WO1999017854A1 - Remotely programmable talking toy - Google Patents

Remotely programmable talking toy Download PDF

Info

Publication number
WO1999017854A1
WO1999017854A1 PCT/US1998/021215 US9821215W WO9917854A1 WO 1999017854 A1 WO1999017854 A1 WO 1999017854A1 US 9821215 W US9821215 W US 9821215W WO 9917854 A1 WO9917854 A1 WO 9917854A1
Authority
WO
WIPO (PCT)
Prior art keywords
server
script
talking
toy
individual
Prior art date
Application number
PCT/US1998/021215
Other languages
French (fr)
Inventor
Stephen J. Brown
Original Assignee
Health Hero Network, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Health Hero Network, Inc. filed Critical Health Hero Network, Inc.
Priority to AU97921/98A priority Critical patent/AU9792198A/en
Publication of WO1999017854A1 publication Critical patent/WO1999017854A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/14Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations with provision for individual teacher-student communication
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present invention relates generally to talking toys such as interactive dolls, and in particular to a remotely programmable talking toy which interacts with a user in accordance with script programs received through a communication network.
  • talking toys Since their inception, talking toys have enjoyed considerable popularity in the marketplace.
  • the talking toys are typically embodied as dolls, teddy bears, action figures, or robots which entertain or instruct their users, usually children.
  • Such toys generally rely upon one of two common methods to produce sounds or spoken phrases . They either use electronic voice synthesis to synthesize phrases prestored in memory or they reproduce prerecorded sounds using an internal audio cassette player.
  • these conventional talking toys are generally limited to speaking a series of prestored phrases in a repetitive sequence selected at the time of manufacture. Because of the repetitious nature of the speech provided by the toys, children quickly lose interest in them. Further, because their speech patterns are preprogrammed during manufacture, these conventional toys are incapable of delivering messages which are personalized, updated, or tailored to the changing needs of their users. Consequently, the educational and entertainment value of the toys is severely limited.
  • U.S Patent 4,840,602 issued to Rose on June 20, 1989 describes a talking doll which interacts with an external signal source, such as a television or VCR.
  • the external signal source communicates a story or narrative which is prerecorded on a magnetic tape.
  • the tape includes binary data which is broadcast to the doll via a radio frequency transmitter. The binary data instructs the doll to make statements at certain points in the narrative, so that the doll appears to interact intelligently with the external signal source.
  • the talking dolls described by Baer and Rose provide somewhat less repetitious speech patterns than those of conventional talking dolls, they cannot be programmed to deliver messages which are tailored to the needs of a user.
  • the dolls simply speak phrases in response to speech signals which have been prerecorded on a cassette tape.
  • the dolls may only be programmed to speak new phrases by purchasing a new cassette tape.
  • U.S. Patent 5,607,336 issued to Lebensfeld et al . on March 4, 1997 discloses a talking doll which delivers messages relating to a user's desired area of interest. Audible messages relating to the area of interest are prerecorded on a read only memory (ROM) chip which is removably mounted in the talking doll. A user inserts the ROM chip in the doll and then activates the doll to hear the recorded messages.
  • ROM read only memory
  • this practice provides greater speech capability, a new ROM chip must be purchased and inserted into the doll each time the user wishes to hear messages relating to a new area of interest.
  • Lebensfeld does not teach any mechanism for tailoring the prerecorded messages to the needs of a specific user.
  • a talking toy which may be programmed to deliver messages which are tailored to the needs of a user. It is another object of the invention to provide a talking toy which may be remotely programmed through a communication network, such as the Internet. It is another object of the invention to provide a talking toy including at least one control button for prompting the toy to execute a script program. It is a further object of the invention to provide a talking toy having at least one user interface means and a signaling unit; for specifying a time at which to execute a script program, and for signaling a user that a script program has been executed.
  • the invention presents a networked communication system which includes at least one talking toy for communicating a message to an individual.
  • the system also includes a server and a remote interface connected to the server for specifying the message to be communicated.
  • the server is preferably accessible via a standard network connection such as a world wide web connection, and the remote interface is preferably a personal computer, network terminal, web TV unit, Palm Pilot unit, or interactive voice system connected to the server via the Internet.
  • the talking toy is also connected to the server via a communication network, preferably the Internet.
  • the talking toy of the present invention is remotely programmed through the communication network.
  • the server includes a script generator for generating a script program executable by the talking toy to communicate the message to the individual.
  • the talking toy includes a communication device, such as a modem, for establishing a communication link to the server through the communication network and for receiving the script program from the server.
  • the talking toy also includes a memory for storing the script program and a speech synthesizer for audibly communicating the message to the individual.
  • the talking toy further includes a microcontroller connected to the communication device, the memory, and the speech synthesizer for executing the script program. Because the talking toy is programmed remotely through the use of script programs, the system allows flexible and dynamic updating of the messages delivered by the toy. Further, because the messages may be specified through the remote interface, the system provides for convenient tailoring of the messages to the needs of an individual user or group of users .
  • FIG. 1 is a block diagram of a networked system according to a preferred embodiment of the invention .
  • FIG. 2 is a block diagram illustrating the interaction of the components of the system of FIG. 1.
  • FIG. 3 is a perspective view of a remotely programmable talking toy of the system of FIG. 1.
  • FIG. 4 is a block diagram illustrating the components of the talking toy of FIG. 3.
  • FIG. 5 is a script entry screen according to the preferred embodiment of the invention.
  • FIG. 6 is a listing of a sample script program according to the preferred embodiment of the invention.
  • FIG. 7 is a script assignment screen according to the preferred embodiment of the invention.
  • FIG. 8 is a flow chart illustrating the steps included in a software application executed by the server of FIG. 1 according to the preferred embodiment of the invention.
  • FIG. 9 is a flow chart illustrating the steps included in a control program executed by the talking toy of FIG. 3 according to the preferred embodiment of the invention.
  • FIG. 10 is a flow chart illustrating the steps included in the script program of FIG. 6.
  • FIG. 11 is a block diagram illustrating the interaction of the server of FIG. 1 with the talking toy of
  • FIG. 3 according to a second embodiment of the - invention.
  • FIG. 12 is a script entry screen according to the second embodiment of the invention.
  • FIG. 13 is a listing of a generic script program according to the second embodiment of the invention .
  • FIG. 14 is a listing of a custom script program according to the second embodiment of the invention.
  • FIG. 15 is a flow chart illustrating the steps included in a software application executed by the server of FIG. 1 according to the second embodiment of the invention.
  • FIG. 16 is a script entry screen according to an alternative embodiment of the invention.
  • FIG. 17 is a script entry screen according to another embodiment of the invention.
  • the invention presents a networked system which includes one or more talking toys for communicating messages to individuals .
  • the talking toys of the present invention are programmed remotely through the use of script programs.
  • the script programs allow flexible and dynamic updating of the messages delivered by the toys, as well as convenient tailoring of the messages to the needs of the individuals .
  • the individuals are patients, and the talking toys are remotely programmed to encourage healthy behavior in the patients.
  • the talking toys may be programmed to encourage children to take their medicine or to tolerate difficult healthcare regimens.
  • the system of the present invention is not limited to healthcare applications. It will be apparent from the ensuing description that the system is equally well suited for advertising, education, entertainment, or any other application which involves the communication of messages.
  • a networked system 16 includes a server 18 and a workstation 20 connected to server 18 through a communication network 24.
  • Server 18 is preferably accessible via a standard network connection such as a world wide web connection and communication network 24 is preferably the Internet. It will be apparent to one skilled in the art that server 18 may comprise a single stand-alone computer or multiple computers distributed throughout a network.
  • Workstation 20 is preferably a personal computer, remote terminal, web TV unit, Palm Pilot unit, or interactive voice system connected to server 18 via the Internet. Workstation 20 functions as a remote interface for entering in server 18 the messages to be communicated to the individuals.
  • System 16 also includes first and second remotely programmable talking toys 26 and 28. Each talking toy interacts with an individual in accordance with script programs received from server 18. Each talking toy is connected to server 18 through communication network 24, preferably the Internet. Alternatively, the talking toys may be placed in communication with server 18 via wireless communication networks, cellular networks, telephone networks, or any other network which allows each talking toy to exchange data with server 18. For clarity of illustration, only two talking toys are shown in FIG. 1. It is to be understood that system 16 may include any number of talking toys for communicating messages to any number of individuals .
  • FIG. 2 shows server 18, workstation 20, and talking toy 26 in greater detail .
  • Server 18 includes a database 30 for - storing script programs 32. The script programs are executed by the talking toys to communicate messages to the individuals.
  • Database 30 further includes a look-up table 34.
  • Table 34 contains a list of the individuals who are to receive messages, and for each of the individuals, a unique identification code and a respective pointer to the script program assigned to the individual.
  • Each talking toy is designed to execute assigned script programs which it receives from server 18.
  • Clock 78 (Fig. 4) enables loaded script to remain in the toy for a period of time after downloading.
  • FIGS . 3 - 4 show the structure of each talking toy according to the preferred embodiment. For clarity, only talking toy 26 is illustrated since each talking toy of the preferred embodiment has substantially identical structure to toy 26.
  • toy 26 is preferably embodied as a doll, such as a teddy bear.
  • toy 26 may be embodied as an action figure, robot, or any other desired toy.
  • An action figure, robot, or other embodiment of toy 26 may have one or more moving body parts.
  • Toy 26 includes a modem jack 46 for connecting the toy to a telephone jack 22 through a connection cord 48.
  • Toy 26 also include a signaling unit 51 for signaling a user, and one or more user interface means 53, as well as first and second user control buttons 50 and 52.
  • Button 50 is pressed to instruct the toy to execute a script program.
  • Button 52 is pressed to instruct the toy to establish a communication link to the server and download a new script program.
  • the control buttons may be replaced, or accompanied by switches, keys, sensors, or any other type of interface suitable for receiving user input.
  • one or more user interface means 53 may be included for prompting signaling unit 51 to signal a user (e.g.
  • FIG. 4 is a schematic block diagram illustrating the internal components of toy 26.
  • Toy 26 includes an audio processor chip 54, which is preferably an RSC-164 chip commercially available from Sensory Circuits Inc. of 1735 N. First Street, San Jose, California 95112.
  • Audio processor chip 54 has a microcontroller 56 for executing script programs received from server 18.
  • a memory 58 is connected to microcontroller 56.
  • Memory 58 stores the individual's unique identification code, script programs received from server 18, and a script interpreter used by microcontroller 56 to execute the script programs .
  • the script interpreter translates script commands into the native processor code of microcontroller 56. Specific techniques for translating and executing script commands in this manner are well known in the art.
  • Memory 58 also stores a control program executed by microcontroller 56 to perform various control functions which are described in the operation section below.
  • Memory 58 is preferably a non-volatile memory, such as a serial EEPROM.
  • Toy 26 also includes a modem 85 which is connected between microcontroller 56 and modem jack 46.
  • Modem 85 operates under the control of microcontroller 56 to establish communication links to server 18 through the communication network and to exchange data with the server.
  • the data includes the individual's unique identification code which modem 85 transmits to server 18, as well as assigned script programs which modem 85 receives from server 18.
  • Modem 85 is preferably a complete 28.8 K modem commercially available from Cermetek Microelectronics, Inc., Sunnyvale, CA, although any suitable modem may be used.
  • Toy 26 further includes a speaker 64 and a microphone 66.
  • Audio processor chip 54 has built in speech synthesis functionality for audibly communicating messages and prompts to an individual through speaker 64.
  • Clock 78 (Fig. 4) allows for loaded script to remain in toy 26 for a specified period of time after downloading a script program.
  • the combination of clock 78 and a signaling unit 51 enable messages to be downloaded into toy 26 (e.g. by a parent in the morning) , and for toy 26 to subsequently signal a user (e.g. a child) at a later time during the day that a message is waiting.
  • Signaling unit 51 may provide an audible and/or a visual signal, e.g., a light emitting diode, musical tones, etc.
  • a signal may be provided " via flashing eyes, vibration, etc., of toy 26.
  • speech synthesis may be used to relay (audibly communicate) a signal to a user, as described hereinabove.
  • chip 54 includes a digital to analog converter (DAC) 60 and an amplifier 62.
  • DAC 60 and amplifier 62 drive speaker 64 under the control of microcontroller 56 to communicate the messages and prompts.
  • Audio processor chip 54 also has built in speech recognition functionality for recognizing responses spoken into microphone 66. Audio signals received through microphone 66 are converted to electrical signals and sent to a preamp and gain control circuit 68. Circuit 68 is controlled by an automatic gain control circuit 70, which is in turn controlled by microcontroller 56. After being amplified by preamp 68, the electrical signals enter chip 54 and pass through a multiplexer 72 and an analog to digital converter (ADC) 74. The resulting digital signals pass through a digital logic circuit 76 and enter microcontroller 56 for speech recognition.
  • ADC analog to digital converter
  • Audio processor chip 54 also includes a RAM 80 for short term memory storage and a ROM 82 which stores audio sounds for speech synthesis and programs executed by microcontroller 56 to perform speech recognition and speech synthesis.
  • Chip 54 operates at a clock speed determined by a crystal 84.
  • Chip 54 further includes a clock 78 which- provides the current date and time to microcontroller 56.
  • Microcontroller 56 is also connected to control buttons 50 and 52 to receive user input.
  • Toy 26 is preferably powered by one or more batteries (not shown) . Alternatively, the toy may be powered by a standard wall outlet . Both methods for supplying power to a toy are well known in the art .
  • server 18 includes a controlling software application 36 which is executed by server 18 to perform the various functions described below.
  • Application 36 includes a script generator 38 and a script assignor 40.
  • Script generator 38 is designed to generate script programs 32 from script information entered through workstation 20.
  • the script information is entered through a script entry screen 42.
  • script entry screen 42 is implemented as a web page on server 18.
  • Workstation 20 includes a web browser for accessing the web page to enter the script information.
  • FIG. 5 illustrates a sample script entry screen 42 as it appears on workstation 20.
  • Screen 42 includes a script name field 86 for specifying the name of a script program to be generated.
  • Screen 42 also includes entry fields 88 for entering a message, such as a set of statements or phrases, to be communicated to an individual.
  • FIG. 5 illustrates an exemplary set of statements which encourage the individual to comply with his or her diabetes care regimen. However, it is to be understood that any type of message may be entered in screen 42, including advertisements, educational messages, and entertainment messages.
  • Screen 42 further includes a CREATE SCRIPT button 90 for instructing the script generator to generate a script program from the information entered in screen 42.
  • Screen 42 also includes a CANCEL button 92 for canceling the information entered.
  • each script program created by the script generator conforms to the standard file format used on UNIX systems.
  • each command is listed in the upper case and followed by a colon. Every line in the script program is terminated by a linefeed character ⁇ LF ⁇ , and only one command is placed on each line.
  • the last character in the script program is a UNIX end of file character ⁇ EOF ⁇ .
  • Table 1 shows an exemplary listing of script commands used in the preferred embodiment of the invention.
  • Script generator 38 preferably stores a script program template which it uses to create each script program. To generate a script program, script generator 38 inserts into the template the information entered in screen 42. For example, FIG. 6 illustrates a sample script program created by the script generator from the script information shown in FIG. 5. The script program includes speech commands to synthesize the phrases or statements entered in fields 88. The steps included in the script program are also shown in the flow chart of FIG. 10 and will be discussed in the operation section below.
  • script assignor 40 is for assigning script programs 32 to the individuals.
  • Script programs 32 are assigned in accordance with script assignment information entered through workstation 20.
  • the script assignment information is entered through a script assignment screen 44, which is preferably implemented as a web page on server 18.
  • FIG. 7 illustrates a sample script assignment screen 44 as it appears on workstation 20.
  • Screen 44 includes check boxes 94 for selecting a script program to be assigned and check boxes 96 for selecting the individuals to whom the script program is to be assigned.
  • Screen 44 also includes an ASSIGN SCRIPT button 100 for entering the assignments. When button 100 is pressed, the script assignor creates and stores for each individual selected in check boxes 96 a respective pointer to the script program selected in check boxes 94. Each pointer is stored in look-up table 34 of database " 30.
  • Screen 44 further includes an ADD SCRIPT button 98 for adding a new script program and a DELETE SCRIPT button 102 for deleting a script program.
  • FIG. 8 is a flow chart illustrating the steps included in the software application executed by server 18.
  • step 202 server 18 determines if new script information has been entered through script entry screen 42. If new script information has not been entered, server 18 proceeds to step 206. If new script information has been entered, server 18 proceeds to step 204.
  • the script information is entered in server 18 by one or more healthcare providers, such as a physician or case manager assigned to the individuals.
  • healthcare providers such as a physician or case manager assigned to the individuals.
  • any person desiring to communicate with the individuals may be granted access to server 18 to create and assign script programs.
  • the system may include any number of remote interfaces for entering script generation and script assignment information in server 18.
  • the script information specifies a message, such as a set of statements or phrases, to be communicated to one or more individuals.
  • script generator 38 generates a script program from the information entered in screen 42.
  • the script program is , stored in database 30.
  • Steps 202 and 204 are preferably repeated to generate multiple script programs, e.g. a script program for diabetes patients, a script program for asthma patients, etc.
  • Each script program corresponds to a respective one of the sets of statements entered through script entry screen 42.
  • step 206 server 18 determines if new script assignment information has been entered through assignment screen 44. If new script assignment information has not been entered, server 18 proceeds to step 210. If new script assignment information has been entered, server 18 proceeds to step 208. As shown in FIG. 7, the script assignment information is entered by selecting a desired script program through check boxes 94, selecting the individuals to whom the selected script program is to be assigned through check boxes 96, and pressing the ASSIGN SCRIPT button 100. When button 100 is pressed, script assignor 40 creates for each individual selected in check boxes 96 a respective pointer to the script program selected in check boxes 94. In step 208, each pointer is stored in look-up table 34 of database 30.
  • step 210 server 18 determines if any one of the talking toys is remotely connected to the server. Each individual is preferably provided with his or her own talking toy which has the individual's unique identification code stored therein. Each individual is thus uniquely associated with a respective one of the talking toys. If none of the talking toys is connected, server 18 returns to step 202. If a talking toy is connected, server 18 receives from the talking toy the individual's unique identification code in step 212. Server 18 uses the received identification code to retrieve from table 34 the pointer to the script program assigned to the individual. In step 214, server 18 retrieves the assigned script program from database 30. In step 216, server 18 transmits the assigned script program to the individual ' s talking toy through communication network 24.
  • step 216 the server returns to step 202.
  • Each talking toy is initially programmed with its user ' s unique identification code, the script interpreter used by the toy to interpret and execute script program commands, and a control program executed by the toy to control its overall operation.
  • the initial programming may be achieved during manufacture or during an initial connection to server 18.
  • FIG. 9 illustrates the steps included in the control program executed by microcontroller 56 of talking toy 26.
  • microcontroller 56 determines if any user input has been received. In the preferred embodiment, user input is received through control buttons 50 and 52. Control button 50 is pressed to instruct the talking toy to speak, and control button 52 is pressed to instruct the toy to connect to the server and download a new script program. If no user input is received for a predetermined period of time, such as two minutes, toy 26 enters sleep mode in step 304. The sleep mode conserves battery power while the toy is not in use. Following step 304, microcontroller 56 returns to step 302 and awaits user input.
  • microcontroller 56 determines if the input is a speech request, step 306. If the user has pressed control button 50 requesting speech, microcontroller 56 executes the script program last received from the server, step 308. The steps included in a sample script program are shown in the flow chart of FIG. 10 and will be discussed below. Following step 308, microcontroller 56 returns to step 302 and awaits new user input .
  • microcontroller 56 attempts to establish a communication link to the server through mode 85 and communication network 24, step 310.
  • step 312 microcontroller determines if the connection was successful. If the connection failed, the user is prompted to connect toy 26 to telephone jack 22 in step 314.
  • Microcontroller 56 preferably prompts the user by synthesizing the phrase "PLEASE CONNECT ME TO THE TELEPHONE JACK USING THE CONNECTION CORD AND SAY 'DONE' WHEN YOU HAVE FINISHED.”
  • microcontroller 56 waits until the appropriate reply is received through microphone 66. Upon recognizing the reply 'DONE', microcontroller 56 repeats step 310 to get a successful connection to the server.
  • microcontroller 56 transmits the unique identification code stored in memory 58 to server 18 in step 318.
  • microcontroller 56 receives a new script program from the server through communication network 24 and modem 85. The new script program is stored in memory 58 for subsequent execution by microcontroller 56.
  • microcontroller 56 returns to step 302 and awaits new user input.
  • FIG. 10 is a flow chart illustrating the steps included in a sample script program executed by microcontroller 56.
  • microcontroller 56 prompts the user by synthesizing through speaker 64 "SAY 'OK' WHEN YOU ARE READY".
  • microcontroller 56 waits until a reply to the prompt is received through microphone 66. When the reply 'OK' is recognized, microcontroller 56 proceeds to step 406. If no reply is received within a predetermined period of time, such as two minutes, toy 26 preferably enters sleep mode until it is reactivated by pressing one the control buttons.
  • microcontroller 56 executes successive speech commands to synthesize through speaker 64 the phrases or statements specified in the script program.
  • the speech commands are preferably separated by delay commands which instruct microcontroller 56 to pause for a number of seconds between statements. The number of seconds is selected to allow the user sufficient time to absorb each statement. Alternatively, the user may be prompted to acknowledge each statement before a subsequent statement is synthesized.
  • the script program may include commands which instruct microcontroller 56 to synthesize the phrase "SAY 'OK' WHEN YOU ARE READY TO HEAR THE NEXT STATEMENT. " Upon recognizing the reply 'OK', microcontroller 56 proceeds to the next speech command in the script program.
  • step 408 the user is reminded to connect toy 26 to telephone jack 22 to download a new script program.
  • Microcontroller 56 synthesizes through speaker 64 "PLEASE CONNECT ME TO THE TELEPHONE JACK TO GET NEW MESSAGES . " Following step 408, the script program ends.
  • One advantage of the system of the present invention is that it allows each talking toy to be programmed remotely through the use of script programs. This allows the messages delivered by each talking toy to be tailored to the specific needs of an individual user or group of users. Moreover, each script program may be easily created, assigned, and downloaded by simply accessing a server through a communication network, such as the Internet.
  • the invention provides a powerful, convenient, and inexpensive system for communicating messages to a large number of individuals .
  • FIGS. 11 - 15 illustrate a second embodiment of the invention in which messages are further customized to each individual by merging personal data with the script programs, much like a standard mail merge application.
  • personal data relating to each individual is preferably stored in look-up table 34 of database 30.
  • the data may include each individual's name, the name of each individual's medication or disease, or any other desired data.
  • database 30 also stores generic script programs 31 created by script generator 38.
  • server 18 includes a data merge program 41 for merging the data stored in table 34 with generic script programs 31.
  • Data merge program 41 is designed to retrieve selected data from table 34 and to insert the data into statements in generic script programs 31, thus creating custom script programs 33.
  • Each custom script program contains a message which is customized to an individual. For example, the message may be customized with the individual's name, medication name, disease name, etc .
  • FIGS. 11 - 15 The operation of the second embodiment is illustrated in FIGS. 11 - 15.
  • the operation of the second embodiment is similar to the operation of the preferred embodiment except that server 18 transmits custom script programs to each talking toy rather than generic script programs.
  • FIG. 15 is a flow chart illustrating the steps included in a software application executed by server 18 according to the second embodiment .
  • step 502 server 18 determines if new script information has been entered through script entry screen
  • the script information specifies a message, such as a set of statements or phrases, to be communicated to the individuals.
  • Each statement preferably includes one or more insert commands specifying data from table 34 to be inserted into the statement.
  • the insert commands instruct data merge program 41 to retrieve the specified data from database 30 and to insert the data into the statement.
  • the first statement shown in FIG. 12 includes insert commands instructing the data merge program to insert a patient name and a medication name into the statement.
  • CREATE SCRIPT button 90 is pressed.
  • script generator 38 When button 90 is pressed, script generator 38 generates a generic script program from the information entered in screen 42, step 504.
  • a sample generic script program is illustrated in FIG. 13.
  • the generic script program includes speech commands to synthesize the statements entered in fields 88.
  • Each statement preferably includes one or more insert commands specifying data to be inserted into the script program.
  • the generic script program is stored in database 30.
  • step 506 server 18 determines if new script assignment information has been entered through assignment screen 44. If new script assignment information has not been entered, server 18 proceeds to step 512. If new script assignment information has been entered, server 18 proceeds to step 508. As shown in FIG. 7, the script assignment information is entered by selecting a desired script program through check boxes 94 , selecting the individuals to whom the selected script program is to be assigned through check boxes 96, and pressing the ASSIGN SCRIPT button 100.
  • data merge program 41 creates a custom script program for each individual selected in check boxes 96, step 508.
  • Each custom script program is preferably created by using the selected generic script program as a template.
  • data merge program 41 retrieves from database 30 the data specified in the insert commands.
  • data merge program 41 inserts the data into the appropriate statements in the generic script program to create a custom script program for the individual.
  • FIG. 14 illustrates a custom script program created from the generic script program of FIG. 13.
  • Each custom script program is stored in database 30.
  • script assignor 40 assigns the custom script program to the individual, step 510. This is preferably accomplished by creating a pointer to the custom script program and storing the pointer with the individual ' s unique identification code in table 34.
  • step 512 server 18 determines if any one of the talking toys is remotely connected to the server. If a talking toy is connected, server 18 receives from the talking toy the individual's unique identification code in step 514.
  • Server 18 uses the received identification code to retrieve from table 34 the pointer to the custom script program assigned to the individual. In step 516, server 18 retrieves the custom script program from database 30. In step 518, server 18 transmits the custom script program to the individual's talking toy. The talking toy receives and executes the script program in the same manner described in the preferred embodiment .
  • the remaining operation of the second embodiment is analogous to the operation of the preferred embodiment described above .
  • custom script program for each individual as soon as script assignment information is received for the individual, it is also possible to wait until the individual's talking toy connects to the server before generating the custom script program. This is accomplished by creating and storing a pointer to the generic script program assigned to the individual, as previously described in the preferred embodiment .
  • the data merge program creates a custom script program for the individual from the generic script program assigned to the individual.
  • the custom script program is then transmitted to the individual ' s talking toy for execution.
  • the system of the present invention may be used for any messaging application.
  • the system is particularly well suited for advertising.
  • an advertising service is provided with a remote interface to the server for creating and assigning script programs which contain advertising messages.
  • each advertising message may be conveniently entered through script entry screen 42, like the health-related messages of the preferred embodiment.
  • the operation of the third embodiment is analogous to the operation of the preferred embodiment, except that the talking toys communicate advertising messages rather than health-related messages.
  • the system of the present invention has many other applications.
  • the user of each talking toy is a child.
  • the child's parent or guardian is provided with a remote interface to the server for creating and assigning script programs which contain messages for the child.
  • each message may be conveniently entered through script entry screen 42.
  • the operation of the fourth embodiment is analogous to the operation of the preferred embodiment, except that script information is entered in the server by a parent or guardian rather than a healthcare provider.
  • the child may be provided with a remote interface to the server to create and assign his or her own script programs .
  • script programs may be generated from information received from multiple sources, such as a healthcare provider, an advertiser, and a parent.
  • the script entry screen includes a respective section for each of the sources to enter a message to be communicated.
  • Each of the sources is provided with a remote interface to the server and a password for accessing the script entry screen.
  • a script program is generated which contains a combination of health-related messages, advertisements, educational messages, or entertainment messages.
  • the remaining operation of the fifth embodiment is analogous to the operation of the preferred embodiment described above .
  • the talking toys of the present invention need not be embodied as dolls.
  • the toys may be embodied as action figures, robots, or any other type of toy.
  • each talking toy need not include a control button for triggering speech output.
  • speech is triggered by other mechanisms, such as voice prompts, the absence of the user's voice, position sensitive sensors, switches, or the like. Specific techniques for triggering speech in a talking toy are well known in the art .
  • system of the present invention is not limited to healthcare applications.
  • the system may be used- in any application which involves the communication of messages, including advertising, education, or entertainment.
  • messages from multiple sources may be combined to generate script programs which contain a combination of health-related messages, advertisements, or educational messages.
  • the system may include any number of remote interfaces for entering and assigning script programs, and any number of talking toys for delivering messages .

Abstract

A networked communication system which includes at least one talking toy for communicating messages to an individual. The system also includes a server and a remote interface connected to the server for specifying the messages to be communicated. The talking toy is connected to the server via a communication network, preferably the Internet. The server includes a script generator for generating script programs which are executed by the talking toy to deliver the messages. The script programs may be executed by the talking toy at user-specified times to deliver the messages to a user at specified times. The talking toy includes a modem for receiving the script programs from the server, a microcontroller for executing the script programs, and a speech synthesizer for audibly communicating the messages to the individual. The talking toy may further include at least one user interface means, and a signaling means for signaling a user that a message is waiting. Because the talking toy may be programmed remotely through the use of script programs, the system allows flexible and dynamic updating of the messages delivered by the toy, as well as tailoring of the messages to the needs of an individual user or group of users.

Description

REMOTELY PROGRAMMABLE TALKING TOY
for
FIELD OF THE INVENTION
The present invention relates generally to talking toys such as interactive dolls, and in particular to a remotely programmable talking toy which interacts with a user in accordance with script programs received through a communication network.
BACKGROUND OF THE INVENTION
Since their inception, talking toys have enjoyed considerable popularity in the marketplace. The talking toys are typically embodied as dolls, teddy bears, action figures, or robots which entertain or instruct their users, usually children. Such toys generally rely upon one of two common methods to produce sounds or spoken phrases . They either use electronic voice synthesis to synthesize phrases prestored in memory or they reproduce prerecorded sounds using an internal audio cassette player.
In either case, these conventional talking toys are generally limited to speaking a series of prestored phrases in a repetitive sequence selected at the time of manufacture. Because of the repetitious nature of the speech provided by the toys, children quickly lose interest in them. Further, because their speech patterns are preprogrammed during manufacture, these conventional toys are incapable of delivering messages which are personalized, updated, or tailored to the changing needs of their users. Consequently, the educational and entertainment value of the toys is severely limited.
Several attempts have been made to develop talking toys which have broader speech capabilities. For example, U.S Patent 4,840,602 issued to Rose on June 20, 1989 describes a talking doll which interacts with an external signal source, such as a television or VCR. The external signal source communicates a story or narrative which is prerecorded on a magnetic tape. In addition to the narrative, the tape includes binary data which is broadcast to the doll via a radio frequency transmitter. The binary data instructs the doll to make statements at certain points in the narrative, so that the doll appears to interact intelligently with the external signal source.
A similar system is disclosed in U.S. Patent 4,846,693 issued to Baer on July 11, 1989. Baer described a talking doll having a speaker which is electrically connected to a control box. The control box is connected to the audio and video outputs of a conventional video cassette recorder (VCR), which is in turn connected to a television set. The VCR receives a cassette tape having a video/audio story line recorded thereon. The tape also includes control data recorded in the video track for routing selected portions of the audio track to the doll's speaker. The illusion created to a human viewer is that the doll is having an animated conversation with characters on the television monitor.
Although the talking dolls described by Baer and Rose provide somewhat less repetitious speech patterns than those of conventional talking dolls, they cannot be programmed to deliver messages which are tailored to the needs of a user. The dolls simply speak phrases in response to speech signals which have been prerecorded on a cassette tape. Moreover, the dolls may only be programmed to speak new phrases by purchasing a new cassette tape.
U.S. Patent 5,607,336 issued to Lebensfeld et al . on March 4, 1997 discloses a talking doll which delivers messages relating to a user's desired area of interest. Audible messages relating to the area of interest are prerecorded on a read only memory (ROM) chip which is removably mounted in the talking doll. A user inserts the ROM chip in the doll and then activates the doll to hear the recorded messages. Although this practice provides greater speech capability, a new ROM chip must be purchased and inserted into the doll each time the user wishes to hear messages relating to a new area of interest. Moreover, Lebensfeld does not teach any mechanism for tailoring the prerecorded messages to the needs of a specific user.
OBJECTS AND ADVANTAGES OF THE INVENTION
In view of the above, it is an object of the present invention to provide a talking toy which may be programmed to deliver messages which are tailored to the needs of a user. It is another object of the invention to provide a talking toy which may be remotely programmed through a communication network, such as the Internet. It is another object of the invention to provide a talking toy including at least one control button for prompting the toy to execute a script program. It is a further object of the invention to provide a talking toy having at least one user interface means and a signaling unit; for specifying a time at which to execute a script program, and for signaling a user that a script program has been executed. It is another object of the invention to provide a talking toy capable of transmitting data and receiving data over a communication network. It is a further object of the invention to incorporate the talking toy in a networked system which allows flexible and dynamic updating of the messages delivered by the toy. These and other objects and advantages will become more apparent after consideration of the ensuing description and the accompanying drawings .
SUMMARY
The invention presents a networked communication system which includes at least one talking toy for communicating a message to an individual. The system also includes a server and a remote interface connected to the server for specifying the message to be communicated. The server is preferably accessible via a standard network connection such as a world wide web connection, and the remote interface is preferably a personal computer, network terminal, web TV unit, Palm Pilot unit, or interactive voice system connected to the server via the Internet. The talking toy is also connected to the server via a communication network, preferably the Internet. In contrast to conventional talking toys whose speech is programmed during manufacture or through the insertion of memory chips, the talking toy of the present invention is remotely programmed through the communication network.
The server includes a script generator for generating a script program executable by the talking toy to communicate the message to the individual. The talking toy includes a communication device, such as a modem, for establishing a communication link to the server through the communication network and for receiving the script program from the server. The talking toy also includes a memory for storing the script program and a speech synthesizer for audibly communicating the message to the individual. The talking toy further includes a microcontroller connected to the communication device, the memory, and the speech synthesizer for executing the script program. Because the talking toy is programmed remotely through the use of script programs, the system allows flexible and dynamic updating of the messages delivered by the toy. Further, because the messages may be specified through the remote interface, the system provides for convenient tailoring of the messages to the needs of an individual user or group of users .
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a networked system according to a preferred embodiment of the invention . FIG. 2 is a block diagram illustrating the interaction of the components of the system of FIG. 1. FIG. 3 is a perspective view of a remotely programmable talking toy of the system of FIG. 1.
FIG. 4 is a block diagram illustrating the components of the talking toy of FIG. 3. FIG. 5 is a script entry screen according to the preferred embodiment of the invention. FIG. 6 is a listing of a sample script program according to the preferred embodiment of the invention. FIG. 7 is a script assignment screen according to the preferred embodiment of the invention. FIG. 8 is a flow chart illustrating the steps included in a software application executed by the server of FIG. 1 according to the preferred embodiment of the invention. FIG. 9 is a flow chart illustrating the steps included in a control program executed by the talking toy of FIG. 3 according to the preferred embodiment of the invention. FIG. 10 is a flow chart illustrating the steps included in the script program of FIG. 6. FIG. 11 is a block diagram illustrating the interaction of the server of FIG. 1 with the talking toy of
FIG. 3 according to a second embodiment of the - invention. FIG. 12 is a script entry screen according to the second embodiment of the invention. FIG. 13 is a listing of a generic script program according to the second embodiment of the invention .
FIG. 14 is a listing of a custom script program according to the second embodiment of the invention. FIG. 15 is a flow chart illustrating the steps included in a software application executed by the server of FIG. 1 according to the second embodiment of the invention. FIG. 16 is a script entry screen according to an alternative embodiment of the invention. FIG. 17 is a script entry screen according to another embodiment of the invention.
DETAILED DESCRIPTION
The invention presents a networked system which includes one or more talking toys for communicating messages to individuals . In contrast to conventional talking toys whose speech is programmed during manufacture or through the insertion of external media, the talking toys of the present invention are programmed remotely through the use of script programs. The script programs allow flexible and dynamic updating of the messages delivered by the toys, as well as convenient tailoring of the messages to the needs of the individuals .
In a preferred embodiment of the invention, the individuals are patients, and the talking toys are remotely programmed to encourage healthy behavior in the patients. For example, the talking toys may be programmed to encourage children to take their medicine or to tolerate difficult healthcare regimens. However, it is to be understood that the system of the present invention is not limited to healthcare applications. It will be apparent from the ensuing description that the system is equally well suited for advertising, education, entertainment, or any other application which involves the communication of messages.
The preferred embodiment of the invention is illustrated in FIGS. 1 - 7. Referring to FIG. 1, a networked system 16 includes a server 18 and a workstation 20 connected to server 18 through a communication network 24. Server 18 is preferably accessible via a standard network connection such as a world wide web connection and communication network 24 is preferably the Internet. It will be apparent to one skilled in the art that server 18 may comprise a single stand-alone computer or multiple computers distributed throughout a network. Workstation 20 is preferably a personal computer, remote terminal, web TV unit, Palm Pilot unit, or interactive voice system connected to server 18 via the Internet. Workstation 20 functions as a remote interface for entering in server 18 the messages to be communicated to the individuals.
System 16 also includes first and second remotely programmable talking toys 26 and 28. Each talking toy interacts with an individual in accordance with script programs received from server 18. Each talking toy is connected to server 18 through communication network 24, preferably the Internet. Alternatively, the talking toys may be placed in communication with server 18 via wireless communication networks, cellular networks, telephone networks, or any other network which allows each talking toy to exchange data with server 18. For clarity of illustration, only two talking toys are shown in FIG. 1. It is to be understood that system 16 may include any number of talking toys for communicating messages to any number of individuals .
FIG. 2 shows server 18, workstation 20, and talking toy 26 in greater detail . Server 18 includes a database 30 for - storing script programs 32. The script programs are executed by the talking toys to communicate messages to the individuals. Database 30 further includes a look-up table 34. Table 34 contains a list of the individuals who are to receive messages, and for each of the individuals, a unique identification code and a respective pointer to the script program assigned to the individual. Each talking toy is designed to execute assigned script programs which it receives from server 18. Clock 78 (Fig. 4) enables loaded script to remain in the toy for a period of time after downloading.
FIGS . 3 - 4 show the structure of each talking toy according to the preferred embodiment. For clarity, only talking toy 26 is illustrated since each talking toy of the preferred embodiment has substantially identical structure to toy 26. Referring to FIG. 3, toy 26 is preferably embodied as a doll, such as a teddy bear. Alternatively, toy 26 may be embodied as an action figure, robot, or any other desired toy. An action figure, robot, or other embodiment of toy 26 may have one or more moving body parts.
Toy 26 includes a modem jack 46 for connecting the toy to a telephone jack 22 through a connection cord 48. Toy 26 also include a signaling unit 51 for signaling a user, and one or more user interface means 53, as well as first and second user control buttons 50 and 52. Button 50 is pressed to instruct the toy to execute a script program. Button 52 is pressed to instruct the toy to establish a communication link to the server and download a new script program. In alternative embodiments, the control buttons may be replaced, or accompanied by switches, keys, sensors, or any other type of interface suitable for receiving user input. For example, one or more user interface means 53 may be included for prompting signaling unit 51 to signal a user (e.g. a child) at a predetermined time, via clock 78, that a message is waiting for the user. FIG. 4 is a schematic block diagram illustrating the internal components of toy 26. Toy 26 includes an audio processor chip 54, which is preferably an RSC-164 chip commercially available from Sensory Circuits Inc. of 1735 N. First Street, San Jose, California 95112. Audio processor chip 54 has a microcontroller 56 for executing script programs received from server 18. A memory 58 is connected to microcontroller 56. Memory 58 stores the individual's unique identification code, script programs received from server 18, and a script interpreter used by microcontroller 56 to execute the script programs .
The script interpreter translates script commands into the native processor code of microcontroller 56. Specific techniques for translating and executing script commands in this manner are well known in the art. Memory 58 also stores a control program executed by microcontroller 56 to perform various control functions which are described in the operation section below. Memory 58 is preferably a non-volatile memory, such as a serial EEPROM.
Toy 26 also includes a modem 85 which is connected between microcontroller 56 and modem jack 46. Modem 85 operates under the control of microcontroller 56 to establish communication links to server 18 through the communication network and to exchange data with the server. The data includes the individual's unique identification code which modem 85 transmits to server 18, as well as assigned script programs which modem 85 receives from server 18. Modem 85 is preferably a complete 28.8 K modem commercially available from Cermetek Microelectronics, Inc., Sunnyvale, CA, although any suitable modem may be used.
Toy 26 further includes a speaker 64 and a microphone 66. Audio processor chip 54 has built in speech synthesis functionality for audibly communicating messages and prompts to an individual through speaker 64. Clock 78 (Fig. 4) allows for loaded script to remain in toy 26 for a specified period of time after downloading a script program. The combination of clock 78 and a signaling unit 51 enable messages to be downloaded into toy 26 (e.g. by a parent in the morning) , and for toy 26 to subsequently signal a user (e.g. a child) at a later time during the day that a message is waiting. Signaling unit 51 may provide an audible and/or a visual signal, e.g., a light emitting diode, musical tones, etc. Alternatively, a signal may be provided" via flashing eyes, vibration, etc., of toy 26. According to an alternative embodiment, speech synthesis may be used to relay (audibly communicate) a signal to a user, as described hereinabove. For speech synthesis, chip 54 includes a digital to analog converter (DAC) 60 and an amplifier 62. DAC 60 and amplifier 62 drive speaker 64 under the control of microcontroller 56 to communicate the messages and prompts.
Audio processor chip 54 also has built in speech recognition functionality for recognizing responses spoken into microphone 66. Audio signals received through microphone 66 are converted to electrical signals and sent to a preamp and gain control circuit 68. Circuit 68 is controlled by an automatic gain control circuit 70, which is in turn controlled by microcontroller 56. After being amplified by preamp 68, the electrical signals enter chip 54 and pass through a multiplexer 72 and an analog to digital converter (ADC) 74. The resulting digital signals pass through a digital logic circuit 76 and enter microcontroller 56 for speech recognition.
Audio processor chip 54 also includes a RAM 80 for short term memory storage and a ROM 82 which stores audio sounds for speech synthesis and programs executed by microcontroller 56 to perform speech recognition and speech synthesis. Chip 54 operates at a clock speed determined by a crystal 84. Chip 54 further includes a clock 78 which- provides the current date and time to microcontroller 56. Microcontroller 56 is also connected to control buttons 50 and 52 to receive user input. Toy 26 is preferably powered by one or more batteries (not shown) . Alternatively, the toy may be powered by a standard wall outlet . Both methods for supplying power to a toy are well known in the art .
Referring again to FIG. 2, server 18 includes a controlling software application 36 which is executed by server 18 to perform the various functions described below. Application 36 includes a script generator 38 and a script assignor 40. Script generator 38 is designed to generate script programs 32 from script information entered through workstation 20. The script information is entered through a script entry screen 42. In the preferred embodiment, script entry screen 42 is implemented as a web page on server 18. Workstation 20 includes a web browser for accessing the web page to enter the script information.
FIG. 5 illustrates a sample script entry screen 42 as it appears on workstation 20. Screen 42 includes a script name field 86 for specifying the name of a script program to be generated. Screen 42 also includes entry fields 88 for entering a message, such as a set of statements or phrases, to be communicated to an individual. FIG. 5 illustrates an exemplary set of statements which encourage the individual to comply with his or her diabetes care regimen. However, it is to be understood that any type of message may be entered in screen 42, including advertisements, educational messages, and entertainment messages. Screen 42 further includes a CREATE SCRIPT button 90 for instructing the script generator to generate a script program from the information entered in screen 42. Screen 42 also includes a CANCEL button 92 for canceling the information entered.
In the preferred embodiment, each script program created by the script generator conforms to the standard file format used on UNIX systems. In the standard file format, each command is listed in the upper case and followed by a colon. Every line in the script program is terminated by a linefeed character {LF}, and only one command is placed on each line. The last character in the script program is a UNIX end of file character {EOF} . Table 1 shows an exemplary listing of script commands used in the preferred embodiment of the invention.
TABLE 1 - SCRIPT COMMANDS
Figure imgf000015_0001
The script commands illustrated in Table 1 are representative of the preferred embodiment and are not intended to limit the scope of the invention. After consideration of the ensuing description, it will be apparent to one skilled in the art many other suitable scripting languages and sets of script commands may be used to implement the invention.
Script generator 38 preferably stores a script program template which it uses to create each script program. To generate a script program, script generator 38 inserts into the template the information entered in screen 42. For example, FIG. 6 illustrates a sample script program created by the script generator from the script information shown in FIG. 5. The script program includes speech commands to synthesize the phrases or statements entered in fields 88. The steps included in the script program are also shown in the flow chart of FIG. 10 and will be discussed in the operation section below.
Referring again to FIG. 2, script assignor 40 is for assigning script programs 32 to the individuals. Script programs 32 are assigned in accordance with script assignment information entered through workstation 20. The script assignment information is entered through a script assignment screen 44, which is preferably implemented as a web page on server 18.
FIG. 7 illustrates a sample script assignment screen 44 as it appears on workstation 20. Screen 44 includes check boxes 94 for selecting a script program to be assigned and check boxes 96 for selecting the individuals to whom the script program is to be assigned. Screen 44 also includes an ASSIGN SCRIPT button 100 for entering the assignments. When button 100 is pressed, the script assignor creates and stores for each individual selected in check boxes 96 a respective pointer to the script program selected in check boxes 94. Each pointer is stored in look-up table 34 of database" 30. Screen 44 further includes an ADD SCRIPT button 98 for adding a new script program and a DELETE SCRIPT button 102 for deleting a script program.
The operation of the preferred embodiment is illustrated in FIGS. 1 - 10. FIG. 8 is a flow chart illustrating the steps included in the software application executed by server 18. In step 202, server 18 determines if new script information has been entered through script entry screen 42. If new script information has not been entered, server 18 proceeds to step 206. If new script information has been entered, server 18 proceeds to step 204.
In the preferred embodiment, the script information is entered in server 18 by one or more healthcare providers, such as a physician or case manager assigned to the individuals. Of course, any person desiring to communicate with the individuals may be granted access to server 18 to create and assign script programs. Further, it is to be understood that the system may include any number of remote interfaces for entering script generation and script assignment information in server 18.
As shown in FIG. 5, the script information specifies a message, such as a set of statements or phrases, to be communicated to one or more individuals. In step 204, script generator 38 generates a script program from the information entered in screen 42. The script program is , stored in database 30. Steps 202 and 204 are preferably repeated to generate multiple script programs, e.g. a script program for diabetes patients, a script program for asthma patients, etc. Each script program corresponds to a respective one of the sets of statements entered through script entry screen 42.
In step 206, server 18 determines if new script assignment information has been entered through assignment screen 44. If new script assignment information has not been entered, server 18 proceeds to step 210. If new script assignment information has been entered, server 18 proceeds to step 208. As shown in FIG. 7, the script assignment information is entered by selecting a desired script program through check boxes 94, selecting the individuals to whom the selected script program is to be assigned through check boxes 96, and pressing the ASSIGN SCRIPT button 100. When button 100 is pressed, script assignor 40 creates for each individual selected in check boxes 96 a respective pointer to the script program selected in check boxes 94. In step 208, each pointer is stored in look-up table 34 of database 30.
In step 210, server 18 determines if any one of the talking toys is remotely connected to the server. Each individual is preferably provided with his or her own talking toy which has the individual's unique identification code stored therein. Each individual is thus uniquely associated with a respective one of the talking toys. If none of the talking toys is connected, server 18 returns to step 202. If a talking toy is connected, server 18 receives from the talking toy the individual's unique identification code in step 212. Server 18 uses the received identification code to retrieve from table 34 the pointer to the script program assigned to the individual. In step 214, server 18 retrieves the assigned script program from database 30. In step 216, server 18 transmits the assigned script program to the individual ' s talking toy through communication network 24. Following step 216, the server returns to step 202. Each talking toy is initially programmed with its user ' s unique identification code, the script interpreter used by the toy to interpret and execute script program commands, and a control program executed by the toy to control its overall operation. The initial programming may be achieved during manufacture or during an initial connection to server 18.
FIG. 9 illustrates the steps included in the control program executed by microcontroller 56 of talking toy 26. In step 302, microcontroller 56 determines if any user input has been received. In the preferred embodiment, user input is received through control buttons 50 and 52. Control button 50 is pressed to instruct the talking toy to speak, and control button 52 is pressed to instruct the toy to connect to the server and download a new script program. If no user input is received for a predetermined period of time, such as two minutes, toy 26 enters sleep mode in step 304. The sleep mode conserves battery power while the toy is not in use. Following step 304, microcontroller 56 returns to step 302 and awaits user input.
If user input has been received, microcontroller 56 determines if the input is a speech request, step 306. If the user has pressed control button 50 requesting speech, microcontroller 56 executes the script program last received from the server, step 308. The steps included in a sample script program are shown in the flow chart of FIG. 10 and will be discussed below. Following step 308, microcontroller 56 returns to step 302 and awaits new user input .
If the user presses control button 52 requesting a connection to the server, microcontroller 56 attempts to establish a communication link to the server through mode 85 and communication network 24, step 310. In step 312, microcontroller determines if the connection was successful. If the connection failed, the user is prompted to connect toy 26 to telephone jack 22 in step 314. Microcontroller 56 preferably prompts the user by synthesizing the phrase "PLEASE CONNECT ME TO THE TELEPHONE JACK USING THE CONNECTION CORD AND SAY 'DONE' WHEN YOU HAVE FINISHED." In step 316, microcontroller 56 waits until the appropriate reply is received through microphone 66. Upon recognizing the reply 'DONE', microcontroller 56 repeats step 310 to get a successful connection to the server.
Once a successful connection is established, microcontroller 56 transmits the unique identification code stored in memory 58 to server 18 in step 318. In step 320, microcontroller 56 receives a new script program from the server through communication network 24 and modem 85. The new script program is stored in memory 58 for subsequent execution by microcontroller 56. Following step 320, microcontroller 56 returns to step 302 and awaits new user input.
FIG. 10 is a flow chart illustrating the steps included in a sample script program executed by microcontroller 56. In step 402, microcontroller 56 prompts the user by synthesizing through speaker 64 "SAY 'OK' WHEN YOU ARE READY". In step 404, microcontroller 56 waits until a reply to the prompt is received through microphone 66. When the reply 'OK' is recognized, microcontroller 56 proceeds to step 406. If no reply is received within a predetermined period of time, such as two minutes, toy 26 preferably enters sleep mode until it is reactivated by pressing one the control buttons.
In step 406, microcontroller 56 executes successive speech commands to synthesize through speaker 64 the phrases or statements specified in the script program. Referring again to FIG . 6 , the speech commands are preferably separated by delay commands which instruct microcontroller 56 to pause for a number of seconds between statements. The number of seconds is selected to allow the user sufficient time to absorb each statement. Alternatively, the user may be prompted to acknowledge each statement before a subsequent statement is synthesized. For example, the script program may include commands which instruct microcontroller 56 to synthesize the phrase "SAY 'OK' WHEN YOU ARE READY TO HEAR THE NEXT STATEMENT. " Upon recognizing the reply 'OK', microcontroller 56 proceeds to the next speech command in the script program.
In step 408, the user is reminded to connect toy 26 to telephone jack 22 to download a new script program. Microcontroller 56 synthesizes through speaker 64 "PLEASE CONNECT ME TO THE TELEPHONE JACK TO GET NEW MESSAGES . " Following step 408, the script program ends.
One advantage of the system of the present invention is that it allows each talking toy to be programmed remotely through the use of script programs. This allows the messages delivered by each talking toy to be tailored to the specific needs of an individual user or group of users. Moreover, each script program may be easily created, assigned, and downloaded by simply accessing a server through a communication network, such as the Internet.
Thus, the invention provides a powerful, convenient, and inexpensive system for communicating messages to a large number of individuals .
FIGS. 11 - 15 illustrate a second embodiment of the invention in which messages are further customized to each individual by merging personal data with the script programs, much like a standard mail merge application. Referring to FIG. 11, personal data relating to each individual is preferably stored in look-up table 34 of database 30. By way of example, the data may include each individual's name, the name of each individual's medication or disease, or any other desired data. As in the preferred embodiment, database 30 also stores generic script programs 31 created by script generator 38.
In the second embodiment, server 18 includes a data merge program 41 for merging the data stored in table 34 with generic script programs 31. Data merge program 41 is designed to retrieve selected data from table 34 and to insert the data into statements in generic script programs 31, thus creating custom script programs 33. Each custom script program contains a message which is customized to an individual. For example, the message may be customized with the individual's name, medication name, disease name, etc .
The operation of the second embodiment is illustrated in FIGS. 11 - 15. The operation of the second embodiment is similar to the operation of the preferred embodiment except that server 18 transmits custom script programs to each talking toy rather than generic script programs. FIG. 15 is a flow chart illustrating the steps included in a software application executed by server 18 according to the second embodiment .
In step 502, server 18 determines if new script information has been entered through script entry screen
42. If new script information has not been entered, server 18 proceeds to step 506. If new script information has been entered, server 18 proceeds to step 504. As shown in FIG. 12, the script information specifies a message, such as a set of statements or phrases, to be communicated to the individuals. Each statement preferably includes one or more insert commands specifying data from table 34 to be inserted into the statement. The insert commands instruct data merge program 41 to retrieve the specified data from database 30 and to insert the data into the statement. For example, the first statement shown in FIG. 12 includes insert commands instructing the data merge program to insert a patient name and a medication name into the statement.
Following entry of the statements and insert commands, CREATE SCRIPT button 90 is pressed. When button 90 is pressed, script generator 38 generates a generic script program from the information entered in screen 42, step 504. A sample generic script program is illustrated in FIG. 13. The generic script program includes speech commands to synthesize the statements entered in fields 88. Each statement preferably includes one or more insert commands specifying data to be inserted into the script program. The generic script program is stored in database 30.
In step 506, server 18 determines if new script assignment information has been entered through assignment screen 44. If new script assignment information has not been entered, server 18 proceeds to step 512. If new script assignment information has been entered, server 18 proceeds to step 508. As shown in FIG. 7, the script assignment information is entered by selecting a desired script program through check boxes 94 , selecting the individuals to whom the selected script program is to be assigned through check boxes 96, and pressing the ASSIGN SCRIPT button 100.
When button 100 is pressed, data merge program 41 creates a custom script program for each individual selected in check boxes 96, step 508. Each custom script program is preferably created by using the selected generic script program as a template. For each individual selected, data merge program 41 retrieves from database 30 the data specified in the insert commands. Next, data merge program 41 inserts the data into the appropriate statements in the generic script program to create a custom script program for the individual. For example, FIG. 14 illustrates a custom script program created from the generic script program of FIG. 13. Each custom script program is stored in database 30.
As each custom script program is generated for an individual, script assignor 40 assigns the custom script program to the individual, step 510. This is preferably accomplished by creating a pointer to the custom script program and storing the pointer with the individual ' s unique identification code in table 34. In step 512, server 18 determines if any one of the talking toys is remotely connected to the server. If a talking toy is connected, server 18 receives from the talking toy the individual's unique identification code in step 514.
Server 18 uses the received identification code to retrieve from table 34 the pointer to the custom script program assigned to the individual. In step 516, server 18 retrieves the custom script program from database 30. In step 518, server 18 transmits the custom script program to the individual's talking toy. The talking toy receives and executes the script program in the same manner described in the preferred embodiment . The remaining operation of the second embodiment is analogous to the operation of the preferred embodiment described above .
Although it is presently preferred to generate a custom script program for each individual as soon as script assignment information is received for the individual, it is also possible to wait until the individual's talking toy connects to the server before generating the custom script program. This is accomplished by creating and storing a pointer to the generic script program assigned to the individual, as previously described in the preferred embodiment . When the individual ' s talking toy connects to the server, the data merge program creates a custom script program for the individual from the generic script program assigned to the individual. The custom script program is then transmitted to the individual ' s talking toy for execution.
Although the first and second embodiments focus on healthcare applications, the system of the present invention may be used for any messaging application. For example, the system is particularly well suited for advertising. In a third embodiment of the invention, an advertising service is provided with a remote interface to the server for creating and assigning script programs which contain advertising messages. As shown in FIG. 16, each advertising message may be conveniently entered through script entry screen 42, like the health-related messages of the preferred embodiment. The operation of the third embodiment is analogous to the operation of the preferred embodiment, except that the talking toys communicate advertising messages rather than health-related messages.
Of course, the system of the present invention has many other applications. Typically, the user of each talking toy is a child. In a fourth embodiment of the invention, the child's parent or guardian is provided with a remote interface to the server for creating and assigning script programs which contain messages for the child. As shown in FIG. 17, each message may be conveniently entered through script entry screen 42. The operation of the fourth embodiment is analogous to the operation of the preferred embodiment, except that script information is entered in the server by a parent or guardian rather than a healthcare provider. Alternatively, the child may be provided with a remote interface to the server to create and assign his or her own script programs .
It should also be noted that script programs may be generated from information received from multiple sources, such as a healthcare provider, an advertiser, and a parent. In a fifth embodiment of the invention, the script entry screen includes a respective section for each of the sources to enter a message to be communicated. Each of the sources is provided with a remote interface to the server and a password for accessing the script entry screen. After each source has entered one or more messages in the server, a script program is generated which contains a combination of health-related messages, advertisements, educational messages, or entertainment messages. The remaining operation of the fifth embodiment is analogous to the operation of the preferred embodiment described above .
SUMMARY, RAMIFICATIONS, AND SCOPE
Although the above description contains many specificities, these should not be construed as limitations on the scope of the invention but merely as illustrations of some of the presently preferred embodiments . Many other embodiments of the invention are possible. For example, the scripting language and script commands shown are representative of the preferred embodiment. It will be apparent to one skilled in the art many other scripting languages and specific script commands may be used to implement the invention.
Moreover, the talking toys of the present invention need not be embodied as dolls. The toys may be embodied as action figures, robots, or any other type of toy. Further, each talking toy need not include a control button for triggering speech output. In alternative embodiments, speech is triggered by other mechanisms, such as voice prompts, the absence of the user's voice, position sensitive sensors, switches, or the like. Specific techniques for triggering speech in a talking toy are well known in the art .
In addition, the system of the present invention is not limited to healthcare applications. The system may be used- in any application which involves the communication of messages, including advertising, education, or entertainment. Of course, various combinations of these applications are also possible. For example, messages from multiple sources may be combined to generate script programs which contain a combination of health-related messages, advertisements, or educational messages. Further, the system may include any number of remote interfaces for entering and assigning script programs, and any number of talking toys for delivering messages .
Therefore, the scope of the invention should be determined not by the examples given, but by the appended claims and their legal equivalents .

Claims

CLAIMSWhat is claimed is:
1. A remotely programmable talking toy comprising: a) communication means for establishing a communication link to a server through a communication network and for receiving from the server a script program executable by the toy to communicate a message to an individual; b) speech synthesis means for communicating the message to the individual; c) memory means for storing the script program; and d) control means connected to the communication means, the speech synthesis means, and the memory means for executing the script program.
2. The talking toy of claim 1, wherein the communication means comprises a modem for establishing the communication link to the server via the Internet.
3. The talking toy of claim 1, wherein the speech synthesis means includes means for prompting the individual, and wherein the talking toy further includes speech recognition means connected to the control means for recognizing responses spoken by the individual .
4. The talking toy of claim 1, wherein the talking toy comprises a doll, an action figure, or a robot.
5. A system for communicating information to an individual, the system comprising: a) a server; b) a remote interface connected to the server for specifying a message to be communicated to the individual ; and c) a remotely programmable talking toy for communicating the message to the individual, the talking toy being networked to the server via a communication network; wherein the server includes a script generating means for generating a script program executable by the talking toy to communicate the message to the individual ; and wherein the talking toy comprises : i) a communication means for receiving the script program from the server through the communication network; ii) a speech synthesis means for communicating the message to the individual; iii) a memory means for storing the script program; and iv) a control means connected to the communication means, the speech synthesis means, and the memory means for executing the script program.
6. The system of claim 5, wherein the server further includes database means connected to the script generating means for storing data relating to the individual, and wherein the script generating means includes means for inserting the data into the script program to customize the message to the individual.
7. The system of claim 5, wherein the server comprises a web server having a web page for entry of the message, and wherein the remote interface is connected to the web server via the Internet .
8. The system of claim 5, wherein the communication means of the talking toy comprises a modem for establishing the communication link to the server via the Internet .
9. The system of claim 5, wherein the speech synthesis means of the talking toy includes means for prompting the individual, and wherein the talking toy further includes speech recognition means connected to the control means for recognizing responses spoken by the individual .
10. The system of claim 5, wherein the talking toy comprises a doll, an action figure, or a robot.
11. The system of claim 5, further comprising a plurality of remotely programmable talking toys networked to the server for communicating information to a corresponding plurality of individuals, wherein the server includes database means for storing a plurality of script programs, the remote interface includes means for entering in the server script assignment information, the server includes script assignment means connected to the database means for assigning to each of the plurality of individuals at least one of the plurality of script programs in accordance with the script assignment information, and the database means further includes means for storing a list of the plurality of individuals, and for each of the plurality of individuals, a respective pointer to the at least one of the plurality of script programs assigned to each of the plurality of individuals.
12. A method for communicating information to an individual, comprising the steps of: a) providing the individual with a talking toy comprising: i) a communication means for establishing a communication link to a server through a communication network and for receiving from the server a script program executable by the toy to communicate a message to the individual; ii) a speech synthesis means for communicating the message to the individual; iii) a memory means for storing the script program; and iv) a control means connected to the communication means, the speech synthesis means, and the memory means for executing the script program; b) specifying in the server the message to be communicated to the individual; c) generating the script program in the server; d) transmitting the script program from the server to the talking toy through the communication ne work; and e) executing the script program in the talking toy to communicate the message to the individual.
13. The method of claim 12, wherein the step of transmitting the script program from the server to the talking toy is preceded by the steps of: f) storing in the server data relating to the individual ; and g) inserting the data into the script program to customize the message to the individual.
14. The method of claim 12, wherein the server comprises a web server having a web page for entry of the message, and wherein the message is specified in the server by accessing the web page through the Internet and entering the message in the web page.
15. The method of claim 12, wherein the step of transmitting the script program from the server to the talking toy comprises establishing a communication link between the server and the talking toy through the Internet and transmitting the script program through the communication link.
16. The method of claim 12, wherein the talking toy comprises a doll, an action figure, or a robot.
17. The method of claim 12, further comprising the steps of: a) providing a plurality of individuals with a corresponding plurality of talking toys such that each of the plurality of individuals is associated with a respective one of the plurality of talking toys ; b) generating in the server a plurality of script programs; c) assigning to each of the plurality of individuals at least one of the plurality of script programs; d) storing in the server the plurality of script programs, a list of the plurality of individuals, and for each of the plurality of individuals, a respective pointer to the at least one of the plurality of script programs assigned to each of the plurality of individuals; and e) transmitting to each of the plurality of talking toys the at least one of the plurality of script programs assigned to each of the plurality of individuals associated with the respective plurality of talking toys.
18. The talking toy of claim 1, further comprising a clock and a signaling unit, wherein the script program received by the talking toy from said server is stored in said memory means for subsequent execution by said control means, and said signaling unit is adapted for signaling a user when the script program is executed by said control means .
19. The talking toy of claim 18, further comprising a user interface means, wherein said user interface means is adapted to specify a time at which a script program is executed by said control means, and said signaling unit signals the user at a time when the script program is executed by said control means.
20. The talking toy of claim 1, further comprising first and second control buttons, wherein said first control button is adapted for instructing the toy to execute a script program, and said second control button is adapted to initiate establishment of a communication link between the talking toy and said server.
21. The talking toy of claim 18, wherein said signaling unit comprises a light emitting diode, flashing eyes, c vibration means, or speech synthesis.
22. The talking toy of claim 1, wherein the talking toy is capable of transmitting data and receiving data over a communication network via said communication means .
23. The talking toy of claim 1, wherein the talking toy transmits user identification code data to said server via said communication means .
PCT/US1998/021215 1997-10-07 1998-10-07 Remotely programmable talking toy WO1999017854A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU97921/98A AU9792198A (en) 1997-10-07 1998-10-07 Remotely programmable talking toy

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US94452997A 1997-10-07 1997-10-07
US08/944,529 1997-10-07

Publications (1)

Publication Number Publication Date
WO1999017854A1 true WO1999017854A1 (en) 1999-04-15

Family

ID=25481584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/021215 WO1999017854A1 (en) 1997-10-07 1998-10-07 Remotely programmable talking toy

Country Status (2)

Country Link
AU (1) AU9792198A (en)
WO (1) WO1999017854A1 (en)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1072297A1 (en) * 1998-12-24 2001-01-31 Sony Corporation Information processor, portable device, electronic pet device, recorded medium on which information processing procedure is recorded, and information processing method
WO2001012285A1 (en) * 1999-08-19 2001-02-22 Kidkids, Inc. Networked toys
WO2001020586A2 (en) * 1999-09-14 2001-03-22 Aisynth Entertainment Inc. Smart toys
WO2001030047A2 (en) * 1999-10-20 2001-04-26 Curo Interactive Incorporated Audio prompted, binary sensors user interface for a consumer electronic device
WO2001076714A1 (en) * 2000-04-07 2001-10-18 Valagam Rajagopal Raghunathan Intelligent toy
EP1175929A2 (en) * 2000-07-26 2002-01-30 Deutsche Telekom AG Toy with link to an external database
KR20020041483A (en) * 2000-11-28 2002-06-03 박동근 Apparatus and Method for a service of the internet education through a separate type of external equipment
KR20020062057A (en) * 2001-01-19 2002-07-25 (주)리딩엣지 A terminal which can recognize the outer environments and an information offering system and an information offering method using the terminal
WO2002087717A1 (en) * 2001-04-26 2002-11-07 4Kids Entertainment Licensing, Inc. (Formerly Known As Leisure Concepts, Inc.) Talking toys
KR20020097477A (en) * 2001-06-21 2002-12-31 현명택 Intelligent robort toy based on internet and making method for intelligent robort toy
KR100396751B1 (en) * 2000-08-17 2003-09-02 엘지전자 주식회사 Scholarship/growth system and method for toy using web server
KR100396755B1 (en) * 2000-08-18 2003-09-02 엘지전자 주식회사 Toy performance apparatus and method using chatting
US6631351B1 (en) 1999-09-14 2003-10-07 Aidentity Matrix Smart toys
EP1405699A1 (en) * 2002-10-04 2004-04-07 Fujitsu Limited Freely moving robot arranged to locate a user and make an emergency call on user's request
US6771982B1 (en) 1999-10-20 2004-08-03 Curo Interactive Incorporated Single action audio prompt interface utlizing binary state time domain multiple selection protocol
EP1641545A2 (en) * 2003-06-09 2006-04-05 Palwintec Systems Ltd. Story-telling doll
WO2008132489A1 (en) * 2007-04-30 2008-11-06 Sony Computer Entertainment Europe Limited Interactive media
US8690796B2 (en) 2002-04-19 2014-04-08 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US8795022B2 (en) 2007-07-19 2014-08-05 Hydrae Limited Interacting toys
GB2511479A (en) * 2012-12-17 2014-09-10 Librae Ltd Interacting toys
US8845550B2 (en) 2001-06-12 2014-09-30 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
US8905945B2 (en) 2002-04-19 2014-12-09 Dominique M. Freeman Method and apparatus for penetrating tissue
US8945910B2 (en) 2003-09-29 2015-02-03 Sanofi-Aventis Deutschland Gmbh Method and apparatus for an improved sample capture device
US8965476B2 (en) 2010-04-16 2015-02-24 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
US9034639B2 (en) 2002-12-30 2015-05-19 Sanofi-Aventis Deutschland Gmbh Method and apparatus using optical techniques to measure analyte levels
US9072842B2 (en) 2002-04-19 2015-07-07 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9089678B2 (en) 2002-04-19 2015-07-28 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9089294B2 (en) 2002-04-19 2015-07-28 Sanofi-Aventis Deutschland Gmbh Analyte measurement device with a single shot actuator
CN104898584A (en) * 2014-03-07 2015-09-09 摩豆科技有限公司 Method and device for controlling dolls with applicable programs
US9144401B2 (en) 2003-06-11 2015-09-29 Sanofi-Aventis Deutschland Gmbh Low pain penetrating member
US9226699B2 (en) 2002-04-19 2016-01-05 Sanofi-Aventis Deutschland Gmbh Body fluid sampling module with a continuous compression tissue interface surface
CN105278477A (en) * 2014-06-19 2016-01-27 摩豆科技有限公司 Method and device for operating interactive doll
US9248267B2 (en) 2002-04-19 2016-02-02 Sanofi-Aventis Deustchland Gmbh Tissue penetration device
US9261476B2 (en) 2004-05-20 2016-02-16 Sanofi Sa Printable hydrogel for biosensors
US9314194B2 (en) 2002-04-19 2016-04-19 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
US9351680B2 (en) 2003-10-14 2016-05-31 Sanofi-Aventis Deutschland Gmbh Method and apparatus for a variable user interface
US9375169B2 (en) 2009-01-30 2016-06-28 Sanofi-Aventis Deutschland Gmbh Cam drive for managing disposable penetrating member actions with a single motor and motor and control system
US9386944B2 (en) 2008-04-11 2016-07-12 Sanofi-Aventis Deutschland Gmbh Method and apparatus for analyte detecting device
US9427532B2 (en) 2001-06-12 2016-08-30 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
US9560993B2 (en) 2001-11-21 2017-02-07 Sanofi-Aventis Deutschland Gmbh Blood testing apparatus having a rotatable cartridge with multiple lancing elements and testing means
US9561000B2 (en) 2003-12-31 2017-02-07 Sanofi-Aventis Deutschland Gmbh Method and apparatus for improving fluidic flow and sample capture
US9795747B2 (en) 2010-06-02 2017-10-24 Sanofi-Aventis Deutschland Gmbh Methods and apparatus for lancet actuation
US9820684B2 (en) 2004-06-03 2017-11-21 Sanofi-Aventis Deutschland Gmbh Method and apparatus for a fluid sampling device
US9839386B2 (en) 2002-04-19 2017-12-12 Sanofi-Aventis Deustschland Gmbh Body fluid sampling device with capacitive sensor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4840602A (en) 1987-02-06 1989-06-20 Coleco Industries, Inc. Talking doll responsive to external signal
US4846693A (en) 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
EP0508912A1 (en) * 1991-04-12 1992-10-14 Info Telecom Method and means for materialising a virtual interaction between an objectand an information stand
WO1994008677A1 (en) * 1992-10-19 1994-04-28 Jeffrey Scott Jani Video and radio controlled moving and talking device
EP0606790A2 (en) * 1992-12-08 1994-07-20 Steven Lebensfeld Subject specific,word/phrase selectable, message delivering doll or action figure

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846693A (en) 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
US4840602A (en) 1987-02-06 1989-06-20 Coleco Industries, Inc. Talking doll responsive to external signal
EP0508912A1 (en) * 1991-04-12 1992-10-14 Info Telecom Method and means for materialising a virtual interaction between an objectand an information stand
WO1994008677A1 (en) * 1992-10-19 1994-04-28 Jeffrey Scott Jani Video and radio controlled moving and talking device
EP0606790A2 (en) * 1992-12-08 1994-07-20 Steven Lebensfeld Subject specific,word/phrase selectable, message delivering doll or action figure
US5607336A (en) 1992-12-08 1997-03-04 Steven Lebensfeld Subject specific, word/phrase selectable message delivering doll or action figure

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1072297A1 (en) * 1998-12-24 2001-01-31 Sony Corporation Information processor, portable device, electronic pet device, recorded medium on which information processing procedure is recorded, and information processing method
EP1072297A4 (en) * 1998-12-24 2005-12-14 Sony Corp Information processor, portable device, electronic pet device, recorded medium on which information processing procedure is recorded, and information processing method
WO2001012285A1 (en) * 1999-08-19 2001-02-22 Kidkids, Inc. Networked toys
US6631351B1 (en) 1999-09-14 2003-10-07 Aidentity Matrix Smart toys
WO2001020586A2 (en) * 1999-09-14 2001-03-22 Aisynth Entertainment Inc. Smart toys
WO2001020586A3 (en) * 1999-09-14 2002-07-11 Aisynth Entertainment Inc Smart toys
WO2001030047A2 (en) * 1999-10-20 2001-04-26 Curo Interactive Incorporated Audio prompted, binary sensors user interface for a consumer electronic device
WO2001030047A3 (en) * 1999-10-20 2001-09-20 Curo Interactive Inc Audio prompted, binary sensors user interface for a consumer electronic device
US6771982B1 (en) 1999-10-20 2004-08-03 Curo Interactive Incorporated Single action audio prompt interface utlizing binary state time domain multiple selection protocol
WO2001076714A1 (en) * 2000-04-07 2001-10-18 Valagam Rajagopal Raghunathan Intelligent toy
EP1175929A2 (en) * 2000-07-26 2002-01-30 Deutsche Telekom AG Toy with link to an external database
EP1175929A3 (en) * 2000-07-26 2003-08-13 Funtel GmbH Toy with link to an external database
KR100396751B1 (en) * 2000-08-17 2003-09-02 엘지전자 주식회사 Scholarship/growth system and method for toy using web server
KR100396755B1 (en) * 2000-08-18 2003-09-02 엘지전자 주식회사 Toy performance apparatus and method using chatting
KR20020041483A (en) * 2000-11-28 2002-06-03 박동근 Apparatus and Method for a service of the internet education through a separate type of external equipment
KR20020062057A (en) * 2001-01-19 2002-07-25 (주)리딩엣지 A terminal which can recognize the outer environments and an information offering system and an information offering method using the terminal
WO2002087717A1 (en) * 2001-04-26 2002-11-07 4Kids Entertainment Licensing, Inc. (Formerly Known As Leisure Concepts, Inc.) Talking toys
US9802007B2 (en) 2001-06-12 2017-10-31 Sanofi-Aventis Deutschland Gmbh Methods and apparatus for lancet actuation
US9694144B2 (en) 2001-06-12 2017-07-04 Sanofi-Aventis Deutschland Gmbh Sampling module device and method
US9427532B2 (en) 2001-06-12 2016-08-30 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
US8845550B2 (en) 2001-06-12 2014-09-30 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
KR20020097477A (en) * 2001-06-21 2002-12-31 현명택 Intelligent robort toy based on internet and making method for intelligent robort toy
US9560993B2 (en) 2001-11-21 2017-02-07 Sanofi-Aventis Deutschland Gmbh Blood testing apparatus having a rotatable cartridge with multiple lancing elements and testing means
US8905945B2 (en) 2002-04-19 2014-12-09 Dominique M. Freeman Method and apparatus for penetrating tissue
US9072842B2 (en) 2002-04-19 2015-07-07 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9839386B2 (en) 2002-04-19 2017-12-12 Sanofi-Aventis Deustschland Gmbh Body fluid sampling device with capacitive sensor
US9226699B2 (en) 2002-04-19 2016-01-05 Sanofi-Aventis Deutschland Gmbh Body fluid sampling module with a continuous compression tissue interface surface
US9795334B2 (en) 2002-04-19 2017-10-24 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9724021B2 (en) 2002-04-19 2017-08-08 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9498160B2 (en) 2002-04-19 2016-11-22 Sanofi-Aventis Deutschland Gmbh Method for penetrating tissue
US9248267B2 (en) 2002-04-19 2016-02-02 Sanofi-Aventis Deustchland Gmbh Tissue penetration device
US9314194B2 (en) 2002-04-19 2016-04-19 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
US8690796B2 (en) 2002-04-19 2014-04-08 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9186468B2 (en) 2002-04-19 2015-11-17 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9089678B2 (en) 2002-04-19 2015-07-28 Sanofi-Aventis Deutschland Gmbh Method and apparatus for penetrating tissue
US9089294B2 (en) 2002-04-19 2015-07-28 Sanofi-Aventis Deutschland Gmbh Analyte measurement device with a single shot actuator
US7447564B2 (en) 2002-10-04 2008-11-04 Fujitsu Limited Robot
EP1405699A1 (en) * 2002-10-04 2004-04-07 Fujitsu Limited Freely moving robot arranged to locate a user and make an emergency call on user's request
US9034639B2 (en) 2002-12-30 2015-05-19 Sanofi-Aventis Deutschland Gmbh Method and apparatus using optical techniques to measure analyte levels
EP1641545A2 (en) * 2003-06-09 2006-04-05 Palwintec Systems Ltd. Story-telling doll
EP1641545A4 (en) * 2003-06-09 2008-03-05 Unity Interactive Llc Story-telling doll
US9144401B2 (en) 2003-06-11 2015-09-29 Sanofi-Aventis Deutschland Gmbh Low pain penetrating member
US10034628B2 (en) 2003-06-11 2018-07-31 Sanofi-Aventis Deutschland Gmbh Low pain penetrating member
US8945910B2 (en) 2003-09-29 2015-02-03 Sanofi-Aventis Deutschland Gmbh Method and apparatus for an improved sample capture device
US9351680B2 (en) 2003-10-14 2016-05-31 Sanofi-Aventis Deutschland Gmbh Method and apparatus for a variable user interface
US9561000B2 (en) 2003-12-31 2017-02-07 Sanofi-Aventis Deutschland Gmbh Method and apparatus for improving fluidic flow and sample capture
US9261476B2 (en) 2004-05-20 2016-02-16 Sanofi Sa Printable hydrogel for biosensors
US9820684B2 (en) 2004-06-03 2017-11-21 Sanofi-Aventis Deutschland Gmbh Method and apparatus for a fluid sampling device
WO2008132489A1 (en) * 2007-04-30 2008-11-06 Sony Computer Entertainment Europe Limited Interactive media
US8636558B2 (en) 2007-04-30 2014-01-28 Sony Computer Entertainment Europe Limited Interactive toy and entertainment device
US8795022B2 (en) 2007-07-19 2014-08-05 Hydrae Limited Interacting toys
US8827761B2 (en) 2007-07-19 2014-09-09 Hydrae Limited Interacting toys
US9386944B2 (en) 2008-04-11 2016-07-12 Sanofi-Aventis Deutschland Gmbh Method and apparatus for analyte detecting device
US9375169B2 (en) 2009-01-30 2016-06-28 Sanofi-Aventis Deutschland Gmbh Cam drive for managing disposable penetrating member actions with a single motor and motor and control system
US8965476B2 (en) 2010-04-16 2015-02-24 Sanofi-Aventis Deutschland Gmbh Tissue penetration device
US9795747B2 (en) 2010-06-02 2017-10-24 Sanofi-Aventis Deutschland Gmbh Methods and apparatus for lancet actuation
GB2511479A (en) * 2012-12-17 2014-09-10 Librae Ltd Interacting toys
CN104898584A (en) * 2014-03-07 2015-09-09 摩豆科技有限公司 Method and device for controlling dolls with applicable programs
CN105278477A (en) * 2014-06-19 2016-01-27 摩豆科技有限公司 Method and device for operating interactive doll

Also Published As

Publication number Publication date
AU9792198A (en) 1999-04-27

Similar Documents

Publication Publication Date Title
WO1999017854A1 (en) Remotely programmable talking toy
US7853645B2 (en) Remote generation and distribution of command programs for programmable devices
US5733131A (en) Education and entertainment device with dynamic configuration and operation
US6246927B1 (en) Inter-cooperating toys
US20020068500A1 (en) Adaptive toy system and functionality
US6290566B1 (en) Interactive talking toy
US8795022B2 (en) Interacting toys
US6684127B2 (en) Method of controlling behaviors of pet robots
US5795213A (en) Reading toy
JP3936749B2 (en) Interactive toys
US20050154594A1 (en) Method and apparatus of simulating and stimulating human speech and teaching humans how to talk
US20030115240A1 (en) Schedule managing character and information providing system and method using same
WO2001012285A9 (en) Networked toys
CN101193684A (en) Figurine using wireless communication to harness external computing power
WO2008096134A2 (en) Toy in the form of a doll
US20150161898A1 (en) Fill-in-the-blank audio-story engine
GB2511479A (en) Interacting toys
JP2003114692A (en) Providing system, terminal, toy, providing method, program, and medium for sound source data
US20050288820A1 (en) Novel method to enhance the computer using and online surfing/shopping experience and methods to implement it
WO2000044460A9 (en) Interactively programmable toy
JP2003108376A (en) Response message generation apparatus, and terminal device thereof
JPH10328421A (en) Automatically responding toy
US20040072498A1 (en) System and method for controlling toy using web
WO2004108239A2 (en) Story-telling doll
WO2005038776A1 (en) Voice controlled toy

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase

Ref country code: KR

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA