US20020077021A1 - Toy system cooperating with Computer - Google Patents

Toy system cooperating with Computer Download PDF

Info

Publication number
US20020077021A1
US20020077021A1 US09922883 US92288301A US2002077021A1 US 20020077021 A1 US20020077021 A1 US 20020077021A1 US 09922883 US09922883 US 09922883 US 92288301 A US92288301 A US 92288301A US 2002077021 A1 US2002077021 A1 US 2002077021A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
data
digital contents
computer
motion
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09922883
Inventor
Soon Cho
Jung Moon
In Lee
Sang Park
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
E-PLANET Corp
Original Assignee
E-PLANET Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS, BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS, BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Abstract

Disclosed is a toy system for downloading digital contents via the internet and processing the downloaded digital contents through real time cooperation between a computer and a toy. The computer comprises a recording medium which stores digital contents and a digital contents driving program. The toy comprises a microprocessor, a communication port, a digital sound decoder, a speaker, a motor driving unit, a motor, an LED/lighting device and an LED/lighting device control unit. The digital contents comprise image data, first and second sound data and motion control data. The digital contents driving program processes the following steps: reading the digital contents; displaying the image data of the digital contents on a screen of the computer; outputting the second sound data of the digital contents to a speaker of the computer; and transmitting the first sound data and the motion control data of the digital contents as a packet form to the toy connected with the computer.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates generally to a toy for executing digital contents, and more particularly to a toy system for downloading the digital contents via the internet to a computer and processing the downloaded digital contents in the computer and a toy connected to the computer interactively. [0002]
  • 2. Description of the Prior Art [0003]
  • As the internet has developed lately, a number of digital contents providers have come to provide various digital contents. The digital contents include, for example, MP3 music files. For executing these digital contents, a digital contents driving program can be processed in a computer or a digital contents player. [0004]
  • Therefore, when a user has the digital contents driving program or the digital contents player, he/she can be provided with new digital contents through access to a digital contents provider's server and execute the contents in the computer or the digital contents player. [0005]
  • Meanwhile, motor driven toys such as robots, vehicles and the like are currently on the market, which can move by using motors installed therein or output sound by using memories and speakers installed therein. These kinds of toys of the related art have small memory capacity and cannot be upgraded. So, they execute or output very simple motions or sounds, thereby boring children after a short time. Also, a long-term educational benefit cannot be expected from these toys. [0006]
  • In order to solve the foregoing problems, it has been proposed to apply internet technologies to the toys. An example of such proposals is disclosed in Korean Patent Application laid-open No. 1999-0027814, which relates to an apparatus for controlling motion of a toy. More particularly, the apparatus can control a motor driving unit to perform various motions according to a program which is transmitted from a computer after being composed therein. [0007]
  • However, according to the above-identified application, the digital contents downloaded from the computer are stored in a storage unit in the toy, and the toy is operated independently from the computer. Therefore, the toy is required to further comprise at least a certain capacity of memory, and the size of the downloaded digital contents are restricted according to memory capacity installed in the toy. [0008]
  • SUMMARY OF THE INVENTION
  • Accordingly, the present invention solves the foregoing problems of the related art. Therefore, it is an object of the invention to provide a toy system which comprises a computer and a toy connected to the computer. The computer of the toy system accesses the server of a digital contents provider and downloads digital contents to a memory of the computer. When the digital contents are executed in the computer, the screen of the computer cooperates with the motion and sound of the toy on a real time base. [0009]
  • It is another objection of the invention to provide a toy system which enables the cooperation between a toy and a computer while transmitting motion control data or sound data from the computer to the toy without downloading digital contents to the toy, so that the toy with a small memory capacity can drive digital contents of which the size is larger than the toy's memory. [0010]
  • It is further another object of the invention to provide a toy which can automatically upgrade a motion code table therein without replacement of parts of the toy or the toy itself, thereby performing various motions in response to the digital contents. [0011]
  • It is other object of the invention to provide a controlling apparatus and method for motorization which controls the motion of the mouth of a toy or a robot according to amplitude of a sound signal that is outputted to the mouth, thereby enabling the mouth to be moved very naturally. [0012]
  • According to an embodiment of the invention to achieve the foregoing objects, a toy system is provided a computer with a record medium in which digital contents and a digital contents driving program are stored; and a toy interfaced with the computer. [0013]
  • The digital contents are stored in the memory of the computer and comprise image data, sound data and motion control data. [0014]
  • The digital contents driving program processes the steps of reading the digital contents, displaying the image data of the digital contents on a screen of the computer and transmitting the sound data and the motion control data of the digital contents as a packet form to the toy. [0015]
  • The toy is comprised of a communication port for providing an interface with the computer; a microprocessor for decoding a data packet received via the communication port, outputting sound data and a motion control signal, and controlling operations of the toy; memory for storing a program and a motion code table required for the operation of the microprocessor; a decoder for decoding the sound data from the microprocessor to output a sound signal; a motor driving unit for driving a motor in response to the motion control signal from the microprocessor; a motor installed in each part of the toy; and a speaker for outputting the sound signal. [0016]
  • According to the invention, the toy receives the control data and the sound data necessary for the operation thereof from the computer, decodes and outputs the received data in real time so that the motion and sound output of the toy is synchronized with the display image on the computer screen. [0017]
  • Preferably, the toy has a motion code table in the memory, in which the motion code table comprises character ID, motion code table version, motion codes and motor control data corresponding to each motion code. The digital contents driving program further processes the steps of: comparing the version of the motion code table of the toy with that of a motion code table stored in the computer; and if the version of the motion code table of the toy is lower than that of the computer, downloading the motion code table stored in the computer to the memory of the toy. [0018]
  • The toy can also comprise a robot and a stage unit; the robot has motor and an EEPROM where the motion control table is written, and the robot is detachable from the stage unit. [0019]
  • The digital contents can also comprise a sound data pool, the sound data pool being composed of sound data IDs, and sound data and motion control data corresponding to each sound data ID. Also, the digital contents driving program obtains the sound data and the motion control data corresponding to the sound data ID and transmits the sound data and the motion control data to the toy. [0020]
  • Also, the toy transmits an event signal to the digital contents driving program when the input unit of the toy receives an external input. The digital contents driving program processes an operation corresponding to the event signal which informs of start, stop and operation from the user control buttons installed in the toy, or informs if the robot is detached. [0021]
  • According to another embodiment of the invention to achieve the foregoing objects, the present invention is provided with a method of controlling a motor by using a sound signal, the method comprising the steps of: (a) amplifying the sound signal; (b) clearing high frequency signal of the amplified sound signal to detect an envelope of the sound signal; and (c) using the envelope as the reference signal for controlling the motor. [0022]
  • Preferably, the method further comprises the steps of converting an analog signal corresponding to the envelope of the sound signal into a digital signal, and using the converted digital signal as the reference input signal for controlling the motor. [0023]
  • The motor controlling method of the invention is applied to control the motor driving the mouth of a speaking robot or toy, and the sound signal is preferably a sound signal outputted via a speaker installed in the robot or toy. [0024]
  • The control data for operating the toy and the sound data for reciting a fairy tale are transmitted in real time instead of being downloaded from a computer, so that the toy does not require a large capacity of memory for storing the whole digital contents. Also, the motion, sound output and lighting control of the toy can be synchronized in real time with image and sound output of the computer. [0025]
  • It is to be understood that the foregoing general description of the present invention is exemplary and explanatory.[0026]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the principle of the invention. [0027]
  • FIG. 1 is a block diagram generally showing a computer and a toy system which cooperates with the computer according to the present invention; [0028]
  • FIG. 2A shows a configuration of digital contents according to the first embodiment of the present invention; [0029]
  • FIG. 2B shows a configuration of digital contents according to the second embodiment of the present invention; [0030]
  • FIG. 2C shows a configuration of a sound data pool of the digital contents according to the second embodiment of the present invention; [0031]
  • FIG. 3 is a block diagram showing a digital contents driving program of the invention; [0032]
  • FIG. 4 is a flow chart showing the operation of the digital contents driving program according to the present invention; [0033]
  • FIG. 5A through 5D show configurations of packets including sound data, motion control data, LED control data and lighting control data; [0034]
  • FIG. 6 shows a download control program in a digital contents driving program according to the invention; [0035]
  • FIG. 7 is a block diagram showing a toy system according to the invention; [0036]
  • FIG. 8 is a block diagram showing the operation of a microprocessor; [0037]
  • FIG. 9A is a flow chart showing a sequential process of a microprocessor; [0038]
  • FIG. 9B is a flow chart showing an interrupt handling routine of the microprocessor shown in FIG. 9A; [0039]
  • FIG. 10 is a circuit diagram of a motor control unit adopted in a toy according to the invention; and [0040]
  • FIG. 11 is a flow chart showing the operation of the motor control unit shown in FIG. 10. [0041]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Hereinafter, a detailed description will be provided about the configuration and operation of digital contents, digital contents execution program and a toy system in reference with the accompanying drawings. [0042] 5 First, FIG. 1 is a system block diagram generally showing a toy system according to the invention. A description will be provided about the configuration and operation of the first embodiment about the toy system according to the invention with reference to FIG. 1.
  • The toy system of the invention is comprised of a computer having digital contents and a digital contents driving program, and a toy connected with the computer using by either [0043] 1 wire or wireless interface. The toy is a movable device having motors.
  • As shown in FIG. 2A, the digital contents of the first embodiment comprise image data [0044] 210, first sound data 216, second sound data 212, motion control data 214, LED/lighting control data 218. Here, each data need not be written in uniform order.
  • A user accesses a server of a digital contents provider via the internet and downloads such digital contents. Here, an updated new motion code table corresponding to a character ID of a robot is downloaded together. [0045]
  • FIG. 2B shows digital contents according to the second embodiment of the invention. The digital contents include a sound data pool and a sound data ID as well as image data, second sound data, motion control data and LED/lighting control data. [0046]
  • Meanwhile, as shown in FIG. 2C, the sound data pool includes sound data IDs, and first sound data, motion control data and LED/lighting control data corresponding to each sound data ID. Here, the first sound data, the motion control data and the LED/lighting control data corresponding to one sound data ID are signals necessary for operating simultaneously in the toy. In other words, the sound data pool is provided in a certain area of the digital contents, and the sound data ID functioning as a key word of the sound data pool is used in the digital contents to allow size reduction of the digital contents. [0047]
  • Again, the user can access the server of the digital contents provider via the internet and download such digital contents together with the motion code table corresponding to the character ID of the robot. The downloaded digital contents and the motion code tables are stored in a memory of the computer. The motion code table is downloaded to the memory in the toy by the digital contents driving program. [0048]
  • The image data and the second sound data are respectively outputted to a monitor and a speaker of the computer when the digital contents are executed. The first sound data, the motion control data and the LED/lighting control data are transmitted to the toy from the computer after being packeted therein. A detailed description about each of these data will be provided later in the specification. [0049]
  • If the data of the read digital contents is sound data ID, the first sound data, the motion control data and the LED/lighting control data corresponding to the sound data ID are read from the sound data pool and transmitted to the toy from the computer after being packeted therein. Further comprising the sound data pool, the digital contents can be reduced in size. [0050]
  • Meanwhile, the digital contents can further include data corresponding to encryption codes for preventing any illegal copy thereof. [0051]
  • FIG. 3 is a block diagram showing the function of a driving program [0052] 300 for reading and executing the digital contents stored in the memory of the computer, and FIG. 4 is a flow chart showing the operation of the digital contents driving program.
  • First, a description about a configuration of the driving program in reference to FIG. 3 will be provided. [0053]
  • The driving program comprises a digital contents reader [0054] 310 for reading the digital contents from the memory, a PC screen display unit 320 and a PC speaker output unit 322 for outputting data of the digital contents to a screen and a speaker according to data types, respectively, a data packet transmitting unit 324, a download control unit 330 and an input signal processing unit 340.
  • Upon receiving an event or a request signal from the toy to the computer, the input signal processing unit [0055] 340 processes the event corresponding to the input signal by using an interrupt handling routine. The type and substance of the event will be described later.
  • The operation of the download control unit [0056] 330 will also be described later.
  • Hereinafter, a description will be provided about the operation of the digital contents driving program in reference to FIG. 4. [0057]
  • First, the digital contents are read and loaded in step [0058] 410, and the digital contents are analyzed in step 420.
  • Then, if the data of the digital contents are image data in step [0059] 430, the image data are displayed on the screen of the computer in step 432. If the data of the digital contents are the second sound data in step 440, the second sound data are outputted to the speaker installed in the computer in step 442.
  • In sequence, if the data of the digital contents are one of the first sound data, the motion control data and the LED/lighting control data in step [0060] 450, the data are transmitted to the toy that is interfaced with the computer after packeting the data in step 452. Then, if the digital contents are terminated, the operation of the digital contents driving program is ended. If not, it returns to the step 420.
  • FIG. 5A to FIG. 5D show configurations of the data packet formed by the digital contents driving program. [0061]
  • FIG. 5A is a packet of motion control data, and is composed of a communication header and the motion control data. The motion control data represent data codes in the motion code table previously set about desired operations and written in the memory of the toy. [0062]
  • FIG. 5B is a packet of lighting control data which is composed of a communication header, a start RGB value, a termination RGB value, a fixed time and the number of repetition. The lighting device varies from the start RGB value to the termination RGB value as often as the number of repetition in the fixed time. [0063]
  • FIG. 5C is a packet of LED control signal which is composed of a communication header, the number of repetition, LED ON/OFF, time, . . . , LED ON/OFF, time. Each LED ON/OFF, which is composed of a length of bits, is turned on when its data is 1, and off when its data is 0. Each LED is turned on/off in the fixed time as often as the number of repetition, matched to a relative time from the point of receiving a command. [0064]
  • FIG. 5D is a packet of the first sound data and is composed of the first sound data, time, a control command, . . . , time and another control command. While the first sound data are decoded and processed, the packet is synchronized and the control commands are executed so as to cause the robot to speak and move. Here, the control commands are data required for controlling the motion of the robot and the lighting as soon as the first sound data are outputted. [0065]
  • FIG. 6 is a flow chart showing the operation of the download control unit. [0066]
  • First, the digital contents driving program requests the character ID of the toy and a version information of the motion code table written in the memory of the toy in step [0067] 600, and receives the character ID and the version information from the toy in step 610.
  • Then, the driving program compares the character ID and the motion code table version received from the toy with the character ID and the motion code table version which are downloaded from the internet and stored in the computer in step [0068] 620.
  • If the character IDs are the same and the motion code table of the toy has lower version than that of the computer in step [0069] 630, the driving program transmits a download start code to the toy in step 640. Then, upon receiving a ready-for-receiving code from the toy in step 650, the driving program downloads the motion code table corresponding to the character ID to the memory installed in the toy in step 660 and in step 670, and then terminates.
  • Hereinafter, the configuration and the operation of the toy in reference to FIG. 7 to FIG. 10 are described. [0070]
  • As shown in FIG. 7, the toy comprises a communication port [0071] 710, a microprocessor 720, a digital sound decoder 750, a speaker 754, an envelope detector 756, an A/D converter 758, a motor driving unit 760, motor 762, an LED/lighting device driving unit 770, an LED or a lighting device 772, an input unit 740 and memories 730 and 732.
  • Here, the toy is comprised of a stage unit and a robot, in which the robot is preferably detachable from the stage unit. In this configuration, the robot can be replaced with a new robot character at any time. Also, the motion code table is varied according to the character type of the robot, and can be replaced together with the robot. [0072]
  • The stage unit includes the communication port, the microprocessor for controlling the entire operations of the toy, the memories, the digital sound decoder, the motor driving unit, the LED/lighting device driving unit, the LED/lighting device and the input unit. [0073]
  • The communication port [0074] 710 enables the wire or wireless interface of the toy with the computer.
  • The microprocessor [0075] 720 controls the entire operations of the toy, and as shown in FIG. 8, comprises a communication processor 820, a packet decoder unit 830, an input processing unit 810, a motor control unit 840, a sound data processing unit 850 and an LED/lighting device control unit 860. Hereinafter, a detailed description is provided about the operation of each component of the microprocessor.
  • After a data stream received from the communication port is successfully stored, upon completion of receiving the packet, the communication processor [0076] 820 outputs the received packet to a packet decoder unit 830 which will be described later, or transmits the data received from the input processing unit 810 to the computer.
  • The packet decoder unit [0077] 830 decodes the packet from the communication processor, and transmits the packet to the sound data processing unit if the packet is the first sound data, to the motor control unit if it is motion control data, and to the LED/lighting control unit if it is LED/lighting control data.
  • When an internal event or an external event from the input unit takes place, the input processing unit [0078] 810 transmits a signal corresponding to each event via the communication processor to the computer. Here, the internal event includes a signal for informing if the sound to the speaker of the toy and the motion of the toy are synchronized, and another signal for informing if the robot is detached/attached from/to the stage unit. The event from the input unit is a signal inputted from user control buttons installed in the toy or a signal for informing of the start, stop, forward and backward of the contents, start of the contents reader, and the like.
  • The motor control unit [0079] 840, after reading position data and time duration data of the motors corresponding to the motion control data from the packet decoder in the motion code table, obtains a motion control signal from the position and time duration data, and outputs the motion control signal to the motor driving unit.
  • The voice data processing unit [0080] 850 outputs the sound data from the packet decoder to the digital sound decoder.
  • The LED/lighting control unit [0081] 860 obtains an LED/lighting control signal from the LED/light control data outputted from the packet decoder, and then outputs the control signal to the LED/lighting device driving unit.
  • The operation and the speaking motion of the toy are controlled in response to input signals from the microprocessor to the toy. [0082]
  • Then, the memories store programs and data required for the operation of the microprocessor. Preferably, the first memory stores the program and data required for the operation of the microprocessor, and the second memory stores the motion code table. More preferably, the second memory is an EEPROM which is an electrically erasable programmable ROM. [0083]
  • The digital sound decoder [0084] 750 decodes and amplifies the sound signal from the microprocessor 720 by using the sound amplifier 752 and outputs the amplified sound signal to the speaker 754.
  • Meanwhile, the envelope detector [0085] 756 detects an envelope of a sound signal from the digital sound decoder 750, and the A/D converter 758 converts an analog signal of the envelope into a digital signal. The converted sound signal is used as the reference input signal to control the motor 762 for moving the mouth of the robot.
  • The motor driving unit [0086] 760 drives the motor 762 in response to the motion control signal from the microprocessor 720.
  • The LED/lighting device driving unit [0087] 770 controls the operation of the LED or the lighting device 772 in response to the LED/lighting control signal produced from the microprocessor 720. The LED/lighting control signal expresses brightness, color and the like of the LED or the lighting device 772.
  • The input unit [0088] 740 is the user control buttons, and when an external input takes place, interrupts the microprocessor to inform the external input.
  • The robot comprises a plurality of motors, speakers and memories. [0089]
  • The motion code table is composed of the character ID of the toy, version of the motion code table, the motion codes and control data of the motor respectively corresponding to the motion codes. The control data include at least position data and time duration data. [0090]
  • Upon receiving the motion control data packet from the computer, the toy obtains the motion code from the motion control data packet, and calls the motor control data corresponding to the motion code from the motion code table with the motion codes as a key word. Motors in the robot are controlled by using the motor control data. [0091]
  • If the motion code table has a version lower than that of the motion code table stored in the cooperating computer, a new version of motion code table is downloaded to the second memory of the robot from the computer. Therefore, the motion code table can be newly upgraded without replacing the existing robot or the toy. [0092]
  • The speaker [0093] 754 outputs the sound signal transmitted from the computer.
  • The motor [0094] 762 is mounted to portions of the robot allowing limbs of the robot to move. Also, the motor is operated in response to the motion control signal which the microprocessor produced by using the motor control data from the motion code table.
  • One of the motors connected with the mouth of the robot uses an additional motor control unit as shown in FIG. 10 to be controlled by the sound signal. The motor control unit comprises an amplifier [0095] 1000 for amplifying an input signal, a diode 1010 and a low-pass filter 1020. More preferably, the motor control unit further comprises an A/D converter as shown in FIG. 7.
  • FIG. 9A is a flow chart showing a sequential process of the microprocessor, and FIG. 9B is a flow chart showing a process which is interrupted at a certain time interval in the microprocessor. [0096]
  • Referring to FIG. 9A, when a new packet is received in step [0097] 910, the received packet is decoded in step 960 to inspect data of the packet. Here, if the received packet is the sound data packet in step 962, the packet is stored in step 964. If the received packet is the motion control data packet in step 970, a motion mode is set in step 972. If the received packet is the LED/lighting control data packet in step 980, a LED/lighting mode is set in step 982.
  • If a new data packet is not received in step [0098] 910, it is checked whether the buffer of the sound decoder is full in step 920. If the buffer of the sound decoder is not full, it is checked whether data stream receipt is ended in step 930. If the data stream is not ended, it is checked whether restoration of the received sound data packet is terminated in step 940. If restoration is not terminated, the sound data are outputted in step 950. If terminated, transmission of the next packet is requested in step 940. Then, the process returns to step 910.
  • FIG. 9B is an interrupt handling routine carried out at a certain time interval separately from the sequential process routine of the microprocessor. For control of the LED and the lighting device, this routine should be executed with constant time intervals and is processed as an interrupt routine. For example, after setting the routine so that a timer interrupt takes place at an interval of every 25 ms, A/D conversion is started in the timer interrupt. An A/D converted signal may include an output of the envelope detector or a position signal of each motor. When A/D conversion is ended, an A/D conversion-complete interrupt takes place and the routines of motor control, LED control and lighting control are sequentially processed as shown in FIG. 9B. Therefore, motor control, LED flashing and lighting device control can be executed at exact time interval of 25 ms. [0099]
  • Further, the microprocessor of the toy uses an external interrupt routine also when an input takes place from a switch or an external input unit. Therefore, when any button of the switch is pushed, the microprocessor of the toy is interrupted so that the microprocessor reads a switch state and transmits a state value of the switch to the computer through the communication processor. [0100]
  • The microprocessor executes the interrupt routine whenever data are received from the computer. To be more specific, the data transmitted from the computer are received in the microprocessor via a communication port, in which a receipt interrupt takes place whenever one bite is received. When the receipt interrupt takes place, the microprocessor maintains storing the received data in the previously reserved buffer, and when the last byte of the packet is received, sets a flag informing of the receipt of the last byte. Then, in the flow chart of FIG. 9A, the microprocessor can confirm if a new data packet is received by inspecting the flag. When packet decoding is terminated, the microprocessor should reset the packet flag as “0” again. [0101]
  • Hereinafter, a detailed description will be provided about the motor control unit for moving the mouth of the robot in reference to FIG. 11. [0102]
  • First, in step [0103] 1100, the motor control unit amplifies the sound signal decoded with the first sound data, which are transmitted from the computer to the toy in a packet form, in which the amplifier preferably employs a differential amplifier. When the sound signal is lower than 0.7 volt, the diode in the rear of the amplifier outputs “0” voltage making envelope detection impossible.
  • Then, the sound signal from the amplifier passes the diode to clean a negative component from the sound signal in step [0104] 1110. The output signal passes the low-pass filter comprised of a capacitor and a resistance to clear a high frequency signal of the sound signal and detect an envelope of the sound signal in step 1120.
  • Then in step [0105] 1130, an analog signal corresponding to the detected envelope is converted into a digital signal by using the A/D converter, and the converted signal is used as the reference input signal for controlling the motor in step 1140.
  • The foregoing motor control unit can be used to control the motor connected to the mouth of the robot or the toy. The motor control unit employs the sound signal outputted to the speaker of the robot or the toy so that the degree of moving the mouth can be varied according to the amplitude of the sound signal. As a result, the mouth shape of the robot or the toy is varied according to sound output so that the speaking motion of the robot or the toy can appear more natural. [0106]
  • According to the toy system as described above, the digital contents driving program is executed in the PC to activate multimedia output such as sound, image and animation of the digital contents, and the toy cooperates with the multimedia in real time to move the limbs of the toy while, for example, reciting a fairy tale, thereby stimulating the interest of a person watching the toy. [0107]
  • Therefore, the toy system of the invention can be applied to a robot system which recites a fairy tale in an interesting manner, in which the computer screen displays characters and backgrounds of the fairy tale and robot recites corresponding to the display image of the computer screen with various motions and stage lighting effect, thereby allowing the fairy tale to be much more amusing. [0108]
  • Also, the invention can be applied to a language educational system. The computer screen displays images or captions of foreign language and the toy speaks corresponding to the caption data displayed on the computer screen so that children or beginners can have an interesting language study while watching the toy. [0109]
  • The toy system of the invention allows the toy to receive the data from the computer in real time and output the same so that the motion and the sound output of the toy can cooperate with the computer screen. [0110]
  • The robot of the toy is detachable from the stage unit of the toy and the motion code table is written in the memory installed in the robot thereby enabling substitution to a new robot at any time. Also, the memory for the motion code table employs EEPROM which is electrically rewritable so that a new version of the motion code table can be downloaded from the computer to update the motion code table of the robot new version. Therefore, the motion code table written in the memory of the robot can be upgraded without any additional hardware replacement. [0111]
  • Also, the sound output signal is used to control the motor for driving the mouth of the toy so that more natural motion of the mouth can be achieved when reciting a fairy tale. [0112]

Claims (18)

    What is claimed is:
  1. 1. A toy system comprising:
    a computer with a recording medium in which digital contents and a digital contents driving program are stored; and
    a movable device interfaced with said computer;
    wherein said digital contents are stored in said recording medium of said computer, and comprise image data, sound data and motion control data;
    wherein said digital contents driving program processes the following steps of:
    (a) reading said digital contents;
    (b) displaying the image data of said digital contents on a screen of said computer; and
    (c) transmitting the sound data and the motion control data of said digital contents as a packet form to said movable device connected with said computer;
    wherein said movable device comprises:
    a communication port for providing an interface with said computer;
    a microprocessor for decoding a data packet received via said communication port, outputting sound data and a motion control signal, and controlling operations of said movable device;
    a memory for storing a program and a motion code table required for the operation of said microprocessor;
    a decoder for decoding the sound data from said microprocessor to output a sound signal;
    a motor driving unit for driving a motor in response to the motion control signal from said microprocessor; and
    a motor installed in each part of said movable device; and
    wherein the image data is displayed on said computer screen and said movable device decodes the data packet transmitted from said computer to output the sound data to said digital sound decoder and the motion control signal to the motor driving unit when said digital contents driving program is processed in said computer, whereby the motion and sound output of said movable device are synchronized with the screen output of said computer.
  2. 2. A toy system according to claim 1, wherein said motion code table comprises a character ID and a motion code table version, motion codes and motor control data corresponding to each motion codes, and
    wherein said digital contents driving program further processes the steps of:
    comparing version of said motion code table of the movable device with a motion code table stored in said computer; and
    if said motion code table of said movable device has a lower version than said motion code table stored in said computer, downloading said motion code table stored in said computer to said memory of said movable device.
  3. 3. A toy system according to claim 2, wherein said motor control data is composed of position data and time duration data.
  4. 4. A toy system according to claim 1, wherein said movable device comprises a robot and a stage unit, said robot having motor and being detachable from said stage unit.
  5. 5. A toy system according to claim 4, wherein said motion code table is stored in an electrically rewritable memory installed in said robot.
  6. 6. A toy system according to claim 4, wherein said memory includes first and second memories, said first memory storing a program necessary for the operation of said microprocessor of said movable device and being installed in said stage unit, said second memory storing said motion code table and being installed in said robot.
  7. 7. A toy system according to claim 6, wherein said second memory is electrically rewritable.
  8. 8. A toy system according to claim 1, wherein said digital contents further comprise a sound data pool, said sound data pool being composed of sound data IDs, and sound data and motion control data corresponding to each sound data ID; and
    wherein said digital contents driving program obtains the sound data and the motion control data corresponding to the sound data ID from said sound data pool and transmits the sound data and the motion control data to said movable device.
  9. 9. A toy system according to claim 1, wherein said movable device further comprises an input unit for inputting an external signal, said microprocessor of said movable device transmitting an event signal to said digital contents driving program when said input unit receives an external input, said digital contents driving program executing an operation corresponding to the event signal.
  10. 10. A toy system according to claim 9, wherein said input unit has user control buttons for instructing start, stop, forward and backward, said event signal informing of the start, stop forward and backward from said user control buttons installed in said movable device.
  11. 11. A toy system according to claim 3, wherein said movable device further comprises an input unit for inputting an external signal, said microprocessor of said movable device transmitting an event signal to said digital contents driving program when said input unit receives an external input, said digital contents driving program executing an operation corresponding to the event signal, said input unit having user control buttons for instructing start, stop, forward and backward, said event signal informing of the start, stop, forward and backward from said user control buttons installed in said movable device, or informing if said robot is detached.
  12. 12. A toy system according to claim 1, wherein said digital contents comprise additional sound data, said digital contents driving program outputting said additional sound data of said digital contents to a speaker installed in said computer.
  13. 13. A toy system according to claim 1, wherein said microprocessor of said movable device detects an envelope from a waveform of the sound data transmitted via said communication port, and uses the envelope as an electric reference signal to control a motor connected to the mouth of said movable device.
  14. 14. A toy system according to claim 1, wherein said microprocessor comprises:
    a communication processor for successively storing a data stream received from said communication port, and then upon completion of receiving the packet, outputting the received packet to a packet decoder unit or transmitting data received from an input processing unit to said computer;
    a packet decoder unit for decoding the packet from said communication processor, and then transmitting the decoded packet to a motor control unit or a sound data processing unit;
    an input processing unit for transmitting a signal corresponding to each event of an input unit to said communication processor when the input unit is inputted;
    a motor control unit for using motion control data from said packet decoder unit to read position and time data of motors corresponding to the motion code from the motion code table, obtain the motion control signal from said position and time data, and output the motion control signal to said motor driving unit; and
    a sound data processing unit for outputting the sound data from said packet decoder unit to said digital sound decoder.
  15. 15. A toy system according to claim 1, wherein said digital contents are downloaded together with the motion code table corresponding to a character ID of said digital contents from a server of a digital contents provider via the internet.
  16. 16. A toy system according to claim 15, wherein said digital contents driving program further performs the steps of:
    (a) receiving the character ID and the motion code table version of said movable device from said movable device;
    (b) comparing the character ID and the motion code table version stored in said computer with the character ID and the motion code table version received from said movable device;
    (c) if the character IDs are not the same, stopping the downloading; and
    (d) if the character IDs are the same and the motion code table version of said movable device is lower than the motion code table version stored in said computer, downloading the motion code table in said computer to said memory in said movable device.
  17. 17. A toy system according to claim 1, wherein said digital contents further comprise LED/lighting control data, said digital contents driving program forming the LED/lighting control data into a packet to transmit the packeted data to said movable device; and
    wherein said movable device further comprises an LED or a lighting device, and controls said LED or said lighting device according to the LED/lighting control data from said digital contents driving program.
  18. 18. A toy system according to claim 17, wherein said microprocessor comprises:
    a communication processor for successively storing a data stream received from said communication port, and then upon completion of receiving the packet, outputting the received packet to a packet decoder unit or transmitting data received from an input processing unit toward said computer;
    a packet decoder unit for decoding the packet from said communication processor, and then transmitting the decoded packet to a motor control unit, a sound data processing unit or an LED/lighting control unit;
    an input processing unit for transmitting a signal corresponding to each event of an input unit to said communication processor when the input unit is inputted;
    a motor control unit for using motion control data from said packet decoder unit to read position and time data of motors corresponding to motion code from the motion code table, obtain the motion control signal from said position and time data, and output the motion control signal to said motor driving unit;
    a sound data processing unit for outputting the sound data from said packet decoder unit to said digital sound decoder; and
    an LED/lighting control unit for obtaining an LED/lighting control signal from LED/lighting control data outputted from said packet decoder, and outputting said LED/lighting control signal to an LED/lighting driving unit.
US09922883 2000-12-18 2001-08-06 Toy system cooperating with Computer Abandoned US20020077021A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR2000-77874 2000-12-18
KR20000077874A KR100360722B1 (en) 2000-12-18 Toy system

Publications (1)

Publication Number Publication Date
US20020077021A1 true true US20020077021A1 (en) 2002-06-20

Family

ID=19703205

Family Applications (1)

Application Number Title Priority Date Filing Date
US09922883 Abandoned US20020077021A1 (en) 2000-12-18 2001-08-06 Toy system cooperating with Computer

Country Status (1)

Country Link
US (1) US20020077021A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040038620A1 (en) * 2002-08-26 2004-02-26 David Small Method, apparatus, and system to synchronize processors in toys
US20060161301A1 (en) * 2005-01-10 2006-07-20 Io.Tek Co., Ltd Processing method for playing multimedia content including motion control information in network-based robot system
US20080263164A1 (en) * 2005-12-20 2008-10-23 Koninklijke Philips Electronics, N.V. Method of Sending Motion Control Content in a Message, Message Transmitting Device Abnd Message Rendering Device
US20090157223A1 (en) * 2007-12-17 2009-06-18 Electronics And Telecommunications Research Institute Robot chatting system and method
WO2010021592A1 (en) * 2008-08-22 2010-02-25 Anthony Michael Jeff & Ng Lily Trading As Day2Day Trading Asia Integrated audio device for action figures and method of manufacture thereof
US20100052864A1 (en) * 2008-08-29 2010-03-04 Boyer Stephen W Light, sound, & motion receiver devices

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846693A (en) * 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
US5655945A (en) * 1992-10-19 1997-08-12 Microsoft Corporation Video and radio controlled moving and talking device
US5855483A (en) * 1994-11-21 1999-01-05 Compaq Computer Corp. Interactive play with a computer
US6012961A (en) * 1997-05-14 2000-01-11 Design Lab, Llc Electronic toy including a reprogrammable data storage device
US6319010B1 (en) * 1996-04-10 2001-11-20 Dan Kikinis PC peripheral interactive doll
US6381515B1 (en) * 1999-01-25 2002-04-30 Sony Corporation Robot apparatus
US6394872B1 (en) * 1999-06-30 2002-05-28 Inter Robot Inc. Embodied voice responsive toy

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4846693A (en) * 1987-01-08 1989-07-11 Smith Engineering Video based instructional and entertainment system using animated figure
US5655945A (en) * 1992-10-19 1997-08-12 Microsoft Corporation Video and radio controlled moving and talking device
US5855483A (en) * 1994-11-21 1999-01-05 Compaq Computer Corp. Interactive play with a computer
US6319010B1 (en) * 1996-04-10 2001-11-20 Dan Kikinis PC peripheral interactive doll
US6012961A (en) * 1997-05-14 2000-01-11 Design Lab, Llc Electronic toy including a reprogrammable data storage device
US6381515B1 (en) * 1999-01-25 2002-04-30 Sony Corporation Robot apparatus
US6394872B1 (en) * 1999-06-30 2002-05-28 Inter Robot Inc. Embodied voice responsive toy

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040038620A1 (en) * 2002-08-26 2004-02-26 David Small Method, apparatus, and system to synchronize processors in toys
US7297044B2 (en) 2002-08-26 2007-11-20 Shoot The Moon Products Ii, Llc Method, apparatus, and system to synchronize processors in toys
US20060161301A1 (en) * 2005-01-10 2006-07-20 Io.Tek Co., Ltd Processing method for playing multimedia content including motion control information in network-based robot system
US7751936B2 (en) * 2005-01-10 2010-07-06 Robomation Co., Ltd. Processing method for playing multimedia content including motion control information in network-based robot system
US20080263164A1 (en) * 2005-12-20 2008-10-23 Koninklijke Philips Electronics, N.V. Method of Sending Motion Control Content in a Message, Message Transmitting Device Abnd Message Rendering Device
US20090157223A1 (en) * 2007-12-17 2009-06-18 Electronics And Telecommunications Research Institute Robot chatting system and method
WO2010021592A1 (en) * 2008-08-22 2010-02-25 Anthony Michael Jeff & Ng Lily Trading As Day2Day Trading Asia Integrated audio device for action figures and method of manufacture thereof
US20100052864A1 (en) * 2008-08-29 2010-03-04 Boyer Stephen W Light, sound, & motion receiver devices
US8354918B2 (en) * 2008-08-29 2013-01-15 Boyer Stephen W Light, sound, and motion receiver devices

Also Published As

Publication number Publication date Type
KR20010016530A (en) 2001-03-05 application

Similar Documents

Publication Publication Date Title
US6939232B2 (en) Information processing system comprising a plurality of operation terminal devices and an information processing device
US5393072A (en) Talking video games with vocal conflict
US5769719A (en) Video game system having means for displaying a key programming
US5358259A (en) Talking video games
US6052556A (en) Interactivity enhancement apparatus for consumer electronics products
US4305131A (en) Dialog between TV movies and human viewers
US4840602A (en) Talking doll responsive to external signal
US7003598B2 (en) Remote control for providing interactive DVD navigation based on user response
US7331857B2 (en) Gaming system
US5825727A (en) Apparatus and method for selecting multi-angle of digital video disc
US20040087373A1 (en) System and method for interacting with online/offline games using a mobile communication terminal
US6421069B1 (en) Method and apparatus for including self-describing information within devices
US5969835A (en) Automated infrared test signal generator
US5597307A (en) Method for starting up a process automatically on insertion of a storage media into a host device
US5548340A (en) Intelligent television receivers combinations including video displays, and methods for diversion of television viewers by visual image modification
US6154186A (en) Electronic entertainment and communication system
US5228859A (en) Interactive educational and training system with concurrent digitized sound and video output
US20050179702A1 (en) Embedded video processing system
US20090300145A1 (en) Media streaming with seamless ad insertion
US5393071A (en) Talking video games with cooperative action
US20100167623A1 (en) Interactive toy and entertainment device
US20020133818A1 (en) Interactive television
US20040117856A1 (en) Method and apparatus for modulating a video signal with data
US20070268312A1 (en) Methods and systems for processing an interchange of real time effects during video communication
US5486872A (en) Method and apparatus for covering and revealing the display of captions

Legal Events

Date Code Title Description
AS Assignment

Owner name: E-PLANET CORPORATION, KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHO, SOON YOUNG;MOON, JUNG HO;LEE, IN SEOB;AND OTHERS;REEL/FRAME:012085/0719

Effective date: 20010620