WO2006062274A1 - Intelligent robot and mobile game method using the same - Google Patents

Intelligent robot and mobile game method using the same Download PDF

Info

Publication number
WO2006062274A1
WO2006062274A1 PCT/KR2005/000825 KR2005000825W WO2006062274A1 WO 2006062274 A1 WO2006062274 A1 WO 2006062274A1 KR 2005000825 W KR2005000825 W KR 2005000825W WO 2006062274 A1 WO2006062274 A1 WO 2006062274A1
Authority
WO
WIPO (PCT)
Prior art keywords
intelligent robot
user
central processing
processing unit
manipulation
Prior art date
Application number
PCT/KR2005/000825
Other languages
French (fr)
Inventor
Yong Duck Lee
Original Assignee
Rivalkorea Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rivalkorea Co., Ltd. filed Critical Rivalkorea Co., Ltd.
Priority to JP2007544251A priority Critical patent/JP2008522654A/en
Publication of WO2006062274A1 publication Critical patent/WO2006062274A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the present invention relates, in general, to an intelligent robot and mobile game method using the intelligent robot and, more particularly, to an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, and a mobile game method using the intelligent robot.
  • a robot having the function of outputting sounds can output various types of preset sounds at regular intervals, or output a specific sound when a user touches a specific part of the robot. Therefore, the user listens to various types of sounds unilaterally output from the robot, and recognizes the current status of the robot. Further, after listening to a specific sound output from the robot when the user touches a specific part of the robot, the user thinks that the robot actually reacts to his or her action.
  • a conventional robot is disadvantageous in that it only sequentially outputs various types of sounds according to a preset sequence or outputs a sound corresponding to a user's touch, but cannot develop by itself or communicate with another robot.
  • an object of the present invention is to provide an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, and a mobile game method using the intelligent robot.
  • the present invention provides an intelligent robot including a sound output unit and a display unit, comprising a manipulation sensing unit for sensing individual manipulations of a user, a storage unit for storing various data required for operation of the intelligent robot, and a central processing unit for accessing the storage unit to obtain specific reaction data for the intelligent robot corresponding to specific manipulation sensing signals of the user that are provided from the manipulation sensing unit, outputting sounds corresponding to the specific manipulations of the user through the sound output unit, developing the intelligent robot according to the manipulations of the user, and displaying status of the intelligent robot on the display unit.
  • the present invention provides an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, and a mobile game method using the intelligent robot.
  • FIG. 1 is a block diagram of an intelligent robot according to an embodiment of the present invention.
  • FIG. 2 is a network configuration view showing a mobile game system using the intelligent robot of FIG. 1 according to an embodiment of the present invention.
  • FIG. 3 is a flowchart showing the steps of a mobile game method using a mobile phone of FIG. 2. Best Mode for Carrying Out the Invention
  • FIG. 1 is a block diagram of an intelligent robot according to an embodiment of the present invention.
  • the intelligent robot includes a manipulation sensing unit 10, a central processing unit 12, a storage unit 18, a sound output unit 20, a display unit 22, and a transmission and reception unit 24.
  • the storage unit 18 includes a first storage unit 14 implemented using mask Read Only Memory (ROM), and a second storage unit 16 implemented using a Secure Digital (SD) memory card.
  • the transmission and reception unit 24 uses infrared rays.
  • the manipulation sensing unit 10 senses individual manipulations of a user and provides sensing results to the central processing unit 12.
  • the manipulation sensing unit 10 is provided with various sensors, such as an impact sensor, a vibration sensor, a touch sensor and a sound sensor.
  • the sensors are respectively installed at suitable locations on the intelligent robot.
  • the touch sensor is installed in an armpit portion of the intelligent robot, so that the intelligent robot senses the user's touch as tickling when the user touches the armpit of the intelligent robot. Further, the intelligent robot senses impact using the impact sensor, and senses vibration using the vibration sensor.
  • the storage unit 18 stores therein various data required for the operation of the intelligent robot, and is accessed by the central processing unit 12.
  • the central processing unit 12 accesses the storage unit 18 to obtain specific reaction data for the intelligent robot corresponding to the user's specific manipulation sensing signals that are provided from the manipulation sensing unit 10, thus outputting sounds corresponding to the specific manipulations through the sound output unit 20. Further, the central processing unit 12 develops the intelligent robot according to the user's manipulation, and displays the status of the intelligent robot on the display unit 22 in the form of graphic images or characters.
  • the status of the intelligent robot may include, for example, development stages and personalities of the intelligent robot.
  • FIG. 2 is a network configuration view of a mobile game system using the intelligent robot of FIG. 1 according to an embodiment of the present invention.
  • the mobile game system includes an intelligent robot 30 connected to the Internet 36 through a client (Personal Computer: PC) 32, a server 34 connected to the Internet 36 to operate a homepage for managing various types of contents, and a mobile communication network 38 for connecting the Internet 36 to a mobile phone 40.
  • PC Personal Computer
  • the SD memory card implementing the second storage unit 16 in the intelligent robot 30, is removable. Therefore, the SD memory card is connected to the client 32 through a predetermined connector, and the client 32 is operated, so that desired data are stored in the SD memory card, and thereafter the SD memory card can be moved to the second storage unit 16.
  • the client 32 transmits and receives various contents, for example, information about development stages and personalities of the intelligent robot 30, to and from the server 34, thus enabling the contents included in the server 34 to be updated at any time.
  • the mobile phone 40 is provided with information about development stages and personalities of the intelligent robot 30 from the server 34 through the Internet 36 and the mobile communication network 38, and can apply the development stages and personalities of the intelligent robot 30 to a specific game while providing the information about the development stages and personalities of the intelligent robot 30 achieved during a corresponding game process to the sever 34.
  • FIG. 3 is a flowchart showing the steps of a mobile game method using the mobile phone 40 of FIG. 2.
  • the mobile phone 40 accesses the server 34 through the mobile communication network 38 and the Internet 36 to fetch information about the current status of the intelligent robot 30 according to the user's operation at step S30.
  • the current status information of the intelligent robot information about the development stages and personalities of the intelligent robot 30, etc. are used.
  • the mobile phone 40 applies the current status information of the intelligent robot
  • the mobile phone 40 uploads the current status of the intelligent robot 30 to the server 34 according to the user's operation when the game has terminated at step S34. That is, the current status of the intelligent robot 30 can be uploaded to the server 34 or not according to the user's operation of the mobile phone 40.
  • the manipulation sensing unit 10 senses the user's beating and provides a sensing signal corresponding to the user's beating to the central processing unit 12. At this time, the manipulation sensing unit 10 may sense the user's beating using the impact sensor.
  • the central processing unit 12 fetches crying sound data from the storage unit 18 and outputs a crying sound through the sound output unit 20, thus notifying the user that the intelligent robot 30 woke up.
  • the manipulation sensing unit 10 senses the user's touch action using the touch sensor, and provides a sensing signal corresponding to the touch action to the central processing unit 12.
  • the central processing unit 12 fetches sound data corresponding to the touch action from the storage unit 18, and outputs a corresponding sound through the sound output unit 20, thus allowing the user to listen to the corresponding sound.
  • the central processing unit 12 outputs various sounds, that is, sounds generated when the intelligent robot 30 is fed, tickled, playing (that is, when the user rotates the intelligent robot 30), and singing, through the sound output unit 20.
  • the central processing unit 12 develops the intelligent robot 30, and stores the development data thereof in both memory managed by the central processing unit 12 and the storage unit 18, respectively.
  • the central processing unit 12 determines that the intelligent robot 30 has become infected with a virus, accesses data that is stored in the storage unit 18 and corresponds to infection, and outputs a cough sound through the sound output unit 20. Further, when the user does not play with the intelligent robot 30 for a predetermined period, the central processing unit 12 determines that the intelligent robot 30 sleeps, accesses data that is stored in the storage unit 18 and corresponds to sleeping, and sequentially outputs yawning and snoring sounds through the sound output unit 20.
  • the intelligent robot senses that an oppositely placed intelligent robot has become infected with a virus in the case where two intelligent robots are oppositely arranged and the transmission and reception units thereof transmit and receive signals to and from each other, the intelligent robot also becomes infected with the virus and shows the same symptoms as the oppositely placed intelligent robot. That is, if an in- telligent robot has become infected with a virus, it outputs information, indicating that the intelligent robot has become infected with a virus, to its transmission and reception unit, thus allowing an oppositely placed intelligent robot to receive the information.
  • the virus is treated if the user plays with the intelligent robot for a predetermined period or longer.
  • the robot becomes depressed. If the user still does not give affection after the period has elapsed, the intelligent robot develops a resistant personality. In contrast, if the user plays frequently with the intelligent robot, the intelligent robot becomes cheerful. Further, if the user plays still more with the intelligent robot, the intelligent robot becomes lively.
  • the present invention provides an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, thus allowing the user to play with the intelligent robot without becoming bored. Further, the present invention is advantageous in that it allows a user to play a mobile game using the intelligent robot, thus enabling the user to experience more enjoyable games.

Landscapes

  • Toys (AREA)

Abstract

The present invention relates to an intelligent robot including a sound output unit (20) and a display unit (22). The intelligent robot includes a manipulation sensing unit (10), a storage unit (18), and a central processing unit (12). The manipulation sensing unit 10 senses individual manipulations of a user. The storage unit (18) stores data required for operation of the intelligent robot. The central processing unit (12) accesses the storage unit to obtain specific reaction data for the intelligent robot corresponding to specific manipulation sensing signals of the user that are provided from the manipulation sensing unit (10), outputs sounds corresponding to the specific manipulations of the user through the sound output unit (20), allows the intelligent robot to develop with the manipulation of the user, and displays the status of the intelligent robot on the display unit (22).

Description

Description
INTELLIGENT ROBOT AND MOBILE GAME METHOD USING
THE SAME
Technical Field
[1] The present invention relates, in general, to an intelligent robot and mobile game method using the intelligent robot and, more particularly, to an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, and a mobile game method using the intelligent robot.
Background Art
[2] Generally, a robot having the function of outputting sounds can output various types of preset sounds at regular intervals, or output a specific sound when a user touches a specific part of the robot. Therefore, the user listens to various types of sounds unilaterally output from the robot, and recognizes the current status of the robot. Further, after listening to a specific sound output from the robot when the user touches a specific part of the robot, the user thinks that the robot actually reacts to his or her action.
Disclosure of Invention Technical Problem
[3] In such a prior art, a conventional robot is disadvantageous in that it only sequentially outputs various types of sounds according to a preset sequence or outputs a sound corresponding to a user's touch, but cannot develop by itself or communicate with another robot. Technical Solution
[4] Accordingly, the present invention has been made keeping in mind the above problems occurring in the prior art, and an object of the present invention is to provide an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, and a mobile game method using the intelligent robot.
[5] In order to accomplish the above object, the present invention provides an intelligent robot including a sound output unit and a display unit, comprising a manipulation sensing unit for sensing individual manipulations of a user, a storage unit for storing various data required for operation of the intelligent robot, and a central processing unit for accessing the storage unit to obtain specific reaction data for the intelligent robot corresponding to specific manipulation sensing signals of the user that are provided from the manipulation sensing unit, outputting sounds corresponding to the specific manipulations of the user through the sound output unit, developing the intelligent robot according to the manipulations of the user, and displaying status of the intelligent robot on the display unit.
Advantageous Effects
[6] The present invention provides an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, and a mobile game method using the intelligent robot. Brief Description of the Drawings
[7] FIG. 1 is a block diagram of an intelligent robot according to an embodiment of the present invention;
[8] FIG. 2 is a network configuration view showing a mobile game system using the intelligent robot of FIG. 1 according to an embodiment of the present invention; and
[9] FIG. 3 is a flowchart showing the steps of a mobile game method using a mobile phone of FIG. 2. Best Mode for Carrying Out the Invention
[10] Reference should now be made to the drawings, in which the same reference numerals are used throughout the different drawings to designate the same or similar components.
[11] FIG. 1 is a block diagram of an intelligent robot according to an embodiment of the present invention. The intelligent robot includes a manipulation sensing unit 10, a central processing unit 12, a storage unit 18, a sound output unit 20, a display unit 22, and a transmission and reception unit 24. The storage unit 18 includes a first storage unit 14 implemented using mask Read Only Memory (ROM), and a second storage unit 16 implemented using a Secure Digital (SD) memory card. The transmission and reception unit 24 uses infrared rays.
[12] In FIG. 1, the manipulation sensing unit 10 senses individual manipulations of a user and provides sensing results to the central processing unit 12. The manipulation sensing unit 10 is provided with various sensors, such as an impact sensor, a vibration sensor, a touch sensor and a sound sensor. The sensors are respectively installed at suitable locations on the intelligent robot. For example, the touch sensor is installed in an armpit portion of the intelligent robot, so that the intelligent robot senses the user's touch as tickling when the user touches the armpit of the intelligent robot. Further, the intelligent robot senses impact using the impact sensor, and senses vibration using the vibration sensor.
[13] The storage unit 18 stores therein various data required for the operation of the intelligent robot, and is accessed by the central processing unit 12.
[14] The central processing unit 12 accesses the storage unit 18 to obtain specific reaction data for the intelligent robot corresponding to the user's specific manipulation sensing signals that are provided from the manipulation sensing unit 10, thus outputting sounds corresponding to the specific manipulations through the sound output unit 20. Further, the central processing unit 12 develops the intelligent robot according to the user's manipulation, and displays the status of the intelligent robot on the display unit 22 in the form of graphic images or characters. The status of the intelligent robot may include, for example, development stages and personalities of the intelligent robot.
[15] FIG. 2 is a network configuration view of a mobile game system using the intelligent robot of FIG. 1 according to an embodiment of the present invention. The mobile game system includes an intelligent robot 30 connected to the Internet 36 through a client (Personal Computer: PC) 32, a server 34 connected to the Internet 36 to operate a homepage for managing various types of contents, and a mobile communication network 38 for connecting the Internet 36 to a mobile phone 40.
[16] In FIG. 2, the SD memory card, implementing the second storage unit 16 in the intelligent robot 30, is removable. Therefore, the SD memory card is connected to the client 32 through a predetermined connector, and the client 32 is operated, so that desired data are stored in the SD memory card, and thereafter the SD memory card can be moved to the second storage unit 16. In this case, the client 32 transmits and receives various contents, for example, information about development stages and personalities of the intelligent robot 30, to and from the server 34, thus enabling the contents included in the server 34 to be updated at any time.
[17] The mobile phone 40 is provided with information about development stages and personalities of the intelligent robot 30 from the server 34 through the Internet 36 and the mobile communication network 38, and can apply the development stages and personalities of the intelligent robot 30 to a specific game while providing the information about the development stages and personalities of the intelligent robot 30 achieved during a corresponding game process to the sever 34.
[18] FIG. 3 is a flowchart showing the steps of a mobile game method using the mobile phone 40 of FIG. 2.
[19] First, the mobile phone 40 accesses the server 34 through the mobile communication network 38 and the Internet 36 to fetch information about the current status of the intelligent robot 30 according to the user's operation at step S30. For the current status information of the intelligent robot 30, information about the development stages and personalities of the intelligent robot 30, etc. are used.
[20] The mobile phone 40 applies the current status information of the intelligent robot
30, fetched from the server 34, to a game according to the user's operation at step S32.
[21] The mobile phone 40 uploads the current status of the intelligent robot 30 to the server 34 according to the user's operation when the game has terminated at step S34. That is, the current status of the intelligent robot 30 can be uploaded to the server 34 or not according to the user's operation of the mobile phone 40.
[22] An example of the operation of the intelligent robot 30 implemented in the present invention is described in detail.
[23] First, the user wakes up the intelligent robot 30 by beating the intelligent robot 30.
That is, the manipulation sensing unit 10 senses the user's beating and provides a sensing signal corresponding to the user's beating to the central processing unit 12. At this time, the manipulation sensing unit 10 may sense the user's beating using the impact sensor.
[24] The central processing unit 12 fetches crying sound data from the storage unit 18 and outputs a crying sound through the sound output unit 20, thus notifying the user that the intelligent robot 30 woke up.
[25] Further, if the user touches a specific part of the intelligent robot 30, the manipulation sensing unit 10 senses the user's touch action using the touch sensor, and provides a sensing signal corresponding to the touch action to the central processing unit 12.
[26] The central processing unit 12 fetches sound data corresponding to the touch action from the storage unit 18, and outputs a corresponding sound through the sound output unit 20, thus allowing the user to listen to the corresponding sound.
[27] In addition, the central processing unit 12 outputs various sounds, that is, sounds generated when the intelligent robot 30 is fed, tickled, playing (that is, when the user rotates the intelligent robot 30), and singing, through the sound output unit 20.
[28] In this way, while the user manipulates the intelligent robot 30, the central processing unit 12 develops the intelligent robot 30, and stores the development data thereof in both memory managed by the central processing unit 12 and the storage unit 18, respectively.
[29] When the user does not play with the intelligent robot 30 for a predetermined period, the central processing unit 12 determines that the intelligent robot 30 has become infected with a virus, accesses data that is stored in the storage unit 18 and corresponds to infection, and outputs a cough sound through the sound output unit 20. Further, when the user does not play with the intelligent robot 30 for a predetermined period, the central processing unit 12 determines that the intelligent robot 30 sleeps, accesses data that is stored in the storage unit 18 and corresponds to sleeping, and sequentially outputs yawning and snoring sounds through the sound output unit 20.
[30] Further, if one intelligent robot senses that an oppositely placed intelligent robot has become infected with a virus in the case where two intelligent robots are oppositely arranged and the transmission and reception units thereof transmit and receive signals to and from each other, the intelligent robot also becomes infected with the virus and shows the same symptoms as the oppositely placed intelligent robot. That is, if an in- telligent robot has become infected with a virus, it outputs information, indicating that the intelligent robot has become infected with a virus, to its transmission and reception unit, thus allowing an oppositely placed intelligent robot to receive the information.
[31] In the case where two intelligent robots are oppositely arranged and the transmission and reception units thereof transmit and receive signals to and from each other, they converse with each other and sing together. For example, if one intelligent robot sings, an oppositely placed intelligent robot sings along so as to make harmony.
[32] When the intelligent robot becomes infected with a virus, the virus is treated if the user plays with the intelligent robot for a predetermined period or longer.
[33] If the user does not give affection to the intelligent robot (does not play with the intelligent robot) within a predetermined period, the robot becomes depressed. If the user still does not give affection after the period has elapsed, the intelligent robot develops a resistant personality. In contrast, if the user plays frequently with the intelligent robot, the intelligent robot becomes cheerful. Further, if the user plays still more with the intelligent robot, the intelligent robot becomes lively.
[34] If two intelligent robots are coupled through a homepage operated by the server 34, the two intelligent robots recognize each other as lovers, and they sing together a wedding march when they meet.
[35] At a specific development stage, if male intelligent robots encounter one another, they duel with each other (both intelligent robots output gun shots).
[36] As described above, if more than two intelligent robots reacting to each other are gathered, they operate in the same way as in the case of two intelligent robots. Industrial Applicability
[37] As described above, the present invention provides an intelligent robot, which variously reacts to and variously develops with the manipulation of a user, thus allowing the user to play with the intelligent robot without becoming bored. Further, the present invention is advantageous in that it allows a user to play a mobile game using the intelligent robot, thus enabling the user to experience more enjoyable games.
[38] Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims.

Claims

Claims
[1] An intelligent robot including a sound output unit and a display unit, comprising: a manipulation sensing unit for sensing individual manipulations of a user; a storage unit for storing various data required for operation of the intelligent robot; and a central processing unit for accessing the storage unit to obtain specific reaction data for the intelligent robot corresponding to specific manipulation sensing signals of the user that are provided from the manipulation sensing unit, outputting sounds corresponding to the specific manipulations of the user through the sound output unit, developing the intelligent robot according to the manipulations of the user, and displaying status of the intelligent robot on the display unit.
[2] The intelligent robot according to claim 1, further comprising a transmission and reception unit for transmitting and receiving specific signals to and from a second intelligent robot under control of the central processing unit.
[3] The intelligent robot according to claim 1 or 2, wherein the manipulation sensing unit comprises impact sensor, a vibration sensor, a touch sensor and a sound sensor, the sensors being installed at suitable locations on the intelligent robot.
[4] The intelligent robot according to claim 1 or 2, wherein the storage unit comprises mask Read Only Memory (ROM) and a Secure Digital (SD) memory card.
[5] The intelligent robot according to claim 1 or 2, wherein the status of the intelligent robot includes development stages and personalities of the intelligent robot.
[6] The intelligent robot according to claim 1 or 2, wherein the central processing unit is operated so that, if the user's manipulation does not occur for a predetermined period, the central processing unit determines that the intelligent robot has become infected with a virus, accesses data that is stored in the storage unit and corresponds to the infection, and outputs a cough sound through the sound output unit, while if the user's manipulation occurs for a predetermined period or longer, the central processing unit determines that the virus has been treated and performs an operation corresponding to treatment.
[7] The intelligent robot according to claim 1 or 2, wherein the central processing unit is operated so that, if the user's manipulation does not occur for a predetermined period, the central processing unit determines that the intelligent robot sleeps, accesses data that is stored in the storage unit and corresponds to the sleeping, and sequentially outputs yawning and snoring sounds through the sound output unit.
[8] The intelligent robot according to claim 1 or 2, wherein the central processing unit is operated so that, if the user's manipulation does not occur for a first predetermined period, the central processing unit determines that the intelligent robot has become depressed and performs an operation corresponding to depression, and if the user's manipulation still does not occur for a second predetermined period after the first period, the central processing unit determines that the intelligent robot has developed a resistant personality and performs an operation corresponding to the resistant personality.
[9] The intelligent robot according to claim 1 or 2, wherein the central processing unit is operated so that, if the user's manipulation occurs for a first predetermined period, the central processing unit determines that the intelligent robot has become cheerful and performs an operation corresponding to cheerfulness, and if the user's manipulation still occurs for a second predetermined period after the first period, the central processing unit determines that the intelligent robot has developed a lively personality and performs an operation corresponding to the lively personality.
[10] The intelligent robot according to claim 2, wherein the transmission and reception unit transmits and receives specific signals to and from a second intelligent robot using infrared rays.
[11] The intelligent robot according to claim 2, wherein the central processing unit is operated so that, if the intelligent robot communicates with a second intelligent robot through the transmission and reception unit and the second intelligent robot has become infected with a virus, the central processing unit determines that the intelligent robot has also become infected with the virus and performs an operation corresponding to infection.
[12] The intelligent robot according to claim 2, wherein the central processing unit is operated so that, if the intelligent robot communicates with a second intelligent robot through the transmission and reception unit and the second intelligent robot sings, the central processing unit determines that the intelligent robot sings along, accesses corresponding song data stored in the storage unit, and outputs a corresponding song through the sound output unit.
[13] A mobile game method performed in a mobile game system, the mobile game system including a mobile phone that receives information about a specific intelligent robot from a specific server, connected to the Internet, through a mobile communication network and plays the game, comprising the steps of: the mobile phone accessing the server through the mobile communication network and the Internet to fetch information about development stages and per- sonalities of the intelligent robot according to an operation of a user; the mobile phone applying the information about development stages and personalities of the intelligent robot to a game according to the operation of the user; and the mobile phone uploading information about a current development stage and personality of the intelligent robot to the server according to the operation of the user when the game has terminated.
PCT/KR2005/000825 2004-12-07 2005-03-22 Intelligent robot and mobile game method using the same WO2006062274A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007544251A JP2008522654A (en) 2004-12-07 2005-03-22 Intelligent robot and mobile game method using the robot

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20040102166 2004-12-07
KR10-2004-0102166 2004-12-07

Publications (1)

Publication Number Publication Date
WO2006062274A1 true WO2006062274A1 (en) 2006-06-15

Family

ID=36578073

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2005/000825 WO2006062274A1 (en) 2004-12-07 2005-03-22 Intelligent robot and mobile game method using the same

Country Status (2)

Country Link
JP (1) JP2008522654A (en)
WO (1) WO2006062274A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2321817A1 (en) * 2008-06-27 2011-05-18 Yujin Robot Co., Ltd. Interactive learning system using robot and method of operating the same in child education

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7109207B2 (en) * 2018-02-23 2022-07-29 パナソニックホールディングス株式会社 Interaction device, interaction method, interaction program and robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020061332A (en) * 2001-01-16 2002-07-24 사성동 method of breeding robot pet using on-line and off-line systems simulaneously
JP2002370185A (en) * 2001-04-13 2002-12-24 Sony Corp Robot device, toy device, behavior control system and behavior control method of robot device, control program and recording medium
KR20020097477A (en) * 2001-06-21 2002-12-31 현명택 Intelligent robort toy based on internet and making method for intelligent robort toy
JP2003033573A (en) * 2002-05-30 2003-02-04 Bandai Co Ltd Growth simulation device utilizing network
JP2004105712A (en) * 1997-03-17 2004-04-08 Bandai Co Ltd Electronic appliance

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004105712A (en) * 1997-03-17 2004-04-08 Bandai Co Ltd Electronic appliance
KR20020061332A (en) * 2001-01-16 2002-07-24 사성동 method of breeding robot pet using on-line and off-line systems simulaneously
JP2002370185A (en) * 2001-04-13 2002-12-24 Sony Corp Robot device, toy device, behavior control system and behavior control method of robot device, control program and recording medium
KR20020097477A (en) * 2001-06-21 2002-12-31 현명택 Intelligent robort toy based on internet and making method for intelligent robort toy
JP2003033573A (en) * 2002-05-30 2003-02-04 Bandai Co Ltd Growth simulation device utilizing network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2321817A1 (en) * 2008-06-27 2011-05-18 Yujin Robot Co., Ltd. Interactive learning system using robot and method of operating the same in child education
EP2321817A4 (en) * 2008-06-27 2013-04-17 Yujin Robot Co Ltd Interactive learning system using robot and method of operating the same in child education

Also Published As

Publication number Publication date
JP2008522654A (en) 2008-07-03

Similar Documents

Publication Publication Date Title
US7976386B2 (en) Mesh network game controller with voice transmission, search capability, motion detection, and/or position detection
US20080318679A1 (en) Foot game controller with motion detection and/or position detection
US8982133B2 (en) Portable virtual characters
EP1533678A1 (en) Physical feedback channel for entertaining or gaming environments
JP3494567B2 (en) Portable communication toy and information storage medium
US20030115240A1 (en) Schedule managing character and information providing system and method using same
JP2002177656A (en) Electronic toy
CN106878390B (en) Electronic pet interaction control method and device and wearable equipment
US20020028704A1 (en) Information gathering and personalization techniques
EP2018032A1 (en) Identification of proximate mobile devices
JP5211435B2 (en) Accessories, electronic musical instruments, learning devices and programs
KR20000072753A (en) An intelligent toy with Wireless Local Communications.
CN108089798A (en) terminal display control method, flexible screen terminal and computer readable storage medium
CN110830368B (en) Instant messaging message sending method and electronic equipment
WO2018108174A1 (en) Interface interactive assembly control method and apparatus, and wearable device
KR100594005B1 (en) The method and apparatus for Bring up simulation of mobile communication terminal device
CN110796918A (en) Training method and device and mobile terminal
CN109475775A (en) Communication means, computer-readable medium, communication device and server
JP3381648B2 (en) Character display control device, character display control system, and recording medium
WO2006062274A1 (en) Intelligent robot and mobile game method using the same
TWI437503B (en) Figure and figure developing system
CN107767851B (en) Song playing method and mobile terminal
JP2000338859A (en) Information managing method for electronic equipment and electronic equipment system
CN111338598A (en) Message processing method and electronic equipment
JP2002000963A (en) Stuffed toy having communication means

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2007544251

Country of ref document: JP

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05789747

Country of ref document: EP

Kind code of ref document: A1