US20080166945A1 - Lifelike covering and lifelike electronic apparatus with the covering - Google Patents

Lifelike covering and lifelike electronic apparatus with the covering Download PDF

Info

Publication number
US20080166945A1
US20080166945A1 US11/971,918 US97191808A US2008166945A1 US 20080166945 A1 US20080166945 A1 US 20080166945A1 US 97191808 A US97191808 A US 97191808A US 2008166945 A1 US2008166945 A1 US 2008166945A1
Authority
US
United States
Prior art keywords
sensors
lifelike
processing unit
covering
sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/971,918
Inventor
Hua-Dong Cheng
Tsu-Li Chiang
Han-Che Wang
Kuan-Hong Hsieh
Xiao-Guang Li
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ensky Techonlogy Shenzhen Co Ltd
Ensky Technology Co Ltd
Original Assignee
Ensky Techonlogy Shenzhen Co Ltd
Ensky Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ensky Techonlogy Shenzhen Co Ltd, Ensky Technology Co Ltd filed Critical Ensky Techonlogy Shenzhen Co Ltd
Assigned to ENSKY TECHNOLOGY CO., LTD., ENSKY TECHNOLOGY (SHENZHEN) CO., LTD. reassignment ENSKY TECHNOLOGY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHENG, HUA-DONG, WANG, HAN-CHE, HSIEH, KUAN-HONG, CHIANG, TSU-LI, LI, XIAO-GUANG
Publication of US20080166945A1 publication Critical patent/US20080166945A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • A63H11/18Figure toys which perform a realistic walking motion
    • A63H11/20Figure toys which perform a realistic walking motion with pairs of legs, e.g. horses
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds

Definitions

  • the present invention relates to lifelike electronic apparatuses, and particularly to a lifelike electronic apparatus with a lifelike covering.
  • pet robots Recently quadruped-walking type pet robots have been developed and are sold widely. These pet robots resemble dogs or cats and are kept as pets. Such a pet robot is equipped with software that emulates real animal's emotions. Emotions such as “joy” and “anger” in are programmed in the software and can be made to respond to user's inputs. The robot may respond to inputs such as “patting” and “striking” as well as input from environmental conditions.
  • Such pet robot generally includes a housing for accommodating various components, such as sensors, actuators, mechanical movement units, etc.
  • the sensors are configured for sensing the surrounding condition and generating sensing signals to activate corresponding components to perform actions.
  • layouts and arrangements between the sensors and other components in the housing may become complicated, and as a result, assemblying of the components may take an inordinate amount of time and labor.
  • a lifelike electronic apparatus includes a lifelike covering and a housing covered by the lifelike covering.
  • the housing is configured with a power source, a central processing unit (CPU), a plurality of actuators, and a plurality of mechanical movement units.
  • the lifelike covering includes a flexible covering body and a flexible circuit board covered by the flexible covering.
  • the flexible circuit board is configured with a plurality of sensors and at least one interface. Each of the sensors is configured for sensing an external input and generating a corresponding sensing signal.
  • the interface is configured for transferring power from the power source to the sensors, and for transferring the sensing signals to the CPU, so as to activate the CPU to generate an action control signal to the corresponding actuators, thereby driving the corresponding mechanical movement units to perform a corresponding action.
  • FIG. 1 is an appearance diagram of a lifelike shaped electronic dinosaur apparatus covering with a lifelike covering.
  • FIG. 2 is a schematic diagram of infrastructure of a housing of the lifelike electronic apparatus of FIG. 1 .
  • FIG. 3 is a schematic diagram of an inner infrastructure of the lifelike covering of FIG. 1 in accordance with a first embodiment of the present invention.
  • FIG. 4 is a schematic diagram of an inner infrastructure of the lifelike covering of FIG. 1 in accordance with a second embodiment of the present invention.
  • FIG. 1 and FIG. 2 show a lifelike shaped electronic dinosaur apparatus covering with a lifelike covering.
  • the dinosaur shaped lifelike electronic apparatus (hereinafter, “the apparatus”) consists of a lifelike covering 1 and a housing 2 .
  • the housing 2 is configured with a power source 20 , a central processing unit (CPU) 21 , a speech unit 22 , a plurality of actuators 23 , and a plurality of mechanical movement units 24 (see FIG. 2 ).
  • the lifelike covering 1 is a flexible material, and is designed with a dinosaur shape.
  • the lifelike covering 1 includes a flexible covering body 10 and a flexible circuit board 11 / 11 ′.
  • the flexible covering body 10 covers the flexible circuit board 11 / 11 ′.
  • the flexible circuit board 11 / 11 ′ is divided into a plurality of areas, such as a head area 110 , a neck area 111 , a body area 112 , a tail area 113 , and limb areas 114 .
  • Each of the areas corresponds to one particular part of the apparatus.
  • Each of the areas is configured with a plurality of sensors 16 , such as 16 a, 16 b, 16 c, and 16 d.
  • Each of the sensors 16 is assigned with a coordinate for identification, and is configured for sensing an external input and generating a corresponding sensing signal.
  • the head area 110 is configured with one or more light sensors 16 a in an eye part of the head area 110 for sensing external light and generating a light sensing signal; the head area 110 is further configured with one or more touch sensitive sensors 16 b for sensing a user's touch thereon and generating a touch sensing signal; the body area 112 is configured with one or more pressure sensors 16 c for sensing a user's tap or blow thereon and generating a pressure sensing signal; the tail area 113 is configured with one or more infrared sensors 16 d for sensing a user's infrared ray and generating an infrared sensing signal.
  • a number of the sensors 16 , types of the sensors 16 , and an arrangement of the sensors 16 are not limited to the embodiments described herein.
  • the flexible circuit board 11 / 11 ′ is further configured with at least one interface 17 .
  • one interface 17 is shown.
  • the interface 17 is electrically coupled to the sensors 16 .
  • the interface 17 is configured for transferring power from the power source 20 to the sensors 16 , and for transferring the sensing signals from the sensors 16 to the CPU 21 .
  • the interface 17 in other embodiments, can be divided into two interfaces, one of which is configured for a power transfer while the other is configured for a sensing signal transfer.
  • the flexible circuit board 11 ′ is further configured with at least one processing unit 18 .
  • one processing unit 18 is shown.
  • the processing unit 18 is electrically coupled with at least one sensor 16 and the interface 17 .
  • the processing unit 18 is configured for processing the sensing signal from the coupled sensor 16 , and generating an identified signal to the CPU 21 .
  • the identified signal indicates a source of the processed sensing signal.
  • the identified signal also indicates a value of the external input corresponding to the sensing signal.
  • the identified signal indicates a light intensity value
  • the identified signal indicates a pressure value, which directly shows a magnitude of the tap or blow operation and indirectly shows a user's motion. For example, a high pressure value shows the user is angry, and a low pressure value shows the user is gentle.
  • the CPU 21 receives and processes signals, including the identified signals and the sensing signals without being processed by the processing unit 18 , from the interface 17 .
  • the CPU 21 generates action control signals based on the processed signals and an input-output comparison table, and transmits the action control signals to corresponding actuators 23 so as to activate the actuators 23 to drive corresponding mechanical movement units 24 to perform an action.
  • the input-output comparison table is configured for recording a relationship between the external inputs and corresponding outputs. That is, the input-output comparison table records the sources of the sensing signals and/or the values of the sensing signals, and the action control signals that consists of control objects and actions. For example, if the sensing signal is from the light sensor 16 a in the eye part of the head area 110 and the light value is greater than a predetermined light value, namely where the apparatus may be in a light ambience, the corresponding control objects are the actuators 23 in an eye part of the apparatus and the corresponding action is narrowing of the eyes of the apparatus.
  • the control objects are the actuators 23 in a mouth part of the apparatus and the corresponding action is opening a mouth of the apparatus and outputting a speech of “ouch.” If the sensing signal is from the infrared sensor 16 d in the tail area 113 , the control objects are the actuators 23 in a neck part of the apparatus and the corresponding action is turning a head of the apparatus and outputting a speech of “who is standing behind me.”
  • the CPU 21 When the CPU 21 receives the sensing signals without being processed by the processing unit 18 from the interface 17 , the CPU 21 identifies the sensing signals according to the coordinates thereof, namely determines the sources of the sensing signals and the values of the external inputs, and generates corresponding action control signals based on the identified signals and the input-output comparison table. Alternatively, when the CPU 21 receives the identified signals from the interface 17 , the CPU 21 directly generates corresponding control signals based on the identified signals and the input-output comparison table.
  • the sensors 16 are directly configured in the lifelike covering 1 and are separated from an assembly of components of the housing 2 .
  • such configuration of the lifelike covering 1 is effective to improve an assembly speed as compared with current assemblies of current lifelike electronic apparatuses that have sensors together with other components installed in housings thereof.

Abstract

A lifelike covering is provided. The covering includes a flexible covering body and a flexible circuit board covered by the flexible covering. The flexible circuit board is configured with a plurality of sensors and at least one interface. Each of the sensors is configured for sensing an external input and generating a corresponding sensing signal. The interface is configured for transferring power from an external power source to the sensors, and for transferring the sensing signals to an external processing unit, so as to activate the external processing unit to perform a corresponding action control. A lifelike electronic apparatus with the covering is also provided.

Description

    BACKGROUND
  • 1. Technical Field
  • The present invention relates to lifelike electronic apparatuses, and particularly to a lifelike electronic apparatus with a lifelike covering.
  • 2. General Background
  • Recently quadruped-walking type pet robots have been developed and are sold widely. These pet robots resemble dogs or cats and are kept as pets. Such a pet robot is equipped with software that emulates real animal's emotions. Emotions such as “joy” and “anger” in are programmed in the software and can be made to respond to user's inputs. The robot may respond to inputs such as “patting” and “striking” as well as input from environmental conditions.
  • Such pet robot generally includes a housing for accommodating various components, such as sensors, actuators, mechanical movement units, etc. The sensors are configured for sensing the surrounding condition and generating sensing signals to activate corresponding components to perform actions. However, when the pet robot is overburden with a large number of sensors, layouts and arrangements between the sensors and other components in the housing may become complicated, and as a result, assemblying of the components may take an inordinate amount of time and labor.
  • What is needed, therefore, is a lifelike electronic apparatus that applies an improved component configuration that is effective to reduce time and labor consuming in an assembly of the lifelike electronic apparatus.
  • SUMMARY
  • A lifelike electronic apparatus is provided. The apparatus includes a lifelike covering and a housing covered by the lifelike covering. The housing is configured with a power source, a central processing unit (CPU), a plurality of actuators, and a plurality of mechanical movement units. The lifelike covering includes a flexible covering body and a flexible circuit board covered by the flexible covering. The flexible circuit board is configured with a plurality of sensors and at least one interface. Each of the sensors is configured for sensing an external input and generating a corresponding sensing signal. The interface is configured for transferring power from the power source to the sensors, and for transferring the sensing signals to the CPU, so as to activate the CPU to generate an action control signal to the corresponding actuators, thereby driving the corresponding mechanical movement units to perform a corresponding action.
  • Other advantages and novel features will be drawn from the following detailed description with reference to the attached drawing.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The components of the drawings are not necessarily drawn to measuring scale, the emphasis instead being placed upon clearly illustrating the principles of the lifelike electronic apparatus. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
  • FIG. 1 is an appearance diagram of a lifelike shaped electronic dinosaur apparatus covering with a lifelike covering.
  • FIG. 2 is a schematic diagram of infrastructure of a housing of the lifelike electronic apparatus of FIG. 1.
  • FIG. 3 is a schematic diagram of an inner infrastructure of the lifelike covering of FIG. 1 in accordance with a first embodiment of the present invention.
  • FIG. 4 is a schematic diagram of an inner infrastructure of the lifelike covering of FIG. 1 in accordance with a second embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE EMBODIMENT
  • FIG. 1 and FIG. 2 show a lifelike shaped electronic dinosaur apparatus covering with a lifelike covering. The dinosaur shaped lifelike electronic apparatus (hereinafter, “the apparatus”) consists of a lifelike covering 1 and a housing 2. The housing 2 is configured with a power source 20, a central processing unit (CPU) 21, a speech unit 22, a plurality of actuators 23, and a plurality of mechanical movement units 24 (see FIG. 2). The lifelike covering 1 is a flexible material, and is designed with a dinosaur shape.
  • Referring to FIG. 3 and FIG. 4, the lifelike covering 1 includes a flexible covering body 10 and a flexible circuit board 11/11′. The flexible covering body 10 covers the flexible circuit board 11/11′. The flexible circuit board 11/11′ is divided into a plurality of areas, such as a head area 110, a neck area 111, a body area 112, a tail area 113, and limb areas 114. Each of the areas corresponds to one particular part of the apparatus. Each of the areas is configured with a plurality of sensors 16, such as 16 a, 16 b, 16 c, and 16 d. Each of the sensors 16 is assigned with a coordinate for identification, and is configured for sensing an external input and generating a corresponding sensing signal.
  • Each of the areas can be equipped with particular sensors 16 for performing a particular application. For example, the head area 110 is configured with one or more light sensors 16 a in an eye part of the head area 110 for sensing external light and generating a light sensing signal; the head area 110 is further configured with one or more touch sensitive sensors 16 b for sensing a user's touch thereon and generating a touch sensing signal; the body area 112 is configured with one or more pressure sensors 16 c for sensing a user's tap or blow thereon and generating a pressure sensing signal; the tail area 113 is configured with one or more infrared sensors 16 d for sensing a user's infrared ray and generating an infrared sensing signal. However, it should be noted that a number of the sensors 16, types of the sensors 16, and an arrangement of the sensors 16 are not limited to the embodiments described herein.
  • The flexible circuit board 11/11′ is further configured with at least one interface 17. For simplicity, in the embodiment as shown in FIGS. 3 and 4, one interface 17 is shown. The interface 17 is electrically coupled to the sensors 16. The interface 17 is configured for transferring power from the power source 20 to the sensors 16, and for transferring the sensing signals from the sensors 16 to the CPU 21. In addition, the interface 17, in other embodiments, can be divided into two interfaces, one of which is configured for a power transfer while the other is configured for a sensing signal transfer.
  • Referring to FIG. 4, the flexible circuit board 11′ is further configured with at least one processing unit 18. For simplicity, in the embodiment as shown in FIG. 4, one processing unit 18 is shown. The processing unit 18 is electrically coupled with at least one sensor 16 and the interface 17. The processing unit 18 is configured for processing the sensing signal from the coupled sensor 16, and generating an identified signal to the CPU 21. The identified signal indicates a source of the processed sensing signal. The identified signal also indicates a value of the external input corresponding to the sensing signal. For example, with respect to the processed sensing signal from the light sensor 16 a, the identified signal indicates a light intensity value; with respect to the processed sensing signal from the pressure sensor 16 c, the identified signal indicates a pressure value, which directly shows a magnitude of the tap or blow operation and indirectly shows a user's motion. For example, a high pressure value shows the user is angry, and a low pressure value shows the user is gentle.
  • The CPU 21 receives and processes signals, including the identified signals and the sensing signals without being processed by the processing unit 18, from the interface 17. The CPU 21 generates action control signals based on the processed signals and an input-output comparison table, and transmits the action control signals to corresponding actuators 23 so as to activate the actuators 23 to drive corresponding mechanical movement units 24 to perform an action.
  • The input-output comparison table is configured for recording a relationship between the external inputs and corresponding outputs. That is, the input-output comparison table records the sources of the sensing signals and/or the values of the sensing signals, and the action control signals that consists of control objects and actions. For example, if the sensing signal is from the light sensor 16 a in the eye part of the head area 110 and the light value is greater than a predetermined light value, namely where the apparatus may be in a light ambience, the corresponding control objects are the actuators 23 in an eye part of the apparatus and the corresponding action is narrowing of the eyes of the apparatus. If the sensing signal is from the pressure sensor 16 c in the body area 112 and the value of the pressure value is greater than a predetermined pressure value, namely where the user of the apparatus may be angry, the control objects are the actuators 23 in a mouth part of the apparatus and the corresponding action is opening a mouth of the apparatus and outputting a speech of “ouch.” If the sensing signal is from the infrared sensor 16 d in the tail area 113, the control objects are the actuators 23 in a neck part of the apparatus and the corresponding action is turning a head of the apparatus and outputting a speech of “who is standing behind me.”
  • When the CPU 21 receives the sensing signals without being processed by the processing unit 18 from the interface 17, the CPU 21 identifies the sensing signals according to the coordinates thereof, namely determines the sources of the sensing signals and the values of the external inputs, and generates corresponding action control signals based on the identified signals and the input-output comparison table. Alternatively, when the CPU 21 receives the identified signals from the interface 17, the CPU 21 directly generates corresponding control signals based on the identified signals and the input-output comparison table.
  • By utilizing a configuration of the lifelike covering 1 described above, during an assembly of the apparatus, the sensors 16 are directly configured in the lifelike covering 1 and are separated from an assembly of components of the housing 2. When the lifelike electronic apparatus applies a large number of sensors 16, such configuration of the lifelike covering 1 is effective to improve an assembly speed as compared with current assemblies of current lifelike electronic apparatuses that have sensors together with other components installed in housings thereof.
  • Although the present invention has been specifically described on the basis of a preferred embodiment thereof, the invention is not to be construed as being limited thereto. Various changes or modifications may be made to the embodiment without departing from the scope and spirit of the invention.

Claims (7)

1. A lifelike covering of a lifelike electronic apparatus comprising:
a flexible covering body; and
a flexible circuit board covered by the flexible covering body, and configured with:
a plurality of sensors configured for sensing an external input and generating a corresponding sensing signal; and
at least one interface, electrically coupled with the sensors, configured for transferring power from an external power source to the sensors, and for transferring sensing signals from the sensors to an external processing unit, so as to activate the external processing unit to perform a corresponding action control.
2. The lifelike covering according to claim 1, wherein the sensors comprise touch sensitive sensors, pressure sensors, light sensors, and infrared sensors.
3. A lifelike covering of a lifelike electronic apparatus comprising:
a flexible covering body;
a flexible circuit board covered by the flexible covering body, and configured with:
a plurality of sensors configured for sensing an external input and generating a corresponding sensing signal; and
at least one processing unit electrically coupled with at least one of the sensors, configured for receiving the sensing signals from the coupled sensors and identifying the received sensing signals; and
at least one interface electrically coupled with the sensors and the processing unit, configured for transferring power from an external power source to the sensors and the processing unit, and for transferring the identified signals and the sensing signals without being processed by the processing unit to an external processing unit, so as to activate the external processing unit to perform a corresponding action control.
4. The lifelike covering according to claim 3, wherein the sensors comprise touch sensitive sensors, pressure sensors, light sensors, and infrared sensors.
5. A lifelike electronic apparatus comprising:
a housing configured with:
a power source;
a central processing unit (CPU);
a plurality of mechanical movement units; and
a plurality of actuators each configured for driving corresponding mechanical movement units; and
a lifelike covering covering the housing, comprising:
a flexible covering body; and
a flexible circuit board covered by the flexible covering body, and configured with:
a plurality of sensors each configured for sensing an external input and generating a corresponding sensing signal; and
at least one interface, electrically coupled with the sensors, configured for transferring power from the power source to the sensors, and for transferring the sensing signals to the CPU, so as to activate the CPU to generate a action control signal to the corresponding actuators, thereby driving the corresponding mechanical movement units to perform a corresponding action.
6. The lifelike electronic apparatus according to claim 5, wherein the sensors comprise touch sensitive sensors, pressure sensors, light sensors, and infrared sensors.
7. A lifelike electronic apparatus comprising:
a housing configured with:
a power source;
a central processing unit (CPU);
a plurality of mechanical movement units; and
a plurality of actuators each configured for driving corresponding mechanical movement units; and
a lifelike covering covering the housing, comprising:
a flexible covering body; and
a flexible circuit board covered by the flexible covering body, and configured with:
a plurality of sensors configured for sensing an external input and generating a corresponding sensing signal; and
at least one processing unit electrically coupled with at least one of the sensors, configured for receiving the sensing signals from the coupled sensors and identifying the received sensing signals; and
at least one interface electrically coupled with the sensors and the processing unit, configured for transferring power from an external power source to the sensors and the processing unit, and for transferring the identified signals and the sensing signals without being processed by the processing unit to an external processing unit, so as to activate the external processing unit to perform a corresponding action control.
US11/971,918 2007-01-10 2008-01-10 Lifelike covering and lifelike electronic apparatus with the covering Abandoned US20080166945A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN200710200035.X 2007-01-10
CN200710200035XA CN101219280B (en) 2007-01-10 2007-01-10 Bionic device ectoderm and bionic device overlapping the ectoderm

Publications (1)

Publication Number Publication Date
US20080166945A1 true US20080166945A1 (en) 2008-07-10

Family

ID=39594715

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/971,918 Abandoned US20080166945A1 (en) 2007-01-10 2008-01-10 Lifelike covering and lifelike electronic apparatus with the covering

Country Status (2)

Country Link
US (1) US20080166945A1 (en)
CN (1) CN101219280B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090104844A1 (en) * 2007-10-19 2009-04-23 Hon Hai Precision Industry Co., Ltd. Electronic dinosaur toys
US20110021108A1 (en) * 2009-07-21 2011-01-27 Borei Corporation Method and system for interactive toys
US20140273722A1 (en) * 2013-03-15 2014-09-18 Mattel, Inc. Toy with an Illuminable Movable Portion
US9108115B1 (en) 2014-08-25 2015-08-18 Silverlit Limited Toy responsive to blowing or sound
EP3000515A1 (en) * 2014-09-25 2016-03-30 Silverlit Limited A toy responsive to blowing or sound
CN105843406A (en) * 2016-06-01 2016-08-10 杨杰 Simulated epidermis with built-in tactile feedback
US10163175B2 (en) * 2009-02-25 2018-12-25 Humana Inc. System and method for improving healthcare through social robotics
US20220118373A1 (en) * 2020-10-20 2022-04-21 Moose Creative Management Pty Limited Toy system
US20230201730A1 (en) * 2021-12-28 2023-06-29 Anthony Blackwell Speaking Doll Assembly

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201735976A (en) * 2016-04-15 2017-10-16 陳萬添 Interactive styling apparatus

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5372511A (en) * 1992-01-13 1994-12-13 Tectron Manufacturing (Hk) Limited Educational toys
US6206745B1 (en) * 1997-05-19 2001-03-27 Creator Ltd. Programmable assembly toy
US6311350B1 (en) * 1999-08-12 2001-11-06 Ferber Technologies, L.L.C. Interactive fabric article
US6554094B1 (en) * 1999-12-17 2003-04-29 Meritor Heavy Vehicle Systems, Llc Method and system for independently electronically controlling steering of vehicle wheels
US20050085157A1 (en) * 2003-08-20 2005-04-21 Kevin Dahlquist Robotic toy
US6892675B1 (en) * 2004-03-16 2005-05-17 Paul H. Comerford Cat toy
US20050225951A1 (en) * 2002-06-24 2005-10-13 Mari Kurakami Electronic apparatus
US7364489B1 (en) * 2003-04-30 2008-04-29 Hasbro, Inc. Electromechanical toy
US7426873B1 (en) * 2006-05-04 2008-09-23 Sandia Corporation Micro electro-mechanical system (MEMS) pressure sensor for footwear

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5288069A (en) * 1992-11-20 1994-02-22 Susan Matsumoto Talking football
US6039628A (en) * 1993-09-02 2000-03-21 Kusmiss; John H. Self-mobile cat toy
CN2292274Y (en) * 1997-01-21 1998-09-23 河北工业大学 Robot contact sensor with filling current converter foam pad
AU7793601A (en) * 2000-08-04 2002-02-18 Mattel Inc Transformable toy figure having alternative sounds
CN2476316Y (en) * 2001-04-12 2002-02-13 梁建宏 Remote controlled multi-joint bionic machine fish
CN100583007C (en) * 2006-12-21 2010-01-20 财团法人工业技术研究院 Movable device with surface display information and interaction function

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5372511A (en) * 1992-01-13 1994-12-13 Tectron Manufacturing (Hk) Limited Educational toys
US6206745B1 (en) * 1997-05-19 2001-03-27 Creator Ltd. Programmable assembly toy
US6311350B1 (en) * 1999-08-12 2001-11-06 Ferber Technologies, L.L.C. Interactive fabric article
US6554094B1 (en) * 1999-12-17 2003-04-29 Meritor Heavy Vehicle Systems, Llc Method and system for independently electronically controlling steering of vehicle wheels
US20050225951A1 (en) * 2002-06-24 2005-10-13 Mari Kurakami Electronic apparatus
US7364489B1 (en) * 2003-04-30 2008-04-29 Hasbro, Inc. Electromechanical toy
US20050085157A1 (en) * 2003-08-20 2005-04-21 Kevin Dahlquist Robotic toy
US6892675B1 (en) * 2004-03-16 2005-05-17 Paul H. Comerford Cat toy
US7426873B1 (en) * 2006-05-04 2008-09-23 Sandia Corporation Micro electro-mechanical system (MEMS) pressure sensor for footwear

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7988522B2 (en) * 2007-10-19 2011-08-02 Hon Hai Precision Industry Co., Ltd. Electronic dinosaur toy
US20090104844A1 (en) * 2007-10-19 2009-04-23 Hon Hai Precision Industry Co., Ltd. Electronic dinosaur toys
US10163175B2 (en) * 2009-02-25 2018-12-25 Humana Inc. System and method for improving healthcare through social robotics
US10971256B2 (en) 2009-02-25 2021-04-06 Humana Inc. System and method for improving healthcare through social robotics
US20110021108A1 (en) * 2009-07-21 2011-01-27 Borei Corporation Method and system for interactive toys
US20140273722A1 (en) * 2013-03-15 2014-09-18 Mattel, Inc. Toy with an Illuminable Movable Portion
US10350505B2 (en) * 2013-03-15 2019-07-16 Mattel, Inc. Toy with an illuminable movable portion
US9108115B1 (en) 2014-08-25 2015-08-18 Silverlit Limited Toy responsive to blowing or sound
EP3000515A1 (en) * 2014-09-25 2016-03-30 Silverlit Limited A toy responsive to blowing or sound
CN105843406A (en) * 2016-06-01 2016-08-10 杨杰 Simulated epidermis with built-in tactile feedback
US20220118373A1 (en) * 2020-10-20 2022-04-21 Moose Creative Management Pty Limited Toy system
US20230008010A1 (en) * 2020-10-20 2023-01-12 Moose Creative Management Pty Limited Toy system
US11786833B2 (en) * 2020-10-20 2023-10-17 Moose Creative Management Pty Limited Toy system
US11786834B2 (en) * 2020-10-20 2023-10-17 Moose Creative Management Pty Limited Toy system
US20230201730A1 (en) * 2021-12-28 2023-06-29 Anthony Blackwell Speaking Doll Assembly

Also Published As

Publication number Publication date
CN101219280A (en) 2008-07-16
CN101219280B (en) 2011-08-24

Similar Documents

Publication Publication Date Title
US20080166945A1 (en) Lifelike covering and lifelike electronic apparatus with the covering
US6337552B1 (en) Robot apparatus
US20090153499A1 (en) Touch action recognition system and method
JP4609584B2 (en) Robot device, face recognition method, and face recognition device
US6708081B2 (en) Electronic equipment with an autonomous function
US20080147239A1 (en) Apparatus with Surface Information Displaying and Interaction Capability
US20120209433A1 (en) Social robot
KR20010095176A (en) Robot and action deciding method for robot
Kaura et al. Gesture controlled robot using image processing
WO2000068879A1 (en) Robot device, its control method, and recorded medium
JP2002239963A (en) Robot device and its action control method, program, and recoding medium of robot device
US20220097230A1 (en) Robot control device, robot control method, and program
KR20010052699A (en) Robot, method of robot control, and program recording medium
US20190025931A1 (en) Techniques for Real Object and Hand Representation in Virtual Reality Content
US20230005481A1 (en) Information processor, information processing method, and program
US20200269421A1 (en) Information processing device, information processing method, and program
JP2024009862A (en) Information processing apparatus, information processing method, and program
Shidujaman et al. “roboquin”: A mannequin robot with natural humanoid movements
CN101224343B (en) Biology-like and parts controlling module thereof
WO2021005878A1 (en) Information processing device, information processing method, and information processing program
US11485021B2 (en) Robot, robot control method, and recording medium
US20230195401A1 (en) Information processing apparatus and information processing method
CN104460962A (en) 4D somatosensory interaction system based on game engine
JP2003271958A (en) Method and processor for processing image, program therefor, recording medium therefor, and robot system of type mounted with image processor
Mohith et al. Gesture and voice controlled robotic car using arduino

Legal Events

Date Code Title Description
AS Assignment

Owner name: ENSKY TECHNOLOGY CO., LTD., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHENG, HUA-DONG;CHIANG, TSU-LI;WANG, HAN-CHE;AND OTHERS;REEL/FRAME:020344/0805;SIGNING DATES FROM 20071128 TO 20071229

Owner name: ENSKY TECHNOLOGY (SHENZHEN) CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHENG, HUA-DONG;CHIANG, TSU-LI;WANG, HAN-CHE;AND OTHERS;REEL/FRAME:020344/0805;SIGNING DATES FROM 20071128 TO 20071229

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION