US20110021108A1 - Method and system for interactive toys - Google Patents

Method and system for interactive toys Download PDF

Info

Publication number
US20110021108A1
US20110021108A1 US12/506,933 US50693309A US2011021108A1 US 20110021108 A1 US20110021108 A1 US 20110021108A1 US 50693309 A US50693309 A US 50693309A US 2011021108 A1 US2011021108 A1 US 2011021108A1
Authority
US
United States
Prior art keywords
toy
doll
user
state
states
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/506,933
Inventor
Khanh M. Le
David M. Holmes
Paul P. Campbell
Ling Kun L. Cheng
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOREI CORP
Original Assignee
BOREI CORP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOREI CORP filed Critical BOREI CORP
Priority to US12/506,933 priority Critical patent/US20110021108A1/en
Assigned to BOREI CORPORATION reassignment BOREI CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CAMPBELL, PAUL P., CHENG, LING KUN L., HOLMES, DAVID M., LE, KHANH M.
Publication of US20110021108A1 publication Critical patent/US20110021108A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H13/00Toy figures with self-moving parts, with or without movement of the toy as a whole
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/006Dolls provided with electrical lighting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Abstract

Toy design methods break down the desired behavior of an electro-mechanical toy into unique states represented with electronics and/or mechanical modeling. The toy will exist in such rest state until some external event acts to trigger a state change in one or more of the parts. Any event can be defined at an appropriate user, environmental, or sensory input to can act as a trigger for the toy to react with some predefined behavior. Each of several physical toy states can be uniquely represented with an electronic circuit register. A multi-bit register status at any one particular instant directly represents the entire state the toy is in, and is quick and simple to inspect and act on. State changes triggered by input stimuli cause a change in the register bits reflecting the changing conditions of the toy.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to interactive toys, and in particular to models and control system architectures for defining, describing and coding control programs for interactive toy behaviors.
  • 2. Description of the Prior Art
  • Toys can be far more interesting to play with if they are able to interact with children and adults. People and animals can react in complex and myriad ways to compound stimuli. But basically, inputs are needed that are processed, and outputs deliver the response. In an interactive toy, the inputs can include touch sensors in the hands, feet, abdomen, and head of a doll or animal, temperature sensors, accelerometers, cameras, microphones, and voice recognition. The outputs can be speakers for speech synthesis, motors and actuators for limb, mouth, eye, and head movements, and memory. Translating the inputs into appropriate outputs is complex and embraces the magic in making a toy fun and entertaining. True intelligence is not yet possible, but enough of a show can be put on to make a young child believe they are playing with a friend.
  • Mass produced products like toys are highly sensitive to component costs. So a practical devices and mechanisms for making a toy interactive and fun would need to be very inexpensive to manufacture.
  • SUMMARY OF THE INVENTION
  • Briefly, an embodiment of the present invention comprises a modular control system architecture for interactive toys with multiple sensory functions. Such integrates acceptance of a user input from sensors to create an appropriate response. In a doll, sensors in the arms, legs, and elsewhere are activated when a user grabs, touches, or speaks, and the doll responds with different actions. For instance a wrestling doll able to sense and recognize the differences when it is subjected to a body slam, back flip, pile driver, etc. Different sensory inputs such as ambient light, pressure, temperature, touch, speech, movement, position are detected with specific sensory hardware and software. Here, particular sensory functions are registered as electronic, software, and mechanical states and transitions. The complete behavior and operation of the toy can be designed by modeling the sequence of states and state transitions that are particularly triggered by internal or external events. The designed interaction with users is broken down into a sequence of states. An input trigger causes a transition between states in a set of predefined events. The generation of a trigger can result from a user input, an environmental input, or a sensory input. The various inputs are collected and interpreted by electronic and mechanical devices. The desired behavior or interaction of the toy is describable by a combination of states and state transitions triggered by some event or events. The intended behavior is parsed into several states, inter-dependent or not, and into a set of actions or reactions that themselves can transition the toy from one state to the next. The states are defined according to the particular control system in use, the different sensory functions, and the actual electro-mechanical design. A software language is used to represent toy behavior as several states.
  • These and other objects and advantages of the present invention will no doubt become obvious to those of ordinary skill in the art after having read the following detailed description of the preferred embodiments that are illustrated in the various drawing figures.
  • IN THE DRAWINGS
  • FIG. 1 is a schematic diagram of an electronic control unit for automating a toy, in a toy control system embodiment of the present invention, and includes various user and environmental sensors, a microcomputer with an interactive-play program, and speaker and motor outputs;
  • FIG. 2 is a plan diagram for a flexible circuit layout which could be used to build the electronic control unit of FIG. 1, and shows that all the electronic devices and circuitry are disposed on a single flexible circuit substrate having elongations for the limbs that put touch sensors out in the extremities;
  • FIG. 3 is a more detailed plan view of a flexible circuit layout suggesting how the components of the electronic control unit of FIGS. 1 and 2 could be laid out for a doll embodiment of the present invention;
  • FIG. 4 is an exploded assembly view diagram showing how the flexible circuits of FIGS. 1-3 could be folded up and installed in the back torso of a doll embodiment of the present invention; and
  • FIG. 5 is a behavior tree diagram that plots how each user and environmental input of a wrestling doll is to be interpreted and used to trigger an output that convinces the user the wrestling doll has suffered and recognized a particular wrestling move or take-down the user has just then applied, and such modeling of states and state transitions is suggested here as being a useful design tool embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • An interactive toy changes its existing physical state or takes a certain action in the form of motion, light, speech, or sound upon receiving a user or environmental input. Sensory inputs that impinge on the toy include touch, light, sound, speech, motion, temperature etc. A modular control system architecture integrates input sensors embedded in various parts of a toy, and the toy concocts an interaction output for the user based on the character of the inputs received.
  • FIG. 1 represents a toy control system embodiment of the present invention, and is referred to herein by the general reference numeral 100. Toy control system 100 mounts inside a toy like a doll, plush animal, board game, or puzzle, and comprises a programmable microcomputer 102 with an executable program inside. Inputs to programmable microcomputer 102 include a microphone 103, an ambient light detector 104, a temperature sensor 106, touch sensors 108, a pressure sensor 110, and an interactive-play program stored inside programmable microcomputer 102. A gyroscope or accelerometers 112 provide balance and orientation information so the toy can sense if it is right-side up, spinning, leaning, laying down, etc. A speaker 114 allows music, sound, and speech output from a voice synthesizer or sound generator, and a motor driver 116 is used to actuate motors in the limbs or head, for example. A motor feedback 118 provides information about position and loading. Programmable microcomputer 102 includes algorithms 120 that can be included in the interactive-play program.
  • Microphone 103, and all the other input and output devices, must be able to perform their functions well enough to engage a child in play. At the same time, such devices must be very inexpensive to manufacture in mass production. In some toys, microphone 103 may be used for limited speech and user recognition, or simply to distinguish when the child makes “happy” sounds. So a very simple microphone and algorithm 120 will do the job. The ambient light detector 104 can be simple enough to tell when room lighting or day light is present. Touch sensors 108 are capacitive sensor types implemented on flex circuit substrates, and provide information about what parts of the toy or game are being touched.
  • Co-pending patent application titled, PRESSURE AND TOUCH SENSORS ON FLEXIBLE SUBSTRATES FOR TOYS, by the present inventors, Ser. No. 12/______, filed Jul. 6______2009, is incorporated herein by reference. It more fully describes the construction and operation of capacitive touch and pressure sensors integrated with flexible circuit substrates.
  • In general, modular control system embodiments of the present invention include multiple sensory functions on a flex substrate. These include electronic circuit hardware, algorithms and operating software useful in toy building blocks, action figures, cars, trucks, etc. Sensory functions are associated with hardware and software, and sensory inputs are provided for light, temperature, touch, speech, movement, position, etc. In design, particular sensory functions are represented as modules with electronic, software, and mechanical descriptors. Embodiments of the present invention integrate a complete electronic system with sensors and power sources on one substrate.
  • FIG. 2 represents how a toy control system like that of FIG. 1 can be laid out on a single flex circuit 200 for a doll or plush animal with four limbs. A panel 202 is provided for a doll's head, a panel 204 is provided for the chest, and limb elongations 206, 208, 210, and 212, are provided, respectively, for the right arm, left arm, right leg, and left leg. These all share a single flex substrate to make manufacturing and installation within the toy simple and reliable. A sensor device 214, 216, 218, and 220, are provided, respectively, for the right arm, left arm, right leg, and left leg, and produce signals indicating the respective limb is being touched or not touched by the user. An ambient light sensor 222 located in the doll's head detects if room light is present, or if the head has been covered, like in a wrestling sleeper hold. An acceleration sensor 224 in the head can provide measurements of movement, orientation, impacts, etc. A speaker 226 and audio output circuits are located in the head, e.g., at the mouth, and is used to produce speech, synthesized sounds, and music, sometimes with a personality or theme according to a program script included in algorithms 120 (FIG. 1). A microphone 228 and audio input circuits are also located in the head and provide an awareness for noise, speech, and other sounds. In a complex version, speech recognition is included to receive word commands, and to recognize and acknowledge a particular user. A microcomputer 230 is located on the chest panel 204 and provides for reading the sensory inputs and producing action outputs according to an executing program within that provides interactive play with the user. A chest pressure sensor 232 provides a signal indicating when the chest of the doll is being squeezed. A chest acceleration sensor 234 provides an indication that the whole doll is being rotated, flipped, spun, laid down, stood up, etc. A set of batteries 236 provide operating power.
  • FIG. 3 represents a flex circuit 300 that was used in a prototype of toy doll embodiment of the present invention. Flex circuit 300 included right and left arm capacitive sensor circuits 304 and 305. These were on elongations of circuit panel 306 which also provided for a power on/off switch (not shown). Right and left leg capacitive sensor circuits 308 and 309 were constructed as elongations of a main circuit panel 310. This attached to an audio circuit panel 312 having connections for a speaker and microphone. A panel 314 provided for mounting support and attachment inside the toy doll. A stiffer was included on the back, and a protective encapsulating coating was applied over the whole.
  • FIG. 4 shows how a flex circuit and sensor electronics assembly 400 can be mounted in the back torso 402 of a toy doll. A capacitive sensor and supporting touch sensor integrated circuit devices for the arms and legs are provided on elongation pads 404-407. These, in turn are fitted near arm and leg sockets 408-411. A battery box 412 provides operating power to flex circuit and sensor electronics assembly 400. An on/off switch (not shown) in switch pocket 414 connects to power switch pads 416. A microphone and speaker (not shown) can be connected to pads provided on a circuit panel 418. A main circuit panel 420 fits to the back of battery box 412, and provides for accelerometers, temperature sensors, touch sensor integrated circuit devices, and a microcontroller unit (MCU).
  • The behavior of an electro-mechanical toy or apparatus and its characteristic interaction with a user can be designed by breaking each down into sequences of constituent states. The inputs are collected and interpreted by electronic and/or mechanical devices. Event triggers from the environment and/or the user are used in real-time to force transitions between these states.
  • Method embodiments of the present invention break down the desired behavior of an electro-mechanical toy or apparatus into unique states represented with electronics and/or mechanical modeling. A toy is always in one state or another at any given time. When nothing is happening, all the parts of the toy are considered to be in a rest state. The toy will exist in such rest state until some external event acts to trigger a state change in one or more of the parts. Any event can be defined at an appropriate user, environmental, or sensory input to act as a trigger for the toy to react with some predefined behavior. Each of several physical toy states can be uniquely represented with an electronic circuit register. A multi-bit register status at any one particular instant directly represents the entire state the toy is in, and is quick and simple to inspect and act on. State changes triggered by input stimuli cause a change in the register bits reflecting the changing conditions of the toy.
  • Similarly, toy states can be mechanically represented with mechanical components, e.g., in a form of body language. A toy's physical attributes, and the changes from one condition such as position of head, limbs, tail etc. to another can be represented by a set of characteristics of the mechanical components and their changes, shapes, forms, junctions, connections, position, active, idle, etc. Hence, a toy's behavior in a particular state can be represented in terms of its mechanical behavior as well as with the states of an electronic circuit. A change in state may be seen as a change in the mechanical as well the electronic representation of a toy's state.
  • A state diagram is a type of diagram used in computer science and related fields to describe the behavior of systems. See, http://en.wikipedia.org/wiki/State_diagram. State diagrams try to define a system as a finite number of states. Often this is indeed possible, but at other times it can only be used for a reasonable abstraction.
  • There are many forms of conventional state diagrams, they differ slightly and use different semantics. For example, a classic form of state diagram for a finite state machine is a Directed Graph having states Q (a finite set of vertices normally represented by circles and labeled with unique designator symbols or words written inside them), input symbols Σ (a finite collection of input symbols or designators), and output symbols Z (a finite collection of output symbols or designators). An output function ω represents the mapping of input symbols into output symbols, denoted mathematically as, ω:Σ×Q→Z.
  • Edges δ represent the “transitions” between two states as caused by the input, and are identified by their symbols drawn on the “edges”. An “edge” is usually drawn as an arrow directed from the present-state toward the next-state. This mapping describes the state transitions that are to occur on input of a particular symbol, mathematically, δ:Σ×Q→Z. A start state q0εQ is usually represented by an arrow with no origin pointing to the state. Sometimes the start state is not shown and is inferred. An accepting state F is for accepting automata, FεQ is the accepting state. It is usually drawn as a double circle. Sometimes the accept state function as final (halt, trapped) states.
  • For a deterministic finite state machine (DFA), nondeterministic finite state machine (NFA), generalized nondeterministic finite state machine (GNFA), or Moore machine, the input is placed on each edge. For a Mealy Machine, input and output are signified on each edge, separated with a slash “/”. A “1/0” shows the state change upon encountering the symbol “1” causing the symbol “0” to be output. For a Moore Machine the state's output is usually written inside the state's circle, also separated from the state's designator with a slash “/”. There are also variants that combine these two notations. For example, if a state has a number of outputs, e.g., “a=motor counter-clockwise=1, b=caution light inactive=0”) the diagram should reflect, e.g., “q5/1,0” designates a state q5 with outputs a=1, b=0. Such designator is written inside the state's circle. Harel statecharts and Unified Modeling Language (UML) state diagrams can also be usefully employed in embodiments of the present invention.
  • On receipt of user inputs by an electronic circuit or mechanical device in the toy, the toy moves to a known next state, which was engineered to create a designed and scripted experience with the toy. Each state has a unique identification pattern depending on the current state and the previous state. These states and the patterns are describable in software languages, such as C, C++, and others.
  • In embodiments of the present invention, each state is represented in a specific manner related to the hardware, e.g., the physical positioning of different parts of the toy's body, limbs, head or other part. From the user's perspective a description of the complete behavior or operation scripts the full experience with toy.
  • An Appendix is included herein as an example of a state machine written in C-code for a prototype wrestling doll that provided good results. After initialization, state transitions are included for left and right arms being grabbed, left and right legs being grabbed, combinations of these, helicopter moves, back-flips, body slams, pile driver moves, dog pile moves, sleeper moves, chest pressure, and touches to the lower parts of the arms an/or legs. Responses to these inputs are audio plays, head lights on, etc.
  • Each state registered by an input sensor can be labeled as a variable in a software language. A transition between states, or a change in the variable, occurs when an event causes the toy to change from one state to another. The event can thus be registered in run-time software and used to position the toy in its next desired state. Two types of registers can be used to keep track of states and events, state registers and event registers.
  • The state registers represent each state of the toy with a binary bit in a register word. The event registers describe input events. An event input register flags any input action as either “1” or “0”. After processing of the input events, the output register bits can be set accordingly. Setting the output register bits enables the responsive act-ions designed for the toy.
  • As an example, when a toy is initially turned on, it positions itself in rest state, e.g., sitting at attention. The rest state is the starting point from which to originate any action. Each recognizable input event transitions the toy from the rest state to a next state. A reset command will initialize all the electronic circuits. If no input events have yet occurred, all of the input event register bits will be “0”. A new input event will set corresponding bits in the input event register, and the programming will see this and put the toy in a next state in a sequence. As the toy transitions from one state to the next, it writes corresponding bits in the output event register that drive the toy's electronic and electro-mechanical components to react. The toy seems to behave in real time that was originally contrived originally by the toy designer. An advanced toy can receive commands even while transitioning between states.
  • A state register's bits can be used to represent a toy's various states with 1/0, where “0” in inactive and “1” is active. For the Rest State, such state register bits would be all “0”, as in Table-I.
  • TABLE I register name LA RA LL RL HA CA CP HL register value 0 0 0 0 0 0 0 0 where, LA left arm touch sensor RA right arm touch sensor LL left leg touch sensor RL right leg touch sensor HA head accelerometer CA chest accelerometer CP chest pressure sensor HL head light sensor
  • Table-II illustrates how such state register would change its bits in response to various exemplary inputs being triggered.
  • TABLE II register name LA RA LL RL HA CA CP HL register value 0 1 0 1 0 0 0 1

    Table-III shows how the input triggers logged in the state register could be interpreted.
  • TABLE III LA left arm touch sensor 0 no touch sensed RA right arm touch sensor 1 touch to right arm sensed LL left leg touch sensor 0 no touch sensed RL right leg touch sensor 1 touch to right leg sensed HA head accelerometer 0 no acceleration sensed CA chest accelerometer 0 no acceleration sensed CP chest pressure sensor 0 no pressure sensed HL head light sensor 1 change in light intensity sensed
  • Table-IV shows how the toy states could be represented in the Perl software language.
  • TABLE IV #! /usr/bin/perl #Rest State representation of registers $la=0; $ra=0; $ll=0; $rl=0; $ha=0; $ca=0; $cp=0; $hl=0; #detecting if a sensor is activated if {$ra = = 1} { print “Right arm touch sensor activated\n”; #Go to the subroutine for right sensor activated & rightArmSensorActivated ; } else { print “No right arm touch sensor detected\n”; } if {$rl = = 1} { print “Right leg touch sensor activated\n”; #Go to the subroutine for right leg sensor activated & rightLegSensorActivated ; } else { print “No right leg touch sensor detected\n”; } if {$hl = = 1} { print “Headlight sensor activated\n”; # Go to the headlight sensor activated subroutine & headlightSensorActivated; } else { print “No headlight sensor detected\n”; } sub rightArmSensorActivated; { #perform the necessary actions after detecting right arm sensor } sub rightLegSensorActivated; { #perform the necessary actions after detecting right leg sensor } sub headlightSensorActivated; { #perform the necessary actions after detecting headlight sensor }
  • FIG. 5 represents how the states and transitions of a wrestling doll can be organized into a behavior tree 500. Such is a tool for an engineer and designer to build the electronics and computer programming necessary for an implementation of a wrestling doll embodiment of the present invention. The circuits and construction methods of FIGS. 1-4 would be useful for such. An input sensor complement 502 comprises capacitive touch sensors 504 in the arms and legs, accelerometers 506 in the head and chest, capacitive pressure sensors 508 in the chest, and an ambient light sensor 510 in the head. For example, these produce state transitions 511-514, respectively for the left arm, right arm, right leg, and left leg if the corresponding state register is active, e.g., “1”, rather than “0”. A second tier of state transitions 521-526 for the right arm, left arm, right leg, and left leg, proceed if two or more limbs produce active sensor outputs. State transition 528 checks the states of the head and chest accelerometers 506 for wrestling forces being applied to the doll that would characterise so-called helicopter and backflip actions. If a helicopter action, then state transition 530 will occur if the doll is in a spin. A state transition 532 sees that the limbs are all released, followed by a state transition 534 consistent with an impact of the doll on the ground. The doll would then produce a sound output recognizing to the user that it had suffered a helicopter throw-down.
  • If a back flip action, then state transition 536 will occur if the doll is flipped. A state transition 538 sees that the limbs are all released, followed by a state transition 540 consistent with an impact of the doll on the ground. The doll would then produce a sound output recognizing to the user that it had suffered a back-flip throw-down.
  • But, if a back flip action produced a state transition 544 immediately followed by a state transition 546 consistent with an impact of the doll on the ground, then a state transition 548 for chest pressure, then a state transition 550 for a release of chest pressure, and then a state transition 552 for a release of the limbs, then the doll should produce a sound output recognizing to the user that it had suffered a body slam throw-down.
  • A state transition 554 recognizes from the accelerometers 506 that a pile driver state 556 has occurred, and when that is followed by an accelerometer reading consistent with an impact of the doll on the ground, then the doll should produce a sound output recognizing to the user that it had suffered a pile-driver throw-down.
  • Sensor readings from the limbs are not involved in the doll recognizing dog pile and sleeper wrestling moves. A state transition 560 is triggered by head and chest accelerometers 506. If a prone state 562 exists, followed by a chest pressure state transition 564 and then a release of pressure state transition 566, then the doll should produce a sound output recognizing to the user that it had been dog-piled.
  • In a sleeper wrestling hold, a state transition 570 will be triggered when the ambient light sensor 510 indicates the dolls head is being covered. A state transition 572 is triggered by head and chest accelerometers 506 for a bent-neck state 574. If that is followed by a hold state 576, then the doll should produce a sound output recognizing to the user that it been put to sleep, e.g., a snore.
  • The designs, circuits, and methods of FIGS. 1-5 can all be usefully employed in making a toy tiger that would be fun for a child to play with. Table-V represents exemplary mechanical and interactive play characteristics that can be associated with a toy tiger.
  • TABLE V Trigger Transition State Mechanical Characteristics active button pressed returns to rest state rest state head straight if not already in there limbs straight down body horizontal active button pressed motors run lying down head straight voice command body lowering front legs forward legs spreading hind legs backward body horizontal
  • Table-VI summarizes the electronic characteristics of a toy tiger.
  • TABLE VI Trigger Transition State Electronic Characteristics active button pressed returns to rest state rest state motor drivers off if not already in there voice inputs active: waiting for next command voice outputs inactive processor in standby RF link active active button pressed motors run lying down motor drivers off voice command positioning gyroscope voice inputs active: waiting for next command running voice outputs active: making “purring” sounds processor active RF link active
  • Although the present invention has been described in terms of the presently preferred embodiments, it is to be understood that the disclosure is not to be interpreted as limiting. Various alterations and modifications will no doubt become apparent to those skilled in the art after having read the above disclosure. Accordingly, it is intended that the appended claims be interpreted as covering all alterations and modifications as fall within the “true” spirit and scope of the invention.

Claims (19)

1. An interactive toy, comprising:
a toy with user inputs and environmental inputs, and having output devices for lights, speech, sounds, and limb movements;
processor disposed in the toy and providing for changes in its existing physical state by producing motion, light, speech, or sound output on receiving predefined user and environmental inputs; and
a modular control system that integrates input sensors embedded in different parts of the body of the toy with the processor and output devices on a single flex circuit, detecting the inputs received by the sensors and making the toy interact with the user based on the nature of received inputs.
2. The interactive toy of claim 1, further comprising:
a sensory register with bits that correspond to said user inputs and environmental inputs, and machine readable by the processor.
3. The interactive toy of claim 1, further comprising:
an event register with bits that correspond to said output devices, and machine accessible by the processor.
4. The interactive toy of claim 1, further comprising:
a sensory register with bits that correspond to said user inputs and environmental inputs, and machine readable by the processor;
an event register with bits that correspond to said output devices, and machine accessible by the processor; and
a set of algorithms included in executable memory accessible by the processor and providing for a scripted translation of sensory register bit states to event register bit states.
5. A wrestling doll, comprising:
a doll body; and
a modular control system disposed within the doll body and including input sensors, a processor, an executable program, and output devices for interacting with a user, and all disposed on a single flexible circuit substrate with limb elongations;
wherein said input sensors are disposed in different locations of the doll body and which can be activated by said user when a corresponding part of the doll body is grabbed, touched, or spoken to;
wherein, said output devices are able to make sounds and move the doll body to imitate wrestling moves dependent on signals received from said input sensors.
6. The wrestling doll of claim 5, wherein:
said executable program recognizes wrestling moves that include pile driver, sleeper, helicopter, back-flip, and body slam holds and take-downs applied by a user to the doll body.
7. The wrestling doll of claim 5, further comprising:
a software program when compiled into said executable program allows a designer to define how readings from said input sensors are to be combined and interpreted to produce scripted responses of the doll through the output devices.
8. The wrestling doll of claim 5, further comprising:
a set of touch sensors disposed in the arms and legs of the doll.
9. The wrestling doll of claim 5, further comprising:
at least one accelerometer disposed in the head and chest of the doll to measure orientations, movements, impacts, and spins applied to the doll.
10. A modular control system, comprising:
a plurality of sensory functions and power sources on a single flex substrate for installation inside building blocks, action figures, cars, trucks, and other toys, and providing for user interaction according to a sequence of automated computer commands embedded in a program that directs execution of a specific procedure coded by a script.
11. The modular control system of claim 10, further comprising:
user input sensors to detect touch and/or pressure.
12. The modular control system of claim 10, further comprising:
environmental input sensors to detect ambient light and/or acceleration.
13. The modular control system of claim 10, further comprising:
algorithms that affect said sequence of automated computer commands embedded in said program to direct the execution of procedures that appear to impart a personality to said building blocks, action figures, cars, trucks, and other toys.
14. The modular control system of claim 10, further comprising:
electronic, software, and mechanical descriptors for modeling particular sensory functions and modules.
15. A method for constructing an interactive behavior for an electro-mechanical toy with its user, comprises:
scrutinizing and partitioning an intended interactive behavior of a toy into a sequence of states inter-connected by transitions;
wherein, signals generated by user and environmental input devices included within the toy trigger said transitions between said states according to scripts.
16. The method of claim 15, wherein:
said scripts are themselves comprised of a sequence of states.
17. The method of claim 15, further comprising:
collecting and interpreting input data obtained from said user and environmental input devices with electronic devices disposed within said toy.
18. The method of claim 15, further comprising:
parsing each state in said sequence of states, and any events leading to them, into a software language using a computer programming syntax, wherein a complete interactive behavior of said toy is semantically constructed in a software language to ultimately produce program code executable by a microcontroller physically disposed within said toy.
19. The method of claim 15, further comprising:
parsing each state in said sequence of states, and any events leading to them, into a software language using a computer programming syntax, wherein a complete interactive behavior of said toy is modeled electronically or mechanically.
US12/506,933 2009-07-21 2009-07-21 Method and system for interactive toys Abandoned US20110021108A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/506,933 US20110021108A1 (en) 2009-07-21 2009-07-21 Method and system for interactive toys

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/506,933 US20110021108A1 (en) 2009-07-21 2009-07-21 Method and system for interactive toys

Publications (1)

Publication Number Publication Date
US20110021108A1 true US20110021108A1 (en) 2011-01-27

Family

ID=43497732

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/506,933 Abandoned US20110021108A1 (en) 2009-07-21 2009-07-21 Method and system for interactive toys

Country Status (1)

Country Link
US (1) US20110021108A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110065354A1 (en) * 2009-09-11 2011-03-17 Andrew Wolfe Tactile input interaction
CN105749564A (en) * 2016-04-14 2016-07-13 中山市三讯电子有限公司 Toy type acousto-optic generator
US9406240B2 (en) * 2013-10-11 2016-08-02 Dynepic Inc. Interactive educational system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5087219A (en) * 1991-03-15 1992-02-11 Hasbro, Inc. Action character figure
US6048209A (en) * 1998-05-26 2000-04-11 Bailey; William V. Doll simulating adaptive infant behavior
US6206745B1 (en) * 1997-05-19 2001-03-27 Creator Ltd. Programmable assembly toy
US6435937B1 (en) * 2001-01-22 2002-08-20 Phillip E. Naegele Toy figure with force measurement and audible messages
US6497607B1 (en) * 1998-12-15 2002-12-24 Hasbro, Inc. Interactive toy
US7364489B1 (en) * 2003-04-30 2008-04-29 Hasbro, Inc. Electromechanical toy
US20080166945A1 (en) * 2007-01-10 2008-07-10 Ensky Technology (Shenzhen) Co., Ltd. Lifelike covering and lifelike electronic apparatus with the covering

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5087219A (en) * 1991-03-15 1992-02-11 Hasbro, Inc. Action character figure
US6206745B1 (en) * 1997-05-19 2001-03-27 Creator Ltd. Programmable assembly toy
US6048209A (en) * 1998-05-26 2000-04-11 Bailey; William V. Doll simulating adaptive infant behavior
US6497607B1 (en) * 1998-12-15 2002-12-24 Hasbro, Inc. Interactive toy
US6435937B1 (en) * 2001-01-22 2002-08-20 Phillip E. Naegele Toy figure with force measurement and audible messages
US7364489B1 (en) * 2003-04-30 2008-04-29 Hasbro, Inc. Electromechanical toy
US20080166945A1 (en) * 2007-01-10 2008-07-10 Ensky Technology (Shenzhen) Co., Ltd. Lifelike covering and lifelike electronic apparatus with the covering

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Firebox, http://web.archive.org/web/20081217140445/http://www.firebox.com/product/2183/Extreme-Dog-Toys, 12-17-08, pages 1-4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110065354A1 (en) * 2009-09-11 2011-03-17 Andrew Wolfe Tactile input interaction
US8398451B2 (en) * 2009-09-11 2013-03-19 Empire Technology Development, Llc Tactile input interaction
US9406240B2 (en) * 2013-10-11 2016-08-02 Dynepic Inc. Interactive educational system
CN105749564A (en) * 2016-04-14 2016-07-13 中山市三讯电子有限公司 Toy type acousto-optic generator

Similar Documents

Publication Publication Date Title
US9607475B2 (en) Interactive toys
US20170038763A1 (en) Instant Message Based Event Driven Motion Systems
CA2836360C (en) A toy construction system for augmented reality
KR20200085358A (en) Wearable wireless hmi device
Grossberg The link between brain learning, attention, and consciousness
US7219064B2 (en) Legged robot, legged robot behavior control method, and storage medium
CN101237915B (en) Interactive entertainment system and method of operation thereof
US5611731A (en) Video pinball machine controller having an optical accelerometer for detecting slide and tilt
Fujita et al. An open architecture for robot entertainment
CA2488024C (en) Package provided with a sound-reproducing device
US8358286B2 (en) Electronic device and the input and output of data
ES2553146T3 (en) Humanoid robot player, method and system of use of said robot
US8706295B2 (en) Apparatus and method for synchronizing robots
US8062087B1 (en) Glove with attached doll
US6584377B2 (en) Legged robot and method for teaching motions thereof
US7356951B2 (en) Inflatable dancing toy with music
US5920024A (en) Apparatus and method for coupling sound to motion
DE60027133T2 (en) A remote toy
CN1135131C (en) Interactive toy
KR20170020497A (en) Controlling physical toys using a physics engine
JP2010015535A (en) Input device, control system, handheld device, and calibration method
ES2265333T3 (en) Programmable toy provided with communication media.
US20010031603A1 (en) Programable assembly toy
US20080174550A1 (en) Motion-Input Device For a Computing Terminal and Method of its Operation
RU2678468C2 (en) System and method for tracking passive wand and actuating effect based on detected wand path

Legal Events

Date Code Title Description
AS Assignment

Owner name: BOREI CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LE, KHANH M.;HOLMES, DAVID M.;CAMPBELL, PAUL P.;AND OTHERS;REEL/FRAME:022985/0718

Effective date: 20090717

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION