US20130231029A1 - Interactive Toy - Google Patents

Interactive Toy Download PDF

Info

Publication number
US20130231029A1
US20130231029A1 US13/781,180 US201313781180A US2013231029A1 US 20130231029 A1 US20130231029 A1 US 20130231029A1 US 201313781180 A US201313781180 A US 201313781180A US 2013231029 A1 US2013231029 A1 US 2013231029A1
Authority
US
United States
Prior art keywords
vehicle
controllers
toy
controller
orientation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/781,180
Inventor
Gregory Katz
Jang Rui Rim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US29/447,069 priority Critical patent/USD725200S1/en
Priority to US13/781,180 priority patent/US20130231029A1/en
Publication of US20130231029A1 publication Critical patent/US20130231029A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission

Definitions

  • the present invention generally relates to an interactive remote controlled toy used to assist individuals with disabilities.
  • Autism is a neural social developmental disorder characterized by impaired interaction and communication.
  • the invention is primarily intended to benefit autistic children, but can be used by any child. Users have to work together on verbal, physical, or non-verbal communicative levels. Specifically, users must be aware of each other's hand movements and must collectively work together to coordinate their movements. An autistic child would be able to learn via imitative play by observing peers or adults and their hand motions. The user also has to be very aware of his/her own movements. A child could greatly improve his/her spatial perception and motor planning as a result.
  • One of the ways in which the present invention could benefit a child with autism is with auditory processing in that a child with autism has to listen and process verbal directions from partner(s) in order to control the toy effectively.
  • Another potential benefit is spatial orientation in that the toy is controlled by sensitive motion-based controls. The child has to be conscious of his or her own hand movement for the toy to go forward, backward, left or right in space.
  • a third potential benefit is visual tracking The child has to visually follow the path of the toy as it moves through space.
  • a further benefit is motor planning The child has to initiate, organize, and execute sequential movements in order to effectively maneuver the toy.
  • the child benefits from social interaction. A child with autism is required to be aware of his/her partner's actions while playing in a natural, unforced manner.
  • the present invention provides an interactive remote controlled toy.
  • the toy includes a mobile robotic vehicle, and a plurality of motion sensitive controllers that generate data that is integrated to control the vehicle.
  • the plurality of controllers cooperate such that their relative orientation determines the speed and direction of the vehicle.
  • an interactive remote controlled toy or vehicle includes a plurality of independently operative controllers with each controller having a control and a transmitter for sending at least one signal.
  • a toy or vehicle has a receiver, control unit, and vehicle motion controller, the vehicle being responsive and moving by the collective signals received by the receiver from the controllers. The vehicle's response is both speed and direction of movement of the vehicle.
  • the control for each controller is an orientation sensor and the vehicle's response is the summation of all the signals received by all the controllers.
  • FIG. 1 is a schematic diagram of the overall system of the present invention
  • FIG. 2 is an isometric front view of the interactive controlled toy of an embodiment of the present invention
  • FIG. 3 is a perspective front view of the toy of FIG. 2 ;
  • FIG. 4 is an isometric rear view of the toy of FIG. 2 ;
  • FIG. 5 is a perspective rear view of the toy of FIG. 2 ;
  • FIG. 6 is a top view of the controller of the toy of FIG. 2 ;
  • FIG. 7 is a side view of the controller of the toy of FIG. 2 ;
  • FIG. 8 is a front perspective view of the controller the toy of FIG. 2 ;
  • FIG. 9A is an end view of the controller of the toy of FIG. 2 ;
  • FIG. 9B is the opposite end view the toy of FIG. 2 ;
  • FIG. 10 is a bottom view of the toy of FIG. 2 ;
  • FIG. 11 shows some exemplary orientations of controllers
  • FIG. 12 shows a schematic diagram of the circuitry within the vehicle.
  • FIG. 1 shows a schematic diagram of the present invention.
  • the system 10 includes a plurality of controllers 100 and a vehicle 200 .
  • controllers There can be any number of controllers ( 101 , 102 , 103 , 104 , 105 ) for the system. While N controllers are shown, in the preferred embodiment, three ( 3 ) controllers are shown ( 101 , 102 , 103 ).
  • Each controller 100 includes within it at least two ( 2 ) components, that being a control 110 and a transmitter 120 .
  • the control 110 of each controller 100 involves the recognition of position and/or movement of the controller, i.e. location and/or orientation in space, and linear and/or rotational velocity and/or acceleration, as well as other types of tactile input, such as the presence/absence of touch, activation of buttons, and/or the application of light/hard pressure, etc.
  • Recognition of the position and/or movement of the controller can be achieved with standard off-the-shelf inertial measurement units (IMUs), which incorporate digital accelerometers and/or gyroscopes that measure linear acceleration and/or rotational velocity. Information from these sensors can be integrated over time in standard ways to determine the change in location or orientation over time.
  • IMUs inertial measurement units
  • the controller can detect that it is in discrete positions such as “forward” (pointed straight ahead), “neutral” (pointed upward), “back” (pointed backwards), “left” (pointed ahead while rolled to the left), and “right” (pointed ahead while rolled to the right).
  • the controller might alternatively detect a continuous angular configuration in space (relative to gravity).
  • the controller might alternatively detect that it is being moved, swung, or spun in a certain direction. Any of these inputs can be detected according to standard methods of using IMUs.
  • the controllers may additionally or alternatively incorporate sensors that allow them to establish their position relative to a global reference frame, such as a digital compass (which would allow reference to the planet earth), a camera that tracks a beacon (e.g. LED or fiducial marker) on the vehicle (which would allow reference to the vehicle) or a camera that tracks a set of beacons (e.g. LEDs or fiducial markers) arrayed around the room (which would allow reference to the local environment).
  • a global reference frame such as a digital compass (which would allow reference to the planet earth), a camera that tracks a beacon (e.g. LED or fiducial marker) on the vehicle (which would allow reference to the vehicle) or a camera that tracks a set of beacons (e.g. LEDs or fiducial markers) arrayed around the room (which would allow reference to the local environment).
  • a global reference frame such as a digital compass (which would allow reference to the planet earth), a camera that tracks a beacon (e.g. LED or
  • the transmitter 120 of each controller takes the movement, button pushed upon, or orientation of that particular controller 100 and turns it into a transmitted signal.
  • the information to be transmitted from each controller would be a stream of up to six numerical values (three linear axes and three rotational axes) encoding position/movement relative to the chosen reference frame (the user, the vehicle, or the local environment).
  • the signals by the plurality of transmitters' signals are collected or received by one or more receivers 210 associated with the vehicle 200 , preferably therein. Those signals are passed to a central control unit 220 in the vehicle and processed into a processed signal and passed, via wiring, to the vehicle motion controller 230 which specifically direct the vehicle to move in a particular direction and speed by controlling the vehicle's drive system, such as a turret and a single wheel or multiple wheels. It is of course recognized that the transmitting system 120 in the controllers 100 must be matched with the receiving system 210 in the vehicle 200 .
  • the vehicle 200 has physical docking ports 240 for each of the controllers 100 .
  • the docking ports 240 provide a place to store the individual controllers 100 , and, if desired, a place to charge the controllers when the vehicle is being charged.
  • Charging the vehicle's power source 250 can be accomplished by batteries, rechargeable batteries 260 and/or by regular A/C through an A/C connection and converter 270 .
  • the interactive toy 10 of the present invention's vehicle 100 takes on a friendly, non-intimidating shape.
  • the vehicle has a front 205 , rear 206 , two sides 207 , a top 208 , and a bottom 209 .
  • the controls 100 are stored in ports 240 in the top 208 of the vehicle 200 and the wheels/propulsion system is on the bottom 209 .
  • the controllers 100 are sensitive to motion, orientation and possibly pressure and/or button input.
  • the vehicle 200 is preferably electronically controlled and operates on rechargeable batteries.
  • the controllers 100 likewise preferably operate on rechargeable batteries, which may be recharged by plugging them into the vehicle 200 or a separate charger (not shown).
  • the controllers may each have two exposed electrical contacts for a connection to the vehicle's or charger's DC power supply, with corresponding contacts in the receptacle, or they may be inductively charged through the proximity of induction coils in both the controller and the vehicle/charger when the controller is placed in its receptacle.
  • the vehicle is steered and propelled by two independently drivable wheels 215 with additional passive casters 216 or bearings for stability.
  • the drive wheels may rotate together while being steered relative to the chassis and additional wheels or casters; or the wheels may be omnidirectional and independently steerable; or the vehicle may locomote by other means such as walking
  • the vehicle 200 is controlled by users operating the controllers 200 to move along a surface such as a floor in response to the relative orientation of the controllers 100 .
  • each user points his/her controller 101 , 102 , 103 in a certain direction, and the vehicle 200 moves in response to the relative orientation of the controllers or the collective sum or pool of the controllers' positions/orientations.
  • the direction/orientation of the controllers can be considered relative to the user, relative to a fixed frame of reference in the world (e.g. the room), or relative to the moving frame of reference of the vehicle.
  • the direction of motion of the vehicle can be considered relative to the direction in which the controllers are collectively oriented or the summation of the orientations, or a location in the room toward which the controllers are pointed.
  • the controllers incorporate sensors (for recognizing their position and/or movement) and transmitters (for relaying this information to the vehicle), and in which the vehicle contains a receiver (for integrating this information into a control signal for directing its movement).
  • the sensors that measure the position/movement of the controllers may be placed external to the controllers, and the controllers themselves may be passive.
  • the vehicle may incorporate a camera that is able to see the controllers and infer their position and orientation through their color, shape, or emission of light.
  • the room may be instrumented with cameras that can detect the controllers' position and orientation, as well as the vehicle's position, and an external computer system may control the movement of the vehicle according to its measurement of the controllers' position/movement.
  • the controllers may be equipped with spatially arrayed infrared LEDs that blink in different patterns, such that certain LEDs can only be viewed from certain angles; the cameras located on the vehicle or in the room may determine the controllers' position/movement according to the patterns that are visible at any moment.
  • the controllers, the vehicle, and/or an external computer system may communicate the position and/or movement information to each other in order to determine the appropriate behavior of the vehicle relative to the input from the controllers.
  • This communication may take place over any of a standard set of technologies and protocols, including Wi-Fi, Bluetooth, ZigBee, other radio frequency (RF), optical, or sonic transmission.
  • the vehicle 200 will move straight ahead. However, if one user points his controller 100 straight to the left ( FIG. 11 C), and the other user points his controller 100 straight ahead ( FIG. 11A ), the vehicle 200 will move at a forty-five degree angle (the intermediate direction). The more in sync the controllers 101 , 102 , 103 are oriented, in the same pointed direction, the faster the vehicle 200 moves.
  • the users may point their controllers at different locations in the room (e.g. on the floor).
  • the vehicle will move to an average position in between the two commanded positions.
  • the two users In order for the vehicle to move to a desired location, rather than an intermediate location, the two users must agree to point their controllers at the same location.
  • the controllers have a body 161 shaped for easy grasping by children and a size so as to prevent a choking hazard.
  • the orientation sensors 110 and transmitters 120 are built within the body 161 .
  • the vehicle 200 will not activate unless two or more controllers 100 are in use.
  • the vehicle 200 also provides feedback features in which the user is rewarded or informed for correct or incorrect play and teamwork.
  • users have their controllers 100 pointed in similar directions ( FIGS. 11C and 11D ), one can assume they are correctly “reading” verbal or non-verbal cues from each other and working together.
  • each controller 100 has a color ribbon 160 running across the top or the top itself When the users' controllers 100 are pointed in similar directions, the color ribbon 160 turns green via a light source (not shown). If the users' controllers 100 are pointed in opposite directions, for example, the ribbon 160 will turn red from a light source (not shown). Additionally, when the users' controllers 100 are pointed in similar direction, the vehicle 200 will actually go faster as a form of positive feedback for good performance.
  • the ribbon can be executed as other embodiments.
  • the controller may have a linear pattern of 3 lights. One light indicates if the controller is pointing in the same direction, or similar, as the other controllers. Another light indicates if the controller's direction or orientation is offset at an angle right of other controllers' direction/orientation.
  • the third light indicates if the controller is pointing in direction/orientation offset at an angle left of the other controller's direction/orientation.
  • the user is constantly informed about his or her participation in play and is also given incentives to communicate with the other user(s).
  • the vehicle 200 is provided with eyes 180 or other indicia ( FIG. 2 ).
  • the eyes 180 give directional, visual feedback to the users indicating in which direction the vehicle 200 is traveling. They can, of course, include lights.
  • the signals by the plurality of transmitters' signals are received by 210 in the vehicle 200 .
  • These signals are passed to a central control unit 220 in the vehicle 200 processes the signals received and performs a function to them, such as vector addition, and breaks the signal into three characteristics, that being direction forward/backwards, direction left/right, and speed.
  • the three characteristics are transmitted to the circuitry controlling the direction forward/backward of the wheels 232 , the circuitry controlling the direction turning right/turning left of the wheels 233 , and the circuitry controlling the rotational speed of the wheels 234 .
  • controllers are not confined to physical orientation as a means to drive the direction of the vehicle.
  • Each controller may have a joy stick, button pad etc., that works in conjunction with commands from the other controller to orient the direction and pace of the vehicle's movement.
  • the vehicle itself may be the compilation of two or three total sub-vehicles.
  • Each vehicle could be controlled as its own body, however, when joined with the other vehicles the connected bodies would function as a single unit and be controlled by multiple users.
  • first,” “second,” “upper,” “lower,” “top,” “bottom,” “right,” “left,” etc. are used for illustrative purposes relative to other elements only and are not intended to limit the embodiments in any way.
  • plurality as used herein is intended to indicate any number greater than one, either disjunctively or conjunctively as necessary, up to an infinite number.
  • joind is intended to put or bring two elements together so as to form a unit, and any number of elements, devices, fasteners, etc. may be provided between the joined or connected elements unless otherwise specified by the use of the term “directly” and/or supported by the drawings.

Abstract

In an embodiment of the present invention, an interactive remote controlled toy (10) is provided. The toy (10) includes a vehicle (200) and a plurality of motion sensitive controllers (100) that cooperate to control the vehicle. The plurality of controllers (100) cooperate such that their relative orientation (110) determines the speed and direction the vehicle (200) moves.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefit of U.S. Provisional Patent Application No. 61/605,581, filed Mar. 1, 2012, the contents of which are incorporated herein by reference.
  • FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • N/A
  • TECHNICAL FIELD
  • The present invention generally relates to an interactive remote controlled toy used to assist individuals with disabilities.
  • BACKGROUND OF THE INVENTION
  • Autism is a neural social developmental disorder characterized by impaired interaction and communication. The invention is primarily intended to benefit autistic children, but can be used by any child. Users have to work together on verbal, physical, or non-verbal communicative levels. Specifically, users must be aware of each other's hand movements and must collectively work together to coordinate their movements. An autistic child would be able to learn via imitative play by observing peers or adults and their hand motions. The user also has to be very aware of his/her own movements. A child could greatly improve his/her spatial perception and motor planning as a result.
  • One of the ways in which the present invention could benefit a child with autism is with auditory processing in that a child with autism has to listen and process verbal directions from partner(s) in order to control the toy effectively. Another potential benefit is spatial orientation in that the toy is controlled by sensitive motion-based controls. The child has to be conscious of his or her own hand movement for the toy to go forward, backward, left or right in space. A third potential benefit is visual tracking The child has to visually follow the path of the toy as it moves through space. A further benefit is motor planning The child has to initiate, organize, and execute sequential movements in order to effectively maneuver the toy. Lastly, the child benefits from social interaction. A child with autism is required to be aware of his/her partner's actions while playing in a natural, unforced manner.
  • SUMMARY OF THE INVENTION
  • The present invention provides an interactive remote controlled toy. The toy includes a mobile robotic vehicle, and a plurality of motion sensitive controllers that generate data that is integrated to control the vehicle. The plurality of controllers cooperate such that their relative orientation determines the speed and direction of the vehicle.
  • In particular, an interactive remote controlled toy or vehicle includes a plurality of independently operative controllers with each controller having a control and a transmitter for sending at least one signal. A toy or vehicle has a receiver, control unit, and vehicle motion controller, the vehicle being responsive and moving by the collective signals received by the receiver from the controllers. The vehicle's response is both speed and direction of movement of the vehicle. In one embodiment, the control for each controller is an orientation sensor and the vehicle's response is the summation of all the signals received by all the controllers.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • To understand the present invention, it will now be described by way of example, with reference to the accompanying FIGURES in which:
  • FIG. 1 is a schematic diagram of the overall system of the present invention;
  • FIG. 2 is an isometric front view of the interactive controlled toy of an embodiment of the present invention;
  • FIG. 3 is a perspective front view of the toy of FIG. 2;
  • FIG. 4 is an isometric rear view of the toy of FIG. 2;
  • FIG. 5 is a perspective rear view of the toy of FIG. 2;
  • FIG. 6 is a top view of the controller of the toy of FIG. 2;
  • FIG. 7 is a side view of the controller of the toy of FIG. 2;
  • FIG. 8 is a front perspective view of the controller the toy of FIG. 2;
  • FIG. 9A is an end view of the controller of the toy of FIG. 2;
  • FIG. 9B is the opposite end view the toy of FIG. 2;
  • FIG. 10 is a bottom view of the toy of FIG. 2;
  • FIG. 11 shows some exemplary orientations of controllers; and,
  • FIG. 12 shows a schematic diagram of the circuitry within the vehicle.
  • DETAILED DESCRIPTION
  • While this invention is susceptible of embodiments in many different forms, there is shown in the drawings and will herein be described in detail preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated.
  • FIG. 1 shows a schematic diagram of the present invention. The system 10 includes a plurality of controllers 100 and a vehicle 200. There can be any number of controllers (101,102, 103,104,105) for the system. While N controllers are shown, in the preferred embodiment, three (3) controllers are shown (101,102,103). Each controller 100 includes within it at least two (2) components, that being a control 110 and a transmitter 120.
  • The control 110 of each controller 100 involves the recognition of position and/or movement of the controller, i.e. location and/or orientation in space, and linear and/or rotational velocity and/or acceleration, as well as other types of tactile input, such as the presence/absence of touch, activation of buttons, and/or the application of light/hard pressure, etc. Recognition of the position and/or movement of the controller can be achieved with standard off-the-shelf inertial measurement units (IMUs), which incorporate digital accelerometers and/or gyroscopes that measure linear acceleration and/or rotational velocity. Information from these sensors can be integrated over time in standard ways to determine the change in location or orientation over time. For example, the controller can detect that it is in discrete positions such as “forward” (pointed straight ahead), “neutral” (pointed upward), “back” (pointed backwards), “left” (pointed ahead while rolled to the left), and “right” (pointed ahead while rolled to the right). The controller might alternatively detect a continuous angular configuration in space (relative to gravity). The controller might alternatively detect that it is being moved, swung, or spun in a certain direction. Any of these inputs can be detected according to standard methods of using IMUs.
  • The controllers may additionally or alternatively incorporate sensors that allow them to establish their position relative to a global reference frame, such as a digital compass (which would allow reference to the planet earth), a camera that tracks a beacon (e.g. LED or fiducial marker) on the vehicle (which would allow reference to the vehicle) or a camera that tracks a set of beacons (e.g. LEDs or fiducial markers) arrayed around the room (which would allow reference to the local environment). For example, the room may be encircled by a ring of infrared LEDs that each blink according to a different prescribed pattern; each camera-equipped controller can then determine the direction it is pointing in the room according to the patterns it is able to see at any moment. By incorporating this global reference information with the movement information from the IMU, the controller can accurately determine its global position and orientation.
  • The transmitter 120 of each controller takes the movement, button pushed upon, or orientation of that particular controller 100 and turns it into a transmitted signal. There are numerous standard means for doing this, such as Wi-Fi, Bluetooth, ZigBee, other radio frequency (RF) protocols, optical transmission such as Consumer Infrared (IR) or IrDA, etc. The information to be transmitted from each controller would be a stream of up to six numerical values (three linear axes and three rotational axes) encoding position/movement relative to the chosen reference frame (the user, the vehicle, or the local environment).
  • The signals by the plurality of transmitters' signals are collected or received by one or more receivers 210 associated with the vehicle 200, preferably therein. Those signals are passed to a central control unit 220 in the vehicle and processed into a processed signal and passed, via wiring, to the vehicle motion controller 230 which specifically direct the vehicle to move in a particular direction and speed by controlling the vehicle's drive system, such as a turret and a single wheel or multiple wheels. It is of course recognized that the transmitting system 120 in the controllers 100 must be matched with the receiving system 210 in the vehicle 200.
  • In the preferred embodiment, the vehicle 200 has physical docking ports 240 for each of the controllers 100. The docking ports 240 provide a place to store the individual controllers 100, and, if desired, a place to charge the controllers when the vehicle is being charged.
  • Charging the vehicle's power source 250 can be accomplished by batteries, rechargeable batteries 260 and/or by regular A/C through an A/C connection and converter 270.
  • The details of the controllers 100 and vehicle 200 are shown in more detail in FIGS. 2-9. In the preferred embodiment, the interactive toy 10 of the present invention's vehicle 100 takes on a friendly, non-intimidating shape. Here, the vehicle has a front 205, rear 206, two sides 207, a top 208, and a bottom 209. The controls 100 are stored in ports 240 in the top 208 of the vehicle 200 and the wheels/propulsion system is on the bottom 209. In the preferred embodiment, the controllers 100 are sensitive to motion, orientation and possibly pressure and/or button input. The vehicle 200 is preferably electronically controlled and operates on rechargeable batteries. The controllers 100 likewise preferably operate on rechargeable batteries, which may be recharged by plugging them into the vehicle 200 or a separate charger (not shown). The controllers may each have two exposed electrical contacts for a connection to the vehicle's or charger's DC power supply, with corresponding contacts in the receptacle, or they may be inductively charged through the proximity of induction coils in both the controller and the vehicle/charger when the controller is placed in its receptacle.
  • In one embodiment, the vehicle is steered and propelled by two independently drivable wheels 215 with additional passive casters 216 or bearings for stability. In other embodiments, the drive wheels may rotate together while being steered relative to the chassis and additional wheels or casters; or the wheels may be omnidirectional and independently steerable; or the vehicle may locomote by other means such as walking
  • The vehicle 200 is controlled by users operating the controllers 200 to move along a surface such as a floor in response to the relative orientation of the controllers 100. In an embodiment, each user points his/her controller 101,102,103 in a certain direction, and the vehicle 200 moves in response to the relative orientation of the controllers or the collective sum or pool of the controllers' positions/orientations.
  • For example, the direction/orientation of the controllers can be considered relative to the user, relative to a fixed frame of reference in the world (e.g. the room), or relative to the moving frame of reference of the vehicle. Additionally, the direction of motion of the vehicle can be considered relative to the direction in which the controllers are collectively oriented or the summation of the orientations, or a location in the room toward which the controllers are pointed.
  • So far the preferred embodiment of the system has been described where the controllers incorporate sensors (for recognizing their position and/or movement) and transmitters (for relaying this information to the vehicle), and in which the vehicle contains a receiver (for integrating this information into a control signal for directing its movement). In other embodiments, the sensors that measure the position/movement of the controllers may be placed external to the controllers, and the controllers themselves may be passive. For example, the vehicle may incorporate a camera that is able to see the controllers and infer their position and orientation through their color, shape, or emission of light. Alternatively, the room may be instrumented with cameras that can detect the controllers' position and orientation, as well as the vehicle's position, and an external computer system may control the movement of the vehicle according to its measurement of the controllers' position/movement. For example, the controllers may be equipped with spatially arrayed infrared LEDs that blink in different patterns, such that certain LEDs can only be viewed from certain angles; the cameras located on the vehicle or in the room may determine the controllers' position/movement according to the patterns that are visible at any moment.
  • The controllers, the vehicle, and/or an external computer system may communicate the position and/or movement information to each other in order to determine the appropriate behavior of the vehicle relative to the input from the controllers. This communication may take place over any of a standard set of technologies and protocols, including Wi-Fi, Bluetooth, ZigBee, other radio frequency (RF), optical, or sonic transmission.
  • In one embodiment, if there are two users and each has the controller pointed straight ahead (FIG. 11A), the vehicle 200 will move straight ahead. However, if one user points his controller 100 straight to the left (FIG. 11 C), and the other user points his controller 100 straight ahead (FIG. 11A), the vehicle 200 will move at a forty-five degree angle (the intermediate direction). The more in sync the controllers 101,102,103 are oriented, in the same pointed direction, the faster the vehicle 200 moves.
  • In another embodiment, the users may point their controllers at different locations in the room (e.g. on the floor). The vehicle will move to an average position in between the two commanded positions. In order for the vehicle to move to a desired location, rather than an intermediate location, the two users must agree to point their controllers at the same location.
  • The controllers have a body 161 shaped for easy grasping by children and a size so as to prevent a choking hazard. The orientation sensors 110 and transmitters 120 are built within the body 161.
  • The vehicle 200 will not activate unless two or more controllers 100 are in use. The vehicle 200 also provides feedback features in which the user is rewarded or informed for correct or incorrect play and teamwork. When users have their controllers 100 pointed in similar directions (FIGS. 11C and 11D), one can assume they are correctly “reading” verbal or non-verbal cues from each other and working together.
  • In a further embodiment, each controller 100 has a color ribbon 160 running across the top or the top itself When the users' controllers 100 are pointed in similar directions, the color ribbon 160 turns green via a light source (not shown). If the users' controllers 100 are pointed in opposite directions, for example, the ribbon 160 will turn red from a light source (not shown). Additionally, when the users' controllers 100 are pointed in similar direction, the vehicle 200 will actually go faster as a form of positive feedback for good performance. The ribbon can be executed as other embodiments. The controller may have a linear pattern of 3 lights. One light indicates if the controller is pointing in the same direction, or similar, as the other controllers. Another light indicates if the controller's direction or orientation is offset at an angle right of other controllers' direction/orientation. The third light indicates if the controller is pointing in direction/orientation offset at an angle left of the other controller's direction/orientation. Thus, the user is constantly informed about his or her participation in play and is also given incentives to communicate with the other user(s). In addition, the vehicle 200 is provided with eyes 180 or other indicia (FIG. 2). The eyes 180 give directional, visual feedback to the users indicating in which direction the vehicle 200 is traveling. They can, of course, include lights.
  • The movement of the vehicle 200 in response to the orientation controllers 101,102,103 is summarized in the following Table 1.
  • Specifically, as shown generally in FIG. 12, the signals by the plurality of transmitters' signals are received by 210 in the vehicle 200. In the embodiment shown, there are three receivers 211 and amplifiers 212. These signals are passed to a central control unit 220 in the vehicle 200 processes the signals received and performs a function to them, such as vector addition, and breaks the signal into three characteristics, that being direction forward/backwards, direction left/right, and speed. The three characteristics are transmitted to the circuitry controlling the direction forward/backward of the wheels 232, the circuitry controlling the direction turning right/turning left of the wheels 233, and the circuitry controlling the rotational speed of the wheels 234.
  • It should be noted that controllers are not confined to physical orientation as a means to drive the direction of the vehicle. Each controller may have a joy stick, button pad etc., that works in conjunction with commands from the other controller to orient the direction and pace of the vehicle's movement. And, the vehicle itself may be the compilation of two or three total sub-vehicles. Each vehicle could be controlled as its own body, however, when joined with the other vehicles the connected bodies would function as a single unit and be controlled by multiple users.
  • The terms “first,” “second,” “upper,” “lower,” “top,” “bottom,” “right,” “left,” etc. are used for illustrative purposes relative to other elements only and are not intended to limit the embodiments in any way. The term “plurality” as used herein is intended to indicate any number greater than one, either disjunctively or conjunctively as necessary, up to an infinite number. The terms “joined,” “attached,” and “connected” as used herein are intended to put or bring two elements together so as to form a unit, and any number of elements, devices, fasteners, etc. may be provided between the joined or connected elements unless otherwise specified by the use of the term “directly” and/or supported by the drawings.
  • While the specific embodiments have been illustrated and described, numerous modifications come to mind without significantly departing from the spirit of the invention, and the scope of protection is only limited by the scope of the accompanying Claims.

Claims (6)

What is claimed is:
1. An interactive remote controlled toy comprising:
a plurality of independently operative controllers, each controller having a control and a transmitter for sending at least one signal; and,
a vehicle having a receiver, control unit, and vehicle motion controller, the vehicle responsive and moving by the collective signals received by the receiver from the controllers.
2. The toy of claim 1 wherein the vehicle's response is both speed and direction of movement of the vehicle.
3. The toy of claim 1, wherein the control for each controller is an orientation sensor.
4. The toy of claim 1 wherein the vehicle's response is the summation of all the signals received by all the controllers.
5. The toy of claim 1 wherein the plurality of controllers cooperate such that their relative orientation determines the direction the vehicle moves.
6. The toy of claim 5 wherein the plurality of controllers cooperate such that their relative orientation determines the speed the vehicle moves.
US13/781,180 2012-03-01 2013-02-28 Interactive Toy Abandoned US20130231029A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US29/447,069 USD725200S1 (en) 2013-02-28 2013-02-28 Interactive toy
US13/781,180 US20130231029A1 (en) 2012-03-01 2013-02-28 Interactive Toy

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261605581P 2012-03-01 2012-03-01
US13/781,180 US20130231029A1 (en) 2012-03-01 2013-02-28 Interactive Toy

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US29/447,069 Continuation-In-Part USD725200S1 (en) 2013-02-28 2013-02-28 Interactive toy

Publications (1)

Publication Number Publication Date
US20130231029A1 true US20130231029A1 (en) 2013-09-05

Family

ID=49043099

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/781,180 Abandoned US20130231029A1 (en) 2012-03-01 2013-02-28 Interactive Toy

Country Status (1)

Country Link
US (1) US20130231029A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150165336A1 (en) * 2013-12-12 2015-06-18 Beatbots, LLC Robot
US20170232358A1 (en) * 2016-02-11 2017-08-17 Disney Enterprises, Inc. Storytelling environment: mapping virtual settings to physical locations
CN107181818A (en) * 2017-06-27 2017-09-19 华南师范大学 Robot remote control and management system and method based on cloud platform
US10094669B2 (en) * 2015-10-29 2018-10-09 Horizon Hobby, LLC Systems and methods for inertially-instituted binding of a RC vehicle

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6083104A (en) * 1998-01-16 2000-07-04 Silverlit Toys (U.S.A.), Inc. Programmable toy with an independent game cartridge
US6247994B1 (en) * 1998-02-11 2001-06-19 Rokenbok Toy Company System and method for communicating with and controlling toy accessories
US20010045978A1 (en) * 2000-04-12 2001-11-29 Mcconnell Daniel L. Portable personal wireless interactive video device and method of using the same
US20020107591A1 (en) * 1997-05-19 2002-08-08 Oz Gabai "controllable toy system operative in conjunction with a household audio entertainment player"
US6645037B1 (en) * 1998-08-24 2003-11-11 Silverlit Toy Manufactory Ltd. Programmable toy and game
US20040082268A1 (en) * 2002-10-23 2004-04-29 Kevin Choi Toy with programmable remote control
US20040147202A1 (en) * 2001-03-29 2004-07-29 Tord Brabrand Remote control system
US20050048870A1 (en) * 2003-08-25 2005-03-03 Arnold L. Taylor System and method for controlling multiple model vehicles
US7147535B2 (en) * 2002-06-11 2006-12-12 Janick Simeray Optical remote controller pointing the place to reach
US7221113B1 (en) * 2004-11-10 2007-05-22 The Creative Train Company, Llc Touch-sensitive model train controls
US20080290598A1 (en) * 2002-10-31 2008-11-27 Mattel, Inc. Remote controlled toy vehicle, toy vehicle control system and game using remote controlled toy vehicle
US20090267897A1 (en) * 2008-04-23 2009-10-29 Smk Corporation Remote control transmitter
US7731588B2 (en) * 2005-09-28 2010-06-08 The United States Of America As Represented By The Secretary Of The Navy Remote vehicle control system
US7753756B2 (en) * 2004-10-07 2010-07-13 Mt Remote Systems, Llc Radio controlled system and method of remote location motion emulation and mimicry
US20100210169A1 (en) * 2009-02-04 2010-08-19 Ulrich Röhr Model Helicopter Control and Receiving Means
US7833080B2 (en) * 2002-10-22 2010-11-16 Winkler International, Sa Control system and method for electric toy vehicles
US20100315262A1 (en) * 2009-06-15 2010-12-16 David Coombs Multiple User Controlled Object
US7905759B1 (en) * 2003-10-07 2011-03-15 Ghaly Nabil N Interactive play set
US8014897B2 (en) * 2000-10-06 2011-09-06 Innovation First, Inc. System, apparatus, and method for managing and controlling robot competitions
US20130072083A1 (en) * 2006-08-02 2013-03-21 Nabil N. Ghaly Interactive play set

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020107591A1 (en) * 1997-05-19 2002-08-08 Oz Gabai "controllable toy system operative in conjunction with a household audio entertainment player"
US6083104A (en) * 1998-01-16 2000-07-04 Silverlit Toys (U.S.A.), Inc. Programmable toy with an independent game cartridge
US6247994B1 (en) * 1998-02-11 2001-06-19 Rokenbok Toy Company System and method for communicating with and controlling toy accessories
US6656012B1 (en) * 1998-02-11 2003-12-02 Rokenbok Toy Company System and method for communicating with and controlling toy accessories
US6645037B1 (en) * 1998-08-24 2003-11-11 Silverlit Toy Manufactory Ltd. Programmable toy and game
US20010045978A1 (en) * 2000-04-12 2001-11-29 Mcconnell Daniel L. Portable personal wireless interactive video device and method of using the same
US8014897B2 (en) * 2000-10-06 2011-09-06 Innovation First, Inc. System, apparatus, and method for managing and controlling robot competitions
US20040147202A1 (en) * 2001-03-29 2004-07-29 Tord Brabrand Remote control system
US7147535B2 (en) * 2002-06-11 2006-12-12 Janick Simeray Optical remote controller pointing the place to reach
US7833080B2 (en) * 2002-10-22 2010-11-16 Winkler International, Sa Control system and method for electric toy vehicles
US20040082268A1 (en) * 2002-10-23 2004-04-29 Kevin Choi Toy with programmable remote control
US20080290598A1 (en) * 2002-10-31 2008-11-27 Mattel, Inc. Remote controlled toy vehicle, toy vehicle control system and game using remote controlled toy vehicle
US7905761B2 (en) * 2002-10-31 2011-03-15 Mattel, Inc. Remote controlled toy vehicle, toy vehicle control system and game using remote controlled toy vehicle
US7137862B2 (en) * 2003-08-25 2006-11-21 Arnold L Taylor System and method for controlling multiple model vehicles
US20050048870A1 (en) * 2003-08-25 2005-03-03 Arnold L. Taylor System and method for controlling multiple model vehicles
US7905759B1 (en) * 2003-10-07 2011-03-15 Ghaly Nabil N Interactive play set
US7753756B2 (en) * 2004-10-07 2010-07-13 Mt Remote Systems, Llc Radio controlled system and method of remote location motion emulation and mimicry
US7221113B1 (en) * 2004-11-10 2007-05-22 The Creative Train Company, Llc Touch-sensitive model train controls
US7731588B2 (en) * 2005-09-28 2010-06-08 The United States Of America As Represented By The Secretary Of The Navy Remote vehicle control system
US20130072083A1 (en) * 2006-08-02 2013-03-21 Nabil N. Ghaly Interactive play set
US20090267897A1 (en) * 2008-04-23 2009-10-29 Smk Corporation Remote control transmitter
US20100210169A1 (en) * 2009-02-04 2010-08-19 Ulrich Röhr Model Helicopter Control and Receiving Means
US20100315262A1 (en) * 2009-06-15 2010-12-16 David Coombs Multiple User Controlled Object
US8279050B2 (en) * 2009-06-15 2012-10-02 David Coombs Multiple user controlled object

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150165336A1 (en) * 2013-12-12 2015-06-18 Beatbots, LLC Robot
US9358475B2 (en) * 2013-12-12 2016-06-07 Beatbots, LLC Robot
US10094669B2 (en) * 2015-10-29 2018-10-09 Horizon Hobby, LLC Systems and methods for inertially-instituted binding of a RC vehicle
US20180364049A1 (en) * 2015-10-29 2018-12-20 Horizon Hobby, LLC Systems and methods for inertially-instituted binding of a rc vehicle
US10578439B2 (en) * 2015-10-29 2020-03-03 Horizon Hobby, LLC Systems and methods for inertially-instituted binding of a RC vehicle
US20170232358A1 (en) * 2016-02-11 2017-08-17 Disney Enterprises, Inc. Storytelling environment: mapping virtual settings to physical locations
US10369487B2 (en) * 2016-02-11 2019-08-06 Disney Enterprises. Inc. Storytelling environment: mapping virtual settings to physical locations
CN107181818A (en) * 2017-06-27 2017-09-19 华南师范大学 Robot remote control and management system and method based on cloud platform

Similar Documents

Publication Publication Date Title
US11454963B2 (en) Self-propelled device with center of mass drive system
US9527213B2 (en) Automatic stair-climbing robot platform
US11648485B2 (en) Toy robot
CN109313487B (en) Object controller
US20210387346A1 (en) Humanoid robot for performing maneuvers like humans
EP2844435B1 (en) Line sensing robot and a method of using the same with a digital display
US20170348858A1 (en) Multiaxial motion control device and method, in particular control device and method for a robot arm
US20130231029A1 (en) Interactive Toy
CN105629969A (en) Restaurant service robot
CN102348068A (en) Head gesture control-based following remote visual system
US20180360177A1 (en) Robotic suitcase
CN106325527A (en) Human body action identification system
KR20180089667A (en) Robot for providing coding education
KR20170089074A (en) Mobile robot system
Dalsaniya et al. Smart phone based wheelchair navigation and home automation for disabled
CN112313589A (en) Self-moving educational toy
CN103143142A (en) Integrated entertainment, exercise and fitness treadmill and manual manipulator thereof
KR20210022394A (en) A moving robot for the blind and control method thereof
WO2018033839A1 (en) Interactive modular robot
Hameed et al. Smart wheel chair
KR101279993B1 (en) Distributing flyers Robot
Langner Effort reduction and collision avoidance for powered wheelchairs: SCAD assistive mobility system
TW201444604A (en) Remote-control car system in control of smart device
KR102183827B1 (en) Object controller
CN211842012U (en) Robot

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION