US20070173171A1 - Reflected light controlled vehicle - Google Patents

Reflected light controlled vehicle Download PDF

Info

Publication number
US20070173171A1
US20070173171A1 US11/339,572 US33957206A US2007173171A1 US 20070173171 A1 US20070173171 A1 US 20070173171A1 US 33957206 A US33957206 A US 33957206A US 2007173171 A1 US2007173171 A1 US 2007173171A1
Authority
US
United States
Prior art keywords
toy vehicle
sensor
tracking
tracking signal
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/339,572
Inventor
Gyora Mihaly Pal Benedek
Shai Seger
Yehiel Avraham Olti
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/339,572 priority Critical patent/US20070173171A1/en
Publication of US20070173171A1 publication Critical patent/US20070173171A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H17/00Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
    • A63H17/26Details; Accessories
    • A63H17/36Steering-mechanisms for toy vehicles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H33/00Other toys
    • A63H33/22Optical, colour, or shadow toys

Definitions

  • the present invention relates to remote controlled toy vehicles and, in particular, it concerns a remotely controlled toy vehicle that is responsive to a reflected tracking signal.
  • said source of said at least a first auditory tracking signal is implemented as an electronic device.
  • FIG. 1 is a schematic top elevation of a first preferred embodiment of toy vehicle constructed and operative according to the teachings of the present invention, shown here with the body covering removed;
  • FIGS. 3A-3C illustrate various sensor arrangements and schemes for vehicle operation for each arrangement.
  • Each of the sensors is deployed in an enclosure that limits its field of sensing “view”, that is, the area within which stimuli is detected. It will be appreciated that the area of the field of view of the sensor varies with the requirements of each vehicle and the operational scheme chosen. It should be noted that providing sensors with an adjustable field of view is within the scope of the present invention. It will be understood that providing sensor enclosures that also allow the sensors to detect the signal emitted by the controller when the controller is pointed directly at the vehicle is within the scope of the present invention.
  • Such an arrangement provides the option of carrying or wearing a controller in such a manner that the vehicle will follow the user. As mentioned above, this following option may also be achieved by wearing a controller configured to project a tracking spot on the ground near the user as the user walks along.

Abstract

The remotely controlled toy vehicle of the present invention is configured to respond to a tracking signal of a narrow beam of non-visible light that is projected onto the surface of the ground in proximity of the toy vehicle. The sensors mounted on the toy vehicle are configured to receive, and respond to, the light energy of the beam that is reflected off the surface of the ground. The controller, which is the source of the tracking signal, is a handheld component configured to project the beam of non-visible light to a location as desired by the user. A beam of visible light is also projected as an indicator of the location of the tracking spot. Operation of the toy vehicle is controlled by moving the tracking spot. Preferably, the control circuitry is configured such that the toy vehicle follows the tracking spot. Alternatively, the remotely controlled toy vehicle of the present invention is configured to respond to an auditory tracking signal.

Description

    FIELD AND BACKGROUND OF THE INVENTION
  • The present invention relates to remote controlled toy vehicles and, in particular, it concerns a remotely controlled toy vehicle that is responsive to a reflected tracking signal.
  • Remotely controlled toy vehicles that are responsive to a tracking signal are known in the art. One such device is disclosed in U.S. Pat. No. 6,780,077 to Baumgartener et al. The Baumgartener et al. toy includes a conventionally remotely controlled master toy vehicle with a transmitter configured to broadcast an IR tracking signal, and a slave toy vehicle with at least two directional IR receivers configured to receive a direct signal from the master toy vehicle in order to follow or evade the master toy vehicle.
  • Remotely controlling a toy vehicle with a directly received tracking signal drastically limits the range of movement of the toy vehicle since the source of tracking signal must be moved from one location to another in order to direct the movement of the tracking toy vehicle.
  • There is therefore a need for a remotely controlled toy vehicle that is responsive to a reflected tracking signal.
  • SUMMARY OF THE INVENTION
  • The present invention is a remotely controlled toy vehicle that is responsive to a reflected tracking signal.
  • According to the teachings of the present invention there is provided, a method for remotely guiding a toy vehicle, the method comprising: a) providing a first light source configured to project at least a first narrow beam tracking signal; b) projecting said at least a first tracking signal so as to provide a tracking spot on a surface; c) providing a motorized toy vehicle configured with at least one sensor responsive to a position of said at least a first tracking spot; wherein output from said sensor affects at least one operational feature of said motorized toy vehicle, and said operational feature is effective to change a location of the toy vehicle; and d) altering said position of said at least a first tracking spot so as to alter said at least one operational feature and thereby guide the toy vehicle to a change of location.
  • According to a further teaching of the present invention, said first tracking signal is implemented as a beam of light within the spectrum of non-visible light.
  • According to a further teaching of the present invention, said first tracking signal is implemented as infrared light.
  • According to a further teaching of the present invention, there is also provided at least a second light source providing a beam of visible light, configured to indicate said position of said tracking spot.
  • According to a further teaching of the present invention, said light source is implemented in a handheld component operated by a user.
  • According to a further teaching of the present invention, said light source is implemented so as to be wearable by a user.
  • According to a further teaching of the present invention, said tracking spot is projected onto said surface in proximity to said motorized toy vehicle.
  • According to a further teaching of the present invention, there is also provided a control circuit in electronic communication with said at least one sensor, said control circuit configured to receive output from said at least one sensor and to control said at least one operational feature.
  • According to a further teaching of the present invention, said at least one sensor is implemented as a plurality of sensors and said at least one operational feature is implemented as a plurality of operational features.
  • According to a further teaching of the present invention, said plurality of operational features are implemented so as to include at least locomotion and directional steering of said motorized toy vehicle and said control circuit is in electronic communication with at least a drive motor and a steering mechanism.
  • There is also provided according to the teachings of the present invention, a light guided toy vehicle comprising: a) a first light source configured to project at least a first narrow beam tracking signal so as to provide a tracking spot on a surface; and b) a motorized toy vehicle configured with at least one sensor responsive to a position of said at least a first tracking spot; wherein output from said sensor affects at least one operational feature of said motorized toy vehicle, and said operational feature is effective to change a location of the toy vehicle.
  • According to a further teaching of the present invention, said first tracking signal is configured as a beam of light within the spectrum of non-visible light.
  • According to a further teaching of the present invention, said first tracking signal is configured as infrared light.
  • According to a further teaching of the present invention, there is also provided, at least a second light source configured to project a beam of visible light, so as to indicate said position of said tracking spot.
  • According to a further teaching of the present invention, said light source is configured in a handheld component operated by a user.
  • According to a further teaching of the present invention, said at least one sensor is configured to respond to said tracking spot when said tracking spot is projected onto said surface in proximity to said motorized toy vehicle.
  • According to a further teaching of the present invention, there is also provided, a control circuit in electronic communication with said at least one sensor, said control circuit configured to receive output from said at least one sensor and control said at least one operational feature.
  • According to a further teaching of the present invention, said at least one sensor is configured as a plurality of sensors and said at least one operational feature is configured as a plurality of operational features.
  • According to a further teaching of the present invention, said plurality of operational features included locomotion and directional steering of said motorized toy vehicle and said control circuit is in electronic communication with at least a drive motor and a steering mechanism.
  • There is also provided according to the teachings of the present invention, a method for remotely guiding a toy vehicle, the method comprising: a) providing a source of at least a first auditory tracking signal configured to emit said at least a first auditory tracking signal; b) emitting said at least a first auditory tracking signal; c) providing a motorized toy vehicle configured with at least one sensor responsive to a position of said at least a first auditory tracking signal; wherein output from said sensor affects at least one operational feature of said motorized toy vehicle, and said operational feature is effective to change a location of the toy vehicle; and d) altering said position of said at least a first auditory tracking signal so as to alter said at least one operational feature and thereby guide the toy vehicle to a change of location.
  • According to a further teaching of the present invention, said source of said at least a first auditory tracking signal is implemented as an electronic device.
  • According to a further teaching of the present invention, said at least a first auditory tracking signal implemented so as to be at a predetermined frequency and said at least one sensor is configured to respond substantially solely to said pre-determined frequency.
  • According to a further teaching of the present invention, said at least one sensor is implemented so as to respond to a loudest auditory signal received.
  • According to a further teaching of the present invention, said at least a first auditory tracking signal is implemented as at least a first voice command.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is herein described, by way of example only, with reference to the accompanying drawings, wherein:
  • FIG. 1 is a schematic top elevation of a first preferred embodiment of toy vehicle constructed and operative according to the teachings of the present invention, shown here with the body covering removed;
  • FIG. 2 is a schematic block drawing of the primary elements of a preferred embodiment of a handheld controller constructed and operative according to the teachings of the present invention;
  • FIG. 3A is a schematic of a single sensor control scheme according to the teachings of the present invention;
  • FIG. 3B is a schematic of a dual sensor control scheme according to the teachings of the present invention;
  • FIG. 3C is a schematic of a triple sensor control scheme according to the teachings of the present invention;
  • FIG. 4 is a photograph of a prototype of a second preferred embodiment of a vehicle constructed and operative according to the teachings of the present invention, shown here with schematic representations of the fields of view of each of the sensors; and
  • FIG. 5 is an exploded vehicle of a preferred embodiment of a handheld controller constructed and operative according to the teachings of the present invention.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present invention is a remotely controlled toy vehicle that is responsive to a reflected tracking signal.
  • The principles and operation of a remotely controlled toy vehicle that is responsive to a reflected tracking signal according to the present invention may be better understood with reference to the drawings and the accompanying description.
  • By way of introduction, the remotely controlled toy vehicle of the present invention is configured to respond to a reflected tracking signal. Preferably, a tracking signal of a narrow beam of non-visible light is projected to the surface of the ground creating a tracking spot in proximity of the toy vehicle. The sensors mounted on the toy vehicle are configured to receive, and respond to, the reflected light energy of the beam emanating from the tracking spot. The output from the sensors is used to control the operational features of the toy vehicle such as, but not limited to, locomotion and directional steering. Optionally, other sensors may be deployed on the toy vehicle that are configured to effect other operational features such as, but no limited to, turning on and off lights and sound effects, opening doors, and firing “weapons”.
  • In a preferred embodiment of the present invention, the controller, which is the source of the tracking signal, is a handheld component configured to project the beam of non-visible light to a location as desired by the user. Optionally, the controller also projects a beam of visible light along a path substantially parallel to the beam of non-visible light as an indicator of the location of the tracking spot. Optionally, the controller may be configured so as to enable it to be Urn such as, but not limited to, clipped on a belt, sticking out from a pocket, tucked in a head or hat band, or attached to a wrist or ankle band, in such a way that a tracking spot is projected onto the ground in order to provide hands free control of the toy vehicle. Alternatively, a separate wearable controller may be provided with the toy.
  • Operation of the toy vehicle is controlled by moving the tracking spot. Preferably, the control circuitry is configured such that the toy vehicle follows the tracking spot, as will be discussed in more detail with regard to the Figures. Optionally, the control circuitry may be configured such that the toy vehicle evades the tracking spot.
  • Referring now to the drawings, it should be noted that directional terms such as left, right, forward and reverse are used with regard to the drawings being discussed and are not intended as limitations to the principles of the present invention.
  • FIG. 1 illustrates the chassis 2 of a preferred embodiment of the toy vehicle of the present invention. In this embodiment, the operational feature of locomotion is affected by the rear drive wheels 4, which are driven by the drive motor 6. The operational feature of directional steering is affected by the steerable front wheels 8 by means of the steering motor 20 and associated steering mechanism, as in conventional remote control vehicles. It is within the scope of the present invention, however, to provide an embodiment implementing a two drive-motor, skid steering, mode of locomotion and directional steering.
  • The front 10 and rear 12 sensor arrangements of this embodiment include three sensors each. This sensor arrangement will be discussed in greater detail with regard to FIG. 3C. The sensors are configured such that their receptive field of view is limited to the ground surface in proximity of the toy vehicle as will be discussed with regard to FIGS. 3A-3C.
  • The control circuit 16 is in electronic communication with the sensors 10 and 12, the drive motor 6 and the steering motor 20. Output signals from the sensors are received by the control circuit 16, which in turn controls the operational features of the toy vehicle such as, but not limited to, locomotion and steering by operating the drive motor 6 and the steering motor 20.
  • FIG. 2 illustrates the basic components of the handheld controller 30 of the present invention, which includes a battery 32, an “on” “off” switch 34, a circuit board 36, a source of non-visible light 38, preferably high intensity infra-red LED, a source of visible light 40, preferably high intensity LED of laser, and lenses 42. The exploded view of FIG. 5 illustrates these components deployed in an exemplary handheld case composed of case segments 44 a, 44 b and 44 c. It should be noted that the controller case may be of substantially any design such as, but not limited to, being similar to a handgun, laser pointer or pen light. It will be appreciated that the visible light source 40 and the nonvisible light source 38 may be activated independently on each other. That is, each may have its own on/off switch.
  • FIGS. 3A-3C illustrate various sensor arrangements and schemes for vehicle operation for each arrangement. Each of the sensors is deployed in an enclosure that limits its field of sensing “view”, that is, the area within which stimuli is detected. It will be appreciated that the area of the field of view of the sensor varies with the requirements of each vehicle and the operational scheme chosen. It should be noted that providing sensors with an adjustable field of view is within the scope of the present invention. It will be understood that providing sensor enclosures that also allow the sensors to detect the signal emitted by the controller when the controller is pointed directly at the vehicle is within the scope of the present invention. Such an arrangement provides the option of carrying or wearing a controller in such a manner that the vehicle will follow the user. As mentioned above, this following option may also be achieved by wearing a controller configured to project a tracking spot on the ground near the user as the user walks along.
  • FIG. 3A illustrates a single sensor arrangement. An example of an operational scheme for such an arrangement is simple forward locomotion. When no tracking spot is detected by the sensor 50, the drive motor is off. When the sensor 50 detects a tracking spot within region 52, the drive motor is activated and the vehicle will move straight forward.
  • FIG. 3B illustrates a dual sensor arrangement. Such an arrangement affords both locomotion and direction steering. When no tracking spot is detected by the sensors 60 and 62, the drive motor is off. When both sensors 60 and 62 detect a tracking spot, i.e. the tracking spot is projected into region 68, only the drive motor is activated. When a tracking spot is detected by only one of sensors 60 and 62, i.e. the tracking spot is projected into either region 64 or 66, both the drive motor and the steering motor are activated Detection of the tracking spot by sensor 60 in region 64 will cause the vehicle to move forward while turning to the left. Similarly, detection of the tracking spot by sensor 62 in region 66 will cause the vehicle to move forward while turning to the right.
  • The triple sensor arrangement illustrated in FIG. 3C also affords both locomotion and direction steering. When no tracking spot is detected by the sensors 70, 72 and 62 the drive motor is off. When sensor 70 detects a tracking spot, i.e. the tracking spot is projected into region 78, only the drive motor is activated. When a tracking spot is detected by sensor 70, i.e. the tracking spot is projected into region 76, both the drive motor and the steering motor are activated will cause the vehicle to move forward while turning to the left. Similarly, detection of the tracking spot by sensor 74 in region 80 will cause the vehicle to move forward while turning to the right. Therefore, the toy vehicle of FIG. 1, which is configured with triple sensor arrangements on both the front 10 and rear 12 is afforded full six direction operation, forward straight, forward left and forward right, and reverse straight, reverse left and reverse right.
  • The second preferred embodiment 90 of the vehicle of the present invention illustrated in FIG. 4 is configured with a triple sensor arrangement in the front and a single sensor arrangement in the rear. Therefore, it is afforded four-direction operation, forward straight, forward left and forward right, and reverse straight. The sensors and regions are numbered according to the corresponding sensor schemes listed above with regard to FIGS. 3A and 3C.
  • A variant embodiment of the present invention includes the use of sound detection sensors rather than light sensors. In such an embodiment, sensor arrangements and schemes for vehicle operation substantially as described above with regard to FIGS. 3A-3C are provided in order to track the source of an auditory signal. The auditory signal may be provided by a controller devise configured to emit an auditory tracking signal of a pre-determined frequency wherein the sensors are configured to respond substantially only to signals at the pre-set frequency. The frequency need not be within the normal human auditory range. Such a controller may be, for example, an electronic device, a whistle, or substantially any other sound producing devise. Alternatively, the sensors may be configured to respond to the loudest auditory signal received such as, but not limited to, a human voice, clapping hands, and a ringing bell.
  • Yet another variant embodiment of the vehicle of the present invention configured to respond to auditory signals is configured to respond to voice commands, which may include, for example, calling the toy by name. This may be implemented in conjunction with the sensor arrangements described with regard to FIGS. 3A-3C, such that the vehicle tracks the position of the source of the voice. Alternatively, the vehicle may be configured to respond to voice commands such as, but not limited to, go, turn right, go straight, turn left, stop, and back up. It should be noted that the inclusion of voice recognition software in such an embodiment is within the scope of the present invention.
  • It will be appreciated that the above descriptions are intended only to serve as examples and that many other embodiments are possible within the spirit and the scope of the present invention.

Claims (24)

1. A method for remotely guiding a toy vehicle, the method comprising:
(a) providing a first light source configured to project at least a first narrow beam tracking signal;
(b) projecting said at least a first tracking signal so as to provide a tracking spot on a surface;
(c) providing a motorized toy vehicle configured with at least one sensor responsive to a position of said at least a first tracking spot;
wherein output from said sensor affects at least one operational feature of said motorized toy vehicle, and said operational feature is effective to change a location of the toy vehicle; and
(d) altering said position of said at least a first tracking spot so as to alter said at least one operational feature and thereby guide the toy vehicle to a change of location.
2. The method of claim 1, wherein said first tracking signal is implemented as a beam of light within the spectrum of non-visible light.
3. The method of claim 2, wherein said first tracking signal is implemented as infrared light.
4. The method of claim 2, further including providing at least a second light source providing a beam of visible light, configured to indicate said position of said tracking spot.
5. The method of claim 1, wherein said light source is implemented in a handheld component operated by a user.
6. The method of claim 1, wherein said light source is implemented so as to be wearable by a user.
7. The method of claim 1, wherein said tracking spot is projected onto said surface in proximity to said motorized toy vehicle.
8. The method of claim 1, further including providing a control circuit in electronic communication with said at least one sensor, said control circuit configured to receive output from said at least one sensor and to control said at least one operational feature.
9. The method of claim 8, wherein said at least one sensor is implemented as a plurality of sensors and said at least one operational feature is implemented as a plurality of operational features.
10. The method of claim 9, wherein said plurality of operational features are implemented so as to include at least locomotion and directional steering of said motorized toy vehicle and said control circuit is in electronic communication with at least a drive motor and a steering mechanism.
11. A light guided toy vehicle comprising:
(a) a first light source configured to project at least a first narrow beam tracking signal so as to provide a tracking spot on a surface; and
(b) a motorized toy vehicle configured with at least one sensor responsive to a position of said at least a first tracking spot;
wherein output from said sensor affects at least one operational feature of said motorized toy vehicle, and said operational feature is effective to change a location of the toy vehicle.
12. The light guided toy vehicle of claim 111, wherein said first tracking signal is configured as a beam of light within the spectrum of non-visible light.
13. The light guided toy vehicle of claim 12, wherein said first tracking signal is configured as infrared light.
14. The light guided toy vehicle of claim 13, further including at least a second light source configured to project a beam of visible light, so as to indicate said position of said tracking spot.
15. The light guided toy vehicle of claim 11, wherein said light source is configured in a handheld component operated by a user.
16. The light guided toy vehicle of claim 11, wherein said at least one sensor is configured to respond to said tracking spot when said tracking spot is projected onto said surface in proximity to said motorized toy vehicle.
17. The light guided toy vehicle of claim 11, further including a control circuit in electronic communication with said at least one sensor, said control circuit configured to receive output from said at least one sensor and control said at least one operational feature.
18. The light guided toy vehicle of claim 17, wherein said at least one sensor is configured as a plurality of sensors and said at least one operational feature is configured as a plurality of operational features.
19. The light guided toy vehicle of claim 18, wherein said plurality of operational features included locomotion and directional steering of said motorized toy vehicle and said control circuit is in electronic communication with at least a drive motor and a steering mechanism.
20. A method for remotely guiding a toy vehicle, the method comprising:
(a) providing a source of at least a first auditory tracking signal configured to emit said at least a first auditory tracking signal;
(b) emitting said at least a first auditory tracking signal;
(c) providing a motorized toy vehicle configured with at least one sensor responsive to a position of said at least a first auditory tracking signal;
wherein output from said sensor affects at least one operational feature of said motorized toy vehicle, and said operational feature is effective to change a location of the toy vehicle; and
(d) altering said position of said at least a first auditory tracking signal so as to alter said at least one operational feature and thereby guide the toy vehicle to a change of location.
21. The method of claim 20, wherein said source of said at least a first auditory tracking signal is implemented as an electronic device.
22. The method of claim 20, wherein said at least a first auditory tracking signal implemented so as to be at a pre-determined frequency and said at least one sensor is configured to respond substantially solely to said pre-determined frequency.
23. The method of claim 20, wherein said at least one sensor is implemented so as to respond to a loudest auditory signal received.
24. The method of claim 20, wherein said at least a first auditory tracking signal is implemented as at least a first voice command.
US11/339,572 2006-01-26 2006-01-26 Reflected light controlled vehicle Abandoned US20070173171A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/339,572 US20070173171A1 (en) 2006-01-26 2006-01-26 Reflected light controlled vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/339,572 US20070173171A1 (en) 2006-01-26 2006-01-26 Reflected light controlled vehicle

Publications (1)

Publication Number Publication Date
US20070173171A1 true US20070173171A1 (en) 2007-07-26

Family

ID=38286156

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/339,572 Abandoned US20070173171A1 (en) 2006-01-26 2006-01-26 Reflected light controlled vehicle

Country Status (1)

Country Link
US (1) US20070173171A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100304640A1 (en) * 2009-05-28 2010-12-02 Anki, Inc. Distributed System of Autonomously Controlled Toy Vehicles
WO2012012889A1 (en) * 2010-07-30 2012-02-02 Thinking Technology, Inc. Two-sided toy vehicle
WO2012012883A1 (en) * 2010-07-30 2012-02-02 Thinking Technology Improved remote controlled toy
CN102755748A (en) * 2011-04-25 2012-10-31 斯平玛斯特有限公司 System for automatically tracking moving toy vehicle
WO2013022520A1 (en) * 2011-08-09 2013-02-14 The Boeing Company Beam directed motion control system
US8882560B2 (en) 2009-05-28 2014-11-11 Anki, Inc. Integration of a robotic system with one or more mobile computing devices
US20150093958A1 (en) * 2013-10-01 2015-04-02 Rehco, Llc System for Controlled Distribution of Light in Toy Characters
US9144746B2 (en) 2010-08-20 2015-09-29 Mattel, Inc. Toy with locating feature
US9155961B2 (en) 2009-05-28 2015-10-13 Anki, Inc. Mobile agents for manipulating, moving, and/or reorienting components
US9233314B2 (en) 2010-07-19 2016-01-12 China Industries Limited Racing vehicle game
US9996369B2 (en) 2015-01-05 2018-06-12 Anki, Inc. Adaptive data analytics service
US10188958B2 (en) 2009-05-28 2019-01-29 Anki, Inc. Automated detection of surface layout

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3314189A (en) * 1964-08-10 1967-04-18 William P Carroll Remote, light actuated control means for models
US4865575A (en) * 1988-11-04 1989-09-12 Mattel, Inc. Light responsive remote control vehicle
US6482064B1 (en) * 2000-08-02 2002-11-19 Interlego Ag Electronic toy system and an electronic ball
US6780077B2 (en) * 2001-11-01 2004-08-24 Mattel, Inc. Master and slave toy vehicle pair
US20060073761A1 (en) * 2002-10-31 2006-04-06 Weiss Stephen N Remote controlled toy vehicle, toy vehicle control system and game using remote controlled toy vehicle
US7097532B1 (en) * 2004-10-16 2006-08-29 Peter Rolicki Mobile device with color discrimination
US7147535B2 (en) * 2002-06-11 2006-12-12 Janick Simeray Optical remote controller pointing the place to reach

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3314189A (en) * 1964-08-10 1967-04-18 William P Carroll Remote, light actuated control means for models
US4865575A (en) * 1988-11-04 1989-09-12 Mattel, Inc. Light responsive remote control vehicle
US6482064B1 (en) * 2000-08-02 2002-11-19 Interlego Ag Electronic toy system and an electronic ball
US6780077B2 (en) * 2001-11-01 2004-08-24 Mattel, Inc. Master and slave toy vehicle pair
US7147535B2 (en) * 2002-06-11 2006-12-12 Janick Simeray Optical remote controller pointing the place to reach
US20060073761A1 (en) * 2002-10-31 2006-04-06 Weiss Stephen N Remote controlled toy vehicle, toy vehicle control system and game using remote controlled toy vehicle
US7097532B1 (en) * 2004-10-16 2006-08-29 Peter Rolicki Mobile device with color discrimination

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8845385B2 (en) 2009-05-28 2014-09-30 Anki, Inc. Distributed system of autonomously controlled mobile agents
US9919232B2 (en) 2009-05-28 2018-03-20 Anki, Inc. Mobile agents for manipulating, moving, and/or reorienting components
US10874952B2 (en) 2009-05-28 2020-12-29 Digital Dream Labs, Llc Virtual representation of physical agent
US10188958B2 (en) 2009-05-28 2019-01-29 Anki, Inc. Automated detection of surface layout
US9155961B2 (en) 2009-05-28 2015-10-13 Anki, Inc. Mobile agents for manipulating, moving, and/or reorienting components
US9950271B2 (en) 2009-05-28 2018-04-24 Anki, Inc. Distributed system of autonomously controlled mobile agents
US11027213B2 (en) 2009-05-28 2021-06-08 Digital Dream Labs, Llc Mobile agents for manipulating, moving, and/or reorienting components
US8747182B2 (en) 2009-05-28 2014-06-10 Anki, Inc. Distributed system of autonomously controlled mobile agents
US8353737B2 (en) * 2009-05-28 2013-01-15 Anki, Inc. Distributed system of autonomously controlled toy vehicles
US9694296B2 (en) 2009-05-28 2017-07-04 Anki, Inc. Distributed system of autonomously controlled mobile agents
US8882560B2 (en) 2009-05-28 2014-11-11 Anki, Inc. Integration of a robotic system with one or more mobile computing devices
US20100304640A1 (en) * 2009-05-28 2010-12-02 Anki, Inc. Distributed System of Autonomously Controlled Toy Vehicles
US8951093B2 (en) 2009-05-28 2015-02-10 Anki, Inc. Distributed system of autonomously controlled mobile agents
US8951092B2 (en) 2009-05-28 2015-02-10 Anki, Inc. Distributed system of autonomously controlled mobile agents
US9238177B2 (en) 2009-05-28 2016-01-19 Anki, Inc. Distributed system of autonomously controlled mobile agents
US9067145B2 (en) 2009-05-28 2015-06-30 Anki, Inc. Virtual representations of physical agents
US9597606B2 (en) 2010-07-19 2017-03-21 China Industries Limited Racing vehicle game
US9233314B2 (en) 2010-07-19 2016-01-12 China Industries Limited Racing vehicle game
US8939812B2 (en) 2010-07-30 2015-01-27 Thinking Technology, Inc. Two-sided toy vehicle
WO2012012889A1 (en) * 2010-07-30 2012-02-02 Thinking Technology, Inc. Two-sided toy vehicle
WO2012012883A1 (en) * 2010-07-30 2012-02-02 Thinking Technology Improved remote controlled toy
US9144746B2 (en) 2010-08-20 2015-09-29 Mattel, Inc. Toy with locating feature
CN102755748A (en) * 2011-04-25 2012-10-31 斯平玛斯特有限公司 System for automatically tracking moving toy vehicle
US9195235B2 (en) 2011-08-09 2015-11-24 The Boeing Company Beam directed motion control system
US8874371B2 (en) * 2011-08-09 2014-10-28 The Boeing Company Beam directed motion control system
US20130041544A1 (en) * 2011-08-09 2013-02-14 The Boeing Company Beam Directed Motion Control System
WO2013022520A1 (en) * 2011-08-09 2013-02-14 The Boeing Company Beam directed motion control system
US20150093958A1 (en) * 2013-10-01 2015-04-02 Rehco, Llc System for Controlled Distribution of Light in Toy Characters
US9636594B2 (en) * 2013-10-01 2017-05-02 Rehco, Llc System for controlled distribution of light in toy characters
US10817308B2 (en) 2015-01-05 2020-10-27 Digital Dream Labs, Llc Adaptive data analytics service
US9996369B2 (en) 2015-01-05 2018-06-12 Anki, Inc. Adaptive data analytics service

Similar Documents

Publication Publication Date Title
US20070173171A1 (en) Reflected light controlled vehicle
US6507773B2 (en) Multi-functional robot with remote and video system
US5090708A (en) Non hand-held toy
US20230184512A1 (en) Electronic device for use with deterrent device
CA2482966A1 (en) Apparatus for lighting a patient monitor front panel
US8810407B1 (en) Walker with illumination, location, positioning, tactile and/or sensor capabilities
US20170036127A1 (en) Tracking toy vehicle
KR101199144B1 (en) System for controlling vehicle by recognizing motion
CA2489159C (en) Light controlled movable toy
WO2002029332A1 (en) Air conditioner and temperature detector
US9718395B1 (en) Apparatus and method for controlling a vehicle light with adjustable light output
TWI509530B (en) Adapted mobile carrier and auto following system
WO2009052652A1 (en) Lighting device and method with indication
EP2379188B1 (en) Torch
US20230393664A1 (en) Signalling device for generating a light signal when driving a vehicle
CA2452311A1 (en) A portable remote control unit with a lantern
JPH0639266Y2 (en) Combat toys
US20020127945A1 (en) Dancing doll controlled by acoustic or optical signal
US11344793B2 (en) Game device
KR20000018450U (en) A Flying Saucer

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION