US20150031268A1 - Toy vehicle with telemetrics and track system and method - Google Patents
Toy vehicle with telemetrics and track system and method Download PDFInfo
- Publication number
- US20150031268A1 US20150031268A1 US14/341,732 US201414341732A US2015031268A1 US 20150031268 A1 US20150031268 A1 US 20150031268A1 US 201414341732 A US201414341732 A US 201414341732A US 2015031268 A1 US2015031268 A1 US 2015031268A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- track
- data
- commands
- mobile device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H17/00—Toy vehicles, e.g. with self-drive; ; Cranes, winches or the like; Accessories therefor
- A63H17/26—Details; Accessories
- A63H17/36—Steering-mechanisms for toy vehicles
- A63H17/395—Steering-mechanisms for toy vehicles steered by program
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H18/00—Highways or trackways for toys; Propulsion by special interaction between vehicle and track
- A63H18/02—Construction or arrangement of the trackway
Definitions
- the present application relates to the field of toy vehicles. More particularly, the described embodiments relate to a remote control vehicle with on-board telemetry systems that monitor performance and a remote control application that monitors inputs by a user for storage and sharing over a cloud-based computer network.
- FIG. 1 is a schematic diagram showing a system for controlling remote control vehicles and for recording and sharing of data relating to the vehicles.
- FIG. 2 is a schematic diagram showing details for a mobile device, a server, and a related database for one embodiment of the system of FIG. 1 .
- FIG. 3 is a schematic diagram showing details for a remote controlled vehicle and track for one embodiment of the system of FIG. 1 .
- FIG. 4 is a front plan view of a detailed section of the gearing for a remote controlled car with an infrared RPM telemetry sensor.
- FIG. 5 a is a schematic diagram showing a first embodiment of a track having a visual track indicator.
- FIG. 5 b is a schematic diagram showing a second embodiment of a track having a visual track indicator.
- FIG. 5 c is a schematic diagram showing a third embodiment of a track having a visual track indicator.
- FIG. 6 is a front plan view of a tablet computer showing an embodiment for a user interface for use with the system of FIG. 1 .
- FIG. 7 is a flow chart showing a method for operating the system of FIG. 1 .
- FIG. 8 is a flow chart showing a method for detecting and transmitting data in connection with the method of FIG. 7 .
- FIG. 9 is a flow chart showing a method for controlling the start of the vehicle.
- FIG. 1 shows a system 100 that utilizes one or more cloud-based servers 110 to store data in a database 112 related to the operation of one or more remote controlled vehicles 140 , 142 .
- Vehicle 140 for instance, is controlled by a first tablet computer 130 .
- An app (not shown in FIG. 1 ) running on the tablet computer 130 allows a user to input commands to the vehicle 140 .
- the vehicle 140 utilizes telemetry sensors to record data about its operation, such as its speed, the rotations per minute (RPM) of one or more wheels, acceleration, distance travelled, etc. These sensors are read using digital logic found on the vehicle 140 . The data from these sensors is then transmitted to the tablet computer 130 for storage and analysis using the app operating on the tablet computer 130 .
- RPM rotations per minute
- the remote controlled vehicle 140 operates on a track 150 .
- the track 150 can contain indicators that can be read by track sensors found on the vehicle 140 .
- the track sensors can be, for instance, image sensors that read data, lines, and other markings found on the track.
- the information read by the track sensors can also be transmitted by the remote controlled vehicle 140 to the tablet computer 130 for analysis and storage.
- the tablet computer 130 stores the telemetry and track-related data that it receives from the vehicle 140 in its memory. This data can be transferred through a wide-area network 120 (such as the Internet) to one or more remote servers 110 for storage in the remote database 112 . In addition, the tablet computer 130 can record the input commands that it received from the user and store these inputs along with the telemetry and track data in its memory for transmission to the remote server 110 . In one embodiment, this information is grouped together for each race (or “run”) through the track by the first vehicle 140 . A run is a particular race by the vehicle 140 through the track 150 , and may constitute one or more laps of the track 150 . In other embodiments, the vehicle 140 need not operate on a particular track 150 , and a run is a set time period during which the remote controlled vehicle 140 is operated and data is collected and stored.
- a second user using a second tablet computer 132 can access the input commands, telemetric data, and track data that the first user stored in the database 112 .
- This information can be used by the second tablet computer 132 to compare the first user's performance with car 140 to the second user's ability to control car 142 .
- the second user can lay out a track 152 that is identical to the track 150 used by the first user. Since the second tablet computer 132 has downloaded all of the information about the first user's run on their track 150 , the second user can use their tablet computer 132 to control their vehicle 142 through the same track layout 152 and compare the results. In effect, the two physical vehicles 140 , 142 are permitted to race head-to-head through the same track layout 150 , 152 . This is true even though the second vehicle 142 may make its run through the track 152 at a later time and at a different location than the run made by the first vehicle 140 through its track 150 .
- FIG. 2 shows a remote control vehicle 200 that is in communication with a mobile device 210 .
- the vehicle is communicating with a local wireless interface 218 found on the mobile device 210 .
- This interface may take the form a BLUETOOTH® connection that complies with one of the standards of the Bluetooth Special Interest Group (such as the Bluetooth 4.0).
- the interface 218 may be a Wi-Fi interface that utilizes one of the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards.
- IEEE Institute of Electrical and Electronics Engineers'
- the mobile device 210 can take the form of a smart phone or tablet computer.
- the device 210 will include a display 212 for displaying information to a user, a user input mechanism 214 for receiving user input commands (such as a touch screen integrated into the display 212 ), and a processor 216 for processing instructions and data for the device 100 .
- the display 212 can use LCDs, OLEDs, or similar technology to provide a color display for the user.
- the processor 216 can be a general purpose CPU, such as those provided by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.), or a mobile specific processor, such as those designed by ARM Holdings (Cambridge, England).
- the mobile device 210 also has the local wireless interface 218 for interfacing with the remote control vehicle 200 , and a wide area network interface 220 for connecting with a network 250 . In some embodiments, these two interfaces 218 , 220 could be the same interface. For instance, the mobile device 210 may interface with both the remote controlled vehicle 200 and the network 250 over a single Wi-Fi network interface 218 , 220 .
- the mobile device 210 further includes a memory 230 for storing processing instructions and data. This memory is typically solid-state memory, such as flash memory.
- Mobile devices such as device 210 generally use specific operating systems 232 designed for such devices, such as iOS from Apple Inc. (Cupertino, Calif.) or ANDROID OS from Google Inc. (Menlo Park, Calif.).
- the operating system 232 is stored on the memory 230 and is used by the processor 216 to provide a user interface for the display 212 , to receive input over the user input device(s) 214 , to handle communications for the device 210 over the interfaces 218 , 220 , and to manage applications (or apps) that are stored in the memory 230 , such as the remote control app 234 .
- the remote control app 234 is responsible for receiving user input 214 related to the control of the remote controlled vehicle 200 and ensuring that these inputs are relayed to the vehicle 200 over interface 218 .
- the app 234 receives data from the vehicle 200 over interface 218 and stores this data in memory 230 .
- the app 234 may receive car telemetry data 238 and track related data 240 .
- some embodiments of the app 234 allow the user to request that the vehicle 200 take video or still images using an image sensor found on the vehicle 200 . This image data 242 is also received by the app 234 and stored in memory 230 .
- the app 234 In addition to storing this data 238 , 240 , 242 , the app 234 also generates a user interface on the display 212 and shares this data in real time with the user over the display 212 . Finally, the app 234 is responsible for connecting with a remote server 260 over network interface 220 and for sharing its data 238 , 240 242 with the server 260 . The app 234 can also request data from the server 260 concerning previous runs or runs made by third-parties, and store this data in memory 230 .
- the server 260 contains a programmable digital processor 262 , such as a general purpose CPU manufactured by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.).
- the server 260 further contains a wireless or wired network interface 264 to communicate with remote computing devices, such as mobile device 210 , over the network 250 .
- the processor 262 is programmed using a set of software instructions stored on a non-volatile, non-transitory, computer readable memory 266 , such as a hard drive or flash memory device.
- the software typically includes operating system software 268 , such as LINUX (available from multiple companies under open source licensing terms) or WINDOWS (available from Microsoft Corporation of Redmond, Wash.).
- the processor 262 controls the communication with mobile device 210 under the direction of an application program 269 residing on the server memory 266 .
- the application program 269 is further responsible for maintaining data within a database 270 .
- the database 270 may be stored in the memory 266 of the server 260 as structured data (such as separate tables in a relational database, or as database objects in an object-oriented database environment). Alternatively, the database 270 may be stored and managed in a separate device, such as a separate database server computer and/or a storage area network storage device.
- Database programming stored on the memory 266 directs the processor 262 to access, manipulate, update, and report on the data in the database 270 . This database programming can be considered part of the application programming 269 .
- FIG. 2 shows the database 270 with tables or objects for users 272 , vehicles 274 , races or runs 276 , user inputs 278 , telemetry data 280 , track data 282 , and image data 284 . Relationships between the database entities are represented in FIG. 2 using crow's foot notation. For example, FIG. 2 shows that each user 272 can be associated with one or more vehicles 274 , since each actual user may have multiple remote control vehicles 200 . In addition, a physical vehicle 200 may be shared by more than one user, so the database 270 allows a particular vehicle database entity 274 to be associated with multiple user database entities 272 .
- a particular run of the vehicle 200 over a track is represented by a single run database entity 276 , which is associated with a single user 272 and a single vehicle 274 .
- the database 270 can store and track multiple user inputs 278 , telemetry data readings 280 , track data readings 282 , and images and/or video files 284 .
- Associations or relationships between the database entities shown in FIG. 2 can be implemented through a variety of known database techniques, such as through the use of foreign key fields and associative tables in a relational database model. In FIG. 2 , associations are shown directly between two database entities, but entities can also be associated through a third database entity. For example, an image file 284 is directly associated with one run 276 , and through that relationship the image file 284 is also associated with a single vehicle 274 and a single user 272 .
- Each of the database entities 272 - 284 shown in FIG. 2 can be implemented as a separate relational database table within a relational database 260 .
- each of the entities 272 - 284 could be implemented using a plurality of related database tables.
- FIG. 3 schematically reveals the major electrical components of a remote controlled vehicle 200 .
- the vehicle 200 has a processor 310 that controls the major functions of the vehicle 200 .
- the processor 310 operates under the control of operational programming 322 , which is stored in digital memory 320 that is accessible by the processor 310 .
- the processor 310 and the operational programming 322 could be combined into an application-specific integrated circuit (or “ASIC”) or a field programmable device (such as an FPGA) specifically designed to handle the processing requirements of the vehicle 200 .
- ASIC application-specific integrated circuit
- FPGA field programmable device
- the processor 310 receives instructions for operating the vehicle from a tablet computer 210 over a local wireless interface 318 .
- this interface 318 may operate under Bluetooth protocols, IEEE 802.11 protocols, or any similar local, wireless protocols. Based on these control instructions, the operational programming 322 will control the wheel motor(s) 330 and the steering motor 332 to control forward and backward motion and the steering of the vehicle. 200 .
- the instructions received from the tablet computer 210 may include directions for the vehicle 200 to take still or video images on its image sensor(s) 316 .
- the resulting images 324 are stored in the vehicle memory 320 for transmission back to the tablet computer 210 when convenient.
- the data from the image sensors 316 is fed to the mobile device 210 as a live feed, allowing the tablet computer 210 to generate still and video image files as desired by the user.
- the processor 310 is monitoring the telemetry sensors 312 to obtain data about how the vehicle 200 is moving and behaving.
- These sensors 312 may include RPM sensors that track wheel revolutions for the vehicle, accelerometer sensors that track acceleration of the vehicle, and even sensors that measure the performance and characteristics (such as heat) of the wheel motors 330 .
- the telemeter sensors 312 include at least an RPM sensor that can indicate wheel rotations, from which can be derived the vehicle speed and distance traveled.
- separate RPM sensors 312 may be placed on wheels driven by the wheel motors 330 and non-driven wheels. The sensors 312 at the powered wheels may detect wheel spin during periods of hard acceleration, at which time the sensors 312 at the non-driven wheels will give a better indication of the vehicle's current speed and distance traveled.
- FIG. 4 shows one embodiment of a telemetry sensor 400 .
- sensor 400 is an infrared RPM sensor that measures rotations of a wheel on the vehicle 200 .
- This sensor 400 utilizes an infrared transmitting tube 410 that transmits an infrared beam through a gear 420 that is used to drive a wheel of the vehicle 200 .
- On the other side of the gear 420 is an infrared receiving tube 430 that receives and senses the infrared beam transmitted by transmitting tube 410 .
- the gear 420 rotates, a portion of the gear 420 will interrupt the beam during the rotation.
- the processor/logic 310 can determine rotations of the wheel on the vehicle 200 .
- Each interruption may not indicate a complete wheel rotation, as the gear 420 will likely interrupt the infrared signal multiple times in a single rotation, and a single rotation of the gear 420 will not likely lead to a single rotation of the wheel. Nonetheless, based on the gearing of the vehicle and the construction of the gear 420 , there will be a known relationship between interruptions in the light beam and rotations of the wheel.
- the processor 310 In addition to monitoring telemetry sensors 312 such as the infrared RPM sensor 400 shown in FIG. 4 , the processor 310 also monitors one or more track sensors 314 on the vehicle 200 (as seen in FIG. 3 ). These sensors 314 read track indicators 352 found on a specially designed track 350 .
- the track 350 constitutes a plurality of specially designed track pieces of a plurality of known lengths that can be configured into multiple track layouts. These track segments are constructed with track indicators 352 that can be read by the track sensor 314 on the vehicle 200 . As explained below in connection with FIGS. 5 a - 5 c, these track indicators 352 can be visual markings on the surface of the track 350 .
- the track sensor 314 takes the form of an image sensor 314 that, together with the processor 310 , can recognize and interpret the visual markings 352 .
- the track indicators 352 can take the form of radio frequency identifies, such as specially designed RFID tags, that can be read while the vehicle 200 passes over the track 350 by a reading sensor 314 found on the vehicle 200 .
- the track sensor 314 is an image sensor that can detect visual markings 352 found on the track 350 .
- FIGS. 5 a - 5 c show three example markings that may be placed on the track 350 .
- FIG. 5 a two track segments 510 , 520 of differing lengths are shown.
- Each track segment 510 , 520 contains alternating background colors 512 , 514 , with a lighter background 512 always being followed by a darker background 514 .
- Each segment 512 , 514 is of a uniform width.
- the image sensor 314 is pointed downward on the vehicle 200 toward the track segments 510 , 520 .
- the processor 310 When the sensor 314 notes a change in background color on the track 512 , 514 , the processor 310 will know that the vehicle 200 has traveled the distance necessary to move to the next background segment 512 , 514 . When this information is transmitted to the mobile device 210 , the device 210 will be able to verify that the vehicle 200 was indeed travelling along the track 350 . Furthermore, this track traversal information can be compared with distance information obtained from the telemetry sensor 312 to ensure that all of the measurements are consistent with vehicle 200 movement on the track 350 . This prevents competitors from spoofing the system 100 , such as by faking a run through a track while holding a vehicle 200 in the air.
- FIG. 5 b shows another technique for creating visual track markings 352 .
- the track segments 530 , 540 each have color lines 532 , 534 , 536 perpendicular to the length of the track segment at regularly spaced intervals.
- the sensors 314 can read the presence and color of these lines 532 - 536 and transmit this information to the processor 310 .
- the tablet computer 210 can also determine distance traversal along the track 350 . By alternating three colors, the tablet computer 210 can identify situations where a vehicle has reversed directions and is traversing backwards along the track 350 .
- the use of three different, identifiable shades or colors could also be used in the embodiment shown in FIG. 5 a to detect backwards movement along the track.
- the track segments 550 , 560 reach have a visual bar code 552 , 562 that can be read as the track sensor 314 passes along the track 350 .
- Each code 552 , 562 identifies the track segment 550 , 560 , which can be used to determine the length and configuration of that track segment 550 , 560 .
- the codes 552 , 562 are identification numbers, and the length and configuration of the related segments 550 , 560 must be determined by consulting a look-up table located on the vehicle memory 320 , the tablet computer 210 , or on the remote server ( 110 , 260 ). In other configurations, this information is directly read from the codes 552 , 562 that are printed on the track segments 550 , 560 .
- a finish line could be created with a unique background color in track segment 510 , or with a unique color line in segment 530 , or with a unique code in segment 550 .
- the vehicle 200 can sense when it crosses the finish line, in order to stop timing a particular run, or to count laps when a particular race involves multiple laps of the track 350 .
- FIG. 6 shows a user interface 600 on a mobile device 210 that is generated by the remote control app 234 .
- the main portion of the interface 600 shows a live view 602 of the image being seen by the image sensor 316 on the vehicle 200 . In this case, the vehicle 200 is seen following a hiker on a dirt trail.
- the interface 600 allows the user to control the movement of the controlled vehicle through direction controls 610 and gear controls 620 .
- the direction controls 610 are shown as four directional arrows superimposed on the image 602 .
- the gear controls 620 are composed of an up and down arrow superimposed over the image 602 , allowing the user to change gears by simply pressing one of the arrows 620 .
- Speed and gear information 630 are shown on the bottom of the interface 600 .
- the touch mode control changes the steering controls from the directional controls 610 to tilt control.
- Tilt control uses sensors within the mobile device 210 to determine when a user tilts the device 210 and sends signals to steer the vehicle in the direction of the tilt.
- the timer display at the center top of the interface 600 allows a user to see the current time of a particular “run.”
- the photo button stores the current image as a still image, while the video button causes the mobile device 210 to record the image stream coming from the vehicle 200 as a video file.
- a red circle is placed on the video button to indicate that a video file is currently being created of the image feed being displayed at location 602 .
- the help button presents a help system to the user, while the screen control menu allows the user to change various settings with the interface 600 and the app 234 as a whole.
- FIG. 7 shows a method 700 for utilization of the vehicle 200 , mobile device 210 , server 260 , and track 350 .
- the method starts at step 705 with the arrangement of track segments, such as segments 530 , 540 , into a track 350 .
- the track 350 may be arranged in a complex loop, where the vehicle 200 can traverse the closed loop multiple times for a single race. This is not necessary, however, as the track 350 may be arranged with a separate beginning and an end.
- a wireless communication path is established between the vehicle 200 and the mobile device 210 .
- this can be a Bluetooth connection, a Wi-Fi connection, or any similar wireless data communication path between the devices 200 , 210 .
- the vehicle 200 is placed on the track 350 .
- the mobile device 210 is used to control the vehicle in a race or run around the track 350 .
- the mobile device acquires data related to the race (step 725 ).
- the mobile device 210 receives car control input from the user at step 805 , and then transmits control signals reflecting that input to the vehicle 200 in step 810 .
- the vehicle 200 receives those signals at step 815 , and then adjusts the car performance in step 820 .
- the vehicle can manipulate the wheel motors 330 and the steering motor 332 to control motion and direction of the vehicle 200 .
- Other control inputs and responses are possible depending on the capabilities of the vehicle 200 .
- the vehicle may have additional motors to control such things as a camera mount, a crane, a plow blade, or additional drive wheels; or the vehicle may have additional electronic components such as a microphone, multiple cameras, a speaker, a touch sensor, etc.
- the user interface 600 allows a user to control such features, and control inputs for these features can be transmitted at step 810 , received at step 815 , and implemented at step 820 .
- the telemetry sensors 312 , the track sensors 314 , and the image sensors 316 will be read at steps 825 , 830 , and 835 respectively.
- the data from these sensors 312 - 316 will be immediately transmitted to the mobile device 210 at step 840 .
- the mobile device 210 will receive this data at step 845 , and then display and/or store the data.
- the data from the sensors 312 - 316 will be stored in the memory 320 of the vehicle 200 for transmission to the mobile device 210 at a later time. In the preferred embodiment, all of this data will be time coded so as to be able to compare each element of data temporally with the other data, including the received control signals.
- the mobile device 210 will upload the data that it acquires for a particular run of the vehicle to the cloud server 260 for storage in the database 270 .
- even the image data is uploaded to the server 260 and stored in the database 270 .
- the mobile device 210 downloads from the server 260 data from a race run by a third party. This allows the user of the mobile device 210 to compare the third-party race results with their own. If the third party ran the race on the same track configuration, it would be possible to compare the performance of each user head-to-head. The total time for the race could be compared to determine a race winner. Individual statistics could be compared, such as fastest lap time, longest wheel skid, etc. If the user elects to perform their race again at step 745 , the third-party race results could be displayed live on the interface 600 while the user controlled their vehicle 200 over the track 350 .
- the interface 600 could display the user's lead over the opponent, or how far behind their opponent their vehicle 200 is at any point in the race.
- the interface 600 could even superimpose an image of the third-party vehicle on the image portion 602 of the interface 600 whenever the user was running behind the third party during their race.
- the third-party user input data 236 could even be used to control the user's vehicle 200 . While environmental differences and variations in starting positions may prevent input data 236 from a multi-lap third-party race from recreating the entire race with the user's vehicle, this ability could prove useful to recreate maneuvers performed by third parties.
- a third-party may be able to perform interesting tricks with their vehicle based on a particular sequence of input commands.
- the third-party could upload the run data for the trick.
- the current user sees video of the trick (such as on a website operated using the data in database 270 ) they decide to download the third-party run data for the trick.
- FIG. 9 shows an additional method 900 that uses the sensors 312 , 314 on the vehicle 200 to create a controlled start mode.
- One problem with racing toy vehicles 200 is that their motors 330 are usually engineered to provide a great deal of torque to the vehicle wheels relative to the vehicle's size and weight.
- Novice users tend to run their first race at “full throttle” and soon find the vehicle 200 impossible to control.
- Method 900 solves this issue by implementing a controlled start mode that may useful for novice users.
- the method 900 starts at step 905 , in which the mobile device 210 receives a command through the user interface to put the vehicle 200 into a controlled start mode. At step 910 , the mobile device 210 transmits this mode change to the vehicle 200 . At step 915 , the vehicle 200 receives this transmission and sets the vehicle 200 to operate in this mode.
- the mobile device 210 receives a command through its user interface to increase the throttle of the vehicle 200 to a user-selected level (step 920 ). This command is then sent as an instruction to the vehicle 200 at step 925 . The vehicle 200 receives this command at step 930 . However, instead of operating the motor 330 at the user-selected level, the vehicle 200 recognizes that it is not currently moving and has been set to operate in the controlled start mode. As a result, the vehicle starts the motor 330 operating at maximum torque (aka “throttle”) in step 930 . This allows the vehicle 200 to start as quickly as possible.
- throttle maximum torque
- the vehicle 200 senses that it has begun to move. This sensing can be accomplished through the track sensor 314 . In other embodiments, the telemetry sensor can determine that the vehicle wheels are now moving the vehicle 200 .
- the vehicle automatically reduces the amount of torque being provided from the motor 330 to the vehicle wheels (step 940 ). In one embodiment, the amount torque being provided by the motor 330 at step 940 is completely controlled by the throttle commands received from the mobile device 210 . If the user is request 90% throttle, step 940 will now provide 90% throttle. In another embodiment, user throttle commands are not allowed to drive the motor 330 above a pre-set throttle/torque level, where such level is below the maximum level provided at step 930 .
- the controlled start mode may prevent the motor from ever providing greater than 60% throttle (other than during start-up at step 930 ). If the user requests 20% throttle, the motor 330 is operated at 20%, but if the user requests 100% throttle, the motor 330 is operated at the pre-set level (such as 60%).
- the vehicle 200 always operates its motor 330 as instructed by the mobile device 210 .
- the mobile device 210 becomes responsible for operating the vehicle 200 in controlled start mode. To accomplish this, the vehicle 200 must constantly provide telemetry data 326 and track data 328 to the mobile device 210 . In this way, the mobile device 210 will know when the vehicle 200 is stopped and when it begins moving. The mobile device 210 will instruct the vehicle 200 to operate the motor 330 at full throttle when starting, as described above in connection with step 930 . When the data received from the vehicle 200 indicates to the mobile device 210 that the vehicle 200 has started moving, the mobile device 210 will change its throttle command to the vehicle 200 to operate under controlled throttle (as described above in connection with step 940 ).
Abstract
A remote control vehicle is presented that has a telemetry sensor, which detects wheel rotation, and a track sensor that reads track indicators found on a track over which the vehicle is driven. The vehicle receives commands from a mobile device to control its operation. Wheel rotation data and track sensor data are transmitted from the vehicle back to the mobile device for storage along with the commands that were transmitted to the vehicle. The commands, rotation data, and track sensor data are transmitted to a server computer over a wide area network, and thereafter shared with another user as run data. When downloaded by the other user, the run data can be used to compare the other user's ability to control their vehicle with the run data. The run data can further be used to control the other user's vehicle.
Description
- This application claims the benefit of U.S. Provisional Application Ser. No. 61/858,440, filed Jul. 25, 2013, which is hereby incorporated by reference in its entirety herein.
- The present application relates to the field of toy vehicles. More particularly, the described embodiments relate to a remote control vehicle with on-board telemetry systems that monitor performance and a remote control application that monitors inputs by a user for storage and sharing over a cloud-based computer network.
-
FIG. 1 is a schematic diagram showing a system for controlling remote control vehicles and for recording and sharing of data relating to the vehicles. -
FIG. 2 is a schematic diagram showing details for a mobile device, a server, and a related database for one embodiment of the system ofFIG. 1 . -
FIG. 3 is a schematic diagram showing details for a remote controlled vehicle and track for one embodiment of the system ofFIG. 1 . -
FIG. 4 is a front plan view of a detailed section of the gearing for a remote controlled car with an infrared RPM telemetry sensor. -
FIG. 5 a is a schematic diagram showing a first embodiment of a track having a visual track indicator. -
FIG. 5 b is a schematic diagram showing a second embodiment of a track having a visual track indicator. -
FIG. 5 c is a schematic diagram showing a third embodiment of a track having a visual track indicator. -
FIG. 6 is a front plan view of a tablet computer showing an embodiment for a user interface for use with the system ofFIG. 1 . -
FIG. 7 is a flow chart showing a method for operating the system ofFIG. 1 . -
FIG. 8 is a flow chart showing a method for detecting and transmitting data in connection with the method ofFIG. 7 . -
FIG. 9 is a flow chart showing a method for controlling the start of the vehicle. -
FIG. 1 shows asystem 100 that utilizes one or more cloud-basedservers 110 to store data in adatabase 112 related to the operation of one or more remote controlledvehicles Vehicle 140, for instance, is controlled by afirst tablet computer 130. An app (not shown inFIG. 1 ) running on thetablet computer 130 allows a user to input commands to thevehicle 140. Thevehicle 140, in turn, utilizes telemetry sensors to record data about its operation, such as its speed, the rotations per minute (RPM) of one or more wheels, acceleration, distance travelled, etc. These sensors are read using digital logic found on thevehicle 140. The data from these sensors is then transmitted to thetablet computer 130 for storage and analysis using the app operating on thetablet computer 130. In some embodiments, the remote controlledvehicle 140 operates on atrack 150. In these embodiments, thetrack 150 can contain indicators that can be read by track sensors found on thevehicle 140. The track sensors can be, for instance, image sensors that read data, lines, and other markings found on the track. The information read by the track sensors can also be transmitted by the remote controlledvehicle 140 to thetablet computer 130 for analysis and storage. - The
tablet computer 130 stores the telemetry and track-related data that it receives from thevehicle 140 in its memory. This data can be transferred through a wide-area network 120 (such as the Internet) to one or moreremote servers 110 for storage in theremote database 112. In addition, thetablet computer 130 can record the input commands that it received from the user and store these inputs along with the telemetry and track data in its memory for transmission to theremote server 110. In one embodiment, this information is grouped together for each race (or “run”) through the track by thefirst vehicle 140. A run is a particular race by thevehicle 140 through thetrack 150, and may constitute one or more laps of thetrack 150. In other embodiments, thevehicle 140 need not operate on aparticular track 150, and a run is a set time period during which the remote controlledvehicle 140 is operated and data is collected and stored. - A second user using a
second tablet computer 132 can access the input commands, telemetric data, and track data that the first user stored in thedatabase 112. This information can be used by thesecond tablet computer 132 to compare the first user's performance withcar 140 to the second user's ability to controlcar 142. In particular, the second user can lay out atrack 152 that is identical to thetrack 150 used by the first user. Since thesecond tablet computer 132 has downloaded all of the information about the first user's run on theirtrack 150, the second user can use theirtablet computer 132 to control theirvehicle 142 through thesame track layout 152 and compare the results. In effect, the twophysical vehicles same track layout second vehicle 142 may make its run through thetrack 152 at a later time and at a different location than the run made by thefirst vehicle 140 through itstrack 150. -
FIG. 2 shows aremote control vehicle 200 that is in communication with amobile device 210. In particular, the vehicle is communicating with a localwireless interface 218 found on themobile device 210. This interface may take the form a BLUETOOTH® connection that complies with one of the standards of the Bluetooth Special Interest Group (such as the Bluetooth 4.0). Alternatively, theinterface 218 may be a Wi-Fi interface that utilizes one of the Institute of Electrical and Electronics Engineers' (IEEE) 802.11 standards. - The
mobile device 210 can take the form of a smart phone or tablet computer. As such, thedevice 210 will include adisplay 212 for displaying information to a user, auser input mechanism 214 for receiving user input commands (such as a touch screen integrated into the display 212), and aprocessor 216 for processing instructions and data for thedevice 100. Thedisplay 212 can use LCDs, OLEDs, or similar technology to provide a color display for the user. Theprocessor 216 can be a general purpose CPU, such as those provided by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.), or a mobile specific processor, such as those designed by ARM Holdings (Cambridge, England). - The
mobile device 210 also has the localwireless interface 218 for interfacing with theremote control vehicle 200, and a widearea network interface 220 for connecting with anetwork 250. In some embodiments, these twointerfaces mobile device 210 may interface with both the remote controlledvehicle 200 and thenetwork 250 over a single Wi-Fi network interface mobile device 210 further includes amemory 230 for storing processing instructions and data. This memory is typically solid-state memory, such as flash memory. Mobile devices such asdevice 210 generally usespecific operating systems 232 designed for such devices, such as iOS from Apple Inc. (Cupertino, Calif.) or ANDROID OS from Google Inc. (Menlo Park, Calif.). Theoperating system 232 is stored on thememory 230 and is used by theprocessor 216 to provide a user interface for thedisplay 212, to receive input over the user input device(s) 214, to handle communications for thedevice 210 over theinterfaces memory 230, such as theremote control app 234. - The
remote control app 234 is responsible for receivinguser input 214 related to the control of the remote controlledvehicle 200 and ensuring that these inputs are relayed to thevehicle 200 overinterface 218. In addition, theapp 234 receives data from thevehicle 200 overinterface 218 and stores this data inmemory 230. In particular, theapp 234 may receivecar telemetry data 238 and trackrelated data 240. In addition, some embodiments of theapp 234 allow the user to request that thevehicle 200 take video or still images using an image sensor found on thevehicle 200. Thisimage data 242 is also received by theapp 234 and stored inmemory 230. In addition to storing thisdata app 234 also generates a user interface on thedisplay 212 and shares this data in real time with the user over thedisplay 212. Finally, theapp 234 is responsible for connecting with aremote server 260 overnetwork interface 220 and for sharing itsdata server 260. Theapp 234 can also request data from theserver 260 concerning previous runs or runs made by third-parties, and store this data inmemory 230. - The
server 260 contains a programmabledigital processor 262, such as a general purpose CPU manufactured by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.). Theserver 260 further contains a wireless or wirednetwork interface 264 to communicate with remote computing devices, such asmobile device 210, over thenetwork 250. Theprocessor 262 is programmed using a set of software instructions stored on a non-volatile, non-transitory, computerreadable memory 266, such as a hard drive or flash memory device. The software typically includesoperating system software 268, such as LINUX (available from multiple companies under open source licensing terms) or WINDOWS (available from Microsoft Corporation of Redmond, Wash.). - The
processor 262 controls the communication withmobile device 210 under the direction of anapplication program 269 residing on theserver memory 266. Theapplication program 269 is further responsible for maintaining data within adatabase 270. Thedatabase 270 may be stored in thememory 266 of theserver 260 as structured data (such as separate tables in a relational database, or as database objects in an object-oriented database environment). Alternatively, thedatabase 270 may be stored and managed in a separate device, such as a separate database server computer and/or a storage area network storage device. Database programming stored on thememory 266 directs theprocessor 262 to access, manipulate, update, and report on the data in thedatabase 270. This database programming can be considered part of theapplication programming 269. -
FIG. 2 shows thedatabase 270 with tables or objects forusers 272,vehicles 274, races or runs 276,user inputs 278,telemetry data 280,track data 282, andimage data 284. Relationships between the database entities are represented inFIG. 2 using crow's foot notation. For example,FIG. 2 shows that eachuser 272 can be associated with one ormore vehicles 274, since each actual user may have multipleremote control vehicles 200. In addition, aphysical vehicle 200 may be shared by more than one user, so thedatabase 270 allows a particularvehicle database entity 274 to be associated with multipleuser database entities 272. A particular run of thevehicle 200 over a track is represented by a singlerun database entity 276, which is associated with asingle user 272 and asingle vehicle 274. For eachrun 276, thedatabase 270 can store and trackmultiple user inputs 278,telemetry data readings 280,track data readings 282, and images and/or video files 284. Associations or relationships between the database entities shown inFIG. 2 can be implemented through a variety of known database techniques, such as through the use of foreign key fields and associative tables in a relational database model. InFIG. 2 , associations are shown directly between two database entities, but entities can also be associated through a third database entity. For example, animage file 284 is directly associated with onerun 276, and through that relationship theimage file 284 is also associated with asingle vehicle 274 and asingle user 272. - Each of the database entities 272-284 shown in
FIG. 2 can be implemented as a separate relational database table within arelational database 260. Alternatively, each of the entities 272-284 could be implemented using a plurality of related database tables. It is also possible to implement the entities 272-284 as objects in an object oriented database. The distinction made between the entities 272-284 inFIG. 2 and the following description are made for ease in understanding the data maintained and manipulated by theserver 260, and should not be seen to limit the scope of the invention described herein. -
FIG. 3 schematically reveals the major electrical components of a remote controlledvehicle 200. Thevehicle 200 has aprocessor 310 that controls the major functions of thevehicle 200. In the embodiment shown inFIG. 3 , theprocessor 310 operates under the control ofoperational programming 322, which is stored indigital memory 320 that is accessible by theprocessor 310. Alternatively, theprocessor 310 and theoperational programming 322 could be combined into an application-specific integrated circuit (or “ASIC”) or a field programmable device (such as an FPGA) specifically designed to handle the processing requirements of thevehicle 200. Whether formed as a general-purpose processor, an ASIC, or an FPGA, theprocessor 310 receives instructions for operating the vehicle from atablet computer 210 over alocal wireless interface 318. As described above, thisinterface 318 may operate under Bluetooth protocols, IEEE 802.11 protocols, or any similar local, wireless protocols. Based on these control instructions, theoperational programming 322 will control the wheel motor(s) 330 and thesteering motor 332 to control forward and backward motion and the steering of the vehicle. 200. - The instructions received from the
tablet computer 210 may include directions for thevehicle 200 to take still or video images on its image sensor(s) 316. In some embodiments, the resultingimages 324 are stored in thevehicle memory 320 for transmission back to thetablet computer 210 when convenient. In the preferred embodiment, the data from theimage sensors 316 is fed to themobile device 210 as a live feed, allowing thetablet computer 210 to generate still and video image files as desired by the user. - While the
vehicle 200 is moving under control of these instructions, theprocessor 310 is monitoring thetelemetry sensors 312 to obtain data about how thevehicle 200 is moving and behaving. Thesesensors 312 may include RPM sensors that track wheel revolutions for the vehicle, accelerometer sensors that track acceleration of the vehicle, and even sensors that measure the performance and characteristics (such as heat) of thewheel motors 330. Preferably, thetelemeter sensors 312 include at least an RPM sensor that can indicate wheel rotations, from which can be derived the vehicle speed and distance traveled. In some embodiments,separate RPM sensors 312 may be placed on wheels driven by thewheel motors 330 and non-driven wheels. Thesensors 312 at the powered wheels may detect wheel spin during periods of hard acceleration, at which time thesensors 312 at the non-driven wheels will give a better indication of the vehicle's current speed and distance traveled. -
FIG. 4 shows one embodiment of atelemetry sensor 400. In particular,sensor 400 is an infrared RPM sensor that measures rotations of a wheel on thevehicle 200. Thissensor 400 utilizes aninfrared transmitting tube 410 that transmits an infrared beam through agear 420 that is used to drive a wheel of thevehicle 200. On the other side of thegear 420 is an infrared receiving tube 430 that receives and senses the infrared beam transmitted by transmittingtube 410. When thegear 420 rotates, a portion of thegear 420 will interrupt the beam during the rotation. By counting the interruptions in the infrared signal detected by the infrared receiving tube 430, the processor/logic 310 can determine rotations of the wheel on thevehicle 200. Each interruption may not indicate a complete wheel rotation, as thegear 420 will likely interrupt the infrared signal multiple times in a single rotation, and a single rotation of thegear 420 will not likely lead to a single rotation of the wheel. Nonetheless, based on the gearing of the vehicle and the construction of thegear 420, there will be a known relationship between interruptions in the light beam and rotations of the wheel. - In addition to
monitoring telemetry sensors 312 such as theinfrared RPM sensor 400 shown inFIG. 4 , theprocessor 310 also monitors one ormore track sensors 314 on the vehicle 200 (as seen inFIG. 3 ). Thesesensors 314 readtrack indicators 352 found on a specially designedtrack 350. In the preferred embodiment, thetrack 350 constitutes a plurality of specially designed track pieces of a plurality of known lengths that can be configured into multiple track layouts. These track segments are constructed withtrack indicators 352 that can be read by thetrack sensor 314 on thevehicle 200. As explained below in connection withFIGS. 5 a-5 c, thesetrack indicators 352 can be visual markings on the surface of thetrack 350. In these cases, thetrack sensor 314 takes the form of animage sensor 314 that, together with theprocessor 310, can recognize and interpret thevisual markings 352. In other embodiments, thetrack indicators 352 can take the form of radio frequency identifies, such as specially designed RFID tags, that can be read while thevehicle 200 passes over thetrack 350 by areading sensor 314 found on thevehicle 200. - As explained above, in one embodiment the
track sensor 314 is an image sensor that can detectvisual markings 352 found on thetrack 350.FIGS. 5 a-5 c show three example markings that may be placed on thetrack 350. InFIG. 5 a, twotrack segments track segment background colors lighter background 512 always being followed by adarker background 514. Eachsegment image sensor 314 is pointed downward on thevehicle 200 toward thetrack segments sensor 314 notes a change in background color on thetrack processor 310 will know that thevehicle 200 has traveled the distance necessary to move to thenext background segment mobile device 210, thedevice 210 will be able to verify that thevehicle 200 was indeed travelling along thetrack 350. Furthermore, this track traversal information can be compared with distance information obtained from thetelemetry sensor 312 to ensure that all of the measurements are consistent withvehicle 200 movement on thetrack 350. This prevents competitors from spoofing thesystem 100, such as by faking a run through a track while holding avehicle 200 in the air. -
FIG. 5 b shows another technique for creatingvisual track markings 352. In this Figure, thetrack segments color lines sensors 314 can read the presence and color of these lines 532-536 and transmit this information to theprocessor 310. Using this information, thetablet computer 210 can also determine distance traversal along thetrack 350. By alternating three colors, thetablet computer 210 can identify situations where a vehicle has reversed directions and is traversing backwards along thetrack 350. The use of three different, identifiable shades or colors could also be used in the embodiment shown inFIG. 5 a to detect backwards movement along the track. - In
FIG. 5 c, thetrack segments visual bar code track sensor 314 passes along thetrack 350. Eachcode track segment track segment codes related segments vehicle memory 320, thetablet computer 210, or on the remote server (110, 260). In other configurations, this information is directly read from thecodes track segments - Regardless of which marking system is used, it is possible to create special markings on the track segments 510-560 to indicate unique locations on the
track 350. For instance, a finish line could be created with a unique background color intrack segment 510, or with a unique color line insegment 530, or with a unique code insegment 550. In this way, thevehicle 200 can sense when it crosses the finish line, in order to stop timing a particular run, or to count laps when a particular race involves multiple laps of thetrack 350. -
FIG. 6 shows auser interface 600 on amobile device 210 that is generated by theremote control app 234. The main portion of theinterface 600 shows alive view 602 of the image being seen by theimage sensor 316 on thevehicle 200. In this case, thevehicle 200 is seen following a hiker on a dirt trail. Theinterface 600 allows the user to control the movement of the controlled vehicle through direction controls 610 and gear controls 620. The direction controls 610 are shown as four directional arrows superimposed on theimage 602. As thetablet computer 210 uses a touch screen foruser input 214, the user need only touch one of the fourdirection arrows 610 to change the direction or steer thevehicle 200. The gear controls 620 are composed of an up and down arrow superimposed over theimage 602, allowing the user to change gears by simply pressing one of thearrows 620. Speed andgear information 630 are shown on the bottom of theinterface 600. - At the top of the interface are a variety of controls. The touch mode control changes the steering controls from the
directional controls 610 to tilt control. Tilt control uses sensors within themobile device 210 to determine when a user tilts thedevice 210 and sends signals to steer the vehicle in the direction of the tilt. The timer display at the center top of theinterface 600 allows a user to see the current time of a particular “run.” The photo button stores the current image as a still image, while the video button causes themobile device 210 to record the image stream coming from thevehicle 200 as a video file. InFIG. 6 , a red circle is placed on the video button to indicate that a video file is currently being created of the image feed being displayed atlocation 602. The help button presents a help system to the user, while the screen control menu allows the user to change various settings with theinterface 600 and theapp 234 as a whole. -
FIG. 7 shows amethod 700 for utilization of thevehicle 200,mobile device 210,server 260, andtrack 350. The method starts atstep 705 with the arrangement of track segments, such assegments track 350. Thetrack 350 may be arranged in a complex loop, where thevehicle 200 can traverse the closed loop multiple times for a single race. This is not necessary, however, as thetrack 350 may be arranged with a separate beginning and an end. - At
step 710, a wireless communication path is established between thevehicle 200 and themobile device 210. As explained above, this can be a Bluetooth connection, a Wi-Fi connection, or any similar wireless data communication path between thedevices step 715, thevehicle 200 is placed on thetrack 350. Atstep 720, themobile device 210 is used to control the vehicle in a race or run around thetrack 350. At the same time (or immediately after the race), the mobile device acquires data related to the race (step 725). - These
steps flow chart 800 inFIG. 8 . In particular, themobile device 210 receives car control input from the user atstep 805, and then transmits control signals reflecting that input to thevehicle 200 instep 810. Thevehicle 200 receives those signals atstep 815, and then adjusts the car performance instep 820. In particular, the vehicle can manipulate thewheel motors 330 and thesteering motor 332 to control motion and direction of thevehicle 200. Other control inputs and responses are possible depending on the capabilities of thevehicle 200. For example, the vehicle may have additional motors to control such things as a camera mount, a crane, a plow blade, or additional drive wheels; or the vehicle may have additional electronic components such as a microphone, multiple cameras, a speaker, a touch sensor, etc. Theuser interface 600 allows a user to control such features, and control inputs for these features can be transmitted atstep 810, received atstep 815, and implemented atstep 820. - While the
vehicle 200 is in motion, thetelemetry sensors 312, thetrack sensors 314, and theimage sensors 316 will be read atsteps mobile device 210 atstep 840. Themobile device 210 will receive this data atstep 845, and then display and/or store the data. In other embodiments, the data from the sensors 312-316 will be stored in thememory 320 of thevehicle 200 for transmission to themobile device 210 at a later time. In the preferred embodiment, all of this data will be time coded so as to be able to compare each element of data temporally with the other data, including the received control signals. - Returning to step 730 of
FIG. 7 , themobile device 210 will upload the data that it acquires for a particular run of the vehicle to thecloud server 260 for storage in thedatabase 270. This includes not only thetelemetry data 326 and thetrack data 328, but also theuser inputs 236 that were used to drive thevehicle 200. In some embodiments, even the image data is uploaded to theserver 260 and stored in thedatabase 270. - At
step 735, themobile device 210 downloads from theserver 260 data from a race run by a third party. This allows the user of themobile device 210 to compare the third-party race results with their own. If the third party ran the race on the same track configuration, it would be possible to compare the performance of each user head-to-head. The total time for the race could be compared to determine a race winner. Individual statistics could be compared, such as fastest lap time, longest wheel skid, etc. If the user elects to perform their race again atstep 745, the third-party race results could be displayed live on theinterface 600 while the user controlled theirvehicle 200 over thetrack 350. Theinterface 600 could display the user's lead over the opponent, or how far behind their opponent theirvehicle 200 is at any point in the race. Theinterface 600 could even superimpose an image of the third-party vehicle on theimage portion 602 of theinterface 600 whenever the user was running behind the third party during their race. - At
step 750, the third-partyuser input data 236 could even be used to control the user'svehicle 200. While environmental differences and variations in starting positions may preventinput data 236 from a multi-lap third-party race from recreating the entire race with the user's vehicle, this ability could prove useful to recreate maneuvers performed by third parties. For example, a third-party may be able to perform interesting tricks with their vehicle based on a particular sequence of input commands. The third-party could upload the run data for the trick. When the current user sees video of the trick (such as on a website operated using the data in database 270), they decide to download the third-party run data for the trick. Atstep 745, they use the third-party user inputs to control their ownphysical vehicle 200 as the trick is recreated in front of them. -
FIG. 9 shows anadditional method 900 that uses thesensors vehicle 200 to create a controlled start mode. One problem withracing toy vehicles 200 is that theirmotors 330 are usually engineered to provide a great deal of torque to the vehicle wheels relative to the vehicle's size and weight. Novice users tend to run their first race at “full throttle” and soon find thevehicle 200 impossible to control. When they attempt their next race at a lower throttle, they find that thevehicle 200 is difficult to start quickly from a stopped position without sufficient torque being supplied by thewheel motor 330.Method 900 solves this issue by implementing a controlled start mode that may useful for novice users. - The
method 900 starts atstep 905, in which themobile device 210 receives a command through the user interface to put thevehicle 200 into a controlled start mode. Atstep 910, themobile device 210 transmits this mode change to thevehicle 200. Atstep 915, thevehicle 200 receives this transmission and sets thevehicle 200 to operate in this mode. - Some time latter, the
mobile device 210 receives a command through its user interface to increase the throttle of thevehicle 200 to a user-selected level (step 920). This command is then sent as an instruction to thevehicle 200 atstep 925. Thevehicle 200 receives this command atstep 930. However, instead of operating themotor 330 at the user-selected level, thevehicle 200 recognizes that it is not currently moving and has been set to operate in the controlled start mode. As a result, the vehicle starts themotor 330 operating at maximum torque (aka “throttle”) instep 930. This allows thevehicle 200 to start as quickly as possible. - At
step 935, thevehicle 200 senses that it has begun to move. This sensing can be accomplished through thetrack sensor 314. In other embodiments, the telemetry sensor can determine that the vehicle wheels are now moving thevehicle 200. Once thevehicle 200 has begun to move, the vehicle automatically reduces the amount of torque being provided from themotor 330 to the vehicle wheels (step 940). In one embodiment, the amount torque being provided by themotor 330 atstep 940 is completely controlled by the throttle commands received from themobile device 210. If the user is request 90% throttle,step 940 will now provide 90% throttle. In another embodiment, user throttle commands are not allowed to drive themotor 330 above a pre-set throttle/torque level, where such level is below the maximum level provided atstep 930. For example, to help novice users control the vehicle, the controlled start mode may prevent the motor from ever providing greater than 60% throttle (other than during start-up at step 930). If the user requests 20% throttle, themotor 330 is operated at 20%, but if the user requests 100% throttle, themotor 330 is operated at the pre-set level (such as 60%). - In another embodiment, the
vehicle 200 always operates itsmotor 330 as instructed by themobile device 210. In this embodiment, themobile device 210 becomes responsible for operating thevehicle 200 in controlled start mode. To accomplish this, thevehicle 200 must constantly providetelemetry data 326 and trackdata 328 to themobile device 210. In this way, themobile device 210 will know when thevehicle 200 is stopped and when it begins moving. Themobile device 210 will instruct thevehicle 200 to operate themotor 330 at full throttle when starting, as described above in connection withstep 930. When the data received from thevehicle 200 indicates to themobile device 210 that thevehicle 200 has started moving, themobile device 210 will change its throttle command to thevehicle 200 to operate under controlled throttle (as described above in connection with step 940). - The many features and advantages of the invention are apparent from the above description. Numerous modifications and variations will readily occur to those skilled in the art. Since such modifications are possible, the invention is not to be limited to the exact construction and operation illustrated and described. Rather, the present invention should be limited only by the following claims.
Claims (12)
1. A remote control vehicle system comprising:
a) a remote control vehicle having;
i) a wireless vehicle communication interface for receiving commands and transmitting data,
ii) wheels,
iii) a motor to control wheels in response to the received commands
iv) a wheel sensor to create wheel rotation data, and
v) a vehicle processor to transmit wheel rotation data across the wireless vehicle communication interface
b) a mobile computing device having:
i) at least one wireless mobile device communication interface for
(1) transmitting commands to the vehicle,
(2) receiving wheel rotation data from the vehicle,
(3) communicating with a remote server computer
ii) a user interface to receive commands to be transmitted to the vehicle,
iii) mobile device memory to store commands and received wheel rotation data, and
iv) a mobile device processor to manage commands and received wheel rotation data and to direct transmission of commands and data to the remote server computer;
c) the remote server computer having;
i) a server communication interface for receiving commands and wheel rotation data from the mobile computing device,
ii) a server memory containing a database of structured data, and
iii) a server processor for receiving commands and wheel rotation data over the server communication interface and for storing the input commands and wheel rotation data in the database.
2. The vehicle system of claim 1 , wherein the mobile device processor receives third-party commands from the remote server computer and transmits the third-party commands to the remote control vehicle.
3. The vehicle system of claim 2 , wherein the third-party commands were received by the remote server from a third-party mobile computer device that previously used the third-party commands to drive a third-party vehicle.
4. The vehicle system of claim 1 , further comprising:
d) track have track indicators,
wherein the remote control vehicle further comprises a track sensor that reads the track indicators off the track as the vehicle passes over the track.
5. The vehicle system of claim 1 , wherein the vehicle processor operates in a controlled start mode when the vehicle starts from a stopped position, wherein the torque provided by the motor to the wheels is limited to a predetermined value when the vehicle begins to move.
6. The vehicle system of claim 5 , wherein the torque provided by the motor to the wheels is maximized when the vehicle is in the stopped position.
7. The vehicle system of claim 5 , wherein the wheel sensor is used to determine whether or not the vehicle is moving.
8. The vehicle system of claim 5 , wherein the mobile device controls the torque provided by the motor during the controlled start mode.
9. The vehicle system of claim 5 , wherein the vehicle controls the torque provided by the motor during the controlled start mode.
10. A method for operating a remote control vehicle comprising:
a) laying out track having track indicators;
b) locating the remote control vehicle on the track;
c) controlling the remote control vehicle by inputting commands on a user interface on a mobile device, wherein the mobile device sends commands to the remote control vehicle;
d) on the remote control vehicle, reading track data by sensing the track indicators found on the track using a track sensor on the vehicle as the vehicle passes over the track;
e) sending track data from the remote control vehicle to the mobile device;
f) sending track data and the commands from the mobile device to a remote server.
11. The method of claim 10 , wherein the remote control vehicle accumulates wheel rotation data and sends wheel rotation data to the mobile device.
12. The method of claim 11 , wherein the wheel rotation data is sent with the track data and commands from the mobile device to the remote server.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/341,732 US20150031268A1 (en) | 2013-07-25 | 2014-07-25 | Toy vehicle with telemetrics and track system and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361858440P | 2013-07-25 | 2013-07-25 | |
US14/341,732 US20150031268A1 (en) | 2013-07-25 | 2014-07-25 | Toy vehicle with telemetrics and track system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150031268A1 true US20150031268A1 (en) | 2015-01-29 |
Family
ID=52390878
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/341,732 Abandoned US20150031268A1 (en) | 2013-07-25 | 2014-07-25 | Toy vehicle with telemetrics and track system and method |
Country Status (1)
Country | Link |
---|---|
US (1) | US20150031268A1 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160129358A1 (en) * | 2014-11-07 | 2016-05-12 | Meeper Technology, LLC | Smart Phone Controllable Construction Brick Vehicle |
US20160243441A1 (en) * | 2015-02-23 | 2016-08-25 | Peter Garbowski | Real-time video feed based multiplayer gaming environment |
WO2016154938A1 (en) * | 2015-03-31 | 2016-10-06 | SZ DJI Technology Co., Ltd. | Systems and methods for analyzing flight behavior |
US9475005B2 (en) | 2014-06-06 | 2016-10-25 | Clean Diesel Technologies, Inc. | Three-way catalyst systems including Fe-activated Rh and Ba-Pd material compositions |
US9511350B2 (en) | 2013-05-10 | 2016-12-06 | Clean Diesel Technologies, Inc. (Cdti) | ZPGM Diesel Oxidation Catalysts and methods of making and using same |
US9511358B2 (en) | 2013-11-26 | 2016-12-06 | Clean Diesel Technologies, Inc. | Spinel compositions and applications thereof |
US9511353B2 (en) | 2013-03-15 | 2016-12-06 | Clean Diesel Technologies, Inc. (Cdti) | Firing (calcination) process and method related to metallic substrates coated with ZPGM catalyst |
US9545626B2 (en) | 2013-07-12 | 2017-01-17 | Clean Diesel Technologies, Inc. | Optimization of Zero-PGM washcoat and overcoat loadings on metallic substrate |
US9555400B2 (en) | 2013-11-26 | 2017-01-31 | Clean Diesel Technologies, Inc. | Synergized PGM catalyst systems including platinum for TWC application |
US20170052537A1 (en) * | 2014-09-02 | 2017-02-23 | Robert Valentin Salinas | System and Method for Control of Autonomous Vehicle by Mobile Device |
US9700841B2 (en) | 2015-03-13 | 2017-07-11 | Byd Company Limited | Synergized PGM close-coupled catalysts for TWC applications |
US9731279B2 (en) | 2014-10-30 | 2017-08-15 | Clean Diesel Technologies, Inc. | Thermal stability of copper-manganese spinel as Zero PGM catalyst for TWC application |
US9771534B2 (en) | 2013-06-06 | 2017-09-26 | Clean Diesel Technologies, Inc. (Cdti) | Diesel exhaust treatment systems and methods |
US9833725B2 (en) * | 2014-06-16 | 2017-12-05 | Dynepic, Inc. | Interactive cloud-based toy |
US9861964B1 (en) | 2016-12-13 | 2018-01-09 | Clean Diesel Technologies, Inc. | Enhanced catalytic activity at the stoichiometric condition of zero-PGM catalysts for TWC applications |
US9875584B2 (en) | 2015-03-31 | 2018-01-23 | SZ DJI Technology Co., Ltd | Systems and methods for monitoring flight |
US9951706B2 (en) | 2015-04-21 | 2018-04-24 | Clean Diesel Technologies, Inc. | Calibration strategies to improve spinel mixed metal oxides catalytic converters |
US10265684B2 (en) | 2017-05-04 | 2019-04-23 | Cdti Advanced Materials, Inc. | Highly active and thermally stable coated gasoline particulate filters |
WO2019087019A1 (en) * | 2017-10-30 | 2019-05-09 | Kovacic Mitja | Racing unit with toy vehicles, racing assembly and a system for managing racing units and for statistical data processing and publication, and a method for topping up and depleting an account of a racing unit user |
US10533472B2 (en) | 2016-05-12 | 2020-01-14 | Cdti Advanced Materials, Inc. | Application of synergized-PGM with ultra-low PGM loadings as close-coupled three-way catalysts for internal combustion engines |
US10616310B2 (en) | 2015-06-15 | 2020-04-07 | Dynepic, Inc. | Interactive friend linked cloud-based toy |
US10652719B2 (en) | 2017-10-26 | 2020-05-12 | Mattel, Inc. | Toy vehicle accessory and related system |
US11471783B2 (en) | 2019-04-16 | 2022-10-18 | Mattel, Inc. | Toy vehicle track system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110106339A1 (en) * | 2006-07-14 | 2011-05-05 | Emilie Phillips | Autonomous Behaviors for a Remote Vehicle |
-
2014
- 2014-07-25 US US14/341,732 patent/US20150031268A1/en not_active Abandoned
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110106339A1 (en) * | 2006-07-14 | 2011-05-05 | Emilie Phillips | Autonomous Behaviors for a Remote Vehicle |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9511353B2 (en) | 2013-03-15 | 2016-12-06 | Clean Diesel Technologies, Inc. (Cdti) | Firing (calcination) process and method related to metallic substrates coated with ZPGM catalyst |
US9511350B2 (en) | 2013-05-10 | 2016-12-06 | Clean Diesel Technologies, Inc. (Cdti) | ZPGM Diesel Oxidation Catalysts and methods of making and using same |
US9771534B2 (en) | 2013-06-06 | 2017-09-26 | Clean Diesel Technologies, Inc. (Cdti) | Diesel exhaust treatment systems and methods |
US9545626B2 (en) | 2013-07-12 | 2017-01-17 | Clean Diesel Technologies, Inc. | Optimization of Zero-PGM washcoat and overcoat loadings on metallic substrate |
US9511358B2 (en) | 2013-11-26 | 2016-12-06 | Clean Diesel Technologies, Inc. | Spinel compositions and applications thereof |
US9555400B2 (en) | 2013-11-26 | 2017-01-31 | Clean Diesel Technologies, Inc. | Synergized PGM catalyst systems including platinum for TWC application |
US9475004B2 (en) | 2014-06-06 | 2016-10-25 | Clean Diesel Technologies, Inc. | Rhodium-iron catalysts |
US9579604B2 (en) | 2014-06-06 | 2017-02-28 | Clean Diesel Technologies, Inc. | Base metal activated rhodium coatings for catalysts in three-way catalyst (TWC) applications |
US9475005B2 (en) | 2014-06-06 | 2016-10-25 | Clean Diesel Technologies, Inc. | Three-way catalyst systems including Fe-activated Rh and Ba-Pd material compositions |
US9833725B2 (en) * | 2014-06-16 | 2017-12-05 | Dynepic, Inc. | Interactive cloud-based toy |
US20170052537A1 (en) * | 2014-09-02 | 2017-02-23 | Robert Valentin Salinas | System and Method for Control of Autonomous Vehicle by Mobile Device |
US9731279B2 (en) | 2014-10-30 | 2017-08-15 | Clean Diesel Technologies, Inc. | Thermal stability of copper-manganese spinel as Zero PGM catalyst for TWC application |
US10004997B2 (en) * | 2014-11-07 | 2018-06-26 | Meeper Technology, LLC | Smart phone controllable construction brick vehicle |
US20160129358A1 (en) * | 2014-11-07 | 2016-05-12 | Meeper Technology, LLC | Smart Phone Controllable Construction Brick Vehicle |
US10124256B2 (en) * | 2015-02-23 | 2018-11-13 | Peter Garbowski | Real-time video feed based multiplayer gaming environment |
US9987557B2 (en) * | 2015-02-23 | 2018-06-05 | Peter Garbowski | Real-time video feed based multiplayer gaming environment |
US20160243441A1 (en) * | 2015-02-23 | 2016-08-25 | Peter Garbowski | Real-time video feed based multiplayer gaming environment |
US9700841B2 (en) | 2015-03-13 | 2017-07-11 | Byd Company Limited | Synergized PGM close-coupled catalysts for TWC applications |
WO2016154938A1 (en) * | 2015-03-31 | 2016-10-06 | SZ DJI Technology Co., Ltd. | Systems and methods for analyzing flight behavior |
US9875584B2 (en) | 2015-03-31 | 2018-01-23 | SZ DJI Technology Co., Ltd | Systems and methods for monitoring flight |
US10692311B2 (en) | 2015-03-31 | 2020-06-23 | SZ DJI Technology Co., Ltd. | Systems and methods for monitoring flight |
US9951706B2 (en) | 2015-04-21 | 2018-04-24 | Clean Diesel Technologies, Inc. | Calibration strategies to improve spinel mixed metal oxides catalytic converters |
US10616310B2 (en) | 2015-06-15 | 2020-04-07 | Dynepic, Inc. | Interactive friend linked cloud-based toy |
US10533472B2 (en) | 2016-05-12 | 2020-01-14 | Cdti Advanced Materials, Inc. | Application of synergized-PGM with ultra-low PGM loadings as close-coupled three-way catalysts for internal combustion engines |
US9861964B1 (en) | 2016-12-13 | 2018-01-09 | Clean Diesel Technologies, Inc. | Enhanced catalytic activity at the stoichiometric condition of zero-PGM catalysts for TWC applications |
US10265684B2 (en) | 2017-05-04 | 2019-04-23 | Cdti Advanced Materials, Inc. | Highly active and thermally stable coated gasoline particulate filters |
US10652719B2 (en) | 2017-10-26 | 2020-05-12 | Mattel, Inc. | Toy vehicle accessory and related system |
WO2019087019A1 (en) * | 2017-10-30 | 2019-05-09 | Kovacic Mitja | Racing unit with toy vehicles, racing assembly and a system for managing racing units and for statistical data processing and publication, and a method for topping up and depleting an account of a racing unit user |
US11471783B2 (en) | 2019-04-16 | 2022-10-18 | Mattel, Inc. | Toy vehicle track system |
US11964215B2 (en) | 2019-04-16 | 2024-04-23 | Mattel, Inc. | Toy vehicle track system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20150031268A1 (en) | Toy vehicle with telemetrics and track system and method | |
US11899472B2 (en) | Aerial vehicle video and telemetric data synchronization | |
US10453333B1 (en) | Methods and apparatus for leveraging a mobile phone or mobile computing device for use in controlling model vehicles | |
KR102129798B1 (en) | Vehicle and method for controlling the same | |
US10831186B2 (en) | System for authoring, executing, and distributing unmanned aerial vehicle flight-behavior profiles | |
US9554508B2 (en) | Autonomous mobile robot | |
EP3970590B1 (en) | Method and system for controlling a robot cleaner | |
KR20200084934A (en) | Apparatus for controlling driving of vehicle and method for performing calibration thereof | |
US20160327389A1 (en) | Calibration Transfer Between Two Devices | |
WO2017218111A1 (en) | Method and system for saving a snapshot of game play and used to begin later execution of the game play by any user as executed on a game cloud system | |
WO2018017428A1 (en) | Method and system for accessing previously stored game play via video recording as executed on a game cloud system | |
WO2012096347A1 (en) | Network system, control method, controller, and control program | |
CN204425487U (en) | Flying Camera head unit, bicycle and system of riding | |
CN108873881A (en) | Portable mobile robot and its operating method | |
WO2016161426A1 (en) | Systems and methods for controlling pilotless aircraft | |
CN106020198B (en) | Somatosensory vehicle carrying method and somatosensory vehicle | |
JP6829513B1 (en) | Position calculation method and information processing system | |
CN105808062A (en) | Method for controlling intelligent device and terminal | |
US20210360849A1 (en) | Mover robot system and controlling method for the same | |
WO2014125867A1 (en) | Control device, computer program, mobile-body system, and control method | |
JP6560479B1 (en) | Unmanned aircraft control system, unmanned aircraft control method, and program | |
JP2018190363A (en) | Portable mobile robot and operation method thereof | |
KR102108170B1 (en) | Golf Drones | |
JP2020166673A (en) | Parking position guiding system | |
KR102548032B1 (en) | Ai golf cart and golf system comprising the ai golf cart |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SKUSION LLC, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WAITES, NIGEL;GOLDEN, STEPHEN;SIGNING DATES FROM 20170324 TO 20180105;REEL/FRAME:044580/0983 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |