WO2014013492A1 - Wireless interactive device system and method - Google Patents

Wireless interactive device system and method Download PDF

Info

Publication number
WO2014013492A1
WO2014013492A1 PCT/IL2013/050609 IL2013050609W WO2014013492A1 WO 2014013492 A1 WO2014013492 A1 WO 2014013492A1 IL 2013050609 W IL2013050609 W IL 2013050609W WO 2014013492 A1 WO2014013492 A1 WO 2014013492A1
Authority
WO
WIPO (PCT)
Prior art keywords
interactive device
interactive
information
transceiver
memory
Prior art date
Application number
PCT/IL2013/050609
Other languages
French (fr)
Inventor
Eran ESHED
Original Assignee
Eshed Eran
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eshed Eran filed Critical Eshed Eran
Publication of WO2014013492A1 publication Critical patent/WO2014013492A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/14Racing games, traffic games, or obstacle games characterised by figures moved by action of the players
    • A63F9/143Racing games, traffic games, or obstacle games characterised by figures moved by action of the players electric
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2448Output devices
    • A63F2009/245Output devices visual
    • A63F2009/2451Output devices visual using illumination, e.g. with lamps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • A63F2009/2483Other characteristics
    • A63F2009/2488Remotely playable
    • A63F2009/2489Remotely playable by radio transmitters, e.g. using RFID
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2250/00Miscellaneous game characteristics
    • A63F2250/52Miscellaneous game characteristics with a remote control
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F3/00Board games; Raffle games
    • A63F3/00643Electric board games; Electric features of board games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/04Dice; Dice-boxes; Mechanical dice-throwing devices
    • A63F9/0468Electronic dice; electronic dice simulators
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • a command is sent from a central console or terminal, such as a computer, game console, telephone, etc. to a separate device.
  • the separate device then performs some function based on the command that was sent.
  • Remote control toys for example, receive commands or signals wirelessly and execute a predetermined action in response to that command.
  • Some interactive devices may also receive firmware updates wirelessly, which can change the rules by which the devices operate in different situations.
  • the interactive device may also send information back to the central console or terminal, indicating that it has received the command, executed the command, or updated its firmware.
  • Some devices seem to "sense" users' movements and/or their environment through various sensors located on the device. Examples of such devices include electronic game controllers, motion-sensing cameras, IR- or RF-responsive toys, etc., which are programmed to run a particular routine when a particular event is detected. The devices can then relay the information about the event or execution of its commands back to the central console. The central console can then send commands to the device based on the sensed event. This interaction of reacting to pre-programmed routines or commands received from a console in response to an on-board sensor may cause the devices to appear to be "aware" of their environment.
  • Conventional interactive devices are also limited as to their real-time responsiveness to other devices in their vicinity. Many devices require time-consuming installation procedures to allow interactivity with a specific device or console in its proximity. New devices must be intentionally added to a network, and the network may be limited to a short distance away from the device, the console, or the access point. In order to determine all of the nearby interactive devices, many conventional systems require a time-consuming addressing procedure, in which the command unit requests the location of each device and maps a grid from the responses received.
  • FIG. 1 is a diagrammatic view of one embodiment of an interactive device according to the invention.
  • FIG. 2 is a schematic view of a mash network according to another embodiment of a system according to the invention.
  • FIG. 3 is a perspective view of a dice-playing game according to another embodiment of the invention.
  • FIG. 4 is a perspective view of an action figure system according to yet another embodiment of the invention.
  • FIG. 5 is a perspective view of a content interaction system according to yet another embodiment of the invention.
  • FIG. 6 is an exploded perspective view of a game board system according to one embodiment of the invention.
  • FIG. 7 is a perspective view of a block interaction system according to one embodiment of the invention.
  • FIG. 8 is a perspective view of a medication monitoring system according to another embodiment of the invention.
  • FIG. 9 is a perspective view of an interactive glove according to yet another embodiment of the invention.
  • FIG. 10 is a perspective view of a remote control car system according to another embodiment of the invention.
  • FIG. 11 is a process flow diagram of one embodiment of a method according to the invention.
  • FIG. 1 shows a block diagram of an interactive device 1 according to one embodiment of the invention.
  • the interactive device 1 includes a control unit 10, an inertial monitoring unit (IMU) 20, an audio unit 30, a tactile feedback module 40, a radio frequency identification (RFID) module 50, a motor module 60, and a display module 70, all connected via a shared bus 80.
  • IMU inertial monitoring unit
  • RFID radio frequency identification
  • the control unit 10 includes a central processing unit (CPU) 12, a memory 14, an energy unit 18, a transceiver 16, and a tilt sensor 19.
  • the memory 14 may contain drivers for hardware, instructions for carrying out routines, and instructions for responding in different ways based on information sensed from the environment or commands received.
  • the memory may also contain information about the user of the device, such as statistics from past games, characteristics of the user, user-specified functions, highest levels reached in various applications, achievements and personalized data etc., which are correlated to unique identification indicia of the device and data for the specific model (unit), such as whether other devices represent friends or foes, shared data, abilities etc.
  • the energy unit 18 in this embodiment may be a power source, such as a battery, charger, solar cell, etc., and supplies power to the device 1.
  • the energy unit 18 is responsive to the tilt sensor 19 to decrease or turn off the power source when the device 1 is stationary for a certain amount of time, and to turn on the power source if the device 1 is moved.
  • the transceiver 16 can send/receive signals to/from other interactive devices, consoles, or access points (see FIG. 2).
  • the transceiver 16 may also receive signals indicative of the device's geographic location through, for example, a GPS unit on-board the device or by receiving GPS information from a device, console, or access point proximate to it.
  • the transceiver 16 may receive signals indicative of the device's location through triangulating other nearby devices (see FIG. 2).
  • the transceiver 16 may also be responsive to the tilt sensor 19, to indicate when the position of the device 1 is to be updated.
  • the transceiver 16 includes a 2.4 GHz IEEE 802.15.4 compliant RF transceiver with internal antenna.
  • transceivers capable of sending and receiving commands in real-time may be used.
  • ZigBee, WiFi, Bluetooth or other wireless transceivers may be substituted for applications within those transceivers' real-time response capabilities and proximity ranges.
  • the IMU 20 includes an accelerometer 22, a gyroscope 24, and a magnetometer 25.
  • the accelerometer 22 senses the g- force experienced by the device 1
  • the gyroscope 24 senses its orientation.
  • the magnetometer 25 may detect and measure the strength and/or direction of a magnetic field.
  • the audio unit 30 includes an audio input 32, such as a microphone, RCA jack, or the like, and an audio output 34, such as a speaker.
  • the tactile feedback module 40 includes, for example, a vibration mechanism 42 to provide tactile vibration, force feedback, temperature, etc.
  • the RFID module 50 includes a passive RFID reader 52 capable of reading radio frequency identification tags or signals.
  • the motor module 60 includes a motor controller 62 connected to a motor and movement mechanism 64, such as, for example, wheels, a propeller, movable legs, movable arms, etc., which allow the device 1 to move from one location to another or to move one portion of the device relative to another.
  • Display module 70 includes visual feedback mechanism 72 such as, for example, RGB LED lights, a display, or other form of visual feedback as is known in the art.
  • FIG. 1 Although the components of device 1 are shown in FIG. 1 as being located in distinct functional units or modules 10, 20, 30, 40, 50, 60, and 70, one skilled in the art will understand that these components may be located within a single unit or module, or distributed within the device on different units or modules or within separate housings, while remaining within the scope of the invention. One skilled in the art will also understand that fewer than all of the functional units or modules 20, 30, 40, 50, 60, and 70 may be included in an interactive device 1, while remaining within the scope of the invention. Additional functional units or modules may also be added to the interactive devices, such as, for example, galvanic skin response detectors, heat sensors, motion detectors, cameras, or interfaces, such as buttons, keyboards, etc.
  • the CPU 12 sends repeated signals through the shared bus to the IMU 20, requesting response if anything has changed. Any changes in acceleration or orientation are then reported to the CPU 12 via the shared bus. Precise information on the device's location can then be determined by the CPU 12, through analysis of the changes reported from the IMU 20 and the location information received through the transceiver 16. These changes may be recorded in the memory 14 to create an ongoing log of changing location, orientation, and acceleration.
  • the devices 1 can network seamlessly as an ad-hoc network without time-consuming setup procedures or the need for a central console, telephone or terminal.
  • additional network-connected devices may also be added to the network, such as game console 106, telephone 104, access point 102, or cloud device 111.
  • Game console 106 may be connected to, for example, a display device 110 and a controller 112.
  • the access point 102 may be connected to terminal 100 through a wired connection and to terminal 108 through, for example, the Internet, which may allow tracking and updates to functionality, firmware, stored routines, etc. of the CPUs of the local interactive devices 1.
  • Additional interactive device 1 may also connect to the access point 102 remotely through, for example, the Internet.
  • Cloud device 1 1 1 may bridge between an interactive device 1 and a Wi-Fi enabled device, and may translate the protocol of interactive device 1 to a Wi-Fi protocol, and vice versa, to enable two-way communication between regular Wi-Fi devices and interactive device 1.
  • data or update request may be sent from one device 1 and have the request "bounce" from device to device within the mash network coverage until it reaches a device connected to a wireless network, such as telephone 104.
  • the telephone 104 may then download the requested data and send it back to device 1.
  • the request reaches a device, for example, terminal 100, that contains the requested data and the terminal 100 sends it through the mash network back to interactive device 1.
  • the information about the similar nearby units 1 that is sent back to the interactive device 1 includes information about the similar unit 1 , such as a unique identifier of the similar unit 1 or the owner of the similar unit 1, etc.
  • This information can then be stored in the memory 14 of the interactive device 1 as a "friends list,” such as a database of devices with which the interactive device 1 has been near to or interacted with in physical space. Interactions of the interactive device 1 may be limited to such "friends," to protect children from the dangers of child molesters, who may otherwise attempt to befriend the children through online play.
  • the above embodiments are discussed in terms of embedded devices that communicate seamlessly with each other without the need for "plug and play" setup.
  • the interactive device may be insertable into multiple, different housings or devices.
  • the interactive device may be connected via an interface port to an action figure to allow interactivity and functionality as described above, disconnected from the action figure, and then connected to an interface port in a refrigerator, thus allowing the refrigerator to interact with nearby interactive devices.
  • the interactive device may be embedded in the action figure, refrigerator, display 1 10, etc., such that all are connected wirelessly on a continuous basis.
  • FIG. 3 shows one application of an interactive device as described above.
  • Interactive devices 200, 202 are formed as dice, including separate displays 204 on each face, which can change each face's appearance with indicia, such as numbers, pictures, text, color, etc., relating to the game played.
  • the interactive devices 200, 202 each include a CPU 12, a memory 14, an energy unit 18, a transceiver 16, a tilt sensor 19, and a gyroscope 24, substantially as described in connection with FIG. 1.
  • the memory 14 for each interactive device 200, 202 includes rules for a dice game application.
  • the CPU 12 sends out a signal via transceiver 16 to determine whether any other interactive devices are in its proximity.
  • Interactive device 202 receives this signal through its transceiver 16 and sends a response back to the interactive device 200 that it is in the proximity of interactive device 200.
  • Interactive device 200 then sends a signal to interactive device 202 indicating that its memory includes a dice game application and requests interaction with device 202 in accordance with that application.
  • Interactive device 202 may also have the dice game application in its memory, and a signal can be sent back to interactive device 200 that the game is initiated. Alternatively, the interactive device 202 may send a response that the dice game application is not in its memory and can either decline initiation of the game or request an update to its memory be sent via either the other interactive device 200, a nearby access point, or other device.
  • the dice game may be played by, for example, each user taking turns rolling one or both dice.
  • the CPU 12 may request orientation information from the gyroscope 24 along the shared bus.
  • the gyroscope 24 can respond to the CPU 12 request and send information to the CPU indicating which face of the die 200 is currently facing upward. This information may then be recorded in the memory 14.
  • the second die 202 may then be rolled, and the face of the die 202 facing upward after the roll may then be recorded in the memory of the second die 202.
  • the respective values of the rolls can then be communicated between the devices 200, 202 via their respective transceivers 16, and an indication of which user had a higher value roll can then be recorded in the memory 14 of the dice 200, 202.
  • the die 200 or 202 that had the higher roll may, for example, vibrate or light up, to indicate which die 200 or 202 won the particular round. It might also synchronize the outcome to all of the other relevant devices, to make sure that all of the game pieces coordinate their functions and rule sets according to the game rules. The synchronization may also take place with remote devices, such as when playing with a remote friend online, as will be discussed below.
  • the interactive devices 200, 202 may also be connected to other devices, such as console 106, telephone 104, access point 102, terminals 100, 108, display 110, cloud device 1 1 1 or remote interactive device 1. It is also within the scope of the invention for the interactive device to include a wireless dongle for connection to a terminal, the Internet, and/or the cloud device without going through a remote access point. Ongoing scores, user statistics, etc., may also be sent to these other devices and saved in their respective memories. Alternatively, interactive devices 200 and 202 may be located in remote locations and networked via the Internet through an access point 102.
  • console 106 is optionally wirelessly connected to interactive device 200.
  • Console 106 also connected to display 1 10 and controller 1 12, may also be running a routine related to the dice game and displaying content related to the dice game on the display 1 10.
  • the data related to the orientation of the dice 200, 202, the user scores, etc. may dictate content displayed on the display 110, such as, for example, advancing characters in an ongoing game based on the value rolled on the dice, increase or decrease powers of characters, locations, start or stop video content to be displayed or any other content etc.
  • the dice game between the interactive devices 200, 202 may be initiated by routines running on the console 106, based on, for example, initiation of a battle between characters in the game running on the console, etc.
  • the console 106 or other connected device may alternatively update the interactive devices 200, 202 with the dice-playing rules or additional routines based on the game/content/routine running on the connected device, the functionality of the interactive devices 200, 202, additional routines purchased by a user, etc.
  • the interactive device may include instructions to light up, move in a particular direction, or make a particular sound based on particular content running on the console.
  • the gyroscope or accelerometer of the interactive device may send commands based on the manner of movement to the console or other connected device to change the content or routines running on the console or device.
  • FIG. 3 is described in terms of a dice-playing game
  • any number of routines or sequences may be stored in the memory of the devices 200, 202, providing for various behaviors and series of smart interactions between the respective devices 200, 202 and/or other networked devices, such as consoles, displays, telephones, access points, terminals, or any embedded device or object, such as an action figure 300 as described in FIG. 4.
  • These routines may go far beyond a single behavior or series of behaviors in response to a command. As described, they may also include rules and routines for smart, ongoing interaction with the other devices, changed functionality, such as new skills or games, and updated applications.
  • the dice may alternatively act as blocks of light that may be placed or constructed in any way, sensitive to user's proximity and gestures. Although the dice are depicted in FIG. 3 with separate displays on each face, it is also within the scope of the invention for the displays to be replaced by non-display surfaces with indicia on each face.
  • FIG. 4 shows another embodiment of an interactive device according to the invention.
  • action figure 300 includes a CPU 12, energy unit 18, transceiver 16, memory 14, tilt sensor 19, microphone 332, visual feedback in the form of LED lights 372, RFID reader 53, motor controller 62, and movement mechanism 64.
  • vehicle 400 includes a CPU 12, energy unit 18, transceiver 16, memory 14, tilt sensor 19, speaker 434, motor controller 62, movement mechanism 64, an RFID tag 402, and a handlebar button 404.
  • Vehicle 400 receives the signal through its transceiver 16, and its CPU 12 sends back information regarding its location, along with information about its capabilities. This information is received by the CPU 12 of the action figure 300, which checks its memory 14 for stored routines that can be used with devices having the characteristics of the vehicle 400.
  • the action figure 300 may send out a request to the vehicle 400 for an update to its stored routines to enable interaction. Updates may be sent back by the vehicle 400 automatically or through qualification based on a number of factors, such as, for example, comparison of the user statistics stored in the memory 14 of the action figure 300 with a certain minimum experience points required for a particular functionality, characteristics of the action figure 300 stored in the action figure's memory 14, past interactions between the action figure 300 and the vehicle 400, whether information stored in the action figure's memory 14 indicates that the user of the action figure is on a predetermined list of approved users, etc.
  • factors such as, for example, comparison of the user statistics stored in the memory 14 of the action figure 300 with a certain minimum experience points required for a particular functionality, characteristics of the action figure 300 stored in the action figure's memory 14, past interactions between the action figure 300 and the vehicle 400, whether information stored in the action figure's memory 14 indicates that the user of the action figure is on a predetermined list of approved users, etc.
  • the vehicle 400 may also be connected to another device or to the Internet, and information relayed from the memory 14 of the action figure 300 can be, for example, checked against a list of paid subscribers found through the Internet, a list of prior interacting devices with the other connected device, etc.
  • the action figure 300 can send a request to the vehicle 400 to initiate the routine, allowing the devices to interact according to a set of rules.
  • the vehicle 400 or the action figure 300 may also vary their functions or behavior based on the proximity of the two devices to each other, exhibiting different functions or behavior when they are closer to each other than farther away.
  • One with ordinary skill in the art will understand that a multitude of applications, routines, rules, or skill sets may apply to the interaction of the two devices 300, 400, but one illustrative example of such a routine will now be described.
  • the speaker 434 may output a sound of a revving motorcycle engine.
  • the microphone 332 may detect this sound and the routine may indicate that the LED lights 372 blink in response.
  • the vehicle 400 may communicate to the action figure 300 that the sound is being output via its speaker 434, which T IL2013/050609
  • the same routine may initiate output of the sound via speaker 434 and the blinking of the LED lights 372.
  • the RFID reader 53 may communicate that contact to the CPU 12 of the action figure 300, and the CPU 12 may look up one or more valid responses to the contact in the routine stored in the memory 14, and instruct the action figure motor controller 62 to move the arm 64 of the action figure 300 until it pushes the handlebar button 404.
  • the CPU 12 of the vehicle 400 may detect the engagement of the handlebar button 404, and operate the motor controller 62 of the vehicle 400 to turn the wheels 64 of the vehicle 400.
  • the interactive devices may do more than simply communicate and respond to commands - they may interact with other elements in the environment, such as other interactive devices, objects in the environment, sounds, sights, proximity, etc.
  • various objects may be scattered in the vicinity of the vehicle, representing road hazards.
  • the vehicle 400 may encounter a bumpy floor surface through a gyroscope as it is moving, communicate this to the action figure 300, which may then play a sound file, "This is a bumpy ride.”
  • the sound file may alternatively be played in response to communication from the vehicle 400, other interactive device 1 or from the shared routine, rather than through direct detection.
  • the RFID reader 53 of the action figure 300 may detect an RFID tag or interactive device embedded in its environment, which represents a "power up.” The action figure 300 may then communicate this to the vehicle 400, prompting it to increase its speed.
  • the action figure 300 may detect a weapon held in its hand through, for example, an RFID reader, pressed button, etc., which may provide additional skills to the action figure, hit points attributed to movements of the action figure 300, etc.
  • an RFID tag or interactive device may be embedded in playing cards, in a patch on clothing, etc.
  • a user may have, for example, their favorite character's T-shirt with an RFID tag embedded in a patch, which can give the RFID-reading action figure additional functions, etc.
  • routines and skill sets between one or more interactive devices are possible, including proximity-based alliances or confrontations that will update user statistics for an ongoing game in the device's memory based on the interactions.
  • Skill sets, routines, and interactive capabilities may also be updated in real time based on the interactions, etc. For example, if the action figure 300 is able to stay on the vehicle 400 for a minimum amount of time while it is in motion, the skill set of the action figure 300 may be updated via either the vehicle 400 or a wireless Internet connection to include performance of a handstand on the vehicle 400.
  • a modular audio unit 30 (FIG. 1) including an audio input 32, such as a microphone, RCA jack, or the like, and an audio output 34, such as a speaker a modular audio unit with audio input and output.
  • Connecting such audio units 30 to the interactive devices may enable, for example, use of the interactive devices as communications units, such as walkie-talkies, VOIP (Voice-Over-IP) or the like through either direct wireless connection between the interactive devices or, if the interactive devices are in remote locations, via the Internet through an access point 102 (FIG. 2).
  • Voice recognition modules may also be added to allow control of remote terminals or of other embedded devices.
  • FIG. 4 features the interactive device as an action figure and vehicle
  • the interactive device can be formed of various sizes, shapes, and functionalities, depending on the desired application and functionality of the device. Examples include balls, refrigerators, game pieces, wearable objects, interactive appliances, TV screens, Hard Drive Disks, or any other object desired to interact with digitally or to give added value when digitized, etc.
  • FIG. 5 shows yet another embodiment of the invention.
  • An interactive device system 500 includes an interactive device in the form of a bracelet 501 , a console 506, and a display 510.
  • the bracelet 501 includes a CPU 12, energy unit 18, transceiver 16, memory 14, tilt sensor 19, gyroscope 24, and visual feedback in the form of different color LED lights 572.
  • the CPU 12 of the bracelet 501 sends out a signal asking about devices in its proximity after indication of movement by, for example, the tilt sensor 19.
  • the console 506 sends back a signal indicating that it is in the proximity of the bracelet 501.
  • the console 506 and the bracelet 501 can then send out various requests related to shared routines, information stored in the memory 14 of the bracelet 501 , capabilities of the bracelet 501, and information related to the content running on the console 506.
  • the console 506 can send information to the bracelet 501 indicating the game that is currently running on the console 506, and the bracelet 501 can send the statistics of the bracelet's user in that game such as, for example, stored characters, highest level attained, special character skills, current points, etc.
  • the console 506 may receive information only related to a unique identification number of the interactive device and reference a table in its memory or a remote location through a connected access point (FIG. 2) that includes the more detailed information about the user.
  • a user may functionally port all of her characters, achievements, statistics, etc. for use on a friend's console 506 with little or no set up time.
  • consoles it is also within the scope of the invention for such an interactive device to be used for other applications such as, for example, retail environments, in which past sales of a user at a particular location are stored in the device's memory or in the memory of a console or other interactive device at the location.
  • the console or interactive device may suggest new products to the user that are similar to past purchases of the user stored in the memory.
  • the bracelet 501 may also function as a remote controller of the console 506, mapping movements and location of the user detected by the gyroscope 24 and accelerometer 22, or as described above in connection with FIG. 1, into commands for interacting with the content running on the console 506 and displayed on the display 510.
  • the bracelet 501 may respond to the content running on the console 506 or to commands from the console 506.
  • one color of LED light 572 may light up when the character in the game is moving close to an enemy, and a different color of LED light may light up when the character in the game is moving toward a hidden treasure.
  • One skilled in the art will also understand that such interaction may occur with audio, vibrations, etc., in other situations as well, such as when powering up, getting hit, etc.
  • An interactive appliance 520 in the form of, for example, a lamp may also be included in system 500.
  • the lamp 520 may also include a CPU 12, energy unit 18, transceiver 16, memory 14, and a light source 521, such as an RGB light bulb.
  • the interactive appliance 520 may also interact with the content running on the console 506 and displayed on the display 501 to, for example, dim, blink, or change colors in pivotal movements of the game or any other digital content running on the console 506. This interaction may be based on, for example, commands sent from the console 506, sounds or visual signals detected from a speaker (not shown) or display 510, a synchronized routine running on both the lamp 520 and the console 506, the time period measured from the beginning of the content interaction, synced with a running script, etc.
  • the thermostat may be set to be far lower during a cold scene of a movie running on the console.
  • the presence of both the bracelet 501 and the lamp 520 may be detected by each of the interactive devices 501, 520, and the console 506, and additional content may become available in the game or movie, or additional functionalities of the bracelet 501 and/or lamp 520 may become available.
  • Information about experiences the user had while interacting with the game content such as successes, additional levels, points, etc. may be stored in the memory of the bracelet 501. Indicia of that information may remain on the bracelet, such as a certain color LED light may remain lit on the bracelet 501, after the interaction to indicate to others that the user has, for example, achieved a certain level of experience, even after the bracelet is moved away from the proximity of the console 506.
  • the interactive device is described in terms of a bracelet 501, it is also within the scope of the invention for the device to be in the form of, for example, other wearable devices, such as a watch, ring, necklace, sticker, glove, head-mounted display, augmented reality glasses, etc., or non-wearable devices, such as those described above.
  • the content is described as running on a console 506, it is also within the scope of the invention for the content to run on a PC, smart TV, DVD player, or any other media device.
  • FIG. 6 depicts another embodiment of a system according to the invention. Referring to
  • interactive system 600 includes interactive devices 601 in the form of action figures, a grid 603 of RFID tags 604, and game board overlay 606.
  • each of the interactive devices 601 include a CPU 12, energy unit 18, transceiver 16, memory 14, and RFID reader 53.
  • the RFID tags 604 are arranged in an array to form grid 603.
  • the game board overlay 606 may, for example, depict a map, picture, game board with path, etc., downloaded from the site, according to the game intended to be played by the user. It is also within the scope of the invention for the game board overlay 606 and the RFID grid 603 to be a single element.
  • Game board overlay 606 is laid on top of the grid 603, and the action figures 601 are placed on top of the game board overlay 606 in various locations.
  • the RFID reader 53 notifies the CPU 12 that the action figure 601 is adjacent to a particular RFID tag within the grid 603, effectively notifying the action figure 601 about its location on the game board overlay 606.
  • the action figures 601 can then communicate the respective locations to each other or to another device (see FIG. 2), such as a telephone 104, console 106 connected to a display 1 10, access point 102, terminals 100, 108, or remote device 1. They may also or alternatively communicate with interactive dice 200, 202, as described in FIG. 3, to incorporate dice rolling features in a game.
  • each action figure 601 can also be updated according to the rules of the game being played with, for example, the current location of the action figure 601, new experience levels, etc., and the CPU 12 of the action figure 601 may indicate certain behavior or rules for the action figure 601, such as, for example, eyes blinking, arms moving, a sound playing, helping other characters in the game, giving power, health, weapons etc.
  • the location of the action figures 601 may also be depicted on display 110, terminals 100, 108, etc., for remote observation and/or remote game play.
  • the system may also include a second grid of RFID tags, game board overlay, and interactive devices in a remote location, which can interact with the local system via access point 102. In this way, locations of both the local and remote interactive devices can be communicated to all of the devices, and remote users may effectively play the game together. It is also within the scope of the invention for a remote user to move virtual characters on a display, and that information can be communicated to the action figures 601 to enable remote game play.
  • Different game board overlays 606, corresponding to different games or variations within a game, may be laid on the grid 603 for a variety of applications.
  • the particular game board overlay 606 placed on the grid 603 may be, for example, automatically detected by the action figures 601 or other device, or communicated directly by the user to one or more of the connected devices including dice. It is also within the scope of the invention for the game board to itself be an interactive device, sensing locations of all of the action figures/game pieces in contact with it, etc.
  • FIG. 7 depicts another embodiment of a system 700 according to the invention.
  • interactive blocks in various shapes such as a cube 701, interlocking block 703, pyramid 705, animal 707, base 709, legs 71 1, etc., each include a CPU 12, energy unit 18, transceiver 16, and memory 14.
  • the blocks may be placed in a desired arrangement to P T/IL2013/050609
  • a virtual 3D model of their physical creation is uploaded to their personal virtual space via, for example, console 706. While visible on display 710, the user may then use a controller 712 to move around virtual objects representing the interactive blocks, scale them, and so on. The interactive blocks may then be taken apart, rearranged, etc. Each time the finished arrangement may be uploaded and saved as a virtual playing ground.
  • a user may build whole cities, castles etc. and add physical action figures, which may play and interact with the virtual model. For instance, after building a city, the player can play with his physical "Godzilla" action figure 713 and demolish virtual buildings or run rampage through the streets.
  • Certain interactive blocks may have different characteristics, such as water (and other elements), mountains (and other terrain), black holes, temperature, traps and so on.
  • the blocks may also be embedded with RGB LEDs and react to the played scenarios according to the characteristics ascribed to the block and any shared rules or routines.
  • FIG. 8 depicts another embodiment of a system 800 according to the invention.
  • a bracelet 801 includes a CPU 12, energy unit 18, transceiver 16, memory 14, RFID reader 52, and RGB LED lights 72.
  • a plurality of medication boxes 803, 805, 807 each has affixed to it a different color RFID tag 809, 811, 813, which may be in the form of, for example, stickers.
  • the CPU 12 of the bracelet 801 may indicate that a blue LED light 72 should be lit on the bracelet 801, indicating that medication from medication box 803 with a blue RFID tag 809 should be taken.
  • the RFID reader 52 of the bracelet 801 can be brought into proximity of the RFID tag 809, indicating that the relevant medication has been taken.
  • the CPU 12 of the bracelet 801 can then indicate that a red LED light 72 should be lit on the bracelet 801, indicating that medication from medication box 805 with a red RFID tag 81 1 should be taken. A similar procedure as outlined above can then be followed for that medication, as well as the medication in box 807 with a green RFID tag 813. Once the CPU 12 of the bracelet 801 is updated that all medications have been taken, the LED light 72 may be turned off, indicating that no medications are due to be taken at that time.
  • LED lights 72 may be substituted for other means of alerting the wearer of the bracelet 801 such as, for example, vibration alerts, a display with readable information, audio alerts, etc.
  • the transceiver 16 may send information to another device, such as, for example, a remote terminal 108 or telephone 104 when a dose of medication is missed, when the CPU 12 indicates that medication within the medication box is running low, a button is pushed on the bracelet indicating an emergency, or any other alerts desired by a user.
  • This information may be synchronized with any number of other digital devices, such as, for example, computers, personal web pages, email notifications, SMS or updates to mobile device applications, patient call center terminals, etc.
  • a GPS or other position tracker may also be added to the device for additional safety of the user.
  • the device in this embodiment is described in terms of a bracelet 801, it is also within the scope of the invention for the interactive device to be constructed in any desired shape or size, including, for example, a pyramid, box, plastic card, pill case that automatically opens when medication is needed, etc.
  • FIG. 9 shows another embodiment of an interactive device according to the invention.
  • a glove 900 includes a strap 902 coupled to a flexible fabric 904 shaped to receive a user's fingers 906.
  • three fingers 906 are received in the glove 900, but it is within the scope of the invention for more or fewer fingers to be received in the glove 900.
  • Each finger of the glove 900 includes a CPU 12, transceiver 16, memory 14, accelerometer 22 and gyroscope 24, allowing the movements, speed, and location of each received finger 906 to be tracked.
  • Ends of each gloved finger 906 also include silicon-type pads 908, which allow work with touch-sensitive devices, such as, for example, an iPod, while 2013/050609
  • each CPU 12 may be updated with movements and locations of each of the other fingers 906 via their respective transceivers 16. Likewise, those movements and locations may be updated to other devices, such as a console 106, additional interactive device 1 , or the like.
  • the glove 900 can thus be used as a seamless controller for any number of devices through use of coordinated gestures, speed of movement, etc.
  • the CPU 12 of each finger may detect that their proximity to each other has increased, thus indicating that the user is making a fist gesture.
  • a fist gesture may be determined through a look-up table, for example, to indicate a command to close a particular application that is running on the console 106.
  • a fist gesture may communicate that a subsequent movement from the glove 900 should be mapped into a game running on the console 106 as a punch, rather than a slap.
  • a command may be sent to the console 106 that a game character should be shown to punch another game character with a certain force and with a certain hit point level.
  • glove 900 has been described in terms of a fist gesture, it is understood that an infinite variety of gestures may be created, learned, or looked-up, based on the individual or coordinated movement of the various fingers 906.
  • a user may point to a device displaying certain information to "grab" the information, and then make a throwing gesture toward another device to display that information on the other device.
  • a voice recognition module 919 including an audio input 32, may be included, allowing the user to "call" his home or office computer (or hard disk) and search for a specific file. Once the file is found, the user my grab it using gesture recognition or ask the remote terminal to send it to his glove (or other required location) using the voice recognition. When the file is received and stored on the glove's memory, the user may point to a terminal and the file will be sent and opened on the designated terminal.
  • the glove may also include a visual feedback module, as discussed above.
  • Such an interface may overcome many problems of prior art motion-mapping applications, which require cameras tracking lights on gloves to determine location and motion.
  • IL2013/050609
  • the user of this embodiment may, for example, move in many different directions, hide his hands, etc., while still being tracked. Such movement would not be tracked by such prior art systems, and would therefore greatly limit the movement and behavior of the user. Interaction with consoles that do not include controllers, such as, for example, Microsoft Kinect systems, may therefore be greatly enhanced.
  • each finger 906 of the glove 900 including a separate CPU 12, etc.
  • the various components may be shared between fingers, as the location, proximity, and movement of the fingers 906 are mapped, or for the above-described functions to be performed by, for example, a ring-sized device.
  • additional functional elements such as RFID readers, LED lights, vibration, motors, etc., as described above may be added to provide additional interactivity.
  • FIG. 10 shows another embodiment of an interactive device system according to the invention.
  • a remote control car system 1000 includes interactive devices in the form of a remote controller 1010, a remote control car 1020, an interactive element 1030, checkpoints 1050 on a racetrack 1060, a drawbridge 1070, a cave 1080, and a mechanized ramp 1090.
  • the remote controller 1010 includes a CPU 12, energy unit 18, transceiver 16, memory 14, speaker 34, lights 72, interface button 1014, and a tactile feedback module 40, which may be coupled to a manual control 1012, such as a joy stick, steering wheel, gas pedal, or the like, for controlling the remote control car 1020 remotely.
  • a manual control 1012 such as a joy stick, steering wheel, gas pedal, or the like
  • the remote control car 1020 includes a CPU 12, energy unit 18, transceiver 16, memory
  • the car 1020 is sized to move on the racetrack 1060, depicted in FIG. 10 as a circular track.
  • Checkpoints 1050 are positioned near each other to form the racetrack 1060 and each include a CPU 12, energy unit 18, transceiver 16, memory 14, and lights 72.
  • Interactive element 1030 is re-positionable within the system and includes a CPU 12, energy unit 18, transceiver 16, memory 14, activation button 1034, and lights 72.
  • the drawbridge 1070 and mechanized ramp 1090 each include a CPU 12, energy unit 18, transceiver 16, memory 14, motor controller 62, and movement mechanism 64 in the form of a driving surface that is movable, such as between a horizontal alignment and vertical alignment, by the motor controller 62.
  • the cave 1080 may include, for example, an entrance with multiple exits. When a car enters, a small motor moves one of the possible exit lanes to connect with the entrance lane, so the player does not know where his car will come out. The car may exit, for example, from one of the sides of the cave, from the top of the cave, etc., simulating a changing of train tracks.
  • the remote controller 1010 may communicate game information and track updates to the player through audio and sounds via speaker 34, vibration via tactile feedback module 40 and color change via lights 72.
  • the remote controller 1010 can thus warn the player of road hazards or traps, thus functioning as a virtual "navigator”.
  • the buttons 1014 allow initiation or activation of power ups or traps for other players, as will be discussed in more detail below.
  • the manual control 1012 reacts to the interactive element 1030 with loose (sluggish) or tight (hectic) controls depending on the state or type of the element 1030.
  • the remote control car 1020 can change its colors via its lights 72 when under the effect of certain traps or road hazards. Brightness levels may change to show the current power of the remote control car 1020, the driver's skill, etc. When winning a race, for example, the lights 72 on the car may light up brightly.
  • the car may emit sound effects through its speaker 34 that enrich the driving experience and enhance the feeling of driving through the road hazards. Sounds can be, for example, screeching wheels, a flat tire, sliding on ice, going through rubble, etc.
  • the car 1020 may change its functionality during a race according to the element 1030 it passes and activates. When under the influence of an element, the car may react in changed steering, navigation, speed, handling etc.
  • the element 1030 may be shaped to be flat, so that the car 1020 may roll directly over the element 1030, but it is also within the scope of the invention for the element 1030 to have a different shape, and mere proximity to the element 1030 will cause the car 1020 to change its behavior and responsiveness.
  • One or more elements 1030 may be included in the system, which are similar in design but can differ in functionality. Before starting a race, the players may select the functionality of each element 1030 by pressing the button 1034 at the bottom of the element 1030 or the element 1030 may alternate its functionality at random. This random functionality may add excitement for the players, as they know only at the last moment what the element's functionality is with very little time to react or plan ahead.
  • Each function of the element 1030 may have a unique color and lighting sequence (from steady colors and brightness to patterns or flickering colors and lights) shown by the lights 72. When a car 1020 passes near or over the element 1030, the element effects the car's steering and handling or the manual control 1012.
  • the wheels 64 of the car 1020 can lock for 3 seconds, making the car 1020 skid off the racetrack 1060 unless the player can overcome the obstacle.
  • Passing an element 1030 that has a "power boost” functionality may, for example, cause the car 1020 to increase its maximum speed for 5 seconds.
  • Different functionalities of the element 1030 have different effects on the car 1020 and the controller 1010.
  • the element 1030 can be active (displaying its colors and functionality) during the entire race, or it can light up gradually after a nearby checkpoint 1050 is crossed, adding to the surprise element.
  • An existing element 1030 can be updated with new functionalities over time. Adding more functions creates more versatility and a much richer experience. Players who own only 4 elements, for instance, will want to purchase more so they can combine and play with any functionality they wish.
  • One or more checkpoints 1050 may be included in the system.
  • the checkpoints 1050 are arranged in pairs facing each other.
  • a player may arrange the checkpoints 1050 and the racetrack 1060 as they wish.
  • the racetrack 1060 can be small if played in a small room, for example, or a massive course when playing outdoors or in a large open space.
  • the checkpoints 1050 may give much needed structure when all players recognize a clear track and understand where they need to go instead of driving around running into each other. To avoid cheats, players may need to go through a set number of check points in order to win. This way, no one can cut corners or just cut across the track field.
  • the checkpoints 1050 may change color via their lights 72 to the color of the passing car 1020.
  • the controller 1010, car 1020, element 1030, etc. may add or change its features through remote update from, for example, a remote manufacturer, to keep the game and experience fresh and exciting for a long time.
  • the element 1030 can change functionality to mimic the effect of starting line lights, snow, an oil patch, traffic lights, a train crossing, a gas drain, rain, etc.
  • Each function may have its unique color and lighting scheme with matching sound effects via the controller 1010, car 1020, element 1030, checkpoint 1050, etc.
  • Weapons (forward and backward attacks), defenses (shields, defensive maneuvers etc.), etc. can also be functionalities added to the car 1020, for example. Additional game types and mini games can be introduced with a purchased element 1030, such as drag racing, delivery and drop off challenges, and so forth.
  • a first user takes out two checkpoints 1050 and aligns them to face each other.
  • the checkpoints 1050 light up with a dim white color, indicating activation.
  • the first user takes out two more checkpoints 1050 and places them at a distance from the first pair of check points 1050, creating a virtual track/ path between both gates. He gives them a little angle to create a turn.
  • the checkpoints 1050 light up and he moves on to set the finish line. He decides to place the last checkpoint pair 1050 in an angle facing the first pair in case he wants to use the same track later for three or five laps.
  • the first user then takes out an element 1030 in the form of a flat game piece. He flips it over and presses the activation button 1034. Pressing the button 1034 toggles the functions from “random”, which means he will not know what affect the element will have on his car until he drives near it. He presses the button 1034 again until the lights 72 of the element 1030 turn “red,” indicating the "Power Boost" function. He places the element 1030 on the racetrack 1060 between the first and second checkpoint pairs 1050.
  • the first user then takes out the car 1020 and remote controller 1010 and places the car 1020 at a distance from the first checkpoint pair 1050 to indicate the starting line for this race. He then presses the button 1014 on his controller 1010. This "revs his engine”, as he hears the engine roar from the car's 1020 speakers 34. A voice coming from the controller's 1010 speaker 34 is heard counting down "3-2-1 GO! ! GO! ! GO! ! !” ! ! The car 1020 shoots forward and races towards the first checkpoint 1050. The car 1020 passes between the first checkpoints 1050, which turn red, matching the car's color, the crowd roars with excitement through the controller 1010 speakers 34.
  • the car 1020 goes back to its original speed shortly after.
  • the final checkpoints 1050 change their lights 72 to red, and the controller speaker 34 sounds "WINNER! !, plays sounds of a crowd cheering, and the car 1020 shines its lights 72 in bright red colors.
  • the first user toggles the element 1030 to the "random" function and sets the number of laps in the controller 1010 to three.
  • the element 1030 lights up and is activated only when the first user passes the first checkpoint 1050, creating a surprise by not knowing in advance its functionality.
  • the element is activated and lit with a light blue color.
  • the car's 1020 wheels 64 are locked and he hears a sound of wheels sliding over ice from the car's 1020 speakers 34.
  • the car 1020 slides for three seconds until he regains control.
  • the announcer shouts "two more laps to go” via the controller 1010 speakers 34, and the car 1020 continues to the first checkpoints 1050. All of the checkpoints 1050 and the element(s) 1030 have reset themselves for the next lap. The element 1030 has a different a function each lap in random mode. Going through the third pair of checkpoints 1050 on the third lap will end the race, as they become the finish line.
  • a first and second user decide to race in a system with two controllers 1010, two cars 1020, fourteen checkpoints 1050 (forming seven gates), a drawbridge 1070, and five elements 1030 included.
  • the first user updates the original element 1030 over the air to include a new functionality, shown by a grey color showing in the lights 72.
  • the red and yellow cars 1020 begin the race.
  • the red car drives over a puddle of deep mud (triggered by a brown colored element), and the responsiveness of the manual control 1012 becomes loose and the car 1020 is sluggish, slowing down and responding with a delay.
  • the second user seeing what happens to the first user's car, tries to stay clear but his car 1020 also affected for a brief moment. He gets his bearings and the yellow car 1020 crosses the first checkpoint pair 1050, coloring it yellow.
  • the second user takes the lead and races on towards the next element 1030.
  • His car 1020 runs into electronic problems and has a short circuit (triggered by a blue colored element 1030), leaving his controls 1012 highly sensitive and extremely responsive to each delicate touch of the controls 1012.
  • the second user passes the first checkpoint pair 1050, coloring it red and presses on. He manages to steer the red car successfully despite the short circuit and takes the lead.
  • the crowd cheers via the controller 1010 speakers 34.
  • the red car drives over a Power Boost (triggered by a red colored element), but he decides to save it and use it later when he might need it more. He therefore does not push the button 1014 on the controller.
  • Power Boost triggered by a red colored element
  • both cars 1020 clear the third checkpoint pair 1050, they reach an element emitting a greyish light.
  • the first user's controller 1010 speakers 34 warn him of an incoming pile of rubble on the road (triggered by a gray colored element).
  • the car 1020 goes over the pile and starts jittering, its power is flickering on and off making the car's engine hiccup and hard to control.
  • a fierce side wind appears all of a sudden (triggered by a white colored element) affecting their cars to veer slowly off the racetrack 1060.
  • the second user activates the drawbridge 1070 by passing a checkpoint 1050, causing the bridge to slowly rise up.
  • the second car 1020 is close enough to cross it, jumping over the low ramp.
  • the first car 1020 now in second place, fears that his car will not make it in time to cross the drawbridge, and that he will have to wait until it lowers back down, so he activates his stored "power boost" by pressing the button 1014 on his controller 1010 and shoots his car 1020 off the ramp and into the air, passes the second car and is announced the winner.
  • the elements 1030 are scattered around the house in the bedrooms, kitchen and living room, under the dining table, behind the couch, under the bed and so on. They set the game and the cars 1020 begin.
  • the first and second users drive their cars 1020 via their remote controller 1010 around the house looking for a lit element 1030.
  • the elements 1030 light up in a random order. When a user reaches a lit element 1030 he collects one point, the element shuts down and the next one lights up. Only a single element 1030 is lit at any given time.
  • the interactive devices may send updates to a web site, such as social media websites, etc., after a game is completed to report or log the result of the game, update the user statistics, announce that the statistics have been changed, etc.
  • this embodiment is described in terms of remote controllers, cars, drawbridges, etc., it can also function similarly in other forms, such as, for example, interactive toy guns that react to the player's arsenal.
  • the arsenal may be comprised of various elements that function as grenades, proximity mines, etc.
  • the gun When a player holding a gun is in range of an element functioning as a grenade, for example, the gun may vibrate and be unable to fire for ten seconds. If the element functions as a freezing grenade, for example, lights on the gun may turn ice blue.
  • An interactive device such as a device as described above, sends out a request for reply by devices in its vicinity 1 110.
  • the request for reply may be sent out proceeding detection of movement of the interactive device 1104 through, for example, a tilt sensor, gyroscope, touch button or screen or the like.
  • the request for reply may be sent out when the interactive device receives a request for reply from another device 1 106, periodically in case the previously-detected network mash has been changed somehow, or the request may be sent out at a time when a user of the interactive device turns it on, operates a control on the device, sends a command to the device via another connected device or over the air updates (push) etc. 1108.
  • Devices in its vicinity respond directly to the interactive device that they are in its vicinity 11 12, and the interactive device and/or devices in its vicinity are updated with the information regarding the nearby devices, or network nodes 1 1 14.
  • Information regarding the nearby devices may also be sent to the interactive device 1 1 16 either after a request from the interactive device or automatically in response to the initial request.
  • This information may include, for example, unique identification indicia of the specific device (e.g., an identification number uniquely assigned to the particular device) or type of device (e.g., a model number of the device), user information stored in the device, information regarding routines that are currently stored on the device, etc.
  • the interactive device receiving this information may then look up the unique identification indicia of the device or type of device to determine information about the device, such as capabilities, etc.
  • the interactive device may also, for example, execute a routine shared by it and the nearby device 1 128, request an update 1 126 from the nearby device based on a routine, skill, or behavior that it does not have in its memory 1 124, send an offer of update to the nearby device for a routine, skill, or behavior that the nearby device does not have in its memory, send an offer to the nearby device to participate in a shared routine, etc.
  • the interactive device may have a default routine or functionality that is automatically initiated 1 122, allowing basic functionality (e.g., a voice recognition routine is run when sound is detected, and outputting programmed responses to recognized queries; blinking lights in response to detection of a specific RFID tag, etc.) when no special shared routines are detected 1120.
  • updates are described in terms of updates between nearby interactive devices, it is also within the scope of the invention for updates to be sent or received via other devices, such as consoles, telephones, access points, terminals, etc., and can be set via wired or wireless connections or through other devices on the mash network.
  • the updates described above may be, for example, for specific behaviors of the interactive device to occur immediately (e.g., to immediately play a song), a new skill or behavior to include in future routines (e.g., to change/ replace an audio file, such as a character's signature sound, or the ability to play a specific song in addition to other songs already known), a new capability with its current hardware (e.g., the ability to play music), a new routine (e.g., to play an audio file when it detects a device in the future with the same model number as the nearby device), a new application (e.g., to play a game of chess by detecting its and other devices' roles and locations on an RFID game board and scoring accordingly), or an upgrade to the interactive device's operating system.
  • future routines e.g., to change/ replace an audio file, such as a character's signature sound, or the ability to play a specific song in addition to other songs already known
  • a new capability with its current hardware e.g., the

Abstract

A method for interactive environment sensing that comprises providing a first interactive device, each comprising, a shared bus, a central processing unit coupled to the shared bus, an inertial monitoring unit coupled to the central processing unit, and further providing a second interactive device in proximity to the first interactive device.

Description

WIRELESS INTERACTIVE DEVICE SYSTEM AND METHOD
Background of the Invention
In many conventional interactive devices, a command is sent from a central console or terminal, such as a computer, game console, telephone, etc. to a separate device. The separate device then performs some function based on the command that was sent. Remote control toys, for example, receive commands or signals wirelessly and execute a predetermined action in response to that command. Some interactive devices may also receive firmware updates wirelessly, which can change the rules by which the devices operate in different situations. In some conventional devices, the interactive device may also send information back to the central console or terminal, indicating that it has received the command, executed the command, or updated its firmware.
Some devices seem to "sense" users' movements and/or their environment through various sensors located on the device. Examples of such devices include electronic game controllers, motion-sensing cameras, IR- or RF-responsive toys, etc., which are programmed to run a particular routine when a particular event is detected. The devices can then relay the information about the event or execution of its commands back to the central console. The central console can then send commands to the device based on the sensed event. This interaction of reacting to pre-programmed routines or commands received from a console in response to an on-board sensor may cause the devices to appear to be "aware" of their environment.
As toys and other interactive devices become more sophisticated and complex, the time needed to send, receive, and confirm receipt of the commands increases, and real-time functionality of these devices is limited. The necessity of a central relay for sending commands also limits the applications of such interactive devices. Further, the "awareness" of these devices is limited to mere sensing of an event and performing pre-programmed routines based on the sensed event or on a command received from a central console.
Conventional interactive devices are also limited as to their real-time responsiveness to other devices in their vicinity. Many devices require time-consuming installation procedures to allow interactivity with a specific device or console in its proximity. New devices must be intentionally added to a network, and the network may be limited to a short distance away from the device, the console, or the access point. In order to determine all of the nearby interactive devices, many conventional systems require a time-consuming addressing procedure, in which the command unit requests the location of each device and maps a grid from the responses received.
Brief Description of the Drawings
FIG. 1 is a diagrammatic view of one embodiment of an interactive device according to the invention.
FIG. 2 is a schematic view of a mash network according to another embodiment of a system according to the invention.
FIG. 3 is a perspective view of a dice-playing game according to another embodiment of the invention.
FIG. 4 is a perspective view of an action figure system according to yet another embodiment of the invention.
FIG. 5 is a perspective view of a content interaction system according to yet another embodiment of the invention.
FIG. 6 is an exploded perspective view of a game board system according to one embodiment of the invention.
FIG. 7 is a perspective view of a block interaction system according to one embodiment of the invention.
FIG. 8 is a perspective view of a medication monitoring system according to another embodiment of the invention.
FIG. 9 is a perspective view of an interactive glove according to yet another embodiment of the invention.
FIG. 10 is a perspective view of a remote control car system according to another embodiment of the invention.
FIG. 11 is a process flow diagram of one embodiment of a method according to the invention.
Detailed Description of the Invention
FIG. 1 shows a block diagram of an interactive device 1 according to one embodiment of the invention. The interactive device 1 includes a control unit 10, an inertial monitoring unit (IMU) 20, an audio unit 30, a tactile feedback module 40, a radio frequency identification (RFID) module 50, a motor module 60, and a display module 70, all connected via a shared bus 80.
The control unit 10 includes a central processing unit (CPU) 12, a memory 14, an energy unit 18, a transceiver 16, and a tilt sensor 19. The memory 14 may contain drivers for hardware, instructions for carrying out routines, and instructions for responding in different ways based on information sensed from the environment or commands received. The memory may also contain information about the user of the device, such as statistics from past games, characteristics of the user, user-specified functions, highest levels reached in various applications, achievements and personalized data etc., which are correlated to unique identification indicia of the device and data for the specific model (unit), such as whether other devices represent friends or foes, shared data, abilities etc.
The energy unit 18 in this embodiment may be a power source, such as a battery, charger, solar cell, etc., and supplies power to the device 1. The energy unit 18 is responsive to the tilt sensor 19 to decrease or turn off the power source when the device 1 is stationary for a certain amount of time, and to turn on the power source if the device 1 is moved.
The transceiver 16 can send/receive signals to/from other interactive devices, consoles, or access points (see FIG. 2). The transceiver 16 may also receive signals indicative of the device's geographic location through, for example, a GPS unit on-board the device or by receiving GPS information from a device, console, or access point proximate to it. Alternatively, the transceiver 16 may receive signals indicative of the device's location through triangulating other nearby devices (see FIG. 2). The transceiver 16 may also be responsive to the tilt sensor 19, to indicate when the position of the device 1 is to be updated.
In this embodiment, the transceiver 16 includes a 2.4 GHz IEEE 802.15.4 compliant RF transceiver with internal antenna. However, one with ordinary skill in the art will recognize that other transceivers capable of sending and receiving commands in real-time may be used. For example, ZigBee, WiFi, Bluetooth or other wireless transceivers may be substituted for applications within those transceivers' real-time response capabilities and proximity ranges. The IMU 20 includes an accelerometer 22, a gyroscope 24, and a magnetometer 25. The accelerometer 22 senses the g- force experienced by the device 1 , and the gyroscope 24 senses its orientation. The magnetometer 25 may detect and measure the strength and/or direction of a magnetic field. The audio unit 30 includes an audio input 32, such as a microphone, RCA jack, or the like, and an audio output 34, such as a speaker. The tactile feedback module 40 includes, for example, a vibration mechanism 42 to provide tactile vibration, force feedback, temperature, etc. The RFID module 50 includes a passive RFID reader 52 capable of reading radio frequency identification tags or signals. The motor module 60 includes a motor controller 62 connected to a motor and movement mechanism 64, such as, for example, wheels, a propeller, movable legs, movable arms, etc., which allow the device 1 to move from one location to another or to move one portion of the device relative to another. Display module 70 includes visual feedback mechanism 72 such as, for example, RGB LED lights, a display, or other form of visual feedback as is known in the art.
Although the components of device 1 are shown in FIG. 1 as being located in distinct functional units or modules 10, 20, 30, 40, 50, 60, and 70, one skilled in the art will understand that these components may be located within a single unit or module, or distributed within the device on different units or modules or within separate housings, while remaining within the scope of the invention. One skilled in the art will also understand that fewer than all of the functional units or modules 20, 30, 40, 50, 60, and 70 may be included in an interactive device 1, while remaining within the scope of the invention. Additional functional units or modules may also be added to the interactive devices, such as, for example, galvanic skin response detectors, heat sensors, motion detectors, cameras, or interfaces, such as buttons, keyboards, etc.
In operation, the CPU 12 sends repeated signals through the shared bus to the IMU 20, requesting response if anything has changed. Any changes in acceleration or orientation are then reported to the CPU 12 via the shared bus. Precise information on the device's location can then be determined by the CPU 12, through analysis of the changes reported from the IMU 20 and the location information received through the transceiver 16. These changes may be recorded in the memory 14 to create an ongoing log of changing location, orientation, and acceleration.
Referring now to FIGs. 1 and 2, when the CPU 12 is informed of movement of the device
1 through, for example, the tilt sensor 19, the CPU 12 can send a signal out via the transceiver 16, "are there any units around me?" Similar nearby units 1 in the device's proximity can then send back a signal directly to the interactive device 1 with information about the similar units 1 and their locations. Thus, the devices 1 can network seamlessly as an ad-hoc network without time-consuming setup procedures or the need for a central console, telephone or terminal.
As shown in FIG. 2, additional network-connected devices may also be added to the network, such as game console 106, telephone 104, access point 102, or cloud device 111. Game console 106 may be connected to, for example, a display device 110 and a controller 112. As shown in FIG. 2, the access point 102 may be connected to terminal 100 through a wired connection and to terminal 108 through, for example, the Internet, which may allow tracking and updates to functionality, firmware, stored routines, etc. of the CPUs of the local interactive devices 1. Additional interactive device 1 may also connect to the access point 102 remotely through, for example, the Internet. Cloud device 1 1 1 may bridge between an interactive device 1 and a Wi-Fi enabled device, and may translate the protocol of interactive device 1 to a Wi-Fi protocol, and vice versa, to enable two-way communication between regular Wi-Fi devices and interactive device 1.
In another embodiment, data or update request may be sent from one device 1 and have the request "bounce" from device to device within the mash network coverage until it reaches a device connected to a wireless network, such as telephone 104. The telephone 104 may then download the requested data and send it back to device 1. In another embodiment, the request reaches a device, for example, terminal 100, that contains the requested data and the terminal 100 sends it through the mash network back to interactive device 1.
In one embodiment, the information about the similar nearby units 1 that is sent back to the interactive device 1 includes information about the similar unit 1 , such as a unique identifier of the similar unit 1 or the owner of the similar unit 1, etc. This information can then be stored in the memory 14 of the interactive device 1 as a "friends list," such as a database of devices with which the interactive device 1 has been near to or interacted with in physical space. Interactions of the interactive device 1 may be limited to such "friends," to protect children from the dangers of child molesters, who may otherwise attempt to befriend the children through online play. The above embodiments are discussed in terms of embedded devices that communicate seamlessly with each other without the need for "plug and play" setup. It is also within the scope of the invention, however, for the interactive device to be insertable into multiple, different housings or devices. For example, the interactive device may be connected via an interface port to an action figure to allow interactivity and functionality as described above, disconnected from the action figure, and then connected to an interface port in a refrigerator, thus allowing the refrigerator to interact with nearby interactive devices. Alternatively, the interactive device may be embedded in the action figure, refrigerator, display 1 10, etc., such that all are connected wirelessly on a continuous basis.
FIG. 3 shows one application of an interactive device as described above. Interactive devices 200, 202 are formed as dice, including separate displays 204 on each face, which can change each face's appearance with indicia, such as numbers, pictures, text, color, etc., relating to the game played. The interactive devices 200, 202 each include a CPU 12, a memory 14, an energy unit 18, a transceiver 16, a tilt sensor 19, and a gyroscope 24, substantially as described in connection with FIG. 1. Referring to FIGs. 1 and 3, the memory 14 for each interactive device 200, 202 includes rules for a dice game application.
When the tilt sensor 19 of interactive device 200 indicates to the CPU 12 that the interactive device 200 is moved, the CPU 12 sends out a signal via transceiver 16 to determine whether any other interactive devices are in its proximity. Interactive device 202 receives this signal through its transceiver 16 and sends a response back to the interactive device 200 that it is in the proximity of interactive device 200. Interactive device 200 then sends a signal to interactive device 202 indicating that its memory includes a dice game application and requests interaction with device 202 in accordance with that application.
Interactive device 202 may also have the dice game application in its memory, and a signal can be sent back to interactive device 200 that the game is initiated. Alternatively, the interactive device 202 may send a response that the dice game application is not in its memory and can either decline initiation of the game or request an update to its memory be sent via either the other interactive device 200, a nearby access point, or other device.
The dice game may be played by, for example, each user taking turns rolling one or both dice. When die 200 is rolled, the CPU 12 may request orientation information from the gyroscope 24 along the shared bus. The gyroscope 24 can respond to the CPU 12 request and send information to the CPU indicating which face of the die 200 is currently facing upward. This information may then be recorded in the memory 14. The second die 202 may then be rolled, and the face of the die 202 facing upward after the roll may then be recorded in the memory of the second die 202. The respective values of the rolls can then be communicated between the devices 200, 202 via their respective transceivers 16, and an indication of which user had a higher value roll can then be recorded in the memory 14 of the dice 200, 202.
If the interactive devices 200, 202 also include vibration mechanisms 42 or visual feedback 72, the die 200 or 202 that had the higher roll may, for example, vibrate or light up, to indicate which die 200 or 202 won the particular round. It might also synchronize the outcome to all of the other relevant devices, to make sure that all of the game pieces coordinate their functions and rule sets according to the game rules. The synchronization may also take place with remote devices, such as when playing with a remote friend online, as will be discussed below.
Referring to FIGs. 1-3, the interactive devices 200, 202 may also be connected to other devices, such as console 106, telephone 104, access point 102, terminals 100, 108, display 110, cloud device 1 1 1 or remote interactive device 1. It is also within the scope of the invention for the interactive device to include a wireless dongle for connection to a terminal, the Internet, and/or the cloud device without going through a remote access point. Ongoing scores, user statistics, etc., may also be sent to these other devices and saved in their respective memories. Alternatively, interactive devices 200 and 202 may be located in remote locations and networked via the Internet through an access point 102.
As shown in FIG. 3, console 106 is optionally wirelessly connected to interactive device 200. Console 106, also connected to display 1 10 and controller 1 12, may also be running a routine related to the dice game and displaying content related to the dice game on the display 1 10. The data related to the orientation of the dice 200, 202, the user scores, etc., may dictate content displayed on the display 110, such as, for example, advancing characters in an ongoing game based on the value rolled on the dice, increase or decrease powers of characters, locations, start or stop video content to be displayed or any other content etc. Likewise, the dice game between the interactive devices 200, 202 may be initiated by routines running on the console 106, based on, for example, initiation of a battle between characters in the game running on the console, etc. The console 106 or other connected device may alternatively update the interactive devices 200, 202 with the dice-playing rules or additional routines based on the game/content/routine running on the connected device, the functionality of the interactive devices 200, 202, additional routines purchased by a user, etc. For example, the interactive device may include instructions to light up, move in a particular direction, or make a particular sound based on particular content running on the console. Likewise, the gyroscope or accelerometer of the interactive device may send commands based on the manner of movement to the console or other connected device to change the content or routines running on the console or device.
Although FIG. 3 is described in terms of a dice-playing game, any number of routines or sequences may be stored in the memory of the devices 200, 202, providing for various behaviors and series of smart interactions between the respective devices 200, 202 and/or other networked devices, such as consoles, displays, telephones, access points, terminals, or any embedded device or object, such as an action figure 300 as described in FIG. 4. These routines may go far beyond a single behavior or series of behaviors in response to a command. As described, they may also include rules and routines for smart, ongoing interaction with the other devices, changed functionality, such as new skills or games, and updated applications. The dice may alternatively act as blocks of light that may be placed or constructed in any way, sensitive to user's proximity and gestures. Although the dice are depicted in FIG. 3 with separate displays on each face, it is also within the scope of the invention for the displays to be replaced by non-display surfaces with indicia on each face.
Although this embodiment is described in terms of dice, one skilled in the art will understand that the interactive devices may be formed in any suitable shape or size, depending on the specific functionality or application desired.
FIG. 4 shows another embodiment of an interactive device according to the invention.
Interactive devices are shown in the form of an action figure 300 and a vehicle 400. In reference to FIGs. 1 and 4, action figure 300 includes a CPU 12, energy unit 18, transceiver 16, memory 14, tilt sensor 19, microphone 332, visual feedback in the form of LED lights 372, RFID reader 53, motor controller 62, and movement mechanism 64. Likewise, vehicle 400 includes a CPU 12, energy unit 18, transceiver 16, memory 14, tilt sensor 19, speaker 434, motor controller 62, movement mechanism 64, an RFID tag 402, and a handlebar button 404. When the tilt sensor 19 of the action figure 300 indicates that it has been moved, the CPU 12 of the action figure 300 sends out a signal via the transceiver, asking what devices are nearby. Vehicle 400 receives the signal through its transceiver 16, and its CPU 12 sends back information regarding its location, along with information about its capabilities. This information is received by the CPU 12 of the action figure 300, which checks its memory 14 for stored routines that can be used with devices having the characteristics of the vehicle 400.
If none are found, the action figure 300 may send out a request to the vehicle 400 for an update to its stored routines to enable interaction. Updates may be sent back by the vehicle 400 automatically or through qualification based on a number of factors, such as, for example, comparison of the user statistics stored in the memory 14 of the action figure 300 with a certain minimum experience points required for a particular functionality, characteristics of the action figure 300 stored in the action figure's memory 14, past interactions between the action figure 300 and the vehicle 400, whether information stored in the action figure's memory 14 indicates that the user of the action figure is on a predetermined list of approved users, etc. The vehicle 400 may also be connected to another device or to the Internet, and information relayed from the memory 14 of the action figure 300 can be, for example, checked against a list of paid subscribers found through the Internet, a list of prior interacting devices with the other connected device, etc.
If a stored routine or application is found or updated in the action figure 300, the action figure 300 can send a request to the vehicle 400 to initiate the routine, allowing the devices to interact according to a set of rules. The vehicle 400 or the action figure 300 may also vary their functions or behavior based on the proximity of the two devices to each other, exhibiting different functions or behavior when they are closer to each other than farther away. One with ordinary skill in the art will understand that a multitude of applications, routines, rules, or skill sets may apply to the interaction of the two devices 300, 400, but one illustrative example of such a routine will now be described.
Once the routine is initiated, for example, the speaker 434 may output a sound of a revving motorcycle engine. The microphone 332 may detect this sound and the routine may indicate that the LED lights 372 blink in response. Alternatively, the vehicle 400 may communicate to the action figure 300 that the sound is being output via its speaker 434, which T IL2013/050609
may trigger the blinking of the LED lights 372, or the same routine may initiate output of the sound via speaker 434 and the blinking of the LED lights 372.
If the action figure 300 is positioned to be seated on the vehicle 400, in contact with the RFID tag 401, the RFID reader 53 may communicate that contact to the CPU 12 of the action figure 300, and the CPU 12 may look up one or more valid responses to the contact in the routine stored in the memory 14, and instruct the action figure motor controller 62 to move the arm 64 of the action figure 300 until it pushes the handlebar button 404. The CPU 12 of the vehicle 400 may detect the engagement of the handlebar button 404, and operate the motor controller 62 of the vehicle 400 to turn the wheels 64 of the vehicle 400. Thus, the interactive devices may do more than simply communicate and respond to commands - they may interact with other elements in the environment, such as other interactive devices, objects in the environment, sounds, sights, proximity, etc.
For example, various objects may be scattered in the vicinity of the vehicle, representing road hazards. The vehicle 400 may encounter a bumpy floor surface through a gyroscope as it is moving, communicate this to the action figure 300, which may then play a sound file, "This is a bumpy ride." As discussed above, the sound file may alternatively be played in response to communication from the vehicle 400, other interactive device 1 or from the shared routine, rather than through direct detection. Likewise, the RFID reader 53 of the action figure 300 may detect an RFID tag or interactive device embedded in its environment, which represents a "power up." The action figure 300 may then communicate this to the vehicle 400, prompting it to increase its speed. In another example, the action figure 300 may detect a weapon held in its hand through, for example, an RFID reader, pressed button, etc., which may provide additional skills to the action figure, hit points attributed to movements of the action figure 300, etc. In other embodiments, an RFID tag or interactive device may be embedded in playing cards, in a patch on clothing, etc. A user may have, for example, their favorite character's T-shirt with an RFID tag embedded in a patch, which can give the RFID-reading action figure additional functions, etc.
Infinite variations of routines and skill sets between one or more interactive devices, such as the above described, are possible, including proximity-based alliances or confrontations that will update user statistics for an ongoing game in the device's memory based on the interactions. Skill sets, routines, and interactive capabilities may also be updated in real time based on the interactions, etc. For example, if the action figure 300 is able to stay on the vehicle 400 for a minimum amount of time while it is in motion, the skill set of the action figure 300 may be updated via either the vehicle 400 or a wireless Internet connection to include performance of a handstand on the vehicle 400.
It is also within the scope of the invention for additional functional modules to be added to the interactive devices such as, for example, a modular audio unit 30 (FIG. 1) including an audio input 32, such as a microphone, RCA jack, or the like, and an audio output 34, such as a speaker a modular audio unit with audio input and output. Connecting such audio units 30 to the interactive devices may enable, for example, use of the interactive devices as communications units, such as walkie-talkies, VOIP (Voice-Over-IP) or the like through either direct wireless connection between the interactive devices or, if the interactive devices are in remote locations, via the Internet through an access point 102 (FIG. 2). Voice recognition modules may also be added to allow control of remote terminals or of other embedded devices.
Although the embodiment described in FIG. 4 features the interactive device as an action figure and vehicle, one having ordinary skill in the art will recognize that the interactive device can be formed of various sizes, shapes, and functionalities, depending on the desired application and functionality of the device. Examples include balls, refrigerators, game pieces, wearable objects, interactive appliances, TV screens, Hard Drive Disks, or any other object desired to interact with digitally or to give added value when digitized, etc.
FIG. 5 shows yet another embodiment of the invention. An interactive device system 500 includes an interactive device in the form of a bracelet 501 , a console 506, and a display 510. The bracelet 501 includes a CPU 12, energy unit 18, transceiver 16, memory 14, tilt sensor 19, gyroscope 24, and visual feedback in the form of different color LED lights 572.
In this embodiment, the CPU 12 of the bracelet 501 sends out a signal asking about devices in its proximity after indication of movement by, for example, the tilt sensor 19. The console 506 sends back a signal indicating that it is in the proximity of the bracelet 501. The console 506 and the bracelet 501 can then send out various requests related to shared routines, information stored in the memory 14 of the bracelet 501 , capabilities of the bracelet 501, and information related to the content running on the console 506. For example, the console 506 can send information to the bracelet 501 indicating the game that is currently running on the console 506, and the bracelet 501 can send the statistics of the bracelet's user in that game such as, for example, stored characters, highest level attained, special character skills, current points, etc. Alternatively, the console 506 may receive information only related to a unique identification number of the interactive device and reference a table in its memory or a remote location through a connected access point (FIG. 2) that includes the more detailed information about the user. Thus, a user may functionally port all of her characters, achievements, statistics, etc. for use on a friend's console 506 with little or no set up time.
Although this embodiment is described in terms of games and consoles, it is also within the scope of the invention for such an interactive device to be used for other applications such as, for example, retail environments, in which past sales of a user at a particular location are stored in the device's memory or in the memory of a console or other interactive device at the location. When the user information is communicated to the location's console or interactive device, the console or interactive device may suggest new products to the user that are similar to past purchases of the user stored in the memory.
Referring again to FIGs. 1 and 5, the bracelet 501 may also function as a remote controller of the console 506, mapping movements and location of the user detected by the gyroscope 24 and accelerometer 22, or as described above in connection with FIG. 1, into commands for interacting with the content running on the console 506 and displayed on the display 510. Likewise, the bracelet 501 may respond to the content running on the console 506 or to commands from the console 506. For example, one color of LED light 572 may light up when the character in the game is moving close to an enemy, and a different color of LED light may light up when the character in the game is moving toward a hidden treasure. One skilled in the art will also understand that such interaction may occur with audio, vibrations, etc., in other situations as well, such as when powering up, getting hit, etc.
An interactive appliance 520 in the form of, for example, a lamp may also be included in system 500. The lamp 520 may also include a CPU 12, energy unit 18, transceiver 16, memory 14, and a light source 521, such as an RGB light bulb. The interactive appliance 520 may also interact with the content running on the console 506 and displayed on the display 501 to, for example, dim, blink, or change colors in pivotal movements of the game or any other digital content running on the console 506. This interaction may be based on, for example, commands sent from the console 506, sounds or visual signals detected from a speaker (not shown) or display 510, a synchronized routine running on both the lamp 520 and the console 506, the time period measured from the beginning of the content interaction, synced with a running script, etc. In another example of an interactive appliance, if an interactive device is connected to an interface port of an air conditioning unit, the thermostat may be set to be far lower during a cold scene of a movie running on the console. In one embodiment, the presence of both the bracelet 501 and the lamp 520 may be detected by each of the interactive devices 501, 520, and the console 506, and additional content may become available in the game or movie, or additional functionalities of the bracelet 501 and/or lamp 520 may become available.
Information about experiences the user had while interacting with the game content, such as successes, additional levels, points, etc. may be stored in the memory of the bracelet 501. Indicia of that information may remain on the bracelet, such as a certain color LED light may remain lit on the bracelet 501, after the interaction to indicate to others that the user has, for example, achieved a certain level of experience, even after the bracelet is moved away from the proximity of the console 506.
Although the interactive device is described in terms of a bracelet 501, it is also within the scope of the invention for the device to be in the form of, for example, other wearable devices, such as a watch, ring, necklace, sticker, glove, head-mounted display, augmented reality glasses, etc., or non-wearable devices, such as those described above. Although the content is described as running on a console 506, it is also within the scope of the invention for the content to run on a PC, smart TV, DVD player, or any other media device.
FIG. 6 depicts another embodiment of a system according to the invention. Referring to
FIGs. 1 and 6, interactive system 600 includes interactive devices 601 in the form of action figures, a grid 603 of RFID tags 604, and game board overlay 606. In this embodiment, each of the interactive devices 601 include a CPU 12, energy unit 18, transceiver 16, memory 14, and RFID reader 53. As seen in FIG. 6, the RFID tags 604 are arranged in an array to form grid 603. The game board overlay 606 may, for example, depict a map, picture, game board with path, etc., downloaded from the site, according to the game intended to be played by the user. It is also within the scope of the invention for the game board overlay 606 and the RFID grid 603 to be a single element.
Game board overlay 606 is laid on top of the grid 603, and the action figures 601 are placed on top of the game board overlay 606 in various locations. In each of the action figures 601, the RFID reader 53 notifies the CPU 12 that the action figure 601 is adjacent to a particular RFID tag within the grid 603, effectively notifying the action figure 601 about its location on the game board overlay 606. The action figures 601 can then communicate the respective locations to each other or to another device (see FIG. 2), such as a telephone 104, console 106 connected to a display 1 10, access point 102, terminals 100, 108, or remote device 1. They may also or alternatively communicate with interactive dice 200, 202, as described in FIG. 3, to incorporate dice rolling features in a game.
The memory 14 of each action figure 601 can also be updated according to the rules of the game being played with, for example, the current location of the action figure 601, new experience levels, etc., and the CPU 12 of the action figure 601 may indicate certain behavior or rules for the action figure 601, such as, for example, eyes blinking, arms moving, a sound playing, helping other characters in the game, giving power, health, weapons etc.
If the location information of the action figures 601 is sent to another device, as discussed above, the location of the action figures 601 may also be depicted on display 110, terminals 100, 108, etc., for remote observation and/or remote game play. Accordingly, the system may also include a second grid of RFID tags, game board overlay, and interactive devices in a remote location, which can interact with the local system via access point 102. In this way, locations of both the local and remote interactive devices can be communicated to all of the devices, and remote users may effectively play the game together. It is also within the scope of the invention for a remote user to move virtual characters on a display, and that information can be communicated to the action figures 601 to enable remote game play.
Different game board overlays 606, corresponding to different games or variations within a game, may be laid on the grid 603 for a variety of applications. The particular game board overlay 606 placed on the grid 603 may be, for example, automatically detected by the action figures 601 or other device, or communicated directly by the user to one or more of the connected devices including dice. It is also within the scope of the invention for the game board to itself be an interactive device, sensing locations of all of the action figures/game pieces in contact with it, etc.
FIG. 7 depicts another embodiment of a system 700 according to the invention. Referring to FIGS. 1 and 5, interactive blocks in various shapes, such as a cube 701, interlocking block 703, pyramid 705, animal 707, base 709, legs 71 1, etc., each include a CPU 12, energy unit 18, transceiver 16, and memory 14. The blocks may be placed in a desired arrangement to P T/IL2013/050609
each other to construct structures, scenes, etc. Once the user is finished with whatever they desired to build, a virtual 3D model of their physical creation is uploaded to their personal virtual space via, for example, console 706. While visible on display 710, the user may then use a controller 712 to move around virtual objects representing the interactive blocks, scale them, and so on. The interactive blocks may then be taken apart, rearranged, etc. Each time the finished arrangement may be uploaded and saved as a virtual playing ground. A user may build whole cities, castles etc. and add physical action figures, which may play and interact with the virtual model. For instance, after building a city, the player can play with his physical "Godzilla" action figure 713 and demolish virtual buildings or run rampage through the streets.
Certain interactive blocks may have different characteristics, such as water (and other elements), mountains (and other terrain), black holes, temperature, traps and so on. The blocks may also be embedded with RGB LEDs and react to the played scenarios according to the characteristics ascribed to the block and any shared rules or routines.
As mentioned, all structures and built objects can be saved to the users' virtual space and can be accessed by friends to enable cooperative play, as described in connection with FIG. 6. Remote players can build together (each from his own home) and can visit other virtual worlds of different players. Such integration may also be achieved through social networking sites, such as Facebook and YouTube. Although this embodiment is described in terms of each of the blocks 701, 703, 705, 707, 709, 71 1, and 713 having a CPU 12, energy unit 18, transceiver 16, and memory 14, it is also within the scope of the invention to have, for example, only a single interactive block, which is capable of sensing positions of the remaining blocks and communicating those positions to the console 706 or other device.
FIG. 8 depicts another embodiment of a system 800 according to the invention. Referring to FIGs. 1 and 8, a bracelet 801 includes a CPU 12, energy unit 18, transceiver 16, memory 14, RFID reader 52, and RGB LED lights 72. A plurality of medication boxes 803, 805, 807 each has affixed to it a different color RFID tag 809, 811, 813, which may be in the form of, for example, stickers. After detecting that a certain amount of time has passed since previous medication was taken, the CPU 12 of the bracelet 801 may indicate that a blue LED light 72 should be lit on the bracelet 801, indicating that medication from medication box 803 with a blue RFID tag 809 should be taken. After the user takes medication from medication box 803, the RFID reader 52 of the bracelet 801 can be brought into proximity of the RFID tag 809, indicating that the relevant medication has been taken.
The CPU 12 of the bracelet 801 can then indicate that a red LED light 72 should be lit on the bracelet 801, indicating that medication from medication box 805 with a red RFID tag 81 1 should be taken. A similar procedure as outlined above can then be followed for that medication, as well as the medication in box 807 with a green RFID tag 813. Once the CPU 12 of the bracelet 801 is updated that all medications have been taken, the LED light 72 may be turned off, indicating that no medications are due to be taken at that time.
One with ordinary skill in the art will appreciate that other means of alerting the wearer of the bracelet 801 may be substituted for the LED lights 72 such as, for example, vibration alerts, a display with readable information, audio alerts, etc.
Referring to FIGs. 1, 2, and 8, the transceiver 16 may send information to another device, such as, for example, a remote terminal 108 or telephone 104 when a dose of medication is missed, when the CPU 12 indicates that medication within the medication box is running low, a button is pushed on the bracelet indicating an emergency, or any other alerts desired by a user. This information may be synchronized with any number of other digital devices, such as, for example, computers, personal web pages, email notifications, SMS or updates to mobile device applications, patient call center terminals, etc. A GPS or other position tracker may also be added to the device for additional safety of the user.
Although the device in this embodiment is described in terms of a bracelet 801, it is also within the scope of the invention for the interactive device to be constructed in any desired shape or size, including, for example, a pyramid, box, plastic card, pill case that automatically opens when medication is needed, etc.
FIG. 9 shows another embodiment of an interactive device according to the invention. Referring to FIGs. 1, 2, and 9, a glove 900 includes a strap 902 coupled to a flexible fabric 904 shaped to receive a user's fingers 906. In this embodiment, three fingers 906 are received in the glove 900, but it is within the scope of the invention for more or fewer fingers to be received in the glove 900. Each finger of the glove 900 includes a CPU 12, transceiver 16, memory 14, accelerometer 22 and gyroscope 24, allowing the movements, speed, and location of each received finger 906 to be tracked. Ends of each gloved finger 906 also include silicon-type pads 908, which allow work with touch-sensitive devices, such as, for example, an iPod, while 2013/050609
wearing the glove 900. A battery 918 provides power to the components on each finger 906. As discussed above, each CPU 12 may be updated with movements and locations of each of the other fingers 906 via their respective transceivers 16. Likewise, those movements and locations may be updated to other devices, such as a console 106, additional interactive device 1 , or the like.
The glove 900 can thus be used as a seamless controller for any number of devices through use of coordinated gestures, speed of movement, etc. For example, the CPU 12 of each finger may detect that their proximity to each other has increased, thus indicating that the user is making a fist gesture. A fist gesture may be determined through a look-up table, for example, to indicate a command to close a particular application that is running on the console 106.
Alternatively, a fist gesture may communicate that a subsequent movement from the glove 900 should be mapped into a game running on the console 106 as a punch, rather than a slap. When each accelerometer 22 indicates to each CPU 12 that fingers 906 are moving at a particular speed while in a fist configuration, a command may be sent to the console 106 that a game character should be shown to punch another game character with a certain force and with a certain hit point level.
Although the operation of glove 900 has been described in terms of a fist gesture, it is understood that an infinite variety of gestures may be created, learned, or looked-up, based on the individual or coordinated movement of the various fingers 906. In another example, a user may point to a device displaying certain information to "grab" the information, and then make a throwing gesture toward another device to display that information on the other device.
In another variation, a voice recognition module 919, including an audio input 32, may be included, allowing the user to "call" his home or office computer (or hard disk) and search for a specific file. Once the file is found, the user my grab it using gesture recognition or ask the remote terminal to send it to his glove (or other required location) using the voice recognition. When the file is received and stored on the glove's memory, the user may point to a terminal and the file will be sent and opened on the designated terminal. The glove may also include a visual feedback module, as discussed above.
Such an interface may overcome many problems of prior art motion-mapping applications, which require cameras tracking lights on gloves to determine location and motion. IL2013/050609
The user of this embodiment may, for example, move in many different directions, hide his hands, etc., while still being tracked. Such movement would not be tracked by such prior art systems, and would therefore greatly limit the movement and behavior of the user. Interaction with consoles that do not include controllers, such as, for example, Microsoft Kinect systems, may therefore be greatly enhanced.
Although this embodiment is described in terms of each finger 906 of the glove 900 including a separate CPU 12, etc., it is within the scope of the invention for the various components to be shared between fingers, as the location, proximity, and movement of the fingers 906 are mapped, or for the above-described functions to be performed by, for example, a ring-sized device. Likewise, additional functional elements, such as RFID readers, LED lights, vibration, motors, etc., as described above may be added to provide additional interactivity.
FIG. 10 shows another embodiment of an interactive device system according to the invention. Referring to FIGs. 1 and 10, a remote control car system 1000 includes interactive devices in the form of a remote controller 1010, a remote control car 1020, an interactive element 1030, checkpoints 1050 on a racetrack 1060, a drawbridge 1070, a cave 1080, and a mechanized ramp 1090. The remote controller 1010 includes a CPU 12, energy unit 18, transceiver 16, memory 14, speaker 34, lights 72, interface button 1014, and a tactile feedback module 40, which may be coupled to a manual control 1012, such as a joy stick, steering wheel, gas pedal, or the like, for controlling the remote control car 1020 remotely.
The remote control car 1020 includes a CPU 12, energy unit 18, transceiver 16, memory
14, speaker 34, motor controller 62, movement mechanism 64 in the form of wheels, and lights 72. The car 1020 is sized to move on the racetrack 1060, depicted in FIG. 10 as a circular track. Checkpoints 1050 are positioned near each other to form the racetrack 1060 and each include a CPU 12, energy unit 18, transceiver 16, memory 14, and lights 72. Interactive element 1030 is re-positionable within the system and includes a CPU 12, energy unit 18, transceiver 16, memory 14, activation button 1034, and lights 72. The drawbridge 1070 and mechanized ramp 1090 each include a CPU 12, energy unit 18, transceiver 16, memory 14, motor controller 62, and movement mechanism 64 in the form of a driving surface that is movable, such as between a horizontal alignment and vertical alignment, by the motor controller 62. The cave 1080 may include, for example, an entrance with multiple exits. When a car enters, a small motor moves one of the possible exit lanes to connect with the entrance lane, so the player does not know where his car will come out. The car may exit, for example, from one of the sides of the cave, from the top of the cave, etc., simulating a changing of train tracks.
The remote controller 1010 may communicate game information and track updates to the player through audio and sounds via speaker 34, vibration via tactile feedback module 40 and color change via lights 72. The remote controller 1010 can thus warn the player of road hazards or traps, thus functioning as a virtual "navigator". The buttons 1014 allow initiation or activation of power ups or traps for other players, as will be discussed in more detail below. The manual control 1012 reacts to the interactive element 1030 with loose (sluggish) or tight (hectic) controls depending on the state or type of the element 1030.
The remote control car 1020 can change its colors via its lights 72 when under the effect of certain traps or road hazards. Brightness levels may change to show the current power of the remote control car 1020, the driver's skill, etc. When winning a race, for example, the lights 72 on the car may light up brightly. The car may emit sound effects through its speaker 34 that enrich the driving experience and enhance the feeling of driving through the road hazards. Sounds can be, for example, screeching wheels, a flat tire, sliding on ice, going through rubble, etc.
The car 1020 may change its functionality during a race according to the element 1030 it passes and activates. When under the influence of an element, the car may react in changed steering, navigation, speed, handling etc. The element 1030 may be shaped to be flat, so that the car 1020 may roll directly over the element 1030, but it is also within the scope of the invention for the element 1030 to have a different shape, and mere proximity to the element 1030 will cause the car 1020 to change its behavior and responsiveness.
One or more elements 1030 may be included in the system, which are similar in design but can differ in functionality. Before starting a race, the players may select the functionality of each element 1030 by pressing the button 1034 at the bottom of the element 1030 or the element 1030 may alternate its functionality at random. This random functionality may add excitement for the players, as they know only at the last moment what the element's functionality is with very little time to react or plan ahead. Each function of the element 1030 may have a unique color and lighting sequence (from steady colors and brightness to patterns or flickering colors and lights) shown by the lights 72. When a car 1020 passes near or over the element 1030, the element effects the car's steering and handling or the manual control 1012. Driving over an element 1030 that has the functionality of a patch of ice, for instance (one of many possible functions of an element), the wheels 64 of the car 1020 can lock for 3 seconds, making the car 1020 skid off the racetrack 1060 unless the player can overcome the obstacle.
Passing an element 1030 that has a "power boost" functionality may, for example, cause the car 1020 to increase its maximum speed for 5 seconds. Different functionalities of the element 1030 have different effects on the car 1020 and the controller 1010. The element 1030 can be active (displaying its colors and functionality) during the entire race, or it can light up gradually after a nearby checkpoint 1050 is crossed, adding to the surprise element.
An existing element 1030 can be updated with new functionalities over time. Adding more functions creates more versatility and a much richer experience. Players who own only 4 elements, for instance, will want to purchase more so they can combine and play with any functionality they wish.
One or more checkpoints 1050 may be included in the system. In this embodiment, the checkpoints 1050 are arranged in pairs facing each other. A player may arrange the checkpoints 1050 and the racetrack 1060 as they wish. The racetrack 1060 can be small if played in a small room, for example, or a massive course when playing outdoors or in a large open space. The checkpoints 1050 may give much needed structure when all players recognize a clear track and understand where they need to go instead of driving around running into each other. To avoid cheats, players may need to go through a set number of check points in order to win. This way, no one can cut corners or just cut across the track field. When passed, the checkpoints 1050 may change color via their lights 72 to the color of the passing car 1020.
As discussed above in relation to remote update of an interactive device, the controller 1010, car 1020, element 1030, etc. may add or change its features through remote update from, for example, a remote manufacturer, to keep the game and experience fresh and exciting for a long time. For example, the element 1030 can change functionality to mimic the effect of starting line lights, snow, an oil patch, traffic lights, a train crossing, a gas drain, rain, etc. Each function may have its unique color and lighting scheme with matching sound effects via the controller 1010, car 1020, element 1030, checkpoint 1050, etc. Weapons (forward and backward attacks), defenses (shields, defensive maneuvers etc.), etc. can also be functionalities added to the car 1020, for example. Additional game types and mini games can be introduced with a purchased element 1030, such as drag racing, delivery and drop off challenges, and so forth.
An example of a user's experience with the system described in FIGs. 1 and 10 will now be described. A first user takes out two checkpoints 1050 and aligns them to face each other. The checkpoints 1050 light up with a dim white color, indicating activation. The first user takes out two more checkpoints 1050 and places them at a distance from the first pair of check points 1050, creating a virtual track/ path between both gates. He gives them a little angle to create a turn. The checkpoints 1050 light up and he moves on to set the finish line. He decides to place the last checkpoint pair 1050 in an angle facing the first pair in case he wants to use the same track later for three or five laps.
The first user then takes out an element 1030 in the form of a flat game piece. He flips it over and presses the activation button 1034. Pressing the button 1034 toggles the functions from "random", which means he will not know what affect the element will have on his car until he drives near it. He presses the button 1034 again until the lights 72 of the element 1030 turn "red," indicating the "Power Boost" function. He places the element 1030 on the racetrack 1060 between the first and second checkpoint pairs 1050.
The first user then takes out the car 1020 and remote controller 1010 and places the car 1020 at a distance from the first checkpoint pair 1050 to indicate the starting line for this race. He then presses the button 1014 on his controller 1010. This "revs his engine", as he hears the engine roar from the car's 1020 speakers 34. A voice coming from the controller's 1010 speaker 34 is heard counting down "3-2-1 GO! ! GO! ! GO! !" The car 1020 shoots forward and races towards the first checkpoint 1050. The car 1020 passes between the first checkpoints 1050, which turn red, matching the car's color, the crowd roars with excitement through the controller 1010 speakers 34. He drives past the second checkpoint 1050 nearing the red lit element when all of a sudden his car starts flashing bright red and shoots out forward at a higher speed. The car 1020 goes back to its original speed shortly after. As the car 1020 clears the gate to the sound of his adoring fans from the controller 1010 speakers 34, the final checkpoints 1050 change their lights 72 to red, and the controller speaker 34 sounds "WINNER! !", plays sounds of a crowd cheering, and the car 1020 shines its lights 72 in bright red colors.
In a variation of this experience, the first user toggles the element 1030 to the "random" function and sets the number of laps in the controller 1010 to three. The element 1030 lights up and is activated only when the first user passes the first checkpoint 1050, creating a surprise by not knowing in advance its functionality. After the user clears the first checkpoints 1050, the element is activated and lit with a light blue color. As he nears the element 1030, the car's 1020 wheels 64 are locked and he hears a sound of wheels sliding over ice from the car's 1020 speakers 34. The car 1020 slides for three seconds until he regains control. Once the car 1020 clears the third checkpoints 1050, the announcer shouts "two more laps to go" via the controller 1010 speakers 34, and the car 1020 continues to the first checkpoints 1050. All of the checkpoints 1050 and the element(s) 1030 have reset themselves for the next lap. The element 1030 has a different a function each lap in random mode. Going through the third pair of checkpoints 1050 on the third lap will end the race, as they become the finish line.
In another variation of this experience, a first and second user decide to race in a system with two controllers 1010, two cars 1020, fourteen checkpoints 1050 (forming seven gates), a drawbridge 1070, and five elements 1030 included. The first user updates the original element 1030 over the air to include a new functionality, shown by a grey color showing in the lights 72. The red and yellow cars 1020 begin the race. The red car drives over a puddle of deep mud (triggered by a brown colored element), and the responsiveness of the manual control 1012 becomes loose and the car 1020 is sluggish, slowing down and responding with a delay. The second user, seeing what happens to the first user's car, tries to stay clear but his car 1020 also affected for a brief moment. He gets his bearings and the yellow car 1020 crosses the first checkpoint pair 1050, coloring it yellow.
The second user takes the lead and races on towards the next element 1030. His car 1020 runs into electronic problems and has a short circuit (triggered by a blue colored element 1030), leaving his controls 1012 highly sensitive and extremely responsive to each delicate touch of the controls 1012. The second user passes the first checkpoint pair 1050, coloring it red and presses on. He manages to steer the red car successfully despite the short circuit and takes the lead. When he passes the second checkpoint pair 1050, the crowd cheers via the controller 1010 speakers 34. The red car drives over a Power Boost (triggered by a red colored element), but he decides to save it and use it later when he might need it more. He therefore does not push the button 1014 on the controller. As both cars 1020 clear the third checkpoint pair 1050, they reach an element emitting a greyish light. The first user's controller 1010 speakers 34 warn him of an incoming pile of rubble on the road (triggered by a gray colored element). The car 1020 goes over the pile and starts jittering, its power is flickering on and off making the car's engine hiccup and hard to control. As both cars 1020 move on to clear the fourth checkpoint pair 1050, a fierce side wind appears all of a sudden (triggered by a white colored element) affecting their cars to veer slowly off the racetrack 1060. The second user activates the drawbridge 1070 by passing a checkpoint 1050, causing the bridge to slowly rise up. Activating the bridge, the second car 1020 is close enough to cross it, jumping over the low ramp. The first car 1020, now in second place, fears that his car will not make it in time to cross the drawbridge, and that he will have to wait until it lowers back down, so he activates his stored "power boost" by pressing the button 1014 on his controller 1010 and shoots his car 1020 off the ramp and into the air, passes the second car and is announced the winner.
In another variation, the elements 1030 are scattered around the house in the bedrooms, kitchen and living room, under the dining table, behind the couch, under the bed and so on. They set the game and the cars 1020 begin. The first and second users drive their cars 1020 via their remote controller 1010 around the house looking for a lit element 1030. The elements 1030 light up in a random order. When a user reaches a lit element 1030 he collects one point, the element shuts down and the next one lights up. Only a single element 1030 is lit at any given time. The first user to collect ten points, wins the game. The interactive devices may send updates to a web site, such as social media websites, etc., after a game is completed to report or log the result of the game, update the user statistics, announce that the statistics have been changed, etc.
Although this embodiment is described in terms of remote controllers, cars, drawbridges, etc., it can also function similarly in other forms, such as, for example, interactive toy guns that react to the player's arsenal. The arsenal may be comprised of various elements that function as grenades, proximity mines, etc. When a player holding a gun is in range of an element functioning as a grenade, for example, the gun may vibrate and be unable to fire for ten seconds. If the element functions as a freezing grenade, for example, lights on the gun may turn ice blue.
The updating of functions or routines running on an interactive device according to the invention will now be described with reference to FIG. 1 1. An interactive device, such as a device as described above, sends out a request for reply by devices in its vicinity 1 110. The request for reply may be sent out proceeding detection of movement of the interactive device 1104 through, for example, a tilt sensor, gyroscope, touch button or screen or the like. Alternatively, the request for reply may be sent out when the interactive device receives a request for reply from another device 1 106, periodically in case the previously-detected network mash has been changed somehow, or the request may be sent out at a time when a user of the interactive device turns it on, operates a control on the device, sends a command to the device via another connected device or over the air updates (push) etc. 1108.
Devices in its vicinity respond directly to the interactive device that they are in its vicinity 11 12, and the interactive device and/or devices in its vicinity are updated with the information regarding the nearby devices, or network nodes 1 1 14. Information regarding the nearby devices may also be sent to the interactive device 1 1 16 either after a request from the interactive device or automatically in response to the initial request. This information may include, for example, unique identification indicia of the specific device (e.g., an identification number uniquely assigned to the particular device) or type of device (e.g., a model number of the device), user information stored in the device, information regarding routines that are currently stored on the device, etc. The interactive device receiving this information may then look up the unique identification indicia of the device or type of device to determine information about the device, such as capabilities, etc. 1 118. The interactive device may also, for example, execute a routine shared by it and the nearby device 1 128, request an update 1 126 from the nearby device based on a routine, skill, or behavior that it does not have in its memory 1 124, send an offer of update to the nearby device for a routine, skill, or behavior that the nearby device does not have in its memory, send an offer to the nearby device to participate in a shared routine, etc. Alternatively, the interactive device may have a default routine or functionality that is automatically initiated 1 122, allowing basic functionality (e.g., a voice recognition routine is run when sound is detected, and outputting programmed responses to recognized queries; blinking lights in response to detection of a specific RFID tag, etc.) when no special shared routines are detected 1120.
Although the updates mentioned above are described in terms of updates between nearby interactive devices, it is also within the scope of the invention for updates to be sent or received via other devices, such as consoles, telephones, access points, terminals, etc., and can be set via wired or wireless connections or through other devices on the mash network.
The updates described above may be, for example, for specific behaviors of the interactive device to occur immediately (e.g., to immediately play a song), a new skill or behavior to include in future routines (e.g., to change/ replace an audio file, such as a character's signature sound, or the ability to play a specific song in addition to other songs already known), a new capability with its current hardware (e.g., the ability to play music), a new routine (e.g., to play an audio file when it detects a device in the future with the same model number as the nearby device), a new application (e.g., to play a game of chess by detecting its and other devices' roles and locations on an RFID game board and scoring accordingly), or an upgrade to the interactive device's operating system.
The embodiments of the invention described herein are illustrative, rather than restrictive. Modification may be made without departing from the spirit of the invention as defined by the following claims and their equivalents.

Claims

Claims
1. A method for interactive environment sensing, the method comprising:
providing a first interactive device, each comprising:
a shared bus;
a central processing unit coupled to the shared bus;
an inertial monitoring unit coupled to the central processing unit for detecting at least one inertial feature of the following group of inertial features of the first interactive device or its environment: g-force experienced, orientation, strength of a magnetic field, and direction of a magnetic field;
a memory coupled to the central processing unit for receiving and storing position information related to that detected by the inertial monitoring unit;
an energy unit coupled to the shared bus for supplying power to the first interactive device;
a tilt sensor coupled to the central processing unit for sensing a tilt of the first interactive device; and
a transceiver coupled to the central processing unit for sending and receiving the position information wirelessly;
receiving, in the central processing unit, inertial information detected by the inertial monitoring unit and tilt information sensed by the tilt sensor;
analyzing, in the central processing unit, the inertial information and/or the tilt information to determine the position information to be stored in the memory, the position information indicating a position of the first interactive device;
providing a second interactive device in proximity to the first interactive device, the second interactive device having a second transceiver, and a second memory coupled to a second central processing unit, the second memory storing information relating to a position of the second interactive device;
transmitting, via the transceiver, the position information of the first interactive device to the second interactive device;
receiving, via the second transceiver, the position information;
storing the position information of the first interactive device in the second memory; transmitting, via the second transceiver, the information relating to the position of the second interactive device;
receiving, via the first transceiver, the information relating to the position of the second interactive device.
2. The method of claim 1, wherein the analyzing occurs when the tilt sensor detects a tilt of the first interactive device.
3. The method of claim 1, wherein the transmitting via the second transceiver occurs when a request for location is received from the first interactive device.
4. The method of claim 3, wherein the request for location is transmitted when a tilt is detected by the tilt sensor.
5. The method of claim 1, wherein the analyzing further comprises analyzing the information relating to the position of the second interactive device to determine the position information.
6. The method of claim 1, wherein the transmitting via the second transceiver further comprises transmitting characteristic information about the second interactive device, the method further comprising storing the characteristic information in the memory.
7. The method of claim 6, wherein the characteristic information includes a unique identifier, an application stored in the second memory, user data relating to a user or user activity of the second interactive device, and/or instructions for potential interactions with the second interactive device.
PCT/IL2013/050609 2012-07-19 2013-07-18 Wireless interactive device system and method WO2014013492A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261673691P 2012-07-19 2012-07-19
US61/673,691 2012-07-19

Publications (1)

Publication Number Publication Date
WO2014013492A1 true WO2014013492A1 (en) 2014-01-23

Family

ID=49948371

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2013/050609 WO2014013492A1 (en) 2012-07-19 2013-07-18 Wireless interactive device system and method

Country Status (1)

Country Link
WO (1) WO2014013492A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2960889A1 (en) * 2014-06-26 2015-12-30 Rovio Entertainment Ltd Enhanced experience for reading, playing, learning, by combining digital media such as audio, video, with physical form such as books, magazine, board games
WO2017006187A3 (en) * 2015-07-07 2017-05-04 Back Nimrod Sided game accessory device
US9886865B2 (en) 2014-06-26 2018-02-06 Rovio Entertainment Ltd. Providing enhanced experience based on device location
DE102018106482B4 (en) * 2018-03-20 2020-02-13 Logic Glas Gmbh Method for controlling the visible surfaces of a hand-movable electronic cube
EP4282498A1 (en) * 2022-05-26 2023-11-29 Kamil PODHOLA Chip board system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1899939A1 (en) * 2005-03-24 2008-03-19 Smalti Technology Limited Manipulable interactive devices
WO2010144390A1 (en) * 2009-06-08 2010-12-16 Cfph, Llc Interprocess communication regarding movement of game devices

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1899939A1 (en) * 2005-03-24 2008-03-19 Smalti Technology Limited Manipulable interactive devices
WO2010144390A1 (en) * 2009-06-08 2010-12-16 Cfph, Llc Interprocess communication regarding movement of game devices

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2960889A1 (en) * 2014-06-26 2015-12-30 Rovio Entertainment Ltd Enhanced experience for reading, playing, learning, by combining digital media such as audio, video, with physical form such as books, magazine, board games
US9886865B2 (en) 2014-06-26 2018-02-06 Rovio Entertainment Ltd. Providing enhanced experience based on device location
WO2017006187A3 (en) * 2015-07-07 2017-05-04 Back Nimrod Sided game accessory device
DE102018106482B4 (en) * 2018-03-20 2020-02-13 Logic Glas Gmbh Method for controlling the visible surfaces of a hand-movable electronic cube
EP4282498A1 (en) * 2022-05-26 2023-11-29 Kamil PODHOLA Chip board system

Similar Documents

Publication Publication Date Title
US10864440B2 (en) Augmented reality gaming systems and methods
US10639544B2 (en) Gaming system for modular toys
US20210001211A1 (en) Location-based games and augmented reality systems
WO2021012850A1 (en) Prompt information sending method and apparatus in multiplayer online battle program, and terminal
US9884254B2 (en) Augmented reality gaming systems and methods
US10561950B2 (en) Mutually attachable physical pieces of multiple states transforming digital characters and vehicles
US8332544B1 (en) Systems, methods, and devices for assisting play
US20160287979A1 (en) A Modular Connected Game Board System and Methods of Use
JP6196378B2 (en) Method, apparatus and computer readable medium for providing a dart game play mode
JP2018511020A (en) Apparatus for providing dart game play mode to play with virtual player and computer program stored in computer readable medium
WO2014013492A1 (en) Wireless interactive device system and method
US11766607B2 (en) Portal device and cooperating video game machine
EP3251732A1 (en) Server and dart game device for providing dart game in accordance with hitting area on basis of location of dart pin, and computer program
KR20160076158A (en) Toy and method for providing game and computer program
JP2012019494A (en) Self-position utilization system
KR101473422B1 (en) Toy driving system and application for controlling toy
WO2021181851A1 (en) Information processing device, method, and program
JP6901541B2 (en) Game programs, computers, and game systems
Zhao How augmented reality redefines interaction design in mobile games
WO2016203097A1 (en) Combining physical and digital playing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13820459

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13820459

Country of ref document: EP

Kind code of ref document: A1