US10569165B2 - Tactile feedback systems and methods for augmented reality and virtual reality systems - Google Patents

Tactile feedback systems and methods for augmented reality and virtual reality systems Download PDF

Info

Publication number
US10569165B2
US10569165B2 US15/241,007 US201615241007A US10569165B2 US 10569165 B2 US10569165 B2 US 10569165B2 US 201615241007 A US201615241007 A US 201615241007A US 10569165 B2 US10569165 B2 US 10569165B2
Authority
US
United States
Prior art keywords
weapon
gun
virtual reality
tactile feedback
microcontroller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/241,007
Other versions
US20180050268A1 (en
Inventor
Allan Jones
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Activision Publishing Inc
Original Assignee
Activision Publishing Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Activision Publishing Inc filed Critical Activision Publishing Inc
Priority to US15/241,007 priority Critical patent/US10569165B2/en
Assigned to ACTIVISION PUBLISHING, INC. reassignment ACTIVISION PUBLISHING, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JONES, ALLAN
Publication of US20180050268A1 publication Critical patent/US20180050268A1/en
Application granted granted Critical
Publication of US10569165B2 publication Critical patent/US10569165B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present invention relates generally to augmented reality and virtual reality systems, and more particularly, to tactile feedback systems and methods for augmented and virtual reality systems.
  • Virtual reality is a computer technology that replicates an environment, which may be a simulation of a real life environment or an imaginary environment, and simulates a user's presence in the environment allowing the user to interact with the environment.
  • Current forms of virtual reality are displayed on a computer display or with a special virtual reality headset worn over the user's eyes, which provides visual and sound simulations of the virtual reality experience.
  • Some simulations also include additional sensory input, such as haptic/tactile feedback to the user to simulate various interactions with the virtual reality.
  • the user may interact with the virtual reality using standard computer input devices such as a keyboard and mouse, and also through multimodal devices such as gloves or other wearables having sensors to detect motion and forces, and/or external motion sensors which detect a user's position and motions.
  • Augmented reality is similar to virtual reality in that it involves a computerized simulation, but differs in that augmented reality utilizes a real-world environment in which certain elements are augmented and/or supplemented by a computer-generated simulation of sensory input such as video, sound, graphics, and/or tactile feedback.
  • the computer-generated simulation may be overlaid onto a computer reproduction of the real-world environment.
  • virtual reality shall mean either, or both, virtual reality and augmented reality, as their differences do not affect the present invention.
  • tactile feedback devices also referred to as haptic devices
  • Tactile feedback devices apply forces, vibrations or motions to a user in order to provide tactile sensory input in virtual reality to a user, such as to simulate the sense of touch or to provide other feedback of experiences within the virtual reality, such as crashing, being shot, loading or shooting a gun, or colliding with something.
  • the present invention is directed to an innovative tactile feedback system for use with a virtual reality headset which utilizes a plurality of tactile feedback devices to enhance the immersion into the virtual reality experience and provide entertaining and useful tactile sensory perception to a user within the virtual reality.
  • a multi-directional, tactile feedback system for use with a virtual reality headset being used to present a virtual reality experience.
  • the tactile feedback system has a support structure, a plurality of tactile feedback units mounted to the support structure in spaced apart relation, and a microcontroller coupled to the support structure.
  • the support structure may be the housing and/or other structure of the headset or a separate wearable device such as a headband, helmet, cap, other head mounting structure, etc.
  • Each of the tactile feedback units is operably coupled to the microcontroller and is configured to produce a tactile feedback to a user, and includes a transducer which can apply forces, vibrations or motions to the user thereby providing tactile sensory input to the user.
  • the tactile feedback units may be positioned in an angularly spaced array or other pattern around the user's body, such as around the user's head.
  • the microcontroller is configured to utilize control signals generated during the operation of the headset to control the operation of the tactile feedback units.
  • the headset is presenting a virtual reality experience, such as playing a game or showing a video
  • the game or video may generate control signals which the microcontroller utilizes to control the operation of the tactile feedback units.
  • the tactile feedback system may be integrated into the headset such that the microcontroller is integrated into the headset, in which case the microcontroller may be integrated with the electronics of the headset or a separate electronic system operably coupled to the headset.
  • the microcontroller may be a separate system having a communication interface configured to electronically communicate with the headset, for instance a wireless communication interface, such as Bluetooth, Wi-Fi, HaLow or other future wireless standards, or a wired communication interface, such as USB.
  • a wireless communication interface such as Bluetooth, Wi-Fi, HaLow or other future wireless standards
  • a wired communication interface such as USB.
  • the microcontroller receives the control signals from the headset via the communication interface.
  • the microcontroller is configured to operate the tactile feedback units in response to the control signals to indicate a directional aspect of the virtual reality experience.
  • the microcontroller may activate a specific tactile feedback unit that is angularly located to correspond to the angular location of an event occurring in the virtual reality experience, such as the location of a hit (e.g., hit by a shot from a weapon, or punched by another player) on the user's persona within the virtual reality experience.
  • the microcontroller may activate a tactile feedback unit located on the right side of the angularly spaced array.
  • the control signals may include directional data related to the directional aspect of the virtual reality experience being signaled using the tactile feedback system.
  • the tactile feedback system may be configured to provide a number of use cases.
  • the tactile feedback system may be configured to provide a threat detection warning signal to the user to indicate that the user is being threatened within the virtual reality experience. This is similar to a “sixth sense” or “Spidy Sense.”
  • the threat detection warning signal can be a general threat detection warning signal, or a specific threat detection warning signal such as a directional threat detection warning signal by using the directional aspect described above.
  • Other specific threat detection warning signals include a severity threat detection warning signal to indicate a severity aspect of the virtual reality experience, and an imminence/urgency threat detection warning signal to indicate an urgency aspect of the virtual reality experience.
  • the microcontroller may activate the tactile feedback system with a level of force corresponding to a severity level occurring in the virtual reality experience (e.g., a fender bender versus a head-on crash).
  • the microcontroller may activate the tactile feedback system in a pattern or other manner corresponding to a degree of urgency occurring in the virtual reality experience. An example of this may be if an enemy is approaching from far away, the tactile feedback units may vibrate slowly or be activated in a slow-rotating pattern, and as the enemy approaches closer the vibrations and/or speed of the pattern activation increases.
  • the tactile feedback system may be configured to indicate that a user has been killed in a game, eliminated from a game, or that the user's game has otherwise ended.
  • the microcontroller may be configured to activate all of the tactile feedback units at the same time and/or in a certain manner in response to a control signal corresponding to such an event.
  • the tactile feedback system may be configured to signal to a user the direction and/or intensity of damage to the user within the virtual reality experience.
  • the microcontroller may be configured to activate particular tactile feedback unit(s) in the array related to the direction of damage and/or to adjust the intensity of the activation of such tactile feedback units based on the control signals.
  • the tactile feedback system may also be configured to signal to the user other aspects of a virtual reality experience, such as vibration thrusters, various other warnings such as low ammunition, weapon re-loading, weapon upgrading, etc.
  • Another embodiment of the present invention is directed to methods of providing tactile feedback with a virtual reality system being used to present a virtual reality experience.
  • the methods may include the methods of using the tactile feedback system described herein, including the additional aspects, features and use cases.
  • Still another embodiment of the present invention is directed to a haptic, toy gun game controller for use with a virtual reality headset for presenting a virtual reality experience, such as playing a virtual reality game.
  • the haptic gun may be shaped and configured as any type of gun, such as a handgun, rifle, shotgun, machine gun, laser gun, BB gun, paintball gun, pellet gun, light machine gun (“LMG”), etc.
  • the haptic gun includes a main body having a handle. The body may be in the shape of the type of gun.
  • a microcontroller having a processor is housed in the main body.
  • a first communication interface is also housed in the main body. The first communication interface is operably coupled to the microcontroller and is configured to electronically communicate with the headset.
  • the first communication interface may be any suitable communication interface, including those described herein.
  • the haptic gun has a trigger coupled to the main body, and operably coupled to the microcontroller.
  • a plurality of tactile feedback units are coupled to the main body and are spaced apart around the body. Each of the tactile feedback units is operably coupled to the microcontroller.
  • the tactile feedback units are configured to produce a haptic feedback to a user, such as the tactile feedback units described above.
  • a linear actuator is coupled to the main body and operably coupled to the microcontroller.
  • the linear actuator is configured to provide a linear force simulating a recoil from firing the haptic gun.
  • the haptic gun also has a tracker device housed within the main body.
  • the tracker device is configured to provide tracking data to the virtual reality system (e.g., a virtual reality headset).
  • the tracker device may include accelerometer(s), and/or other sensors to detect the motion and orientation of the haptic gun.
  • the tracker device may have a tracker device communication interface, such as a wireless interface or wired interface for communicating with the headset. In other aspects, the tracker device may be integrated with the microcontroller and/or the first communication interface.
  • the microcontroller of the haptic gun is configured to electronically communicate with the headset via the communication interface to receive control signals from the headset for controlling operation of the tactile feedback units and the linear actuator.
  • the microcontroller operates the tactile feedback units and the linear actuator based on the control signals.
  • the microcontroller also sends a trigger signal to the headset in response to actuation of the trigger by the user.
  • the user is utilizing the haptic handgun with a virtual reality headset and is playing a virtual reality game.
  • the user sees a target (e.g., an enemy) to shoot at.
  • the user moves and aims the haptic gun.
  • the tracker device senses the motion and aim point of the gun and provides tracking data to the microcontroller and sends the tracking data to the headset via the tracker device communication interface.
  • the virtual reality headset may be configured to utilize the tracking data to control the image of a gun within the virtual reality game which corresponds to the haptic gun. For instance, as the user moves and aims the haptic gun, the virtual reality headset may move and aim the gun in the virtual reality game accordingly.
  • the microcontroller When the user actuates the trigger, the microcontroller detects that the trigger has been actuated and sends a trigger signal to the headset. The headset then sends control signals to the microcontroller which the microcontroller uses to control the operation of the tactile feedback units and the linear actuator.
  • the haptic gun may also be configured to provide various use cases for controlling the tactile feedback units and the linear actuator based on the control signals received from the headset.
  • the microcontroller may receive a control signal from the headset indicating whether to fire in response to sending a trigger signal to the headset.
  • the headset receives the trigger signal and determines whether the corresponding gun in the virtual reality game can be fired (e.g., is there ammunition?; is the gun damaged or otherwise prevented from firing?). If the headset determines the gun can be fired, the headset sends a “fire gun” control signal to the microcontroller.
  • the microcontroller is configured to activate the linear actuator to simulate a recoil from firing the gun.
  • the headset can send a “do not fire” control signal, or no control signal at all.
  • the microcontroller may be configured not to activate the linear actuator, and/or to activate one or more of the tactile feedback units to simulate a hammer or other component of the gun actuating when the gun fails to fire upon pulling the trigger.
  • the headset may be configure to send a weapon upgrade control signal, a weapon damage control signal, or other control signals.
  • the microcontroller is configured to respond to such signals by activating the linear actuator and/or tactile feedback units in a certain manner.
  • Still another embodiment of the present invention is directed to a video game system for providing a single game play instance in which multiple clients can play, each client utilizing a different game platform.
  • the video game system includes a first game platform executing a virtual reality game on a virtual reality system.
  • the virtual reality system may be a virtual reality headset, as described herein, or other virtual reality system.
  • the video game system includes a second game platform executing a game having a similar representation of the virtual reality game but modified for the second game platform.
  • the first game platform and second game platform are configured to communicate with each other to provide a single instance game space for the first game platform and second game platform.
  • the first game platform and second game platform may be configured to communicate over a communication network comprising the Internet.
  • the first game platform may be linked to a third game platform which displays a representation of the virtual reality game on a video display as the game is being played on the first game platform.
  • FIG. 1 is a top view of a multi-directional, tactile feedback system for use with a virtual reality system, according to one embodiment of the present invention
  • FIG. 2 is a rear view of the tactile feedback system of FIG. 1 ;
  • FIG. 3 is front view of the tactile feedback system of FIG. 1 , as worn on the head of a user;
  • FIG. 4 is a rear view of tactile feedback system of FIG. 1 , as worn on the head of a user;
  • FIG. 5 is a schematic view of the tactile feedback system of FIG. 1 ;
  • FIG. 6 is a schematic view of the tactile feedback system of FIG. 1 in combination with a virtual reality headset for presenting a virtual reality experience;
  • FIG. 7 is a diagram depicting a directional damage scenario being signaled by the tactile feedback system of FIG. 1 ;
  • FIG. 8 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the directional damage scenario of FIG. 7 , according to another embodiment of the present invention
  • FIG. 9 is a schematic depicting a direction and damage intensity being signaled by the tactile feedback system of FIG. 1 ;
  • FIG. 10 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the direction and damage intensity scenario of FIG. 9 , according to another embodiment of the present invention.
  • FIG. 11 is a schematic depicting a general threat detection warning scenario being signaled by the tactile feedback system of FIG. 1 ;
  • FIG. 12 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the general threat detection warning scenario of FIG. 11 , according to another embodiment of the present invention
  • FIG. 13 is a schematic depicting a directional threat detection warning scenario being signaled by the tactile feedback system of FIG. 1 ;
  • FIG. 14 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the directional threat detection warning scenario, according to another embodiment of the present invention
  • FIG. 15 is a side view of a haptic gun, according to another embodiment of the present invention.
  • FIG. 16 is a side, cut-away view of the haptic gun of FIG. 15 ;
  • FIG. 17 is a high-level schematic view of the electronic system for the haptic gun of FIG. 15 , according to another embodiment of the present invention.
  • FIG. 18 is a schematic view of the electronic system for the linear actuator of the haptic gun of FIG. 15 , according to another embodiment of the present invention.
  • FIG. 19 is a is a flow chart illustrating a method of operation of the haptic gun of FIG. 15 , according to another embodiment of the present invention.
  • FIG. 20 is a flow chart illustrating a method of operation of the haptic gun of FIG. 15 for a trigger actuation scenario, according to another embodiment of the present invention.
  • FIG. 21 is a schematic view of a virtual reality system for providing a single game instance for multiple clients each utilizing a different game platform.
  • FIGS. 1-6 one embodiment of a multi-directional, tactile feedback system 10 for use with a virtual reality system 30 (see FIG. 6 , in which the virtual reality system 30 is a virtual reality headset 30 ) configured to present a virtual reality experience to a user is illustrated.
  • the virtual reality system 30 may be any suitable system for presenting a virtual reality experience, such as a virtual reality headset, goggles, helmet, video display(s), etc.
  • the virtual reality system 30 includes a processor (e.g., a microprocessor), a software program configured to program the virtual reality system to present the virtual reality experience, a display device (e.g., video display(s)), and a system communication interface.
  • a processor e.g., a microprocessor
  • a software program configured to program the virtual reality system to present the virtual reality experience
  • a display device e.g., video display(s)
  • system communication interface e.g., a system communication interface
  • the virtual reality experience can be any type of entertainment or informational program, such as a video game, a movie, a video, etc.
  • the term “virtual reality” as used herein includes either, or both, virtual reality and augmented reality.
  • the system communication interface may be any suitable communication interface for communicating with the tactile feedback system 10 and/or other electronic systems.
  • the interface can be a wireless communication interface, such as Bluetooth or Wi-Fi, or a wired communication interface, such as USB.
  • the tactile feedback system 10 includes a support structure 12 , which in this case is a headband 12 configured to be worn on the head 14 of a user 11 (see FIGS. 3 and 4 ).
  • the support structure 12 may be integrated with a virtual reality system 30 (e.g., integrated with a virtual reality headset), or it may be a separate wearable device such as the headband 12 , or other wearable device such as a hat, helmet, goggles, glasses, etc.
  • the tactile feedback system 10 is a separate wearable device which communicates with a virtual reality system 30 , such as the virtual reality headset 30 as shown in FIG. 6 .
  • a plurality of tactile feedback units 16 are mounted to the headband 12 angularly spaced apart around the headband 12 .
  • Each of the tactile feedback units 16 is a tactile feedback device which can apply forces, vibrations, or motions which provide tactile sensory input to the user 11 .
  • the tactile feedback units 16 may comprise a vibration motor for applying a vibration feedback to the user 11 , such as a 10 mm-3-4.5 volt vibration motor.
  • Each of the tactile feedback units 16 is operably coupled to a microcontroller 20 (described below) such that each tactile feedback unit 16 can be operated independently of the other tactile feedback units 16 .
  • the feedback system 10 has eight tactile feedback units 16 which are angularly spaced at about 45 degrees from each other.
  • tactile feedback units 16 may be spaced about equal angles around the headband, or they may be spaced unequally, or even spaced in multiple groups.
  • the feedback system 10 may have from 1 to 30, or more, tactile feedback units 16 , such as four units 16 (e.g., spaced at about 90 degrees), 5 units 16 (e.g., spaced at about 72 degrees), 6 units 16 (e.g., spaced at about 60 degrees), 10 units (e.g., spaced at about 36 degrees), etc.
  • the control module 18 includes an electronic circuit 26 , such as a printed circuit board or integrated circuit, operably connecting the microcontroller 20 , the power supply 24 , and the user input controls 22 .
  • the control module 18 also operably connects each of the tactile feedback units 16 to the microcontroller 20 and the power supply 24 in order to control and power the actuation of the tactile feedback units 16 .
  • the control module 18 also has a USB communication interface 28 , which may be integrated into the microcontroller 20 , or separate from the microcontroller 20 and operably coupled to the microcontroller 20 .
  • the USB communication interface 28 is configured to provide electronic communication between microcontroller 20 and the virtual reality system 30 .
  • the tactile feedback system 10 also has a wireless communication interface 32 , which may be integrated into the microcontroller 20 , or separate from the microcontroller 20 and operably coupled to the microcontroller 20 .
  • the wireless communication interface 32 is configured to provide electronic communication between microcontroller 20 and the virtual reality system 30 and the one or more communication interfaces.
  • the tactile feedback system 10 may be operably connected to the virtual reality system 30 using either, or both, of the USB communication interface 28 and the wireless communication interface 32 , to receive control signals from the virtual reality system 30 which the microcontroller 20 uses to control the operation of the tactile feedback units 16 .
  • the user input controls 22 are operably coupled to the microcontroller 20 for adjusting the baseline activation intensity (e.g., vibration intensity) of the tactile feedback system 10 .
  • the baseline activation intensity is an intensity of activation from which the different levels of intensity at which the tactile feedback units 16 are activated for various tactile signals to the user, as further described below.
  • the user input controls 22 may comprise a pair of push buttons operably coupled to the microcontroller 20 , or any other suitable controls for providing user input to the microcontroller 20 to adjust the baseline activation intensity.
  • the microcontroller 20 is configured to adjust the baseline activation intensity at which the microcontroller 20 activates the tactile feedback units 16 in response to the user input controls 22 .
  • FIGS. 7 and 8 a diagram and flowchart depict the tactile feedback system 10 configured to perform a method 100 for signaling a directional damage scenario.
  • the tactile feedback system 10 is configured to activate the tactile feedback unit 16 a to indicate to the user 11 the approximate direction of damage sustained by the user 11 in the virtual reality experience.
  • the user 11 is damaged (e.g., shot) in the front/left, such that the tactile feedback system 10 activates the tactile feedback unit 16 a located on the front/left of the tactile feedback system 10 .
  • the user 11 turns to his left in response to the activation of the tactile feedback unit 16 a.
  • the method 100 comprises a step 102 , in which an event occurs at an angular location relative to the user 11 in the virtual reality experience being executed on the virtual reality system 30 .
  • the event is the user 11 being shot in the front/left of the user 11 .
  • the virtual reality system 30 sends a direction control signal to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32 .
  • the direction control signal indicates the angular direction of the event.
  • the control signals may be coded and the microcontroller 20 is configured to relate the coded signal to an angular direction.
  • the microcontroller 20 processes the direction control signal to determine the tactile feedback unit 16 a corresponding to the angular location of the event.
  • the corresponding tactile feedback unit 16 may be the tactile feedback unit which best represents the direction of the event.
  • the microcontroller 20 activates the tactile feedback unit 16 a based on the direction control signal corresponding to the direction of the event relative to the location of the user 11 in the virtual reality experience.
  • FIGS. 9 and 10 illustrate another use case for the tactile feedback system 10 performing a method 120 to signal to the user the direction of damage and damage intensity incurred by the user 11 in the virtual reality experience.
  • the angular location of the damage is basically the same as described above for the method 100 .
  • the microcontroller 20 instead of activating just the one tactile feedback unit 16 a , the microcontroller 20 also activates one or more adjacent tactile feedback units 16 b and 16 c at a lower intensity than it activates the main tactile feedback unit 16 a .
  • the number of adjacent tactile feedback units 16 and the intensity of the activation of the adjacent tactile feedback units 16 depend on the level of intensity of the damage. For instance, for minimal damage, perhaps no adjacent tactile feedback units 16 are activated.
  • a first threshold and below a second threshold perhaps only the two adjacent tactile feedback units 16 b and 16 c are activated at a very low intensity.
  • the intensity of the two adjacent tactile feedback units 16 b and 16 c may be increased, but still lower than the intensity of the main tactile feedback unit 16 a in order to still effectively signal the direction of the damage.
  • the second threshold is reached, three or four adjacent tactile feedback units 16 may be activated, with increasing intensity based on an increase in the intensity of damage, and so on as the intensity of the damage increases.
  • the method 120 comprises step 122 , in which damage occurs to the user 11 in the virtual reality experience being executed on the virtual reality system 30 .
  • the damage has an associated direction of damage and intensity of damage.
  • the virtual reality system 30 sends a damage direction and damage intensity control signal (this may be a single signal or a separate signal for damage direction and damage intensity) to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32 , indicating the direction of the damage and the intensity of the damage.
  • the microcontroller 20 processes the damage direction and damage intensity control signal(s) to determine the main tactile feedback unit 16 a corresponding to the angular location of the event and the adjacent tactile feedback unit(s) 16 b and 16 c to be activated and their respective intensity. Then, at step 108 , the microcontroller 20 activates the tactile feedback units 16 a , 16 b and 16 c based on the control signal to indicate the direction and intensity of the damage occurring in the virtual reality experience.
  • FIGS. 11 and 12 a diagram and flowchart depict the tactile feedback system 10 configured to perform a method 130 for signaling a general threat detection warning.
  • a general threat detection warning means it is non-directional, i.e., it does not indicate to the user a direction of a threat (e.g., potential danger) to the user occurring in the virtual reality experience.
  • a general threat detection warning also does not necessarily indicate other specific characteristics of the threat, such as imminence/urgency or severity.
  • the arrow 131 shows that the threat detection warning signal may be performed by sequentially activating each of the tactile feedback units 16 (individually or in groups) for a short duration or pulse in a rotational sequence around the array.
  • the arrow 131 shows the tactile feedback units 16 being activated in a counter-clockwise sequence, but the sequence could also be clockwise, or a counter-clockwise sequence could signal a first type of threat while a clockwise sequence could signal a second type of threat.
  • the method 130 comprises step 132 in which a threat detection warning event occurs in the virtual reality experience being executed on the virtual reality system 30 .
  • the virtual reality system 30 sends a general threat detection warning control signal to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32 .
  • the microcontroller 20 processes the general threat detection warning control signal to determine the sequence for activating the tactile feedback units 16 corresponding to the control signal.
  • the microcontroller 20 activates the tactile feedback units 16 (e.g., in a sequence) to signal a general threat detection warning, as described above.
  • a directional threat detection warning indicates both that there is a threat (e.g., a potential danger) and the direction of the threat event relative to the user 11 occurring in the virtual reality experience.
  • the tactile feedback system 10 is configured to activate the tactile feedback unit 16 a to indicate to the user 11 the approximate angular direction of the threat to the user 11 in the virtual reality experience.
  • the microcontroller 20 may be configured to activate the tactile feedback unit 16 a by oscillating or pulsing (i.e., quickly activating and de-activating) the tactile feedback unit 16 a multiple times) the tactile feedback unit 16 a .
  • the threat is located in the front/left of the user 11 , such that the tactile feedback system 10 activates the tactile feedback unit 16 a located on the front/left of the tactile feedback system 10 .
  • a similar method to method 140 may also be utilized to provide a directional signal to a user for events, items or situations other than threats, such as for providing a directional indicator to signal an event, item or situation in a particular direction within the virtual reality experience.
  • the method 140 comprises step 142 in which a threat detection warning event occurs at an angular location relative to the user 11 in the virtual reality experience being executed on the virtual reality system 30 .
  • the virtual reality system 30 sends a directional threat detection warning control signal to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32 .
  • the directional threat detection warning control signal indicates the angular direction of the threat event.
  • the microcontroller 20 processes the directional threat detection warning control signal to determine the tactile feedback unit 16 corresponding to the angular location of the threat event.
  • the microcontroller 20 activates the tactile feedback unit 16 a corresponding to the direction of the threat event relative to the user 11 in the virtual reality experience by, e.g., oscillating or pulsing the tactile feedback unit 16 a , as described above.
  • FIGS. 15-18 another embodiment of the present invention is directed to a haptic, toy gun, game controller 150 for use with a virtual reality system 30 .
  • the virtual reality system 30 may be the same, or similar to the virtual reality system described above for the tactile feedback system 10 .
  • the haptic gun 150 is shaped like a handgun and has a main body 152 having a handle 154 .
  • the gun 15-16 is in the shape of a handgun
  • the gun may be shaped and configured as any suitable type of gun, including without limitation, a handgun, rifle, shotgun, machine gun, laser gun, BB gun, paintball gun, pellet gun, light machine gun, replica of an actual gun, or any made-up gun.
  • the body 152 is in the shape of the desired gun.
  • a plurality of tactile feedback units 16 are coupled to the main body 152 , such as being housed in the main body 152 .
  • the tactile feedback units 16 may be the same as, or similar to, to the tactile feedback units 16 described above with respect to the tactile feedback system 10 .
  • the tactile feedback units 16 are located spaced apart within the main body 152 , such as a tactile feedback unit 16 located in the handle 154 , a tactile feedback unit 16 located in the middle of the main body 152 , and a tactile feedback unit 16 located near the front end of the main body 152 (e.g., proximate the end of the barrel of the gun 150 ). More or fewer tactile feedback units 16 may be utilized, depending on the desired effects to be produced by the tactile feedback units 16 .
  • a linear actuator 156 is also coupled to the main body 152 , in this case housed within the main body 152 .
  • the linear actuator 156 is configured to be activated to provide a linear force simulating a recoil from firing the haptic gun 150 .
  • the linear actuator 156 may also provide a linear force in the opposite direction when the linear actuator is reset which can simulate the gun 150 loading another round of ammunition.
  • a microcontroller 20 having a processor, a tracker device 158 and a power source 160 are also housed in the main body 152 .
  • the microcontroller 20 may be the same as, or similar to, to the microcontroller 20 described above with respect to the tactile feedback system 10 .
  • the haptic gun 150 has a gun communication interface 166 which may be integrated into the microcontroller 20 , or may be a separate component operably coupled to the microcontroller 20 .
  • the gun communication interface 166 may be, for example, a wired communication interface such as USB, or a wireless communication interface such as Bluetooth or Wi-Fi, and is configured to provide electronic communication between the microcontroller 20 and system communication interface of the virtual reality system 30 .
  • Each of the tactile feedback units 16 and the linear actuator 156 are operably coupled to the microcontroller 20 such that each of the tactile feedback units 16 and the linear actuator 156 may be activated independently of each other.
  • a trigger 162 is also coupled to the main body 152 .
  • the trigger 162 is configured to be actuated by pulling the trigger with a finger of a user.
  • the trigger 162 is also operably coupled to the microcontroller 20 .
  • the tracker device 158 is configured to provide tracking data to the virtual reality system 30 .
  • the tracker device 158 has a tracker communication interface, such as a wireless communication interface or a USB communication interface, for communicating the tracking data to the system communication interface of the virtual reality system 30 .
  • the tracker device 158 may have accelerometer(s), and/or other sensors to detect the motion, location, and/or orientation of the haptic gun 150 .
  • FIG. 17 a high-level schematic of the electronic system of the haptic gun 150 is shown. It is understood that the schematic of FIG. 17 does not include all of the components of the electronic system (e.g., transistors, diodes, resistors, etc.), but one or ordinary skill in the art is enabled to incorporate the high-level schematic into an operable electronic system.
  • the electronic system includes an electronic circuit 164 operably interconnecting the microcontroller 20 , the power source 160 , the tracker device 158 , the tactile feedback units 16 and the linear actuation 156 .
  • FIG. 18 a schematic is shown for an exemplary dual power relay 168 for powering the linear actuator 156 .
  • the dual power relay 168 includes two 5 volt power relays 170 which are operably coupled to the microcontroller 20 and to the linear actuator 156 .
  • FIG. 19 a method 200 for operating the haptic gun 150 while using the haptic gun 150 with a virtual reality system 30 is shown.
  • the haptic gun 150 is linked to the virtual reality system 30 such that the microcontroller 20 and virtual reality system may electronically communicate via the gun communication interface 166 and the system communication interface.
  • the virtual reality system 30 displays a target in the virtual reality experience which is seen by the user.
  • the user moves and aims the haptic gun 150 to aim at the target, and the tracker device 158 senses the motion and aim point of the haptic gun 150 .
  • the tracker device 158 sends tracking data to the virtual reality system 30 via the tracker communication interface and the system communication interface.
  • the virtual reality system 30 controls an image of a gun within the virtual reality experience corresponding to the haptic gun 150 , and displays the image to the user based on the tracking data.
  • the microcontroller 20 detects whether that the trigger 162 has been actuated. If the user actuates the trigger 162 , the microcontroller 20 detects that the trigger has been actuated, and at step 212 , the microcontroller 20 sends a trigger signal to the virtual reality system 30 .
  • the virtual reality system 30 processes the trigger signal and determines a control signal or signals to send to the microcontroller 20 for controlling the operation of the tactile feedback units 16 and/or the linear actuator 156 .
  • the virtual reality system 30 sends the control signal(s) to the microcontroller 20 .
  • the microcontroller 20 processes the control signal(s) to determine if and how to actuate the tactile feedback unit(s) 16 and/or the linear actuator 156 .
  • the microcontroller activates the tactile feedback unit(s) 16 and/or the linear actuator 156 based on the control signal(s). If the trigger has not been actuated at step 210 , then at step 222 , the virtual reality system determines a control signal based on the virtual reality experience. For example, the corresponding gun in the virtual reality experience may be hit or damaged.
  • the virtual reality system 30 sends a control signal to the microcontroller 20 .
  • the microcontroller 20 processes the control signal to determine if and how to actuate the tactile feedback unit(s) 16 and/or the linear actuator 156 .
  • the microcontroller activates the tactile feedback unit(s) 16 and/or the linear actuator 156 based on the control signal(s).
  • a method 230 for a particular use case for firing the haptic gun 150 is shown.
  • the method 230 may be performed following step 212 and replaces the steps following 212 in the method 200 described above and shown in FIG. 19 .
  • the virtual reality system 30 determines whether a virtual gun within the virtual reality experience corresponding to the haptic gun 150 has ammunition. If the virtual gun does not have ammunition, then at step 234 , the virtual reality system 30 sends a “No Ammo” control signal to the microcontroller 20 .
  • the microcontroller processes the “No Ammo” control signal to determine if and how to activate the tactile feedback units 16 and/or the linear actuator 156 .
  • the microcontroller 20 If the microcontroller 20 is configured (i.e., programmed) to activate the tactile feedback units 16 and/or the linear actuator 156 in response to a “No Ammo” control signal, then at step 238 , the microcontroller 20 activates the tactile feedback units 16 and/or the linear actuator 156 .
  • the microcontroller 20 does not activate the linear actuator 156 , but may activate one or more of the tactile feedback units 16 to simulate dry fire (e.g., to simulate a hammer or firing pin being actuated, but no round fired).
  • the virtual reality system 30 determines at step 232 that the gun has ammunition to be fired, then at step 240 , the virtual reality system 30 determines if the virtual gun is damaged or cannot fire for any other reason within the virtual reality experience. If the gun is damaged or cannot fire, at step 242 , the virtual reality system sends a “gun damaged” or “gun inoperable” control signal to the microcontroller 20 . At step 244 , the microcontroller 20 processes the control signal to determine if and how to activate the tactile feedback units 16 and/or the linear actuator 156 .
  • the microcontroller 20 is configured (i.e., programmed) to activate the tactile feedback units 16 and/or the linear actuator 156 in response to a “gun damaged” or “gun inoperable” control signal, then at step 246 , the microcontroller 20 activates the tactile feedback units 16 and/or the linear actuator 156 .
  • the microcontroller 20 does not activate the linear actuator 156 , but may activate one or more of the tactile feedback units 16 to simulate dry fire (e.g., to simulate a failed attempt to fire the gun).
  • the virtual reality system 30 determines at step 240 that the gun is not damaged or otherwise inoperable, then at step 248 , the virtual reality system 30 sends a “fire gun” control signal to the microcontroller 20 .
  • the microcontroller 20 processes the “fire gun” control signal to determine if and how to activate the tactile feedback units 16 and/or the linear actuator 156 , or to simply execute a pre-programmed “firing sequence.”
  • the microcontroller 20 is configured (i.e., programmed) to activate the linear actuator 156 in response to a “fire gun” control signal.
  • the microcontroller 20 may also activate one or more of the tactile feedback units 16 to indicate certain scenarios, such as firing a very powerful gun, or other scenarios.
  • the microcontroller 20 activates the linear actuator 156 and/or tactile feedback units 16 , based on the “fire gun” control signal.
  • the virtual reality system 30 may send a control signal to the microcontroller 20 indicating how many rounds of ammunition the gun has to fire, and the microcontroller 20 is configured to perform the firing sequence each time the trigger is actuated (or held depressed in the case of a simulated automatic gun) for the number of rounds of ammunition remaining.
  • the microcontroller 20 receives another control signal from the virtual reality system in order to perform any further firing sequences.
  • This aspect can account for communication delays between the haptic gun 150 and the virtual reality system 20 , such as delays caused by EMI or other communication interference.
  • FIG. 21 another embodiment of the present invention is directed to a video game system 300 for providing a single game play instance wherein multiple clients can play the same game instance with each client utilizing a different game platform.
  • the video game system 300 includes a first game platform 302 which is a virtual reality system 30 , as described herein.
  • the first game platform 302 executes a virtual reality game 304 on the virtual reality system 30 .
  • the virtual reality system 30 may be a virtual reality headset, as described herein, or other virtual reality system.
  • the video game system 300 includes a second game platform 306 executing a game 308 having a similar representation of the virtual reality game but modified for the second game platform 306 .
  • the first game platform 302 and second game platform 306 each have communication interfaces configured to allow them to communicate with each other to provide a single instance game space for the first game platform 302 and second game platform 306 .
  • the communication interfaces may be wired interfaces, or wireless interfaces, as described herein.
  • the first game platform 302 and second game platform 306 may communicate with each other over the Internet 310 using their respective communication interfaces to connect to the Internet 310 .
  • the video game system 300 may also include a third game platform 312 which is in communication with the first game platform 302 .
  • the first game platform 302 and third game platform 312 are configured to allow the third game platform 312 to display a representation of the virtual reality game 304 on a video display as it is being played on the first game platform 302 .
  • the third game platform 312 may be a tablet computer, a smart TV, a personal computer, a smart phone, or other computing device not having virtual reality capability, such that a spectator can watch the game play while a user is using the virtual reality game 304 .
  • additional game platforms same, similar, or different to the first game platform 202 or the second game platform 306 , can be connected and added to the video game system 300 to allow additional clients to play the same single game instance.
  • additional game platforms, same, similar, or different to the third game platform 312 may be connected to the video game system 300 to allow additional observers to view the game play.
  • additional game platforms 302 may be connected to the video game system 300 .
  • specific game platforms 312 can be slaved to various headsets, and/or games platforms 312 can be allowed to switch from one game platform 302 to another game platform 302 to view the game from different views from their perspective.
  • the system 300 allows such client to continue to view the gamer. For instance, the system 300 may allow such client to view the game from the perspective of one of the other clients that is still active in the game.
  • Any of the game platforms 302 and 306 may also be configured to register a client's death and/or termination of such client's active participation in the game, for example, as a result of a “game conclusion” event.
  • the game platform may send a “end game” signal to all of the other game platforms involved in the single game instance which indicates to that the respective client is dead or has terminated active participation in the game.
  • observers viewing a game through a game platform connected to the video game system 300 may participate in or interact with game play through certain predefined audience actions, such as cheering or booing, wagering on a game play result or event, conversing or chatting with one or more players or observers, deploying game play objects into game play, or any other game play-related action.
  • certain predefined audience actions such as cheering or booing, wagering on a game play result or event, conversing or chatting with one or more players or observers, deploying game play objects into game play, or any other game play-related action.
  • audience actions may include causing an attack/defense in a combat-themed game (e.g., throwing a grenade, firing a shot, launching a missile, casting a spell into the game); helping or obstructing a player (e.g., giving/taking health, mana, ammunition, virtual currency, weapons, power-ups, etc.); giving hints to players of the game; wagering with other observers/players on the outcome of the game; cheering on a particular player.
  • a combat-themed game e.g., throwing a grenade, firing a shot, launching a missile, casting a spell into the game
  • helping or obstructing a player e.g., giving/taking health, mana, ammunition, virtual currency, weapons, power-ups, etc.
  • giving hints to players of the game wagering with other observers/players on the outcome of the game; cheering on a particular player.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A tactile feedback system for use with a virtual reality system comprises a support structure and tactile feedback units coupled to and spaced apart around the support structure. A microcontroller is coupled to the tactile feedback units to independently control their activation. The microcontroller has a communication interface configured to communicate with the virtual reality system to receive control signals for controlling operation of the tactile feedback units. The microcontroller activates the tactile feedback units in response to control signals received from the virtual reality system to indicate aspects of the virtual reality experience. In another aspect, a haptic gun for use with a virtual reality system has tactile feedback units and a linear actuator for simulating a recoil of firing the gun. In another aspect a video game system provides a single game play instance for multiple clients utilizing different game platforms.

Description

FIELD OF THE INVENTION
The present invention relates generally to augmented reality and virtual reality systems, and more particularly, to tactile feedback systems and methods for augmented and virtual reality systems.
BACKGROUND OF THE INVENTION
Virtual reality is a computer technology that replicates an environment, which may be a simulation of a real life environment or an imaginary environment, and simulates a user's presence in the environment allowing the user to interact with the environment. Current forms of virtual reality are displayed on a computer display or with a special virtual reality headset worn over the user's eyes, which provides visual and sound simulations of the virtual reality experience. Some simulations also include additional sensory input, such as haptic/tactile feedback to the user to simulate various interactions with the virtual reality. The user may interact with the virtual reality using standard computer input devices such as a keyboard and mouse, and also through multimodal devices such as gloves or other wearables having sensors to detect motion and forces, and/or external motion sensors which detect a user's position and motions.
Augmented reality is similar to virtual reality in that it involves a computerized simulation, but differs in that augmented reality utilizes a real-world environment in which certain elements are augmented and/or supplemented by a computer-generated simulation of sensory input such as video, sound, graphics, and/or tactile feedback. The computer-generated simulation may be overlaid onto a computer reproduction of the real-world environment.
For purpose of the present description, the term “virtual reality” shall mean either, or both, virtual reality and augmented reality, as their differences do not affect the present invention.
In order to enhance the virtual reality experience, tactile feedback devices, also referred to as haptic devices, have been employed with virtual reality systems. Tactile feedback devices apply forces, vibrations or motions to a user in order to provide tactile sensory input in virtual reality to a user, such as to simulate the sense of touch or to provide other feedback of experiences within the virtual reality, such as crashing, being shot, loading or shooting a gun, or colliding with something.
SUMMARY
In one aspect, the present invention is directed to an innovative tactile feedback system for use with a virtual reality headset which utilizes a plurality of tactile feedback devices to enhance the immersion into the virtual reality experience and provide entertaining and useful tactile sensory perception to a user within the virtual reality.
Accordingly, in one embodiment, a multi-directional, tactile feedback system for use with a virtual reality headset being used to present a virtual reality experience is provided. The tactile feedback system has a support structure, a plurality of tactile feedback units mounted to the support structure in spaced apart relation, and a microcontroller coupled to the support structure. The support structure may be the housing and/or other structure of the headset or a separate wearable device such as a headband, helmet, cap, other head mounting structure, etc. Each of the tactile feedback units is operably coupled to the microcontroller and is configured to produce a tactile feedback to a user, and includes a transducer which can apply forces, vibrations or motions to the user thereby providing tactile sensory input to the user. For instance, the tactile feedback units may be positioned in an angularly spaced array or other pattern around the user's body, such as around the user's head. The microcontroller is configured to utilize control signals generated during the operation of the headset to control the operation of the tactile feedback units. For example, while the headset is presenting a virtual reality experience, such as playing a game or showing a video, the game or video may generate control signals which the microcontroller utilizes to control the operation of the tactile feedback units. In one aspect, the tactile feedback system may be integrated into the headset such that the microcontroller is integrated into the headset, in which case the microcontroller may be integrated with the electronics of the headset or a separate electronic system operably coupled to the headset.
In another aspect, the microcontroller may be a separate system having a communication interface configured to electronically communicate with the headset, for instance a wireless communication interface, such as Bluetooth, Wi-Fi, HaLow or other future wireless standards, or a wired communication interface, such as USB. In this case, the microcontroller receives the control signals from the headset via the communication interface.
In another aspect, the microcontroller is configured to operate the tactile feedback units in response to the control signals to indicate a directional aspect of the virtual reality experience. For example, the microcontroller may activate a specific tactile feedback unit that is angularly located to correspond to the angular location of an event occurring in the virtual reality experience, such as the location of a hit (e.g., hit by a shot from a weapon, or punched by another player) on the user's persona within the virtual reality experience. If the user is shot on the right side, the microcontroller may activate a tactile feedback unit located on the right side of the angularly spaced array. The control signals may include directional data related to the directional aspect of the virtual reality experience being signaled using the tactile feedback system.
In additional aspects, the tactile feedback system may be configured to provide a number of use cases. In one use case, the tactile feedback system may be configured to provide a threat detection warning signal to the user to indicate that the user is being threatened within the virtual reality experience. This is similar to a “sixth sense” or “Spidy Sense.” The threat detection warning signal can be a general threat detection warning signal, or a specific threat detection warning signal such as a directional threat detection warning signal by using the directional aspect described above. Other specific threat detection warning signals include a severity threat detection warning signal to indicate a severity aspect of the virtual reality experience, and an imminence/urgency threat detection warning signal to indicate an urgency aspect of the virtual reality experience. For example, the microcontroller may activate the tactile feedback system with a level of force corresponding to a severity level occurring in the virtual reality experience (e.g., a fender bender versus a head-on crash). Similarly, the microcontroller may activate the tactile feedback system in a pattern or other manner corresponding to a degree of urgency occurring in the virtual reality experience. An example of this may be if an enemy is approaching from far away, the tactile feedback units may vibrate slowly or be activated in a slow-rotating pattern, and as the enemy approaches closer the vibrations and/or speed of the pattern activation increases.
In another use case, the tactile feedback system may be configured to indicate that a user has been killed in a game, eliminated from a game, or that the user's game has otherwise ended. For instance, the microcontroller may be configured to activate all of the tactile feedback units at the same time and/or in a certain manner in response to a control signal corresponding to such an event.
In still another use case, the tactile feedback system may be configured to signal to a user the direction and/or intensity of damage to the user within the virtual reality experience. The microcontroller may be configured to activate particular tactile feedback unit(s) in the array related to the direction of damage and/or to adjust the intensity of the activation of such tactile feedback units based on the control signals.
The tactile feedback system may also be configured to signal to the user other aspects of a virtual reality experience, such as vibration thrusters, various other warnings such as low ammunition, weapon re-loading, weapon upgrading, etc.
Another embodiment of the present invention is directed to methods of providing tactile feedback with a virtual reality system being used to present a virtual reality experience. For example, the methods may include the methods of using the tactile feedback system described herein, including the additional aspects, features and use cases.
Still another embodiment of the present invention is directed to a haptic, toy gun game controller for use with a virtual reality headset for presenting a virtual reality experience, such as playing a virtual reality game. The haptic gun may be shaped and configured as any type of gun, such as a handgun, rifle, shotgun, machine gun, laser gun, BB gun, paintball gun, pellet gun, light machine gun (“LMG”), etc. The haptic gun includes a main body having a handle. The body may be in the shape of the type of gun. A microcontroller having a processor is housed in the main body. A first communication interface is also housed in the main body. The first communication interface is operably coupled to the microcontroller and is configured to electronically communicate with the headset. The first communication interface may be any suitable communication interface, including those described herein. The haptic gun has a trigger coupled to the main body, and operably coupled to the microcontroller. A plurality of tactile feedback units are coupled to the main body and are spaced apart around the body. Each of the tactile feedback units is operably coupled to the microcontroller. The tactile feedback units are configured to produce a haptic feedback to a user, such as the tactile feedback units described above.
A linear actuator is coupled to the main body and operably coupled to the microcontroller. The linear actuator is configured to provide a linear force simulating a recoil from firing the haptic gun. The haptic gun also has a tracker device housed within the main body. The tracker device is configured to provide tracking data to the virtual reality system (e.g., a virtual reality headset). For example, the tracker device may include accelerometer(s), and/or other sensors to detect the motion and orientation of the haptic gun. The tracker device may have a tracker device communication interface, such as a wireless interface or wired interface for communicating with the headset. In other aspects, the tracker device may be integrated with the microcontroller and/or the first communication interface.
The microcontroller of the haptic gun is configured to electronically communicate with the headset via the communication interface to receive control signals from the headset for controlling operation of the tactile feedback units and the linear actuator. The microcontroller operates the tactile feedback units and the linear actuator based on the control signals. The microcontroller also sends a trigger signal to the headset in response to actuation of the trigger by the user.
As an example of the operation of the haptic handgun, the user is utilizing the haptic handgun with a virtual reality headset and is playing a virtual reality game. Within the virtual reality game, the user sees a target (e.g., an enemy) to shoot at. The user moves and aims the haptic gun. The tracker device senses the motion and aim point of the gun and provides tracking data to the microcontroller and sends the tracking data to the headset via the tracker device communication interface. The virtual reality headset may be configured to utilize the tracking data to control the image of a gun within the virtual reality game which corresponds to the haptic gun. For instance, as the user moves and aims the haptic gun, the virtual reality headset may move and aim the gun in the virtual reality game accordingly. When the user actuates the trigger, the microcontroller detects that the trigger has been actuated and sends a trigger signal to the headset. The headset then sends control signals to the microcontroller which the microcontroller uses to control the operation of the tactile feedback units and the linear actuator.
In additional aspects, the haptic gun may also be configured to provide various use cases for controlling the tactile feedback units and the linear actuator based on the control signals received from the headset. In one use case, the microcontroller may receive a control signal from the headset indicating whether to fire in response to sending a trigger signal to the headset. The headset receives the trigger signal and determines whether the corresponding gun in the virtual reality game can be fired (e.g., is there ammunition?; is the gun damaged or otherwise prevented from firing?). If the headset determines the gun can be fired, the headset sends a “fire gun” control signal to the microcontroller. In response to receiving the fire gun control signal, the microcontroller is configured to activate the linear actuator to simulate a recoil from firing the gun. If the headset determines the gun cannot be fired when receiving the trigger signal, the headset can send a “do not fire” control signal, or no control signal at all. In response to receiving a do not fire control signal, the microcontroller may be configured not to activate the linear actuator, and/or to activate one or more of the tactile feedback units to simulate a hammer or other component of the gun actuating when the gun fails to fire upon pulling the trigger.
In additional use cases, the headset may be configure to send a weapon upgrade control signal, a weapon damage control signal, or other control signals. The microcontroller is configured to respond to such signals by activating the linear actuator and/or tactile feedback units in a certain manner.
Still another embodiment of the present invention is directed to a video game system for providing a single game play instance in which multiple clients can play, each client utilizing a different game platform. The video game system includes a first game platform executing a virtual reality game on a virtual reality system. The virtual reality system may be a virtual reality headset, as described herein, or other virtual reality system. The video game system includes a second game platform executing a game having a similar representation of the virtual reality game but modified for the second game platform. In addition, the first game platform and second game platform are configured to communicate with each other to provide a single instance game space for the first game platform and second game platform.
In another aspect of the video game system, the first game platform and second game platform may be configured to communicate over a communication network comprising the Internet. In still another aspect, the first game platform may be linked to a third game platform which displays a representation of the virtual reality game on a video display as the game is being played on the first game platform.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other aspects of embodiments are described in further detail with reference to the accompanying drawings, wherein like reference numerals refer to like elements and the description for like elements shall be applicable for all described embodiments wherever relevant:
FIG. 1 is a top view of a multi-directional, tactile feedback system for use with a virtual reality system, according to one embodiment of the present invention;
FIG. 2 is a rear view of the tactile feedback system of FIG. 1;
FIG. 3 is front view of the tactile feedback system of FIG. 1, as worn on the head of a user;
FIG. 4 is a rear view of tactile feedback system of FIG. 1, as worn on the head of a user;
FIG. 5 is a schematic view of the tactile feedback system of FIG. 1;
FIG. 6 is a schematic view of the tactile feedback system of FIG. 1 in combination with a virtual reality headset for presenting a virtual reality experience;
FIG. 7 is a diagram depicting a directional damage scenario being signaled by the tactile feedback system of FIG. 1;
FIG. 8 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the directional damage scenario of FIG. 7, according to another embodiment of the present invention;
FIG. 9 is a schematic depicting a direction and damage intensity being signaled by the tactile feedback system of FIG. 1;
FIG. 10 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the direction and damage intensity scenario of FIG. 9, according to another embodiment of the present invention;
FIG. 11 is a schematic depicting a general threat detection warning scenario being signaled by the tactile feedback system of FIG. 1;
FIG. 12 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the general threat detection warning scenario of FIG. 11, according to another embodiment of the present invention;
FIG. 13 is a schematic depicting a directional threat detection warning scenario being signaled by the tactile feedback system of FIG. 1;
FIG. 14 is a flow chart illustrating a method of operation of the tactile feedback system of FIG. 1 for the directional threat detection warning scenario, according to another embodiment of the present invention;
FIG. 15 is a side view of a haptic gun, according to another embodiment of the present invention;
FIG. 16 is a side, cut-away view of the haptic gun of FIG. 15;
FIG. 17 is a high-level schematic view of the electronic system for the haptic gun of FIG. 15, according to another embodiment of the present invention;
FIG. 18 is a schematic view of the electronic system for the linear actuator of the haptic gun of FIG. 15, according to another embodiment of the present invention;
FIG. 19 is a is a flow chart illustrating a method of operation of the haptic gun of FIG. 15, according to another embodiment of the present invention;
FIG. 20 is a flow chart illustrating a method of operation of the haptic gun of FIG. 15 for a trigger actuation scenario, according to another embodiment of the present invention;
FIG. 21 is a schematic view of a virtual reality system for providing a single game instance for multiple clients each utilizing a different game platform.
DETAILED DESCRIPTION
Referring to FIGS. 1-6, one embodiment of a multi-directional, tactile feedback system 10 for use with a virtual reality system 30 (see FIG. 6, in which the virtual reality system 30 is a virtual reality headset 30) configured to present a virtual reality experience to a user is illustrated. The virtual reality system 30 may be any suitable system for presenting a virtual reality experience, such as a virtual reality headset, goggles, helmet, video display(s), etc. In general, the virtual reality system 30 includes a processor (e.g., a microprocessor), a software program configured to program the virtual reality system to present the virtual reality experience, a display device (e.g., video display(s)), and a system communication interface. The virtual reality experience can be any type of entertainment or informational program, such as a video game, a movie, a video, etc. As explained above, the term “virtual reality” as used herein includes either, or both, virtual reality and augmented reality. The system communication interface may be any suitable communication interface for communicating with the tactile feedback system 10 and/or other electronic systems. For instance, the interface can be a wireless communication interface, such as Bluetooth or Wi-Fi, or a wired communication interface, such as USB.
Turning to FIGS. 1 and 2, the tactile feedback system 10 includes a support structure 12, which in this case is a headband 12 configured to be worn on the head 14 of a user 11 (see FIGS. 3 and 4). As explained herein, the support structure 12 may be integrated with a virtual reality system 30 (e.g., integrated with a virtual reality headset), or it may be a separate wearable device such as the headband 12, or other wearable device such as a hat, helmet, goggles, glasses, etc. In this described embodiment, the tactile feedback system 10 is a separate wearable device which communicates with a virtual reality system 30, such as the virtual reality headset 30 as shown in FIG. 6. A plurality of tactile feedback units 16 are mounted to the headband 12 angularly spaced apart around the headband 12. Each of the tactile feedback units 16 is a tactile feedback device which can apply forces, vibrations, or motions which provide tactile sensory input to the user 11. For example, the tactile feedback units 16 may comprise a vibration motor for applying a vibration feedback to the user 11, such as a 10 mm-3-4.5 volt vibration motor. Each of the tactile feedback units 16 is operably coupled to a microcontroller 20 (described below) such that each tactile feedback unit 16 can be operated independently of the other tactile feedback units 16. In this described embodiment, the feedback system 10 has eight tactile feedback units 16 which are angularly spaced at about 45 degrees from each other. More or less tactile feedback units 16 may utilized, and may be spaced about equal angles around the headband, or they may be spaced unequally, or even spaced in multiple groups. For instance, the feedback system 10 may have from 1 to 30, or more, tactile feedback units 16, such as four units 16 (e.g., spaced at about 90 degrees), 5 units 16 (e.g., spaced at about 72 degrees), 6 units 16 (e.g., spaced at about 60 degrees), 10 units (e.g., spaced at about 36 degrees), etc.
A control module 18 housing the microcontroller 20 having a processor, user input controls 22, and a power supply 24 (which may include a battery or other power source) is also mounted to the headband 12. Turning to FIGS. 5 and 6, the control module 18 includes an electronic circuit 26, such as a printed circuit board or integrated circuit, operably connecting the microcontroller 20, the power supply 24, and the user input controls 22. The control module 18 also operably connects each of the tactile feedback units 16 to the microcontroller 20 and the power supply 24 in order to control and power the actuation of the tactile feedback units 16. The control module 18 also has a USB communication interface 28, which may be integrated into the microcontroller 20, or separate from the microcontroller 20 and operably coupled to the microcontroller 20. The USB communication interface 28 is configured to provide electronic communication between microcontroller 20 and the virtual reality system 30. The tactile feedback system 10 also has a wireless communication interface 32, which may be integrated into the microcontroller 20, or separate from the microcontroller 20 and operably coupled to the microcontroller 20. The wireless communication interface 32 is configured to provide electronic communication between microcontroller 20 and the virtual reality system 30 and the one or more communication interfaces. In use, the tactile feedback system 10 may be operably connected to the virtual reality system 30 using either, or both, of the USB communication interface 28 and the wireless communication interface 32, to receive control signals from the virtual reality system 30 which the microcontroller 20 uses to control the operation of the tactile feedback units 16.
The user input controls 22 are operably coupled to the microcontroller 20 for adjusting the baseline activation intensity (e.g., vibration intensity) of the tactile feedback system 10. The baseline activation intensity is an intensity of activation from which the different levels of intensity at which the tactile feedback units 16 are activated for various tactile signals to the user, as further described below. The user input controls 22 may comprise a pair of push buttons operably coupled to the microcontroller 20, or any other suitable controls for providing user input to the microcontroller 20 to adjust the baseline activation intensity. The microcontroller 20 is configured to adjust the baseline activation intensity at which the microcontroller 20 activates the tactile feedback units 16 in response to the user input controls 22.
Turning to FIGS. 7 and 8, a diagram and flowchart depict the tactile feedback system 10 configured to perform a method 100 for signaling a directional damage scenario. As shown in FIG. 7, the tactile feedback system 10 is configured to activate the tactile feedback unit 16 a to indicate to the user 11 the approximate direction of damage sustained by the user 11 in the virtual reality experience. For instance, in the example depicted in FIG. 7, the user 11 is damaged (e.g., shot) in the front/left, such that the tactile feedback system 10 activates the tactile feedback unit 16 a located on the front/left of the tactile feedback system 10. As shown in the diagram, the user 11 turns to his left in response to the activation of the tactile feedback unit 16 a.
Turning to FIG. 8, the method 100 comprises a step 102, in which an event occurs at an angular location relative to the user 11 in the virtual reality experience being executed on the virtual reality system 30. In the example of FIG. 7, the event is the user 11 being shot in the front/left of the user 11. At step 104, the virtual reality system 30 sends a direction control signal to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32. The direction control signal indicates the angular direction of the event. For instance, the control signals may be coded and the microcontroller 20 is configured to relate the coded signal to an angular direction. At step 106, the microcontroller 20 processes the direction control signal to determine the tactile feedback unit 16 a corresponding to the angular location of the event. For example, the corresponding tactile feedback unit 16 may be the tactile feedback unit which best represents the direction of the event. Then, at step 108, the microcontroller 20 activates the tactile feedback unit 16 a based on the direction control signal corresponding to the direction of the event relative to the location of the user 11 in the virtual reality experience.
FIGS. 9 and 10 illustrate another use case for the tactile feedback system 10 performing a method 120 to signal to the user the direction of damage and damage intensity incurred by the user 11 in the virtual reality experience. The angular location of the damage is basically the same as described above for the method 100. However, as shown in FIG. 9, instead of activating just the one tactile feedback unit 16 a, the microcontroller 20 also activates one or more adjacent tactile feedback units 16 b and 16 c at a lower intensity than it activates the main tactile feedback unit 16 a. The number of adjacent tactile feedback units 16 and the intensity of the activation of the adjacent tactile feedback units 16 depend on the level of intensity of the damage. For instance, for minimal damage, perhaps no adjacent tactile feedback units 16 are activated. Above a first threshold and below a second threshold, perhaps only the two adjacent tactile feedback units 16 b and 16 c are activated at a very low intensity. As the level of intensity increases from the first threshold to the second threshold, the intensity of the two adjacent tactile feedback units 16 b and 16 c may be increased, but still lower than the intensity of the main tactile feedback unit 16 a in order to still effectively signal the direction of the damage. When the second threshold is reached, three or four adjacent tactile feedback units 16 may be activated, with increasing intensity based on an increase in the intensity of damage, and so on as the intensity of the damage increases.
Accordingly, turning to the flow chart of FIG. 10, the method 120 comprises step 122, in which damage occurs to the user 11 in the virtual reality experience being executed on the virtual reality system 30. The damage has an associated direction of damage and intensity of damage. At step 124, the virtual reality system 30 sends a damage direction and damage intensity control signal (this may be a single signal or a separate signal for damage direction and damage intensity) to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32, indicating the direction of the damage and the intensity of the damage. At step 126, the microcontroller 20 processes the damage direction and damage intensity control signal(s) to determine the main tactile feedback unit 16 a corresponding to the angular location of the event and the adjacent tactile feedback unit(s) 16 b and 16 c to be activated and their respective intensity. Then, at step 108, the microcontroller 20 activates the tactile feedback units 16 a, 16 b and 16 c based on the control signal to indicate the direction and intensity of the damage occurring in the virtual reality experience.
Turning to FIGS. 11 and 12, a diagram and flowchart depict the tactile feedback system 10 configured to perform a method 130 for signaling a general threat detection warning. A general threat detection warning means it is non-directional, i.e., it does not indicate to the user a direction of a threat (e.g., potential danger) to the user occurring in the virtual reality experience. A general threat detection warning also does not necessarily indicate other specific characteristics of the threat, such as imminence/urgency or severity. As shown in the diagram of FIG. 11, the arrow 131 shows that the threat detection warning signal may be performed by sequentially activating each of the tactile feedback units 16 (individually or in groups) for a short duration or pulse in a rotational sequence around the array. The arrow 131 shows the tactile feedback units 16 being activated in a counter-clockwise sequence, but the sequence could also be clockwise, or a counter-clockwise sequence could signal a first type of threat while a clockwise sequence could signal a second type of threat.
Referring to FIG. 12, the method 130 comprises step 132 in which a threat detection warning event occurs in the virtual reality experience being executed on the virtual reality system 30. At step 134, the virtual reality system 30 sends a general threat detection warning control signal to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32. At step 136, the microcontroller 20 processes the general threat detection warning control signal to determine the sequence for activating the tactile feedback units 16 corresponding to the control signal. At step 138, the microcontroller 20 activates the tactile feedback units 16 (e.g., in a sequence) to signal a general threat detection warning, as described above.
Referring now to FIGS. 13 and 14, a diagram and flowchart depict the tactile feedback system 10 configured to perform a method 140 for signaling a directional threat detection warning. A directional threat detection warning indicates both that there is a threat (e.g., a potential danger) and the direction of the threat event relative to the user 11 occurring in the virtual reality experience. As shown in FIG. 13, the tactile feedback system 10 is configured to activate the tactile feedback unit 16 a to indicate to the user 11 the approximate angular direction of the threat to the user 11 in the virtual reality experience. To indicate a threat, as opposed to damage for example, the microcontroller 20 may be configured to activate the tactile feedback unit 16 a by oscillating or pulsing (i.e., quickly activating and de-activating) the tactile feedback unit 16 a multiple times) the tactile feedback unit 16 a. For instance, in the example depicted in FIG. 13, the threat is located in the front/left of the user 11, such that the tactile feedback system 10 activates the tactile feedback unit 16 a located on the front/left of the tactile feedback system 10. A similar method to method 140 may also be utilized to provide a directional signal to a user for events, items or situations other than threats, such as for providing a directional indicator to signal an event, item or situation in a particular direction within the virtual reality experience.
As shown in the flow chart of FIG. 14, the method 140 comprises step 142 in which a threat detection warning event occurs at an angular location relative to the user 11 in the virtual reality experience being executed on the virtual reality system 30. At step 144, the virtual reality system 30 sends a directional threat detection warning control signal to the microcontroller 20 via the USB communication interface 28 or the wireless communication interface 32. The directional threat detection warning control signal indicates the angular direction of the threat event. At step 146, the microcontroller 20 processes the directional threat detection warning control signal to determine the tactile feedback unit 16 corresponding to the angular location of the threat event. At step 148, the microcontroller 20 activates the tactile feedback unit 16 a corresponding to the direction of the threat event relative to the user 11 in the virtual reality experience by, e.g., oscillating or pulsing the tactile feedback unit 16 a, as described above.
Referring now to FIGS. 15-18, another embodiment of the present invention is directed to a haptic, toy gun, game controller 150 for use with a virtual reality system 30. The virtual reality system 30 may be the same, or similar to the virtual reality system described above for the tactile feedback system 10. As shown in FIG. 15, the haptic gun 150 is shaped like a handgun and has a main body 152 having a handle 154. Although the exemplary embodiment shown in FIGS. 15-16 is in the shape of a handgun, the gun may be shaped and configured as any suitable type of gun, including without limitation, a handgun, rifle, shotgun, machine gun, laser gun, BB gun, paintball gun, pellet gun, light machine gun, replica of an actual gun, or any made-up gun. Thus, the body 152 is in the shape of the desired gun.
Turning to FIG. 16, a plurality of tactile feedback units 16 are coupled to the main body 152, such as being housed in the main body 152. The tactile feedback units 16 may be the same as, or similar to, to the tactile feedback units 16 described above with respect to the tactile feedback system 10. The tactile feedback units 16 are located spaced apart within the main body 152, such as a tactile feedback unit 16 located in the handle 154, a tactile feedback unit 16 located in the middle of the main body 152, and a tactile feedback unit 16 located near the front end of the main body 152 (e.g., proximate the end of the barrel of the gun 150). More or fewer tactile feedback units 16 may be utilized, depending on the desired effects to be produced by the tactile feedback units 16.
A linear actuator 156 is also coupled to the main body 152, in this case housed within the main body 152. The linear actuator 156 is configured to be activated to provide a linear force simulating a recoil from firing the haptic gun 150. The linear actuator 156 may also provide a linear force in the opposite direction when the linear actuator is reset which can simulate the gun 150 loading another round of ammunition.
A microcontroller 20 having a processor, a tracker device 158 and a power source 160 are also housed in the main body 152. The microcontroller 20 may be the same as, or similar to, to the microcontroller 20 described above with respect to the tactile feedback system 10. The haptic gun 150 has a gun communication interface 166 which may be integrated into the microcontroller 20, or may be a separate component operably coupled to the microcontroller 20. The gun communication interface 166 may be, for example, a wired communication interface such as USB, or a wireless communication interface such as Bluetooth or Wi-Fi, and is configured to provide electronic communication between the microcontroller 20 and system communication interface of the virtual reality system 30.
Each of the tactile feedback units 16 and the linear actuator 156 are operably coupled to the microcontroller 20 such that each of the tactile feedback units 16 and the linear actuator 156 may be activated independently of each other. A trigger 162 is also coupled to the main body 152. The trigger 162 is configured to be actuated by pulling the trigger with a finger of a user. The trigger 162 is also operably coupled to the microcontroller 20.
The tracker device 158 is configured to provide tracking data to the virtual reality system 30. The tracker device 158 has a tracker communication interface, such as a wireless communication interface or a USB communication interface, for communicating the tracking data to the system communication interface of the virtual reality system 30. The tracker device 158 may have accelerometer(s), and/or other sensors to detect the motion, location, and/or orientation of the haptic gun 150.
Turning to FIG. 17, a high-level schematic of the electronic system of the haptic gun 150 is shown. It is understood that the schematic of FIG. 17 does not include all of the components of the electronic system (e.g., transistors, diodes, resistors, etc.), but one or ordinary skill in the art is enabled to incorporate the high-level schematic into an operable electronic system. The electronic system includes an electronic circuit 164 operably interconnecting the microcontroller 20, the power source 160, the tracker device 158, the tactile feedback units 16 and the linear actuation 156. Referring to FIG. 18, a schematic is shown for an exemplary dual power relay 168 for powering the linear actuator 156. The dual power relay 168 includes two 5 volt power relays 170 which are operably coupled to the microcontroller 20 and to the linear actuator 156.
Turning to FIG. 19, a method 200 for operating the haptic gun 150 while using the haptic gun 150 with a virtual reality system 30 is shown. The haptic gun 150 is linked to the virtual reality system 30 such that the microcontroller 20 and virtual reality system may electronically communicate via the gun communication interface 166 and the system communication interface. At step 202, the virtual reality system 30 displays a target in the virtual reality experience which is seen by the user. At step 204, the user moves and aims the haptic gun 150 to aim at the target, and the tracker device 158 senses the motion and aim point of the haptic gun 150. At step 206, the tracker device 158 sends tracking data to the virtual reality system 30 via the tracker communication interface and the system communication interface. At step 208, the virtual reality system 30 controls an image of a gun within the virtual reality experience corresponding to the haptic gun 150, and displays the image to the user based on the tracking data. At step 210, the microcontroller 20 detects whether that the trigger 162 has been actuated. If the user actuates the trigger 162, the microcontroller 20 detects that the trigger has been actuated, and at step 212, the microcontroller 20 sends a trigger signal to the virtual reality system 30. At step 214, the virtual reality system 30 processes the trigger signal and determines a control signal or signals to send to the microcontroller 20 for controlling the operation of the tactile feedback units 16 and/or the linear actuator 156. At step 216, the virtual reality system 30 sends the control signal(s) to the microcontroller 20. At step 218, the microcontroller 20 processes the control signal(s) to determine if and how to actuate the tactile feedback unit(s) 16 and/or the linear actuator 156. At step 220, the microcontroller activates the tactile feedback unit(s) 16 and/or the linear actuator 156 based on the control signal(s). If the trigger has not been actuated at step 210, then at step 222, the virtual reality system determines a control signal based on the virtual reality experience. For example, the corresponding gun in the virtual reality experience may be hit or damaged. At step 224, the virtual reality system 30 sends a control signal to the microcontroller 20. Then, at step 226, the microcontroller 20 processes the control signal to determine if and how to actuate the tactile feedback unit(s) 16 and/or the linear actuator 156. At step 228, the microcontroller activates the tactile feedback unit(s) 16 and/or the linear actuator 156 based on the control signal(s).
Turning to the flow chart of FIG. 20, a method 230 for a particular use case for firing the haptic gun 150 is shown. The method 230 may be performed following step 212 and replaces the steps following 212 in the method 200 described above and shown in FIG. 19. At step 232, the virtual reality system 30 determines whether a virtual gun within the virtual reality experience corresponding to the haptic gun 150 has ammunition. If the virtual gun does not have ammunition, then at step 234, the virtual reality system 30 sends a “No Ammo” control signal to the microcontroller 20. At step 236, the microcontroller processes the “No Ammo” control signal to determine if and how to activate the tactile feedback units 16 and/or the linear actuator 156. If the microcontroller 20 is configured (i.e., programmed) to activate the tactile feedback units 16 and/or the linear actuator 156 in response to a “No Ammo” control signal, then at step 238, the microcontroller 20 activates the tactile feedback units 16 and/or the linear actuator 156. Typically, in a “No Ammo” scenario, the microcontroller 20 does not activate the linear actuator 156, but may activate one or more of the tactile feedback units 16 to simulate dry fire (e.g., to simulate a hammer or firing pin being actuated, but no round fired).
If the virtual reality system 30 determines at step 232 that the gun has ammunition to be fired, then at step 240, the virtual reality system 30 determines if the virtual gun is damaged or cannot fire for any other reason within the virtual reality experience. If the gun is damaged or cannot fire, at step 242, the virtual reality system sends a “gun damaged” or “gun inoperable” control signal to the microcontroller 20. At step 244, the microcontroller 20 processes the control signal to determine if and how to activate the tactile feedback units 16 and/or the linear actuator 156. If the microcontroller 20 is configured (i.e., programmed) to activate the tactile feedback units 16 and/or the linear actuator 156 in response to a “gun damaged” or “gun inoperable” control signal, then at step 246, the microcontroller 20 activates the tactile feedback units 16 and/or the linear actuator 156. Typically, in a “gun damaged” or “gun inoperable” scenario, the microcontroller 20 does not activate the linear actuator 156, but may activate one or more of the tactile feedback units 16 to simulate dry fire (e.g., to simulate a failed attempt to fire the gun).
If the virtual reality system 30 determines at step 240 that the gun is not damaged or otherwise inoperable, then at step 248, the virtual reality system 30 sends a “fire gun” control signal to the microcontroller 20. At step 250, the microcontroller 20 processes the “fire gun” control signal to determine if and how to activate the tactile feedback units 16 and/or the linear actuator 156, or to simply execute a pre-programmed “firing sequence.” Typically, in a “fire gun” scenario, the microcontroller 20 is configured (i.e., programmed) to activate the linear actuator 156 in response to a “fire gun” control signal. The microcontroller 20 may also activate one or more of the tactile feedback units 16 to indicate certain scenarios, such as firing a very powerful gun, or other scenarios. At step 252, the microcontroller 20 activates the linear actuator 156 and/or tactile feedback units 16, based on the “fire gun” control signal.
In another aspect, the virtual reality system 30 may send a control signal to the microcontroller 20 indicating how many rounds of ammunition the gun has to fire, and the microcontroller 20 is configured to perform the firing sequence each time the trigger is actuated (or held depressed in the case of a simulated automatic gun) for the number of rounds of ammunition remaining. When the number of rounds has been exhausted, the microcontroller 20 receives another control signal from the virtual reality system in order to perform any further firing sequences. This aspect can account for communication delays between the haptic gun 150 and the virtual reality system 20, such as delays caused by EMI or other communication interference.
Turning to FIG. 21, another embodiment of the present invention is directed to a video game system 300 for providing a single game play instance wherein multiple clients can play the same game instance with each client utilizing a different game platform. The video game system 300 includes a first game platform 302 which is a virtual reality system 30, as described herein. The first game platform 302 executes a virtual reality game 304 on the virtual reality system 30. The virtual reality system 30 may be a virtual reality headset, as described herein, or other virtual reality system. The video game system 300 includes a second game platform 306 executing a game 308 having a similar representation of the virtual reality game but modified for the second game platform 306. The first game platform 302 and second game platform 306 each have communication interfaces configured to allow them to communicate with each other to provide a single instance game space for the first game platform 302 and second game platform 306. For example, the communication interfaces may be wired interfaces, or wireless interfaces, as described herein. Furthermore, the first game platform 302 and second game platform 306 may communicate with each other over the Internet 310 using their respective communication interfaces to connect to the Internet 310.
In another aspect, the video game system 300 may also include a third game platform 312 which is in communication with the first game platform 302. The first game platform 302 and third game platform 312 are configured to allow the third game platform 312 to display a representation of the virtual reality game 304 on a video display as it is being played on the first game platform 302. For example, the third game platform 312 may be a tablet computer, a smart TV, a personal computer, a smart phone, or other computing device not having virtual reality capability, such that a spectator can watch the game play while a user is using the virtual reality game 304. For instance, if the user is using a virtual reality headset, only the user can see the game play, so it is useful to allow others to view the game play on the third game platform 312. It is understood, that additional game platforms, same, similar, or different to the first game platform 202 or the second game platform 306, can be connected and added to the video game system 300 to allow additional clients to play the same single game instance. Similarly, additional game platforms, same, similar, or different to the third game platform 312, may be connected to the video game system 300 to allow additional observers to view the game play. Also, additional game platforms 302 may be connected to the video game system 300. In the case of additional game platforms 302, specific game platforms 312 can be slaved to various headsets, and/or games platforms 312 can be allowed to switch from one game platform 302 to another game platform 302 to view the game from different views from their perspective. In addition when a client's game ends, e.g., the client's character is dead or the client's active participation in the game otherwise terminates, the system 300 allows such client to continue to view the gamer. For instance, the system 300 may allow such client to view the game from the perspective of one of the other clients that is still active in the game. Any of the game platforms 302 and 306 may also be configured to register a client's death and/or termination of such client's active participation in the game, for example, as a result of a “game conclusion” event. The game platform may send a “end game” signal to all of the other game platforms involved in the single game instance which indicates to that the respective client is dead or has terminated active participation in the game.
In some embodiments, observers viewing a game through a game platform connected to the video game system 300 may participate in or interact with game play through certain predefined audience actions, such as cheering or booing, wagering on a game play result or event, conversing or chatting with one or more players or observers, deploying game play objects into game play, or any other game play-related action. Additional non-limiting examples of audience actions may include causing an attack/defense in a combat-themed game (e.g., throwing a grenade, firing a shot, launching a missile, casting a spell into the game); helping or obstructing a player (e.g., giving/taking health, mana, ammunition, virtual currency, weapons, power-ups, etc.); giving hints to players of the game; wagering with other observers/players on the outcome of the game; cheering on a particular player.
Although particular embodiments have been shown and described, it is to be understood that the above description is not intended to limit the scope of these embodiments. While embodiments and variations of the many aspects of the invention have been disclosed and described herein, such disclosure is provided for purposes of explanation and illustration only. Thus, various changes and modifications may be made without departing from the scope of the claims. For example, not all of the components described in the embodiments are necessary, and the invention may include any suitable combinations of the described components, and the general shapes and relative sizes of the components of the invention may be modified. Accordingly, embodiments are intended to exemplify alternatives, modifications, and equivalents that may fall within the scope of the claims. The invention, therefore, should not be limited, except to the following claims, and their equivalents.

Claims (14)

What is claimed is:
1. A haptic gun for use with a virtual reality system configured to present a virtual reality experience, comprising:
a main body having a handle;
a microcontroller housed in the main body, the microcontroller comprising a processor;
a communication interface housed in the main body, the communication interface operably coupled to the microcontroller and configured to electronically communicate with the virtual reality system;
a trigger coupled to the main body, and operably coupled to the microcontroller;
a plurality of tactile feedback units coupled to the main body, each of the tactile feedback units operably coupled to the microcontroller and configured to produce a haptic feedback to a user;
a linear actuator coupled to the main body, the linear actuator operably coupled to the microcontroller and configured to provide a linear force simulating a recoil from firing the gun; and
a tracker device housed within the main body, the tracker device configured to provide tracking data to the virtual reality system representative of a motion and an aim point of the gun; and
the microcontroller is configured to: electronically communicate with the virtual reality system via the communication interface including receiving control signals from the virtual reality system for controlling operation of the tactile feedback units and the linear actuator; the control signals including at least two different weapon condition control signals, each weapon condition control signal corresponding to a different weapon condition event occurring in a virtual reality experience, the weapon condition events being events relating to the condition of the weapon, the at least two weapon condition events selected from the group consisting of (a) weapon is damaged (b) weapon is upgraded, and (c) weapon is out of ammunition; operate the tactile feedback units and the linear actuator based on the control signals; and send a trigger signal to the virtual reality system in response to actuation of the trigger; and, in response to receiving a respective one of the weapon condition control signals, selectively actuate one or more of the tactile feedback units in a predetermined sequence corresponding to the respective weapon condition control signal, wherein the predetermined sequence corresponding to the “weapon is damaged” event is a weapon damaged sequence, the predetermined sequence corresponding to the “weapon is upgraded” event is a weapon upgrade sequence, and the predetermined sequence corresponding to the “weapon is out of ammunition” is a weapon out of ammo sequence.
2. The haptic gun of claim 1, wherein the tracker device comprises a tracker communication interface.
3. The haptic gun of claim 1, wherein the microcontroller comprises the communication interface.
4. The haptic gun of claim 1, wherein the communication interface is a wireless communication interface operating on a wireless communication protocol.
5. The haptic gun of claim 1, wherein the weapon out of ammo sequence simulates a hammer or firing pin being actuated.
6. The haptic gun of claim 1, wherein the weapon out of ammo sequence simulates a failed attempt to fire a gun.
7. The haptic gun of claim 1, wherein the weapon out of ammo sequence simulates a hammer or firing pin being actuated and the weapon out of ammo sequence simulates a failed attempt to fire a gun.
8. A system for providing a virtual reality experience utilizing a haptic gun controller, comprising:
a virtual reality system comprising a processor, a software program, a display device, and a system communication interface, and configured to present a virtual reality experience to a user; and
a haptic gun controller comprising:
a main body having a handle;
a microcontroller housed in the main body, the microcontroller comprising a processor and a gun communication interface;
a trigger coupled to the main body, and operably coupled to the microcontroller;
a plurality of tactile feedback units coupled to the main body, each of the tactile feedback units operably coupled to the microcontroller and configured to produce a haptic feedback to a user;
a linear actuator coupled to the main body, the linear actuator operably coupled to the microcontroller and configured to provide a linear force simulating a recoil from firing the gun; and
a tracker device housed within the main body, the tracker device operably coupled to the microcontroller, the tracker device configured to provide tracking data to the virtual reality system representative of a motion and an aim point of the gun;
wherein the microcontroller is configured to: electronically communicate with the virtual reality system via the communication interface including receiving control signals from the virtual reality system for controlling operation of the tactile feedback units and the linear actuator; the control signals including at least two different weapon condition control signals, each weapon condition control signal corresponding to a different weapon condition event occurring in a virtual reality experience, the weapon condition events being events relating to the condition of the weapon, the at least two weapon condition events selected from the group consisting of (a) weapon is damaged, (b) weapon is upgraded, and (c) weapon is out of ammunition; operate the tactile feedback units and the linear actuator based on the control signals; and send a trigger signal to the virtual reality system in response to actuation of the trigger; and, in response to receiving a respective one of the weapon condition control signals, selectively actuate one or more of the tactile feedback units in a predetermined sequence corresponding to the respective weapon condition control signal, wherein the predetermined sequence corresponding to the “weapon is damaged” event is a weapon damaged sequence, the predetermined sequence corresponding to the “weapon is upgraded” event is a weapon upgrade sequence, and the predetermined sequence corresponding to the “weapon is out of ammunition” is a weapon out of ammo sequence;
wherein the virtual reality system is configured to: communicate with the gun communication interface via the system communication interface; receive the trigger signal from the haptic gun; send the control signals to the haptic gun; receive the tracking data; and utilize the trigger signal and tracking data to control a virtual gun within the virtual reality experience corresponding to the haptic gun.
9. The system of claim 8, wherein the virtual reality system is further configured such that upon receiving a trigger signal from the gun that the trigger has been activated, determine whether a virtual gun within the virtual reality experience has ammunition, and (i) when determining the virtual gun does not have ammunition, send a control signal to the haptic gun that the ammunition is empty (not to fire); (ii) when determining that the virtual gun has ammunition, send a firing sequence control signal to the haptic gun; and
wherein the haptic gun controller is further configured to activate the linear actuator to simulate a recoil from firing the gun and to reset the linear actuator, both in response to receiving the firing sequence control signal.
10. The system of claim 9, wherein one of the weapon condition control signals is a weapon upgrade control signal corresponding to a weapon upgrade event in the virtual reality experience and the virtual reality system is further configured to send the weapon upgrade control signal to the haptic gun controller; and
wherein the predetermined sequence corresponding to the weapon upgrade control signal is a weapon upgrade sequence, and the haptic gun controller is further configured to activate one or more of the tactile feedback units and/or the linear actuator in the weapon upgrade sequence in response to receiving the weapon upgrade control signal.
11. The system of claim 8, wherein one of the weapon condition control signals is a weapon damage control signal corresponding to a weapon damage event in the virtual reality experience and the virtual reality system is further configured to send the weapon damage control signal to the haptic gun controller; and
wherein the haptic gun controller is further configured to activate one or more of the tactile feedback units and/or the linear actuator in a weapon damage sequence in response to receiving the weapon damage control signal.
12. The system of claim 8, wherein the weapon out of ammo sequence simulates a hammer or firing pin being actuated.
13. The system of claim 8, wherein the weapon out of ammo sequence simulates a failed attempt to fire a gun.
14. The system of claim 8, wherein the weapon out of ammo sequence simulates a hammer or firing pin being actuated and the weapon out of ammo sequence simulates a failed attempt to fire a gun.
US15/241,007 2016-08-18 2016-08-18 Tactile feedback systems and methods for augmented reality and virtual reality systems Active 2037-09-06 US10569165B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/241,007 US10569165B2 (en) 2016-08-18 2016-08-18 Tactile feedback systems and methods for augmented reality and virtual reality systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/241,007 US10569165B2 (en) 2016-08-18 2016-08-18 Tactile feedback systems and methods for augmented reality and virtual reality systems

Publications (2)

Publication Number Publication Date
US20180050268A1 US20180050268A1 (en) 2018-02-22
US10569165B2 true US10569165B2 (en) 2020-02-25

Family

ID=61191110

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/241,007 Active 2037-09-06 US10569165B2 (en) 2016-08-18 2016-08-18 Tactile feedback systems and methods for augmented reality and virtual reality systems

Country Status (1)

Country Link
US (1) US10569165B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022046375A1 (en) * 2020-08-25 2022-03-03 Sony Interactive Entertainment LLC Positional haptics via head-mounted peripheral
US11325030B2 (en) * 2018-09-20 2022-05-10 Protubevr Game pad holder for virtual reality video games, comprising a mechanical force feedback device

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11726568B2 (en) 2019-05-31 2023-08-15 Apple Inc. Haptics application programming interface
US11087162B2 (en) * 2019-08-01 2021-08-10 Nvidia Corporation Determining relative regions of interest in images using object detection
CN111388993A (en) * 2020-03-16 2020-07-10 网易(杭州)网络有限公司 Control method and device for virtual reality shooting game
US20210402292A1 (en) * 2020-06-25 2021-12-30 Sony Interactive Entertainment LLC Method of haptic responses and interacting
EP4251941A4 (en) * 2020-11-30 2024-10-16 Green Ammo As Recoil simulation device
US11946716B2 (en) 2020-11-30 2024-04-02 Green Ammo As Recoil simulation device
EP4255592A1 (en) * 2020-12-04 2023-10-11 HVRT Corp. Systems and methods for shooting simulation and training
JP2024515645A (en) * 2021-04-15 2024-04-10 ナイアンティック, インコーポレイテッド Augmented Reality Hat
CN113230654B (en) * 2021-06-18 2023-02-24 腾讯科技(深圳)有限公司 Shooting display method and device of virtual gun, computer equipment and storage medium
USD1044805S1 (en) 2022-08-04 2024-10-01 Niantic, Inc. Augmented reality hat
CN117742484A (en) * 2023-11-28 2024-03-22 华南理工大学 Multi-mode touch feedback control device oriented to virtual simulation training platform

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6269730B1 (en) * 1999-10-22 2001-08-07 Precision Remotes, Inc. Rapid aiming telepresent system
US6509896B1 (en) * 1997-03-03 2003-01-21 Kabushiki Kaisha Sega Enterprises Image processor, image processing method, medium and game machine
US20100062678A1 (en) * 2006-12-22 2010-03-11 Konami Digital Entertainment Co., Ltd. Shooting toy used in game for two or more players
US20100173686A1 (en) * 2009-01-06 2010-07-08 Immersion Corporation Programmable Game-Based Haptic Enabled Gun Controller
US20110275435A1 (en) * 2010-05-10 2011-11-10 Torre David P Polymorphic firearm controller
US20120050198A1 (en) * 2010-03-22 2012-03-01 Bruce Cannon Electronic Device and the Input and Output of Data
US20120122576A1 (en) * 2010-11-17 2012-05-17 Sony Computer Entertainment Inc. Smart shell to a game controller
US20140087844A1 (en) * 2012-09-27 2014-03-27 Igt Gaming system and method for providing a skill-based game
US20150072790A1 (en) * 2009-05-19 2015-03-12 David Kotkin Gaming accessory
US20150278263A1 (en) * 2014-03-25 2015-10-01 Brian Bowles Activity environment and data system for user activity processing
US20160292924A1 (en) * 2012-10-31 2016-10-06 Sulon Technologies Inc. System and method for augmented reality and virtual reality applications
US20160330593A1 (en) * 2015-05-08 2016-11-10 Kerry CAPERELL Alert system capable of indicating the nature of an object, person, or pet

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6509896B1 (en) * 1997-03-03 2003-01-21 Kabushiki Kaisha Sega Enterprises Image processor, image processing method, medium and game machine
US6269730B1 (en) * 1999-10-22 2001-08-07 Precision Remotes, Inc. Rapid aiming telepresent system
US20100062678A1 (en) * 2006-12-22 2010-03-11 Konami Digital Entertainment Co., Ltd. Shooting toy used in game for two or more players
US20100173686A1 (en) * 2009-01-06 2010-07-08 Immersion Corporation Programmable Game-Based Haptic Enabled Gun Controller
US20150072790A1 (en) * 2009-05-19 2015-03-12 David Kotkin Gaming accessory
US20120050198A1 (en) * 2010-03-22 2012-03-01 Bruce Cannon Electronic Device and the Input and Output of Data
US20110275435A1 (en) * 2010-05-10 2011-11-10 Torre David P Polymorphic firearm controller
US20120122576A1 (en) * 2010-11-17 2012-05-17 Sony Computer Entertainment Inc. Smart shell to a game controller
US20140087844A1 (en) * 2012-09-27 2014-03-27 Igt Gaming system and method for providing a skill-based game
US20160292924A1 (en) * 2012-10-31 2016-10-06 Sulon Technologies Inc. System and method for augmented reality and virtual reality applications
US20150278263A1 (en) * 2014-03-25 2015-10-01 Brian Bowles Activity environment and data system for user activity processing
US20160330593A1 (en) * 2015-05-08 2016-11-10 Kerry CAPERELL Alert system capable of indicating the nature of an object, person, or pet

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
https://www.firgelliauto.com/blogs/news/how-does-a-linear-actuator-work by Firgelli Automations Team on Nov. 17, 2014 (Year: 2014). *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11325030B2 (en) * 2018-09-20 2022-05-10 Protubevr Game pad holder for virtual reality video games, comprising a mechanical force feedback device
WO2022046375A1 (en) * 2020-08-25 2022-03-03 Sony Interactive Entertainment LLC Positional haptics via head-mounted peripheral
US11291913B2 (en) 2020-08-25 2022-04-05 Sony Interactive Entertainment LLC Positional haptics via head-mounted peripheral

Also Published As

Publication number Publication date
US20180050268A1 (en) 2018-02-22

Similar Documents

Publication Publication Date Title
US10155159B2 (en) Tactile feedback systems and methods for augmented reality and virtual reality systems
US10569165B2 (en) Tactile feedback systems and methods for augmented reality and virtual reality systems
JP6877249B2 (en) Systems and methods that provide complex tactile stimuli during and in connection with the operation of virtual devices during operation gesture input.
EP1825209B1 (en) Instructor-lead training environment and interfaces therewith
US6752720B1 (en) Mobile remote control video gaming system
US7126607B2 (en) Electronic game and method for effecting game features
CN110585712A (en) Method, device, terminal and medium for throwing virtual explosives in virtual environment
US20080096654A1 (en) Game control using three-dimensional motions of controller
CN111001159B (en) Virtual item control method, device, equipment and storage medium in virtual scene
JP5138520B2 (en) Program and game device
WO2021227733A1 (en) Method and apparatus for displaying virtual prop, and device and storage medium
US10928915B2 (en) Distributed storytelling environment
CN111359206B (en) Virtual object control method, device, terminal and storage medium
CN112933601B (en) Virtual throwing object operation method, device, equipment and medium
CN112121431A (en) Interactive processing method and device of virtual prop, electronic equipment and storage medium
JP7071823B2 (en) Simulation system and program
CN114130031A (en) Using method, device, equipment, medium and program product of virtual prop
CN111202983A (en) Method, device, equipment and storage medium for using props in virtual environment
CN112316430B (en) Prop using method, device, equipment and medium based on virtual environment
US20180050269A1 (en) Systems and methods for providing a single virtual reality game play instance for multiple clients using different game platforms
US10369487B2 (en) Storytelling environment: mapping virtual settings to physical locations
KR20190024059A (en) System for immersive virtual reality game with wearable device and method using the same
JP2015160114A (en) Competition type shooting game apparatus
KR102583169B1 (en) Shooting game apparatus using the drone
CN114288678A (en) Interactive processing method and device for virtual scene, electronic equipment and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: ACTIVISION PUBLISHING, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JONES, ALLAN;REEL/FRAME:039751/0839

Effective date: 20160801

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4