US20150268722A1 - Systems and Methods for a Shared Haptic Experience - Google Patents
Systems and Methods for a Shared Haptic Experience Download PDFInfo
- Publication number
- US20150268722A1 US20150268722A1 US14/219,882 US201414219882A US2015268722A1 US 20150268722 A1 US20150268722 A1 US 20150268722A1 US 201414219882 A US201414219882 A US 201414219882A US 2015268722 A1 US2015268722 A1 US 2015268722A1
- Authority
- US
- United States
- Prior art keywords
- haptic
- user
- haptic effect
- electronic device
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/216—Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
- A63F13/235—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/33—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/843—Special adaptations for executing a specific game genre or game mode involving concurrently two or more players on the same game device, e.g. requiring the use of a plurality of controllers or of a specific view of game data for each player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1037—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
Definitions
- the present invention relates to the field of user interface devices. More specifically, the present invention relates to systems and methods for providing a shared haptic experience.
- Computer users continue to desire a more interactive experience. For example, as video games become more interactive, demand for multiplayer games, wherein users can play with or against each other, has increased. Users may play video games with or against each other in a multitude of ways.
- One common way for users to play multiplayer games is through a single game console, such as a Sony PlayStation, in which all the users are located in close proximity to one another, often the same room, and manipulate virtual characters through handheld controllers connected to the game console.
- Users also commonly play multiplayer games over the Internet, wherein users play with or against each other from sometimes remote corners of the world, often via different kinds of devices, such as computers, game consoles, and smart phones.
- Some multiplayer games and game systems may allow players to share audio and video content with one another. While various techniques have been used to improve the multiplayer gaming experience, there is a need for multiplayer games, game systems, and similar collaborative computing environments to allow users to share their haptic content in order to enhance the interactive and collaborative nature of the system.
- Embodiments of the present disclosure comprise systems and methods for providing a shared haptic experience.
- a system of the present disclosure may comprise a processor configured to: receive a first haptic effect signal, the first haptic effect signal based at least in part on a haptic event and adapted to cause a first haptic effect to be output to a first user, determine a second haptic effect based at least in part on the first haptic effect and a characteristic independent of the haptic event, generate a second haptic effect signal based at least in part on the second haptic effect, and transmit the second haptic signal to a second haptic output device.
- the system may further comprise a second haptic output device in communication with the processor, wherein the second haptic output device is configured to receive the second haptic effect signal and output the second haptic effect to a second user.
- a method of the present disclosure may comprise: receiving a first haptic effect signal, the first haptic effect signal based at least in part on a haptic event and adapted to cause a first haptic effect to be output to a first user, determining a second haptic effect based at least in part on the first haptic effect and a characteristic independent of the haptic event, generating a second haptic effect signal based at least in part on the second haptic effect, and transmitting the second haptic signal to a second haptic output device.
- Yet another embodiment comprises a computer-readable medium for implementing such a method.
- FIG. 1 is a block diagram showing a system for providing a shared haptic experience in one embodiment
- FIG. 2 is another block diagram showing one embodiment of a system for providing a shared haptic experience
- FIG. 3 shows an external view of a system for providing a shared haptic experience in one embodiment
- FIG. 4 shows another external view of a system for providing a shared haptic experience in one embodiment
- FIG. 5 shows one embodiment of an external view of a system for providing a shared haptic experience
- FIG. 6 shows an external view of a system for providing a shared haptic experience in one embodiment
- FIG. 7 is a flowchart showing a method for providing a shared haptic experience in one embodiment.
- FIG. 8 is a flowchart showing another method for providing a shared haptic experience in one embodiment.
- the gaming system includes one or more game consoles or other computing systems that are in communication with user interface devices, such as a game controller, smart phone, or tablet.
- user interface devices such as a game controller, smart phone, or tablet.
- Such gaming systems may include, for example, the Microsoft Xbox, Sony PlayStation, Nintendo Wii, or the Sega Zone.
- the user interface devices may comprise and/or may be in communication with one or more user input elements.
- Such elements may include, for example, a button, joystick, camera, gyroscope, accelerometer, or touch-sensitive surface, any of which can be used to detect a user input alone or in combination with one another.
- the user input device also comprises and/or may be in communication with one or more haptic output devices.
- the haptic output device receives a signal from the gaming system and outputs a haptic effect to a user.
- Each haptic output device may include one or more actuators, such as an eccentric rotating mass (ERM) motor for providing a vibratory effect.
- ERP eccentric rotating mass
- a first user interacts with the gaming system through a user input device, such as a game controller, to control the actions of an avatar on the screen during a game.
- a user input device such as a game controller
- the first user controls the first user's character to achieve some goal, such as advancing through a level.
- the first user may experience one or more haptic effects in response to the game events.
- the first user's virtual character gets shot, and in response, a haptic effect, such as a vibration, is output to the first user's controller.
- the gaming system is adapted to share the first user's haptic effects with one or more other users.
- the characteristics of the haptic effect transmitted to a second user are based, at least in part, on the haptic effect generated for the first user and on other factors.
- the gaming system may generate an effect to be delivered to the second user by starting with the haptic effect generated for the first user and modifying that effect based on the relative position of the first user's virtual character to the second user's virtual character in the video game.
- the strength of the haptic effect transmitted to the second user is inversely proportional to the relative distance between the first user's virtual character and the second user's virtual character in virtual space.
- the haptic effect transmitted to the second user is weaker than if the first user's virtual character was standing three feet from the second user's virtual character.
- FIG. 1 is a block diagram showing a system 100 for providing a shared haptic experience in one embodiment.
- system 100 comprises a computing device 101 having a processor 102 in communication with other hardware via bus 106 .
- a memory 104 which can comprise any suitable tangible (and non-transitory) computer-readable medium such as RAM, ROM, EEPROM, or the like, embodies program components that configure operation of the computing device 101 .
- computing device 101 further comprises one or more network interface devices 110 , input/output (I/O) interface components 112 , and additional storage 114 .
- I/O input/output
- Network device 110 can represent one or more of any components that facilitate a network connection. Examples include, but are not limited to, wired interfaces such as Ethernet, USB, IEEE 1394, and/or wireless interfaces such as IEEE 802.11, Bluetooth, or radio interfaces for accessing cellular telephone networks (e.g., transceiver/antenna for accessing a CDMA, GSM, UMTS, or other mobile communications network).
- wired interfaces such as Ethernet, USB, IEEE 1394
- wireless interfaces such as IEEE 802.11, Bluetooth
- radio interfaces for accessing cellular telephone networks (e.g., transceiver/antenna for accessing a CDMA, GSM, UMTS, or other mobile communications network).
- I/O components 112 may be used to facilitate connection to devices such as one or more displays, keyboards, mice, joysticks, video game controllers, buttons, speakers, microphones, and/or other hardware used to input data or output data.
- Storage 114 represents nonvolatile storage such as magnetic, optical, or other storage media included in device 101 .
- Display 116 may be used to facilitate the output of one or more images, and may comprise, for example, a television set, a touchscreen display, a computer monitor, or a projector.
- computing device 101 is in communication with two external electronic devices (hereinafter “external devices”), first electronic device 118 and second electronic device 120 .
- computing device 101 may be in communication with any number of external electronic devices.
- these external devices may be similar, such as game controllers for use with a single game console, like a Sony PlayStation.
- the external devices may be of different types, such as smart phones, tablets, e-readers, laptop computers, desktop computers, or wearable devices.
- computing device 101 and first electronic device 118 are illustrated in FIG. 1 as separate devices, the computing device 101 and first electronic device 118 may comprise a single integrated device capable of performing the functions described in relation to computing system 101 as well as serving as an input and output device for the user.
- First electronic device 118 and second electronic device 120 comprise a first haptic output device 122 and a second haptic output device 130 , respectively.
- These haptic output devices may be configured to output haptic effects, for example, vibrations, changes in a perceived coefficient of friction, simulated textures, or surface deformations in response to haptic signals.
- haptic output devices 122 and 130 may provide haptic effects that move the surfaces of the external devices in a controlled manner.
- haptic effects may utilize an actuator coupled to a housing of the external devices, and some haptic effects may use multiple actuators in sequence and/or in concert.
- haptic output devices 122 and 130 may comprise one or more of, for example, a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA).
- haptic output devices 122 and 130 may use electrostatic attraction, for example by use of an electrostatic surface actuator, to simulate a texture or to vary the coefficient of friction the user feels when moving his or her finger across the surface of the external devices.
- haptic output devices 122 and 130 comprise a device that applies voltages and currents instead of mechanical motion to generate a haptic effect.
- the electrostatic actuator comprises a conducting layer and an insulating layer.
- the conducting layer may be any semiconductor or other conductive material, such as copper, aluminum, gold, or silver.
- the insulating layer may be glass, plastic, polymer, or any other insulating material.
- the processor 102 may operate the electrostatic actuator by applying an electric signal to the conducting layer.
- the electric signal may be an AC signal that is generated by a high-voltage amplifier.
- the AC signal may capacitively couple the conducting layer with an object near or touching the surface of the external devices. The capacitive coupling may simulate a friction coefficient or texture on the surface of the external devices.
- the surface of first electronic device 118 is smooth, but the capacitive coupling may produce an attractive force between an object, such as a user's hand, and the surface of first electronic device 118 .
- varying the levels of attraction between the object and the conducting layer can create or vary a simulated texture on an object moving across the surface of the external devices.
- an electrostatic actuator may be used in conjunction with traditional actuators to create or vary a simulated texture on the surface of the external devices.
- haptic effects may be output using a flexible surface layer configured to vary its texture based upon contact from a surface reconfigurable haptic substrate (including, but not limited to, e.g., fibers, nanotubes, electroactive polymers, piezoelectric elements, or shape memory allows) or a magnetorheological fluid.
- a surface reconfigurable haptic substrate including, but not limited to, e.g., fibers, nanotubes, electroactive polymers, piezoelectric elements, or shape memory allows
- a magnetorheological fluid including, but not limited to, e.g., fibers, nanotubes, electroactive polymers, piezoelectric elements, or shape memory allows
- haptic effects may be output by raising or lowering one or more surface features, for example, with a deforming mechanism, air or fluid pockets, local deformation of materials, resonant mechanical elements, piezoelectric materials, micro-electromechanical systems (“MEMS”) elements, thermal fluid pockets, MEMS pumps, variable porosity membranes, or laminar flow modulation.
- a deforming mechanism air or fluid pockets
- MEMS micro-electromechanical systems
- first haptic output device 122 is embedded in a device external from computing device 101 , i.e. first electronic device 118
- first haptic output device 122 may be embedded within computing device 101 .
- computing device 101 may comprise a laptop computer further comprising first haptic output device 122 .
- first and second haptic output devices 122 and 130 are depicted as single devices, each haptic output device may comprise one or more haptic output devices.
- first haptic output device 122 may comprise two or more actuators to provide different types of effects to a user. It will be recognized by those of ordinary skill in the art that other embodiments may contain additional configurations of the first haptic output device 122 , second haptic output device 130 , and computing system 101 .
- detection module 124 configures processor 102 to monitor a virtual environment, such as a video game environment, for a haptic event, such as a virtual gun shot.
- a haptic event such as a virtual gun shot.
- module 124 may sample videogame data to track the presence of a haptic event and, if a haptic event is present, to track one or more of the type, duration, location, intensity, and/or other characteristics of the haptic event.
- detection module 124 configures processor 102 to monitor the virtual environment for the receipt of haptic content from other players or for the triggering of a sharing event, such as a button press, which may indicate that computing device 101 should replicate (i.e., generate substantially the same effect as) another user's haptic content.
- module 124 may sample network 110 data to track the presence of another user's shared haptic content.
- detection module 124 may track one or more of the type, duration, location, and/or other characteristics of the one or more haptic effects.
- Haptic effect determination module 126 represents a program component that analyzes data regarding the shared haptic effect in order to determine a haptic effect to locally generate. Particularly, haptic effect determination module 126 may comprise code that determines, based on the type, duration, location, and/or other characteristics of the shared haptic content, a haptic effect to locally output. Haptic effect determination module 126 may further base this determination on the relative position of a first user to a second user in real space, the relative position of a first user to a gaming device in real space, the relative position of a virtual character controlled by a first user to a virtual character controlled by a second user in a virtual environment, or a variety of other real or virtual environmental characteristics.
- haptic effect determination module 126 may determine that, because of a large distance between a first user's virtual character and a second user's virtual character, the proper effect to output to the second user is a short, mild vibration.
- haptic effect determination module 126 may comprise code that determines which actuators to use in order to generate the haptic effect.
- the second user's gaming device may comprise four actuators; two actuators vertically aligned on the left side of the gaming device and two actuators vertically aligned on the right side of the user's gaming device.
- haptic effect determination module 126 may determine that because a first user's virtual character is positioned northwest of a second user's virtual character in a virtual environment, only the actuator in the front, left side of the second user's gaming device should be actuated to generate the haptic effect.
- haptic effect determination module 126 may comprise code that determines how best to output a local haptic effect that is substantially similar to the shared haptic content. For example, if the shared haptic content comprises a series of vibrations of varying intensity, haptic effect determination module 126 may determine how best to locally output an effect that is substantially similar to a series of vibrations of varying intensity. In such an embodiment, for example, haptic effect determination module 126 may determine that first electronic device 118 does not have the vibratory hardware, such as an ERM or LRA, to directly implement a series of vibrations. In one such embodiment, haptic effect determination module 126 may determine that the closest sensation to a series of vibrations of varying intensity that can be output is a series of changes in the coefficient of friction at the surface of the first electronic device 118 via an ESF actuator.
- the shared haptic content comprises a series of vibrations of varying intensity
- haptic effect determination module 126 may determine how best to locally output an effect that is substantially similar to a series of vibrations
- Haptic effect generation module 128 represents programming that causes processor 102 to generate and transmit a haptic signal to a haptic output device, such as first haptic output device 122 , to generate the determined haptic effect.
- generation module 128 may access stored waveforms or commands to send to first haptic output device 122 .
- haptic effect generation module 128 may receive a desired type of effect and utilize signal processing algorithms to generate an appropriate signal to send to first haptic output device 122 .
- a desired texture may be indicated to be output at the surface of first electronic devices 118 , along with target coordinates for the texture, and an appropriate waveform sent to one or more actuators to generate appropriate displacement of the surface of first electronic devices 118 (and/or other device components) to provide the texture.
- the first electronic device 118 comprises a touch screen display, such as a smart phone.
- Some embodiments may utilize multiple haptic output devices in concert to generate the haptic effect.
- first haptic output device 122 may comprise a plurality of haptic output devices, wherein in order to generate a desired effect, one haptic output device changes the perceived coefficient of friction at the surface of the first electronic device 118 while another haptic output device vibrates the surface of the first electronic device 118 .
- FIG. 2 is another block diagram showing one embodiment of a system for providing a shared haptic experience.
- the system comprises a first electronic device 202 and a second electronic device 204 .
- the system may comprise any number of electronic devices.
- a first user may use first electronic device 202 to control a virtual character in a videogame
- a second user may use second electronic device 204 to control a different virtual character in the videogame.
- Second electronic device 204 comprises a “share” button 214 for triggering sharing of the second user's haptic content with the first user.
- First electronic device 202 comprises a first haptic output device 206 , which in turn comprises actuators 210 for outputting a haptic effect.
- second electronic device 202 comprises a second haptic output device 208 , which in turn comprises actuators (not shown) for outputting a haptic effect.
- first haptic output device 206 comprises two actuators on the right side of first electronic device 202 and two actuators on the left side of first electronic device 202 .
- the two actuators on each side of first electronic device 202 are horizontally aligned.
- the two actuators on each side of first electronic device 202 may be vertically aligned.
- actuators 210 may be of a similar type.
- actuators 210 may be of different types. It will be recognized by those skilled in the art that any number, type, and arrangement of such actuators may be possible.
- first electronic device 202 may be different in type from second electronic device 204 .
- first electronic device 202 comprises a smart phone and second electronic device 204 comprises a game console controller.
- first electronic device 202 and second electronic device 204 may be any combination of laptop computers, game consoles, desktop computers, smart phones, tablets, e-readers, portable gaming systems, game console controllers, personal digital assistants, or other electronic devices.
- first haptic output device 206 and/or actuators 210 may be of a different type than second haptic output device 208 and/or actuators contained therein.
- first haptic output device 206 comprises a series of ERMs
- second haptic output device 208 comprises a mechanism for deforming the surface of second electronic device 204 .
- FIG. 3 shows an external view of a system for providing a shared haptic experience in one embodiment.
- the system comprises a first electronic device 302 , here a game console controller, which comprises a first haptic output device.
- the first electronic device 302 is in communication with gaming system 306 .
- a first user may be playing, for example, an army game on gaming system 306 .
- the first user may control a virtual marine using first electronic device 302 .
- a second user may also be playing the army game via second electronic device 304 , here also a game console controller, and through which he or she can control his or her own virtual marine.
- gaming system 306 may output various haptic effects to second electronic device 304 as the second user plays the army game. For example, if the second user's virtual character gets shot, gaming system 306 may cause second electronic device 302 to vibrate.
- the first user controlling first electronic device 302 may receive modified versions of the haptic effects sent to the second user.
- first electronic device 302 may output haptic effects modified based on the relative position of first electronic device 302 to second electronic device 304 in real space. For example, in one embodiment, first electronic device 302 outputs haptic effects with strengths inversely proportional to the relative distance 310 between the first electronic device 302 and second electronic device 304 in real space.
- first electronic device 302 may output haptic effects modified based on the relative position of second electronic device 304 to gaming system 306 in real space. For example, in one such embodiment, first electronic device 302 outputs haptic effects with strengths inversely proportional to the relative distance 308 between second electronic device 304 and gaming system 306 in real space. Thus, if the first user is holding second electronic device 304 ten feet from the gaming system 306 , the haptic effect transmitted to the first user is weaker than if the second user was holding second electronic device 304 three feet from the gaming system 306 .
- FIG. 4 shows another external view of a system for providing a shared haptic experience in one embodiment.
- the system comprises a first electronic device 402 , here a game console controller, which comprises a first haptic output device 122 .
- the first electronic device 402 is in communication with gaming system 406 .
- a first user may be playing, for example, an army game on gaming system 406 .
- the first user may control a first character 408 in the game, here a marine, using first electronic device 402 .
- a second user may also playing the army game on gaming system 406 .
- the second user may control a second character 410 in the game using second electronic device 404 , here also a game console controller, which comprises second haptic output device.
- gaming system 406 may output various haptic effects to second electronic device 404 as the second user plays the army game. For example, if the second user's virtual character is near an explosion, gaming system 406 may cause second electronic device 402 to vibrate via second haptic output device 130 .
- the first electronic device 402 may output modified versions of the haptic effects sent to the second user.
- first electronic device 402 may output haptic effects modified based on the relative position of first character 408 to second character 410 in the virtual environment. For example, in one embodiment, first electronic device 402 outputs haptic effects with strengths inversely proportional to the relative virtual distance between first character 408 and second character 410 . In such an embodiment, as the virtual distance between the two virtual characters increases, the strength of the haptic effect output to first electronic device 402 decreases proportionately.
- the first electronic device 402 may output versions of the haptic effects sent to the second user that are modified based on the relative size of first character 408 to second character 410 in a virtual environment. For example, if the first character 408 is standing and the second character 410 is kneeling or crawling, first electronic device 402 may output haptic effects with strengths that are intensified compared to those output by second electronic device 404 . In another embodiment, first electronic device 402 may output intensified haptic effects if second character's 410 virtual size is larger than first character's 408 virtual size.
- first electronic device 402 will output a substantially intensified haptic effect, such as a long, intense vibration, due to the virtual size differential.
- FIG. 5 shows one embodiment of an external view of system for providing a shared haptic experience.
- the system comprises a first electronic device 502 , here a game console controller, which comprises a first haptic output device.
- the first electronic device 502 is in communication with gaming system 506 .
- a first user may be playing, for example, an army game on gaming system 506 .
- the first user may control a first character 508 in the game, here a marine, using first electronic device 502 .
- a second user may also playing the army game on gaming system 506 .
- the second user may control a second character 510 in the game using second electronic device 504 , here also a game console controller, which comprises a second haptic output device.
- gaming system 506 may output various haptic effects to second electronic device 504 as the second user plays the army game. For example, if the second user's virtual character is driving a virtual tank over a bumpy road, gaming system 506 may cause second haptic output device to vibrate.
- first electronic device 502 may output modified versions of the haptic effects sent to the second user. In some such embodiments, the modifications may be based on a virtual environmental characteristic 512 .
- virtual environmental characteristic 512 may comprise one or more of a characteristic of an object or barrier, an ambient temperature, a characteristic of a barrier, a humidity level, or a density of a medium in which a character is located. In FIG.
- environmental characteristic 512 comprises a barrier that is a virtual brick wall.
- first electronic device 502 may output versions of haptic effects sent to second electronic device 504 with dampened strengths because environmental characteristic 512 , a brick wall, is positioned between first character 508 and second character 510 .
- environmental characteristic 512 may comprise the medium in which the first and/or second characters 508 or 510 are located.
- the first character 508 may be swimming in water while the second character 510 is on land.
- the haptic effect transmitted to the first electronic device 502 may be an dampened version of the haptic effect transmitted to the second electronic device 504 because water is denser than air.
- the environmental characteristic 512 may comprise physical properties, like the Doppler effect. For example, in one such embodiment, as second character 510 drives past first character 508 in a virtual car, first electronic device 502 outputs versions of haptic effects sent to second electronic device 504 with characteristics modified based on the Doppler effect.
- environmental characteristic 512 may comprise a virtual ambient temperature or humidity.
- first electronic device 502 may output versions of haptic effects sent to second electronic device 504 with characteristics modified based on the virtual ambient temperature or humidity.
- first electronic device 502 outputs versions of haptic effects sent to second electronic device 504 with their strengths dampened because environmental characteristic 512 comprises high virtual humidity.
- environmental characteristic 512 may be present in real space.
- environmental characteristic 512 may comprise one or more of an ambient temperature, a characteristic of a barrier, a humidity level, or a density of a medium in real space.
- first electronic device 502 comprises a temperature sensor.
- first electronic device 402 can determine the temperature in real space, such as the room in which users are playing the army video game, and vary its haptic output based on the temperature determination.
- first electronic device 402 may output versions of haptic effects sent to second electronic device 504 modified based on the temperature in real space.
- first electronic device 502 may output versions of haptic effects sent to second electronic device 504 with characteristics modified based on a physical obstruction in real space, like a real brick wall between first electronic device 502 and second electronic device 504 .
- FIG. 6 shows an external view of system for providing a shared haptic experience in one embodiment.
- the system comprises a first electronic device 602 , which in this example is a game console controller, in communication with a computing device, which in this example is a gaming system 606 .
- Gaming system 606 is connected to the internet 608 for multiplayer gameplay.
- a first user may be playing, for example, a basketball game on gaming system 606 and may control his virtual basketball player using first electronic device 602 .
- a second user may be playing the basketball game via a second electronic device 604 , which in this example is a smartphone, such as an iPhone or Android phone.
- second electronic device 604 is wirelessly connected to the internet 608 for multiplayer gameplay.
- First electronic device 602 comprises a first haptic output device and second electronic device 604 comprises a second haptic output device.
- gaming system 606 may output various haptic effects to first electronic device 602 as the first user plays the basketball game. For example, if the first user's virtual character takes a shot that bounces off the rim of the basketball net, gaming system 606 may cause first electronic device 602 to vibrate.
- first electronic device 602 may comprise a “share” button, through which the first user may initiate the sharing of his haptic content with the second user.
- the first user may press a “share” button on a the first electronic device 602 , indicating he or she wants to share his or her haptic feedback with a second user.
- the gaming system 606 may generate an effect to be delivered to the second electronic device 604 that is substantially the same as the effect that was delivered to the first electronic device 602 .
- the second user may not actually be participating in playing the basketball game, but rather may be simply observing in order to learn how to play the game.
- the first user may press the “share” button on first electronic device 602 , triggering haptic sharing among the two users.
- second electronic device 604 replicates any haptic effects delivered to first electronic device 602 , such as vibrations, as a result of gameplay.
- the first user may share not only haptic content with the second user by pressing the “share button,” but also his video data, audio data, and/or gameplay controls.
- the second user may take over control of the first user's virtual character and the first user may become an observer.
- second electronic device 604 may replicate any haptic effects delivered to first electronic device 602 , such as vibrations, as a result of gameplay.
- a software-generated event may trigger sharing of a first user's haptic content with a second user.
- the game system 606 may initiate sharing of a first user's haptic feedback with a second user upon the death of the second user's virtual character in a multiplayer game.
- the first user may be playing a virtual basketball game against the second user. If the second user commits a virtual fowl against the first user, the first user may be entitled to two “free throws,” in which the first user may take unopposed shots from a “foul line” on the virtual basketball court.
- the game may disable the second user's controls and change the second user's virtual perspective to that of the first user while the first user is allowed to take his or her free throws.
- the change in virtual perspective may automatically trigger sharing of the first user's haptic content with the second user.
- the first electronic device 602 may output a haptic effect, such as a vibration.
- the second electronic device 604 outputs a similar haptic effect.
- the first user may press the “share” button on first electronic device 602 , which may begin recording of his or her haptic content.
- the second user may be able to trigger a playback event, such as by a button press, and subsequently play back the haptic content on second electronic device 604 .
- second electronic device 604 replicates the saved haptic content as closely as possible. For example, a first user may press a “share” button on a game controller indicating he or she wants to share his or her haptic content with a second user. Thereafter, haptic content (e.g., haptic effects) generated for the first user are recorded.
- haptic content e.g., haptic effects
- the saved haptic content is delivered to the second user.
- the first user's audio and/or video content may be recorded in addition the first user's haptic content when he or she presses the “share button.”
- the saved haptic content as well as the saved audio and/or video content may be delivered to the second user.
- FIGS. 3-6 depict only a first electronic device 302 and a second electronic device 303 , in some embodiments, a plurality of such devices may be used to output haptic effects of the types described throughout this specification.
- the system may comprise one or more automobiles.
- a first user may be, for example, driving a first automobile on the highway.
- the first user may control the first automobile via a steering wheel, which may comprise a first haptic output device.
- the first automobile is in communication with the first haptic output device.
- a second user may also be driving a second automobile on the highway.
- the second user may control the second automobile via a steering wheel, which may comprise a second haptic output device.
- the second automobile is in communication with the second haptic output device.
- one or both automobiles may have blind spot detection enabled, in which an automobile can detect if another vehicle in its blind spot and output an associated alert to one or both drivers.
- the first automobile may detect the presence of the second automobile in the first user's blind spot. Based on this detection, the first user's automobile may cause the first haptic output device to output a haptic effect to the first user.
- the haptic effect may comprise a vibration.
- the magnitude of the vibration may change based on the distance between the first and second automobiles. For example, in one such embodiment, the first user may activate his left blinker signal.
- the first user's automobile may detect the presence of the second automobile in the first user's blind spot, determine that the distance between the first and second automobiles is half a meter, and output a haptic effect via the first haptic output device comprising an intense (e.g., high magnitude) vibration.
- the haptic effect may be output on the side of the first user's steering wheel corresponding to the side of the first automobile on which the second automobile is detected. For example, if the second automobile is detected in the blind spot on the left side of the first automobile, the first automobile may output a haptic effect on the left side of the steering wheel.
- the second automobile may output haptic effects based on the haptic effects sent to the first user in the first automobile.
- the second automobile may output a version of the haptic effect sent to the first user in which the location on the steering wheel that the first haptic effect was output is modified. For example, if the first haptic effect is output to the first user on the left side of the first user's steering wheel, the modified haptic effect may be output to the second user on the right side of the second user's steering wheel.
- the second automobile may output a version of the first haptic effect sent to the first user in which the magnitude of the first haptic effect is modified.
- the second automobile outputs a version of the first haptic effect with the magnitude reduced by 50%.
- how the first haptic effect is modified in order to generate the second haptic effect may change as the distance between the two automobiles changes. For example, in some embodiments, if the two automobiles are more than one meter apart, the second automobile may output a version of the first haptic effect modified such that its magnitude is reduced by 50%. As the two automobiles move closer together, the amount that the magnitude of the first haptic effect is reduced in order to generate the second haptic effect may decrease, so that by the time the two automobiles are within two-tenths of a meter, there is no magnitude reduction between the first haptic effect and the second haptic effect.
- haptic effects may be shared among a plurality of automobiles, and that a multitude of other haptic triggering events (e.g., a change in the automobile's radio station, a GPS navigation event, pressing the breaks or the gas pedal, the failure of an automobile component, or a low car battery), haptic output device configurations (e.g., placing the haptic output devices in the gear shifter, the break or gas pedal, or a car seat), and haptic effects (e.g, a perceived change in a coefficient of friction or a texture) are possible.
- haptic triggering events e.g., a change in the automobile's radio station, a GPS navigation event, pressing the breaks or the gas pedal, the failure of an automobile component, or a low car battery
- haptic output device configurations e.g., placing the haptic output devices in the gear shifter, the break or gas pedal, or a car seat
- haptic effects e.g, a perceived change in a coefficient of friction or
- the system may comprise a virtual training program.
- an expert may use a first electronic device, which comprises a first haptic output device, to perform a task (e.g., a surgery).
- haptic effects may be delivered to the expert via the first haptic output device upon the occurrence of an event (e.g., if the expert touches a specific portion of the patient's body).
- a student may use a second electronic device, which comprises a second haptic output device, to learn how to perform the task.
- the haptic content delivered to the first haptic output device is immediately transmitted to the second electronic device, which outputs the haptic effect.
- the expert may be able to record the haptic content, as well as any video and/or audio content, for subsequent playback upon the occurrence of a playback event.
- the student can initiate the playback event by, for example, pressing a button, which delivers the saved haptic content, and any saved audio and/or video content, to the second electronic device.
- the second electronic device then delivers the haptic content to the second haptic output device, which outputs the haptic effects to the student.
- the second electronic device outputs modified versions of haptic content delivered to the first haptic output device.
- the second electronic device may output a version of an effect sent to the first haptic output device with the magnitude amplified. Such a magnitude increase may allow the student to more easily detect what might otherwise be a subtle, but important, haptic cues.
- FIG. 7 is a flowchart showing a method for providing a shared haptic experience in one embodiment.
- the steps in FIG. 7 may be implemented in program code that is executed by a processor, for example, the processor in a general purpose computer, a mobile device, or a server. In some embodiments, these steps may be implemented by a group of processors. The steps below are described with reference to components described above with regard to system 100 shown in FIG. 1 .
- Method 700 begins at step 702 , with the receipt of a first haptic effect signal, the first haptic signal based at least in part on a haptic event (e.g., a user's character getting shot in a game, the completion of a level, driving a virtual vehicle over a bumpy virtual road).
- a haptic event e.g., a user's character getting shot in a game, the completion of a level, driving a virtual vehicle over a bumpy virtual road.
- Detection module 124 or processor 102 may detect the first haptic effect signal.
- the method 700 continues at step 704 when processor 102 determines a second haptic effect based at least in part on the first haptic effect and a characteristic external to the haptic event.
- the characteristic external to the haptic event may comprise a relative position of a first user with respect to a second user.
- the relative positions of the first user and the second user comprise the physical positions of the first user and second user in real space.
- the physical positions of the first electronic device 122 , controlled by the first user, and the second electronic device 120 , controlled by the second user may be used as reasonable approximations of the physical positions of the first user and the second user in real space.
- First and second electronic devices 118 and 120 may comprise one or more of an accelerometer, a gyroscope, an inclinometer, a global positioning system (GPS) unit, or another sensor for determining the positions of first electronic device 118 and second electronic device 120 , respectively, in real space.
- processor 102 may receive sensor signals from a first accelerometer and a first gyroscope embedded within first the electronic device 118 .
- processor 102 may receive sensor signals from a second accelerometer and a second gyroscope embedded within the second electronic device 120 . Based on these sensor signals, processor 102 may determine (via, for example, algorithms or a lookup table) the relative positions of the first electronic device 118 and the second electronic device 120 in real space. In such an embodiment, processor 102 may further determine the relative position of first electronic device 118 with respect to second electronic device 120 in real space.
- the relative positions of the first user and the second user comprise virtual positions of a virtual characters controlled by the first user and second user in a virtual environment.
- processor 102 may determine the relative position of a virtual character controlled by the first user and the relative position of a virtual character controlled by the second user directly from data about the virtual environment. For example, in some embodiments, processor 102 may sample network 110 data to track the location of the first user's virtual character and the second user's virtual character. Based on the sampled data, processor 102 may determine the virtual positions of the first user's virtual character and the second user's virtual character, respectively. In such an embodiment, processor 102 may further determine the relative position of first user's virtual character with respect to the second user's virtual character in the virtual environment.
- the characteristic external to the haptic event may comprise an environmental characteristic.
- the environmental characteristic may be an environmental characteristic in real space.
- computing system 101 may comprise one or more sensors such as a temperature sensor, a humidity sensor, a camera, an accelerometer, a gyroscope, a sonar device, and/or other electronic devices configured to send sensor signals to processor 102 .
- Processor 102 may determine an environmental characteristic directly from the sensor signal, or may apply the sensor signal data to an algorithm or a lookup table to determine the environmental characteristic. For example, in one such embodiment, processor 102 receives a sensor signal from a humidity sensor or temperature sensor and determines the humidity or temperature in the environment in which the first user and/or second user may be located.
- processor 102 receives a sensor signal from a camera or sonar device and determines any environmental obstructions, like walls, in the environment in which the first user and/or second user may be located. In still another embodiment, processor 102 determines, based on a camera sensor signal, the medium in which the first user or second user is located, for example, if the first user is located in water. In yet another such embodiment, processor 102 receives sensor signals from a camera and determines whether or not the first user or second user is in a vehicle, the size of the vehicle, and/or the direction or velocity in which the vehicle is moving.
- the environmental characteristic may be a virtual environmental characteristic in a virtual environment.
- processor 102 may determine the environmental characteristic directly from data about the virtual environment. For example, in some embodiments, processor 102 samples network 110 data to track the presence or absence of environmental characteristics. Based on the sampled data, processor 102 may determine an environmental characteristic. In some embodiments, processor 102 may determine the virtual environmental characteristic directly from the sampled data, or may apply the sampled data to an algorithm or a lookup table to determine the virtual environmental characteristic. For example, in one such embodiment, processor 102 samples network data and applies an algorithm to determine the presence of an object, e.g. a virtual brick wall, in the virtual environment. Similarly, in some embodiments, processor 102 may sample network data and determine that the environmental characteristic comprises a physical principle, such as the Doppler effect.
- processor 102 may apply data about the characteristics of the first haptic effect to an algorithm in order to determine the strength, duration, location, and/or other characteristics of the second haptic effect. For example, processor 102 may use the strength and intensity characteristics of the first haptic effect to determine what second haptic effect to generate and through which actuators to generate it in the second electronic device 118 . In some embodiments, the processor 102 may determine the second haptic effect based on the real or virtual relative position of the first user with respect to the second user, as determined in step 704 , or any real or virtual environmental characteristics, as determined in step 706 .
- the processor 102 may rely on programming contained in haptic effect determination module 126 to determine the second haptic effect. For example, in some embodiments, haptic effect determination module 126 may determine the haptic effect to output, and which actuators to use to output the effect, based on algorithms. In some embodiments, such algorithms may assess the relative virtual position of a second user's virtual character with respect to a first user's virtual character. For example, in one embodiment, if a first user's virtual character is 40 meters northeast of the second user's virtual character, the processor 102 determines that the second haptic effect should be generated by actuators in the front right side of the second electronic device 120 . Further, in such an embodiment, processor 102 may determine that the second haptic effect should be a substantially dampened version of the first haptic effect due to the 40-meter distance between the first user's virtual character and the second user's virtual character.
- processor 102 may determine the second haptic effect to output, and which actuators to use to output the second haptic effect, based on algorithms that assess the relative position of first electronic device 118 to second electronic device 120 in real space. For example, in one embodiment, if first electronic device 118 is a half meter northeast of second electronic device 120 in real space, the processor 102 determines that the second haptic effect should be generated by actuators in the front right side of the second electronic device 120 . Further, in such an embodiment, processor 102 may determine that the second haptic effect should be not be dampened due to the mere half meter distance between the first electronic device 118 and the second electronic device 120 .
- haptic effect determination module 126 may comprise a haptic actuator lookup table.
- the lookup table may comprise data with a haptic actuator of one type and a plurality of haptic actuators of different types that are capable of outputting similar haptic effects.
- the lookup table may comprise data with an ERM and data with a plurality of other haptic devices capable of outputting similar haptic effects as an ERM, such as a LRA, piezoelectric actuator, an electric motor, or an electro-magnetic actuator.
- processor 102 may receive data indicating that the first haptic effect was generated in the first electronic device 118 by a signal of a specific intensity and duration designated for a specific type of haptic actuator, for example, an ERM. Based on this data, in one such embodiment, processor 102 may consult the lookup table to determine what hardware in second haptic output device 130 can be used as a substitute to generate a second haptic effect with characteristics similar to the first haptic effect. For example, if second haptic output device 130 does not contain an ERM but does contain an electric motor, processor 102 consults the lookup table and determines that the electric motor may be able to act as a suitable substitute to generate the second haptic effect.
- processor 102 consults the lookup table and determines that the electric motor may be able to act as a suitable substitute to generate the second haptic effect.
- processor 102 may determine a default haptic effect.
- second haptic output device 130 may not be able to generate a haptic effect, such as a vibration, due to lack of appropriate hardware.
- second haptic output device 130 may comprise an ESF actuator capable of varying the perceived coefficient of friction on the surface of second electronic device 120 , which is the default haptic effect.
- processor 102 associates any first haptic effect comprising a vibration with a default second haptic effect comprising a perceived change in the coefficient of friction at the surface of second electronic device 120 .
- processor 102 may determine a second haptic effect based on data in a lookup table.
- the lookup table may comprise data with environmental characteristics and a plurality of haptic effect modifications.
- the lookup table may comprise data with a brick wall and a haptic effect modification, such as a decrease in haptic effect intensity by 30%.
- processor 102 consults the lookup table and determines that the second haptic effect should be a modified version of the first haptic effect with 30% less intensity.
- processor 102 may make its determination based in part on other outside factors, such as the state of the gaming device. For example, in one such embodiment, processor 102 may make its determination partially on the amount of battery life the gaming device has. In such an embodiment, haptic effect determination module 126 may receive data indicating that the first haptic effect comprised a short, intense vibration. Because the battery life on the second electronic device 120 , for example a smart phone, may be low, processor 102 may determine that a longer, but significantly less intense, vibration may achieve substantially the same effect without depleting the battery life to a detrimental level.
- the method 700 continues at step 706 with the detection of a triggering event.
- Detection module 124 or processor 102 may detect the triggering event.
- the trigger event may initiate storing of second haptic effects for subsequent playback.
- the trigger event may initiate the sharing of haptic feedback between users.
- the triggering event may be user generated, such as by a button press, or software generated, such as when a virtual character is killed in a video game.
- the method 700 continues when processor 102 generates a second haptic effect signal 710 and transmits the second haptic signal to the second haptic output device 712 , which outputs the haptic effect.
- the second haptic effect signal is based at least in part on the second haptic effect.
- the processor 102 may access drive signals stored in memory 104 and associated with particular haptic effects.
- a signal is generated by accessing a stored algorithm and inputting parameters associated with an effect.
- an algorithm may output data for use in generating a drive signal based on amplitude and frequency parameters.
- a second haptic signal may comprise data sent to an actuator to be decoded by the actuator.
- the actuator may itself respond to commands specifying parameters such as amplitude and frequency.
- FIG. 8 is a flowchart showing another method for providing a shared haptic experience in one embodiment.
- the steps in FIG. 8 may be implemented in program code that is executed by a processor, for example, the processor in a general purpose computer, a mobile device, or a server. In some embodiments, these steps may be implemented by a group of processors. The steps below are described with reference to components described above with regard to system 100 shown in FIG. 1 .
- Method 800 begins at step 802 , with the detection of a triggering event indicating that a plurality of first haptic effects should be stored for later playback.
- Detection module 124 or processor 102 may detect the triggering event.
- the triggering event may be user generated, such as by a button press, or software generated, such as when a virtual character is killed in a video game.
- the method 800 continues at step 804 when processor 102 determines a second plurality of haptic effects to generate based at least in part on each of a first plurality haptic effects.
- the processor 102 may rely on programming contained in haptic effect determination module 126 to determine the second plurality of haptic effects.
- processor 102 may use any of the methods discussed with respect to FIG. 7 to determine each of the second plurality of haptic effects.
- the method 800 continues at step 806 when processor 102 causes the second plurality of haptic effects to be stored in memory 104 for subsequent playback.
- Processor 102 may store the second plurality of haptic effects by type, name, duration, intensity, timestamp, or any other characteristics such that they can later be recalled and output in a sequential, or in some embodiments nonsequential, manner.
- Method 800 continues at step 808 where system 100 waits for an event indicating the saved plurality of second haptic effects should be played back.
- the event may be user generated, such as by pressing a button, or software generated, such as by the death of a virtual character in a video game. If the playback event occurs, the method 800 continues to step 810 . Otherwise, the method 800 returns to steps 804 and 806 , where it continues to determine and store the second plurality of haptic effects.
- the method 800 continues at steps 810 and 812 where processor 102 generates a second plurality of haptic effect signals based on the stored second plurality of haptic effects and transmits each of the second plurality of haptic signals to the second haptic output device 130 , which outputs the haptic effects.
- the processor 102 may access drive signals or algorithms stored in memory 104 and associated with particular haptic effects to generate the second plurality of haptic effect signals.
- processor 102 may use any of the methods discussed with respect to FIG. 7 to generate and transmit each of the second plurality of haptic signals to the second haptic output device 130 .
- Such systems may provide more compelling gaming experiences by allowing users to feel the sensations their fellow game players feel. For example, if a first user's gaming device vibrates because a virtual character controlled by the first user is shot, a second user's gaming device may output substantially the same, or a modified version of, the vibration, providing a more interactive experience. This may increase overall user satisfaction.
- sharing haptic feedback among users may lead to improved collaboration when performing tasks in a virtual environment. This is because users will have a better understanding of what actions their teammates are taking by feeling the same, or modified versions of, the associated haptic responses.
- two users may be playing a game in which they have to collaborate to achieve a military objective.
- the first user may be controlling a virtual gunner character, while the second user is controlling a virtual medic character.
- the first user manipulates his or her virtual character to take part in an attack on a military stronghold
- the first user's gaming device may vibrate if his or her virtual gunner character gets shot.
- this haptic content may be modified and shared with the second user, causing the second user's gaming device to also vibrate.
- the second user feels a weak vibration on the front left side of his or her controller. This may indicate to the second user how far away, and in what direction, the first user's character may be located so he may render aid.
- some embodiments may improve virtual training programs. For example, an expert may be able to play a videogame or perform a task in a virtual environment and save his audio, video, and, in some embodiments, haptic content. In such an embodiment, a novice may be able to play back the saved content and learn how to play the game or perform the task. Playing back haptic content, in addition to audio and video, may make such learning more effective.
- configurations may be described as a process that is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
- examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
- a computer may comprise a processor or processors.
- the processor comprises or has access to a computer-readable medium, such as a random access memory (RAM) coupled to the processor.
- RAM random access memory
- the processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs including a sensor sampling routine, selection routines, and other routines to perform the methods described above.
- Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines.
- Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
- Such processors may comprise, or may be in communication with, media, for example tangible computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor.
- Embodiments of computer-readable media may comprise, but are not limited to, all electronic, optical, magnetic, or other storage devices capable of providing a processor, such as the processor in a web server, with computer-readable instructions.
- Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
- various other devices may comprise computer-readable media, such as a router, private or public network, or other transmission device.
- the processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures.
- the processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Environmental & Geological Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present invention relates to the field of user interface devices. More specifically, the present invention relates to systems and methods for providing a shared haptic experience.
- Computer users continue to desire a more interactive experience. For example, as video games become more interactive, demand for multiplayer games, wherein users can play with or against each other, has increased. Users may play video games with or against each other in a multitude of ways. One common way for users to play multiplayer games is through a single game console, such as a Sony PlayStation, in which all the users are located in close proximity to one another, often the same room, and manipulate virtual characters through handheld controllers connected to the game console. Users also commonly play multiplayer games over the Internet, wherein users play with or against each other from sometimes remote corners of the world, often via different kinds of devices, such as computers, game consoles, and smart phones. Some multiplayer games and game systems may allow players to share audio and video content with one another. While various techniques have been used to improve the multiplayer gaming experience, there is a need for multiplayer games, game systems, and similar collaborative computing environments to allow users to share their haptic content in order to enhance the interactive and collaborative nature of the system.
- Embodiments of the present disclosure comprise systems and methods for providing a shared haptic experience. In one embodiment, a system of the present disclosure may comprise a processor configured to: receive a first haptic effect signal, the first haptic effect signal based at least in part on a haptic event and adapted to cause a first haptic effect to be output to a first user, determine a second haptic effect based at least in part on the first haptic effect and a characteristic independent of the haptic event, generate a second haptic effect signal based at least in part on the second haptic effect, and transmit the second haptic signal to a second haptic output device. The system may further comprise a second haptic output device in communication with the processor, wherein the second haptic output device is configured to receive the second haptic effect signal and output the second haptic effect to a second user.
- In another embodiment, a method of the present disclosure may comprise: receiving a first haptic effect signal, the first haptic effect signal based at least in part on a haptic event and adapted to cause a first haptic effect to be output to a first user, determining a second haptic effect based at least in part on the first haptic effect and a characteristic independent of the haptic event, generating a second haptic effect signal based at least in part on the second haptic effect, and transmitting the second haptic signal to a second haptic output device. Yet another embodiment comprises a computer-readable medium for implementing such a method.
- These illustrative embodiments are mentioned not to limit or define the limits of the present subject matter, but to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, and further description is provided there. Advantages offered by various embodiments may be further understood by examining this specification and/or by practicing one or more embodiments of the claimed subject matter.
- A full and enabling disclosure is set forth more particularly in the remainder of the specification. The specification makes reference to the following appended figures.
-
FIG. 1 is a block diagram showing a system for providing a shared haptic experience in one embodiment; -
FIG. 2 is another block diagram showing one embodiment of a system for providing a shared haptic experience; -
FIG. 3 shows an external view of a system for providing a shared haptic experience in one embodiment; -
FIG. 4 shows another external view of a system for providing a shared haptic experience in one embodiment; -
FIG. 5 shows one embodiment of an external view of a system for providing a shared haptic experience; -
FIG. 6 shows an external view of a system for providing a shared haptic experience in one embodiment; -
FIG. 7 is a flowchart showing a method for providing a shared haptic experience in one embodiment; and -
FIG. 8 is a flowchart showing another method for providing a shared haptic experience in one embodiment. - Reference will now be made in detail to various and alternative illustrative embodiments and to the accompanying drawings. Each example is provided by way of explanation, and not as a limitation. It will be apparent to those skilled in the art that modifications and variations can be made. For instance, features illustrated or described as part of one embodiment may be used in another embodiment to yield a still further embodiment. Thus, it is intended that this disclosure include modifications and variations as come within the scope of the appended claims and their equivalents.
- One illustrative embodiment of the present disclosure comprises a gaming system configured to provide a shared haptic experience. The gaming system includes one or more game consoles or other computing systems that are in communication with user interface devices, such as a game controller, smart phone, or tablet. Such gaming systems may include, for example, the Microsoft Xbox, Sony PlayStation, Nintendo Wii, or the Sega Zone. The user interface devices may comprise and/or may be in communication with one or more user input elements. Such elements may include, for example, a button, joystick, camera, gyroscope, accelerometer, or touch-sensitive surface, any of which can be used to detect a user input alone or in combination with one another.
- In the illustrative embodiment, the user input device also comprises and/or may be in communication with one or more haptic output devices. The haptic output device receives a signal from the gaming system and outputs a haptic effect to a user. Each haptic output device may include one or more actuators, such as an eccentric rotating mass (ERM) motor for providing a vibratory effect.
- In the illustrative embodiment, a first user interacts with the gaming system through a user input device, such as a game controller, to control the actions of an avatar on the screen during a game. For example, if the first user is playing a first-person shooting game, then the first user controls the first user's character to achieve some goal, such as advancing through a level. As events occur in the game, the first user may experience one or more haptic effects in response to the game events. For example, in one embodiment, the first user's virtual character gets shot, and in response, a haptic effect, such as a vibration, is output to the first user's controller.
- As video games become more interactive, demand for multiplayer games, wherein users can play with or against each other, has increased. Users may play video games with or against each other in a multitude of ways. Users commonly play multiplayer video games with or against one another via a single game console or over the Internet. Some users may wish to share haptic feedback that they experience with one or more other users.
- In the illustrative embodiment, the gaming system is adapted to share the first user's haptic effects with one or more other users. In the illustrative embodiment, the characteristics of the haptic effect transmitted to a second user are based, at least in part, on the haptic effect generated for the first user and on other factors. For example, the gaming system may generate an effect to be delivered to the second user by starting with the haptic effect generated for the first user and modifying that effect based on the relative position of the first user's virtual character to the second user's virtual character in the video game. In one such embodiment, the strength of the haptic effect transmitted to the second user is inversely proportional to the relative distance between the first user's virtual character and the second user's virtual character in virtual space. That is, if the first user's virtual character is standing ten feet from the second user's virtual character in the video game, the haptic effect transmitted to the second user is weaker than if the first user's virtual character was standing three feet from the second user's virtual character.
- The description of the illustrative embodiment above is provided merely as an example, not to limit or define the limits of the present subject matter. Various other embodiments of the present invention are described herein and variations of such embodiments would be understood by one of skill in the art. As will be discussed in further detail below, such embodiments are not limited to gaming systems, and the systems and methods described herein for generating haptic effects can be modified in any number of ways. Advantages offered by various embodiments may be further understood by examining this specification and/or by practicing one or more embodiments of the claimed subject matter.
-
FIG. 1 is a block diagram showing asystem 100 for providing a shared haptic experience in one embodiment. In this example,system 100 comprises acomputing device 101 having aprocessor 102 in communication with other hardware viabus 106. Amemory 104, which can comprise any suitable tangible (and non-transitory) computer-readable medium such as RAM, ROM, EEPROM, or the like, embodies program components that configure operation of thecomputing device 101. In this example,computing device 101 further comprises one or morenetwork interface devices 110, input/output (I/O)interface components 112, andadditional storage 114. -
Network device 110 can represent one or more of any components that facilitate a network connection. Examples include, but are not limited to, wired interfaces such as Ethernet, USB, IEEE 1394, and/or wireless interfaces such as IEEE 802.11, Bluetooth, or radio interfaces for accessing cellular telephone networks (e.g., transceiver/antenna for accessing a CDMA, GSM, UMTS, or other mobile communications network). - I/
O components 112 may be used to facilitate connection to devices such as one or more displays, keyboards, mice, joysticks, video game controllers, buttons, speakers, microphones, and/or other hardware used to input data or output data.Storage 114 represents nonvolatile storage such as magnetic, optical, or other storage media included indevice 101.Display 116 may be used to facilitate the output of one or more images, and may comprise, for example, a television set, a touchscreen display, a computer monitor, or a projector. - In this example,
computing device 101 is in communication with two external electronic devices (hereinafter “external devices”), firstelectronic device 118 and secondelectronic device 120. In some embodiments,computing device 101 may be in communication with any number of external electronic devices. In some embodiments, these external devices may be similar, such as game controllers for use with a single game console, like a Sony PlayStation. In other embodiments, the external devices may be of different types, such as smart phones, tablets, e-readers, laptop computers, desktop computers, or wearable devices. Whilecomputing device 101 and firstelectronic device 118 are illustrated inFIG. 1 as separate devices, thecomputing device 101 and firstelectronic device 118 may comprise a single integrated device capable of performing the functions described in relation tocomputing system 101 as well as serving as an input and output device for the user. - First
electronic device 118 and secondelectronic device 120 comprise a firsthaptic output device 122 and a secondhaptic output device 130, respectively. These haptic output devices may be configured to output haptic effects, for example, vibrations, changes in a perceived coefficient of friction, simulated textures, or surface deformations in response to haptic signals. Additionally or alternatively,haptic output devices haptic output devices - In other embodiments,
haptic output devices haptic output devices processor 102 may operate the electrostatic actuator by applying an electric signal to the conducting layer. In some embodiments, the electric signal may be an AC signal that is generated by a high-voltage amplifier. In some embodiments, the AC signal may capacitively couple the conducting layer with an object near or touching the surface of the external devices. The capacitive coupling may simulate a friction coefficient or texture on the surface of the external devices. - For example, in one embodiment, the surface of first
electronic device 118 is smooth, but the capacitive coupling may produce an attractive force between an object, such as a user's hand, and the surface of firstelectronic device 118. In some embodiments, varying the levels of attraction between the object and the conducting layer can create or vary a simulated texture on an object moving across the surface of the external devices. Furthermore, in some embodiments, an electrostatic actuator may be used in conjunction with traditional actuators to create or vary a simulated texture on the surface of the external devices. - One of ordinary skill in the art will recognize that, in addition to vibrating or varying the coefficient of friction on a surface, other techniques or methods can be used to output haptic effects. For example, in some embodiments, haptic effects may be output using a flexible surface layer configured to vary its texture based upon contact from a surface reconfigurable haptic substrate (including, but not limited to, e.g., fibers, nanotubes, electroactive polymers, piezoelectric elements, or shape memory allows) or a magnetorheological fluid. In another embodiment, haptic effects may be output by raising or lowering one or more surface features, for example, with a deforming mechanism, air or fluid pockets, local deformation of materials, resonant mechanical elements, piezoelectric materials, micro-electromechanical systems (“MEMS”) elements, thermal fluid pockets, MEMS pumps, variable porosity membranes, or laminar flow modulation.
- Although in this example first
haptic output device 122 is embedded in a device external fromcomputing device 101, i.e. firstelectronic device 118, in other embodiments firsthaptic output device 122 may be embedded withincomputing device 101. For example,computing device 101 may comprise a laptop computer further comprising firsthaptic output device 122. Also, while first and secondhaptic output devices haptic output device 122 may comprise two or more actuators to provide different types of effects to a user. It will be recognized by those of ordinary skill in the art that other embodiments may contain additional configurations of the firsthaptic output device 122, secondhaptic output device 130, andcomputing system 101. - Turning to
memory 104,illustrative program components detection module 124 configuresprocessor 102 to monitor a virtual environment, such as a video game environment, for a haptic event, such as a virtual gun shot. For example,module 124 may sample videogame data to track the presence of a haptic event and, if a haptic event is present, to track one or more of the type, duration, location, intensity, and/or other characteristics of the haptic event. Further, in some embodiments,detection module 124 configuresprocessor 102 to monitor the virtual environment for the receipt of haptic content from other players or for the triggering of a sharing event, such as a button press, which may indicate thatcomputing device 101 should replicate (i.e., generate substantially the same effect as) another user's haptic content. For example,module 124 may samplenetwork 110 data to track the presence of another user's shared haptic content. In such an embodiment, if another user's shared haptic content is present,detection module 124 may track one or more of the type, duration, location, and/or other characteristics of the one or more haptic effects. - Haptic
effect determination module 126 represents a program component that analyzes data regarding the shared haptic effect in order to determine a haptic effect to locally generate. Particularly, hapticeffect determination module 126 may comprise code that determines, based on the type, duration, location, and/or other characteristics of the shared haptic content, a haptic effect to locally output. Hapticeffect determination module 126 may further base this determination on the relative position of a first user to a second user in real space, the relative position of a first user to a gaming device in real space, the relative position of a virtual character controlled by a first user to a virtual character controlled by a second user in a virtual environment, or a variety of other real or virtual environmental characteristics. For example, if the shared haptic effect comprises a long, intense (e.g., high magnitude) vibration, hapticeffect determination module 126 may determine that, because of a large distance between a first user's virtual character and a second user's virtual character, the proper effect to output to the second user is a short, mild vibration. - In some embodiments, haptic
effect determination module 126 may comprise code that determines which actuators to use in order to generate the haptic effect. For example, in some embodiments, the second user's gaming device may comprise four actuators; two actuators vertically aligned on the left side of the gaming device and two actuators vertically aligned on the right side of the user's gaming device. In such an embodiment, hapticeffect determination module 126 may determine that because a first user's virtual character is positioned northwest of a second user's virtual character in a virtual environment, only the actuator in the front, left side of the second user's gaming device should be actuated to generate the haptic effect. - In some embodiments, haptic
effect determination module 126 may comprise code that determines how best to output a local haptic effect that is substantially similar to the shared haptic content. For example, if the shared haptic content comprises a series of vibrations of varying intensity, hapticeffect determination module 126 may determine how best to locally output an effect that is substantially similar to a series of vibrations of varying intensity. In such an embodiment, for example, hapticeffect determination module 126 may determine that firstelectronic device 118 does not have the vibratory hardware, such as an ERM or LRA, to directly implement a series of vibrations. In one such embodiment, hapticeffect determination module 126 may determine that the closest sensation to a series of vibrations of varying intensity that can be output is a series of changes in the coefficient of friction at the surface of the firstelectronic device 118 via an ESF actuator. - Haptic
effect generation module 128 represents programming that causesprocessor 102 to generate and transmit a haptic signal to a haptic output device, such as firsthaptic output device 122, to generate the determined haptic effect. For example,generation module 128 may access stored waveforms or commands to send to firsthaptic output device 122. As another example, hapticeffect generation module 128 may receive a desired type of effect and utilize signal processing algorithms to generate an appropriate signal to send to firsthaptic output device 122. As a further example, a desired texture may be indicated to be output at the surface of firstelectronic devices 118, along with target coordinates for the texture, and an appropriate waveform sent to one or more actuators to generate appropriate displacement of the surface of first electronic devices 118 (and/or other device components) to provide the texture. Such an embodiment may be particularly applicable wherein the firstelectronic device 118 comprises a touch screen display, such as a smart phone. Some embodiments may utilize multiple haptic output devices in concert to generate the haptic effect. For instance, firsthaptic output device 122 may comprise a plurality of haptic output devices, wherein in order to generate a desired effect, one haptic output device changes the perceived coefficient of friction at the surface of the firstelectronic device 118 while another haptic output device vibrates the surface of the firstelectronic device 118. -
FIG. 2 is another block diagram showing one embodiment of a system for providing a shared haptic experience. The system comprises a first electronic device 202 and a secondelectronic device 204. In some embodiments, the system may comprise any number of electronic devices. In some embodiments, a first user may use first electronic device 202 to control a virtual character in a videogame, and a second user may use secondelectronic device 204 to control a different virtual character in the videogame. Secondelectronic device 204 comprises a “share”button 214 for triggering sharing of the second user's haptic content with the first user. First electronic device 202 comprises a first haptic output device 206, which in turn comprisesactuators 210 for outputting a haptic effect. Likewise, second electronic device 202 comprises a secondhaptic output device 208, which in turn comprises actuators (not shown) for outputting a haptic effect. In this example, first haptic output device 206 comprises two actuators on the right side of first electronic device 202 and two actuators on the left side of first electronic device 202. Further, the two actuators on each side of first electronic device 202 are horizontally aligned. In other embodiments, the two actuators on each side of first electronic device 202 may be vertically aligned. In some embodiments,actuators 210 may be of a similar type. In other embodiments,actuators 210 may be of different types. It will be recognized by those skilled in the art that any number, type, and arrangement of such actuators may be possible. - In some embodiments, first electronic device 202 may be different in type from second
electronic device 204. For example, in one such embodiment, first electronic device 202 comprises a smart phone and secondelectronic device 204 comprises a game console controller. In other embodiments, first electronic device 202 and secondelectronic device 204 may be any combination of laptop computers, game consoles, desktop computers, smart phones, tablets, e-readers, portable gaming systems, game console controllers, personal digital assistants, or other electronic devices. Consequently, in some embodiments, first haptic output device 206 and/oractuators 210 may be of a different type than secondhaptic output device 208 and/or actuators contained therein. For example, in one such embodiment, first haptic output device 206 comprises a series of ERMs, while secondhaptic output device 208 comprises a mechanism for deforming the surface of secondelectronic device 204. -
FIG. 3 shows an external view of a system for providing a shared haptic experience in one embodiment. In this example, the system comprises a firstelectronic device 302, here a game console controller, which comprises a first haptic output device. The firstelectronic device 302 is in communication withgaming system 306. In some embodiments, a first user may be playing, for example, an army game ongaming system 306. The first user may control a virtual marine using firstelectronic device 302. Likewise, a second user may also be playing the army game via second electronic device 304, here also a game console controller, and through which he or she can control his or her own virtual marine. - In such an embodiment,
gaming system 306 may output various haptic effects to second electronic device 304 as the second user plays the army game. For example, if the second user's virtual character gets shot,gaming system 306 may cause secondelectronic device 302 to vibrate. In some embodiments, the first user controlling firstelectronic device 302 may receive modified versions of the haptic effects sent to the second user. In some such embodiments, firstelectronic device 302 may output haptic effects modified based on the relative position of firstelectronic device 302 to second electronic device 304 in real space. For example, in one embodiment, firstelectronic device 302 outputs haptic effects with strengths inversely proportional to therelative distance 310 between the firstelectronic device 302 and second electronic device 304 in real space. That is, in such an embodiment, as thedistance 310 between the two electronic devices increases, the strength of the haptic effect output to firstelectronic device 302 decreases proportionately. In other embodiments, firstelectronic device 302 may output haptic effects modified based on the relative position of second electronic device 304 togaming system 306 in real space. For example, in one such embodiment, firstelectronic device 302 outputs haptic effects with strengths inversely proportional to therelative distance 308 between second electronic device 304 andgaming system 306 in real space. Thus, if the first user is holding second electronic device 304 ten feet from thegaming system 306, the haptic effect transmitted to the first user is weaker than if the second user was holding second electronic device 304 three feet from thegaming system 306. -
FIG. 4 shows another external view of a system for providing a shared haptic experience in one embodiment. In this example, the system comprises a firstelectronic device 402, here a game console controller, which comprises a firsthaptic output device 122. The firstelectronic device 402 is in communication withgaming system 406. In some embodiments, a first user may be playing, for example, an army game ongaming system 406. The first user may control afirst character 408 in the game, here a marine, using firstelectronic device 402. Likewise, a second user may also playing the army game ongaming system 406. The second user may control asecond character 410 in the game using second electronic device 404, here also a game console controller, which comprises second haptic output device. - In such an embodiment,
gaming system 406 may output various haptic effects to second electronic device 404 as the second user plays the army game. For example, if the second user's virtual character is near an explosion,gaming system 406 may cause secondelectronic device 402 to vibrate via secondhaptic output device 130. In some embodiments, the firstelectronic device 402 may output modified versions of the haptic effects sent to the second user. In some such embodiments, firstelectronic device 402 may output haptic effects modified based on the relative position offirst character 408 tosecond character 410 in the virtual environment. For example, in one embodiment, firstelectronic device 402 outputs haptic effects with strengths inversely proportional to the relative virtual distance betweenfirst character 408 andsecond character 410. In such an embodiment, as the virtual distance between the two virtual characters increases, the strength of the haptic effect output to firstelectronic device 402 decreases proportionately. - In some embodiments, the first
electronic device 402 may output versions of the haptic effects sent to the second user that are modified based on the relative size offirst character 408 tosecond character 410 in a virtual environment. For example, if thefirst character 408 is standing and thesecond character 410 is kneeling or crawling, firstelectronic device 402 may output haptic effects with strengths that are intensified compared to those output by second electronic device 404. In another embodiment, firstelectronic device 402 may output intensified haptic effects if second character's 410 virtual size is larger than first character's 408 virtual size. For example, in one such embodiment, ifsecond character 410 is a bear andfirst character 408 is an ant, and the second user's haptic effect is a light vibration in response to a virtual car driving by, firstelectronic device 402 will output a substantially intensified haptic effect, such as a long, intense vibration, due to the virtual size differential. -
FIG. 5 shows one embodiment of an external view of system for providing a shared haptic experience. In this example, the system comprises a first electronic device 502, here a game console controller, which comprises a first haptic output device. The first electronic device 502 is in communication withgaming system 506. In some embodiments, a first user may be playing, for example, an army game ongaming system 506. The first user may control afirst character 508 in the game, here a marine, using first electronic device 502. Likewise, a second user may also playing the army game ongaming system 506. The second user may control asecond character 510 in the game using second electronic device 504, here also a game console controller, which comprises a second haptic output device. - In such an embodiment,
gaming system 506 may output various haptic effects to second electronic device 504 as the second user plays the army game. For example, if the second user's virtual character is driving a virtual tank over a bumpy road,gaming system 506 may cause second haptic output device to vibrate. In some embodiments, first electronic device 502 may output modified versions of the haptic effects sent to the second user. In some such embodiments, the modifications may be based on a virtualenvironmental characteristic 512. In some embodiments, virtual environmental characteristic 512 may comprise one or more of a characteristic of an object or barrier, an ambient temperature, a characteristic of a barrier, a humidity level, or a density of a medium in which a character is located. InFIG. 5 ,environmental characteristic 512 comprises a barrier that is a virtual brick wall. In one such embodiment, first electronic device 502 may output versions of haptic effects sent to second electronic device 504 with dampened strengths becauseenvironmental characteristic 512, a brick wall, is positioned betweenfirst character 508 andsecond character 510. In another embodiment, environmental characteristic 512 may comprise the medium in which the first and/orsecond characters first character 508 may be swimming in water while thesecond character 510 is on land. In such an embodiment, the haptic effect transmitted to the first electronic device 502 may be an dampened version of the haptic effect transmitted to the second electronic device 504 because water is denser than air. - In some embodiments, the
environmental characteristic 512 may comprise physical properties, like the Doppler effect. For example, in one such embodiment, assecond character 510 drives pastfirst character 508 in a virtual car, first electronic device 502 outputs versions of haptic effects sent to second electronic device 504 with characteristics modified based on the Doppler effect. In another embodiment, environmental characteristic 512 may comprise a virtual ambient temperature or humidity. In such an embodiment, first electronic device 502 may output versions of haptic effects sent to second electronic device 504 with characteristics modified based on the virtual ambient temperature or humidity. For example, in one embodiment, first electronic device 502 outputs versions of haptic effects sent to second electronic device 504 with their strengths dampened becauseenvironmental characteristic 512 comprises high virtual humidity. - Although the environmental characteristic 512 shown in
FIG. 5 is part of a virtual environment, in some embodiments, environmental characteristic 512 may be present in real space. In some embodiments, environmental characteristic 512 may comprise one or more of an ambient temperature, a characteristic of a barrier, a humidity level, or a density of a medium in real space. For example, in one embodiment, first electronic device 502 comprises a temperature sensor. In such an embodiment, firstelectronic device 402 can determine the temperature in real space, such as the room in which users are playing the army video game, and vary its haptic output based on the temperature determination. In some such embodiments, firstelectronic device 402 may output versions of haptic effects sent to second electronic device 504 modified based on the temperature in real space. Likewise, in some embodiments, first electronic device 502 may output versions of haptic effects sent to second electronic device 504 with characteristics modified based on a physical obstruction in real space, like a real brick wall between first electronic device 502 and second electronic device 504. -
FIG. 6 shows an external view of system for providing a shared haptic experience in one embodiment. In this example, the system comprises a firstelectronic device 602, which in this example is a game console controller, in communication with a computing device, which in this example is agaming system 606.Gaming system 606 is connected to theinternet 608 for multiplayer gameplay. In some embodiments, a first user may be playing, for example, a basketball game ongaming system 606 and may control his virtual basketball player using firstelectronic device 602. Likewise, a second user may be playing the basketball game via a secondelectronic device 604, which in this example is a smartphone, such as an iPhone or Android phone. In this example, secondelectronic device 604 is wirelessly connected to theinternet 608 for multiplayer gameplay. Firstelectronic device 602 comprises a first haptic output device and secondelectronic device 604 comprises a second haptic output device. - In such an embodiment,
gaming system 606 may output various haptic effects to firstelectronic device 602 as the first user plays the basketball game. For example, if the first user's virtual character takes a shot that bounces off the rim of the basketball net,gaming system 606 may cause firstelectronic device 602 to vibrate. - In one such embodiment, first
electronic device 602 may comprise a “share” button, through which the first user may initiate the sharing of his haptic content with the second user. In such an embodiment, the first user may press a “share” button on a the firstelectronic device 602, indicating he or she wants to share his or her haptic feedback with a second user. Thereafter, thegaming system 606 may generate an effect to be delivered to the secondelectronic device 604 that is substantially the same as the effect that was delivered to the firstelectronic device 602. For example, in one embodiment, the second user may not actually be participating in playing the basketball game, but rather may be simply observing in order to learn how to play the game. In such an embodiment, the first user may press the “share” button on firstelectronic device 602, triggering haptic sharing among the two users. In such an embodiment, secondelectronic device 604 replicates any haptic effects delivered to firstelectronic device 602, such as vibrations, as a result of gameplay. - In other embodiments, the first user may share not only haptic content with the second user by pressing the “share button,” but also his video data, audio data, and/or gameplay controls. In such an embodiment, the second user may take over control of the first user's virtual character and the first user may become an observer. In some such embodiments, second
electronic device 604 may replicate any haptic effects delivered to firstelectronic device 602, such as vibrations, as a result of gameplay. - In some embodiments, a software-generated event, rather than a button press, may trigger sharing of a first user's haptic content with a second user. For example, in some embodiments, the
game system 606 may initiate sharing of a first user's haptic feedback with a second user upon the death of the second user's virtual character in a multiplayer game. In another embodiment, the first user may be playing a virtual basketball game against the second user. If the second user commits a virtual fowl against the first user, the first user may be entitled to two “free throws,” in which the first user may take unopposed shots from a “foul line” on the virtual basketball court. In response to the foul event, the game may disable the second user's controls and change the second user's virtual perspective to that of the first user while the first user is allowed to take his or her free throws. Further, in some embodiments, the change in virtual perspective may automatically trigger sharing of the first user's haptic content with the second user. In such an embodiment, if the first user takes a free throw and misses, the ball hitting the virtual basketball net's rim, the firstelectronic device 602 may output a haptic effect, such as a vibration. Likewise, because haptic sharing has been triggered, the secondelectronic device 604 outputs a similar haptic effect. - In some embodiments, the first user may press the “share” button on first
electronic device 602, which may begin recording of his or her haptic content. In some embodiments, the second user may be able to trigger a playback event, such as by a button press, and subsequently play back the haptic content on secondelectronic device 604. In one such embodiment, secondelectronic device 604 replicates the saved haptic content as closely as possible. For example, a first user may press a “share” button on a game controller indicating he or she wants to share his or her haptic content with a second user. Thereafter, haptic content (e.g., haptic effects) generated for the first user are recorded. Upon the occurrence of a playback event, such as a software-generated event, the saved haptic content is delivered to the second user. In some embodiments, the first user's audio and/or video content may be recorded in addition the first user's haptic content when he or she presses the “share button.” Upon the occurrence of a playback event, the saved haptic content as well as the saved audio and/or video content may be delivered to the second user. - It should be recognized that although the embodiments shown in
FIGS. 3-6 depict only a firstelectronic device 302 and a second electronic device 303, in some embodiments, a plurality of such devices may be used to output haptic effects of the types described throughout this specification. - In some embodiments, the system may comprise one or more automobiles. In some such embodiments, a first user may be, for example, driving a first automobile on the highway. The first user may control the first automobile via a steering wheel, which may comprise a first haptic output device. The first automobile is in communication with the first haptic output device. Likewise, a second user may also be driving a second automobile on the highway. The second user may control the second automobile via a steering wheel, which may comprise a second haptic output device. The second automobile is in communication with the second haptic output device. In some embodiments, one or both automobiles may have blind spot detection enabled, in which an automobile can detect if another vehicle in its blind spot and output an associated alert to one or both drivers.
- In some embodiments, upon the first user activating the left or right blinker, the first automobile may detect the presence of the second automobile in the first user's blind spot. Based on this detection, the first user's automobile may cause the first haptic output device to output a haptic effect to the first user. In some embodiments, the haptic effect may comprise a vibration. Further, in some embodiments, the magnitude of the vibration may change based on the distance between the first and second automobiles. For example, in one such embodiment, the first user may activate his left blinker signal. In such an embodiment, the first user's automobile may detect the presence of the second automobile in the first user's blind spot, determine that the distance between the first and second automobiles is half a meter, and output a haptic effect via the first haptic output device comprising an intense (e.g., high magnitude) vibration. In some embodiments, the haptic effect may be output on the side of the first user's steering wheel corresponding to the side of the first automobile on which the second automobile is detected. For example, if the second automobile is detected in the blind spot on the left side of the first automobile, the first automobile may output a haptic effect on the left side of the steering wheel.
- Further, in some embodiments, the second automobile (via the second haptic output device) may output haptic effects based on the haptic effects sent to the first user in the first automobile. In some such embodiments, the second automobile may output a version of the haptic effect sent to the first user in which the location on the steering wheel that the first haptic effect was output is modified. For example, if the first haptic effect is output to the first user on the left side of the first user's steering wheel, the modified haptic effect may be output to the second user on the right side of the second user's steering wheel. In some embodiments, the second automobile may output a version of the first haptic effect sent to the first user in which the magnitude of the first haptic effect is modified. For example, in one embodiment, the second automobile outputs a version of the first haptic effect with the magnitude reduced by 50%. In some embodiments, how the first haptic effect is modified in order to generate the second haptic effect may change as the distance between the two automobiles changes. For example, in some embodiments, if the two automobiles are more than one meter apart, the second automobile may output a version of the first haptic effect modified such that its magnitude is reduced by 50%. As the two automobiles move closer together, the amount that the magnitude of the first haptic effect is reduced in order to generate the second haptic effect may decrease, so that by the time the two automobiles are within two-tenths of a meter, there is no magnitude reduction between the first haptic effect and the second haptic effect.
- One of ordinary skill in the art will recognize that haptic effects may be shared among a plurality of automobiles, and that a multitude of other haptic triggering events (e.g., a change in the automobile's radio station, a GPS navigation event, pressing the breaks or the gas pedal, the failure of an automobile component, or a low car battery), haptic output device configurations (e.g., placing the haptic output devices in the gear shifter, the break or gas pedal, or a car seat), and haptic effects (e.g, a perceived change in a coefficient of friction or a texture) are possible.
- In some embodiments, the system may comprise a virtual training program. In some such embodiments, an expert may use a first electronic device, which comprises a first haptic output device, to perform a task (e.g., a surgery). As the expert performs the task, haptic effects may be delivered to the expert via the first haptic output device upon the occurrence of an event (e.g., if the expert touches a specific portion of the patient's body). In some embodiments, a student may use a second electronic device, which comprises a second haptic output device, to learn how to perform the task.
- In some embodiments, the haptic content delivered to the first haptic output device is immediately transmitted to the second electronic device, which outputs the haptic effect. In other embodiments, the expert may be able to record the haptic content, as well as any video and/or audio content, for subsequent playback upon the occurrence of a playback event. In some such embodiments, the student can initiate the playback event by, for example, pressing a button, which delivers the saved haptic content, and any saved audio and/or video content, to the second electronic device. The second electronic device then delivers the haptic content to the second haptic output device, which outputs the haptic effects to the student.
- In some embodiments, the second electronic device outputs modified versions of haptic content delivered to the first haptic output device. For example, in some embodiments, the second electronic device may output a version of an effect sent to the first haptic output device with the magnitude amplified. Such a magnitude increase may allow the student to more easily detect what might otherwise be a subtle, but important, haptic cues.
-
FIG. 7 is a flowchart showing a method for providing a shared haptic experience in one embodiment. In some embodiments, the steps inFIG. 7 may be implemented in program code that is executed by a processor, for example, the processor in a general purpose computer, a mobile device, or a server. In some embodiments, these steps may be implemented by a group of processors. The steps below are described with reference to components described above with regard tosystem 100 shown inFIG. 1 . -
Method 700 begins atstep 702, with the receipt of a first haptic effect signal, the first haptic signal based at least in part on a haptic event (e.g., a user's character getting shot in a game, the completion of a level, driving a virtual vehicle over a bumpy virtual road).Detection module 124 orprocessor 102 may detect the first haptic effect signal. - The
method 700 continues atstep 704 whenprocessor 102 determines a second haptic effect based at least in part on the first haptic effect and a characteristic external to the haptic event. - In some embodiments, the characteristic external to the haptic event may comprise a relative position of a first user with respect to a second user. In some embodiments, the relative positions of the first user and the second user comprise the physical positions of the first user and second user in real space. In some such embodiments, the physical positions of the first
electronic device 122, controlled by the first user, and the secondelectronic device 120, controlled by the second user, may be used as reasonable approximations of the physical positions of the first user and the second user in real space. First and secondelectronic devices electronic device 118 and secondelectronic device 120, respectively, in real space. In some such embodiments,processor 102 may receive sensor signals from a first accelerometer and a first gyroscope embedded within first theelectronic device 118. Similarly,processor 102 may receive sensor signals from a second accelerometer and a second gyroscope embedded within the secondelectronic device 120. Based on these sensor signals,processor 102 may determine (via, for example, algorithms or a lookup table) the relative positions of the firstelectronic device 118 and the secondelectronic device 120 in real space. In such an embodiment,processor 102 may further determine the relative position of firstelectronic device 118 with respect to secondelectronic device 120 in real space. - In other embodiments, the relative positions of the first user and the second user comprise virtual positions of a virtual characters controlled by the first user and second user in a virtual environment. In such an embodiment,
processor 102 may determine the relative position of a virtual character controlled by the first user and the relative position of a virtual character controlled by the second user directly from data about the virtual environment. For example, in some embodiments,processor 102 may samplenetwork 110 data to track the location of the first user's virtual character and the second user's virtual character. Based on the sampled data,processor 102 may determine the virtual positions of the first user's virtual character and the second user's virtual character, respectively. In such an embodiment,processor 102 may further determine the relative position of first user's virtual character with respect to the second user's virtual character in the virtual environment. - In some embodiments, the characteristic external to the haptic event may comprise an environmental characteristic. In some embodiments, the environmental characteristic may be an environmental characteristic in real space. In such an embodiment,
computing system 101 may comprise one or more sensors such as a temperature sensor, a humidity sensor, a camera, an accelerometer, a gyroscope, a sonar device, and/or other electronic devices configured to send sensor signals toprocessor 102.Processor 102 may determine an environmental characteristic directly from the sensor signal, or may apply the sensor signal data to an algorithm or a lookup table to determine the environmental characteristic. For example, in one such embodiment,processor 102 receives a sensor signal from a humidity sensor or temperature sensor and determines the humidity or temperature in the environment in which the first user and/or second user may be located. In another embodiment,processor 102 receives a sensor signal from a camera or sonar device and determines any environmental obstructions, like walls, in the environment in which the first user and/or second user may be located. In still another embodiment,processor 102 determines, based on a camera sensor signal, the medium in which the first user or second user is located, for example, if the first user is located in water. In yet another such embodiment,processor 102 receives sensor signals from a camera and determines whether or not the first user or second user is in a vehicle, the size of the vehicle, and/or the direction or velocity in which the vehicle is moving. - In other embodiments, the environmental characteristic may be a virtual environmental characteristic in a virtual environment. In such an embodiment,
processor 102 may determine the environmental characteristic directly from data about the virtual environment. For example, in some embodiments,processor 102samples network 110 data to track the presence or absence of environmental characteristics. Based on the sampled data,processor 102 may determine an environmental characteristic. In some embodiments,processor 102 may determine the virtual environmental characteristic directly from the sampled data, or may apply the sampled data to an algorithm or a lookup table to determine the virtual environmental characteristic. For example, in one such embodiment,processor 102 samples network data and applies an algorithm to determine the presence of an object, e.g. a virtual brick wall, in the virtual environment. Similarly, in some embodiments,processor 102 may sample network data and determine that the environmental characteristic comprises a physical principle, such as the Doppler effect. - In some embodiments,
processor 102 may apply data about the characteristics of the first haptic effect to an algorithm in order to determine the strength, duration, location, and/or other characteristics of the second haptic effect. For example,processor 102 may use the strength and intensity characteristics of the first haptic effect to determine what second haptic effect to generate and through which actuators to generate it in the secondelectronic device 118. In some embodiments, theprocessor 102 may determine the second haptic effect based on the real or virtual relative position of the first user with respect to the second user, as determined instep 704, or any real or virtual environmental characteristics, as determined instep 706. - The
processor 102 may rely on programming contained in hapticeffect determination module 126 to determine the second haptic effect. For example, in some embodiments, hapticeffect determination module 126 may determine the haptic effect to output, and which actuators to use to output the effect, based on algorithms. In some embodiments, such algorithms may assess the relative virtual position of a second user's virtual character with respect to a first user's virtual character. For example, in one embodiment, if a first user's virtual character is 40 meters northeast of the second user's virtual character, theprocessor 102 determines that the second haptic effect should be generated by actuators in the front right side of the secondelectronic device 120. Further, in such an embodiment,processor 102 may determine that the second haptic effect should be a substantially dampened version of the first haptic effect due to the 40-meter distance between the first user's virtual character and the second user's virtual character. - In some embodiments,
processor 102 may determine the second haptic effect to output, and which actuators to use to output the second haptic effect, based on algorithms that assess the relative position of firstelectronic device 118 to secondelectronic device 120 in real space. For example, in one embodiment, if firstelectronic device 118 is a half meter northeast of secondelectronic device 120 in real space, theprocessor 102 determines that the second haptic effect should be generated by actuators in the front right side of the secondelectronic device 120. Further, in such an embodiment,processor 102 may determine that the second haptic effect should be not be dampened due to the mere half meter distance between the firstelectronic device 118 and the secondelectronic device 120. - In some embodiments, haptic
effect determination module 126 may comprise a haptic actuator lookup table. In one such embodiment, the lookup table may comprise data with a haptic actuator of one type and a plurality of haptic actuators of different types that are capable of outputting similar haptic effects. For example, in one such embodiment, the lookup table may comprise data with an ERM and data with a plurality of other haptic devices capable of outputting similar haptic effects as an ERM, such as a LRA, piezoelectric actuator, an electric motor, or an electro-magnetic actuator. In such an embodiment,processor 102 may receive data indicating that the first haptic effect was generated in the firstelectronic device 118 by a signal of a specific intensity and duration designated for a specific type of haptic actuator, for example, an ERM. Based on this data, in one such embodiment,processor 102 may consult the lookup table to determine what hardware in secondhaptic output device 130 can be used as a substitute to generate a second haptic effect with characteristics similar to the first haptic effect. For example, if secondhaptic output device 130 does not contain an ERM but does contain an electric motor,processor 102 consults the lookup table and determines that the electric motor may be able to act as a suitable substitute to generate the second haptic effect. - In other embodiments,
processor 102 may determine a default haptic effect. For example, in one such embodiment, secondhaptic output device 130 may not be able to generate a haptic effect, such as a vibration, due to lack of appropriate hardware. However, in such an embodiment, secondhaptic output device 130 may comprise an ESF actuator capable of varying the perceived coefficient of friction on the surface of secondelectronic device 120, which is the default haptic effect. Thus, in one such embodiment,processor 102 associates any first haptic effect comprising a vibration with a default second haptic effect comprising a perceived change in the coefficient of friction at the surface of secondelectronic device 120. - In some embodiments,
processor 102 may determine a second haptic effect based on data in a lookup table. In some such embodiments, the lookup table may comprise data with environmental characteristics and a plurality of haptic effect modifications. For example, in one such embodiment, the lookup table may comprise data with a brick wall and a haptic effect modification, such as a decrease in haptic effect intensity by 30%. In such an embodiment, if there is a brick wall between a second user's virtual character and the first user's virtual character,processor 102 consults the lookup table and determines that the second haptic effect should be a modified version of the first haptic effect with 30% less intensity. - In some embodiments,
processor 102 may make its determination based in part on other outside factors, such as the state of the gaming device. For example, in one such embodiment,processor 102 may make its determination partially on the amount of battery life the gaming device has. In such an embodiment, hapticeffect determination module 126 may receive data indicating that the first haptic effect comprised a short, intense vibration. Because the battery life on the secondelectronic device 120, for example a smart phone, may be low,processor 102 may determine that a longer, but significantly less intense, vibration may achieve substantially the same effect without depleting the battery life to a detrimental level. - The
method 700 continues atstep 706 with the detection of a triggering event.Detection module 124 orprocessor 102 may detect the triggering event. In some embodiments, the trigger event may initiate storing of second haptic effects for subsequent playback. In some embodiments, the trigger event may initiate the sharing of haptic feedback between users. The triggering event may be user generated, such as by a button press, or software generated, such as when a virtual character is killed in a video game. - The
method 700 continues whenprocessor 102 generates a secondhaptic effect signal 710 and transmits the second haptic signal to the secondhaptic output device 712, which outputs the haptic effect. The second haptic effect signal is based at least in part on the second haptic effect. In some embodiments, theprocessor 102 may access drive signals stored inmemory 104 and associated with particular haptic effects. In one embodiment, a signal is generated by accessing a stored algorithm and inputting parameters associated with an effect. For example, in such an embodiment, an algorithm may output data for use in generating a drive signal based on amplitude and frequency parameters. As another example, a second haptic signal may comprise data sent to an actuator to be decoded by the actuator. For instance, the actuator may itself respond to commands specifying parameters such as amplitude and frequency. -
FIG. 8 is a flowchart showing another method for providing a shared haptic experience in one embodiment. In some embodiments, the steps inFIG. 8 may be implemented in program code that is executed by a processor, for example, the processor in a general purpose computer, a mobile device, or a server. In some embodiments, these steps may be implemented by a group of processors. The steps below are described with reference to components described above with regard tosystem 100 shown inFIG. 1 . -
Method 800 begins atstep 802, with the detection of a triggering event indicating that a plurality of first haptic effects should be stored for later playback.Detection module 124 orprocessor 102 may detect the triggering event. The triggering event may be user generated, such as by a button press, or software generated, such as when a virtual character is killed in a video game. - The
method 800 continues atstep 804 whenprocessor 102 determines a second plurality of haptic effects to generate based at least in part on each of a first plurality haptic effects. Theprocessor 102 may rely on programming contained in hapticeffect determination module 126 to determine the second plurality of haptic effects. In some embodiments,processor 102 may use any of the methods discussed with respect toFIG. 7 to determine each of the second plurality of haptic effects. - The
method 800 continues atstep 806 whenprocessor 102 causes the second plurality of haptic effects to be stored inmemory 104 for subsequent playback.Processor 102 may store the second plurality of haptic effects by type, name, duration, intensity, timestamp, or any other characteristics such that they can later be recalled and output in a sequential, or in some embodiments nonsequential, manner. -
Method 800 continues atstep 808 wheresystem 100 waits for an event indicating the saved plurality of second haptic effects should be played back. The event may be user generated, such as by pressing a button, or software generated, such as by the death of a virtual character in a video game. If the playback event occurs, themethod 800 continues to step 810. Otherwise, themethod 800 returns tosteps - In response to the playback event, the
method 800 continues atsteps processor 102 generates a second plurality of haptic effect signals based on the stored second plurality of haptic effects and transmits each of the second plurality of haptic signals to the secondhaptic output device 130, which outputs the haptic effects. In some embodiments, theprocessor 102 may access drive signals or algorithms stored inmemory 104 and associated with particular haptic effects to generate the second plurality of haptic effect signals. In some embodiments,processor 102 may use any of the methods discussed with respect toFIG. 7 to generate and transmit each of the second plurality of haptic signals to the secondhaptic output device 130. - There are numerous advantages of providing shared haptic experiences. Such systems may provide more compelling gaming experiences by allowing users to feel the sensations their fellow game players feel. For example, if a first user's gaming device vibrates because a virtual character controlled by the first user is shot, a second user's gaming device may output substantially the same, or a modified version of, the vibration, providing a more interactive experience. This may increase overall user satisfaction.
- In some embodiments, sharing haptic feedback among users may lead to improved collaboration when performing tasks in a virtual environment. This is because users will have a better understanding of what actions their teammates are taking by feeling the same, or modified versions of, the associated haptic responses. For example, two users may be playing a game in which they have to collaborate to achieve a military objective. The first user may be controlling a virtual gunner character, while the second user is controlling a virtual medic character. As the first user manipulates his or her virtual character to take part in an attack on a military stronghold, the first user's gaming device may vibrate if his or her virtual gunner character gets shot. In some embodiments, this haptic content may be modified and shared with the second user, causing the second user's gaming device to also vibrate. For example, in such an embodiment, if the gunner character is shot 500 meters northwest of the medic character's position, the second user feels a weak vibration on the front left side of his or her controller. This may indicate to the second user how far away, and in what direction, the first user's character may be located so he may render aid.
- Further, some embodiments may improve virtual training programs. For example, an expert may be able to play a videogame or perform a task in a virtual environment and save his audio, video, and, in some embodiments, haptic content. In such an embodiment, a novice may be able to play back the saved content and learn how to play the game or perform the task. Playing back haptic content, in addition to audio and video, may make such learning more effective.
- The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
- Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
- Also, configurations may be described as a process that is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
- Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not bound the scope of the claims.
- The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
- Embodiments in accordance with aspects of the present subject matter can be implemented in digital electronic circuitry, in computer hardware, firmware, software, or in combinations of the preceding. In one embodiment, a computer may comprise a processor or processors. The processor comprises or has access to a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs including a sensor sampling routine, selection routines, and other routines to perform the methods described above.
- Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
- Such processors may comprise, or may be in communication with, media, for example tangible computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, all electronic, optical, magnetic, or other storage devices capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. Also, various other devices may comprise computer-readable media, such as a router, private or public network, or other transmission device. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
- While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.
Claims (34)
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/219,882 US10067566B2 (en) | 2014-03-19 | 2014-03-19 | Systems and methods for a shared haptic experience |
EP15158365.5A EP2921212A1 (en) | 2014-03-19 | 2015-03-10 | Systems and methods for a shared haptic experience |
KR1020150034835A KR20150109269A (en) | 2014-03-19 | 2015-03-13 | Systems and methods for a shared haptic experience |
JP2015054171A JP2015187862A (en) | 2014-03-19 | 2015-03-18 | Systems and methods for providing haptic effect |
CN201510121533.XA CN104922899A (en) | 2014-03-19 | 2015-03-19 | Systems and methods for a shared haptic experience |
US16/051,698 US20180341332A1 (en) | 2014-03-19 | 2018-08-01 | Systems and Methods for a Shared Haptic Experience |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/219,882 US10067566B2 (en) | 2014-03-19 | 2014-03-19 | Systems and methods for a shared haptic experience |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/051,698 Continuation US20180341332A1 (en) | 2014-03-19 | 2018-08-01 | Systems and Methods for a Shared Haptic Experience |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150268722A1 true US20150268722A1 (en) | 2015-09-24 |
US10067566B2 US10067566B2 (en) | 2018-09-04 |
Family
ID=52630277
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/219,882 Expired - Fee Related US10067566B2 (en) | 2014-03-19 | 2014-03-19 | Systems and methods for a shared haptic experience |
US16/051,698 Abandoned US20180341332A1 (en) | 2014-03-19 | 2018-08-01 | Systems and Methods for a Shared Haptic Experience |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/051,698 Abandoned US20180341332A1 (en) | 2014-03-19 | 2018-08-01 | Systems and Methods for a Shared Haptic Experience |
Country Status (5)
Country | Link |
---|---|
US (2) | US10067566B2 (en) |
EP (1) | EP2921212A1 (en) |
JP (1) | JP2015187862A (en) |
KR (1) | KR20150109269A (en) |
CN (1) | CN104922899A (en) |
Cited By (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9398410B2 (en) * | 2014-09-29 | 2016-07-19 | Stubhub, Inc. | Location-based communications across time |
US20170173457A1 (en) * | 2014-06-09 | 2017-06-22 | Immersion Corporation | System and method for outputting a haptic effect based on a camera zoom state, camera perspective, and/or a direction in which a user's eyes are directed |
EP3255525A1 (en) * | 2016-06-08 | 2017-12-13 | Immersion Corporation | Haptic feedback for opportunistic displays |
WO2018017934A1 (en) * | 2016-07-22 | 2018-01-25 | Harman International Industries, Incorporated | Haptic system for delivering audio content to a user |
EP3305383A1 (en) * | 2016-10-04 | 2018-04-11 | Immersion Corporation | Haptic actuator incorporating electropermanent magnet |
US20180169520A1 (en) * | 2015-06-11 | 2018-06-21 | Shogun Bros. Co. Limited | Vibration feedback system and vibration feedback method |
US20180178118A1 (en) * | 2016-12-27 | 2018-06-28 | Nintendo Co., Ltd. | Storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method |
US20180178121A1 (en) * | 2016-12-27 | 2018-06-28 | Nintendo Co., Ltd. | Vibration control system, vibration control apparatus, storage medium and vibration control method |
US20180232051A1 (en) * | 2017-02-16 | 2018-08-16 | Immersion Corporation | Automatic localized haptics generation system |
CN108604130A (en) * | 2016-03-30 | 2018-09-28 | 索尼公司 | Information processing equipment, information processing method and non-transitory computer-readable medium |
US20190015744A1 (en) * | 2017-07-12 | 2019-01-17 | Nintendo Co., Ltd. | Game system, storage medium having stored therein game program, game apparatus, and game processing method |
US10625150B2 (en) | 2017-03-01 | 2020-04-21 | Nintendo Co., Ltd. | Game system, game apparatus, storage medium having stored therein game program, and game processing method |
US10632371B2 (en) | 2016-12-27 | 2020-04-28 | Nintendo Co., Ltd. | Storage medium having stored therein information processing program, information processing system, information processing apparatus, and information processing method |
US11059495B2 (en) * | 2016-09-12 | 2021-07-13 | Sony Corporation | Information presentation apparatus, steering apparatus, and information presentation method |
US20210216143A1 (en) * | 2017-12-11 | 2021-07-15 | Telefonaktiebolaget Lm Ericsson (Publ) | Method for capturing haptic content in multiple communication devices |
US11175739B2 (en) * | 2018-01-26 | 2021-11-16 | Immersion Corporation | Method and device for performing actuator control based on an actuator model |
US20220193538A1 (en) * | 2020-12-18 | 2022-06-23 | Dell Products, Lp | System and method for coordinating gaming haptic response across peripheral device ecosystem |
US20220215637A1 (en) * | 2019-11-13 | 2022-07-07 | At&T Intellectual Property I, L.P. | Activation of extended reality actuators based on content analysis |
US11391257B2 (en) * | 2019-07-23 | 2022-07-19 | Ford Global Technologies, Llc | Power supply during vehicle startup |
US20220252881A1 (en) * | 2014-03-26 | 2022-08-11 | Mark D. Wieczorek | System and method for haptic interactive experiences |
US11529886B2 (en) | 2019-07-23 | 2022-12-20 | Ford Global Technologies, Llc | Power supply during vehicle off state |
US11550396B2 (en) | 2018-02-16 | 2023-01-10 | Sony Corporation | Information processing device and information processing method |
US11586347B2 (en) * | 2019-04-22 | 2023-02-21 | Hewlett-Packard Development Company, L.P. | Palm-based graphics change |
US20230097257A1 (en) * | 2020-12-31 | 2023-03-30 | Snap Inc. | Electronic communication interface with haptic feedback response |
US12164689B2 (en) | 2021-03-31 | 2024-12-10 | Snap Inc. | Virtual reality communication interface with haptic feedback response |
US12200399B2 (en) | 2020-12-31 | 2025-01-14 | Snap Inc. | Real-time video communication interface with haptic feedback response |
US12216823B2 (en) | 2021-12-29 | 2025-02-04 | Snap Inc. | Communication interface with haptic feedback response |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102512840B1 (en) * | 2015-10-15 | 2023-03-22 | 삼성전자주식회사 | Method for recording a screen and an electronic device thereof |
US10031580B2 (en) * | 2016-01-13 | 2018-07-24 | Immersion Corporation | Systems and methods for haptically-enabled neural interfaces |
US10456672B2 (en) * | 2016-05-19 | 2019-10-29 | Google Llc | Methods and systems for facilitating participation in a game session |
WO2018047543A1 (en) * | 2016-09-08 | 2018-03-15 | ソニー株式会社 | Output control device, output control method, and program |
JP6613267B2 (en) * | 2017-06-02 | 2019-11-27 | 任天堂株式会社 | Information processing system, information processing program, information processing apparatus, and information processing method |
KR102518400B1 (en) | 2017-11-22 | 2023-04-06 | 삼성전자주식회사 | Method for providing vibration and electronic device for supporting the same |
US20190324538A1 (en) * | 2018-04-20 | 2019-10-24 | Immersion Corporation | Haptic-enabled wearable device for generating a haptic effect in an immersive reality environment |
JP2020046714A (en) * | 2018-09-14 | 2020-03-26 | コニカミノルタ株式会社 | Display apparatus and control program of display apparatus |
CA3119451A1 (en) * | 2018-11-13 | 2020-05-22 | Spark Xr, Inc. | Systems and methods for generating sensory input associated with virtual objects |
WO2021145026A1 (en) * | 2020-01-17 | 2021-07-22 | ソニーグループ株式会社 | Information processing device and information processing terminal |
CN114534249A (en) * | 2022-02-17 | 2022-05-27 | 上海哔哩哔哩科技有限公司 | Data processing method, storage medium and system for sharing immersive experience by network users |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020142701A1 (en) * | 2001-03-30 | 2002-10-03 | Rosenberg Louis B. | Haptic remote control for toys |
US20030135203A1 (en) * | 2002-01-16 | 2003-07-17 | Yulun Wang | Minimally invasive surgical training using robotics and tele-collaboration |
US20060223637A1 (en) * | 2005-03-31 | 2006-10-05 | Outland Research, Llc | Video game system combining gaming simulation with remote robot control and remote robot feedback |
US20080294984A1 (en) * | 2007-05-25 | 2008-11-27 | Immersion Corporation | Customizing Haptic Effects On An End User Device |
US20090091479A1 (en) * | 2007-10-04 | 2009-04-09 | Motorola, Inc. | Keypad haptic communication |
US20090096632A1 (en) * | 2007-10-16 | 2009-04-16 | Immersion Corporation | Synchronization of haptic effect data in a media stream |
US20110021272A1 (en) * | 2009-07-22 | 2011-01-27 | Immersion Corporation | System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment |
US20110181403A1 (en) * | 2006-02-03 | 2011-07-28 | Immersion Corporation | Generation of consistent haptic effects |
US20110187498A1 (en) * | 2010-01-29 | 2011-08-04 | Immersion Corporation | System and Method of Haptically Communicating Vehicle Information From a Vehicle to a Keyless Entry Device |
US20110309918A1 (en) * | 2010-06-17 | 2011-12-22 | Immersion Corporation | System and Method for Compensating for Aging Haptic Actuators |
US20120068835A1 (en) * | 2010-09-22 | 2012-03-22 | At&T Intellectual Property I, L.P. | Devices, Systems, and Methods for Tactile Feedback and Input |
US20120232780A1 (en) * | 2005-06-27 | 2012-09-13 | Coactive Drive Corporation | Asymmetric and general vibration waveforms from multiple synchronized vibration actuators |
US20130038603A1 (en) * | 2011-08-09 | 2013-02-14 | Sungho Bae | Apparatus and method for generating sensory vibration |
US20170263259A1 (en) * | 2014-09-12 | 2017-09-14 | Sony Corporation | Transmission device, transmission method, reception device, and reception method |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6028593A (en) * | 1995-12-01 | 2000-02-22 | Immersion Corporation | Method and apparatus for providing simulated physical interactions within computer generated environments |
US8508469B1 (en) * | 1995-12-01 | 2013-08-13 | Immersion Corporation | Networked applications including haptic feedback |
US6147674A (en) * | 1995-12-01 | 2000-11-14 | Immersion Corporation | Method and apparatus for designing force sensations in force feedback computer applications |
TW404846B (en) * | 1998-01-23 | 2000-09-11 | Koninkl Philips Electronics Nv | Multiperson tactual virtual environment |
EP1330811B1 (en) * | 2000-09-28 | 2012-08-22 | Immersion Corporation | Directional tactile feedback for haptic feedback interface devices |
JP3733075B2 (en) * | 2002-02-07 | 2006-01-11 | 株式会社国際電気通信基礎技術研究所 | Interaction media system |
US6812833B2 (en) * | 2002-04-12 | 2004-11-02 | Lear Corporation | Turn signal assembly with tactile feedback |
US7378986B2 (en) * | 2002-09-03 | 2008-05-27 | Daimlerchrysler Ag | Device and method for radio-based danger warning |
GB0322489D0 (en) | 2003-09-25 | 2003-10-29 | British Telecomm | Haptics transmission systems |
US9046922B2 (en) | 2004-09-20 | 2015-06-02 | Immersion Corporation | Products and processes for providing multimodal feedback in a user interface device |
WO2013134388A1 (en) | 2012-03-06 | 2013-09-12 | Coactive Drive Corporation | Synchronized array of vibration actuators in a network topology |
JP2007034439A (en) * | 2005-07-22 | 2007-02-08 | Advanced Telecommunication Research Institute International | Haptic sense search system and tactile sense search program |
US7714701B2 (en) * | 2005-11-16 | 2010-05-11 | Gm Global Technology Operations, Inc. | Active material based haptic alert system |
US20070244641A1 (en) * | 2006-04-17 | 2007-10-18 | Gm Global Technology Operations, Inc. | Active material based haptic communication systems |
US9161817B2 (en) * | 2008-03-27 | 2015-10-20 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Robotic catheter system |
CN102160366B (en) | 2008-07-15 | 2014-05-14 | 伊梅森公司 | Systems and methods for transmitting haptic messages |
JP5097684B2 (en) * | 2008-11-28 | 2012-12-12 | 株式会社日立製作所 | Imaging device |
JP5668076B2 (en) | 2009-11-17 | 2015-02-12 | イマージョン コーポレーションImmersion Corporation | System and method for increasing haptic bandwidth in electronic devices |
US8540571B2 (en) * | 2010-03-31 | 2013-09-24 | Immersion Corporation | System and method for providing haptic stimulus based on position |
US20130198625A1 (en) | 2012-01-26 | 2013-08-01 | Thomas G Anderson | System For Generating Haptic Feedback and Receiving User Inputs |
JP2013184612A (en) * | 2012-03-08 | 2013-09-19 | Aisin Aw Co Ltd | Driving support system |
US8860563B2 (en) * | 2012-06-14 | 2014-10-14 | Immersion Corporation | Haptic effect conversion system using granular synthesis |
DE102012013376A1 (en) * | 2012-07-05 | 2014-01-09 | GM Global Technology Operations, LLC (n.d. Ges. d. Staates Delaware) | Driver assistance system |
US9558637B2 (en) | 2013-09-10 | 2017-01-31 | Immersion Corporation | Systems and methods for performing haptic conversion |
US9251630B2 (en) * | 2013-12-17 | 2016-02-02 | At&T Intellectual Property I, L.P. | Method, computer-readable storage device and apparatus for exchanging vehicle information |
-
2014
- 2014-03-19 US US14/219,882 patent/US10067566B2/en not_active Expired - Fee Related
-
2015
- 2015-03-10 EP EP15158365.5A patent/EP2921212A1/en not_active Withdrawn
- 2015-03-13 KR KR1020150034835A patent/KR20150109269A/en not_active Application Discontinuation
- 2015-03-18 JP JP2015054171A patent/JP2015187862A/en active Pending
- 2015-03-19 CN CN201510121533.XA patent/CN104922899A/en active Pending
-
2018
- 2018-08-01 US US16/051,698 patent/US20180341332A1/en not_active Abandoned
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020142701A1 (en) * | 2001-03-30 | 2002-10-03 | Rosenberg Louis B. | Haptic remote control for toys |
US20030135203A1 (en) * | 2002-01-16 | 2003-07-17 | Yulun Wang | Minimally invasive surgical training using robotics and tele-collaboration |
US20060223637A1 (en) * | 2005-03-31 | 2006-10-05 | Outland Research, Llc | Video game system combining gaming simulation with remote robot control and remote robot feedback |
US20120232780A1 (en) * | 2005-06-27 | 2012-09-13 | Coactive Drive Corporation | Asymmetric and general vibration waveforms from multiple synchronized vibration actuators |
US20110181403A1 (en) * | 2006-02-03 | 2011-07-28 | Immersion Corporation | Generation of consistent haptic effects |
US20080294984A1 (en) * | 2007-05-25 | 2008-11-27 | Immersion Corporation | Customizing Haptic Effects On An End User Device |
US20090091479A1 (en) * | 2007-10-04 | 2009-04-09 | Motorola, Inc. | Keypad haptic communication |
US20090096632A1 (en) * | 2007-10-16 | 2009-04-16 | Immersion Corporation | Synchronization of haptic effect data in a media stream |
US20110021272A1 (en) * | 2009-07-22 | 2011-01-27 | Immersion Corporation | System and method for providing complex haptic stimulation during input of control gestures, and relating to control of virtual equipment |
US20110187498A1 (en) * | 2010-01-29 | 2011-08-04 | Immersion Corporation | System and Method of Haptically Communicating Vehicle Information From a Vehicle to a Keyless Entry Device |
US20110309918A1 (en) * | 2010-06-17 | 2011-12-22 | Immersion Corporation | System and Method for Compensating for Aging Haptic Actuators |
US20120068835A1 (en) * | 2010-09-22 | 2012-03-22 | At&T Intellectual Property I, L.P. | Devices, Systems, and Methods for Tactile Feedback and Input |
US20130038603A1 (en) * | 2011-08-09 | 2013-02-14 | Sungho Bae | Apparatus and method for generating sensory vibration |
US20170263259A1 (en) * | 2014-09-12 | 2017-09-14 | Sony Corporation | Transmission device, transmission method, reception device, and reception method |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220252881A1 (en) * | 2014-03-26 | 2022-08-11 | Mark D. Wieczorek | System and method for haptic interactive experiences |
US12189123B2 (en) * | 2014-03-26 | 2025-01-07 | Mark D. Wieczorek | System and method for haptic interactive experiences |
US20170173457A1 (en) * | 2014-06-09 | 2017-06-22 | Immersion Corporation | System and method for outputting a haptic effect based on a camera zoom state, camera perspective, and/or a direction in which a user's eyes are directed |
US9398410B2 (en) * | 2014-09-29 | 2016-07-19 | Stubhub, Inc. | Location-based communications across time |
US20180169520A1 (en) * | 2015-06-11 | 2018-06-21 | Shogun Bros. Co. Limited | Vibration feedback system and vibration feedback method |
CN108604130A (en) * | 2016-03-30 | 2018-09-28 | 索尼公司 | Information processing equipment, information processing method and non-transitory computer-readable medium |
US20190094972A1 (en) * | 2016-03-30 | 2019-03-28 | Sony Corporation | Information processing apparatus, information processing method, and non-transitory computer-readable medium |
EP3255525A1 (en) * | 2016-06-08 | 2017-12-13 | Immersion Corporation | Haptic feedback for opportunistic displays |
US10198072B2 (en) | 2016-06-08 | 2019-02-05 | Immersion Corporation | Haptic feedback for opportunistic displays |
US10671170B2 (en) | 2016-07-22 | 2020-06-02 | Harman International Industries, Inc. | Haptic driving guidance system |
US11392201B2 (en) * | 2016-07-22 | 2022-07-19 | Harman International Industries, Incorporated | Haptic system for delivering audio content to a user |
US11275442B2 (en) | 2016-07-22 | 2022-03-15 | Harman International Industries, Incorporated | Echolocation with haptic transducer devices |
WO2018017934A1 (en) * | 2016-07-22 | 2018-01-25 | Harman International Industries, Incorporated | Haptic system for delivering audio content to a user |
US11126263B2 (en) | 2016-07-22 | 2021-09-21 | Harman International Industries, Incorporated | Haptic system for actuating materials |
US10915175B2 (en) | 2016-07-22 | 2021-02-09 | Harman International Industries, Incorporated | Haptic notification system for vehicles |
US10890975B2 (en) | 2016-07-22 | 2021-01-12 | Harman International Industries, Incorporated | Haptic guidance system |
US11059495B2 (en) * | 2016-09-12 | 2021-07-13 | Sony Corporation | Information presentation apparatus, steering apparatus, and information presentation method |
EP3305383A1 (en) * | 2016-10-04 | 2018-04-11 | Immersion Corporation | Haptic actuator incorporating electropermanent magnet |
US10632371B2 (en) | 2016-12-27 | 2020-04-28 | Nintendo Co., Ltd. | Storage medium having stored therein information processing program, information processing system, information processing apparatus, and information processing method |
US10603582B2 (en) * | 2016-12-27 | 2020-03-31 | Nintendo Co., Ltd. | Vibration control system, vibration control apparatus, storage medium and vibration control method |
US10350491B2 (en) * | 2016-12-27 | 2019-07-16 | Nintendo Co., Ltd. | Techniques for variable vibration waveform generation based on number of controllers |
EP3343327B1 (en) * | 2016-12-27 | 2022-10-19 | Nintendo Co., Ltd. | Vibration control system, vibration control apparatus, vibration control program and vibration control method |
US20180178118A1 (en) * | 2016-12-27 | 2018-06-28 | Nintendo Co., Ltd. | Storage medium having stored therein information processing program, information processing apparatus, information processing system, and information processing method |
US20180178121A1 (en) * | 2016-12-27 | 2018-06-28 | Nintendo Co., Ltd. | Vibration control system, vibration control apparatus, storage medium and vibration control method |
EP3343323A1 (en) * | 2016-12-27 | 2018-07-04 | Nintendo Co., Ltd. | Information processing program, information processing apparatus, information processing system, and information processing method |
US20180232051A1 (en) * | 2017-02-16 | 2018-08-16 | Immersion Corporation | Automatic localized haptics generation system |
US10625150B2 (en) | 2017-03-01 | 2020-04-21 | Nintendo Co., Ltd. | Game system, game apparatus, storage medium having stored therein game program, and game processing method |
US20190015744A1 (en) * | 2017-07-12 | 2019-01-17 | Nintendo Co., Ltd. | Game system, storage medium having stored therein game program, game apparatus, and game processing method |
US10758820B2 (en) * | 2017-07-12 | 2020-09-01 | Nintendo Co., Ltd. | Game system, storage medium having stored therein game program, game apparatus, and game processing method |
US20210216143A1 (en) * | 2017-12-11 | 2021-07-15 | Telefonaktiebolaget Lm Ericsson (Publ) | Method for capturing haptic content in multiple communication devices |
US11733777B2 (en) * | 2017-12-11 | 2023-08-22 | Telefonaktiebolaget Lm Ericsson (Publ) | Method for capturing haptic content in multiple communication devices |
US11175739B2 (en) * | 2018-01-26 | 2021-11-16 | Immersion Corporation | Method and device for performing actuator control based on an actuator model |
US12135838B2 (en) | 2018-02-16 | 2024-11-05 | Sony Group Corporation | Information processing device and information processing method for tactile stimulation |
US11550396B2 (en) | 2018-02-16 | 2023-01-10 | Sony Corporation | Information processing device and information processing method |
US11586347B2 (en) * | 2019-04-22 | 2023-02-21 | Hewlett-Packard Development Company, L.P. | Palm-based graphics change |
US11391257B2 (en) * | 2019-07-23 | 2022-07-19 | Ford Global Technologies, Llc | Power supply during vehicle startup |
US11529886B2 (en) | 2019-07-23 | 2022-12-20 | Ford Global Technologies, Llc | Power supply during vehicle off state |
US20220215637A1 (en) * | 2019-11-13 | 2022-07-07 | At&T Intellectual Property I, L.P. | Activation of extended reality actuators based on content analysis |
US20220193538A1 (en) * | 2020-12-18 | 2022-06-23 | Dell Products, Lp | System and method for coordinating gaming haptic response across peripheral device ecosystem |
US11571617B2 (en) * | 2020-12-18 | 2023-02-07 | Dell Products, Lp | System and method for coordinating gaming haptic response across peripheral device ecosystem |
US20240184371A1 (en) * | 2020-12-31 | 2024-06-06 | Snap Inc. | Electronic communication interface with haptic feedback response |
US20230097257A1 (en) * | 2020-12-31 | 2023-03-30 | Snap Inc. | Electronic communication interface with haptic feedback response |
US12200399B2 (en) | 2020-12-31 | 2025-01-14 | Snap Inc. | Real-time video communication interface with haptic feedback response |
US12164689B2 (en) | 2021-03-31 | 2024-12-10 | Snap Inc. | Virtual reality communication interface with haptic feedback response |
US12216823B2 (en) | 2021-12-29 | 2025-02-04 | Snap Inc. | Communication interface with haptic feedback response |
US12216827B2 (en) * | 2022-11-29 | 2025-02-04 | Snap Inc. | Electronic communication interface with haptic feedback response |
Also Published As
Publication number | Publication date |
---|---|
US20180341332A1 (en) | 2018-11-29 |
JP2015187862A (en) | 2015-10-29 |
CN104922899A (en) | 2015-09-23 |
KR20150109269A (en) | 2015-10-01 |
US10067566B2 (en) | 2018-09-04 |
EP2921212A1 (en) | 2015-09-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10067566B2 (en) | Systems and methods for a shared haptic experience | |
JP7469266B2 (en) | System and method for providing complex tactile stimuli during input of manipulation gestures and in conjunction with manipulation of a virtual device | |
JP6576697B2 (en) | Programmable haptic device and method for changing haptic intensity based on field of view and / or proximity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMMERSION CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GRANT, DANNY;WANG, YITING;REEL/FRAME:032480/0624 Effective date: 20140319 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20220904 |