US20190265794A1 - Haptic feedback for opportunistic displays - Google Patents
Haptic feedback for opportunistic displays Download PDFInfo
- Publication number
- US20190265794A1 US20190265794A1 US16/227,681 US201816227681A US2019265794A1 US 20190265794 A1 US20190265794 A1 US 20190265794A1 US 201816227681 A US201816227681 A US 201816227681A US 2019265794 A1 US2019265794 A1 US 2019265794A1
- Authority
- US
- United States
- Prior art keywords
- display device
- user
- computing device
- haptic
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
Abstract
One illustrative system disclosed herein includes a computing device in communication with a sensor configured to detect a display device near the sensor, the computing device, or within a field of view of a user of the computing device. The sensor can also be configured to transmit a signal associated with the display device to a processor in communication with the sensor. The processor is configured to determine an availability of the display device to display or output content (e.g., texts, images, sounds, videos, etc.) and a location of the display device based on the signal. The processor is also configured to determine a haptic effect based on the availability and the location of the display device and transmit a haptic signal associated with the haptic effect. The illustrative system also includes a haptic output device configured to receive the haptic signal and output the haptic effect.
Description
- The present application is a continuation of U.S. Nonprovisional application Ser. No. 15/176,692, filed on Jun. 8, 2016, entitled “Haptic Feedback for Opportunistic Displays,” the entire contents of which are hereby incorporated by reference for all purposes.
- The present disclosure relates generally to user interface devices. More specifically, but not by way of limitation, this disclosure relates to haptic feedback for opportunistic displays.
- Many modern devices include a display device that can be used to provide content to a user of the device. The content can include text, images, sounds, videos, etc. A spectator (e.g., another user looking at the device) may also view content provided on the display device. Some display devices, however, may lack haptic feedback capabilities. Moreover, the spectator's attention may not be drawn to the display if the display is providing content that may be relevant to the spectator.
- Various embodiments of the present disclosure provide systems and methods for providing haptic feedback for opportunistic displays.
- In one embodiment, a system of the present disclosure may comprise a computing device and a sensor communicatively coupled to the computing device. The sensor can be configured to detect a display device proximate to the sensor and within a field of view of a user of the computing device and transmit a signal associated with the display device. The system also comprises a processor in communication with the sensor, the processor is configured to receive the signal from the sensor and determine an availability of the display device to display or output content (e.g., texts, images, sounds, videos, animated graphics, etc.) to the user and determine a location of the display device based on the signal. The processor can also be configured to determine a first haptic effect based at least in part on the availability of the display device and the location of the display device. The processor can transmit a first haptic signal associated with the first haptic effect. The system may further comprise a haptic output device configured to receive the first haptic signal and output the first haptic effect.
- In another embodiment, a method of the present disclosure may comprise: detecting, by a sensor communicatively coupled to a computing device, a display device proximate to the sensor and within a field of view of a user of the computing device; transmitting, by the sensor, a sensor signal associated with the display device to a processor. The method may also comprise determining, by the processor, an availability of the display device to output content to the user and a location of the display device based at least in part on the sensor signal. The method may also comprise determining, by the processor, a first haptic effect based at least in part on the availability of the display device and the location of the display device. The method may also comprise transmitting, by the processor, a first haptic signal associated with the first haptic effect to a haptic output device. The haptic output device may be configured to receive the first haptic signal and output the first haptic effect. Yet another embodiment comprises a computer-readable medium for implementing such a method.
- These illustrative embodiments are mentioned not to limit or define the limits of the present subject matter, but to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, and further description is provided there. Advantages offered by various embodiments may be further understood by examining this specification and/or by practicing one or more embodiments of the claimed subject matter.
- A full and enabling disclosure is set forth more particularly in the remainder of the specification. The specification makes reference to the following appended figures.
-
FIG. 1 is a block diagram showing a system for haptic feedback for opportunistic displays according to one embodiment. -
FIG. 2 shows an embodiment of a system for haptic feedback for opportunistic displays according to one embodiment. -
FIG. 3 shows another embodiment of a system for haptic feedback for opportunistic displays according to another embodiment. -
FIG. 4 is a flow chart of steps for performing a method for providing haptic feedback for opportunistic displays according to one embodiment. -
FIG. 5 is a flow chart of steps for performing another method for providing haptic feedback for opportunistic displays according to another embodiment. - Reference now will be made in detail to various and alternative illustrative embodiments and to the accompanying drawings. Each example is provided by way of explanation, and not as a limitation. It will be apparent to those skilled in the art that modifications and variations can be made. For instance, features illustrated or described as part of one embodiment may be used in another embodiment to yield a still further embodiment. Thus, it is intended that this disclosure include modifications and variations that come within the scope of the appended claims and their equivalents.
- One illustrative embodiment of the present disclosure comprises a computing device, such as a smartphone or a smartwatch. The computing device comprises a sensor, a memory, a haptic output device, and a processor in communication with each of these elements.
- In the illustrative embodiment, the sensor may detect an opportunistic display device near the computing device. An opportunistic display device includes a display device that is proximate to the computing device, the sensor, or a user of the computing device (e.g., a user wearing the smartwatch or a user of the smartphone) and can be used to output or display content to the user of the computing device or another user. For example, the opportunistic display device can be a display device on another smartphone, smartwatch, or other display device near the user and within the field of view of the user of the computing device and able to communicate with the computing device such that the display device can be used to provide content to the user. Thus, for example, the user could receive content on the user's computing device but wish to see the content displayed on the opportunistic display. For instance, if the user is sitting across from another person and can see the other person's smartwatch display, the user may wish certain text messages to be displayed on the other person's smartwatch display while the smartwatch is in the user's field of view. Doing so allows the user to see the content of the text message without having to access or view the user's own computing device and with minimal disruption to the other person.
- In the illustrative embodiment, the sensor detects the opportunistic display device and transmits a signal to the processor, which determines an availability and location of the opportunistic display device based on the signal. The availability of the opportunistic display device can indicate that the opportunistic display device is available to display content to the user and/or is displaying content to the user. In such an embodiment, the processor can be configured to transmit data that includes content to the opportunistic display device to be output by the opportunistic display device. In the illustrative embodiment, the processor is configured to determine a first haptic effect based at least in part on the availability of the opportunistic display device and a second haptic effect based at least in part on the location of the opportunistic display device. These haptic signals may be combined to form a single, composite haptic effect.
- In the illustrative embodiment, the processor is configured to transmit a first haptic signal associated with the first haptic effect and a second haptic signal associated with the second haptic effect to a haptic output device. The haptic output device is configured to output one or more first haptic effects (e.g., textures, vibrations, stroking sensations, and/or stinging sensations) associated with the availability of the opportunistic display device. In the illustrative embodiment, the one or more first haptic effects can indicate to the user of the computing device that the opportunistic display device is available (e.g., displaying content received from the processor, which may be relevant to the user).
- In the illustrative embodiment, the haptic output device is also configured to output one or more second haptic effects associated with the location of the opportunistic display device. In such an embodiment, the one or more second haptic effects indicate to the user of the computing device the location of the opportunistic display device (e.g., where the user should look to view content being displayed on the opportunistic display device). In this manner, the first and second haptic effects indicate to the user of the computing device when and where to look to view content displayed on the opportunistic display device.
- In the illustrative embodiment, the opportunistic display device may comprise a sensor, a memory, a haptic output device, and a processor in communication with each of these elements. In such an embodiment, the sensor may detect a field of view of the user of the computing device or a direction of an eye gaze of the user. For example, the opportunistic display may comprise a video screen with an embedded or proximate sensor for detecting a passing user's field of view with respect to the video screen. The sensor can transmit a signal to the processor, which transmits a haptic signal based on the direction of the field of view of the user of the computing device to the haptic output device. The haptic output device is configured to receive the haptic signal from the processor and output one or more haptic effects associated with the field of view of the user of the computing device. In the illustrative embodiment, the haptic output effects can indicate to another user associated with the opportunistic display device (e.g., a user wearing a smartwatch that is being used as the opportunistic display device) that the user of the computing device is looking at the opportunistic display device.
- These illustrative examples are given to introduce the reader to the general subject matter discussed here and are not intended to limit the scope of the disclosed concepts. The following sections describe various additional features and examples with reference to the drawings in which like numerals indicate like elements, and directional descriptions are used to describe the illustrative examples but, like the illustrative examples, should not be used to limit the present disclosure.
-
FIG. 1 is a block diagram showing asystem 100 for haptic feedback for opportunistic displays according to one embodiment. In the embodiment depicted inFIG. 1 , thesystem 100 comprises acomputing device 101 having aprocessor 102 in communication with other hardware via abus 106. Thecomputing device 101 may comprise, for example, a mobile device (e.g., a smartphone), tablet, e-reader, smartwatch, a head-mounted display, glasses, a wearable device, etc. - A
memory 104, which can comprise any suitable tangible (and non-transitory) computer-readable medium such as RAM, ROM, EEPROM, or the like, embodies program components that configure operation of thecomputing device 101. In the embodiment shown,computing device 101 further includes one or morenetwork interface devices 110, input/output (I/O)interface components 112, andstorage 114. -
Network interface device 110 can represent one or more of any components that facilitate a network connection. Examples include, but are not limited to, wired interfaces such as Ethernet, USB, IEEE 1394, and/or wireless interfaces such as IEEE 802.11, Bluetooth, or radio interfaces for accessing cellular telephone networks (e.g., transceiver/antenna for accessing a CDMA, GSM, UMTS, or other mobile communications network). - I/
O components 112 may be used to facilitate wired or wireless connection to devices such as one ormore displays 134, game controllers, keyboards, mice, joysticks, cameras, buttons, speakers, microphones and/or other hardware used to input or output data.Storage 114 represents nonvolatile storage such as magnetic, optical, or other storage media included incomputing device 101 or coupled to theprocessor 102. - In some embodiments, the
computing device 101 includes a touchsensitive surface 116. Touchsensitive surface 116 represents any surface that is configured to sense tactile input of a user. One ormore touch sensors 108 are configured to detect a touch in a touch area (e.g., when an object contacts the touch sensitive surface 116) and transmit signals associated with the touch to theprocessor 102. Any suitable number, type, or arrangement oftouch sensor 108 can be used. For example, resistive and/or capacitive sensors may be embedded in touchsensitive surface 116 and used to determine the location of a touch and other information, such as pressure, speed, and/or direction. - The
touch sensor 108 can additionally or alternatively comprise other types of sensors. For example, optical sensors with a view of the touchsensitive surface 116 may be used to determine the touch position. As another example, thetouch sensor 108 may comprise a LED (Light Emitting Diode) finger detector mounted on the side of a display. In some embodiments,touch sensor 108 may be configured to detect multiple aspects of the user interaction. For example,touch sensor 108 may detect the speed, pressure, and direction of a user interaction, and incorporate this information into the signal transmitted to theprocessor 102. - In some embodiments, the
computing device 101 comprises a touch-enabled display that combines a touchsensitive surface 116 and adisplay 134 of thecomputing device 101. The touchsensitive surface 116 may correspond to thedisplay 134 exterior or one or more layers of material above components of thedisplay 134. In other embodiments, thecomputing device 101 comprises a touchsensitive surface 116, which may be mapped to a graphical user interface provided in thedisplay 134 that is included in the system 11 interfaced tocomputing device 101. - In some embodiments,
computing device 101 comprises acamera 130. Although thecamera 130 is depicted inFIG. 1 as being internal to thecomputing device 101, in some embodiments, thecamera 130 may be external to and in communication with thecomputing device 101. As an example, thecamera 130 may be external to and in communication with thecomputing device 101 via wired interfaces such as, for example, Ethernet, USB, IEEE 1394, and/or wireless interfaces such as IEEE1 802.11, Bluetooth, or radio interfaces. - In some embodiments, the
computing device 101 comprises one ormore sensors 132. In some embodiments, thesensor 132 may comprise, for example, gyroscope, an accelerometer, a global positioning system (GPS) unit, a range sensor, a depth sensor, a Bluetooth device, a camera, an infrared sensor, a quick response (QR) code sensor, etc. In some embodiments, thesensor 132 may comprise a camera including, as an example, a body-worn camera (e.g., Google Glass) that can identify devices. In some embodiments, thecomputing device 101 may comprise a wearable device (e.g., glasses) and thesensor 132 may comprise any device for detecting eye gaze, line-of-sight, or field of view of a user of thecomputing device 101. In some embodiments, thesensor 132 is external to thecomputing device 101 and in wired or wireless communication with the computing device. - In some embodiments, the
sensor 132 may detect an availability, presence, or location of an opportunistic display device (e.g., adisplay device 136, or another computing device) that is proximate to thesensor 132, thecomputing device 101, or a user of thecomputing device 101. As an example, thesensor 132 may be a Bluetooth device or other network device configured to detect a presence and location of another Bluetooth device (e.g., a Bluetooth display device) or network device by analyzing signal strength between thesensor 132 and the display device. In some embodiments, thesensor 132 may detect a distance between thesensor 132, thecomputing device 101, or a user of thecomputing device 101, and the display device (e.g., based on the strength of the Bluetooth signal between in thesensor 132 and the display device). As another example, thesensor 132 may determine the availability, presence, or location of the display device using indoor positioning technology. In still another example, thesensor 132 may transmit data about a position or location of thesensor 132, thecomputing device 101, or a user of thecomputing device 101 to a server or other device. In response, the server or other device may transmit data about a position, location, or availability of other devices (e.g., display device 136) that are near thesensor 132 orcomputing device 101 based on the data transmitted by thesensor 132. - In some embodiments, the
sensor 132 may detect other data about devices near thesensor 132. For example, thesensor 132 may detect a type, size, shape, etc., of devices near the sensor 132 (e.g., the display device 136). - In some embodiments, the
processor 102 may be in communication with asingle sensor 132 and, in other embodiments, theprocessor 102 may be in communication with a plurality ofsensors 132, for example, a gyroscope, an accelerometer, and a Bluetooth device. Thesensor 132 is configured to transmit sensor signals to theprocessor 102. - In some embodiments, the
system 100 further includeshaptic output device 118 in communication with theprocessor 102.Haptic output device 118 is configured to output a haptic effect in response to a haptic signal. For example, thehaptic output device 118 can output a haptic effect in response to a haptic signal from theprocessor 102. In some embodiments,haptic output device 118 is configured to output a haptic effect comprising, for example, a vibration, a squeeze, a poke, a change in a perceived coefficient of friction, a simulated texture, a stroking sensation, an electro-tactile effect a surface deformation (e.g., a deformation of a surface associated with the computing device 101), and/or a puff of a solid, liquid, or gas. Further, some haptic effects may use multiplehaptic output devices 118 of the same or different types in sequence and/or in concert. Although a singlehaptic output device 118 is shown inFIG. 1 , some embodiments may use multiplehaptic output devices 118 of the same or different type to produce haptic effects. - In some embodiments, the
haptic output device 118 is in communication with theprocessor 102 and internal to thecomputing device 101. In other embodiments, thehaptic output device 118 is external to thecomputing device 101 and in communication with the computing device 101 (e.g., via wired interfaces such as Ethernet, USB, IEEE 1394, and/or wireless interfaces such as IEEE 802.11, Bluetooth, or radio interfaces). For example, thehaptic output device 118 may be associated with (e.g., coupled to) a wearable device (e.g., a wristband, bracelet, hat, headband, etc.) and configured to receive haptic signals from theprocessor 102. - In some embodiments, the
haptic output device 118 is configured to output a haptic effect comprising a vibration. Thehaptic output device 118 may comprise, for example, one or more of a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA). - In some embodiments, the
haptic output device 118 is configured to output a haptic effect modulating the perceived coefficient of friction of a surface associated with thehaptic output device 118. In one embodiment, thehaptic output device 118 comprises an ultrasonic actuator. An ultrasonic actuator may vibrate at an ultrasonic frequency, for example 20 kHz, increasing or reducing the perceived coefficient of friction of the surface associated with thehaptic output device 118. In some embodiments, the ultrasonic actuator may comprise a piezo-electric material. - In some embodiments, the
haptic output device 118 uses electrostatic attraction, for example by use of an electrostatic actuator, to output a haptic effect. The haptic effect may comprise a simulated texture, a simulated vibration, a stroking sensation, or a perceived change in a coefficient of friction on a surface associated with computing device 101 (e.g., the touch sensitive surface 116). In some embodiments, the electrostatic actuator may comprise a conducting layer and an insulating layer. The conducting layer may be any semiconductor or other conductive material, such as copper, aluminum, gold, or silver. The insulating layer may be glass, plastic, polymer, or any other insulating material. Furthermore, theprocessor 102 may operate the electrostatic actuator by applying an electric signal, for example an AC signal, to the conducting layer. In some embodiments, a high-voltage amplifier may generate the AC signal. The electric signal may generate a capacitive coupling between the conducting layer and an object (e.g., a user's finger or other body part, or a stylus) near or touching the touchsensitive surface 116. Varying the levels of attraction between the object and the conducting layer can vary the haptic effect perceived by a user. - In some embodiments, the
haptic output device 118 comprises a deformation device configured to output a deformation haptic effect. The deformation haptic effect may comprise raising or lowering portions of a surface associated with thecomputing device 101. For example, the deformation haptic effect may comprise raising portions of the touchsensitive surface 116. In some embodiments, the deformation haptic effect may comprise bending, folding, rolling, twisting, squeezing, flexing, changing the shape of, or otherwise deforming a surface associated with thecomputing device 101. For example, the deformation haptic effect may apply a force on thecomputing device 101 or a surface associated with the computing device 101 (e.g., the touch sensitive surface 116), causing it to bend, fold, roll, twist, squeeze, flex, change shape, or otherwise deform. - In some embodiments, the
haptic output device 118 comprises fluid configured for outputting a deformation haptic effect (e.g., for bending or deforming a surface associated with the computing device 101). For example, the fluid may comprise a smart gel. A smart gel comprises a fluid with mechanical or structural properties that change in response to a stimulus or stimuli (e.g., an electric field, a magnetic field, temperature, ultraviolet light, shaking, or a pH variation). For instance, in response to a stimulus, a smart gel may change in stiffness, volume, transparency, and/or color. In some embodiments, stiffness may comprise the resistance of a surface associated with the computing device 101 (e.g., the touch sensitive surface 116) against deformation. In some embodiments, one or more wires may be embedded in or coupled to the smart gel. As current runs through the wires, heat is emitted, causing the smart gel to expand or contract, which may cause thecomputing device 101 or a surface associated with thecomputing device 101 to deform. - As another example, the fluid may comprise a rheological (e.g., a magneto-rheological or electro-rheological) fluid. A rheological fluid comprises metal particles (e.g., iron particles) suspended in a fluid (e.g., oil or water). In response to an electric or magnetic field, the order of the molecules in the fluid may realign, changing the overall damping and/or viscosity of the fluid. This may cause the
computing device 101 or a surface associated with thecomputing device 101 to deform. - In some embodiments, the
haptic output device 118 comprises a mechanical deformation device. For example, in some embodiments, thehaptic output device 118 may comprise an actuator coupled to an arm that rotates a deformation component. The deformation component may comprise, for example, an oval, starburst, or corrugated shape. The deformation component may be configured to move a surface associated with thecomputing device 101 at some rotation angles but not others. The actuator may comprise a piezo-electric actuator, rotating/linear actuator, solenoid, an electroactive polymer actuator, macro fiber composite (MFC) actuator, shape memory alloy (SMA) actuator, and/or other actuator. As the actuator rotates the deformation component, the deformation component may move the surface, causing it to deform. In such an embodiment, the deformation component may begin in a position in which the surface is flat. In response to receiving a signal fromprocessor 102, the actuator may rotate the deformation component. Rotating the deformation component may cause one or more portions of the surface to raise or lower. The deformation component may, in some embodiments, remain in this rotated state until theprocessor 102 signals the actuator to rotate the deformation component back to its original position. - Further, other techniques or methods can be used to deform a surface associated with the
computing device 101. For example, thehaptic output device 118 may comprise a flexible surface layer configured to deform its surface or vary its texture based upon contact from a surface reconfigurable haptic substrate (including, but not limited to, e.g., fibers, nanotubes, electroactive polymers, piezoelectric elements, or shape memory alloys). In some embodiments, thehaptic output device 118 is deformed, for example, with a deforming mechanism (e.g., a motor coupled to wires), air or fluid pockets, local deformation of materials, resonant mechanical elements, piezoelectric materials, micro-electromechanical systems (“MEMS”) elements or pumps, thermal fluid pockets, variable porosity membranes, or laminar flow modulation. - Turning to
memory 104,modules - In this example, a
detection module 124 configures theprocessor 102 to monitor the touchsensitive surface 116 via thetouch sensor 108 to determine a position of a touch. For example, thedetection module 124 may sample thetouch sensor 108 in order to track the presence or absence of a touch and, if a touch is present, to track one or more of the location, path, velocity, acceleration, pressure and/or other characteristics of the touch over time. - In some embodiments, a
content provision module 129 configures theprocessor 102 to provide content (e.g., texts, images, sounds, videos, etc.) to a user (e.g., to a user of thecomputing device 101 or another user). As an example, thecontent provision module 129 configures theprocessor 102 to provide an image in any format (e.g., in an animated graphics interchange format) to the user. If the content includes computer generated images, thecontent provision module 129 is configured to generate the images for display on a display device (e.g., thedisplay 134 of the computing device, thedisplay device 136, or another display communicatively coupled to the processor 102). If the content includes video and/or still images, thecontent provision module 129 is configured to access the video and/or still images and generate views of the video and/or still images for display on the display device. If the content includes audio content, thecontent provision module 129 is configured to generate electronic signals that will drive a speaker, which may be part of the display device, to output corresponding sounds. In some embodiments, the content, or the information from which the content is derived, may be obtained by thecontent provision module 129 from thestorage 114, which may be part of thecomputing device 101, as illustrated inFIG. 1 , or may be separate from thecomputing device 101 and communicatively coupled to thecomputing device 101. In some embodiments, thecontent provision module 129 can cause theprocessor 102 to transmit content to another device. As an example, thecontent provision module 129 can generate or access content and cause the processor to transmit the content to thedisplay device 136. - In some embodiments, the haptic
effect determination module 126 represents a program component that analyzes data to determine a haptic effect to generate. The hapticeffect determination module 126 may comprise code that selects one or more haptic effects to output using one or more algorithms or lookup tables. In some embodiments, the hapticeffect determination module 126 comprises one or more algorithms or lookup tables usable by theprocessor 102 to determine a haptic effect. - Particularly, in some embodiments, the haptic
effect determination module 126 may determine a haptic effect based at least in part on sensor signals received from thesensor 132. For example, theprocessor 102 may receive sensor signals from thesensor 132 and determine an availability of a display device (e.g., a display device 136) near thecomputing device 101. The hapticeffect determination module 126 may determine a first haptic effect based at least in part on the determined availability of the display device. For example, in one such embodiment, the hapticeffect determination module 126 may determine a first haptic effect that is output to a user associated with the computing device 101 (e.g., a user wearing the computing device) to indicate to the user that the display device is available (e.g., displaying content, which may be relevant to the user). - For instance, the
display device 136 may be near thecomputing device 101 and thesensor 132 can detect thedisplay device 136 via Bluetooth (e.g., based on a strength of the Bluetooth signal between thesensor 132 and the display device 136). Thecomputing device 101 can receive a signal from thesensor 132 and determine that thedisplay device 136 is available (e.g., that thedisplay device 136 is discoverable and available to connect to thecomputing device 101 via Bluetooth to receive content from the computing device 101). The hapticeffect determination module 126 may determine a vibration or a series of vibrations to be output to indicate to the user that thedisplay device 136 is near the user and is receiving content from thecomputing device 101 via Bluetooth to be displayed by thedisplay device 136. - In some embodiments, the
processor 102 may receive sensor signals from thesensor 132 and determine a location or position of the display device. In such an embodiment, theprocessor 102 may determine a distance between thesensor 132, thecomputing device 101, or a user associated with thecomputing device 101 orsensor 132, and thedisplay device 136. The hapticeffect determination module 126 may determine a second haptic effect based at least in part on the determined location or position of the display device. The second haptic effect may include one or more haptic effects that can indicate the location or position of thedisplay device 136. In some embodiments, the hapticeffect determination module 126 may determine a characteristic (e.g., magnitude, duration, location, type, frequency, etc.) of a haptic effect based on the location or position of thedisplay device 136. For example, in one such embodiment, the hapticeffect determination module 126 may determine a second haptic effect that is output to the user of thecomputing device 101 to indicate to the user the location (e.g., the vertical or horizontal position) of thedisplay device 136 relative to the user. - As an example, the haptic
effect determination module 126 may determine a haptic effect if thedisplay device 136 is positioned to the left of the user and another haptic effect if thedisplay device 136 is in front of the user and still another haptic effect if thedisplay device 136 is above the user. As another example, the hapticeffect determination module 126 may determine a weak or short haptic effect if the user is far from thedisplay device 136 or a strong or long haptic effect if the user is near thedisplay device 136. - In some embodiments, the haptic
effect determination module 126 may determine sequential haptic feedback to indicate to the user the location of thedisplay device 136 relative to the user. In such embodiments, the hapticeffect determination module 126 may determine one or more haptic effects that can create the perception of a flow toward a direction (e.g., by determining a series of haptic outputs along an arm of a user), which can indicate a direction of thedisplay device 136 relative to the user. - In some embodiments, the
computing device 101 may include one or morehaptic output devices 118 for providing various output effects associated with the location of a device (e.g., the display device 136) relative to the user of thecomputing device 101. - In some embodiments, the
processor 102 may receive sensor signals from thesensor 132 and determine a size, type, shape, or other attribute of the display device. In such an embodiment, the hapticeffect determination module 126 may determine one or more haptic effects to indicate the size, type, shape, or other attribute of the display device. For example, the haptic effects may be selected based on the size of the display device (e.g., a strong vibration if the display device is large, for example, a large LCD screen, and a weaker vibration if the display device is small, for example, a smartwatch). - In another embodiment, the haptic
effect determination module 126 may comprise code that determines a haptic effect based on content provided by thecontent provision module 129. For example, thecontent provision module 129 may provide visual content to be output on thedisplay device 136. In one embodiment, the hapticeffect determination module 126 may determine a haptic effect associated with the visual content. For example, in one such embodiment, the hapticeffect determination module 126 may determine a haptic effect for providing a haptic track associated with a video being provided by thedisplay device 136. A haptic track can include a haptic effect (e.g., a vibration) or a series of haptic effects that correspond to the events occurring in the video being provided. For instance, if the video includes a series of explosions, the haptic track can be a series of vibrations that correspond to each explosion. Thus, as the user watches the video, the user may perceive the haptic effects associated with the video. - In some embodiments, the haptic
effect determination module 126 may comprise code that determines, based on a location of a touch on the touchsensitive surface 116, a haptic effect to output and code that selects one or more haptic effects to provide in order to simulate the effect. For example, different haptic effects may be selected based on the location of a touch in order to simulate the presence of a virtual object (e.g., a virtual piece of furniture, automobile, animal, cartoon character, button, lever, logo, or person) on thedisplay 134. Further, in some embodiments, hapticeffect determination module 126 may comprise code that determines, based on the size, color, location, movement, and/or other characteristics of a virtual object, a haptic effect to output and code that selects one or more haptic effects to provide in order to simulate the effect. For example, haptic effects may be selected based on the color of a virtual object (e.g., a strong vibration if the virtual object is red, and a weaker vibration if the virtual object is green). - In some embodiments, the haptic
effect determination module 126 comprises code that determines a haptic effect based on an event. An event, as used herein, is any interaction, action, collision, or other event which occurs during operation of thecomputing device 101, which can potentially comprise an associated haptic effect. In some embodiments, an event may comprise user input (e.g., a button press, manipulating a joystick, interacting with a touchsensitive surface 116, tilting or orienting the device), a system status (e.g., low battery, low memory, or a system notification, such as a notification generated based on the system receiving a message, an incoming phone call, a notification, or an update), sending data, receiving data, or a program event (e.g., if the program is a game, a program event may comprise explosions, gunshots, collisions, interactions between game characters, advancing to a new level, or driving over bumpy terrain). - In some embodiments, the haptic
effect generation module 128 represents programming that causes theprocessor 102 to generate and transmit haptic signals to thehaptic output device 118 to generate the selected haptic effect. In some examples, the hapticeffect generation module 128 causes thehaptic output device 118 to generate a haptic effect determined by the hapticeffect determination module 126. For example, the hapticeffect generation module 128 may access stored waveforms or commands to send to thehaptic output device 118 to create the selected haptic effect. In some embodiments, the hapticeffect generation module 128 may comprise algorithms to determine the haptic signal. The hapticeffect generation module 128 may comprise algorithms to determine target coordinates for the haptic effect (e.g., coordinates for a location on thecomputing device 101, such as on the touchsensitive surface 116, at which to output the haptic effect). - In some embodiments, the
display device 136 includesprocessor 138, amemory 140, abus 142, I/O components 144,storage 146,network interface device 148,display 150, touchsensitive surface 152,touch sensors 154,camera 156,sensors 158, andhaptic output device 160, each of which may be configured in substantially the same manner as thememory 104, theprocessor 102,bus 106, I/O components 112,storage 114,network interface device 110,display 134, touchsensitive surface 116,touch sensors 108,camera 130,sensors 132, andhaptic output device 118 of thecomputing device 101, although they need not be. In some embodiments, thedisplay device 136 may include all or some of the components depicted inFIG. 1 . - In some embodiments, the
display device 136 comprises, for example, a mobile device (e.g., a smartphone), tablet, e-reader, smartwatch, a head-mounted display, glasses, a wearable device, an interactive kiosk, an interactive poster, a screen of a cash register, or any other device that includes a display for providing content. - The
computing device 101 may be communicatively coupled to thedisplay device 136. For example, thecomputing device 101 and thedisplay device 136 can communicate (e.g., transmit or receive data or signals 172, 174) usingnetwork interface devices computing device 101 and thedisplay device 136 can each be connected to a common wireless network and can communicate via the wireless network. As another example, thecomputing device 101 and thedisplay device 136 can be communicatively coupled via a Bluetooth connection. - In some embodiments, the
display device 136 may be an opportunistic display device. For example, thedisplay device 136 can be a display device near a user of thecomputing device 101 and within the field of view of the user such that thedisplay device 136 can be used to provide content to the user. - In some embodiments, the
sensor 158 of thedisplay device 136 may detect a presence or location of another device (e.g., the computing device 101). For example, thesensor 158 may be configured in substantially the same manner assensor 132 for detecting the presence or location of another device. - In some embodiments, the
sensor 158 may detect a direction of a field of view of the user of thecomputing device 101. For example, the sensor may include any device used to detect eye gaze, line-of-sight, or field of view of the user. For example, thesensor 158 may include a camera configured to capture an image of the eye of the user of thecomputing device 101 and theprocessor 138 can determine the direction of the field of view of the user of thecomputing device 101 based at least in part on the image by using various image processing methods and technique. In another embodiment, thesensor 158 is configured to monitor movements of an eye of the user or muscles near an eye of the user of thecomputing device 101 and theprocessor 138 is configured to determine the direction of the user's field of view based at least in part on the monitored movements. In still another embodiment, thesensor 158 may be configured to monitor or measure electrical activity of muscles moving the eye of the user of thecomputing device 101 and theprocessor 138 can be configured to determine the direction of the user's field of view. In some embodiments, thesensor 158 may include other sensors used to determine a user's intent or volition, including, for example, sensors associated with functional magnetic resonance imaging (fMRI) or electroencephalogram (EEG). In still another embodiment, thesensor 158 may detect the user's eye gaze, line-of-sight, or field of view through various methods and techniques, including for example, analyzing the user's body or head posture. - In some embodiments, the
memory 140 includesmodules memory 140 includes all or some of themodules modules respective modules computing device 101. - In some embodiments, the haptic
effect determination module 164 may determine a haptic effect based at least in part on sensor signals received from thesensor 158. In some embodiments, theprocessor 138 may receive sensor signals from thesensor 158 and determine a direction of an eye gaze or field of view of a user of thecomputing device 101. Based on this determination, the hapticeffect determination module 164 may determine a haptic effect. For example, in one such embodiment, theprocessor 138 may determine that the user of thecomputing device 101 is looking at, or in the direction of, the display device 136 (e.g., at thedisplay 150 of the display device). Based on this determination, the hapticeffect determination module 126 may determine a haptic effect that is output to the user of thedisplay device 136 to indicate that the user of thecomputing device 101 is looking at thedisplay device 136. For instance, a smartwatch might vibrate briefly to indicate that another user is viewing thedisplay device 136. In this manner, the user of thedisplay device 136 may become aware that a person (e.g., the user of the computing device 101) is looking at (e.g., reading) content on the display device that may be relevant to the person. - In some embodiments, the haptic
effect determination module 126 may determine another haptic effect based on the direction of the eye gaze or field of view of the user of thecomputing device 101 to indicate to the user of thedisplay device 136 how to orient thedisplay device 136. For example, the haptic effect may indicate to the user of thedisplay device 136 that the user should adjust an arm, wrist, leg, or hand, to allow the user of thecomputing device 101 to view content on thedisplay device 136. For instance, a smartwatch on the user's wrist may vibrate toward the top of the smartwatch (e.g., toward the top of the user's wrist) to indicate to the user that another person (e.g., the user of the computing device 101) is looking at the top of the display of the smartwatch so that the user can adjust the user's wrist such that the top of the display is more visible to the other person. - In some embodiments, the
content provision module 170 can be configured in substantially the same manner as thecontent provision module 129 and can configure theprocessor 138 to provide content (e.g., texts, images, sounds, videos, animated graphics, etc.) to a user (e.g., to a user of thecomputing device 101, the user of thedisplay device 136, or another user). In some embodiments, the content, or the information from which the content is derived, may be obtained from another device (e.g., the computing device 101). For example, thecontent provision module 129 can be configured to cause the processor to receive data that includes content from thecomputing device 101. The content may displayed by the display device 136 (e.g., via the display 150). -
FIG. 2 shows an embodiment of a system for haptic feedback for opportunistic displays. The system includes acomputing device 101 and adisplay device 136. In the example shown inFIG. 2 , thedisplay device 136 is an opportunistic display device (e.g., thedisplay device 136 is configured to display content, for example, viadisplay 150, and is within the field ofview 208 of 204 of thecomputing device 101. - The
computing device 101 comprises a sensor (e.g., thesensor 132 ofFIG. 1 ). In some embodiments, the sensor may be internal to thecomputing device 101 or external to the computing device. The sensor may be configured to detect devices proximate to thecomputing device 101 or auser 204 associated with the computing device 101 (e.g., a user using or wearing the computing device 101). - In the embodiment shown in
FIG. 2 , the sensor may detect adisplay device 136 that is near the sensor, thecomputing device 101, or theuser 204. In some embodiments the sensor may detect thedisplay device 136 and transmit sensor signals associated with thedisplay device 136 to a processor associated with the computing device 101 (e.g., theprocessor 102 ofFIG. 1 ). The sensor signal can correspond to any attribute of the display device. For example, the sensor signal can indicate an availability of thedisplay device 136. For example, thedisplay device 136 can be available to provide or output content (e.g., text, alerts, sounds, images, video, animated graphics, etc.) via adisplay 150. In some embodiments, the sensor signal can indicate a location of the display device 136 (e.g., a location or position of the display device relative to the sensor, thecomputing device 101, or the user 204). In another embodiment, the sensor signal can indicate a type, size, or shape of thedisplay device 136. - In some embodiments, the
computing device 101 may determine and output one or more haptic effects based at least in part on the sensor signals. The haptic effects may each comprise a vibration or any other haptic effect. For example, thecomputing device 101 may receive sensor signals and determine that thedisplay device 136 is available (e.g., nearby, within the field ofview 208 of theuser 204, and/or displaying content that is relevant to the user 204). In some embodiments, the content may be transmitted from thecomputing device 101 to thedisplay device 136. Thecomputing device 101 may output a haptic effect (e.g., a short vibration) to a body part of the user 204 (e.g., an arm, leg, wrist, hand, or other body part) to indicate to theuser 204 that thedisplay device 136 is available. - In some embodiments, the
computing device 101 may output one or more other haptic effects to a body part of theuser 204 to indicate to the user the location or position of the display device (e.g., relative to the user 204). For example, in some embodiments, thecomputing device 101 outputs a haptic effect to indicate to theuser 204 that thedisplay device 136 is up or down relative to theuser 204. As another example, thecomputing device 101 outputs a haptic effect to indicate to theuser 204 that thedisplay device 136 is to the left or right of theuser 204. In still another example, thecomputing device 101 outputs a haptic effect to indicate to theuser 204 that thedisplay device 136 is in front of, or behind, theuser 204. - As an illustrative example, the
computing device 101 may comprise a wearable device (e.g., a wristband or headband) and may comprise one or more haptic output devices. Thecomputing device 101 may be configured to output a haptic effect to a left side of thecomputing device 101 or body of the user 204 (e.g., if the user has a wristband on a left wrist or if a haptic output device is positioned on the left side of the body of the user 204) to indicate to theuser 204 that thedisplay device 136 is to the left of the user. Thecomputing device 101 may also be configured to output a haptic effect to a top part of thecomputing device 101 or body part of the user 204 (e.g., a top part of the wrist of theuser 204 if the user is wearing a wristband) to indicate to the user that thedisplay device 136 is in front or above theuser 204. In still another embodiment, thecomputing device 101 may output a weak haptic effect (e.g., a weak vibration) when the user is far from thedisplay device 136 and a strong haptic effect (e.g., a strong vibration) when theuser 204 is close to thedisplay device 136. - In this manner, the
computing device 101 may indicate to theuser 204 the horizontal and/or vertical position of thedisplay device 136 so that theuser 204 may view content provided on thedisplay device 136, which may be relevant to the user 204 (e.g., content being provided by the computing device 101). - In another embodiment, the
computing device 101 may output one or more other haptic effects to theuser 204 to indicate a size, shape, type, or other attribute of thedisplay device 136 or adisplay 150 of thedisplay device 136. For example, in some embodiments, thecomputing device 101 may determine a size of thedisplay device 136 or thedisplay 150 based at least in part on sensor signals. Thecomputing device 101 may determine and output a haptic effect to indicate to theuser 204 the size of thedisplay device 136. As an example, thecomputing device 101 may output a strong vibration to theuser 204 if thedisplay device 136 is a large display device (e.g., a large LED screen). In another example, thecomputing device 101 may output a weak vibration to theuser 204 if thedisplay device 136 is small display device (e.g., a smartwatch or wristband). In this manner, the haptic effects may indicate to the user 204 a size, or other attribute, of thedisplay device 136 or thedisplay 150, which may be displaying content relevant to theuser 204. - In some embodiments, the
computing device 101 may be configured to provide content to thedisplay device 136. For example, thecomputing device 101 may transmit data that includes content and thedisplay device 136 may be configured to display the content (e.g., via the display 150). Thecomputing device 101 may output one or more haptic effects associated with the content displayed on thedisplay device 136. For example, the content may comprise a video, or a video game, and thecomputing device 101 may output one or more haptic effects that correspond to the video game or video content displayed on thedisplay device 136. As an example, the video content may include one or more actions or interactions (e.g., interaction between characters or objects in the video). Thecomputing device 101 can output one or more vibrations associated with the actions or interaction so that the user may perceive the actions and/or interactions as they are displayed on thedisplay device 136. - In some embodiments, the
display device 136 may comprise asensor 206 for detecting a presence or proximity of thecomputing device 101. In some embodiments, thesensor 206 may be configured to detect a line-of-sight or field ofview 208 of theuser 204 or a direction of aneye gaze 210 of theuser 204. In the embodiment shown inFIG. 2 , thedisplay device 136 comprises thesensor 206, which may be configured in substantially the same manner as thecamera 156 or thesensor 158 ofFIG. 1 . - In some embodiments, the
display device 136 may transmit data to thecomputing device 101 based at least in part on data from thesensor 206. For example, thesensor 206 may detect the presence or proximity of the computing device 101 (e.g., via a Bluetooth signal) and transmit data indicating an availability, location, or attribute of thedisplay device 136 based on data from thesensor 206. - As another example, the
sensor 206 may detect a field ofview 208 of theuser 204 or the direction of theeye gaze 210 of the user. Thedisplay device 136 may transmit data indicating an availability, location, or attribute of thedisplay device 136 based on data from thesensor 206. As an example, thedisplay device 136 may determine that theuser 204 is looking at, or in the direction of, thedisplay device 136 or thedisplay 150 of the display device, based on data from thesensor 206 and transmit data to thecomputing device 101 about an availability, location, or attribute of the display device based on this determination. In some embodiments, thecomputing device 101 may output one or more haptic effects to theuser 204 based on the availability, location, or attribute of thedisplay device 136 as described above. - In some embodiments, the
display device 136 may determine a haptic effect based at least in part on data from thesensor 206. For example, in some embodiments, thedisplay device 136 may determine a haptic effect based on the field ofview 208 of theuser 204 or the direction of theeye gaze 210 of the user. In one such embodiment, thedisplay device 136 determines that theuser 204 is looking at, or in the direction, of thedisplay device 136 or thedisplay 150, based on data from thesensor 206 and may output a haptic effect (e.g., a vibration) to auser 212 of the display device 136 (e.g., to a wrist, hand, arm, leg, hand of the user). The haptic effect may indicate to theuser 212 that theuser 204 is looking at thedisplay device 136 or thedisplay 150. For example, in the embodiment shown inFIG. 2 , thedisplay device 136 may output a haptic effect at the user'swrist 214 to indicate to theuser 212 that theuser 204 is reading or viewing content on thedisplay 150 of the display device that may be relevant to the user 204 (e.g., content transmitted from thecomputing device 101 to the display device 136). In some embodiments, theuser 212 may keep thedisplay device 136 in position in response to the haptic effect. In another embodiment, theuser 212 may move thedisplay device 136 out of the field ofview 208 of theuser 204 in response to the haptic effect. - In some embodiments, the
display device 136 may output one or more other haptic effects to theuser 212 to indicate to theuser 212 how the user may orient thedisplay device 136 based on data from thesensor 206. For example, the sensor may detect that the direction of theeye gaze 210 of theuser 204 is toward a top portion of thedisplay 150. Thedisplay device 136 may output one or more haptic effects to theuser 212 based on data from thesensor 206. The haptic effects may indicate to theuser 212 that the user should rotate the user'swrist 214 such that theuser 204 can view content on top portion of thedisplay 150. As an example, thedisplay device 136 may output a vibration toward the top of the display device (e.g., toward the top of user's wrist 214) to indicate to theuser 212 that theuser 204 is looking at the top of thedisplay 150, which may cause theuser 212 to adjust thedisplay device 136 accordingly. -
FIG. 3 shows another embodiment of a system for haptic feedback for opportunistic displays according to another embodiment. In the example depicted inFIG. 3 , the system comprises acomputing device 302, which comprises a smartphone or a tablet, and anopportunistic display device 304.Computing device 302 comprises a sensor (not shown), which may detect a presence, availability, or location of theopportunistic display device 304. Theopportunistic display device 304 comprises any device that includes adisplay 306. In some embodiments, theopportunistic display device 304 may comprise asensor 310. - In the example depicted in
FIG. 3 , auser 308 of thecomputing device 302 may be walking by theopportunistic display device 304 and thecomputing device 302 may be in a pocket of theuser 308 or elsewhere. The sensor of thecomputing device 302 may detect theopportunistic display device 304 when theuser 308 is near opportunistic display device 304 (e.g., when theopportunistic display device 304 is within a field of view of the user 308). In some embodiments, the sensor may detect theopportunistic display device 304 upon occurrence of an event. An event, as used herein, is any interaction, action, or other event which occurs during operation of thecomputing device 302, which can potentially comprise an associated haptic effect. In some embodiments, an event may comprise a system status (e.g., low batter, low memory), a system notification (e.g., a notification generated based on the system receiving an incoming call, an alert, an update, an upcoming meeting reminder, etc.), receiving data (e.g., receiving a text message). For example, thecomputing device 302 may receive an alert or update (e.g., an alert regarding a meeting or news alert). The sensor may detect theopportunistic display device 304 within the field of view of theuser 308 in response to the alert. - In some embodiments, the
opportunistic display device 304 may detect thecomputing device 302 as theuser 308 approaches theopportunistic display device 304. For example, theopportunistic display device 304 may include a sensor 310 (e.g., a camera or other sensor configured in substantially the same manner assensor 206 ofFIG. 2 ) that may detect thecomputing device 302 when theuser 308 is near theopportunistic display device 304. In another embodiment, thesensor 310 may detect a field of view of theuser 308, or a direction of an eye gaze of the user 308 (e.g., when user is looking in the direction of the opportunistic display device 304). In some embodiments, theopportunistic display device 304 may transmit a signal associated with an availability or location of theopportunistic display device 304 to thecomputing device 302 based on data from thesensor 310. For example, theopportunistic display device 304 may transmit a signal about the availability and location of theopportunistic display device 304 when theopportunistic display device 304 is within the field of view of theuser 308. - In some embodiments, the
computing device 302 may determine an availability or location of theopportunistic display device 304 based on sensor data (e.g., data from the sensor of thecomputing device 302 or data from thesensor 310 of the opportunistic display device 304). For example, thecomputing device 302 may determine if theopportunistic display device 304 is available to display content associated with an event (e.g., the alert received by the computing device 302) based on sensor data. Thecomputing device 302 may transmit data to theopportunistic display device 304 in response to determining that the display device is available. Theopportunistic display device 304 may display content associated with the event based on the data (e.g., via the display 306). In the example shown inFIG. 3 , thecomputing device 302 transmits an alert 309 to theopportunistic display device 304 to be shown ondisplay 306. - In some embodiments, the
computing device 302 may determine and output a first haptic effect based at least in part on the availability of theopportunistic display device 304. For example, in some embodiments, thecomputing device 302 may determine the first haptic effect based on theopportunistic display device 304 displaying content associated with the event (e.g., when theopportunistic display device 304 is displaying the alert 309). The computing device may 302 may output the first haptic effect (e.g., a vibration, a squeeze, or a poke) to a body part of the user 308 (e.g., a hip or other body party) to indicate to theuser 308 that theopportunistic display device 304 is displaying content relevant to the user 308 (e.g., that theopportunistic display device 304 is displaying the alert 309 received by the computing device 302). - In some embodiments, the
computing device 302 may determine and output a second haptic effect based at least in part on a location or position of the opportunistic display device 304 (e.g., relative to the user 308). In some embodiments, the second haptic effect may comprise one or more haptic effects. In some embodiments, thecomputing device 302 may output the second haptic effect (e.g., a vibration) to a body part of the user to indicate a proximity or distance between theuser 308 and the display device. For example, thecomputing device 302 may output a strong haptic effect (e.g., a strong vibration) to theuser 308 to indicate that theopportunistic display device 304 is near theuser 308. Thecomputing device 302 may also output another haptic effect (e.g., vibration) at the top of thecomputing device 302 to indicate that theopportunistic display device 304 is in front of theuser 308. As another example, the computing device may be worn on an arm of theuser 308. Thecomputing device 302 may output one or more haptic effects that can create the perception of a flow toward a direction toward the opportunistic display device 304 (e.g., by outputting a series of haptic outputs along the arm of the user 308), which can indicate a direction of theopportunistic display device 304 relative to the user. In some embodiments, the first and second haptic effects can be combined to form a single composite haptic effect for indicating the availability of thedisplay device 304 and location of thedisplay device 304 relative to theuser 308. - In some embodiments, the
computing device 302 may determine and output a third haptic effect based at least in part on the content displayed by theopportunistic display device 304. In some embodiments, the third haptic effect may comprise one or more haptic effects. For example, thecomputing device 302 may output haptic effects associated with the alert 309 displayed by display device. As an example, the alert 309 may be a video and thecomputing device 302 may output one or more haptic effects to theuser 308 that correspond to the video displayed by theopportunistic display device 304. For example, the video can include a collision or a series of collisions (e.g., in a news alert or other alert) and the third haptic effect can be a series of vibrations corresponding to each collision, which allows the user to experience haptic effects associated with the one or more collisions in the video. - In some embodiments, the
opportunistic display device 304 may comprise a haptic output device. In such embodiments, theopportunistic display device 304 may determine and output a haptic effect based on data from thesensor 310. For example, in one such embodiment, thesensor 310 may detect thecomputing device 302 as theuser 308 nears theopportunistic display device 304 or may detect the field of view of theuser 308. Theopportunistic display device 304 may be communicatively coupled to the computing device 302 (e.g., for receiving data associated with content to be displayed by the opportunistic display device 304). Theopportunistic display device 304 may output a haptic effect (e.g., a puff of a solid, liquid, or gas or a vibration) as theuser 308 nears theopportunistic display device 304. The haptic effect may indicate to theuser 308 that theopportunistic display device 304 is available (e.g., displaying the alert 309 received by the computing device 302) or may indicate the location of theopportunistic display device 304 relative to theuser 308. As an example, theopportunistic display device 304 can output a strong vibration when theuser 308 is near theopportunistic display device 304 to indicate that the alert 309 is being displayed and to indicate the location of theopportunistic display device 304. -
FIG. 4 is a flow chart of steps for performing amethod 400 for providing haptic feedback for opportunistic displays according to one embodiment. In some embodiments, the steps inFIG. 4 may be implemented in program code that is executable by a processor, for example, the processor in a general purpose computer, a mobile device, or a server. In some embodiments, these steps may be implemented by a group of processors. In some embodiments, one or more steps shown inFIG. 4 may be omitted or performed in a different order. Similarly, in some embodiments, additional steps not shown inFIG. 4 may also be performed. The steps below are described with reference to components described above with regard to the systems shown inFIGS. 1 and 2 . - The
method 400 begins atstep 402 when thesensor 132 of thecomputing device 101 detects one ormore display devices 136 within a field of view of auser 204. In some embodiments, thesensor 132 can detect adisplay device 136 near thesensor 132 or thecomputing device 101. Thedisplay device 136 can be an opportunistic display, which includes a display device that is near thesensor 132 or thecomputing device 101 and can be used to display content. In some embodiments, thedisplay device 136 is within the field of view of auser 204 such that thedisplay device 136 can be used to display content to theuser 204. For example, thesensor 132 can detect the presence and location ofdisplay devices 136 by analyzing the strength of a Bluetooth signal between thesensor 132 and thedisplay device 136. In some embodiments, thesensor 132 may detect a distance between thesensor 132, thecomputing device 101, and the display device 136 (e.g., based on the strength of the Bluetooth signal between in thesensor 132 and the display device 136). - The
method 400 continues atstep 404 when a signal about thedisplay device 136 is transmitted to aprocessor 102. In some embodiments, thesensor 132 transmits the signal about thedisplay device 136 to theprocessor 102. In other embodiments another sensor associated with the display device (e.g., thesensor 206 of the display device 136) transmits the signal about thedisplay device 136. - The method continues at
step 406 when theprocessor 102 determines an availability and a location of thedisplay device 136 based on the signal. In some embodiments theprocessor 102 determines the availability ofdisplay device 136 to display content (e.g., texts, images, sounds, videos, animated graphics, etc.) to theuser 204. For example, theprocessor 102 can determine if thedisplay device 136 is discoverable and available to connect to thecomputing device 101 via Bluetooth to receive content from thecomputing device 101. In some embodiments, theprocessor 102 can be configured to provide the content to the display device 136 (e.g., instep 410 described below). - In some embodiments, the
processor 102 determines a location of thedisplay device 136 relative to theuser 204. For example, theprocessor 102 determines a horizontal and vertical position of thedisplay device 136 relative to theuser 204. - The
method 400 continues atstep 408 when theprocessor 102 determines a characteristic of thedisplay device 136 based on the signal. In some embodiments, theprocessor 102 determines a type, size, or shape of thedisplay device 136 based on the signal from the sensor. For instance, theprocessor 102 determines that thedisplay device 136 is a large display device (e.g., a large LCD screen) or a small display device (e.g., a smartwatch). As another example, theprocessor 102 determines the type of thedisplay device 136, for example, whether thedisplay device 136 is a LCD screen or a smartwatch. - The
method 400 continues atstep 410 when theprocessor 102 transmits content to thedisplay device 136. In some embodiments, thecontent provision module 129 causes theprocessor 102 to transmit content to thedisplay device 136. For example, theprocessor 102 can transmit a text message or an alert to thedisplay device 136 if theprocessor 102 determines that thedisplay device 136 is available (e.g., in step 406). - The
method 400 continues atstep 412 when theprocessor 102 determines a first haptic effect based at least in part on the availability and the location of thedisplay device 136. In some embodiments, the hapticeffect determination module 126 causes theprocessor 102 to determine the first haptic effect. In some embodiments, the first haptic effect can include one or more haptic effects. For example, the first haptic effect can include an availability haptic effect based on the availability of thedisplay device 136 and a location haptic effect based on the location of thedisplay device 136. - For example, the
processor 102 can determine the first haptic effect (e.g., a series of vibrations) associated with thedisplay device 136 being available (e.g., available to display content received from thecomputing device 101 and/or displaying content from the computing device 101) and the horizontal and vertical position of thedisplay device 136 relative to theuser 204. As an example, the first haptic effect can include a first vibration that indicates to theuser 204 that thedisplay device 136 is displaying content followed by a series of intermittent vibrations to be output to a left or right side of thecomputing device 101 or body part of theuser 204 for indicating the horizontal position of thedisplay device 136 relative to the user and/or a series of intermittent vibrations to be output to a top or bottom portion of thecomputing device 101 or a body part of theuser 204 to indicate the vertical position of thedisplay device 136 relative to theuser 204. - The
method 400 continues atstep 414 when the processor determines a second haptic effect based at least in part on a characteristic of thedisplay device 136. In some embodiments, the hapticeffect determination module 126 causes theprocessor 102 to determine the second haptic effect. - For example, the
processor 102 can determine the second haptic effect based on a size, type, shape, or other attribute of thedisplay device 136. As an example, the second haptic effect can be a strong vibration if thedisplay device 136 is a large display device (e.g., a large LCD screen). As another example, the second haptic effect can be a weak vibration if the display device is a small display device (e.g., a smartwatch). - The
method 400 continues atstep 416 when the processor determines a third haptic effect based at least in part on the content transmitted to the display device 136 (e.g., in step 410). In some embodiments, the hapticeffect determination module 126 causes theprocessor 102 to determine the third effect. - For example, if the content transmitted to the
display device 136 is a video, theprocessor 102 can determine a third haptic effect that corresponds to the video. As an example, the video may include one or more explosions or collisions and the third haptic effect can comprise one or more vibrations that correspond to the explosions or collisions. - The method continues at
step 418 when theprocessor 102 transmits a first haptic signal associated with the first haptic effect to thehaptic output device 118, a second haptic signal associated with the second haptic effect, or a third haptic signal associated with the third haptic effect to thehaptic output device 118. In some embodiments, theprocessor 102 may transmit one or more of the first, second, and third haptic signals to thehaptic output device 118. In some embodiments, the hapticeffect generation module 128 causes theprocessor 102 to generate and transmit the first, second, and third haptic signals to thehaptic output device 118. - The
method 400 continues atstep 420 whenhaptic output device 118 outputs the first haptic effect based on the availability and location of thedisplay device 136. In some embodiments, the first haptic effect comprises a vibration, a surface deformation, a squeeze, a poke, and/or a puff of a solid, liquid, gas, or plasma. In some embodiments, the first haptic effect can indicate to theuser 204 that thedisplay device 136 is displaying content relevant to the user 204 (e.g., content transmitted from thecomputing device 101 to the display device 136) and indicate to theuser 204 the location of thedisplay device 136. In this manner, theuser 204 can be alerted to view content being displayed on thedisplay device 136, which may be relevant to theuser 204. - The
method 400 continues atstep 422 when thehaptic output device 118 outputs the second haptic effect based on the characteristic of thedisplay device 136 or the third haptic effect based on the content transmitted to thedisplay device 136. In some embodiments, the second or third haptic effect can each comprise a vibration, a surface deformation, a squeeze, a poke, and/or a puff of a solid, liquid, gas, or plasma. In some embodiments, the second haptic effect can indicate to theuser 204 the size, type, or shape of thedisplay device 136 that is displaying content relevant to the user. The third haptic effect can allow theuser 204 to perceive haptic effects associated with content displayed by thedisplay device 136. -
FIG. 5 is a flow chart of steps for performing anothermethod 500 for providing haptic feedback for opportunistic displays according to another embodiment. In some embodiments, the steps inFIG. 5 may be implemented in program code that is executable by a processor, for example, the processor in a general purpose computer, a mobile device, or a server. In some embodiments, these steps may be implemented by a group of processors. In some embodiments, one or more steps shown inFIG. 5 may be omitted or performed in a different order. Similarly, in some embodiments, additional steps not shown inFIG. 5 may also be performed. The steps described below are described with reference to components described above with regard to the systems shown inFIGS. 1 and 2 . - The
method 500 begins atstep 502 when thesensor 206 of thedisplay device 136 detects thecomputing device 101 near thedisplay device 136. In some embodiments, thesensor 206 may detect a presence or proximity of thecomputing device 101. For example, thesensor 206 may include a Bluetooth device that can detect the presence and proximity thecomputing device 101 by analyzing signal strength between thesensor 206 and the computing device 101 (e.g., by determining if thecomputing device 101 is discoverable and determining a proximity of thecomputing device 101 based on the strength of the Bluetooth signal). - The
method 500 continues at step 504 when thesensor 206 detects a field ofview 208 of auser 204 of thecomputing device 101. In some embodiments, thesensor 206 comprises a camera, or other suitable device, that can detect the field ofview 208 or a direction of aneye gaze 210 of theuser 204. - The method continues at
step 506 when a first signal about the field ofview 208 or the direction of aneye gaze 210 of theuser 204 is transmitted to aprocessor 138 of thedisplay device 136. In some embodiments, thesensor 206 transmits the first signal to theprocessor 138. - The method continues at
step 508 when theprocessor 138 determines the direction of the field ofview 208 or a direction of aneye gaze 210 of theuser 204 based on the first signal (e.g., the first signal transmitted from thesensor 206 to theprocessor 138 in step 506). - For example, the
sensor 206 can transmit data about the field ofview 208 or the direction of theeye gaze 210 of theuser 204 to theprocessor 138 of thedisplay device 136. Theprocessor 138 can determine that theuser 204 is looking at, or in the direction of, thedisplay device 136 based on the field ofview 208 or the direction of theeye gaze 210 of theuser 204. As an example, thesensor 206 can include a camera for monitoring movements of an eye of theuser 204 or muscles near the eye of theuser 204 and theprocessor 138 can determine the field ofview 208 or the direction of theeye gaze 210 of theuser 204 based on the monitored movements. As another example, thesensor 206 monitors or measures electrical activity of the muscles moving the eye of theuser 204 and theprocessor 138 determines the field ofview 208 or the direction of theeye gaze 210 of theuser 204 based on the electrical activity of the muscles. Theprocessor 138 can determine that theuser 204 is looking at, or in the direction of, thedisplay device 136 based on the field ofview 208 or the direction of theeye gaze 210 of theuser 204. - The method continues at
step 510 when a second signal about thedisplay device 136 is transmitted to the computing device 101 (e.g., to theprocessor 102 of the computing device 101). In some embodiments, thedisplay device 136 can transmit the second signal to thecomputing device 101 based at least in part on the field ofview 208 or the direction of theeye gaze 210 of theuser 204. - For example, the
processor 138 of thedisplay device 136 can determine that theuser 204 is looking at, or in the direction of, thedisplay device 136 and thedisplay device 136 can transmit the second signal indicating an availability, a location, or an attribute of thedisplay device 136 to thecomputing device 101 based on the determination. In some embodiments, thecomputing device 101 can determine the availability, location, or attribute of the display device based on the second signal as described above. - The method continues at
step 512 when theprocessor 138 determines a display device haptic effect based at least in part on the field ofview 208 of theuser 204. In some embodiments, the hapticeffect determination module 164 of the display device causes theprocessor 138 to determine the display device haptic effect. - For example, the
processor 138 can determine the display device haptic effect based on theuser 204 of thecomputing device 101 looking at, or in the direction of, the display device 136 (e.g., in step 510). Theprocessor 138 can determine the display device haptic effect to be output to auser 212 of thedisplay device 136 to indicate to theuser 212 that theuser 204 is looking at, or in the direction of, thedisplay device 136. As an example, thedisplay device 136 can be a smartwatch and can vibrate briefly to indicate to theuser 212 that theuser 204 is looking at thedisplay 150 of the display device 136 (e.g., theuser 204 is viewing content on the display 150). - In some embodiments, the display device haptic effect can include one or more haptic effects. For example, the display device haptic effect can include a haptic effect (e.g., a brief vibration) to indicate to the
user 212 that theuser 204 is looking at thedisplay device 136. The display device haptic effect can include another haptic effect (e.g., a strong vibration) to indicate to theuser 212 that theuser 212 should adjust an arm, wrist, leg, or hand to allow theuse 204 to view content being displayed on thedisplay device 136. As an example, the smartwatch can vibrate at the bottom of the smartwatch toward the bottom of the user'swrist 214 to indicate to theuser 212 that theuser 204 is looking at the bottom of thedisplay device 136. Theuser 212 may adjust the user'swrist 214 in response to the display device haptic effect to allow theuser 204 to better view the content on the display device 136 (e.g., rotate thewrist 214 such that the bottom of thedisplay device 136 is more visible). - The
method 500 continues atstep 514 when theprocessor 138 transmits a display device haptic signal associated with the display device haptic effect to thehaptic output device 160 of thedisplay device 136. In some embodiments, the hapticeffect generation module 168 causes theprocessor 138 to generate and transmit the display device haptic signal to thehaptic output device 118. - The
method 500 continues atstep 516 when thehaptic output device 160 outputs the display device haptic effect. In some embodiments, the display device haptic effect can be output to theuser 212 of thedisplay device 136 based on the direction of the field ofview 208 of theuser 204 of thecomputing device 101. In other embodiments, the display device haptic effect can be output to theuser 204 of thecomputing device 101 based on the direction of the field ofview 208 of theuser 204. In some embodiments, the display device haptic effect comprises a vibration, a surface deformation, a squeeze, a poke, and/or a puff of a solid, liquid, gas, or plasma. In some embodiments, the display device haptic effect comprises one or more haptic effects for indicating to theuser 212 that theuser 204 is looking at thedisplay device 136. In other embodiments, the display device haptic effect comprises one or more haptic effects for indicating to theuser 204 that thedisplay device 136 is available and for indicating the location of the display device 137 relative to theuser 204. - The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
- Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
- Also, configurations may be described as a process that is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
- Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not bound the scope of the claims.
- The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
- Embodiments in accordance with aspects of the present subject matter can be implemented in digital electronic circuitry, in computer hardware, firmware, software, or in combinations of the preceding. In one embodiment, a computer may comprise a processor or processors. The processor comprises or has access to a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs including a sensor sampling routine, selection routines, and other routines to perform the methods described above.
- Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
- Such processors may comprise, or may be in communication with, media, for example tangible computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, all electronic, optical, magnetic, or other storage devices capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. Also, various other devices may comprise computer-readable media, such as a router, private or public network, or other transmission device. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
- While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.
Claims (23)
1-20. (canceled)
21. A method comprising:
receiving, by a computing device, a sensor signal indicating a user's field of view;
determining, by the computing device, an availability of a display device within the user's field of view to be communicatively connected to the computing device;
connecting, by the computing device, the computing device to the display device;
determining, by the computing device, a haptic effect indicating the availability of the display device;
sending, by the computing device to the display device, content to be displayed by the display device; and
outputting a haptic signal to cause a haptic output device to output the haptic effect.
22. The method of claim 21 , further comprising determining a location of the display device, and wherein the haptic effect is based on the location of the display device.
23. The method of claim 22 , wherein determining the location of the display device comprises determining a horizontal or vertical position of the display device with respect to the user.
24. The method of claim 21 , wherein the sensor signal indicates the user's eye-gaze direction.
25. The method of claim 21 , further comprising receiving one or more signals from the display device indicating an availability of the display device.
26. The method of claim 25 , wherein the one or more signals indicate a location of the display device.
27. A system comprising:
a sensor;
a non-transitory computer-readable medium; and
a processor communicatively coupled to the sensor and the non-transitory computer-readable medium, the processor configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to:
receive a sensor signal indicating a user's field of view;
determine an availability of a display device within the user's field of view to be communicatively connected to the computing device;
connect the computing device to the display device;
determine a haptic effect indicating the availability of the display device;
send content to be displayed by the display device; and
output a haptic signal to cause a haptic output device to output the haptic effect.
28. The system of claim 27 , wherein the processor is further configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to determine a location of the display device, and wherein the haptic effect is based on the location of the display device.
29. The system of claim 28 , wherein the processor is configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to determine a horizontal or vertical position of the display device with respect to the user.
30. The system of claim 27 , wherein the sensor signal indicates the user's eye-gaze direction.
31. The system of claim 27 , wherein the processor is configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to receive one or more signals from the display device indicating an availability of the display device.
32. The system of claim 31 , wherein the one or more signals indicate a location of the display device.
33. A method comprising:
receiving, by a computing device, one or more signals from a display device indicating an availability of the display device and a user's field of view associated with the display device;
connecting, by the computing device, the computing device to the display device;
determining, by the computing device, a haptic effect indicating the availability of the display device;
sending, by the computing device to the display device, content to be displayed by the display device; and
outputting a haptic signal to cause a haptic output device to output the haptic effect.
34. The method of claim 33 , wherein the one or more signals indicate the user's eye-gaze direction.
35. The method of claim 33 , further comprising determining a location of the display device, and wherein the haptic effect is based on the location of the display device.
36. The method of claim 35 , wherein determining the location of the display device comprises determining a horizontal or vertical position of the display device with respect to the user.
37. The method of claim 35 , further comprising receiving one or more signals indicating the location of the display device.
38. A system comprising:
a sensor;
a non-transitory computer-readable medium; and
a processor communicatively coupled to the sensor and the non-transitory computer-readable medium, the processor configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to:
receive, by a computing device, one or more signals from a display device indicating an availability of the display device and a user's field of view associated with the display device;
connect the computing device to the display device;
determine a haptic effect indicating the availability of the display device;
send content to be displayed by the display device; and
output a haptic signal to cause a haptic output device to output the haptic effect.
39. The system of claim 38 , wherein the one or more signal indicates the user's eye-gaze direction.
40. The system of claim 38 , wherein the processor is configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to determine a location of the display device, and wherein the haptic effect is based on the location of the display device.
41. The system of claim 40 , wherein the processor is configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to determine a horizontal or vertical position of the display device with respect to the user.
42. The system of claim 40 , wherein the processor is configured to execute processor-executable instructions stored in the non-transitory computer-readable medium to receive one or more signals indicating the location of the display device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/227,681 US20190265794A1 (en) | 2016-06-08 | 2018-12-20 | Haptic feedback for opportunistic displays |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/176,692 US10198072B2 (en) | 2016-06-08 | 2016-06-08 | Haptic feedback for opportunistic displays |
US16/227,681 US20190265794A1 (en) | 2016-06-08 | 2018-12-20 | Haptic feedback for opportunistic displays |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date | |
---|---|---|---|---|
US15/176,692 Continuation US10198072B2 (en) | 2016-06-08 | 2016-06-08 | Haptic feedback for opportunistic displays |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190265794A1 true US20190265794A1 (en) | 2019-08-29 |
Family
ID=59034661
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/176,692 Expired - Fee Related US10198072B2 (en) | 2016-06-08 | 2016-06-08 | Haptic feedback for opportunistic displays |
US16/227,681 Abandoned US20190265794A1 (en) | 2016-06-08 | 2018-12-20 | Haptic feedback for opportunistic displays |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/176,692 Expired - Fee Related US10198072B2 (en) | 2016-06-08 | 2016-06-08 | Haptic feedback for opportunistic displays |
Country Status (5)
Country | Link |
---|---|
US (2) | US10198072B2 (en) |
EP (1) | EP3255525A1 (en) |
JP (1) | JP2017224290A (en) |
KR (1) | KR20170138937A (en) |
CN (1) | CN107479687A (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102234432B1 (en) * | 2019-05-13 | 2021-04-01 | (주)부품디비 | Kinesthetic actuator using wavy-shaped mre and electro-permanent magnet |
KR102234431B1 (en) * | 2019-05-13 | 2021-04-01 | (주)부품디비 | Vibration actuator using wavy-shaped mre and electro-permanent magnet |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110037712A1 (en) * | 2009-08-11 | 2011-02-17 | Lg Electronics Inc. | Electronic device and control method thereof |
US20130227410A1 (en) * | 2011-12-21 | 2013-08-29 | Qualcomm Incorporated | Using haptic technologies to provide enhanced media experiences |
US9380621B2 (en) * | 2012-04-12 | 2016-06-28 | Telefonaktiebolaget Lm Ericsson (Publ) | Pairing a mobile terminal with a wireless device |
US20170127289A1 (en) * | 2015-11-04 | 2017-05-04 | At&T Intellectual Property I, L.P. | Augmented Reality Visual Wi-Fi Signal |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7801569B1 (en) * | 2007-03-22 | 2010-09-21 | At&T Intellectual Property I, L.P. | Mobile communications device with distinctive vibration modes |
EP2000885B1 (en) * | 2007-06-08 | 2011-01-26 | Research In Motion Limited | Haptic display for a handheld electronic device |
KR101467796B1 (en) * | 2009-01-12 | 2014-12-10 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US8754757B1 (en) * | 2013-03-05 | 2014-06-17 | Immersion Corporation | Automatic fitting of haptic effects |
US9833697B2 (en) | 2013-03-11 | 2017-12-05 | Immersion Corporation | Haptic sensations as a function of eye gaze |
CN105283840B (en) * | 2013-06-08 | 2019-07-05 | 苹果公司 | For synchronizing the equipment, method and graphic user interface of two or more displays |
KR102092332B1 (en) * | 2013-07-01 | 2020-04-14 | 삼성전자주식회사 | Portable apparatus and method for displaying a screen |
US10067566B2 (en) | 2014-03-19 | 2018-09-04 | Immersion Corporation | Systems and methods for a shared haptic experience |
US10613627B2 (en) | 2014-05-12 | 2020-04-07 | Immersion Corporation | Systems and methods for providing haptic feedback for remote interactions |
-
2016
- 2016-06-08 US US15/176,692 patent/US10198072B2/en not_active Expired - Fee Related
-
2017
- 2017-06-02 KR KR1020170068880A patent/KR20170138937A/en unknown
- 2017-06-06 JP JP2017111445A patent/JP2017224290A/en active Pending
- 2017-06-07 CN CN201710422385.4A patent/CN107479687A/en active Pending
- 2017-06-08 EP EP17275083.8A patent/EP3255525A1/en not_active Withdrawn
-
2018
- 2018-12-20 US US16/227,681 patent/US20190265794A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110037712A1 (en) * | 2009-08-11 | 2011-02-17 | Lg Electronics Inc. | Electronic device and control method thereof |
US20130227410A1 (en) * | 2011-12-21 | 2013-08-29 | Qualcomm Incorporated | Using haptic technologies to provide enhanced media experiences |
US9380621B2 (en) * | 2012-04-12 | 2016-06-28 | Telefonaktiebolaget Lm Ericsson (Publ) | Pairing a mobile terminal with a wireless device |
US20170127289A1 (en) * | 2015-11-04 | 2017-05-04 | At&T Intellectual Property I, L.P. | Augmented Reality Visual Wi-Fi Signal |
Also Published As
Publication number | Publication date |
---|---|
KR20170138937A (en) | 2017-12-18 |
EP3255525A1 (en) | 2017-12-13 |
US10198072B2 (en) | 2019-02-05 |
US20170357315A1 (en) | 2017-12-14 |
CN107479687A (en) | 2017-12-15 |
JP2017224290A (en) | 2017-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10564729B2 (en) | Haptic feedback using a field of view | |
US10248210B2 (en) | Systems and methods for haptically-enabled conformed and multifaceted displays | |
US10185441B2 (en) | Systems and methods for position-based haptic effects | |
US10561334B2 (en) | Portable apparatus and method of changing screen of content thereof | |
EP3279784A1 (en) | Systems and methods for deformation and haptic effects | |
JP6653536B2 (en) | System and method for shape input and output for tactilely usable variable surfaces | |
CN106293055B (en) | Electronic device and method for providing tactile feedback thereof | |
JP2018147517A (en) | Wearable device manager | |
JP2015228214A (en) | Haptic notification manager | |
CN103733115A (en) | Wearable computer with curved display and navigation tool | |
US20190265794A1 (en) | Haptic feedback for opportunistic displays | |
US20200192480A1 (en) | Systems and methods for providing haptic effects based on a user's motion or environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMMERSION CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEVESQUE, VINCENT;REEL/FRAME:048290/0901 Effective date: 20170601 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |