US20170017310A1 - Systems and Methods for Optical Transmission of Haptic Display Parameters - Google Patents
Systems and Methods for Optical Transmission of Haptic Display Parameters Download PDFInfo
- Publication number
- US20170017310A1 US20170017310A1 US15/278,567 US201615278567A US2017017310A1 US 20170017310 A1 US20170017310 A1 US 20170017310A1 US 201615278567 A US201615278567 A US 201615278567A US 2017017310 A1 US2017017310 A1 US 2017017310A1
- Authority
- US
- United States
- Prior art keywords
- haptic
- signal
- sensor
- image
- stylus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G06T7/408—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/835—Generation of protective data, e.g. certificates
- H04N21/8358—Generation of protective data, e.g. certificates involving watermark
Definitions
- the present disclosure relates generally to systems and methods for optical transmission of haptic display parameters.
- peripherals, accessories, and other devices may be configured with haptic output devices so that a user can be provided with haptic sensations to provide information to the user, such as an indication of the occurrence of various events.
- Such effects may be used, for example to supplant or augment a visual or auditory effect.
- Such peripherals may receive information for generating haptic effects over a wireless communication channel, such as Bluetooth or Wi-Fi, however, at times such communication channels may introduce latency into the communication and hamper the ability of the peripheral to provide the haptic effect efficiently and effectively.
- Optical transmission of at least some of the parameters used to generate the effect may help improve the ability of a device to provide haptic effects.
- Embodiments provide systems and methods for optical transmission of haptic display parameters. For example, one disclosed method comprises receiving an image signal from an image sensor, the image signal associated with an image, determining a haptic signal based at least in part on the image signal; generating a haptic effect signal based at least in part on the haptic signal; and transmitting the haptic effect signal to a haptic output device.
- a computer-readable medium comprises program code for implementing such a method.
- a device comprises an image sensor configured to generate an image signal associated with an image, a haptic output device configured to receive a haptic effect signal and generate a haptic effect based at least in part on the haptic effect signal, and a processor in communication with the image sensor and the haptic output device.
- the processor is configured to determine a haptic signal based at least in part on the image signal, generate a haptic effect signal based at least in part on the haptic signal, and transmit the haptic effect signal to the haptic output device.
- FIG. 1 illustrates an electronic device for optical transmission of haptic display parameters in accordance with an embodiment
- FIG. 2 illustrates an electronic device for optical transmission of haptic display parameters in accordance with an embodiment
- FIG. 3 is a flow chart illustrating a method for optical transmission of haptic display parameters in accordance with an embodiment
- FIG. 4 is a flow chart illustrating a method for evaluating a color signal in a method for optical transmission of haptic display parameters in accordance with an embodiment.
- Example embodiments are described herein in the context of systems and methods for optical transmission of haptic display parameters. Those of ordinary skill in the art will realize that the following description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of example embodiments as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following description to refer to the same or like items.
- FIG. 1 illustrates an electronic device 100 for optical transmission of haptic display parameters.
- the electronic device 100 is a tablet.
- the device 100 includes a housing 110 , in which is disposed a display 120 and an IR diode 130 .
- the embodiment shown also includes stylus 140 .
- the stylus 140 shown is able to detect and process optical display information to generate and play haptic effects in real-time or near real-time.
- the optical information can be incident to, projected on, or transmitted by a display surface, device, or other light array.
- the haptic playback in the stylus is initiated via optical transmission from the surface display 120 .
- the stylus 140 an “active” stylus can be created that is able to process optical information via a small embedded camera, and render haptic sensations via a haptic effect generator housed in the stylus.
- Proprietary optical signatures can be embedded in the display surface, allowing the application developer to control what haptic effects are played in the stylus according to the location, state, and movement of the stylus. In such an embodiment, an end user would feel haptic effects in the stylus 140 that may be modulated based on the speed, location, inclination, etc. of a stylus tip across a display surface, such as a capacitive touchscreen.
- a peripheral or accessory other than a stylus may include an image sensor and be capable of sensing optical haptic parameters.
- a handheld scanner may implement the methods described herein to impart information to the user of the scanner.
- a mobile phone that includes a camera may implement methods such as those described herein.
- FIG. 2 illustrates a stylus 140 for optical transmission of haptic display parameters according to one embodiment.
- the stylus 140 includes a processor 210 and memory 220 in communication with the processor 210 .
- the stylus 140 also includes a communication interface 230 .
- the communication interface 230 is in communication with the processor 210 and provides wired or wireless communications from the stylus 140 to other components or other devices.
- the communication interface 230 may provide wireless communications between the stylus 140 and a wireless sensor or a wireless actuation device.
- the communication interface 230 may provide communications to one or more other devices, such as another stylus 140 , to allow users to interact with each other at their respective devices.
- the communication interface 230 can be any component or collection of components that enables the multi-pressure touch-sensitive input stylus 140 to communicate with another component or device.
- the communication interface 230 may comprise a PCI network adapter, a USB network adapter, or an Ethernet adapter.
- the communication interface 230 may communicate using wireless Ethernet, including 802.11a, g, b, or n standards. In one embodiment, the communication interface 230 can communicate using Radio Frequency (RF), Bluetooth, CDMA, TDMA, FDMA, GSM, WiFi, satellite, or other cellular or wireless technology. In other embodiments, the communication interface 230 may communicate through a wired connection and may be in communication with one or more networks, such as Ethernet, token ring, USB, FireWire 1394, fiber optic, etc. In some embodiments, stylus 140 comprises a single communication interface 230 . In other embodiments, stylus 140 comprises two, three, four, or more communication interfaces. Thus, in embodiments, stylus 140 can communicate with one or more components and/or devices through one or more communication interfaces. In other embodiments, a stylus 140 may not comprise a communication interface 230 .
- RF Radio Frequency
- Bluetooth Bluetooth
- CDMA Code Division Multiple Access
- TDMA Time Division Multiple Access
- FDMA Global System for Mobile communications
- the stylus 140 shown in FIG. 1 also comprises a haptic output device 240 , which can be any component or collection of components capable of outputting one or more haptic effects.
- a haptic output device can be one of various types including, but not limited to, an eccentric rotational mass (ERM) actuator, a linear resonant actuator (LRA), a piezoelectric actuator, a voice coil actuator, an electro-active polymer (EAP) actuator, a memory shape alloy, a pager, a DC motor, an AC motor, a moving magnet actuator, an E-core actuator, a smartgel, an electrostatic actuator, an electrotactile actuator, a deformable surface, an electrostatic friction (ESF) device, an ultrasonic friction (USF) device, or a thermal actuator for changing the surface temperature, or any other haptic output device or collection of components that perform the functions of a haptic output device or that are capable of outputting a haptic effect.
- EEM eccentric rotational mass
- LRA linear
- haptic output devices or different-sized haptic output devices may be used to provide a range of vibrational frequencies, which may be actuated individually or simultaneously.
- Various embodiments may include a single or multiple haptic output devices and may have the same type or a combination of different types of haptic output devices.
- one or more haptic output devices are directly or indirectly in communication with electronic device, such as via wired or wireless communication.
- the electronic device can be placed in a vehicle or is integrated into a vehicle and one or more haptic output devices are embedded into the vehicle.
- the stylus 140 may be used to interact with a display screen mounted in the dashboard of the vehicle.
- the stylus 140 instead of having haptic output device 240 or in addition to having haptic output device 240 , the stylus 140 has one or more other output devices.
- the stylus 140 may have a speaker and/or a small display.
- the stylus 140 has one or more haptic output devices, one or more speakers, and one or more displays. Numerous other embodiments are disclosed herein and variations are within the scope of this disclosure.
- one or more haptic effects may be produced in any number of ways or in a combination of ways.
- one or more vibrations may be used to produce a haptic effect, such as by rotating an eccentric mass or by linearly oscillating a mass.
- the haptic effect may be configured to impart a vibration to the entire electronic device or to only one surface or a limited part of the electronic device.
- friction between two or more components or friction between at least one component and at least one contact may be used to produce a haptic effect, such as by applying a brake to a moving component, such as to provide resistance to movement of a component or to provide a torque.
- haptic output devices used for this purpose include an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys.
- EEM Eccentric Rotating Mass
- LRA Linear Resonant Actuator
- smart material such as piezoelectric, electro-active polymers or shape memory alloys.
- deformation of one or more components can be used to produce a haptic effect.
- one or more haptic effects may be output to change the shape of a surface or a coefficient of friction of a surface.
- one or more haptic effects are produced by creating electrostatic forces and/or ultrasonic forces that are used to change friction on a surface.
- an array of transparent deforming elements may be used to produce a haptic effect, such as one or more areas comprising a smartgel.
- Haptic output devices also broadly include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
- a haptic effect is a kinesthetic effect.
- U.S. patent application Ser. No. 13/092,484 describes ways that one or more haptic effects can be produced and describes various haptic output devices. The entirety of U.S. patent application Ser. No. 13/092,484, filed Apr. 22, 2011, is hereby incorporated by reference.
- the stylus 140 also comprises a sensor 250 to detect movement of or interaction with the stylus 140 , an electronic device, and/or a surface.
- the sensor 250 is in communication with the processor 210 and provides sensor information to the processor 210 .
- sensor 250 may provide one or more interactions to the processor 210 .
- the sensor 250 may provide an input signal indicating one or more interactions.
- sensor 250 can provide information corresponding to one or more interactions with stylus 140 to processor 210 .
- the information the sensor 250 provides to processor 210 corresponds to an interaction with the entire stylus 140 , such as a user shaking the stylus 140 .
- the information sensor 250 provides to processor 210 corresponds to an interaction with a part of the stylus 140 , such as the tip of the stylus 140 .
- the sensor 250 may be utilized to detect one or more of a number of conditions.
- the sensor such as an accelerometer, gyroscope, or compass can detect the inclination of the pen.
- the sensor comprises an accelerometer capable of measuring the user's writing speed.
- Another embodiment uses the camera 260 instead of or in addition to the sensor 250 in order to detect the user's writing speed.
- the sensor 250 detects the pressure of the user's grip on the stylus 140 , using, for example, a pressure sensor on the surface of the stylus 140 .
- the sensor 250 detects a pressure exerted by the user on to the writing surface using a pressure sensor on the tip of the pen.
- sensor 250 comprises one or more biometric sensors that can be installed on the pen to detect or measure the mood of the user (e.g., relaxed or excited).
- the input from the biometric sensor can be used to vary the haptic effect.
- the haptic feedback may be intensified, such as by increasing the frequency and/or magnitude, if the user is determined to be excited.
- FIG. 2 depicts a single sensor 250 .
- multiple sensors can be used.
- a sensor may be housed in the same component as the other components of the stylus 140 or in a separate component.
- the processor 210 , memory 220 , and sensor 250 are all comprised in a stylus 140 .
- a sensor is placed in a component separate from the portion of the stylus 140 that the user holds.
- a wearable sensor may be in communication with the processor 210 and provide information regarding the stylus.
- Sensor 250 may comprise any number and/or type of sensing components.
- sensor 250 can comprise an accelerometer and/or gyroscope.
- accelerometer and/or gyroscope.
- gyroscope A non-limiting list of examples of sensors and interactions is provided below:
- the stylus 140 shown in FIG. 2 also comprises a camera 260 .
- the camera 260 is used to “see” the writing surface, for example, a touch screen sensitive device with a collocated visual display (host device), such as is shown in FIG. 1 .
- the pen point of the stylus 140 may comprise a conductive material capable of activating a capacitive touch screen when the user holds the pen on a capacitive sensitive touch screen.
- the stylus 140 may be used on a variety of surfaces on which are displayed or printed predefined patterns (e.g. paper, walls, and tables). The stylus 140 is able to recognize these patterns with the camera 260 and corresponding software and/or firmware and produce haptic effects based on the surface optical patterns recognized by the camera 260 .
- the camera 260 is used to see the writing surface, which may be a touch sensitive device with a collocated visual display.
- the pen point in such an embodiment could be made of conductive material to activate a capacitive touch screen when the user holds the pen on a capacitive sensitive touch screen.
- the same pen can be used on other surfaces with predefined patterns (e.g. paper, walls, and tables) and produce haptic effects based on the surface optical patterns that could be recognized by the camera 260 .
- stylus 140 comprises a smart pen.
- Smart pens are pen-like devices that can record handwriting and audio at the same time for up to several hours.
- Such pens comprise a camera behind the pen point (cartridge) that looks over a patterned paper to keep track of the information being written or drawn.
- FIG. 3 illustrates a flow chart directed to a method 300 of optical transmission of haptic display parameters in accordance with an embodiment.
- the method 300 shown in FIG. 3 will be described with respect to stylus 140 shown in FIGS. 1 and 2 .
- the method 300 begins in block 310 when a first image signal is received via camera 260 .
- a first image signal is received via camera 260 .
- the camera 260 captures and processes the image and communicates an image signal to the processor 210 .
- the processor 210 receives a sensor signal from sensor 250 .
- the sensor signal may indicate, for example, that the stylus 140 is being moved.
- the processor 210 next determines a haptic effect based at least in part on the image signal and the sensor signal 330 . In some embodiments, the processor 210 relies solely on the image signal.
- the processor 210 then generates a haptic effect signal based at least in part on the haptic effect 340 .
- the processor 210 may, for example, take the type or number of haptic output device 240 into account when generating the haptic effect signal.
- the processor 210 transmits the haptic effect signal to the haptic output device 240 in order to output the effect 350 .
- the effect can then be felt by the user as, for example a vibration of the stylus.
- an initial synchronization procedure is executed where initial conditions of the current graphical user interface (“GUI”) presented by the device 100 on the display 120 is transmitted to the stylus 140 , thus preparing the active stylus with the necessary information to generate haptic effects.
- GUI graphical user interface
- a synchronization process is executed.
- the synchronization may include, for example, whether the device 100 is displaying a GUI or a specific application. With this information, the stylus is able to determine what type of effect to generate in response to a detected optical patterns or when drawing something on the display 120 .
- FIG. 4 is a flow chart illustrating a method for evaluating a color signal in a method for optical transmission of haptic display parameters in accordance with an embodiment. The embodiment shown is similar to some steganography techniques used in images.
- First the processor 210 evaluates the color of an image 410 .
- the color of the image includes various types of information, including an RGB value.
- the processor evaluates the lower 2 bits of the RGB value 320 .
- the processor 210 determines the magnitude of a haptic effect based at least in part on the lower 2 bits of the RGB value 430 .
- the processor 210 utilizes the color and width of the drawing to determine the type of haptic effect to be played and the magnitude to be used, or modulation/texture to be used.
- a region of the display 120 may contain a special pixel pattern or grid.
- the processor 210 can then generate haptic effects that that correspond to “textures” in that region of the display 120 , or that are modulated by the speed that the user is making the stylus traverse the region.
- the haptic information can be encoded in the color patterns displayed at the edges of the icon/button.
- the device 100 may be equipped with an Infrared (“IR”) light not visible to the human eye, which is located such that the stylus 140 can detect the IR light when the stylus 140 is in close proximity to the device 100 .
- IR Infrared
- the IR light is emitted using a diode LED, but in other embodiments, light generator may be embedded in the visual display.
- the device 100 may be capable of delivering additional information from the device 100 to the stylus 140 than over some other means, such as Wi-Fi or Bluetooth.
- a pulse-width modulated (“PWM”) signal encodes information pertinent to the generation of the haptic effect.
- PWM pulse-width modulated
- a complete protocol is utilized to establish an optical communication of the haptic information.
- the PWM signal could indicate magnitude and texture of a haptic effect in a 16-bit number over a 5-msec time period, which in embodiments is a sufficiently frequent refresh rate to accurately output the haptic information. Note that the haptic loop within the pen will be much faster. In some embodiments, more complex information is transmitted to the stylus 140 , depending on the clock used to generate the PWM and the amount of information to be transmitted.
- a user uses an “active” stylus 140 on a touchscreen tablet device 100 .
- the stylus 140 contains a capacitive tip as well as a camera 260 , processor 210 , and haptic actuator 250 .
- a special optical signature (“grid” or “pattern” within the display surface 120 , which may be invisible or difficult to perceive by an end user) could indicate when the stylus 140 is writing on a paper surface, or it is being used to choose a writing implement from a menu.
- the camera 250 may also detect position information about the stylus angle, and this information might be used to modulate haptic effects being played in the stylus accordingly.
- the camera may detect information about the stylus' speed on the surface, when the stylus is touching the surface, or not, and all this information may be used to modulate haptic effects displayed to a user.
- a user may be provided with different haptic effects depending upon the writing instrument chosen (different vibration frequency or effect strength), paper type chosen, and the speed of writing.
- a user plays a game on a touchscreen device 100 with a stylus 140 .
- the stylus 140 contains a capacitive tip as well as a camera 260 , processor 210 , and haptic output device 240 .
- the device's camera 260 detects optical information that differentiates the elements from the background, for instance, RGB color, intensity, or pixel density.
- the stylus 140 processes this optical information, and delivers a haptic effect via the actuator.
- a user is playing a game where she can pop bubbles with the stylus 140 and feel “pops” as vibrotactile effects in the stylus 140 .
- a group of users are operating a large tabletop display with multiple styluses in order to draw and annotate a group sketch.
- the tabletop display device uses FTIR (Frustrated Total Internal Reflection) for touch detection.
- FTIR Fieldrated Total Internal Reflection
- Each user has a stylus that is capable of reading optical information produced by the display surface, and rendering haptic effects to the hand of the user.
- Each user is able to write simultaneously and feel effects independently.
- Such a configuration would be very challenging to accomplish using wireless protocols due to the number of independent devices.
- a user is playing with a game controller or home entertainment system that uses an IR light array for detecting gesture and movement of the human body.
- the user can hold a controller unit with an optical sensor that can detect the light array. Detected transitions or signal modulation might trigger the controller to play different haptic effects.
- the system could detect when the device is pointing at a gaming element onscreen, playing an effect. For example, the user might feel an effect when a virtual bat makes contact with a virtual baseball.
- Latency can be a key usability challenge in device interactions that use wireless communication protocols (e.g. Bluetooth, Wi-Fi, or NFC) in order to receive transmission signals and display haptic information to the user.
- wireless communication protocols e.g. Bluetooth, Wi-Fi, or NFC
- a user operating a stylus with a touchscreen mobile device, where the device sends information wirelessly to the stylus to display haptic information may encounter latency that negatively impacts usability and performance.
- haptic device can receive initiating signals in real-time without reliance on wireless communication signals to initiate haptic effect playback. This also may lessen the usability burden of “pairing” the device via handshakes to establish the communication protocol, which can be cumbersome in the case of Bluetooth or Wi-Fi networking connections. Further, an optical-detecting haptic device could be sold as a third party device and used with any display surface or device that is displaying an optical signature that it can recognize.
- application developers are able to embed specific optical information into their application that will provide a user who has an optical-detecting haptic stylus or other device with an enhanced experience. Further, pre-determined optical signatures could be made available to application developers as well as the technology to read them and render haptic effects.
- Some devices described herein may deliver haptic output in a way that creates realism and enhances usability by initiating the signal for triggering haptic effects at a host device.
- the host device then processes the haptic requirements for specific software applications, creates that haptic signal/information, and sends it to the haptic device, lessening the demands on the device providing the haptic feedback. Since this processing requires processing cycles, memory and time, devices that can overcome any restrictions added by limited throughput in wireless communication channel would be advantageous.
- a haptic device that reads haptic parameters directly from a graphical user interface, displayed as optical information may almost substantially eliminate the latency inherent in a wireless communication channel, allowing the haptic effect to be generated such that it feels almost immediate to a user.
- a device may comprise a processor or processors.
- the processor comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor.
- RAM random access memory
- the processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs for editing an image.
- Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines.
- Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICS), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
- Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor.
- Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor, such as the processor in a web server, with computer-readable instructions.
- Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read.
- the processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures.
- the processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present disclosure relates generally to systems and methods for optical transmission of haptic display parameters.
- Touch-enabled devices have become increasingly popular. For instance, peripherals, accessories, and other devices may be configured with haptic output devices so that a user can be provided with haptic sensations to provide information to the user, such as an indication of the occurrence of various events. Such effects may be used, for example to supplant or augment a visual or auditory effect. Such peripherals may receive information for generating haptic effects over a wireless communication channel, such as Bluetooth or Wi-Fi, however, at times such communication channels may introduce latency into the communication and hamper the ability of the peripheral to provide the haptic effect efficiently and effectively. Optical transmission of at least some of the parameters used to generate the effect may help improve the ability of a device to provide haptic effects.
- Embodiments provide systems and methods for optical transmission of haptic display parameters. For example, one disclosed method comprises receiving an image signal from an image sensor, the image signal associated with an image, determining a haptic signal based at least in part on the image signal; generating a haptic effect signal based at least in part on the haptic signal; and transmitting the haptic effect signal to a haptic output device. In another embodiment, a computer-readable medium comprises program code for implementing such a method.
- In one embodiment, a device comprises an image sensor configured to generate an image signal associated with an image, a haptic output device configured to receive a haptic effect signal and generate a haptic effect based at least in part on the haptic effect signal, and a processor in communication with the image sensor and the haptic output device. In one such device, the processor is configured to determine a haptic signal based at least in part on the image signal, generate a haptic effect signal based at least in part on the haptic signal, and transmit the haptic effect signal to the haptic output device.
- These illustrative embodiments are mentioned not to limit or define the invention, but rather to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, which provides further description of the invention. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
- The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more examples of embodiments and, together with the description of example embodiments, serve to explain the principles and implementations of the embodiments.
-
FIG. 1 illustrates an electronic device for optical transmission of haptic display parameters in accordance with an embodiment; -
FIG. 2 illustrates an electronic device for optical transmission of haptic display parameters in accordance with an embodiment; -
FIG. 3 is a flow chart illustrating a method for optical transmission of haptic display parameters in accordance with an embodiment; and -
FIG. 4 is a flow chart illustrating a method for evaluating a color signal in a method for optical transmission of haptic display parameters in accordance with an embodiment. - Example embodiments are described herein in the context of systems and methods for optical transmission of haptic display parameters. Those of ordinary skill in the art will realize that the following description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of example embodiments as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following description to refer to the same or like items.
- In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another.
-
FIG. 1 illustrates anelectronic device 100 for optical transmission of haptic display parameters. In the embodiment shown, theelectronic device 100 is a tablet. Thedevice 100 includes a housing 110, in which is disposed adisplay 120 and anIR diode 130. The embodiment shown also includes stylus 140. - The stylus 140 shown is able to detect and process optical display information to generate and play haptic effects in real-time or near real-time. The optical information can be incident to, projected on, or transmitted by a display surface, device, or other light array. The haptic playback in the stylus is initiated via optical transmission from the
surface display 120. For example, the stylus 140 an “active” stylus can be created that is able to process optical information via a small embedded camera, and render haptic sensations via a haptic effect generator housed in the stylus. Proprietary optical signatures can be embedded in the display surface, allowing the application developer to control what haptic effects are played in the stylus according to the location, state, and movement of the stylus. In such an embodiment, an end user would feel haptic effects in the stylus 140 that may be modulated based on the speed, location, inclination, etc. of a stylus tip across a display surface, such as a capacitive touchscreen. - Numerous other embodiments are disclosed herein and variations are within the scope of this disclosure. For example, a peripheral or accessory other than a stylus may include an image sensor and be capable of sensing optical haptic parameters. For instance, a handheld scanner may implement the methods described herein to impart information to the user of the scanner. Similarly, a mobile phone that includes a camera may implement methods such as those described herein. These are merely additional examples and are not meant to limit the scope of the present disclosure.
- This illustrative example is given to introduce the reader to the general subject matter discussed herein. The invention is not limited to this example. The following sections describe various additional non-limiting embodiments and examples of devices, systems, and methods for parameter modification of haptic effects.
-
FIG. 2 illustrates a stylus 140 for optical transmission of haptic display parameters according to one embodiment. The stylus 140 includes aprocessor 210 andmemory 220 in communication with theprocessor 210. The stylus 140 also includes acommunication interface 230. - In
FIG. 2 , thecommunication interface 230 is in communication with theprocessor 210 and provides wired or wireless communications from the stylus 140 to other components or other devices. For example, thecommunication interface 230 may provide wireless communications between the stylus 140 and a wireless sensor or a wireless actuation device. In some embodiments, thecommunication interface 230 may provide communications to one or more other devices, such as another stylus 140, to allow users to interact with each other at their respective devices. Thecommunication interface 230 can be any component or collection of components that enables the multi-pressure touch-sensitive input stylus 140 to communicate with another component or device. For example, thecommunication interface 230 may comprise a PCI network adapter, a USB network adapter, or an Ethernet adapter. Thecommunication interface 230 may communicate using wireless Ethernet, including 802.11a, g, b, or n standards. In one embodiment, thecommunication interface 230 can communicate using Radio Frequency (RF), Bluetooth, CDMA, TDMA, FDMA, GSM, WiFi, satellite, or other cellular or wireless technology. In other embodiments, thecommunication interface 230 may communicate through a wired connection and may be in communication with one or more networks, such as Ethernet, token ring, USB, FireWire 1394, fiber optic, etc. In some embodiments, stylus 140 comprises asingle communication interface 230. In other embodiments, stylus 140 comprises two, three, four, or more communication interfaces. Thus, in embodiments, stylus 140 can communicate with one or more components and/or devices through one or more communication interfaces. In other embodiments, a stylus 140 may not comprise acommunication interface 230. - The stylus 140 shown in
FIG. 1 also comprises ahaptic output device 240, which can be any component or collection of components capable of outputting one or more haptic effects. For example, a haptic output device can be one of various types including, but not limited to, an eccentric rotational mass (ERM) actuator, a linear resonant actuator (LRA), a piezoelectric actuator, a voice coil actuator, an electro-active polymer (EAP) actuator, a memory shape alloy, a pager, a DC motor, an AC motor, a moving magnet actuator, an E-core actuator, a smartgel, an electrostatic actuator, an electrotactile actuator, a deformable surface, an electrostatic friction (ESF) device, an ultrasonic friction (USF) device, or a thermal actuator for changing the surface temperature, or any other haptic output device or collection of components that perform the functions of a haptic output device or that are capable of outputting a haptic effect. Multiple haptic output devices or different-sized haptic output devices may be used to provide a range of vibrational frequencies, which may be actuated individually or simultaneously. Various embodiments may include a single or multiple haptic output devices and may have the same type or a combination of different types of haptic output devices. In some embodiments, one or more haptic output devices are directly or indirectly in communication with electronic device, such as via wired or wireless communication. In one embodiment, the electronic device can be placed in a vehicle or is integrated into a vehicle and one or more haptic output devices are embedded into the vehicle. For example, the stylus 140 may be used to interact with a display screen mounted in the dashboard of the vehicle. In some embodiments, instead of havinghaptic output device 240 or in addition to havinghaptic output device 240, the stylus 140 has one or more other output devices. For example, the stylus 140 may have a speaker and/or a small display. In one embodiment, the stylus 140 has one or more haptic output devices, one or more speakers, and one or more displays. Numerous other embodiments are disclosed herein and variations are within the scope of this disclosure. - In various embodiments, one or more haptic effects may be produced in any number of ways or in a combination of ways. For example, in one embodiment, one or more vibrations may be used to produce a haptic effect, such as by rotating an eccentric mass or by linearly oscillating a mass. In some such embodiments, the haptic effect may be configured to impart a vibration to the entire electronic device or to only one surface or a limited part of the electronic device. In another embodiment, friction between two or more components or friction between at least one component and at least one contact may be used to produce a haptic effect, such as by applying a brake to a moving component, such as to provide resistance to movement of a component or to provide a torque. In order to generate vibration effects, many devices utilize some type of actuator and/or other haptic output device. Known haptic output devices used for this purpose include an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys.
- In other embodiments, deformation of one or more components can be used to produce a haptic effect. For example, one or more haptic effects may be output to change the shape of a surface or a coefficient of friction of a surface. In an embodiment, one or more haptic effects are produced by creating electrostatic forces and/or ultrasonic forces that are used to change friction on a surface. In other embodiments, an array of transparent deforming elements may be used to produce a haptic effect, such as one or more areas comprising a smartgel. Haptic output devices also broadly include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on. In some embodiments, a haptic effect is a kinesthetic effect. U.S. patent application Ser. No. 13/092,484 describes ways that one or more haptic effects can be produced and describes various haptic output devices. The entirety of U.S. patent application Ser. No. 13/092,484, filed Apr. 22, 2011, is hereby incorporated by reference.
- In
FIG. 2 , the stylus 140 also comprises asensor 250 to detect movement of or interaction with the stylus 140, an electronic device, and/or a surface. Thesensor 250 is in communication with theprocessor 210 and provides sensor information to theprocessor 210. For example,sensor 250 may provide one or more interactions to theprocessor 210. Thesensor 250 may provide an input signal indicating one or more interactions. As another example,sensor 250 can provide information corresponding to one or more interactions with stylus 140 toprocessor 210. In embodiments, the information thesensor 250 provides toprocessor 210 corresponds to an interaction with the entire stylus 140, such as a user shaking the stylus 140. In other embodiments, theinformation sensor 250 provides toprocessor 210 corresponds to an interaction with a part of the stylus 140, such as the tip of the stylus 140. - The
sensor 250 may be utilized to detect one or more of a number of conditions. For example, in one embodiment, the sensor, such as an accelerometer, gyroscope, or compass can detect the inclination of the pen. In another embodiment, the sensor comprises an accelerometer capable of measuring the user's writing speed. Another embodiment uses thecamera 260 instead of or in addition to thesensor 250 in order to detect the user's writing speed. In yet another embodiment thesensor 250 detects the pressure of the user's grip on the stylus 140, using, for example, a pressure sensor on the surface of the stylus 140. In yet another embodiment, thesensor 250 detects a pressure exerted by the user on to the writing surface using a pressure sensor on the tip of the pen. - In one embodiment of the present invention,
sensor 250 comprises one or more biometric sensors that can be installed on the pen to detect or measure the mood of the user (e.g., relaxed or excited). The input from the biometric sensor can be used to vary the haptic effect. For instance, the haptic feedback may be intensified, such as by increasing the frequency and/or magnitude, if the user is determined to be excited. - The embodiment shown in
FIG. 2 depicts asingle sensor 250. In some embodiments, multiple sensors can be used. Additionally, a sensor may be housed in the same component as the other components of the stylus 140 or in a separate component. For example, in some embodiments, theprocessor 210,memory 220, andsensor 250 are all comprised in a stylus 140. In some embodiments, a sensor is placed in a component separate from the portion of the stylus 140 that the user holds. For instance, a wearable sensor may be in communication with theprocessor 210 and provide information regarding the stylus. -
Sensor 250 may comprise any number and/or type of sensing components. For example,sensor 250 can comprise an accelerometer and/or gyroscope. A non-limiting list of examples of sensors and interactions is provided below: -
TABLE 1 Exemplary Sensors and Conditions Sensor Interaction Sensed Accelerometer Force in one, two, or three directions Altimeter Altitude Thermometer Ambient temperature; user body temperature Heart rate monitor Heart rate of device user Skin resistance Skin resistance of device user monitor Oxygen sensor Oxygen use of device user Audio sensor/ Ambient audio and/or audio generated microphone by device user Photosensor Ambient light IR/Photosensor User eye movement, position, body temperature Hygrometer Relative humidity Speedometer Velocity Pedometer/odometer Distance traveled Chronometer Time of day, date Weight Mass or quantity of matter - The stylus 140 shown in
FIG. 2 also comprises acamera 260. Thecamera 260 is used to “see” the writing surface, for example, a touch screen sensitive device with a collocated visual display (host device), such as is shown inFIG. 1 . The pen point of the stylus 140 may comprise a conductive material capable of activating a capacitive touch screen when the user holds the pen on a capacitive sensitive touch screen. In various embodiments, the stylus 140 may be used on a variety of surfaces on which are displayed or printed predefined patterns (e.g. paper, walls, and tables). The stylus 140 is able to recognize these patterns with thecamera 260 and corresponding software and/or firmware and produce haptic effects based on the surface optical patterns recognized by thecamera 260. - In one embodiment, the
camera 260 is used to see the writing surface, which may be a touch sensitive device with a collocated visual display. The pen point in such an embodiment could be made of conductive material to activate a capacitive touch screen when the user holds the pen on a capacitive sensitive touch screen. The same pen can be used on other surfaces with predefined patterns (e.g. paper, walls, and tables) and produce haptic effects based on the surface optical patterns that could be recognized by thecamera 260. - In one embodiment, stylus 140 comprises a smart pen. Smart pens are pen-like devices that can record handwriting and audio at the same time for up to several hours. Such pens comprise a camera behind the pen point (cartridge) that looks over a patterned paper to keep track of the information being written or drawn.
- As will be clear to one of skill in the art, numerous other embodiments are disclosed herein and variations are within the scope of this disclosure.
-
FIG. 3 illustrates a flow chart directed to amethod 300 of optical transmission of haptic display parameters in accordance with an embodiment. Themethod 300 shown inFIG. 3 will be described with respect to stylus 140 shown inFIGS. 1 and 2 . - The
method 300 begins inblock 310 when a first image signal is received viacamera 260. For example, as stylus 140 is passed over an image, thecamera 260 captures and processes the image and communicates an image signal to theprocessor 210. - Next in
block 320, theprocessor 210 receives a sensor signal fromsensor 250. The sensor signal may indicate, for example, that the stylus 140 is being moved. - The
processor 210 next determines a haptic effect based at least in part on the image signal and thesensor signal 330. In some embodiments, theprocessor 210 relies solely on the image signal. - The
processor 210 then generates a haptic effect signal based at least in part on thehaptic effect 340. In some embodiments, in addition to the haptic effect, theprocessor 210 may, for example, take the type or number ofhaptic output device 240 into account when generating the haptic effect signal. - Finally, the
processor 210 transmits the haptic effect signal to thehaptic output device 240 in order to output theeffect 350. The effect can then be felt by the user as, for example a vibration of the stylus. - In one embodiment, when the
device 100 housing thedisplay 120 is turned on, an initial synchronization procedure is executed where initial conditions of the current graphical user interface (“GUI”) presented by thedevice 100 on thedisplay 120 is transmitted to the stylus 140, thus preparing the active stylus with the necessary information to generate haptic effects. In such an embodiment, if the stylus 140 has been far away from thedevice 100 and the connection is re-established, a synchronization process is executed. The synchronization may include, for example, whether thedevice 100 is displaying a GUI or a specific application. With this information, the stylus is able to determine what type of effect to generate in response to a detected optical patterns or when drawing something on thedisplay 120. -
FIG. 4 is a flow chart illustrating a method for evaluating a color signal in a method for optical transmission of haptic display parameters in accordance with an embodiment. The embodiment shown is similar to some steganography techniques used in images. First theprocessor 210 evaluates the color of animage 410. - The color of the image includes various types of information, including an RGB value. In one embodiment, the processor evaluates the lower 2 bits of the
RGB value 320. Theprocessor 210 then determines the magnitude of a haptic effect based at least in part on the lower 2 bits of theRGB value 430. - In another embodiment, the
processor 210 utilizes the color and width of the drawing to determine the type of haptic effect to be played and the magnitude to be used, or modulation/texture to be used. - In another embodiment, a region of the
display 120 may contain a special pixel pattern or grid. When thecamera 260 provides the image signal to theprocessor 210, theprocessor 210 can then generate haptic effects that that correspond to “textures” in that region of thedisplay 120, or that are modulated by the speed that the user is making the stylus traverse the region. In yet another embodiment, when one or more buttons are displayed in on thedisplay 120, the haptic information can be encoded in the color patterns displayed at the edges of the icon/button. - In some embodiments, the
device 100 may be equipped with an Infrared (“IR”) light not visible to the human eye, which is located such that the stylus 140 can detect the IR light when the stylus 140 is in close proximity to thedevice 100. In some embodiments, the IR light is emitted using a diode LED, but in other embodiments, light generator may be embedded in the visual display. - In such an embodiment, the
device 100 may be capable of delivering additional information from thedevice 100 to the stylus 140 than over some other means, such as Wi-Fi or Bluetooth. In one such embodiment, a pulse-width modulated (“PWM”) signal encodes information pertinent to the generation of the haptic effect. In one such embodiment, a complete protocol is utilized to establish an optical communication of the haptic information. For example, the PWM signal could indicate magnitude and texture of a haptic effect in a 16-bit number over a 5-msec time period, which in embodiments is a sufficiently frequent refresh rate to accurately output the haptic information. Note that the haptic loop within the pen will be much faster. In some embodiments, more complex information is transmitted to the stylus 140, depending on the clock used to generate the PWM and the amount of information to be transmitted. - In one illustrative embodiment, a user uses an “active” stylus 140 on a
touchscreen tablet device 100. The stylus 140 contains a capacitive tip as well as acamera 260,processor 210, andhaptic actuator 250. A special optical signature (“grid” or “pattern” within thedisplay surface 120, which may be invisible or difficult to perceive by an end user) could indicate when the stylus 140 is writing on a paper surface, or it is being used to choose a writing implement from a menu. Thecamera 250 may also detect position information about the stylus angle, and this information might be used to modulate haptic effects being played in the stylus accordingly. The camera may detect information about the stylus' speed on the surface, when the stylus is touching the surface, or not, and all this information may be used to modulate haptic effects displayed to a user. In various embodiments, a user may be provided with different haptic effects depending upon the writing instrument chosen (different vibration frequency or effect strength), paper type chosen, and the speed of writing. - In another embodiment, a user plays a game on a
touchscreen device 100 with a stylus 140. The stylus 140 contains a capacitive tip as well as acamera 260,processor 210, andhaptic output device 240. When the user touches thedisplay 120 on gaming elements, the device'scamera 260 detects optical information that differentiates the elements from the background, for instance, RGB color, intensity, or pixel density. The stylus 140 processes this optical information, and delivers a haptic effect via the actuator. In one example, a user is playing a game where she can pop bubbles with the stylus 140 and feel “pops” as vibrotactile effects in the stylus 140. - In yet another embodiment, a group of users are operating a large tabletop display with multiple styluses in order to draw and annotate a group sketch. The tabletop display device uses FTIR (Frustrated Total Internal Reflection) for touch detection. Each user has a stylus that is capable of reading optical information produced by the display surface, and rendering haptic effects to the hand of the user. Each user is able to write simultaneously and feel effects independently. Such a configuration would be very challenging to accomplish using wireless protocols due to the number of independent devices.
- In a further embodiment, a user is playing with a game controller or home entertainment system that uses an IR light array for detecting gesture and movement of the human body. The user can hold a controller unit with an optical sensor that can detect the light array. Detected transitions or signal modulation might trigger the controller to play different haptic effects. The system could detect when the device is pointing at a gaming element onscreen, playing an effect. For example, the user might feel an effect when a virtual bat makes contact with a virtual baseball.
- Latency can be a key usability challenge in device interactions that use wireless communication protocols (e.g. Bluetooth, Wi-Fi, or NFC) in order to receive transmission signals and display haptic information to the user. For instance, a user operating a stylus with a touchscreen mobile device, where the device sends information wirelessly to the stylus to display haptic information, may encounter latency that negatively impacts usability and performance.
- One benefit of using optical transmission of haptic effects is that the haptic device can receive initiating signals in real-time without reliance on wireless communication signals to initiate haptic effect playback. This also may lessen the usability burden of “pairing” the device via handshakes to establish the communication protocol, which can be cumbersome in the case of Bluetooth or Wi-Fi networking connections. Further, an optical-detecting haptic device could be sold as a third party device and used with any display surface or device that is displaying an optical signature that it can recognize.
- In one embodiment, application developers are able to embed specific optical information into their application that will provide a user who has an optical-detecting haptic stylus or other device with an enhanced experience. Further, pre-determined optical signatures could be made available to application developers as well as the technology to read them and render haptic effects.
- Some devices described herein may deliver haptic output in a way that creates realism and enhances usability by initiating the signal for triggering haptic effects at a host device. The host device then processes the haptic requirements for specific software applications, creates that haptic signal/information, and sends it to the haptic device, lessening the demands on the device providing the haptic feedback. Since this processing requires processing cycles, memory and time, devices that can overcome any restrictions added by limited throughput in wireless communication channel would be advantageous. Further, a haptic device that reads haptic parameters directly from a graphical user interface, displayed as optical information, may almost substantially eliminate the latency inherent in a wireless communication channel, allowing the haptic effect to be generated such that it feels almost immediate to a user.
- While the methods and systems herein are described in terms of software executing on various machines, the methods and systems may also be implemented as specifically-configured hardware, such as field-programmable gate array (FPGA) specifically configured to execute the various methods. For example, embodiments can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in a combination thereof. In one embodiment, a device may comprise a processor or processors. The processor comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs for editing an image. Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICS), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
- Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
- The foregoing description of some embodiments of the invention has been presented only for the purpose of illustration and description and is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Numerous modifications and adaptations thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention.
- Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, operation, or other characteristic described in connection with the embodiment may be included in at least one implementation of the invention. The invention is not restricted to the particular embodiments described as such. The appearance of the phrase “in one embodiment” or “in an embodiment” in various places in the specification does not necessarily refer to the same embodiment. Any particular feature, structure, operation, or other characteristic described in this specification in relation to “one embodiment” may be combined with other features, structures, operations, or other characteristics described in respect of any other embodiment.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/278,567 US20170017310A1 (en) | 2013-12-13 | 2016-09-28 | Systems and Methods for Optical Transmission of Haptic Display Parameters |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/105,266 US9489048B2 (en) | 2013-12-13 | 2013-12-13 | Systems and methods for optical transmission of haptic display parameters |
US15/278,567 US20170017310A1 (en) | 2013-12-13 | 2016-09-28 | Systems and Methods for Optical Transmission of Haptic Display Parameters |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/105,266 Continuation US9489048B2 (en) | 2013-12-13 | 2013-12-13 | Systems and methods for optical transmission of haptic display parameters |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170017310A1 true US20170017310A1 (en) | 2017-01-19 |
Family
ID=52272817
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/105,266 Active 2034-01-25 US9489048B2 (en) | 2013-12-13 | 2013-12-13 | Systems and methods for optical transmission of haptic display parameters |
US15/278,567 Abandoned US20170017310A1 (en) | 2013-12-13 | 2016-09-28 | Systems and Methods for Optical Transmission of Haptic Display Parameters |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/105,266 Active 2034-01-25 US9489048B2 (en) | 2013-12-13 | 2013-12-13 | Systems and methods for optical transmission of haptic display parameters |
Country Status (5)
Country | Link |
---|---|
US (2) | US9489048B2 (en) |
EP (1) | EP2884370B1 (en) |
JP (2) | JP6449639B2 (en) |
KR (1) | KR20150069545A (en) |
CN (1) | CN104714687B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018152383A1 (en) * | 2017-02-20 | 2018-08-23 | Microsoft Technology Licensing, Llc | Device and method for communicating with a stylus |
EP3796136A4 (en) * | 2018-05-18 | 2021-07-14 | Wacom Co., Ltd. | Position indication device and information processing device |
EP3792735B1 (en) * | 2019-09-16 | 2022-02-09 | Microsoft Technology Licensing, LLC | Stylus speed |
US11507189B1 (en) | 2022-01-21 | 2022-11-22 | Dell Products, Lp | System and method for a haptic thin-film actuator on active pen to provide variable writing pressure feedback |
US11914800B1 (en) * | 2022-10-28 | 2024-02-27 | Dell Products L.P. | Information handling system stylus with expansion bay and replaceable module |
US11983061B1 (en) | 2022-10-28 | 2024-05-14 | Dell Products L.P. | Information handling system peripheral device sleep power management |
US11983337B1 (en) | 2022-10-28 | 2024-05-14 | Dell Products L.P. | Information handling system mouse with strain sensor for click and continuous analog input |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10032065B2 (en) * | 2013-10-25 | 2018-07-24 | Wacom Co., Ltd. | Dynamic handwriting verification, handwriting-based user authentication, handwriting data generation, and handwriting data preservation |
JP5841297B1 (en) | 2013-10-25 | 2016-01-13 | 株式会社ワコム | Handwritten data output method and computer system |
US9489048B2 (en) | 2013-12-13 | 2016-11-08 | Immersion Corporation | Systems and methods for optical transmission of haptic display parameters |
US9817489B2 (en) | 2014-01-27 | 2017-11-14 | Apple Inc. | Texture capture stylus and method |
US20150212578A1 (en) * | 2014-01-27 | 2015-07-30 | Apple Inc. | Touch Implement with Haptic Feedback for Simulating Surface Texture |
US10339342B2 (en) * | 2014-05-09 | 2019-07-02 | Lenovo (Singapore) Pte. Ltd. | Data transfer based on input device identifying information |
KR101846256B1 (en) * | 2014-05-09 | 2018-05-18 | 삼성전자주식회사 | Tactile feedback apparatus and method for providing tactile feeling |
US9400570B2 (en) | 2014-11-14 | 2016-07-26 | Apple Inc. | Stylus with inertial sensor |
JP2018501558A (en) * | 2014-12-02 | 2018-01-18 | トムソン ライセンシングThomson Licensing | Haptic method and device for capturing and rendering sliding friction |
US9575573B2 (en) | 2014-12-18 | 2017-02-21 | Apple Inc. | Stylus with touch sensor |
JP6651297B2 (en) * | 2015-03-27 | 2020-02-19 | ユニバーシティ・オブ・タンペレUniversity of Tampere | Haptic stylus |
US20180299976A1 (en) * | 2015-09-07 | 2018-10-18 | Somboon Chiewcharnpipat | Digitized writing apparatus |
KR20170037158A (en) * | 2015-09-25 | 2017-04-04 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9851818B2 (en) * | 2015-10-19 | 2017-12-26 | Microsoft Technology Licensing, Llc | Handheld input apparatus |
CN105353969B (en) * | 2015-10-23 | 2019-08-06 | 广东小天才科技有限公司 | Method and system for awakening screen |
WO2017110195A1 (en) * | 2015-12-25 | 2017-06-29 | 住友理工株式会社 | Tactile vibration presentation device |
US10671186B2 (en) * | 2016-06-15 | 2020-06-02 | Microsoft Technology Licensing, Llc | Autonomous haptic stylus |
US10296089B2 (en) | 2016-08-10 | 2019-05-21 | Microsoft Technology Licensing, Llc | Haptic stylus |
US10725544B1 (en) * | 2016-09-09 | 2020-07-28 | Apple Inc. | Pencil haptics |
US10268288B1 (en) | 2016-09-20 | 2019-04-23 | Apple Inc. | Stiffness rendering for a pencil |
US20190235650A1 (en) * | 2016-10-11 | 2019-08-01 | Hewlett-Packard Development Company, Lp | Digital stylus nib including wear indicator |
DE102017111897B3 (en) | 2017-05-31 | 2018-09-06 | Trw Automotive Electronics & Components Gmbh | Operating device for a vehicle component and method for generating a feedback |
DE102018120760B4 (en) | 2018-07-12 | 2022-11-17 | Tdk Electronics Ag | Pen-type input and/or output device and method for generating a haptic signal |
CN109491526B (en) * | 2019-01-11 | 2024-05-28 | 桂林理工大学南宁分校 | Electronic pen with writing touch sense and implementation method |
US20220171518A1 (en) * | 2019-02-26 | 2022-06-02 | Sony Group Corporation | Information processing device, information processing method, and program |
JP2020177587A (en) * | 2019-04-22 | 2020-10-29 | ソニー株式会社 | Information processing device, information processing method, and program |
JP7564699B2 (en) | 2020-04-01 | 2024-10-09 | 株式会社ワコム | Handwritten data generating device, handwritten data reproducing device, and digital ink data structure |
WO2023171070A1 (en) * | 2022-03-07 | 2023-09-14 | 株式会社ワコム | Electronic pen |
CN114840075A (en) * | 2022-04-06 | 2022-08-02 | 华为技术有限公司 | Stylus and terminal equipment |
KR20240138324A (en) * | 2023-03-10 | 2024-09-20 | 삼성전자주식회사 | Tactile display apparatus |
JP7434633B1 (en) | 2023-03-23 | 2024-02-20 | レノボ・シンガポール・プライベート・リミテッド | Information processing device, information processing system, input device and control method |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6159013A (en) * | 1996-01-19 | 2000-12-12 | Parienti; Raoul | Portable reading device for the blind |
US20010035854A1 (en) * | 1998-06-23 | 2001-11-01 | Rosenberg Louis B. | Haptic feedback for touchpads and other touch controls |
US20080226134A1 (en) * | 2007-03-12 | 2008-09-18 | Stetten George Dewitt | Fingertip visual haptic sensor controller |
US20090251336A1 (en) * | 2008-04-03 | 2009-10-08 | Livescribe, Inc. | Quick Record Function In A Smart Pen Computing System |
US20100160041A1 (en) * | 2008-12-19 | 2010-06-24 | Immersion Corporation | Interactive painting game and associated controller |
US20100231541A1 (en) * | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and Methods for Using Textures in Graphical User Interface Widgets |
US20110032088A1 (en) * | 2009-08-10 | 2011-02-10 | Electronics And Telecommunications Research Institute | Method of encoding haptic information on image, method of decoding haptic information from image and apparatus of processing haptic information for the same |
US20110155044A1 (en) * | 2007-12-21 | 2011-06-30 | David Burch | Kinesthetically concordant optical, haptic image sensing device |
US20110267182A1 (en) * | 2010-04-29 | 2011-11-03 | Microsoft Corporation | Active vibrations |
US20120127088A1 (en) * | 2010-11-19 | 2012-05-24 | Apple Inc. | Haptic input device |
US20120133616A1 (en) * | 2010-11-29 | 2012-05-31 | Nishihara H Keith | Creative design systems and methods |
US20130113763A1 (en) * | 2011-11-09 | 2013-05-09 | Crayola Llc | Stylus |
US20130225261A1 (en) * | 2008-11-19 | 2013-08-29 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
US20130222280A1 (en) * | 2011-12-19 | 2013-08-29 | Qualcomm Incorporated | Integrating sensation functionalities into a mobile device using a haptic sleeve |
US20130234934A1 (en) * | 2010-12-22 | 2013-09-12 | Zspace, Inc. | Three-Dimensional Collaboration |
US20140015750A1 (en) * | 2012-07-11 | 2014-01-16 | Po Hsin Chen | Multimode pointing device |
US20140199673A1 (en) * | 2013-01-11 | 2014-07-17 | Superd Co. Ltd. | 3d virtual training system and method |
US9489048B2 (en) * | 2013-12-13 | 2016-11-08 | Immersion Corporation | Systems and methods for optical transmission of haptic display parameters |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3124886B2 (en) * | 1994-03-31 | 2001-01-15 | シャープ株式会社 | Light pen |
JP4567817B2 (en) * | 1997-09-11 | 2010-10-20 | ソニー株式会社 | Information processing apparatus and control method thereof |
WO2001091100A1 (en) | 2000-05-24 | 2001-11-29 | Immersion Corporation | Haptic devices using electroactive polymers |
US20080192234A1 (en) | 2007-02-08 | 2008-08-14 | Silverbrook Research Pty Ltd | Method of sensing motion of a sensing device relative to a surface |
JP2010224665A (en) * | 2009-03-19 | 2010-10-07 | Sony Corp | Light-tactility conversion system, and method for providing tactile feedback |
JP5254117B2 (en) * | 2009-04-20 | 2013-08-07 | シャープ株式会社 | INPUT DEVICE, ITS CONTROL METHOD, CONTROL PROGRAM, COMPUTER-READABLE RECORDING MEDIUM, AND TOUCH PANEL INPUT SYSTEM |
WO2011043415A1 (en) * | 2009-10-07 | 2011-04-14 | 日本電気株式会社 | Digital pen system and pen-based input method |
US9678569B2 (en) | 2010-04-23 | 2017-06-13 | Immersion Corporation | Systems and methods for providing haptic effects |
US8401224B2 (en) | 2010-05-05 | 2013-03-19 | Digimarc Corporation | Hidden image signalling |
US8798534B2 (en) | 2010-07-09 | 2014-08-05 | Digimarc Corporation | Mobile devices and methods employing haptics |
-
2013
- 2013-12-13 US US14/105,266 patent/US9489048B2/en active Active
-
2014
- 2014-12-03 EP EP14196166.4A patent/EP2884370B1/en not_active Not-in-force
- 2014-12-12 JP JP2014251389A patent/JP6449639B2/en not_active Expired - Fee Related
- 2014-12-12 KR KR1020140179151A patent/KR20150069545A/en not_active Application Discontinuation
- 2014-12-12 CN CN201410769370.1A patent/CN104714687B/en not_active Expired - Fee Related
-
2016
- 2016-09-28 US US15/278,567 patent/US20170017310A1/en not_active Abandoned
-
2018
- 2018-12-06 JP JP2018229163A patent/JP2019061711A/en active Pending
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6159013A (en) * | 1996-01-19 | 2000-12-12 | Parienti; Raoul | Portable reading device for the blind |
US20010035854A1 (en) * | 1998-06-23 | 2001-11-01 | Rosenberg Louis B. | Haptic feedback for touchpads and other touch controls |
US20080226134A1 (en) * | 2007-03-12 | 2008-09-18 | Stetten George Dewitt | Fingertip visual haptic sensor controller |
US20110155044A1 (en) * | 2007-12-21 | 2011-06-30 | David Burch | Kinesthetically concordant optical, haptic image sensing device |
US20090251336A1 (en) * | 2008-04-03 | 2009-10-08 | Livescribe, Inc. | Quick Record Function In A Smart Pen Computing System |
US20130225261A1 (en) * | 2008-11-19 | 2013-08-29 | Immersion Corporation | Method and apparatus for generating mood-based haptic feedback |
US20100160041A1 (en) * | 2008-12-19 | 2010-06-24 | Immersion Corporation | Interactive painting game and associated controller |
US20100231541A1 (en) * | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and Methods for Using Textures in Graphical User Interface Widgets |
US20110032088A1 (en) * | 2009-08-10 | 2011-02-10 | Electronics And Telecommunications Research Institute | Method of encoding haptic information on image, method of decoding haptic information from image and apparatus of processing haptic information for the same |
US20110267182A1 (en) * | 2010-04-29 | 2011-11-03 | Microsoft Corporation | Active vibrations |
US20120127088A1 (en) * | 2010-11-19 | 2012-05-24 | Apple Inc. | Haptic input device |
US20120133616A1 (en) * | 2010-11-29 | 2012-05-31 | Nishihara H Keith | Creative design systems and methods |
US20130234934A1 (en) * | 2010-12-22 | 2013-09-12 | Zspace, Inc. | Three-Dimensional Collaboration |
US20130113763A1 (en) * | 2011-11-09 | 2013-05-09 | Crayola Llc | Stylus |
US20130222280A1 (en) * | 2011-12-19 | 2013-08-29 | Qualcomm Incorporated | Integrating sensation functionalities into a mobile device using a haptic sleeve |
US20140015750A1 (en) * | 2012-07-11 | 2014-01-16 | Po Hsin Chen | Multimode pointing device |
US20140199673A1 (en) * | 2013-01-11 | 2014-07-17 | Superd Co. Ltd. | 3d virtual training system and method |
US9489048B2 (en) * | 2013-12-13 | 2016-11-08 | Immersion Corporation | Systems and methods for optical transmission of haptic display parameters |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018152383A1 (en) * | 2017-02-20 | 2018-08-23 | Microsoft Technology Licensing, Llc | Device and method for communicating with a stylus |
EP3796136A4 (en) * | 2018-05-18 | 2021-07-14 | Wacom Co., Ltd. | Position indication device and information processing device |
EP3792735B1 (en) * | 2019-09-16 | 2022-02-09 | Microsoft Technology Licensing, LLC | Stylus speed |
US11966533B2 (en) | 2019-09-16 | 2024-04-23 | Microsoft Technology Licensing, Llc. | Stylus speed |
US11507189B1 (en) | 2022-01-21 | 2022-11-22 | Dell Products, Lp | System and method for a haptic thin-film actuator on active pen to provide variable writing pressure feedback |
US11914800B1 (en) * | 2022-10-28 | 2024-02-27 | Dell Products L.P. | Information handling system stylus with expansion bay and replaceable module |
US11983061B1 (en) | 2022-10-28 | 2024-05-14 | Dell Products L.P. | Information handling system peripheral device sleep power management |
US11983337B1 (en) | 2022-10-28 | 2024-05-14 | Dell Products L.P. | Information handling system mouse with strain sensor for click and continuous analog input |
Also Published As
Publication number | Publication date |
---|---|
JP2019061711A (en) | 2019-04-18 |
JP6449639B2 (en) | 2019-01-09 |
CN104714687A (en) | 2015-06-17 |
CN104714687B (en) | 2020-02-11 |
EP2884370A1 (en) | 2015-06-17 |
US20150169056A1 (en) | 2015-06-18 |
EP2884370B1 (en) | 2018-10-10 |
KR20150069545A (en) | 2015-06-23 |
US9489048B2 (en) | 2016-11-08 |
JP2015115076A (en) | 2015-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9489048B2 (en) | Systems and methods for optical transmission of haptic display parameters | |
US20220083149A1 (en) | Computing interface system | |
US10466791B2 (en) | Interactivity model for shared feedback on mobile devices | |
CN104914987B (en) | Systems and methods for a haptically-enabled projected user interface | |
US8711118B2 (en) | Interactivity model for shared feedback on mobile devices | |
JP7125920B2 (en) | Information processing program | |
US10037081B2 (en) | Systems and methods for haptic fiddling | |
CN107430450B (en) | Apparatus and method for generating input | |
KR102087392B1 (en) | Method of operating and electronic device thereof | |
US20140340326A1 (en) | Drawing apparatus and drawing system | |
US20140340328A1 (en) | Drawing apparatus and drawing system | |
US20180011538A1 (en) | Multimodal haptic effects | |
EP3113014B1 (en) | Mobile terminal and method for controlling the same | |
WO2013163233A1 (en) | Detachable sensory-interface device for a wireless personal communication device and method | |
WO2017058637A1 (en) | Filtering controller input mode |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMMERSION CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WEDDLE, AMAYA;CRUZ-HERNANDEZ, JUAN MANUEL;REEL/FRAME:048291/0584 Effective date: 20131212 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |