US20190041987A1 - Haptic effect encoding and rendering system - Google Patents

Haptic effect encoding and rendering system Download PDF

Info

Publication number
US20190041987A1
US20190041987A1 US15/668,125 US201715668125A US2019041987A1 US 20190041987 A1 US20190041987 A1 US 20190041987A1 US 201715668125 A US201715668125 A US 201715668125A US 2019041987 A1 US2019041987 A1 US 2019041987A1
Authority
US
United States
Prior art keywords
haptic
effect pattern
haptic effect
effects
media object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/668,125
Inventor
Shadi ASFOUR
Eric Gervais
Hugues-Antoine Oliver
Eric Lajeunesse
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion Corp filed Critical Immersion Corp
Priority to US15/668,125 priority Critical patent/US20190041987A1/en
Assigned to IMMERSION CORPORATION reassignment IMMERSION CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Asfour, Shadi, GERVAIS, ERIC, LAJEUNESSE, Eric, OLIVER, HUGUES-ANTOINE
Priority to EP18180409.7A priority patent/EP3438792A1/en
Priority to KR1020180081549A priority patent/KR102622570B1/en
Priority to JP2018136552A priority patent/JP7278037B2/en
Priority to CN201810814925.8A priority patent/CN109388234B/en
Publication of US20190041987A1 publication Critical patent/US20190041987A1/en
Priority to US17/329,222 priority patent/US11579697B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • H04N21/4325Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems

Definitions

  • the embodiments of the present invention are generally directed to electronic devices, and more particularly, to electronic devices that encode and render haptic effects.
  • kinesthetic feedback e.g., active and resistive force feedback
  • tactile feedback e.g., vibration, texture, temperature variation, and the like
  • Haptic feedback provides cues that intuitively enhance and simplify a user's interaction with an electronic device.
  • the haptic effects may provide cues to the user of the electronic device to alert the user to specific events, or provide realistic feedback to generate greater sensory immersion within a simulated or virtual environment.
  • Haptic feedback has also been increasingly incorporated in a variety of portable electronic devices, such as cellular telephones, smart phones, tablets, portable gaming devices, and a variety of other portable electronic devices.
  • portable electronic devices such as cellular telephones, smart phones, tablets, portable gaming devices, and a variety of other portable electronic devices.
  • some known devices modify or generate haptic effects in real-time or based on an audio file.
  • Embodiments of the present invention are directed toward electronic devices configured to produce haptic effects that substantially improve upon the related art.
  • the methods, non-transitory mediums, and systems for encoding and generating haptic effects include retrieving a media object, analyzing the media object to determine one or more time periods for rendering haptic effects, determining the haptic effects for rendering during the time periods, encoding the haptic effects as a haptic effect pattern that identifies the start time and duration of each of the haptic effects, and rendering the haptic effects according to the haptic pattern.
  • FIG. 1 is a simplified block diagram of a haptically-enabled system/device according to an example embodiment of the present invention.
  • FIG. 2 is a simplified block diagram illustrating a system for generating a haptic pattern according to an example embodiment of the present invention.
  • FIG. 3 illustrates a flow diagram of a functionality for encoding and rendering haptic effects based on a haptic effect pattern according to an example embodiment of the present invention.
  • FIGS. 4A-4C illustrate a haptic effect pattern and haptic effect timeline according to example embodiments of the present invention.
  • FIG. 5 illustrates a system environment for generating and rendering haptic feedback according to an example embodiment of the present invention.
  • FIG. 6 illustrates haptic effect timelines of haptic effects rendered according to a haptic effect pattern according to an example embodiment of the present invention.
  • FIG. 7 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to another example embodiment of the present invention.
  • FIG. 8 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to yet another example embodiment of the present invention.
  • a haptic effect pattern is used to identify one or more haptic effects according to a variety of haptic parameters.
  • the haptic effect pattern ignores haptic-free periods.
  • the embodiments reduce processor computations and power consumption.
  • FIG. 1 is a simplified block diagram of a haptically-enabled system/device 10 according to an example embodiment of the present invention.
  • System 10 includes a touch sensitive surface 11 , such as a touchscreen, or other type of user interface mounted within a housing 15 , and may include mechanical keys/buttons 13 and a speaker 28 .
  • a haptic feedback system that generates haptic effects on system 10 and includes a processor 12 .
  • processor 12 Coupled to processor 12 is a memory 20 , and a haptic drive circuit 16 which is coupled to an actuator 18 or other haptic output device.
  • Processor 12 can determine which haptic effects are rendered and the order in which the haptic effects are rendered based on an encoded haptic effect pattern.
  • the haptic effect pattern defines one or more haptic rendering time periods for each haptic effect.
  • the haptic effect pattern tracks and stores the start time and duration for each haptic effect. Additional high level parameters, such as the magnitude, frequency, and type of haptic effect may also be specified.
  • a haptic effect may be considered “dynamic” if it includes some variation of these parameters when the haptic effect is rendered or a variation of these parameters based on a user's interaction. Examples of such dynamic effects include ramp-up, ramp-down, spatial, and other haptic effects.
  • the haptic feedback system in one embodiment, generates vibrations 30 , 31 or other types of haptic effects on system 10 .
  • Processor 12 outputs the control signals to haptic drive circuit 16 , which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage (i.e., “motor signals”) to render the desired haptic effects.
  • System 10 may include more than one actuator 18 or other haptic output device, and each actuator may include a separate drive circuit 16 , all coupled to a common processor 12 .
  • Haptic drive circuit 16 is configured to drive actuator 18 .
  • haptic drive circuit 16 may attenuate the haptic drive signal at and around the resonance frequency (e.g., +/ ⁇ 20 Hz, 30 Hz, 40 Hz, etc.) of actuator 18 .
  • haptic drive circuit 16 may comprise a variety of signal processing stages, each stage defining a subset of the signal processing stages applied to modify the haptic drive signal.
  • Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire system 10 , or may be a separate processor.
  • ASIC application-specific integrated circuit
  • Memory 20 may include a variety of computer-readable media that may be accessed by processor 12 .
  • memory 20 and other memory devices described herein may include a volatile and nonvolatile medium, removable and non-removable medium.
  • memory 20 may include any combination of random access memory (“RAM”), dynamic RAM (“DRAM”), static RAM (“SRAM”), read only memory (“ROM”), flash memory, cache memory, and/or any other type of non-transitory computer-readable medium.
  • RAM random access memory
  • DRAM dynamic RAM
  • SRAM static RAM
  • ROM read only memory
  • flash memory cache memory, and/or any other type of non-transitory computer-readable medium.
  • Memory 20 stores instructions executed by processor 12 .
  • memory 20 includes media haptic simulation module 22 , which are instructions that, when executed by processor 12 , generate the haptic effects using actuator 18 in combination with touch sensitive surface 11 and/or speaker 28 , and by encoding haptic effects as discussed below.
  • Memory 20 may also be located internal to processor 12 , or any combination of internal and external memory.
  • Actuator 18 may be any type of actuator or haptic output device that can generate a haptic effect.
  • an actuator is an example of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, electrostatic friction haptic effects, temperature variation, and/or deformation haptic effects, in response to a drive signal.
  • haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, electrostatic friction haptic effects, temperature variation, and/or deformation haptic effects, in response to a drive signal.
  • Actuator 18 may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a harmonic ERM motor (“HERM”), a linear resonance actuator (“LRA”) a solenoid resonance actuator (“SRA”), a piezoelectric actuator, a macro fiber composite (“MFC”) actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, an ultrasonic vibration generator, or the like.
  • the actuator itself may include a haptic drive circuit.
  • system 10 may include other types of haptic output devices (not shown) that may be non-mechanical or non-vibratory devices such as devices that use electrostatic friction (“ESF”), ultrasonic surface friction (“USF”), devices that induce acoustic radiation pressure with an ultrasonic haptic transducer, devices that use a haptic substrate and a flexible or deformable surface or shape changing devices and that may be attached to a user's body, devices that provide projected haptic output such as a puff of air using an air jet, etc.
  • ESF electrostatic friction
  • USF ultrasonic surface friction
  • an actuator may be characterized as a standard definition (“SD”) actuator that generates vibratory haptic effects at a single frequency.
  • SD actuator examples include ERM and LRA.
  • a high definition (“HD”) actuator or high fidelity actuator such as a piezoelectric actuator or an EAP actuator is capable of generating high bandwidth/definition haptic effects at multiple frequencies.
  • HD actuators are characterized by their ability to produce wide bandwidth tactile effects with variable amplitude and with a fast response to transient drive signals.
  • System 10 may be any type of portable electronic device, such as a cellular telephone, personal digital assistant (“PDA”), smartphone, computer tablet, gaming console, remote control, or any other type of device that includes a haptic effect system that includes one or more actuators. In multiple actuator configurations, respective haptic effect patterns may be linked with each of the actuators.
  • System 10 may be a wearable device such as wrist bands, headbands, eyeglasses, rings, leg bands, arrays integrated into clothing, etc., or any other type of device that a user may wear on a body or can be held by a user and that is haptically enabled, including furniture or a vehicle steering wheel. Further, some of the elements or functionality of system 10 may be remotely located or may be implemented by another device that is in communication with the remaining elements of system 10 .
  • FIG. 2 is a simplified block diagram illustrating a system 200 for generating a haptic pattern according to an example embodiment of the present invention.
  • processor 212 may execute various programs, such as an application 210 .
  • application 210 generates media objects including audio and/or video streams, such as media stream 211 .
  • Media stream 211 may be sampled, by sampler 215 or alternatively by processor 212 , to generate haptic streams, such as haptic stream 218 .
  • haptic streams such as haptic stream 218 .
  • a 40 second media stream 211 may be sampled at a predetermined rate, such as 200 samples per second.
  • the example 40 second media stream 211 is represented by 8000 sampled values.
  • a haptic value is assigned to each of the 8000 sampled values of media stream 211 .
  • each of the 8000 assigned haptic values is processed by such conventional systems.
  • haptic effects may be generated from audio, an excessive amount of processing power is required. Excessive use of the processer can reduce significantly reduce the battery life span of portable electronic devices.
  • processor 212 converts or encodes haptic stream 218 into a haptic effect pattern 219 by analyzing the sampled values or waveforms of media stream 211 .
  • haptic effect pattern 219 is used to identify the start time and duration of each haptic effect. By identifying the start time of each haptic effect, processor 212 needs only to process the haptic effects during haptically-active time periods. As a result, processor 212 may disengage from processing haptic effects when no haptic effects are scheduled for rendering.
  • haptic effect pattern 219 corresponds to a media stream 211 having a duration 10 seconds, and includes a haptic effect that starts at a time of 9 seconds with a duration of 1 second
  • processor 212 may begin to process and render the haptic effect at the start time of the haptic effect, that is 9 seconds, and during the haptic effect duration, which is 1 second.
  • haptic effect pattern 219 may be stored in memory 220 instead of haptic stream 218 . As a result, memory usage within memory 220 is reduced. Haptic effect pattern 219 also may specify other haptic parameters, such as haptic effect type, magnitude, frequency, etc. In some instances, processor 212 also may adjust the start times and durations of the haptic effects to provide either synchronous or asynchronous haptic effects.
  • haptic data such as haptic stream 218
  • media stream 211 may be incorporated directly within media stream 211 .
  • haptic intensity or magnitude may be varied depending on the user's distance from a virtual reality object.
  • a 360 degree view may be split into a plurality of haptic tracks that are rendered simultaneously while muting the ones not in the view of the user.
  • haptic tracks may be mixed together to give a more accurate haptic representation.
  • the haptic effect pattern may be predetermined and transmitted to the electronic device.
  • FIG. 3 illustrates a flow diagram of a functionality 300 for encoding and rendering haptic effects based on a haptic effect pattern according to an example embodiment of the present invention.
  • the functionality of the flow diagram of FIG. 3 is implemented by software stored in memory or other computer readable or tangible media, and executed by a processor.
  • the functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • functionality 300 receives one or more media objects as input.
  • the media objects may include one or more audio, video, other media files (e.g., animation objects), or any combination thereof.
  • the media object may include predetermined media objects or media objects rendered “on the fly” based on actions of a user (e.g., within a gaming application).
  • functionality 300 samples the media object to generate a haptic object, such as a haptic stream.
  • the media object is sampled at a predetermined rate, such as 200 samples per second.
  • a 40 second media object may be sampled at 200 samples per second.
  • the 40 second media object may be represented by 8000 sample values.
  • a haptic value is assigned to each of the 8000 sample values of the media object.
  • functionality 300 iterates through the media object to identify time periods suitable for rendering haptic effects.
  • the sampled values or the wave form of the media object is analyzed to identify one or more haptically-relevant events.
  • the tempo of the media object may indicate that weak or strong haptic effects should be rendered (e.g., weaker haptic effects when slow, stronger haptic effects when fast).
  • the haptic effects may be selected depending on other events (e.g., crash, explosion, etc.) detected within the media object.
  • one or more haptic instructions are generated for rendering haptic effects.
  • one or more haptic instructions are generated to cause haptic effects to be rendered by one or more haptic output devices (e.g., actuator 18 of FIG. 1 ).
  • the haptic instructions may cause haptic effects to be retrieved from a haptic library.
  • functionality 300 encodes the haptic object as a haptic pattern.
  • the embodiments of the present invention store, within the haptic pattern, the haptic effect start times, durations, and effect data. For example, if the haptic pattern corresponds to media object having a duration 10 seconds, and includes a haptic effect that starts at a time of 9 seconds with a duration of 1 second, then the data segment of the haptic pattern specifies the start time of the haptic effect, that is 9 seconds, the haptic effect duration, which is 1 second, and the effect data, which is a 1 second of haptic pattern data.
  • .hapt files (a haptic file type from Immersion Corp.) may be utilized in conjunction with the embodiments of the present invention, the file size of the .hapt files are substantially reduced.
  • haptically-active time periods are stored and time periods without haptic effects are not stored or otherwise processed.
  • Other parameters such as frequency, magnitude, and haptic effect type also may be stored within the haptic effect pattern.
  • functionality 300 renders the haptic effects according to the haptic effect pattern.
  • the haptic effect pattern may be used to render haptic effects.
  • the haptic effects are rendered by scheduling events to occur at the assigned start times of the haptic effects.
  • the haptic effect pattern is used to render the haptic effects.
  • the haptic output is scheduled to occur at 9 seconds after rendering the haptic file. In other words, after 9 seconds, the processor fetches the haptic instruction and renders the corresponding haptic effects.
  • FIGS. 4A-4C illustrate a haptic effect pattern 410 , 430 and a haptic effect timeline 420 according to example embodiments of the present invention.
  • example haptic effect pattern 410 includes a list of haptic effects 411 A-D, and a plurality of corresponding fields for each haptic effect including time field 412 , duration field 413 , and pattern field 414 .
  • Haptic effects 411 A-D may be stored in a list that is sorted by start time (e.g., start times 0, 340, 610, and 9100), as indicated within time field 412 .
  • Optional duration field 413 indicates the total duration of the pattern stored within corresponding pattern field 414 .
  • Duration field 413 may be used to more readily provide more advanced or dynamic haptic effects, such as ramp-up, ramp down, and spatial haptic effects.
  • Pattern field 414 includes the duration times for alternating actuator OFF and actuator ON time periods.
  • each haptic effect 411 A-D is triggered (i.e., the start time of the respective haptic effect 411 A-D is reached)
  • the corresponding pattern field 414 is rendered at the target haptic output device (e.g., actuator 18 of FIG. 1 ).
  • the pattern for haptic effect 411 A is OFF for 0 seconds, ON for 20 seconds, OFF for 20 seconds, ON for 40 seconds, and OFF for 30 seconds.
  • the pattern for haptic effect 411 B starting at time 340 seconds, is OFF for 0 seconds, ON for 20 seconds, OFF for 40 seconds, ON for 50 seconds, OFF for 50 seconds, ON for 30 seconds, and OFF for 30 seconds.
  • the pattern for haptic effect 411 C is OFF for 0 seconds, ON for 30 seconds, OFF for 30 seconds, ON for 30 seconds, and OFF for 30 seconds.
  • the pattern for haptic effect 411 D starting at time 9100 seconds, is OFF for 0 seconds, ON for 20 seconds, OFF for 20 seconds, ON for 20 seconds, and OFF for 20 seconds.
  • FIG. 4B illustrates a haptic effect timeline 420 of the haptic effects rendered according to haptic effect pattern 410 .
  • the haptic effect timeline 420 visually represents haptic effect pattern 410 which is depicted in FIG. 4A .
  • example haptic effect pattern 430 includes haptic effect 431 , and a plurality of corresponding fields for each haptic effect including time field 432 , duration field 433 , and pattern field 434 .
  • Pattern field 434 includes the duration times and corresponding strengths for the actuator.
  • the pattern for haptic effect 431 is strength of 0.5 (e.g., half strength) for 20 seconds, strength of 0 (i.e., OFF) for 20 seconds, strength of 1.0 (e.g., full strength) for 20 seconds, strength of 0.5 for 40 seconds, and strength of 1.0 for 30 seconds.
  • a system 500 includes a media object 502 stored on one or more media server(s) 504 and haptic instructions 508 , such as a haptic library, stored on one or more haptic media server(s) 510 .
  • haptic instructions 508 such as a haptic library
  • each of media server(s) 504 and haptic media server(s) 510 may include one or more server(s) with standard components known in the art, e.g., processor, memory, data storage, network connection(s), and software configured to store and access data stored on the server.
  • Both the media server(s) 504 and the haptic media server(s) 510 are coupled to one of cloud or Internet connections 506 ( a ) or 506 ( b ). Although shown as separate servers, media server(s) 504 and haptic media server(s) 510 may be configured as part of a single server. Connections 506 ( a ) or 506 ( b ) comprise wired and/or wireless internet connections as is known in the art.
  • media object 502 may be transmitted separately from haptic instructions 508 .
  • haptic instructions 508 may be retrieved from a haptic library after one or more haptic effects are identified and/or otherwise assigned to media object 502 .
  • An application such as a publisher application 512 (e.g., a haptic-enabled Android application or haptic media software development kit (“SDK”)), may be accessed to synchronize and/or otherwise render the media and haptic objects.
  • a publisher application 512 e.g., a haptic-enabled Android application or haptic media software development kit (“SDK”)
  • the embodiments may be configured to be used in conjunction with a variety of SDKs and other products for implementing haptic effects on electronic devices.
  • SDKs and other products include Immersion's TouchSense Force SDK, Immersion Video SDK (“IVSDK”), Asynchronous Haptic Media SDK (“AHMSDK”), Aurora SDK, Unified Haptic Library (“UHL”), and JavaScript Player.
  • the haptic effect pattern may include a list of haptic effects for rendering in conjunction with the rendering of corresponding media objects.
  • the haptic effect objects are generated for each haptic effect listed in the haptic effect pattern.
  • multiple haptic effect objects may be stored or otherwise referenced within a linked list.
  • the haptic effect pattern may reference a haptic effect object at 530 ms, and reference another haptic effect object at 800 ms.
  • the haptic effects objects are executed sequentially and proceed to iterate through the linked list. Then, the embodiments move to the next effect in the linked list. Subsequently, the second effects vibration pattern is executed, and so on.
  • a linked list is described as an example implementation, other configurations are readily feasible including software arrays, queues, double linked lists, and the like.
  • the embodiments of the present invention call one or more executable modules to generate and execute the haptic effect pattern.
  • the embodiments start tracking the reference time at which the rendering of the media object was requested, and the elapsed time of the media object.
  • the reference time is the current time (typically described in milliseconds), while the elapsed time is the time that has passed since the beginning of the haptic effect.
  • the elapsed time is also used in connection with update and seek functions.
  • the timing of the haptic effects may be more accurately rendered. This enables both synchronous and asynchronous haptic playback.
  • asynchronous playback is achieved. For synchronous playback, the actual times at which playback is requested, resumed, or updated as well as the elapsed time and the reference time are used.
  • FIG. 6 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to an example embodiment of the present invention.
  • FIG. 6 depicts pause timeline 610 and resume timeline 620 which relate to the pausing and resuming of media objects and their corresponding haptic objects.
  • each executable module, media and haptic may be removed from the processing queue.
  • the media object and corresponding haptic objects are paused and further removed from the processing queue at time 500 ms.
  • the elapsed time variable is updated to the render time of the media object.
  • a running variable is set to FALSE when the media object is paused.
  • the execution times of the haptic objects are offset by the elapsed time.
  • the running variable is also set to TRUE. For example, if the pause function is selected at time 500 ms which is in advance of the haptic object scheduled to be executed at time 620 ms, then the execution time of the haptic object is offset by the elapsed time of 500 ms. Accordingly, once the resume function is selected, the haptic object executes at time 120 ms (620 ms-500 ms).
  • the pause and resume functionality may be implemented according to the following psuedocode:
  • FIG. 7 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to another example embodiment of the present invention.
  • FIG. 7 depicts seek origin timeline 710 and seek target timeline 720 which relate to the seeking function that enables a user to scroll through the timeline of media objects and their corresponding haptic objects.
  • the seeking functionality When the seeking functionality is executed, the user selects to seek from a first point of the media object, say 500 ms, to a second point in the media object, say 800 ms, as shown.
  • each executable module may be removed from the processing queue.
  • the seeking function is called at 500 ms, and any subsequent media object rendering and corresponding haptic objects are paused and further removed from the processing queue at time 500 ms.
  • the seeking functionality resets the elapsed time based on the selected target time and further determines which haptic objects remain to be executed.
  • the seeking functionality may be implemented according to the following psuedocode:
  • FIG. 8 illustrates a timeline of haptic effects rendered according to a haptic effect pattern according to yet another example embodiment of the present invention.
  • a misalignment between the media and haptic effects may occur. For example, slow network connections, lost media frames, and other events may cause such misalignments.
  • the haptic objects may be realigned with the corresponding media object.
  • the embodiments may periodically call an application programming interface (“API”) to update the current time. Such updates may occur every 1.5 seconds, for example.
  • API application programming interface
  • the embodiments may calculate a time delta between the media object and the reference time, and compare the time delta to a predetermined threshold. In the event that the time delta exceeds the predetermined threshold, then the user may sense that the haptic effects are out of sync with the media object.
  • the synchronization functionality is executed.
  • the API is called and it is determined that the reference time is 1 second, but that the media object is still at 850 ms.
  • the time delta of 150 ms (1 s-850 ms) is greater than the predetermined threshold which may be 100 ms.
  • the elapsed time of the haptic object is offset by the time delta.
  • the synchronization functionality may be implemented according to the following psuedocode:
  • a haptic effect pattern is used to identify one or more haptic effects according to a variety of haptic parameters. Additionally, the haptic effect pattern ignores haptic-free periods (i.e., silent periods). As a result, the embodiments reduce processor computations and power consumption.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiments of the present invention enable novel methods, non-transitory mediums, and systems for encoding and generating haptic effects. According to the various embodiments, a media object is retrieved. The media object is analyzed to determine one or more time periods for rendering haptic effects. The haptic effects for rendering during the time periods are determined. The haptic effects are encoded as a haptic effect pattern that identifies a start time and duration for each of the haptic effects.

Description

    FIELD OF INVENTION
  • The embodiments of the present invention are generally directed to electronic devices, and more particularly, to electronic devices that encode and render haptic effects.
  • BACKGROUND
  • Electronic device manufacturers strive to produce a rich interface for users. Conventional devices use visual and auditory cues to provide feedback to a user. In some interface devices, kinesthetic feedback (e.g., active and resistive force feedback) and/or tactile feedback (e.g., vibration, texture, temperature variation, and the like) are provided to the user. In general, such feedback is collectively known as “haptic feedback” or “haptic effects.” Haptic feedback provides cues that intuitively enhance and simplify a user's interaction with an electronic device. For example, the haptic effects may provide cues to the user of the electronic device to alert the user to specific events, or provide realistic feedback to generate greater sensory immersion within a simulated or virtual environment.
  • Haptic feedback has also been increasingly incorporated in a variety of portable electronic devices, such as cellular telephones, smart phones, tablets, portable gaming devices, and a variety of other portable electronic devices. In addition, some known devices modify or generate haptic effects in real-time or based on an audio file.
  • SUMMARY OF THE INVENTION
  • Embodiments of the present invention are directed toward electronic devices configured to produce haptic effects that substantially improve upon the related art.
  • Features and advantages of the embodiments are set forth in the description which follows, or will be apparent from the description, or may be learned by practice of the invention.
  • In one example, the methods, non-transitory mediums, and systems for encoding and generating haptic effects include retrieving a media object, analyzing the media object to determine one or more time periods for rendering haptic effects, determining the haptic effects for rendering during the time periods, encoding the haptic effects as a haptic effect pattern that identifies the start time and duration of each of the haptic effects, and rendering the haptic effects according to the haptic pattern.
  • It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not intended to limit the invention to the described examples.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Further embodiments, details, advantages, and modifications will become apparent from the following detailed description of the preferred embodiments, which is to be taken in conjunction with the accompanying drawings.
  • FIG. 1 is a simplified block diagram of a haptically-enabled system/device according to an example embodiment of the present invention.
  • FIG. 2 is a simplified block diagram illustrating a system for generating a haptic pattern according to an example embodiment of the present invention.
  • FIG. 3 illustrates a flow diagram of a functionality for encoding and rendering haptic effects based on a haptic effect pattern according to an example embodiment of the present invention.
  • FIGS. 4A-4C illustrate a haptic effect pattern and haptic effect timeline according to example embodiments of the present invention.
  • FIG. 5 illustrates a system environment for generating and rendering haptic feedback according to an example embodiment of the present invention.
  • FIG. 6 illustrates haptic effect timelines of haptic effects rendered according to a haptic effect pattern according to an example embodiment of the present invention.
  • FIG. 7 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to another example embodiment of the present invention.
  • FIG. 8 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to yet another example embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments. Wherever possible, like reference numbers will be used for like elements.
  • The embodiments of the present invention enable novel methods, non-transitory mediums, and systems for rendering haptic effects. According to the various embodiments, a haptic effect pattern is used to identify one or more haptic effects according to a variety of haptic parameters. In particular, the haptic effect pattern ignores haptic-free periods. As a result, more efficient use of device resources is provided. For example, the embodiments reduce processor computations and power consumption.
  • FIG. 1 is a simplified block diagram of a haptically-enabled system/device 10 according to an example embodiment of the present invention.
  • System 10 includes a touch sensitive surface 11, such as a touchscreen, or other type of user interface mounted within a housing 15, and may include mechanical keys/buttons 13 and a speaker 28. Internal to system 10 is a haptic feedback system that generates haptic effects on system 10 and includes a processor 12. Coupled to processor 12 is a memory 20, and a haptic drive circuit 16 which is coupled to an actuator 18 or other haptic output device.
  • Processor 12 can determine which haptic effects are rendered and the order in which the haptic effects are rendered based on an encoded haptic effect pattern. In general, the haptic effect pattern defines one or more haptic rendering time periods for each haptic effect. In particular, the haptic effect pattern tracks and stores the start time and duration for each haptic effect. Additional high level parameters, such as the magnitude, frequency, and type of haptic effect may also be specified. A haptic effect may be considered “dynamic” if it includes some variation of these parameters when the haptic effect is rendered or a variation of these parameters based on a user's interaction. Examples of such dynamic effects include ramp-up, ramp-down, spatial, and other haptic effects. The haptic feedback system, in one embodiment, generates vibrations 30, 31 or other types of haptic effects on system 10.
  • Processor 12 outputs the control signals to haptic drive circuit 16, which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage (i.e., “motor signals”) to render the desired haptic effects. System 10 may include more than one actuator 18 or other haptic output device, and each actuator may include a separate drive circuit 16, all coupled to a common processor 12.
  • Haptic drive circuit 16 is configured to drive actuator 18. For example, haptic drive circuit 16 may attenuate the haptic drive signal at and around the resonance frequency (e.g., +/−20 Hz, 30 Hz, 40 Hz, etc.) of actuator 18. In certain embodiments, haptic drive circuit 16 may comprise a variety of signal processing stages, each stage defining a subset of the signal processing stages applied to modify the haptic drive signal.
  • Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire system 10, or may be a separate processor.
  • Memory 20 may include a variety of computer-readable media that may be accessed by processor 12. In the various embodiments, memory 20 and other memory devices described herein may include a volatile and nonvolatile medium, removable and non-removable medium. For example, memory 20 may include any combination of random access memory (“RAM”), dynamic RAM (“DRAM”), static RAM (“SRAM”), read only memory (“ROM”), flash memory, cache memory, and/or any other type of non-transitory computer-readable medium. Memory 20 stores instructions executed by processor 12. Among the instructions, memory 20 includes media haptic simulation module 22, which are instructions that, when executed by processor 12, generate the haptic effects using actuator 18 in combination with touch sensitive surface 11 and/or speaker 28, and by encoding haptic effects as discussed below. Memory 20 may also be located internal to processor 12, or any combination of internal and external memory.
  • Actuator 18 may be any type of actuator or haptic output device that can generate a haptic effect. In general, an actuator is an example of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, electrostatic friction haptic effects, temperature variation, and/or deformation haptic effects, in response to a drive signal. Although the term actuator may be used throughout the detailed description, the embodiments of the invention may be readily applied to a variety of haptic output devices. Actuator 18 may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a harmonic ERM motor (“HERM”), a linear resonance actuator (“LRA”) a solenoid resonance actuator (“SRA”), a piezoelectric actuator, a macro fiber composite (“MFC”) actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, an ultrasonic vibration generator, or the like. In some instances, the actuator itself may include a haptic drive circuit.
  • In addition to, or in place of, actuator 18, system 10 may include other types of haptic output devices (not shown) that may be non-mechanical or non-vibratory devices such as devices that use electrostatic friction (“ESF”), ultrasonic surface friction (“USF”), devices that induce acoustic radiation pressure with an ultrasonic haptic transducer, devices that use a haptic substrate and a flexible or deformable surface or shape changing devices and that may be attached to a user's body, devices that provide projected haptic output such as a puff of air using an air jet, etc.
  • In general, an actuator may be characterized as a standard definition (“SD”) actuator that generates vibratory haptic effects at a single frequency. Examples of an SD actuator include ERM and LRA. By contrast to an SD actuator, a high definition (“HD”) actuator or high fidelity actuator such as a piezoelectric actuator or an EAP actuator is capable of generating high bandwidth/definition haptic effects at multiple frequencies. HD actuators are characterized by their ability to produce wide bandwidth tactile effects with variable amplitude and with a fast response to transient drive signals.
  • System 10 may be any type of portable electronic device, such as a cellular telephone, personal digital assistant (“PDA”), smartphone, computer tablet, gaming console, remote control, or any other type of device that includes a haptic effect system that includes one or more actuators. In multiple actuator configurations, respective haptic effect patterns may be linked with each of the actuators. System 10 may be a wearable device such as wrist bands, headbands, eyeglasses, rings, leg bands, arrays integrated into clothing, etc., or any other type of device that a user may wear on a body or can be held by a user and that is haptically enabled, including furniture or a vehicle steering wheel. Further, some of the elements or functionality of system 10 may be remotely located or may be implemented by another device that is in communication with the remaining elements of system 10.
  • FIG. 2 is a simplified block diagram illustrating a system 200 for generating a haptic pattern according to an example embodiment of the present invention.
  • As shown in FIG. 2, processor 212 may execute various programs, such as an application 210. As part of its functionality, application 210 generates media objects including audio and/or video streams, such as media stream 211. Media stream 211 may be sampled, by sampler 215 or alternatively by processor 212, to generate haptic streams, such as haptic stream 218. For example, a 40 second media stream 211 may be sampled at a predetermined rate, such as 200 samples per second. As a result, the example 40 second media stream 211 is represented by 8000 sampled values. In conventional systems, a haptic value is assigned to each of the 8000 sampled values of media stream 211. In turn, each of the 8000 assigned haptic values, including null values, is processed by such conventional systems. Although a wide variety of haptic effects may be generated from audio, an excessive amount of processing power is required. Excessive use of the processer can reduce significantly reduce the battery life span of portable electronic devices.
  • By contrast, in embodiments of the present invention, processor 212 converts or encodes haptic stream 218 into a haptic effect pattern 219 by analyzing the sampled values or waveforms of media stream 211. In particular, haptic effect pattern 219 is used to identify the start time and duration of each haptic effect. By identifying the start time of each haptic effect, processor 212 needs only to process the haptic effects during haptically-active time periods. As a result, processor 212 may disengage from processing haptic effects when no haptic effects are scheduled for rendering. For example, if haptic effect pattern 219 corresponds to a media stream 211 having a duration 10 seconds, and includes a haptic effect that starts at a time of 9 seconds with a duration of 1 second, then processor 212 may begin to process and render the haptic effect at the start time of the haptic effect, that is 9 seconds, and during the haptic effect duration, which is 1 second.
  • In addition to reducing the load of processor 212, haptic effect pattern 219 may be stored in memory 220 instead of haptic stream 218. As a result, memory usage within memory 220 is reduced. Haptic effect pattern 219 also may specify other haptic parameters, such as haptic effect type, magnitude, frequency, etc. In some instances, processor 212 also may adjust the start times and durations of the haptic effects to provide either synchronous or asynchronous haptic effects.
  • In an alternative embodiment, haptic data, such as haptic stream 218, may be incorporated directly within media stream 211. Such a configuration enables the connection of haptic objects to virtual reality objects in the media object. For example, the haptic intensity or magnitude may be varied depending on the user's distance from a virtual reality object. In such virtual reality contexts, a 360 degree view may be split into a plurality of haptic tracks that are rendered simultaneously while muting the ones not in the view of the user. Depending on the user's location or view, haptic tracks may be mixed together to give a more accurate haptic representation. In another alternative embodiment, the haptic effect pattern may be predetermined and transmitted to the electronic device.
  • FIG. 3 illustrates a flow diagram of a functionality 300 for encoding and rendering haptic effects based on a haptic effect pattern according to an example embodiment of the present invention. In some instances, the functionality of the flow diagram of FIG. 3 is implemented by software stored in memory or other computer readable or tangible media, and executed by a processor. In other instances, the functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.
  • At 310, functionality 300 receives one or more media objects as input. The media objects may include one or more audio, video, other media files (e.g., animation objects), or any combination thereof. The media object may include predetermined media objects or media objects rendered “on the fly” based on actions of a user (e.g., within a gaming application).
  • Next, at 320, functionality 300 samples the media object to generate a haptic object, such as a haptic stream. Typically, the media object is sampled at a predetermined rate, such as 200 samples per second. For example, a 40 second media object may be sampled at 200 samples per second. In this example, the 40 second media object may be represented by 8000 sample values. As in conventional systems, a haptic value is assigned to each of the 8000 sample values of the media object.
  • At 330, functionality 300 iterates through the media object to identify time periods suitable for rendering haptic effects. Here, the sampled values or the wave form of the media object is analyzed to identify one or more haptically-relevant events. For example, the tempo of the media object may indicate that weak or strong haptic effects should be rendered (e.g., weaker haptic effects when slow, stronger haptic effects when fast). Alternatively, or additionally, the haptic effects may be selected depending on other events (e.g., crash, explosion, etc.) detected within the media object.
  • Subsequently, at 340, one or more haptic instructions are generated for rendering haptic effects. Based on the analysis of the media object, at 330, one or more haptic instructions are generated to cause haptic effects to be rendered by one or more haptic output devices (e.g., actuator 18 of FIG. 1). For example, the haptic instructions may cause haptic effects to be retrieved from a haptic library.
  • At 350, functionality 300 encodes the haptic object as a haptic pattern. In particular, the embodiments of the present invention store, within the haptic pattern, the haptic effect start times, durations, and effect data. For example, if the haptic pattern corresponds to media object having a duration 10 seconds, and includes a haptic effect that starts at a time of 9 seconds with a duration of 1 second, then the data segment of the haptic pattern specifies the start time of the haptic effect, that is 9 seconds, the haptic effect duration, which is 1 second, and the effect data, which is a 1 second of haptic pattern data.
  • Although .hapt files (a haptic file type from Immersion Corp.) may be utilized in conjunction with the embodiments of the present invention, the file size of the .hapt files are substantially reduced. Here, only haptically-active time periods are stored and time periods without haptic effects are not stored or otherwise processed. Other parameters, such as frequency, magnitude, and haptic effect type also may be stored within the haptic effect pattern.
  • Lastly, at 360, functionality 300 renders the haptic effects according to the haptic effect pattern. Once the haptic effect pattern is encoded in the hapt file, it may be used to render haptic effects. Using the haptic pattern, the haptic effects are rendered by scheduling events to occur at the assigned start times of the haptic effects. Once the haptic effects are triggered, the haptic effect pattern is used to render the haptic effects. Returning to the previous example, the haptic output is scheduled to occur at 9 seconds after rendering the haptic file. In other words, after 9 seconds, the processor fetches the haptic instruction and renders the corresponding haptic effects.
  • FIGS. 4A-4C illustrate a haptic effect pattern 410, 430 and a haptic effect timeline 420 according to example embodiments of the present invention.
  • As shown in FIG. 4A, example haptic effect pattern 410 includes a list of haptic effects 411A-D, and a plurality of corresponding fields for each haptic effect including time field 412, duration field 413, and pattern field 414. Haptic effects 411A-D may be stored in a list that is sorted by start time (e.g., start times 0, 340, 610, and 9100), as indicated within time field 412. Optional duration field 413 indicates the total duration of the pattern stored within corresponding pattern field 414. Duration field 413 may be used to more readily provide more advanced or dynamic haptic effects, such as ramp-up, ramp down, and spatial haptic effects. Pattern field 414 includes the duration times for alternating actuator OFF and actuator ON time periods.
  • When each haptic effect 411A-D is triggered (i.e., the start time of the respective haptic effect 411A-D is reached), the corresponding pattern field 414 is rendered at the target haptic output device (e.g., actuator 18 of FIG. 1). In this example, starting at time 0 seconds, the pattern for haptic effect 411A is OFF for 0 seconds, ON for 20 seconds, OFF for 20 seconds, ON for 40 seconds, and OFF for 30 seconds. The pattern for haptic effect 411B, starting at time 340 seconds, is OFF for 0 seconds, ON for 20 seconds, OFF for 40 seconds, ON for 50 seconds, OFF for 50 seconds, ON for 30 seconds, and OFF for 30 seconds. The pattern for haptic effect 411C, starting at time 610 seconds, is OFF for 0 seconds, ON for 30 seconds, OFF for 30 seconds, ON for 30 seconds, and OFF for 30 seconds. The pattern for haptic effect 411D, starting at time 9100 seconds, is OFF for 0 seconds, ON for 20 seconds, OFF for 20 seconds, ON for 20 seconds, and OFF for 20 seconds.
  • FIG. 4B illustrates a haptic effect timeline 420 of the haptic effects rendered according to haptic effect pattern 410. In other words, the haptic effect timeline 420 visually represents haptic effect pattern 410 which is depicted in FIG. 4A.
  • Although ON/OFF patterns are described above, alternative configurations are also feasible. For example, duration/strength patterns also can be used. As shown in FIG. 4C, example haptic effect pattern 430 includes haptic effect 431, and a plurality of corresponding fields for each haptic effect including time field 432, duration field 433, and pattern field 434. Pattern field 434 includes the duration times and corresponding strengths for the actuator. In this example, starting at time 0 seconds, the pattern for haptic effect 431 is strength of 0.5 (e.g., half strength) for 20 seconds, strength of 0 (i.e., OFF) for 20 seconds, strength of 1.0 (e.g., full strength) for 20 seconds, strength of 0.5 for 40 seconds, and strength of 1.0 for 30 seconds.
  • Turning now to FIG. 5, embodiments for a system environment for generating and rendering haptic feedback are provided. As shown in FIG. 5, a system 500 includes a media object 502 stored on one or more media server(s) 504 and haptic instructions 508, such as a haptic library, stored on one or more haptic media server(s) 510. As shown in system 500, each of media server(s) 504 and haptic media server(s) 510 may include one or more server(s) with standard components known in the art, e.g., processor, memory, data storage, network connection(s), and software configured to store and access data stored on the server. Both the media server(s) 504 and the haptic media server(s) 510 are coupled to one of cloud or Internet connections 506(a) or 506(b). Although shown as separate servers, media server(s) 504 and haptic media server(s) 510 may be configured as part of a single server. Connections 506(a) or 506(b) comprise wired and/or wireless internet connections as is known in the art.
  • As shown in system 500, media object 502 may be transmitted separately from haptic instructions 508. As described above, haptic instructions 508 may be retrieved from a haptic library after one or more haptic effects are identified and/or otherwise assigned to media object 502. An application, such as a publisher application 512 (e.g., a haptic-enabled Android application or haptic media software development kit (“SDK”)), may be accessed to synchronize and/or otherwise render the media and haptic objects.
  • The embodiments may be configured to be used in conjunction with a variety of SDKs and other products for implementing haptic effects on electronic devices. Examples of such SDKs and other products include Immersion's TouchSense Force SDK, Immersion Video SDK (“IVSDK”), Asynchronous Haptic Media SDK (“AHMSDK”), Aurora SDK, Unified Haptic Library (“UHL”), and JavaScript Player.
  • As discussed above, the haptic effect pattern may include a list of haptic effects for rendering in conjunction with the rendering of corresponding media objects. Upon execution of the media object, the haptic effect objects are generated for each haptic effect listed in the haptic effect pattern. In other words, multiple haptic effect objects may be stored or otherwise referenced within a linked list. For example, the haptic effect pattern may reference a haptic effect object at 530 ms, and reference another haptic effect object at 800 ms. The haptic effects objects are executed sequentially and proceed to iterate through the linked list. Then, the embodiments move to the next effect in the linked list. Subsequently, the second effects vibration pattern is executed, and so on. Although a linked list is described as an example implementation, other configurations are readily feasible including software arrays, queues, double linked lists, and the like.
  • When the rendering of the media object is requested, the embodiments of the present invention call one or more executable modules to generate and execute the haptic effect pattern. Here, the embodiments start tracking the reference time at which the rendering of the media object was requested, and the elapsed time of the media object. The reference time is the current time (typically described in milliseconds), while the elapsed time is the time that has passed since the beginning of the haptic effect. The reference time is updated whenever an action (e.g., play or pause) is applied to the haptic effect. For example, when a pause function is selected, the elapsed time is calculated as: elapsed time=current time−reference time. The elapsed time is also used in connection with update and seek functions.
  • By using the variables described herein, such as the elapsed time and the reference time, the timing of the haptic effects may be more accurately rendered. This enables both synchronous and asynchronous haptic playback. Using the haptic effect pattern, and the time variables, asynchronous playback is achieved. For synchronous playback, the actual times at which playback is requested, resumed, or updated as well as the elapsed time and the reference time are used.
  • Haptic effects contained within the haptic effect pattern also may be scheduled and rendered according to the following pseudocode which shows the tracking of reference time:
  • //Post time is the time an effect starts relative to its beginning
    void startRenderer( ) {
    Node current = hapticEffect.getHeadNode( )
    while (current) {
    mHandler.postDelayed(mRunnable,
    current.getPattern( ).getPostTime( ));
    current = current.getNext( );
    }
    mEffect.setReferenceTime(getCurrentTimeInMS( ));
    mIsRunning = true;
    }
  • FIG. 6 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to an example embodiment of the present invention. In particular, FIG. 6 depicts pause timeline 610 and resume timeline 620 which relate to the pausing and resuming of media objects and their corresponding haptic objects. When the media object is paused, each executable module, media and haptic, may be removed from the processing queue. In the example depicted in FIG. 6, the media object and corresponding haptic objects are paused and further removed from the processing queue at time 500 ms.
  • Based on the execution of the pause function, the elapsed time variable is updated to the render time of the media object. In addition, a running variable is set to FALSE when the media object is paused. Subsequently, if the resume function is selected, the execution times of the haptic objects are offset by the elapsed time. The running variable is also set to TRUE. For example, if the pause function is selected at time 500 ms which is in advance of the haptic object scheduled to be executed at time 620 ms, then the execution time of the haptic object is offset by the elapsed time of 500 ms. Accordingly, once the resume function is selected, the haptic object executes at time 120 ms (620 ms-500 ms).
  • The pause and resume functionality may be implemented according to the following psuedocode:
  • void pauseRenderer( ) {
    mHandler.remove(mRunnable); //remove all scheduled runnables
    mEffect.setElapsedTime(mEffect.getElapsedTime( ) +
    (getCurrentTimeInMS( ) − mEffect.getReferenceTime( )));
    mIsRunning = false;
    }
    void resumeRenderer( ) {
    Node current = mNode;
    while (current) {
    mHandler.postDelayed(mRunnable,
    current.getPattern( ).getPostTime( )−
    current.getPattern( ).getElapsedTime( ));
    current = current.getNext( );
    }
    mEffect.setReferenceTime(getCurrentTimeInMS( ));
    mIsRunning = true;
    }
  • FIG. 7 illustrates timelines of haptic effects rendered according to a haptic effect pattern according to another example embodiment of the present invention. In particular, FIG. 7 depicts seek origin timeline 710 and seek target timeline 720 which relate to the seeking function that enables a user to scroll through the timeline of media objects and their corresponding haptic objects. When the seeking functionality is executed, the user selects to seek from a first point of the media object, say 500 ms, to a second point in the media object, say 800 ms, as shown.
  • To achieve the seeking functionality, each executable module may be removed from the processing queue. In the example depicted in FIG. 7, the seeking function is called at 500 ms, and any subsequent media object rendering and corresponding haptic objects are paused and further removed from the processing queue at time 500 ms. Subsequently, when the user selects a target time, 800 ms in the example shown in FIG. 7, then the seeking functionality resets the elapsed time based on the selected target time and further determines which haptic objects remain to be executed.
  • The seeking functionality may be implemented according to the following psuedocode:
  • void seekRenderer(long positionMS) {
    mEffect.setElapsedTime(positionMS); //update expected position
    if (mIsRunning) {
    mHandler.remove(mRunnable);
    }
    Node current = mHead;
    while (!correspondsToSeekedPosition(current)) {
    current = current.getNext( );
    }
    mNode = current;
    mEffect.setElapsedTime(positionMS);
    }
  • FIG. 8 illustrates a timeline of haptic effects rendered according to a haptic effect pattern according to yet another example embodiment of the present invention. During the rendering of media and haptic objects, a misalignment between the media and haptic effects may occur. For example, slow network connections, lost media frames, and other events may cause such misalignments.
  • As shown in FIG. 8, the haptic objects may be realigned with the corresponding media object. In an example implementation, the embodiments may periodically call an application programming interface (“API”) to update the current time. Such updates may occur every 1.5 seconds, for example. In turn, the embodiments may calculate a time delta between the media object and the reference time, and compare the time delta to a predetermined threshold. In the event that the time delta exceeds the predetermined threshold, then the user may sense that the haptic effects are out of sync with the media object. In addition, the synchronization functionality is executed. In the example depicted in FIG. 8, the API is called and it is determined that the reference time is 1 second, but that the media object is still at 850 ms. In this example, the time delta of 150 ms (1 s-850 ms) is greater than the predetermined threshold which may be 100 ms. In turn, the elapsed time of the haptic object is offset by the time delta.
  • The synchronization functionality may be implemented according to the following psuedocode:
  • void updateRenderer(long positionMS) {
    mEffect.setElapsedTime(mEffect.getElapsedTime( ) +
    (getCurrentTimeInMS( ) − mEffect.getReferenceTime( )));
    long delta = mEffect.getElapsedTime( ) − positionMS;
    if (delta > THRESHOLD || delta < −THRESHOLD) {
    mHandler.remove(mRunnable);
    mEffect.setElapsedTime(positionMS);
    Node current = mNode;
    while (current) {
    mHandler.postDelayed(mRunnable,
    current.getPattern( ).getPostTime( ) −
    mEffect.getElapsedTime( ));
    current = current.getNext( );
    }
    }
    mEffect.setReferenceTime(getCurrentTimeInMS( ));
    }
  • Thus, the example embodiments described herein provide more efficient techniques for encoding, rendering, and manipulating haptic effects. According to the various embodiments, a haptic effect pattern is used to identify one or more haptic effects according to a variety of haptic parameters. Additionally, the haptic effect pattern ignores haptic-free periods (i.e., silent periods). As a result, the embodiments reduce processor computations and power consumption.
  • Several embodiments have been specifically illustrated and/or described. However, it will be appreciated that modifications and variations of the disclosed embodiments are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention. The embodiments described herein are only some of the many possible implementations. Furthermore, the embodiments may be readily applied to various actuator types and other haptic output devices.

Claims (20)

1. A method for generating haptic effects, the method comprising:
retrieving a media object;
generating a haptic stream for storing the haptic effects, the haptic stream corresponding to the media object;
analyzing the media object to determine one or more time periods for rendering the haptic effects;
determining the haptic effects for rendering during the time periods;
encoding the haptic effects of the haptic stream as a haptic effect pattern that identifies a start time and a duration of each of the haptic effects;
engaging a processor to render the haptic effects of the haptic stream according to the haptic effect pattern; and
disengaging the processor from processing the haptic stream according to the haptic effect pattern.
2. The method according to claim 1, wherein the haptic effect pattern includes one or more parameters for each of the haptic effects, the parameters including magnitude, frequency, and/or type of haptic effect.
3. The method according to claim 1, wherein the media object includes an audio object or a video object.
4. The method according to claim 1, wherein the haptic effect pattern only includes data relating to haptically-active time periods.
5. The method according to claim 1, wherein the haptic effect pattern includes a plurality of duration times that indicate alternating actuator OFF and actuator ON time periods.
6. The method according to claim 1, wherein the media object and the haptic effect pattern are synchronized.
7. The method according to claim 1, further comprising:
adjusting an elapsed time variable of the haptic effect pattern in connection with execution of one of the following functions: pause, resume, and seek.
8. A device comprising:
a processor; and
a memory storing one or more programs for execution by the processor, the one or more programs including instructions for:
retrieving a media object;
generating a haptic stream for storing haptic effects, the haptic stream corresponding to the media object;
analyzing the media object to determine one or more time periods for rendering the haptic effects;
determining the haptic effects for rendering during the time periods;
encoding the haptic effects of the haptic stream as a haptic effect pattern that identifies a start time and a duration of each of the haptic effects;
engaging the processor to render the haptic effects of the haptic stream according to the haptic effect pattern; and
disengaging the processor from processing the haptic stream according to the haptic effect pattern.
9. The device according to claim 8, wherein the haptic effect pattern includes one or more parameters for each of the haptic effects, the parameters including magnitude, frequency, and/or type of haptic effect.
10. The device according to claim 8, wherein the media object includes an audio object or a video object.
11. The device according to claim 8, wherein the haptic effect pattern only includes data relating to haptically-active time periods.
12. The device according to claim 8, wherein the haptic effect pattern includes a plurality of duration times that indicate alternating actuator OFF and actuator ON time periods.
13. The device according to claim 8, wherein the media object and the haptic effect pattern are synchronized.
14. The device according to claim 8, further comprising instructions for:
adjusting an elapsed time variable of the haptic effect pattern in connection with execution of one of the following functions: pause, resume, and seek.
15. A non-transitory computer readable storage medium storing one or more programs configured to be executed by a processor, the one or more programs comprising instructions for:
retrieving a media object;
generating a haptic stream for storing haptic effects, the haptic stream corresponding to the media object;
analyzing the media object to determine one or more time periods for rendering the haptic effects;
determining the haptic effects for rendering during the time periods;
encoding the haptic effects of the haptic stream as a haptic effect pattern that identifies a start time and a duration of each of the haptic effects;
engaging a processor to render the haptic effects of the haptic stream according to the haptic effect pattern; and
disengaging the processor from processing the haptic stream according to the haptic effect pattern.
16. The non-transitory computer readable storage medium according to claim 15, wherein the haptic effect pattern includes one or more parameters for each of the haptic effects, the parameters including magnitude, frequency, and/or type of haptic effect.
17. The non-transitory computer readable storage medium according to claim 15, wherein the media object includes an audio object or a video object.
18. The non-transitory computer readable storage medium according to claim 15, wherein the haptic effect pattern only includes data relating to haptically-active time periods.
19. The non-transitory computer readable storage medium according to claim 15, wherein the haptic effect pattern includes a plurality of duration times that indicate alternating actuator OFF and actuator ON time periods.
20. The non-transitory computer readable storage medium according to claim 15, wherein the media object and the haptic effect pattern are synchronized.
US15/668,125 2017-08-03 2017-08-03 Haptic effect encoding and rendering system Abandoned US20190041987A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US15/668,125 US20190041987A1 (en) 2017-08-03 2017-08-03 Haptic effect encoding and rendering system
EP18180409.7A EP3438792A1 (en) 2017-08-03 2018-06-28 Haptic effect encoding and rendering system
KR1020180081549A KR102622570B1 (en) 2017-08-03 2018-07-13 Haptic effect encoding and rendering system
JP2018136552A JP7278037B2 (en) 2017-08-03 2018-07-20 Coding and rendering system for haptic effects
CN201810814925.8A CN109388234B (en) 2017-08-03 2018-07-24 Haptic effect encoding and rendering system
US17/329,222 US11579697B2 (en) 2017-08-03 2021-05-25 Haptic effect encoding and rendering system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/668,125 US20190041987A1 (en) 2017-08-03 2017-08-03 Haptic effect encoding and rendering system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/329,222 Continuation US11579697B2 (en) 2017-08-03 2021-05-25 Haptic effect encoding and rendering system

Publications (1)

Publication Number Publication Date
US20190041987A1 true US20190041987A1 (en) 2019-02-07

Family

ID=62816374

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/668,125 Abandoned US20190041987A1 (en) 2017-08-03 2017-08-03 Haptic effect encoding and rendering system
US17/329,222 Active 2037-11-16 US11579697B2 (en) 2017-08-03 2021-05-25 Haptic effect encoding and rendering system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/329,222 Active 2037-11-16 US11579697B2 (en) 2017-08-03 2021-05-25 Haptic effect encoding and rendering system

Country Status (5)

Country Link
US (2) US20190041987A1 (en)
EP (1) EP3438792A1 (en)
JP (1) JP7278037B2 (en)
KR (1) KR102622570B1 (en)
CN (1) CN109388234B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230205317A1 (en) * 2021-12-28 2023-06-29 Industrial Technology Research Institute Embedded system and vibration driving method

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190041987A1 (en) * 2017-08-03 2019-02-07 Immersion Corporation Haptic effect encoding and rendering system
US20220113801A1 (en) * 2019-04-26 2022-04-14 Hewlett-Packard Development Company, L.P. Spatial audio and haptics
WO2024034336A1 (en) * 2022-08-09 2024-02-15 ソニーグループ株式会社 Information processing device, information processing method, and program
US20240127680A1 (en) * 2022-10-18 2024-04-18 Tencent America LLC Method and apparatus for timed referenced access unit packetization of haptics elementary streams

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100066512A1 (en) * 2001-10-09 2010-03-18 Immersion Corporation Haptic Feedback Sensations Based on Audio Output From Computer Devices
US20140000234A1 (en) * 2012-06-29 2014-01-02 Tex-Ray Industrial. Co., Ltd. Ply yarn having twisted hollow fiber and heat retention fiber
US20140002346A1 (en) * 2012-06-27 2014-01-02 Immersion Corporation Haptic feedback control system
US20140030045A1 (en) * 2012-07-24 2014-01-30 Alan Beck Pipe pick-up and lay down apparatus
US20140300454A1 (en) * 2013-04-09 2014-10-09 Immersion Corporation Offline haptic conversion system

Family Cites Families (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI113519B (en) 2001-03-02 2004-04-30 Nokia Corp Method and apparatus for combining the characteristics of a mobile station
US6963762B2 (en) 2001-05-23 2005-11-08 Nokia Corporation Mobile phone using tactile icons
JP2003244654A (en) * 2002-02-21 2003-08-29 Canon Inc Image processing apparatus, image processing method, and storage medium
US9948885B2 (en) 2003-12-12 2018-04-17 Kurzweil Technologies, Inc. Virtual encounters
US7765333B2 (en) 2004-07-15 2010-07-27 Immersion Corporation System and method for ordering haptic effects
US8700791B2 (en) * 2005-10-19 2014-04-15 Immersion Corporation Synchronization of haptic effect data in a media transport stream
US9370704B2 (en) 2006-08-21 2016-06-21 Pillar Vision, Inc. Trajectory detection and feedback system for tennis
US8098234B2 (en) 2007-02-20 2012-01-17 Immersion Corporation Haptic feedback system with stored effects
US9019087B2 (en) * 2007-10-16 2015-04-28 Immersion Corporation Synchronization of haptic effect data in a media stream
JP5016117B2 (en) 2008-01-17 2012-09-05 アーティキュレイト テクノロジーズ インコーポレーティッド Method and apparatus for intraoral tactile feedback
US9370459B2 (en) 2009-06-19 2016-06-21 Andrew Mahoney System and method for alerting visually impaired users of nearby objects
US9251721B2 (en) 2010-04-09 2016-02-02 University Of Florida Research Foundation, Inc. Interactive mixed reality system and uses thereof
US9462262B1 (en) 2011-08-29 2016-10-04 Amazon Technologies, Inc. Augmented reality environment with environmental condition control
US10852093B2 (en) 2012-05-22 2020-12-01 Haptech, Inc. Methods and apparatuses for haptic systems
FR2999741B1 (en) 2012-12-17 2015-02-06 Centre Nat Rech Scient HAPTIC SYSTEM FOR NON-CONTACT INTERACTING AT LEAST ONE PART OF THE BODY OF A USER WITH A VIRTUAL ENVIRONMENT
US9367136B2 (en) 2013-04-12 2016-06-14 Microsoft Technology Licensing, Llc Holographic object feedback
EP3399763A1 (en) * 2013-05-24 2018-11-07 Immersion Corporation Method and system for haptic data encoding
US9908048B2 (en) 2013-06-08 2018-03-06 Sony Interactive Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display
US9811854B2 (en) 2013-07-02 2017-11-07 John A. Lucido 3-D immersion technology in a virtual store
EP4083758A1 (en) 2013-07-05 2022-11-02 Rubin, Jacob A. Whole-body human-computer interface
US9158379B2 (en) * 2013-09-06 2015-10-13 Immersion Corporation Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns
US9619980B2 (en) 2013-09-06 2017-04-11 Immersion Corporation Systems and methods for generating haptic effects associated with audio signals
US9630105B2 (en) 2013-09-30 2017-04-25 Sony Interactive Entertainment Inc. Camera based safety mechanisms for users of head mounted displays
US9213408B2 (en) * 2013-10-08 2015-12-15 Immersion Corporation Generating haptic effects while minimizing cascading
EP3095023A1 (en) 2014-01-15 2016-11-23 Sony Corporation Haptic notification on wearables
US9551873B2 (en) 2014-05-30 2017-01-24 Sony Interactive Entertainment America Llc Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content
CN106796451B (en) 2014-07-28 2020-07-21 Ck高新材料有限公司 Tactile information providing module
US9645646B2 (en) 2014-09-04 2017-05-09 Intel Corporation Three dimensional contextual feedback wristband device
US9799177B2 (en) 2014-09-23 2017-10-24 Intel Corporation Apparatus and methods for haptic covert communication
US10166466B2 (en) 2014-12-11 2019-01-01 Elwha Llc Feedback for enhanced situational awareness
US20160170508A1 (en) 2014-12-11 2016-06-16 Toyota Motor Engineering & Manufacturing North America, Inc. Tactile display devices
US9870718B2 (en) 2014-12-11 2018-01-16 Toyota Motor Engineering & Manufacturing North America, Inc. Imaging devices including spacing members and imaging devices including tactile feedback devices
US9922518B2 (en) 2014-12-11 2018-03-20 Elwha Llc Notification of incoming projectiles
US10073516B2 (en) 2014-12-29 2018-09-11 Sony Interactive Entertainment Inc. Methods and systems for user interaction within virtual reality scene using head mounted display
US9746921B2 (en) 2014-12-31 2017-08-29 Sony Interactive Entertainment Inc. Signal generation and detector systems and methods for determining positions of fingers of a user
US9843744B2 (en) 2015-01-13 2017-12-12 Disney Enterprises, Inc. Audience interaction projection system
US10322203B2 (en) 2015-06-26 2019-06-18 Intel Corporation Air flow generation for scent output
US9778746B2 (en) 2015-09-25 2017-10-03 Oculus Vr, Llc Transversal actuator for haptic feedback
US20170103574A1 (en) 2015-10-13 2017-04-13 Google Inc. System and method for providing continuity between real world movement and movement in a virtual/augmented reality experience
US20170131775A1 (en) 2015-11-10 2017-05-11 Castar, Inc. System and method of haptic feedback by referral of sensation
US10055948B2 (en) 2015-11-30 2018-08-21 Nike, Inc. Apparel with ultrasonic position sensing and haptic feedback for activities
US10310804B2 (en) 2015-12-11 2019-06-04 Facebook Technologies, Llc Modifying haptic feedback provided to a user to account for changes in user perception of haptic feedback
US10324530B2 (en) 2015-12-14 2019-06-18 Facebook Technologies, Llc Haptic devices that simulate rigidity of virtual objects
US10096163B2 (en) 2015-12-22 2018-10-09 Intel Corporation Haptic augmented reality to reduce noxious stimuli
US10065124B2 (en) 2016-01-15 2018-09-04 Disney Enterprises, Inc. Interacting with a remote participant through control of the voice of a toy device
US9846971B2 (en) 2016-01-19 2017-12-19 Disney Enterprises, Inc. Systems and methods for augmenting an appearance of a hilt to simulate a bladed weapon
US11351472B2 (en) 2016-01-19 2022-06-07 Disney Enterprises, Inc. Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon
TWI688879B (en) 2016-01-22 2020-03-21 宏達國際電子股份有限公司 Method, virtual reality system, and computer-readable recording medium for real-world interaction in virtual reality environment
US9933851B2 (en) 2016-02-22 2018-04-03 Disney Enterprises, Inc. Systems and methods for interacting with virtual objects using sensory feedback
US10555153B2 (en) 2016-03-01 2020-02-04 Disney Enterprises, Inc. Systems and methods for making non-smart objects smart for internet of things
US20170352185A1 (en) 2016-06-02 2017-12-07 Dennis Rommel BONILLA ACEVEDO System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation
US10155159B2 (en) 2016-08-18 2018-12-18 Activision Publishing, Inc. Tactile feedback systems and methods for augmented reality and virtual reality systems
US20180053351A1 (en) 2016-08-19 2018-02-22 Intel Corporation Augmented reality experience enhancement method and apparatus
US10779583B2 (en) 2016-09-20 2020-09-22 Facebook Technologies, Llc Actuated tendon pairs in a virtual reality device
US10372213B2 (en) 2016-09-20 2019-08-06 Facebook Technologies, Llc Composite ribbon in a virtual reality device
US10300372B2 (en) 2016-09-30 2019-05-28 Disney Enterprises, Inc. Virtual blaster
US10281982B2 (en) 2016-10-17 2019-05-07 Facebook Technologies, Llc Inflatable actuators in virtual reality
US10088902B2 (en) 2016-11-01 2018-10-02 Oculus Vr, Llc Fiducial rings in virtual reality
US20170102771A1 (en) 2016-12-12 2017-04-13 Leibs Technology Limited Wearable ultrasonic haptic feedback system
US20190041987A1 (en) * 2017-08-03 2019-02-07 Immersion Corporation Haptic effect encoding and rendering system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100066512A1 (en) * 2001-10-09 2010-03-18 Immersion Corporation Haptic Feedback Sensations Based on Audio Output From Computer Devices
US20140002346A1 (en) * 2012-06-27 2014-01-02 Immersion Corporation Haptic feedback control system
US20140000234A1 (en) * 2012-06-29 2014-01-02 Tex-Ray Industrial. Co., Ltd. Ply yarn having twisted hollow fiber and heat retention fiber
US20140030045A1 (en) * 2012-07-24 2014-01-30 Alan Beck Pipe pick-up and lay down apparatus
US20140300454A1 (en) * 2013-04-09 2014-10-09 Immersion Corporation Offline haptic conversion system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230205317A1 (en) * 2021-12-28 2023-06-29 Industrial Technology Research Institute Embedded system and vibration driving method

Also Published As

Publication number Publication date
CN109388234B (en) 2024-03-26
CN109388234A (en) 2019-02-26
JP7278037B2 (en) 2023-05-19
KR102622570B1 (en) 2024-01-10
US20210278903A1 (en) 2021-09-09
JP2019029012A (en) 2019-02-21
US11579697B2 (en) 2023-02-14
EP3438792A1 (en) 2019-02-06
KR20190015096A (en) 2019-02-13

Similar Documents

Publication Publication Date Title
US11579697B2 (en) Haptic effect encoding and rendering system
US10429933B2 (en) Audio enhanced simulation of high bandwidth haptic effects
US10074246B2 (en) Sound to haptic effect conversion system using multiple actuators
CN104049743B (en) For making touch feedback call synchronous system and method
US9508236B2 (en) Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns
US10395489B1 (en) Generation and braking of vibrations
CN108845673B (en) Sound-to-haptic effect conversion system using mapping
US8866601B2 (en) Overdrive voltage for an actuator to generate haptic effects
US20170090577A1 (en) Haptic effects design system
US20220014123A1 (en) Linear resonant device, and braking method for same
US20150323994A1 (en) Dynamic haptic effect modification
US20200043491A1 (en) Method and system for multimodal interaction with sound device connected to network
EP3434009A1 (en) Interactive audio metadata handling
JP2019220160A (en) Reference signal variation for generating crisp haptic effects
US11645896B2 (en) Systems, devices, and methods for providing actuator braking
WO2020176383A1 (en) Audio data with embedded tags for rendering customized haptic effects

Legal Events

Date Code Title Description
AS Assignment

Owner name: IMMERSION CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ASFOUR, SHADI;GERVAIS, ERIC;OLIVER, HUGUES-ANTOINE;AND OTHERS;REEL/FRAME:043188/0819

Effective date: 20170802

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STPP Information on status: patent application and granting procedure in general

Free format text: TC RETURN OF APPEAL

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION