US20190267043A1 - Automated haptic effect accompaniment - Google Patents

Automated haptic effect accompaniment Download PDF

Info

Publication number
US20190267043A1
US20190267043A1 US16/288,686 US201916288686A US2019267043A1 US 20190267043 A1 US20190267043 A1 US 20190267043A1 US 201916288686 A US201916288686 A US 201916288686A US 2019267043 A1 US2019267043 A1 US 2019267043A1
Authority
US
United States
Prior art keywords
media
haptic
playing
audio
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/288,686
Inventor
Robert A. Lacroix
Paul Norris
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion Corp filed Critical Immersion Corp
Priority to US16/288,686 priority Critical patent/US20190267043A1/en
Assigned to IMMERSION CORPORATION reassignment IMMERSION CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LACROIX, ROBERT A., NORRIS, PAUL
Publication of US20190267043A1 publication Critical patent/US20190267043A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Definitions

  • One embodiment is directed generally to haptic effects, and in particular to automated accompaniment of haptic effects to audio and/or video.
  • kinesthetic feedback such as active and resistive force feedback
  • tactile feedback such as vibration, texture, and heat
  • Haptic feedback can provide cues that enhance and simplify the user interface.
  • vibration effects, or vibrotactile haptic effects may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.
  • An increasing number of devices such as smartphones and tablets, include hardware, such as actuators, for generating haptic effects.
  • Haptic effects in particular, can enhance the viewing of audio and/or audio/video on these devices.
  • haptic effect accompaniment to an audio/video track can allow a viewer to “feel” an engine roaring in a car, explosions, collisions, and the shimmering feeling of sunlight.
  • One embodiment is a method of playing media on a device.
  • the method includes initiating the playing of audio/video (“A/V”) media on the device, and then identifying the A/V media.
  • the method further includes selecting a pre-defined haptic track that corresponds to the identified A/V media, and playing the selected pre-defined haptic track in synchrony with the playing of the NV, where the playing of the selected pre-defined haptic track generates haptic effects on the device.
  • A/V audio/video
  • FIG. 1 is a block diagram of a haptically-enabled system/device in accordance with one embodiment of the present invention.
  • FIG. 2 is an overview diagram that includes the system of FIG. 1 and other network elements in accordance with one embodiment of the invention.
  • FIG. 3 is a flow diagram of the functionality of the system of FIG. 1 in accordance with an embodiment of the invention.
  • Embodiments allow for the automatic recognition and identification of an audio/video track that is playing or that may be played on a device. Based on the identification, a corresponding haptic effect track is selected and retrieved, and then played on a haptic output device during the playing of the audio/video track. As a result, the audio/video playback is automatically enhanced with haptic effects.
  • FIG. 1 is a block diagram of a haptically-enabled system/device 10 in accordance with one embodiment of the present invention.
  • System 10 includes a touch sensitive surface 11 or other type of user interface mounted within a housing 15 , and may include mechanical keys/buttons 13 .
  • a haptic feedback system that generates haptic effects on system 10 and includes a processor or controller 12 . Coupled to processor 12 is a memory 20 , and an actuator drive circuit 16 which is coupled to an actuator 18 .
  • Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”).
  • ASIC application-specific integrated circuit
  • Processor 12 may be the same processor that operates the entire system 10 , or may be a separate processor.
  • Processor 12 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency and duration.
  • Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect.
  • a haptic effect may be considered “dynamic” if it includes some variation of these parameters when the haptic effect is generated or a variation of these parameters based on a user's interaction.
  • the haptic feedback system in one embodiment generates vibrations 26 , 27 , or other types of haptic effects on system 10 .
  • Processor 12 outputs the control signals to actuator drive circuit 16 , which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage (i.e., “motor signals”) to cause the desired haptic effects.
  • System 10 may include more than one actuator 18 , and each actuator may include a separate drive circuit 16 , all coupled to a common processor 12 .
  • Memory 20 can be any type of storage device or computer-readable medium, such as random access memory (“RAM”) or read-only memory (“ROM”).
  • Memory 20 stores instructions executed by processor 12 .
  • memory 20 includes haptic effect accompaniment module 22 , which are instructions that, when executed by processor 12 , automatically selects and generates haptic effects that accompany audio or audio/video media, as disclosed in more detail below.
  • Memory 20 may also be located internal to processor 12 , or any combination of internal and external memory.
  • System 10 may be any type of handheld/mobile device, such as a cellular telephone, personal digital assistant (“PDA”), smartphone, computer tablet, gaming console, remote control, or any other type of device that includes a haptic effect system that includes one or more actuators or any other type of haptic output device.
  • System 10 further includes an audio/visual system (not shown) that is capable of playing video (with audio) or audio only.
  • System 10 may be a wearable device such as wrist bands, headbands, eyeglasses, rings, leg bands, arrays integrated into clothing, etc., or any other type of device that a user may wear on a body or can be held by a user and that is haptically enabled, including furniture, a game controller, or a vehicle steering wheel. Further, some of the elements or functionality of system 10 may be remotely located or may be implemented by another device that is in communication with the remaining elements of system 10 .
  • system 10 may include other types of haptic output devices (not shown) that may be non-mechanical or non-vibratory devices such as devices that use electrostatic friction (“ESF”), ultrasonic surface friction (“USF”), devices that induce acoustic radiation pressure with an ultrasonic haptic transducer, devices that use a haptic substrate and a flexible or deformable surface or shape changing devices and that may be attached to a user's body, devices that provide projected haptic output such as a puff of air using an air jet, etc.
  • ESF electrostatic friction
  • USF ultrasonic surface friction
  • FIG. 2 is an overview diagram that includes system 10 of FIG. 1 and other network elements in accordance with one embodiment of the invention.
  • System 10 is coupled to audio/visual (“A/V”) media servers 204 , and haptic accompaniment servers 206 through the internet 202 , or through any other communications method.
  • media servers 204 and haptic servers 206 can be local to system 10 , or the functionality of servers 204 and 206 can be provided by system 10 itself.
  • haptic accompaniment servers 206 is Associated with haptic accompaniment servers 206 .
  • Media recognizer 35 “observes” A/V media playback or other A/V media characteristics and uniquely identifies the A/V media, and in conjunction with media listener 32 described below, finds/selects/determines a haptic effect media/track that corresponds to the identified media.
  • System 10 includes an A/V player 31 , a media listener 32 , and a haptic player 33 .
  • A/V player 31 plays audio and/or video media on system 10 .
  • Haptic player 33 plays haptic effects on system 10 , and includes actuator drive circuit 16 and actuator 18 of FIG. 1 , or any type of haptic output device in other embodiments.
  • Media listener 32 communicates with or otherwise uses media recognizer 35 and provides by itself or in conjunction with media recognizer 35 the automatic generation of the file association between haptic media and A/V media.
  • prior art solutions require an explicit association element in the system, such as metadata, that explicitly tells a playback system where a corresponding haptic track is located.
  • media listener 32 functions as a “matchmaker” by automatically matching pre-existing or pre-defined haptic tracks to A/V media using some type of AN recognition data.
  • Media recognizer 35 can receive the A/V recognition data and in turn generate an identity of the A/V media.
  • the input to media recognizer 35 that functions as the A/V recognition data is the content of the A/V media itself, which can be some combination of audio and video, or video only, or audio only, and is used to uniquely identify the A/V media.
  • the audio and video input in this embodiment is original content in that it has not been modified to assist in recognition haptic matching, such as by the addition of metadata.
  • identifying information surrounding the A/V media is used to uniquely identify the A/V media by media recognizer 35 and functions as A/V recognition data.
  • the identifying information can include, for example, a file name, a uniform resource locator (“URL), MPEG-4 metadata, etc.
  • URL uniform resource locator
  • MPEG-4 metadata etc.
  • the A/V media is re-encoded, or transcoded to inject human-perceptible or non-human-perceptible audio and/or video watermarks or signatures into the A/V media in order to assist in recognition for haptic matching and functions as the A/V recognition data. Therefore, the media is slightly modified from its original state before being stored on a media server for later streaming or downloading.
  • the A/V media is re-encoded, transcoded, or otherwise modified to inject metadata into the A/V media in order to assist in recognition for haptic matching and functions as the A/V recognition data. Therefore, the media is slightly modified from its original state before being stored on a media server for later streaming or downloading.
  • media listener 32 monitors A/V recognition data as it is being passed to, or is being output from, a media player system such as media server 204 .
  • A/V media may be requested by a user of system 10 by, for example, selecting a YouTube video stored on media server 204 to be streamed via internet 202 and played by A/V player 31 .
  • media listener 32 may pre-process the A/V media in order to create the A/V recognition data.
  • media listener 32 may continuously stream A/V recognition data to media recognizer 35 , or it may have the ability to discriminate at what times it is necessary to transmit A/V recognition data. This will lead to the A/V recognition data being either a continuous stream of data, or an occasional discontinuous stream of data, or some discrete number of data packets.
  • Media recognizer 35 uses the A/V recognition data to attempt to match the data to an existing haptic media or haptic track, possibly among a large set of pre-defined haptic tracks stored on server 206 . Media recognizer 35 , upon finding a match, provides the location of the haptic track to media listener 32 .
  • the functionality of media recognizer 35 can be implemented by system 10 itself, or remote from system 10 as shown in FIG. 2 .
  • one or both of the media listener 32 and media recognizer 35 determine the current playback location of the A/V media and the corresponding haptic track.
  • Media listener 32 instructs haptic player 33 of the location of the haptic track, and the current playback location of the A/V media.
  • One or both of media listener 32 and haptic player 33 render the haptic track according to the current playback location of the A/V media such as to ensure reasonable synchrony between the A/V media and the haptic media.
  • the haptic track is rendered, in one embodiment, by sending the corresponding haptic effect signal to actuator drive circuit 16 , which causes actuator 18 to generate a vibratory haptic effect.
  • the haptic track can start upon initiation of the playing of the A/V media, or at some point prior to or after initiation.
  • system 10 can be a mobile device with a haptic output system via haptic player 33 , as well as A/V media playback capabilities via A/V player 31 , and media listener 32 may be a service running on the mobile device, such as an application/“app”.
  • Media listener 32 communicates the A/V recognition data through internet 202 , wired or wirelessly, to server 206 that is running a process that instantiates media recognizer 35 , and to the same, or a different server 204 , that hosts one or more haptic media that are associated with specific A/V media.
  • system 10 can be a set-top box with a remote control unit that is equipped with a haptic output system that could be in the form of a gamepad, where the set-top box has the ability to transport A/V media to audio and video data sinks (e.g., a television and an audio sound system).
  • media listener 32 is a service running on the set-top box, and communicates with servers 204 and 206 over internet 202 .
  • the set-top box is a gaming console.
  • the set-top box hosts both media listener 32 and media recognizer 35 , and the haptic media is accessible locally on the set-top box (e.g., in volatile memory, non-volatile memory, or physical disk), or the haptic media is accessible remotely by residing on media server 204 that is accessible via internet 202 .
  • the A/V content is rendered through a web browser on a device (e.g., Smart TV, mobile device, set-top box, etc.) or directly on the device, and media listener 32 can access the A/V recognition data through known software/hardware methods.
  • a device e.g., Smart TV, mobile device, set-top box, etc.
  • media listener 32 can access the A/V recognition data through known software/hardware methods.
  • FIG. 3 is a flow diagram of the functionality of system 10 of FIG. 1 in accordance with an embodiment of the invention.
  • the functionality of the flow diagram of FIG. 3 is implemented by software stored in memory or other computer readable or tangible medium, and executed by a processor.
  • the functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • A/V media (which includes audio and video, only audio, or only video media) begins playing or is initially selected.
  • the A/V media includes AN recognition data, which can include the A/V media itself.
  • the A/V media is recognized based on the A/V recognition data.
  • a haptic track that corresponds to the recognized A/V media is identified and retrieved/selected.
  • the haptic track would include a haptic effect signal that when applied to a haptic effect output device would generate haptic effects.
  • the haptic track is played on system 10 in synchronized fashion with the playing of the A/V media. As a result, the viewing of the A/V media is enhanced.
  • the A/V media is first recognized, and then a haptic track is retrieved/selected, in another embodiment the reverse may happen.
  • a haptic track is initiated and identified, and then the system selects and plays back A/V media in a synchronized fashion.
  • the selection of the haptic track at 303 can depend on the capabilities of the target device on which the haptic effect will be generated. For example, some devices are equipped with standard definition (“SD”) actuators, such as LRAs, and other devices are equipped with high definition (“HD”) actuators, such as piezo actuators. For a particular A/V media, there may be both a corresponding SD and HD haptic track. The selection of SD or HD will depend on the capabilities of the target device. Media listener 32 in one embodiment would have knowledge of the haptic effect playback capabilities of system 10 .
  • SD standard definition
  • HD high definition
  • a user has a smartphone that has the media listener service running.
  • the user starts up the YouTube application, and proceeds to watch a kitten fight video for which someone has previously created a corresponding haptic track.
  • the media listener sends processed (or unprocessed) data to a media recognizer, which can reside on the smartphone or an internet server.
  • the recognizer finds a suitable haptic media for the kitten fight video.
  • the media listener service starts retrieving the haptic media to play it back in synchrony with the kitten fight video, creating haptic sensations for every kitten swipe and knockdown.
  • embodiments observe A/V media playback, and in observing it, uniquely identify the media. Embodiments then find a corresponding haptic media, and then synchronize the haptic media playback to the A/V media playback on a personal computing device, or any other type of haptically enabled device.

Abstract

A method of playing media on a device includes initiating the playing of audio/video (“A/V”) media on the device, and then identifying the A/V media. The method further includes selecting a pre-defined haptic track that corresponds to the identified A/V media, and playing the selected pre-defined haptic track in synchrony with the playing of the A/V, where the playing of the selected pre-defined haptic track generates haptic effects on the device.

Description

    PRIORITY APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 14/619,590, filed on Feb. 11, 2015, which has been incorporated herein by reference in its entirety.
  • FIELD
  • One embodiment is directed generally to haptic effects, and in particular to automated accompaniment of haptic effects to audio and/or video.
  • BACKGROUND INFORMATION
  • Electronic device manufacturers strive to produce a rich interface for users. Conventional devices use visual and auditory cues to provide feedback to a user. In some interface devices, kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, texture, and heat) is also provided to the user, more generally known collectively as “haptic feedback” or “haptic effects”. Haptic feedback can provide cues that enhance and simplify the user interface. Specifically, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.
  • An increasing number of devices, such as smartphones and tablets, include hardware, such as actuators, for generating haptic effects. Haptic effects, in particular, can enhance the viewing of audio and/or audio/video on these devices. For example, haptic effect accompaniment to an audio/video track can allow a viewer to “feel” an engine roaring in a car, explosions, collisions, and the shimmering feeling of sunlight.
  • SUMMARY
  • One embodiment is a method of playing media on a device. The method includes initiating the playing of audio/video (“A/V”) media on the device, and then identifying the A/V media. The method further includes selecting a pre-defined haptic track that corresponds to the identified A/V media, and playing the selected pre-defined haptic track in synchrony with the playing of the NV, where the playing of the selected pre-defined haptic track generates haptic effects on the device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a haptically-enabled system/device in accordance with one embodiment of the present invention.
  • FIG. 2 is an overview diagram that includes the system of FIG. 1 and other network elements in accordance with one embodiment of the invention.
  • FIG. 3 is a flow diagram of the functionality of the system of FIG. 1 in accordance with an embodiment of the invention.
  • DETAILED DESCRIPTION
  • Embodiments allow for the automatic recognition and identification of an audio/video track that is playing or that may be played on a device. Based on the identification, a corresponding haptic effect track is selected and retrieved, and then played on a haptic output device during the playing of the audio/video track. As a result, the audio/video playback is automatically enhanced with haptic effects.
  • FIG. 1 is a block diagram of a haptically-enabled system/device 10 in accordance with one embodiment of the present invention. System 10 includes a touch sensitive surface 11 or other type of user interface mounted within a housing 15, and may include mechanical keys/buttons 13.
  • Internal to system 10 is a haptic feedback system that generates haptic effects on system 10 and includes a processor or controller 12. Coupled to processor 12 is a memory 20, and an actuator drive circuit 16 which is coupled to an actuator 18. Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire system 10, or may be a separate processor. Processor 12 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency and duration. Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect. A haptic effect may be considered “dynamic” if it includes some variation of these parameters when the haptic effect is generated or a variation of these parameters based on a user's interaction. The haptic feedback system in one embodiment generates vibrations 26, 27, or other types of haptic effects on system 10.
  • Processor 12 outputs the control signals to actuator drive circuit 16, which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage (i.e., “motor signals”) to cause the desired haptic effects. System 10 may include more than one actuator 18, and each actuator may include a separate drive circuit 16, all coupled to a common processor 12.
  • Memory 20 can be any type of storage device or computer-readable medium, such as random access memory (“RAM”) or read-only memory (“ROM”). Memory 20 stores instructions executed by processor 12. Among the instructions, memory 20 includes haptic effect accompaniment module 22, which are instructions that, when executed by processor 12, automatically selects and generates haptic effects that accompany audio or audio/video media, as disclosed in more detail below. Memory 20 may also be located internal to processor 12, or any combination of internal and external memory.
  • System 10 may be any type of handheld/mobile device, such as a cellular telephone, personal digital assistant (“PDA”), smartphone, computer tablet, gaming console, remote control, or any other type of device that includes a haptic effect system that includes one or more actuators or any other type of haptic output device. System 10 further includes an audio/visual system (not shown) that is capable of playing video (with audio) or audio only. System 10 may be a wearable device such as wrist bands, headbands, eyeglasses, rings, leg bands, arrays integrated into clothing, etc., or any other type of device that a user may wear on a body or can be held by a user and that is haptically enabled, including furniture, a game controller, or a vehicle steering wheel. Further, some of the elements or functionality of system 10 may be remotely located or may be implemented by another device that is in communication with the remaining elements of system 10.
  • In addition to, or in place of, actuator 18, system 10 may include other types of haptic output devices (not shown) that may be non-mechanical or non-vibratory devices such as devices that use electrostatic friction (“ESF”), ultrasonic surface friction (“USF”), devices that induce acoustic radiation pressure with an ultrasonic haptic transducer, devices that use a haptic substrate and a flexible or deformable surface or shape changing devices and that may be attached to a user's body, devices that provide projected haptic output such as a puff of air using an air jet, etc.
  • FIG. 2 is an overview diagram that includes system 10 of FIG. 1 and other network elements in accordance with one embodiment of the invention. System 10 is coupled to audio/visual (“A/V”) media servers 204, and haptic accompaniment servers 206 through the internet 202, or through any other communications method. Further, media servers 204 and haptic servers 206 can be local to system 10, or the functionality of servers 204 and 206 can be provided by system 10 itself. Associated with haptic accompaniment servers 206 is a media recognizer 35. Media recognizer 35 “observes” A/V media playback or other A/V media characteristics and uniquely identifies the A/V media, and in conjunction with media listener 32 described below, finds/selects/determines a haptic effect media/track that corresponds to the identified media.
  • System 10, as shown in the embodiment of FIG. 2, includes an A/V player 31, a media listener 32, and a haptic player 33. A/V player 31 plays audio and/or video media on system 10. Haptic player 33 plays haptic effects on system 10, and includes actuator drive circuit 16 and actuator 18 of FIG. 1, or any type of haptic output device in other embodiments. Media listener 32 communicates with or otherwise uses media recognizer 35 and provides by itself or in conjunction with media recognizer 35 the automatic generation of the file association between haptic media and A/V media. In contrast, prior art solutions require an explicit association element in the system, such as metadata, that explicitly tells a playback system where a corresponding haptic track is located.
  • In general, media listener 32 functions as a “matchmaker” by automatically matching pre-existing or pre-defined haptic tracks to A/V media using some type of AN recognition data. Media recognizer 35 can receive the A/V recognition data and in turn generate an identity of the A/V media. In one embodiment, the input to media recognizer 35 that functions as the A/V recognition data is the content of the A/V media itself, which can be some combination of audio and video, or video only, or audio only, and is used to uniquely identify the A/V media. The audio and video input in this embodiment is original content in that it has not been modified to assist in recognition haptic matching, such as by the addition of metadata.
  • In another embodiment, identifying information surrounding the A/V media, rather than the A/V media itself, is used to uniquely identify the A/V media by media recognizer 35 and functions as A/V recognition data. The identifying information can include, for example, a file name, a uniform resource locator (“URL), MPEG-4 metadata, etc. As with the above, the A/V media and the identifying information in one embodiment has not been modified to assist in recognition for haptic matching.
  • In another embodiment, the A/V media is re-encoded, or transcoded to inject human-perceptible or non-human-perceptible audio and/or video watermarks or signatures into the A/V media in order to assist in recognition for haptic matching and functions as the A/V recognition data. Therefore, the media is slightly modified from its original state before being stored on a media server for later streaming or downloading.
  • In another embodiment, the A/V media is re-encoded, transcoded, or otherwise modified to inject metadata into the A/V media in order to assist in recognition for haptic matching and functions as the A/V recognition data. Therefore, the media is slightly modified from its original state before being stored on a media server for later streaming or downloading.
  • In one embodiment, during operation of system 10, media listener 32 monitors A/V recognition data as it is being passed to, or is being output from, a media player system such as media server 204. A/V media may be requested by a user of system 10 by, for example, selecting a YouTube video stored on media server 204 to be streamed via internet 202 and played by A/V player 31. Further, in one embodiment, media listener 32 may pre-process the A/V media in order to create the A/V recognition data.
  • In one embodiment, media listener 32 may continuously stream A/V recognition data to media recognizer 35, or it may have the ability to discriminate at what times it is necessary to transmit A/V recognition data. This will lead to the A/V recognition data being either a continuous stream of data, or an occasional discontinuous stream of data, or some discrete number of data packets.
  • Media recognizer 35 uses the A/V recognition data to attempt to match the data to an existing haptic media or haptic track, possibly among a large set of pre-defined haptic tracks stored on server 206. Media recognizer 35, upon finding a match, provides the location of the haptic track to media listener 32. The functionality of media recognizer 35 can be implemented by system 10 itself, or remote from system 10 as shown in FIG. 2.
  • In one embodiment, one or both of the media listener 32 and media recognizer 35 determine the current playback location of the A/V media and the corresponding haptic track. Media listener 32 instructs haptic player 33 of the location of the haptic track, and the current playback location of the A/V media.
  • One or both of media listener 32 and haptic player 33 render the haptic track according to the current playback location of the A/V media such as to ensure reasonable synchrony between the A/V media and the haptic media. The haptic track is rendered, in one embodiment, by sending the corresponding haptic effect signal to actuator drive circuit 16, which causes actuator 18 to generate a vibratory haptic effect. The haptic track can start upon initiation of the playing of the A/V media, or at some point prior to or after initiation.
  • The elements shown in FIG. 2 can be implemented in many different configurations. In one embodiment, system 10 can be a mobile device with a haptic output system via haptic player 33, as well as A/V media playback capabilities via A/V player 31, and media listener 32 may be a service running on the mobile device, such as an application/“app”. Media listener 32 communicates the A/V recognition data through internet 202, wired or wirelessly, to server 206 that is running a process that instantiates media recognizer 35, and to the same, or a different server 204, that hosts one or more haptic media that are associated with specific A/V media.
  • In another embodiment, system 10 can be a set-top box with a remote control unit that is equipped with a haptic output system that could be in the form of a gamepad, where the set-top box has the ability to transport A/V media to audio and video data sinks (e.g., a television and an audio sound system). In this embodiment, media listener 32 is a service running on the set-top box, and communicates with servers 204 and 206 over internet 202. In another embodiment, the set-top box is a gaming console.
  • In another embodiment, the set-top box hosts both media listener 32 and media recognizer 35, and the haptic media is accessible locally on the set-top box (e.g., in volatile memory, non-volatile memory, or physical disk), or the haptic media is accessible remotely by residing on media server 204 that is accessible via internet 202.
  • In another embodiment, the A/V content is rendered through a web browser on a device (e.g., Smart TV, mobile device, set-top box, etc.) or directly on the device, and media listener 32 can access the A/V recognition data through known software/hardware methods.
  • FIG. 3 is a flow diagram of the functionality of system 10 of FIG. 1 in accordance with an embodiment of the invention. In one embodiment, the functionality of the flow diagram of FIG. 3 is implemented by software stored in memory or other computer readable or tangible medium, and executed by a processor. In other embodiments, the functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.
  • At 301, A/V media (which includes audio and video, only audio, or only video media) begins playing or is initially selected. The A/V media includes AN recognition data, which can include the A/V media itself.
  • At 302, the A/V media is recognized based on the A/V recognition data.
  • At 303, a haptic track that corresponds to the recognized A/V media is identified and retrieved/selected. The haptic track would include a haptic effect signal that when applied to a haptic effect output device would generate haptic effects.
  • At 304, the haptic track is played on system 10 in synchronized fashion with the playing of the A/V media. As a result, the viewing of the A/V media is enhanced.
  • Although in the above embodiment, the A/V media is first recognized, and then a haptic track is retrieved/selected, in another embodiment the reverse may happen. In other words, in one embodiment a haptic track is initiated and identified, and then the system selects and plays back A/V media in a synchronized fashion.
  • In one embodiment, the selection of the haptic track at 303 can depend on the capabilities of the target device on which the haptic effect will be generated. For example, some devices are equipped with standard definition (“SD”) actuators, such as LRAs, and other devices are equipped with high definition (“HD”) actuators, such as piezo actuators. For a particular A/V media, there may be both a corresponding SD and HD haptic track. The selection of SD or HD will depend on the capabilities of the target device. Media listener 32 in one embodiment would have knowledge of the haptic effect playback capabilities of system 10.
  • As an example of a use of embodiments of the invention, assume a user has a smartphone that has the media listener service running. The user starts up the YouTube application, and proceeds to watch a kitten fight video for which someone has previously created a corresponding haptic track. The media listener sends processed (or unprocessed) data to a media recognizer, which can reside on the smartphone or an internet server. The recognizer finds a suitable haptic media for the kitten fight video. The media listener service starts retrieving the haptic media to play it back in synchrony with the kitten fight video, creating haptic sensations for every kitten swipe and knockdown.
  • As disclosed, embodiments observe A/V media playback, and in observing it, uniquely identify the media. Embodiments then find a corresponding haptic media, and then synchronize the haptic media playback to the A/V media playback on a personal computing device, or any other type of haptically enabled device.
  • Several embodiments are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations of the disclosed embodiments are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.

Claims (1)

What is claimed is:
1. A method of playing media on a device, the method comprising:
initiating the playing of audio/video (A/V) media on the device;
identifying the A/V media;
selecting a pre-defined haptic track that corresponds to the identified A/V media; and
playing the selected pre-defined haptic track in synchrony with the playing of the A/V, wherein the playing of the selected pre-defined haptic track generates haptic effects on the device.
US16/288,686 2015-02-11 2019-02-28 Automated haptic effect accompaniment Abandoned US20190267043A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/288,686 US20190267043A1 (en) 2015-02-11 2019-02-28 Automated haptic effect accompaniment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/619,590 US10269392B2 (en) 2015-02-11 2015-02-11 Automated haptic effect accompaniment
US16/288,686 US20190267043A1 (en) 2015-02-11 2019-02-28 Automated haptic effect accompaniment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/619,590 Continuation US10269392B2 (en) 2015-02-11 2015-02-11 Automated haptic effect accompaniment

Publications (1)

Publication Number Publication Date
US20190267043A1 true US20190267043A1 (en) 2019-08-29

Family

ID=54707501

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/619,590 Expired - Fee Related US10269392B2 (en) 2015-02-11 2015-02-11 Automated haptic effect accompaniment
US16/288,686 Abandoned US20190267043A1 (en) 2015-02-11 2019-02-28 Automated haptic effect accompaniment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/619,590 Expired - Fee Related US10269392B2 (en) 2015-02-11 2015-02-11 Automated haptic effect accompaniment

Country Status (5)

Country Link
US (2) US10269392B2 (en)
EP (1) EP3056971A1 (en)
JP (1) JP2016149121A (en)
KR (1) KR20160098956A (en)
CN (1) CN105867604A (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10871826B2 (en) 2016-03-01 2020-12-22 DISH Technologies L.L.C. Haptic feedback remote control systems and methods
US10075251B2 (en) * 2017-02-08 2018-09-11 Immersion Corporation Haptic broadcast with select haptic metadata based on haptic playback capability
US10360774B1 (en) * 2018-01-05 2019-07-23 Immersion Corporation Method and device for enabling pitch control for a haptic effect
US20190324538A1 (en) * 2018-04-20 2019-10-24 Immersion Corporation Haptic-enabled wearable device for generating a haptic effect in an immersive reality environment
US20200012347A1 (en) * 2018-07-09 2020-01-09 Immersion Corporation Systems and Methods for Providing Automatic Haptic Generation for Video Content
US11016568B2 (en) * 2018-07-24 2021-05-25 The Nielsen Company (Us), Llc Methods and apparatus to monitor haptic vibrations of touchscreens
WO2020219073A1 (en) * 2019-04-26 2020-10-29 Hewlett-Packard Development Company, L.P. Spatial audio and haptics

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110188832A1 (en) * 2008-09-22 2011-08-04 Electronics And Telecommunications Research Institute Method and device for realising sensory effects

Family Cites Families (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001297086A (en) * 2000-04-13 2001-10-26 Sony Corp Retrieval system, retrieval device, retrieval method, input device and input method
JP4438191B2 (en) * 2000-07-21 2010-03-24 ソニー株式会社 Information processing apparatus and method, information processing system, and recording medium
US6990453B2 (en) 2000-07-31 2006-01-24 Landmark Digital Services Llc System and methods for recognizing sound and music signals in high noise and distortion
US7623114B2 (en) * 2001-10-09 2009-11-24 Immersion Corporation Haptic feedback sensations based on audio output from computer devices
US9948885B2 (en) 2003-12-12 2018-04-17 Kurzweil Technologies, Inc. Virtual encounters
US8700791B2 (en) * 2005-10-19 2014-04-15 Immersion Corporation Synchronization of haptic effect data in a media transport stream
JP2009521170A (en) * 2005-12-22 2009-05-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Script synchronization method using watermark
JP2007241942A (en) * 2006-03-13 2007-09-20 Sharp Corp Content reproduction device, content reproduction method, content reproduction program and storage medium
US8000825B2 (en) * 2006-04-13 2011-08-16 Immersion Corporation System and method for automatically producing haptic events from a digital audio file
US9370704B2 (en) 2006-08-21 2016-06-21 Pillar Vision, Inc. Trajectory detection and feedback system for tennis
US8315652B2 (en) * 2007-05-18 2012-11-20 Immersion Corporation Haptically enabled messaging
US9019087B2 (en) * 2007-10-16 2015-04-28 Immersion Corporation Synchronization of haptic effect data in a media stream
US8740622B2 (en) 2008-01-17 2014-06-03 Articulate Technologies, Inc. Methods and devices for intraoral tactile feedback
KR101459766B1 (en) 2008-02-12 2014-11-10 삼성전자주식회사 Method for recognizing a music score image with automatic accompaniment in a mobile device
JP4600548B2 (en) * 2008-08-27 2010-12-15 ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, AND PROGRAM
US9370459B2 (en) 2009-06-19 2016-06-21 Andrew Mahoney System and method for alerting visually impaired users of nearby objects
US8902050B2 (en) 2009-10-29 2014-12-02 Immersion Corporation Systems and methods for haptic augmentation of voice-to-text conversion
US9251721B2 (en) 2010-04-09 2016-02-02 University Of Florida Research Foundation, Inc. Interactive mixed reality system and uses thereof
US9462262B1 (en) 2011-08-29 2016-10-04 Amazon Technologies, Inc. Augmented reality environment with environmental condition control
US10013857B2 (en) * 2011-12-21 2018-07-03 Qualcomm Incorporated Using haptic technologies to provide enhanced media experiences
US20130198625A1 (en) * 2012-01-26 2013-08-01 Thomas G Anderson System For Generating Haptic Feedback and Receiving User Inputs
WO2013168732A1 (en) * 2012-05-08 2013-11-14 株式会社ニコン Electronic device
US10852093B2 (en) 2012-05-22 2020-12-01 Haptech, Inc. Methods and apparatuses for haptic systems
US8842969B2 (en) * 2012-09-28 2014-09-23 Intel Corporation Haptic playback of video
FR2999741B1 (en) 2012-12-17 2015-02-06 Centre Nat Rech Scient HAPTIC SYSTEM FOR NON-CONTACT INTERACTING AT LEAST ONE PART OF THE BODY OF A USER WITH A VIRTUAL ENVIRONMENT
US9128523B2 (en) * 2012-12-20 2015-09-08 Amazon Technologies, Inc. Dynamically generating haptic effects from audio data
US8754757B1 (en) 2013-03-05 2014-06-17 Immersion Corporation Automatic fitting of haptic effects
US9992491B2 (en) * 2013-03-15 2018-06-05 Immersion Corporation Method and apparatus for encoding and decoding haptic information in multi-media files
US9367136B2 (en) 2013-04-12 2016-06-14 Microsoft Technology Licensing, Llc Holographic object feedback
US9908048B2 (en) 2013-06-08 2018-03-06 Sony Interactive Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display
US9811854B2 (en) 2013-07-02 2017-11-07 John A. Lucido 3-D immersion technology in a virtual store
EP4083758A1 (en) 2013-07-05 2022-11-02 Rubin, Jacob A. Whole-body human-computer interface
US9630105B2 (en) 2013-09-30 2017-04-25 Sony Interactive Entertainment Inc. Camera based safety mechanisms for users of head mounted displays
WO2015107386A1 (en) 2014-01-15 2015-07-23 Sony Corporation Haptic notification on wearables
US9551873B2 (en) 2014-05-30 2017-01-24 Sony Interactive Entertainment America Llc Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content
CN111998027B (en) 2014-07-28 2022-05-27 Ck高新材料有限公司 Tactile information providing method
US9645646B2 (en) 2014-09-04 2017-05-09 Intel Corporation Three dimensional contextual feedback wristband device
US9667907B2 (en) * 2014-09-13 2017-05-30 Vicente Diaz System and method for haptic integration and generation in multimedia devices
US9799177B2 (en) 2014-09-23 2017-10-24 Intel Corporation Apparatus and methods for haptic covert communication
US10166466B2 (en) 2014-12-11 2019-01-01 Elwha Llc Feedback for enhanced situational awareness
US9870718B2 (en) 2014-12-11 2018-01-16 Toyota Motor Engineering & Manufacturing North America, Inc. Imaging devices including spacing members and imaging devices including tactile feedback devices
US9922518B2 (en) 2014-12-11 2018-03-20 Elwha Llc Notification of incoming projectiles
US20160170508A1 (en) 2014-12-11 2016-06-16 Toyota Motor Engineering & Manufacturing North America, Inc. Tactile display devices
US10073516B2 (en) 2014-12-29 2018-09-11 Sony Interactive Entertainment Inc. Methods and systems for user interaction within virtual reality scene using head mounted display
US9746921B2 (en) 2014-12-31 2017-08-29 Sony Interactive Entertainment Inc. Signal generation and detector systems and methods for determining positions of fingers of a user
US9843744B2 (en) 2015-01-13 2017-12-12 Disney Enterprises, Inc. Audience interaction projection system
US10322203B2 (en) 2015-06-26 2019-06-18 Intel Corporation Air flow generation for scent output
US9851799B2 (en) 2015-09-25 2017-12-26 Oculus Vr, Llc Haptic surface with damping apparatus
US20170103574A1 (en) 2015-10-13 2017-04-13 Google Inc. System and method for providing continuity between real world movement and movement in a virtual/augmented reality experience
US20170131775A1 (en) 2015-11-10 2017-05-11 Castar, Inc. System and method of haptic feedback by referral of sensation
WO2017095951A1 (en) 2015-11-30 2017-06-08 Nike Innovate C.V. Apparel with ultrasonic position sensing and haptic feedback for activities
US10310804B2 (en) 2015-12-11 2019-06-04 Facebook Technologies, Llc Modifying haptic feedback provided to a user to account for changes in user perception of haptic feedback
US10324530B2 (en) 2015-12-14 2019-06-18 Facebook Technologies, Llc Haptic devices that simulate rigidity of virtual objects
US10096163B2 (en) 2015-12-22 2018-10-09 Intel Corporation Haptic augmented reality to reduce noxious stimuli
US10065124B2 (en) 2016-01-15 2018-09-04 Disney Enterprises, Inc. Interacting with a remote participant through control of the voice of a toy device
US9846971B2 (en) 2016-01-19 2017-12-19 Disney Enterprises, Inc. Systems and methods for augmenting an appearance of a hilt to simulate a bladed weapon
US11351472B2 (en) 2016-01-19 2022-06-07 Disney Enterprises, Inc. Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon
TWI688879B (en) 2016-01-22 2020-03-21 宏達國際電子股份有限公司 Method, virtual reality system, and computer-readable recording medium for real-world interaction in virtual reality environment
US9933851B2 (en) 2016-02-22 2018-04-03 Disney Enterprises, Inc. Systems and methods for interacting with virtual objects using sensory feedback
US10555153B2 (en) 2016-03-01 2020-02-04 Disney Enterprises, Inc. Systems and methods for making non-smart objects smart for internet of things
US20170352185A1 (en) 2016-06-02 2017-12-07 Dennis Rommel BONILLA ACEVEDO System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation
US10155159B2 (en) 2016-08-18 2018-12-18 Activision Publishing, Inc. Tactile feedback systems and methods for augmented reality and virtual reality systems
US20180053351A1 (en) 2016-08-19 2018-02-22 Intel Corporation Augmented reality experience enhancement method and apparatus
US10372213B2 (en) 2016-09-20 2019-08-06 Facebook Technologies, Llc Composite ribbon in a virtual reality device
US10779583B2 (en) 2016-09-20 2020-09-22 Facebook Technologies, Llc Actuated tendon pairs in a virtual reality device
US10300372B2 (en) 2016-09-30 2019-05-28 Disney Enterprises, Inc. Virtual blaster
US10281982B2 (en) 2016-10-17 2019-05-07 Facebook Technologies, Llc Inflatable actuators in virtual reality
US10088902B2 (en) 2016-11-01 2018-10-02 Oculus Vr, Llc Fiducial rings in virtual reality
US20170102771A1 (en) 2016-12-12 2017-04-13 Leibs Technology Limited Wearable ultrasonic haptic feedback system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110188832A1 (en) * 2008-09-22 2011-08-04 Electronics And Telecommunications Research Institute Method and device for realising sensory effects

Also Published As

Publication number Publication date
US20160232943A1 (en) 2016-08-11
KR20160098956A (en) 2016-08-19
JP2016149121A (en) 2016-08-18
CN105867604A (en) 2016-08-17
EP3056971A1 (en) 2016-08-17
US10269392B2 (en) 2019-04-23

Similar Documents

Publication Publication Date Title
US20190267043A1 (en) Automated haptic effect accompaniment
US10429933B2 (en) Audio enhanced simulation of high bandwidth haptic effects
US20200245038A1 (en) Second screen haptics
US10241580B2 (en) Overlaying of haptic effects
US10320501B2 (en) Haptic broadcast with select haptic metadata based on haptic playback capability
JP6538305B2 (en) Methods and computer readable media for advanced television interaction
US9245429B2 (en) Haptic warping system
JP2019195181A (en) Method and system for providing haptic effects based on information supplementing multimedia content
JP2020126685A (en) Feedback reduction for user input element associated with haptic output device
EP3031502B1 (en) Video gameplay haptics
US20150054727A1 (en) Haptically enabled viewing of sporting events
WO2020031497A1 (en) Preemptive driving of tactile feedback presentation device
US10216277B2 (en) Modifying haptic effects for slow motion

Legal Events

Date Code Title Description
AS Assignment

Owner name: IMMERSION CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LACROIX, ROBERT A.;NORRIS, PAUL;REEL/FRAME:048467/0966

Effective date: 20150211

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION