US20180210552A1 - Haptic conversion system using segmenting and combining - Google Patents
Haptic conversion system using segmenting and combining Download PDFInfo
- Publication number
- US20180210552A1 US20180210552A1 US15/863,261 US201815863261A US2018210552A1 US 20180210552 A1 US20180210552 A1 US 20180210552A1 US 201815863261 A US201815863261 A US 201815863261A US 2018210552 A1 US2018210552 A1 US 2018210552A1
- Authority
- US
- United States
- Prior art keywords
- haptic
- signal
- signals
- input
- input sub
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000006243 chemical reaction Methods 0.000 title description 33
- 230000000694 effects Effects 0.000 claims abstract description 89
- 238000000034 method Methods 0.000 claims description 28
- 238000004458 analytical method Methods 0.000 claims description 26
- 238000001914 filtration Methods 0.000 claims description 4
- 230000005236 sound signal Effects 0.000 description 23
- 238000004891 communication Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000000295 complement effect Effects 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 238000012913 prioritisation Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000000704 physical effect Effects 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 3
- 235000020831 absolute fast Nutrition 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000005355 Hall effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 230000005520 electrodynamics Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004880 explosion Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 229910001285 shape-memory alloy Inorganic materials 0.000 description 1
- 231100000430 skin reaction Toxicity 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L21/00—Speech or voice signal processing techniques to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
- G10L21/06—Transformation of speech into a non-audible representation, e.g. speech visualisation or speech processing for tactile aids
- G10L21/16—Transforming into a non-visible representation
Definitions
- One embodiment is directed generally to a device, and more particularly, to a device that produces haptic effects.
- Haptics is a tactile and force feedback technology that takes advantage of a user's sense of touch by applying haptic feedback effects (i.e., “haptic effects”), such as forces, vibrations, and motions, to the user.
- haptic effects such as forces, vibrations, and motions
- Devices such as mobile devices, touchscreen devices, and personal computers, can be configured to generate haptic effects.
- calls to embedded hardware capable of generating haptic effects can be programmed within an operating system (“OS”) of the device. These calls specify which haptic effect to play. For example, when a user interacts with the device using, for example, a button, touchscreen, lever, joystick, wheel, or some other control, the OS of the device can send a play command through control circuitry to the embedded hardware. The embedded hardware then produces the appropriate haptic effect.
- OS operating system
- Devices can be configured to coordinate the output of haptic effects with the output of other content, such as audio, so that the haptic effects are incorporated into the other content.
- an audio effect developer can develop audio effects that can be output by the device, such as machine gun fire, explosions, or car crashes.
- other types of content such as video effects, can be developed and subsequently output by the device.
- a haptic effect developer can subsequently author a haptic effect for the device, and the device can be configured to output the haptic effect along with the other content.
- such a process generally requires the individual judgment of the haptic effect developer to author a haptic effect that correctly compliments the audio effect, or other type of content.
- a poorly-authored haptic effect that does not compliment the audio effect, or other type of content can produce an overall dissonant effect where the haptic effect does not “mesh” with the audio effect or other content. This type of user experience is generally not desired.
- One embodiment is a system that converts an input into one or more haptic effects using segmenting and combining.
- the system receives an input.
- the system further segments the input into a plurality of input sub-signals.
- the system further converts the plurality of input sub-signals into a single haptic signal, or multiple haptic signals that can either be played separately on different haptic output devices, or mixed into a single haptic signal.
- the system further generates the one or more haptic effects based on the haptic signal.
- FIG. 1 illustrates a block diagram of a system in accordance with one embodiment of the invention.
- FIG. 2 illustrates a flow diagram of haptic conversion functionality performed by a system, according to an embodiment of the invention.
- FIG. 3 illustrates a plurality of input sub-signals that are analyzed and converted into one or more haptic effects using frequency band analysis and prioritization, according to an embodiment of the invention.
- FIG. 4 illustrates a flow diagram of haptic conversion functionality performed by a system, according to another embodiment of the invention.
- FIG. 5 illustrates an example of shifting a frequency of an input signal, according to an embodiment of the invention.
- FIG. 6 illustrates a block diagram of a haptic mixer configured to mix a plurality of haptic sub-signals, according to an embodiment of the invention.
- FIG. 7 illustrates a flow diagram of the functionality of a haptic conversion module, according to an embodiment of the invention.
- One embodiment is a system that can convert an input, such as an audio signal, into a haptic signal that can be used to generate haptic effects.
- the system can filter the input into multiple frequency bands, where each frequency band includes a sub-signal of the input, and the system can further prioritize the multiple frequency bands based on one or more parameters of analysis.
- the system can create a prioritized list of the frequency bands.
- the system can further select one or more frequency bands from the prioritized list, and use the selected frequency band(s) to generate the haptic signal, where the haptic signal is based upon, at least in part, a combination of the selected frequency band(s).
- the system can band-pass filter an audio signal into four frequency bands, and can create a haptic signal based on the frequency band that contains an audio sub-signal with the highest magnitude.
- the haptic conversion functionality of the system can be implemented either as offline functionality that provides an output file that can be played back on a device, or as an algorithm that performs the processing at playback time.
- Another embodiment is a system that can convert an input, such as an audio signal, into a haptic signal that can be used to generate haptic effects.
- the system can read a multimedia file (e.g., audio file or video file), and extract an input signal, such as an audio signal, from the multimedia file.
- the system can filter the input signal into one or more input sub-signals. For example, the system can use different band pass filters with complementary cutoff frequencies to segment the input signal into different complementary input sub-signals.
- the system can then create a haptic sub-signal for each input sub-signal.
- the system can further mix, or otherwise combine, the haptic sub-signals into an overall haptic signal that corresponds to the original input signal.
- the haptic conversion functionality of the system can be implemented as a software module, a mobile application, or a plug-in for audio/video players and editing tools.
- FIG. 1 illustrates a block diagram of a system 10 in accordance with one embodiment of the invention.
- system 10 is part of a mobile device, and system 10 provides a haptic conversion functionality for the mobile device.
- system 10 is part of a wearable device, and system 10 provides a haptic conversion functionality for the wearable device.
- wearable devices include wrist bands, headbands, eyeglasses, rings, leg bands, arrays integrated into clothing, or any other type of device that a user may wear on a body or can be held by a user.
- Some wearable devices can be “haptically enabled,” meaning they include mechanisms to generate haptic effects.
- system 10 is separate from the device (e.g., a mobile device or a wearable device), and remotely provides the haptic conversion functionality for the device.
- system 10 includes a bus 12 or other communication mechanism for communicating information, and a processor 22 coupled to bus 12 for processing information.
- Processor 22 may be any type of general or specific purpose processor.
- System 10 further includes a memory 14 for storing information and instructions to be executed by processor 22 .
- Memory 14 can be comprised of any combination of random access memory (“RAM”), read only memory (“ROM”), static storage such as a magnetic or optical disk, or any other type of computer-readable medium.
- a computer-readable medium may be any available medium that can be accessed by processor 22 and may include both a volatile and nonvolatile medium, a removable and non-removable medium, a communication medium, and a storage medium.
- a communication medium may include computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any other form of an information delivery medium known in the art.
- a storage medium may include RAM, flash memory, ROM, erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), registers, hard disk, a removable disk, a compact disk read-only memory (“CD-ROM”), or any other form of a storage medium known in the art.
- memory 14 stores software modules that provide functionality when executed by processor 22 .
- the modules include an operating system 15 that provides operating system functionality for system 10 , as well as the rest of a mobile device in one embodiment.
- the modules further include a haptic conversion module 16 that converts an input into one or more haptic effects using segmenting and combining, as disclosed in more detail below.
- haptic conversion module 16 can comprise a plurality of modules, where each module provides specific individual functionality for converting an input into one or more haptic effects using segmenting and combining.
- System 10 will typically include one or more additional application modules 18 to include additional functionality, such as IntegratorTM software by Immersion Corporation.
- System 10 in embodiments that transmit and/or receive data from remote sources, further includes a communication device 20 , such as a network interface card, to provide mobile wireless network communication, such as infrared, radio, Wi-Fi, or cellular network communication.
- communication device 20 provides a wired network connection, such as an Ethernet connection or a modem.
- Processor 22 is further coupled via bus 12 to a display 24 , such as a Liquid Crystal Display (“LCD”), for displaying a graphical representation or user interface to a user.
- the display 24 may be a touch-sensitive input device, such as a touch screen, configured to send and receive signals from processor 22 , and may be a multi-touch touch screen.
- System 10 in one embodiment, further includes an actuator 26 .
- Processor 22 may transmit a haptic signal associated with a generated haptic effect to actuator 26 , which in turn outputs haptic effects such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects.
- Actuator 26 includes an actuator drive circuit.
- Actuator 26 may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a linear resonant actuator (“LRA”), a piezoelectric actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, or an ultrasonic vibration generator.
- system 10 can include one or more additional actuators, in addition to actuator 26 (not illustrated in FIG. 1 ).
- Actuator 26 is an example of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects, in response to a drive signal.
- actuator 26 can be replaced by some other type of haptic output device.
- system 10 may not include actuator 26 , and a separate device from system 10 includes an actuator, or other haptic output device, that generates the haptic effects, and system 10 sends generated haptic signals to that device through communication device 20 .
- System 10 in one embodiment, further includes a speaker 28 .
- Processor 22 may transmit an audio signal to speaker 28 , which in turn outputs audio effects.
- Speaker 28 may be, for example, a dynamic loudspeaker, an electrodynamic loudspeaker, a piezoelectric loudspeaker, a magnetostrictive loudspeaker, an electrostatic loudspeaker, a ribbon and planar magnetic loudspeaker, a bending wave loudspeaker, a flat panel loudspeaker, a heil air motion transducer, a plasma arc speaker, and a digital loudspeaker.
- system 10 can include one or more additional speakers, in addition to speaker 28 (not illustrated in FIG. 1 ). Further, in other alternate embodiments, system 10 may not include speaker 28 , and a separate device from system 10 includes a speaker that outputs the audio effects, and system 10 sends audio signals to that device through communication device 20 .
- System 10 further includes a sensor 30 .
- Sensor 30 can be configured to detect a form of energy, or other physical property, such as, but not limited to, sound, movement, acceleration, bio signals, distance, flow, force/pressure/strain/bend, humidity, linear position, orientation/inclination, radio frequency, rotary position, rotary velocity, manipulation of a switch, temperature, vibration, or visible light intensity.
- Sensor 30 can further be configured to convert the detected energy, or other physical property, into an electrical signal, or any signal that represents virtual sensor information.
- Sensor 30 can be any device, such as, but not limited to, an accelerometer, an electrocardiogram, an electroencephalogram, an electromyograph, an electrooculogram, an electropalatograph, a galvanic skin response sensor, a capacitive sensor, a hall effect sensor, an infrared sensor, an ultrasonic sensor, a pressure sensor, a fiber optic sensor, a flexion sensor (or bend sensor), a force-sensitive resistor, a load cell, a LuSense CPS 2 155, a miniature pressure transducer, a piezo sensor, a strain gage, a hygrometer, a linear position touch sensor, a linear potentiometer (or slider), a linear variable differential transformer, a compass, an inclinometer, a magnetic tag (or radio frequency identification tag), a rotary encoder, a rotary potentiometer, a gyroscope, an on-off switch, a temperature sensor (such as a thermometer, thermocouple, resistance temperature detector,
- system 10 can include one or more additional sensors, in addition to sensor 30 (not illustrated in FIG. 1 ).
- sensor 30 and the one or more additional sensors may be part of a sensor array, or some other type of collection of sensors.
- system 10 may not include sensor 30 , and a separate device from system 10 includes a sensor that detects a form of energy, or other physical property, and converts the detected energy, or other physical property, into an electrical signal, or other type of signal that represents virtual sensor information. The device can then send the converted signal to system 10 through communication device 20 .
- FIG. 2 illustrates a flow diagram of haptic conversion functionality performed by a system, according to an embodiment of the invention.
- the functionality of FIG. 2 as well as the functionality of FIG. 4 and the functionality of FIG. 7 , are each implemented by software stored in memory or other computer-readable or tangible media, and executed by a processor.
- each functionality may be performed by a haptic conversion module (such as haptic conversion module 16 of FIG. 1 ).
- each functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.
- ASIC application specific integrated circuit
- PGA programmable gate array
- FPGA field programmable gate array
- the haptic conversion functionality can include receiving one or more “chunks” (also identified as segments) of an input signal, processing each chunk of the input signal, and playing back the modified chunk of the input signal on a haptic output device, such as an actuator.
- the input can be an audio signal, or other type of audio input, that includes audio data.
- the input can be a video signal, or other type of video input, that includes video data.
- the input can be an acceleration signal, or other type of acceleration input, that includes acceleration data.
- the input can be an orientation signal that includes orientation data, an ambient light signal that includes ambient light data, or another type of signal that can be related to a media file, and that can also be sensed by a sensor, such as sensor 30 .
- the output of the sensor can be recorded beforehand, and can be provided along with the media file.
- the sensor may be, or may not be, attached to the system.
- the flow begins at 210 , where an input signal chunk is received.
- an input signal chunk is a segment of an input signal.
- the input signal chunk can include the entire input signal. The flow proceeds to 220 .
- the input signal chunk is filtered (e.g., band-pass filtered) to create a plurality of input sub-signals (also identified as frequency bands). More specifically, one or more filters (e.g., band-pass filters) can be applied to the input signal chunk to remove segments of the input signal chunk, so that the remaining segment of the input signal chunk includes one or more frequencies within a specific frequency band. The segment of the input signal chunk that remains after the application of the filter is identified as an input sub-signal, or a frequency band.
- filters e.g., band-pass filters
- each filter can correspond to a specific frequency band
- the plurality of filters can be applied to the input signal chunk in multiple passes (where a different filter is applied to the input signal chunk in each pass), and each filter can create a segment of the input signal chunk that includes one or more frequencies within a frequency band that corresponds to the filter.
- an input sub-signal can include the entire input signal chunk.
- a first filter can represent a low frequency band
- a second filter can represent a medium frequency band
- a third filter can represent a high frequency band.
- a first filter can be applied to an input signal chunk, and a first input sub-signal can be created, where the first input sub-signal includes one or more frequencies within a low frequency band.
- a second filter can then be applied to the input signal chunk, and a second input sub-signal can be created, where the second input sub-signal includes one or more frequencies within a medium frequency band.
- a third filter can then be applied to an input signal chunk, and a third input sub-signal can be created, where the third input sub-signal includes one or more frequencies within a high frequency band.
- the input sub-signals i.e., frequency bands
- the input sub-signals are illustrated in FIG. 2 as frequency bands 221 , 222 , 223 , and 224 .
- any number of input sub-signals can be created, and each input sub-signal can be defined to include one or more frequencies within any type of frequency band.
- An example implementation of input sub-signals is further described below in conjunction with FIG. 3 .
- the flow then proceeds to 230 .
- the input sub-signals are prioritized based on an analysis parameter. More specifically, one or more characteristics of each input sub-signal are analyzed, where the analysis parameter defines the one or more characteristics. Examples of the characteristics can include: frequency, duration, envelope, density, and magnitude.
- the input sub-signals can then be ordered (i.e., prioritized) based on the analyzed characteristics of each input sub-signal. For example, a prioritized list of the one or more input sub-signals can be generated, where the one or more input sub-signals are prioritized within the list based on the analyzed characteristics of each input sub-signal.
- an analysis parameter can be defined as a magnitude of an input sub-signal.
- a plurality of input sub-signals i.e., frequency bands
- each input sub-signal can be analyzed in order to determine a maximum magnitude value.
- the maximum magnitude value for each input sub-signal can be compared, and the input sub-signals can be prioritized based on the corresponding maximum magnitude values.
- a haptic signal is calculated and generated based on one or more input sub-signals that are selected from the prioritized input sub-signals. More specifically, one or more input sub-signals are first selected from the prioritized input sub-signals. For example, an input sub-signal with the highest priority (or a plurality of input sub-signals with the highest priorities) can be selected from a prioritized list of the input sub-signals. Subsequently, a haptic signal is calculated based on the selected input sub-signal(s). More specifically, the haptic signal is calculated to include one or more characteristics of the selected input sub-signal(s).
- the haptic signal can be calculated to include all the characteristics of the selected input sub-signal(s). In embodiments where there are a plurality of input sub-signals, the haptic signal can be calculated, at least in part, based on a combination of the input sub-signals. The haptic signal is subsequently generated.
- an input sub-signal with a highest maximum magnitude value can be selected.
- the maximum magnitude value of the input sub-signal can be used to calculate a haptic signal.
- This haptic signal can subsequently be used to generate a haptic effect that is based on the input sub-signal with the highest magnitude.
- the three input sub-signals with the three highest maximum magnitude values can be selected.
- the maximum magnitude values of the three input sub-signals can be used to calculate a haptic signal.
- an average, or other calculation, of the three maximum magnitude values can be calculated. This average value, or other calculated value, can be used to generate a haptic signal.
- This haptic signal can subsequently be used to generate a haptic effect that is based on the three input sub-signals with the three highest magnitudes.
- the flow then proceeds to 250 .
- the generated haptic signal is “warped” into a “warped haptic signal.” More specifically, the generated haptic signal is input into a “warping” function, where a warping function can “warp” or transform the data contained within an input signal to create an output signal, where the output signal is also identified as a “warped signal.” Thus, by inputting the generated haptic signal into the warping function, the warping function can warp the data contained within the generated haptic signal.
- the warping of the data contained within the generated haptic signal can ultimately transform the generated haptic signal into the “warped haptic signal.”
- the warped haptic signal is more suitable for a specific haptic output device, and thus, the warped haptic signal can be played on the specific haptic output device in order to generate one or more haptic effects.
- the warping function can envelope an input haptic signal, or use a maximum magnitude value of the input haptic signal, to calculate a magnitude of an output haptic signal that is generated, where the magnitude of the output haptic signal correlates to the magnitude of a haptic effect generated by the output haptic signal.
- the warping function can play back the input haptic signal as a waveform, or perform another type of algorithm to convert the input haptic signal into an output haptic signal.
- 250 can be omitted. The flow then proceeds to 260 .
- the haptic signal (either the warped haptic signal if 250 is performed, or the generated haptic signal if 250 is omitted) is sent to a haptic output device, such as actuator 26 of FIG. 1 , and the haptic output device generates one or more haptic effects based on the haptic signal.
- a haptic output device such as actuator 26 of FIG. 1
- the haptic output device generates one or more haptic effects based on the haptic signal.
- the one or more haptic effects can be generated based on the selected input sub-signal(s). The flow then ends.
- the haptic conversion functionality illustrated in FIG. 2 can be performed in real-time on a device configured to output haptic effects, such as a mobile device or touchscreen device.
- the haptic conversion functionality illustrated in FIG. 2 can be performed offline, by a computer or other type of computing machine, and a resulting haptic signal can be sent to the device that is configured to output the haptic effects.
- FIG. 3 illustrates a plurality of input sub-signals that are analyzed and converted into one or more haptic effects using frequency band analysis and prioritization, according to an embodiment of the invention.
- an input signal 301 can be filtered to create a plurality of input sub-signals using one or more filters, where the plurality of input sub-signals includes input sub-signals 302 , 303 , and 304 .
- Input sub-signal 302 represents a 200 Hz center frequency band-pass of an input signal, and can be created using a first filter.
- Input sub-signal 303 represents a 1000 Hz center frequency band-pass of an input signal, and can be created using a second filter.
- Input sub-signal 304 represents a 5000 Hz center frequency band-pass of an input signal, and can be created using a third filter.
- each input sub-signal can divided into a plurality of segments or windows.
- example windows 310 , 320 , and 330 are illustrated, where windows 310 , 320 , and 330 each include a segment of input sub-signals 302 , 303 , and 304 .
- an input sub-signal can include other windows not illustrated in FIG. 3 .
- the windows of an input sub-signal can be in a sequential arrangement, where a subsequent window starts at a position when a preceding window ends.
- one or more characteristics of each input sub-signal can be analyzed based on an analysis parameter.
- one or more audio characteristics of input sub-signal 302 can be analyzed based on an analysis parameter.
- one or more audio characteristics of input sub-signals 303 and 304 can also be analyzed based on an analysis parameter.
- input sub-signals 302 , 303 , and 304 can be prioritized based on the analysis.
- an input sub-signal can be selected from input sub-signals 302 , 303 , and 304 based on the prioritization, and the selected input sub-signal can be converted into a haptic signal.
- analysis parameters can include: frequency, duration, envelope, density, and magnitude.
- an analysis parameter can be a magnitude.
- the magnitude of input sub-signals 302 , 303 , and 304 can be analyzed. Further, for window 310 , input sub-signal 302 can be selected because input sub-signal 302 has the highest magnitude. Input sub-signal 302 can subsequently be used to generate a haptic signal for window 310 .
- the magnitude of input sub-signals 302 , 303 , and 304 can be analyzed. Further, for window 320 , input sub-signal 302 can be selected because input sub-signal 302 has the highest magnitude.
- Input sub-signal 302 can subsequently be used to generate a haptic signal for window 320 .
- the magnitude of input sub-signals 302 , 303 , and 304 can be analyzed.
- input sub-signal 303 can be selected because input sub-signal 303 has the highest magnitude.
- Input sub-signal 303 can subsequently be used to generate a haptic signal for window 330 .
- an analysis parameter can be a density.
- a density can be a power, or energy, of a signal, distributed over the different frequencies of the signal.
- the density of input sub-signals 302 , 303 , and 304 can be analyzed.
- input sub-signal 304 can be selected because input sub-signal 304 has the highest density.
- Input sub-signal 304 can subsequently be used to generate a haptic signal for window 310 .
- the density of input sub-signals 302 , 303 , and 304 can be analyzed.
- input sub-signal 304 can be selected because input sub-signal 304 has the highest density. Input sub-signal 304 can subsequently be used to generate a haptic signal for window 320 .
- the density of input sub-signals 302 , 303 , and 304 can be analyzed.
- input sub-signal 303 can be selected because input sub-signal 303 has the highest density. Input sub-signal 303 can subsequently be used to generate a haptic signal for window 330 .
- FIG. 4 illustrates a flow diagram of haptic conversion functionality performed by a system, according to another embodiment of the invention.
- the haptic conversion functionality can be performed by a software program or algorithm that receives a multimedia file, such as an audio file or video file, as input, and generates a haptic signal as output.
- the software program or algorithm can be a stand-alone software program, a mobile application, or a plug-in for other audio/video editing tools, such as ProTools.
- the haptic conversion functionality can be performed by a multimedia player that receives a multimedia file and outputs the content of the multimedia file, where the content is augmented with one or more haptic effects.
- the multimedia player can be adjusted for mobile devices or computers.
- the functionality may be performed by a haptic conversion module (such as haptic conversion module 16 of FIG. 1 ).
- Multimedia file 410 is any computer file that includes multimedia data.
- multimedia file 410 is a computer file that includes audio data.
- multimedia file 410 is a computer file that includes video data.
- multimedia file 410 is a computer file that includes both audio and video data.
- multimedia file 410 is a computer file that includes some other type of data.
- Multimedia file 410 can be streamed online or provided on a physical digital support, such as a memory, disk, or other type of non-transitory computer-readable medium. Audio signal 420 (identified in FIG. 4 as audio track 420 ) is then extracted from multimedia file 410 .
- Audio signal 420 is an example of an input signal that can be extracted from multimedia file 410 .
- audio signal 420 can be replaced by another type of input signal, such as a video signal, an acceleration signal, an orientation signal, an ambient light signal, or another type of signal that can include data captured with a sensor.
- Audio signal 420 is subsequently filtered (e.g., band-pass filtered) to create a plurality of audio sub-signals, where an audio sub-signal is a type of input sub-signal (i.e., frequency band). More specifically, one or more filters (e.g., band-pass filters) can be applied to audio signal 420 to remove segments of audio signal 420 , so that the remaining segment of audio signal 420 includes one or more frequencies within a specific frequency band. The segment of audio signal 420 that remains after the application of the filter is identified as an audio sub-signal.
- filters e.g., band-pass filters
- each filter can correspond to a specific frequency band
- the plurality of filters can be applied to audio signal 420 in multiple passes (where a different filter is applied to audio signal 420 in each pass), and each filter can create a segment of audio signal 420 that includes one or more frequencies within a frequency band that corresponds to the filter.
- the one or more filters are represented by band pass filters 430 , 440 , and 450 .
- any number of filters can be used.
- any type of filter can be used.
- an audio sub-signal can include the entirety of audio signal 420 .
- the choice of cutoff frequencies that are defined for each filter can be done in a way to have contiguous complementary input sub-signals (i.e., frequency bands) that cover most or all of the frequencies present in the original input signal.
- the choice of the number and bandwidths of the different filters can be relative to the nature of the input signal (e.g., audio signal 420 ), and can affect the creation of a haptic signal.
- the resonant frequency of a haptic output device can be used to define the number and bandwidths of the filters.
- three filters can be used to produce three input sub-signals, where the first input sub-signal includes content with frequencies lower than the resonant frequency of the haptic output device, the second input sub-signal includes content with frequencies around the resonant frequency of the haptic output device, and the third input sub-signal includes content with frequencies greater than the resonant frequency of the haptic output device.
- audio sub-signals, or other types of input sub-signals are converted into haptic sub-signals using a plurality of haptic conversion algorithms.
- the haptic conversion algorithms are represented by algorithms 431 , 441 , and 451 .
- any number of haptic conversion algorithms can be used.
- each audio sub-signal is converted into a haptic signal using a unique haptic conversion algorithm.
- the plurality of haptic conversion algorithms can covert the plurality of audio sub-signals, or other types of input sub-signals, into a plurality of haptic sub-signals.
- Example haptic conversion algorithms can include: (a) multiplying the audio sub-signal by a pre-determined factor and a sine carrier waveform executed at a haptic output device's resonant frequency; (b) multiplying the audio sub-signal by a pre-determined factor; or (c) shifting a frequency content of the audio sub-signal from a frequency band to another frequency band that surrounds the haptic output device's resonant frequency, and multiplying the shifted audio sub-signal by the original audio sub-signal to preserve the shape of the original audio sub-signal. Examples of frequency shifting algorithms are further described in conjunction with FIG. 5 .
- the haptic sub-signals are subsequently mixed into a haptic signal using haptic mixer 460 .
- Haptic mixer 460 can mix the haptic sub-signals according to one of a number of mixing techniques. Example mixing techniques are further described in conjunction with FIG. 6 .
- the haptic signal can be normalized to 1 using its maximum absolute value (not illustrated in FIG. 4 ). In other embodiments, the normalizing can be omitted.
- one or more “noisy vibrations” can be cleaned from the haptic signal (not illustrated in FIG. 4 ).
- a “noisy vibration” is a segment of a haptic signal that includes a deviation from a pre-defined value, where the deviation is below a pre-defined threshold. This segment can be identified as “noise,” and can be removed from the haptic signal. This can cause the haptic signal to produce a “cleaner” (i.e., more adequate and compelling) haptic effect, when sent to a haptic output device.
- Different techniques can be used to clean the haptic track from one or more “noisy vibrations” that can be identified as “noise.”
- One technique takes a plurality of “chunks” or “windows” of samples from the haptic signal, calculates an average absolute value (or a maximum absolute value in another implementation) of these samples, and identifies the samples as “noise” if the calculated value for the sample is lower than a pre-defined threshold. All the samples identified as “noise” will then have their value reduced to 0.
- Another technique uses interleaving time windows. For each time window, normalized values of the haptic signal are checked within two larger time windows: (a) the time window itself and the preceding time window; and (b) the time window itself and the succeeding time window.
- the average normalized absolute value (or the maximum normalized absolute value in another implementation) is lower than a pre-defined threshold, the content in the time window is identified as “noise,” and its value is reduced to 0.
- the cleaning of the one or more noisy vibrations from the haptic signal can be omitted.
- the haptic signal is sent to a device 470 , where device 470 is configured to generate and output one or more haptic effects based on the received haptic signal.
- Device 470 is further configured to receive an input signal from multimedia file 410 , where the input signal can be an audio signal, a video signal, a signal that contains both audio data and video data, or some other type of input signal.
- Device 470 can be further configured to generate and output one or more effects, such as audio effects, video effects, or other type of effects, based on the input signal. Further, device 470 can be further configured to generate the one or more haptic effects so that they “complement” the audio effects, video effects, or other type of effects.
- Device 470 can play different haptic effects and/or signals simultaneously given its configuration (i.e., number and position of haptic output devices).
- FIG. 5 illustrates an example of shifting a frequency of an input signal, according to an embodiment of the invention.
- a haptic conversion algorithm can convert an input signal into a haptic signal by shifting a frequency content of the input signal from a frequency band to another frequency band that surrounds a haptic output device's resonant frequency, and by eventually multiplying the shifted input signal by the original input signal to preserve the shape of the original input signal.
- an input signal 510 is frequency-shifted into a haptic signal 520 . More specifically, input signal 510 includes frequency content within a frequency band of 300 Hz to 500 Hz.
- haptic signal 520 includes frequency content within the frequency band of 100 Hz to 200 Hz.
- the frequency-shifting is achieved using a fast Fourier transform of input signal 510 .
- one of multiple frequency-shifting techniques can be used, depending on the sizes of a “shift-from” frequency band (e.g., an original frequency band of input signal 510 ) and a “shift-to” frequency band (e.g., a shifted frequency band of haptic signal 520 ). This is because, depending on the sizes of the shift-from frequency band and the shift-to frequency band, either multiple frequencies within the shift-from frequency band are replaced with a single frequency within the shift-to frequency band, or a single band within the shift-from frequency band is replaced with multiple frequencies within the shift-to frequency band.
- each frequency of the shift-to frequency band represents multiple frequencies of the shift-from frequency band.
- the first frequency-shifting technique is to average the fast Fourier transform values of the multiple frequencies of the shift-from frequency band, and to assign the average to the corresponding frequency of the shift-to frequency band.
- the second frequency-shifting technique is to sum the fast Fourier transform values of the multiple frequencies of the shift-from frequency band, and to assign the sum to the corresponding frequency of the shift-to frequency band.
- the third frequency-shifting technique is to select a frequency of the shift-from frequency band that has the largest absolute fast Fourier transform value, ignore the other frequencies of the shift-from frequency band, and to assign the largest absolute fast Fourier transform value to the corresponding frequency of the shift-to frequency band.
- each frequency of the shift-from frequency band is represented by multiple frequencies of the shift-to frequency band.
- the first frequency-shifting technique is to assign the fast Fourier transform value of the frequency of the shift-from frequency band to the multiple frequencies of the shift-to frequency band.
- the second frequency-shifting technique is to assign the fast Fourier transform value of the frequency of the shift-from frequency band to a single frequency (e.g., a lowest frequency) of the shift-to frequency band.
- FIG. 6 illustrates a block diagram of a haptic mixer 610 configured to mix a plurality of haptic sub-signals, according to an embodiment of the invention.
- an input signal such as audio signal 620
- haptic mixer 610 can mix the haptic sub-signals into a haptic signal.
- Haptic mixer 610 can mix the haptic sub-signals according to one of a number of mixing techniques, where three example mixing techniques are illustrated in FIG. 6 at 630 .
- the haptic sub-signals are summed, and the sum of the haptic sub-signals is used to calculate a haptic signal.
- the haptic signal is subsequently normalized.
- each haptic sub-signal is segmented into a plurality of time windows.
- one or more dominant frequencies in the corresponding input signal are identified.
- a power spectrum density (“PSD”) value is calculated for each frequency in the original input signal and N frequencies having the highest PSD values are identified as being the “dominant frequencies,” where N is any number of frequencies. These N frequencies belong to the different bands represented by the different input sub-signals.
- the band having the highest number of frequencies in the group of the N dominant frequencies is identified as the dominant band. Its corresponding haptic sub-signal values are assigned to the resulting output haptic signal at the specific time window.
- each input sub-signal is segmented into a plurality of time windows.
- a PSD value is calculated per frequency band, and a PSD percentage contribution is also calculated per frequency band for each input sub-signal. More specifically, for each time window, a PSD value for each input sub-signal is calculated, an overall PSD value for the time window calculated, and a ratio of an input sub-signal PSD value to the overall PSD value is calculated for each input sub-signal. The ratio of the input sub-signal PSD value to the overall PSD value is the PSD percentage contribution for that specific input sub-signal.
- each haptic sub-signal is weighted according to its corresponding input sub-signal PSD percentage contribution, the weighted haptic sub-signals are summed, and the haptic signal is calculated based on the weighted sum of haptic sub-signals of each time window.
- the third mixing technique stems from the fact that the frequency content can change significantly throughout an input sub-signal, and thus, can change significantly through a haptic sub-signal.
- each haptic sub-signal is given equal weight. However, this may not take into consideration such situations as when an original input sub-signal had a slight influence on the original input content.
- each haptic sub-signal will contribute to the haptic signal similar to its corresponding input sub-signal's contribution to the original input signal.
- the resulting haptic signal for a specific time window is thus the sum of all the haptic sub-signals, where the haptic sub-signals are each weighted by the contribution of its corresponding input sub-signal to the original input signal for the specific time window.
- FIG. 7 illustrates a flow diagram of the functionality of a haptic conversion module (such as haptic conversion module 16 of FIG. 1 ), according to an embodiment of the invention.
- the flow begins and proceeds to 710 .
- an input is received.
- a segment of an input signal can be received.
- a multimedia file can be received, and an input signal can be extracted from the multimedia file.
- the input signal can be an audio signal.
- the input signal can be a video signal.
- the input signal can be an acceleration signal.
- the flow then proceeds to 720 .
- the input is segmented into a plurality of input sub-signals.
- the input can be filtered using one or more filters, where each input sub-signal can include a frequency band. Further, in some of these embodiments, the one or more filters include at least one band-pass filter. The flow then proceeds to 730 .
- the input sub-signals are converted into a haptic signal.
- the input sub-signals can be prioritized based on an analysis parameter.
- One or more input sub-signals can be selected, and the haptic signal can be generated based on the selected input sub-signals.
- the analysis parameter can include a characteristic of the input sub-signals.
- the characteristic of the input sub-signals can include one of: a frequency, a duration, an envelope, a density, or a magnitude.
- the haptic signal can be warped into a warped haptic signal, where one or more haptic effects can be generated based on the warped haptic signal.
- the input sub-signals can be converted into haptic sub-signals.
- the input sub-signals can be converted into haptic sub-signals by at least one of: multiplying an input sub-signal by a factor and a sine carrier waveform; multiplying the input sub-signal by a factor; or shifting frequency content from a first frequency band of the input sub-signal to a second frequency band of the input sub-signal.
- each input sub-signal can be converted into a haptic sub-signal using a unique haptic conversion algorithm.
- a fast Fourier transform of the input sub-signal can be performed.
- the shifting can include at least one of: averaging a plurality of fast Fourier transform values for a plurality of frequencies within the first frequency band of the input sub-signal, and assigning the average to a frequency within the second frequency band of the input sub-signal; summing a plurality of fast Fourier transform values for a plurality of frequencies within the first frequency band of the input sub-signal, and assigning the sum to a frequency within the second frequency band of the input sub-signal; selecting a fast Fourier transform value that has a highest absolute value from a plurality of fast Fourier transform values for a plurality of frequencies within the first frequency band of the input sub-signal, and assigning the selected fast Fourier transform value to a frequency within the second frequency band of the input sub-signal; assigning a fast Fourier transform value for a frequency within the first frequency band of the input sub-signal
- the haptic sub-signals can then be mixed into the haptic signal.
- the mixing can include one of: summing the plurality of haptic sub-signals into the haptic signal and normalizing the haptic signal; segmenting the input signal into one or more time-windows, analyzing a plurality of frequencies of the plurality of input sub-signals for each time-window, and selecting a haptic sub-signal from the plurality of haptic sub-signals as the haptic signal for each time-window, where the selected haptic sub-signal includes a frequency of the plurality of frequencies; or segmenting the haptic signal into one or more time windows, calculating a power spectrum density percentage contribution for each input sub-signal of the plurality of input sub-signals for each time window, and calculating a weighted combination of the plurality of haptic sub-signals as the haptic signal for each time-window, where a weight of each haptic sub-
- the haptic signal can be normalized to 1 using its maximum absolute value. Further, in certain embodiments, one or more noisy vibrations can be cleaned from the haptic signal. In some of these embodiments, one or more sample haptic sub-signals can be selected from the haptic signal. A mean absolute value can be calculated for each selected sample haptic sub-signal of the one or more selected sample haptic sub-signals. Each mean absolute value can be compared with a threshold. A sample haptic sub-signal can be removed from the haptic signal when its corresponding mean absolute value is less than the threshold. The flow then proceeds to 740 .
- one or more haptic effects are generated based on the haptic signal.
- the haptic signal can be sent to a haptic output device to generate the one or more haptic effects.
- the haptic output device can be an actuator. The flow then ends.
- a system can filter an input into one or more frequency bands, analyze and prioritize the one or more frequency bands based on one or more pre-determined analysis parameters, select at least one of the frequency bands based on the prioritization, and can use the selected frequency band(s) to generate a haptic signal that is ultimately used to generate one or more haptic effects.
- a system can filter an input into one or more frequency bands, analyze and prioritize the one or more frequency bands based on one or more pre-determined analysis parameters, select at least one of the frequency bands based on the prioritization, and can use the selected frequency band(s) to generate a haptic signal that is ultimately used to generate one or more haptic effects.
- a haptic effect that is more “customized” to the input, rather than merely selecting a frequency band of the input without first analyzing the multiple frequency bands of the input.
- the system can further “perfect” the haptic effect that can be generated to “complement” the input.
- such a solution can be an elegant solution that can be extended by future haptic conversion algorithms.
- a system can filter an input into one or more frequency bands, convert each frequency band into a haptic sub-signal, and mix the haptic sub-signals into a haptic signal that is ultimate used to generate one or more haptic effects.
- the system can create a compelling haptic effect that “complements” the input without the need of any human intervention, such as authored effects.
- Such a system can be implemented on any mobile device, such as a tablet or smartphone, and can use the processing power of the device as well as a haptic playback component of the device to deliver a richer experience, such as a richer video viewing experience or richer music listening experience).
- the system can attempt to create a haptic signal based on all of the input, rather than a specific segment of the input, such as a low frequency content of the input. As previously described, this can be accomplished by regrouping the content of the input given the existing frequencies, and processing each group adequately according to the group's frequency profile, and according to the haptic play device. Such an approach can produce a more “perfect” haptic effect that “complements” the input.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Computational Linguistics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Otolaryngology (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 14/020,461, filed on Sep. 6, 2013, the specification of which is hereby incorporated by reference.
- One embodiment is directed generally to a device, and more particularly, to a device that produces haptic effects.
- Haptics is a tactile and force feedback technology that takes advantage of a user's sense of touch by applying haptic feedback effects (i.e., “haptic effects”), such as forces, vibrations, and motions, to the user. Devices, such as mobile devices, touchscreen devices, and personal computers, can be configured to generate haptic effects. In general, calls to embedded hardware capable of generating haptic effects (such as actuators) can be programmed within an operating system (“OS”) of the device. These calls specify which haptic effect to play. For example, when a user interacts with the device using, for example, a button, touchscreen, lever, joystick, wheel, or some other control, the OS of the device can send a play command through control circuitry to the embedded hardware. The embedded hardware then produces the appropriate haptic effect.
- Devices can be configured to coordinate the output of haptic effects with the output of other content, such as audio, so that the haptic effects are incorporated into the other content. For example, an audio effect developer can develop audio effects that can be output by the device, such as machine gun fire, explosions, or car crashes. Further, other types of content, such as video effects, can be developed and subsequently output by the device. A haptic effect developer can subsequently author a haptic effect for the device, and the device can be configured to output the haptic effect along with the other content. However, such a process generally requires the individual judgment of the haptic effect developer to author a haptic effect that correctly compliments the audio effect, or other type of content. A poorly-authored haptic effect that does not compliment the audio effect, or other type of content, can produce an overall dissonant effect where the haptic effect does not “mesh” with the audio effect or other content. This type of user experience is generally not desired.
- One embodiment is a system that converts an input into one or more haptic effects using segmenting and combining. The system receives an input. The system further segments the input into a plurality of input sub-signals. The system further converts the plurality of input sub-signals into a single haptic signal, or multiple haptic signals that can either be played separately on different haptic output devices, or mixed into a single haptic signal. The system further generates the one or more haptic effects based on the haptic signal.
- Further embodiments, details, advantages, and modifications will become apparent from the following detailed description of the preferred embodiments, which is to be taken in conjunction with the accompanying drawings.
-
FIG. 1 illustrates a block diagram of a system in accordance with one embodiment of the invention. -
FIG. 2 illustrates a flow diagram of haptic conversion functionality performed by a system, according to an embodiment of the invention. -
FIG. 3 illustrates a plurality of input sub-signals that are analyzed and converted into one or more haptic effects using frequency band analysis and prioritization, according to an embodiment of the invention. -
FIG. 4 illustrates a flow diagram of haptic conversion functionality performed by a system, according to another embodiment of the invention. -
FIG. 5 illustrates an example of shifting a frequency of an input signal, according to an embodiment of the invention. -
FIG. 6 illustrates a block diagram of a haptic mixer configured to mix a plurality of haptic sub-signals, according to an embodiment of the invention. -
FIG. 7 illustrates a flow diagram of the functionality of a haptic conversion module, according to an embodiment of the invention. - One embodiment is a system that can convert an input, such as an audio signal, into a haptic signal that can be used to generate haptic effects. The system can filter the input into multiple frequency bands, where each frequency band includes a sub-signal of the input, and the system can further prioritize the multiple frequency bands based on one or more parameters of analysis. In general, the system can create a prioritized list of the frequency bands. The system can further select one or more frequency bands from the prioritized list, and use the selected frequency band(s) to generate the haptic signal, where the haptic signal is based upon, at least in part, a combination of the selected frequency band(s). For example, the system can band-pass filter an audio signal into four frequency bands, and can create a haptic signal based on the frequency band that contains an audio sub-signal with the highest magnitude. The haptic conversion functionality of the system can be implemented either as offline functionality that provides an output file that can be played back on a device, or as an algorithm that performs the processing at playback time.
- Another embodiment is a system that can convert an input, such as an audio signal, into a haptic signal that can be used to generate haptic effects. The system can read a multimedia file (e.g., audio file or video file), and extract an input signal, such as an audio signal, from the multimedia file. The system can filter the input signal into one or more input sub-signals. For example, the system can use different band pass filters with complementary cutoff frequencies to segment the input signal into different complementary input sub-signals. The system can then create a haptic sub-signal for each input sub-signal. The system can further mix, or otherwise combine, the haptic sub-signals into an overall haptic signal that corresponds to the original input signal. The haptic conversion functionality of the system can be implemented as a software module, a mobile application, or a plug-in for audio/video players and editing tools.
-
FIG. 1 illustrates a block diagram of asystem 10 in accordance with one embodiment of the invention. In one embodiment,system 10 is part of a mobile device, andsystem 10 provides a haptic conversion functionality for the mobile device. In another embodiment,system 10 is part of a wearable device, andsystem 10 provides a haptic conversion functionality for the wearable device. Examples of wearable devices include wrist bands, headbands, eyeglasses, rings, leg bands, arrays integrated into clothing, or any other type of device that a user may wear on a body or can be held by a user. Some wearable devices can be “haptically enabled,” meaning they include mechanisms to generate haptic effects. In another embodiment,system 10 is separate from the device (e.g., a mobile device or a wearable device), and remotely provides the haptic conversion functionality for the device. Although shown as a single system, the functionality ofsystem 10 can be implemented as a distributed system.System 10 includes abus 12 or other communication mechanism for communicating information, and aprocessor 22 coupled tobus 12 for processing information.Processor 22 may be any type of general or specific purpose processor.System 10 further includes amemory 14 for storing information and instructions to be executed byprocessor 22.Memory 14 can be comprised of any combination of random access memory (“RAM”), read only memory (“ROM”), static storage such as a magnetic or optical disk, or any other type of computer-readable medium. - A computer-readable medium may be any available medium that can be accessed by
processor 22 and may include both a volatile and nonvolatile medium, a removable and non-removable medium, a communication medium, and a storage medium. A communication medium may include computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any other form of an information delivery medium known in the art. A storage medium may include RAM, flash memory, ROM, erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), registers, hard disk, a removable disk, a compact disk read-only memory (“CD-ROM”), or any other form of a storage medium known in the art. - In one embodiment,
memory 14 stores software modules that provide functionality when executed byprocessor 22. The modules include anoperating system 15 that provides operating system functionality forsystem 10, as well as the rest of a mobile device in one embodiment. The modules further include ahaptic conversion module 16 that converts an input into one or more haptic effects using segmenting and combining, as disclosed in more detail below. In certain embodiments,haptic conversion module 16 can comprise a plurality of modules, where each module provides specific individual functionality for converting an input into one or more haptic effects using segmenting and combining.System 10 will typically include one or moreadditional application modules 18 to include additional functionality, such as Integrator™ software by Immersion Corporation. -
System 10, in embodiments that transmit and/or receive data from remote sources, further includes acommunication device 20, such as a network interface card, to provide mobile wireless network communication, such as infrared, radio, Wi-Fi, or cellular network communication. In other embodiments,communication device 20 provides a wired network connection, such as an Ethernet connection or a modem. -
Processor 22 is further coupled viabus 12 to adisplay 24, such as a Liquid Crystal Display (“LCD”), for displaying a graphical representation or user interface to a user. Thedisplay 24 may be a touch-sensitive input device, such as a touch screen, configured to send and receive signals fromprocessor 22, and may be a multi-touch touch screen. -
System 10, in one embodiment, further includes anactuator 26.Processor 22 may transmit a haptic signal associated with a generated haptic effect toactuator 26, which in turn outputs haptic effects such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects.Actuator 26 includes an actuator drive circuit.Actuator 26 may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a linear resonant actuator (“LRA”), a piezoelectric actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, or an ultrasonic vibration generator. In alternate embodiments,system 10 can include one or more additional actuators, in addition to actuator 26 (not illustrated inFIG. 1 ).Actuator 26 is an example of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, electrostatic friction haptic effects, or deformation haptic effects, in response to a drive signal. In alternate embodiments,actuator 26 can be replaced by some other type of haptic output device. Further, in other alternate embodiments,system 10 may not includeactuator 26, and a separate device fromsystem 10 includes an actuator, or other haptic output device, that generates the haptic effects, andsystem 10 sends generated haptic signals to that device throughcommunication device 20. -
System 10, in one embodiment, further includes aspeaker 28.Processor 22 may transmit an audio signal tospeaker 28, which in turn outputs audio effects.Speaker 28 may be, for example, a dynamic loudspeaker, an electrodynamic loudspeaker, a piezoelectric loudspeaker, a magnetostrictive loudspeaker, an electrostatic loudspeaker, a ribbon and planar magnetic loudspeaker, a bending wave loudspeaker, a flat panel loudspeaker, a heil air motion transducer, a plasma arc speaker, and a digital loudspeaker. In alternate embodiments,system 10 can include one or more additional speakers, in addition to speaker 28 (not illustrated inFIG. 1 ). Further, in other alternate embodiments,system 10 may not includespeaker 28, and a separate device fromsystem 10 includes a speaker that outputs the audio effects, andsystem 10 sends audio signals to that device throughcommunication device 20. -
System 10, in one embodiment, further includes asensor 30.Sensor 30 can be configured to detect a form of energy, or other physical property, such as, but not limited to, sound, movement, acceleration, bio signals, distance, flow, force/pressure/strain/bend, humidity, linear position, orientation/inclination, radio frequency, rotary position, rotary velocity, manipulation of a switch, temperature, vibration, or visible light intensity.Sensor 30 can further be configured to convert the detected energy, or other physical property, into an electrical signal, or any signal that represents virtual sensor information.Sensor 30 can be any device, such as, but not limited to, an accelerometer, an electrocardiogram, an electroencephalogram, an electromyograph, an electrooculogram, an electropalatograph, a galvanic skin response sensor, a capacitive sensor, a hall effect sensor, an infrared sensor, an ultrasonic sensor, a pressure sensor, a fiber optic sensor, a flexion sensor (or bend sensor), a force-sensitive resistor, a load cell, a LuSense CPS2 155, a miniature pressure transducer, a piezo sensor, a strain gage, a hygrometer, a linear position touch sensor, a linear potentiometer (or slider), a linear variable differential transformer, a compass, an inclinometer, a magnetic tag (or radio frequency identification tag), a rotary encoder, a rotary potentiometer, a gyroscope, an on-off switch, a temperature sensor (such as a thermometer, thermocouple, resistance temperature detector, thermistor, or temperature-transducing integrated circuit), microphone, photometer, altimeter, bio monitor, camera, or a light-dependent resistor. In alternate embodiments,system 10 can include one or more additional sensors, in addition to sensor 30 (not illustrated inFIG. 1 ). In some of these embodiments,sensor 30 and the one or more additional sensors may be part of a sensor array, or some other type of collection of sensors. Further, in other alternate embodiments,system 10 may not includesensor 30, and a separate device fromsystem 10 includes a sensor that detects a form of energy, or other physical property, and converts the detected energy, or other physical property, into an electrical signal, or other type of signal that represents virtual sensor information. The device can then send the converted signal tosystem 10 throughcommunication device 20. -
FIG. 2 illustrates a flow diagram of haptic conversion functionality performed by a system, according to an embodiment of the invention. In one embodiment, the functionality ofFIG. 2 , as well as the functionality ofFIG. 4 and the functionality ofFIG. 7 , are each implemented by software stored in memory or other computer-readable or tangible media, and executed by a processor. In this embodiment, each functionality may be performed by a haptic conversion module (such ashaptic conversion module 16 ofFIG. 1 ). In other embodiments, each functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software. - The haptic conversion functionality can include receiving one or more “chunks” (also identified as segments) of an input signal, processing each chunk of the input signal, and playing back the modified chunk of the input signal on a haptic output device, such as an actuator. In certain embodiments, the input can be an audio signal, or other type of audio input, that includes audio data. In other alternate embodiments, the input can be a video signal, or other type of video input, that includes video data. In yet other alternate embodiments, the input can be an acceleration signal, or other type of acceleration input, that includes acceleration data. In yet other alternate embodiments, the input can be an orientation signal that includes orientation data, an ambient light signal that includes ambient light data, or another type of signal that can be related to a media file, and that can also be sensed by a sensor, such as
sensor 30. The output of the sensor can be recorded beforehand, and can be provided along with the media file. Thus, the sensor may be, or may not be, attached to the system. - According to the embodiment, the flow begins at 210, where an input signal chunk is received. As previously described, an input signal chunk is a segment of an input signal. In one embodiment, the input signal chunk can include the entire input signal. The flow proceeds to 220.
- At 220, the input signal chunk is filtered (e.g., band-pass filtered) to create a plurality of input sub-signals (also identified as frequency bands). More specifically, one or more filters (e.g., band-pass filters) can be applied to the input signal chunk to remove segments of the input signal chunk, so that the remaining segment of the input signal chunk includes one or more frequencies within a specific frequency band. The segment of the input signal chunk that remains after the application of the filter is identified as an input sub-signal, or a frequency band. In embodiments involving a plurality of filters, each filter can correspond to a specific frequency band, the plurality of filters can be applied to the input signal chunk in multiple passes (where a different filter is applied to the input signal chunk in each pass), and each filter can create a segment of the input signal chunk that includes one or more frequencies within a frequency band that corresponds to the filter. In certain embodiments, an input sub-signal can include the entire input signal chunk.
- For example, a first filter can represent a low frequency band, a second filter can represent a medium frequency band, and a third filter can represent a high frequency band. A first filter can be applied to an input signal chunk, and a first input sub-signal can be created, where the first input sub-signal includes one or more frequencies within a low frequency band. A second filter can then be applied to the input signal chunk, and a second input sub-signal can be created, where the second input sub-signal includes one or more frequencies within a medium frequency band. A third filter can then be applied to an input signal chunk, and a third input sub-signal can be created, where the third input sub-signal includes one or more frequencies within a high frequency band.
- The input sub-signals (i.e., frequency bands) that are created at 220 are illustrated in
FIG. 2 asfrequency bands FIG. 3 . The flow then proceeds to 230. - At 230, the input sub-signals (e.g., frequency bands 221-224) are prioritized based on an analysis parameter. More specifically, one or more characteristics of each input sub-signal are analyzed, where the analysis parameter defines the one or more characteristics. Examples of the characteristics can include: frequency, duration, envelope, density, and magnitude. The input sub-signals can then be ordered (i.e., prioritized) based on the analyzed characteristics of each input sub-signal. For example, a prioritized list of the one or more input sub-signals can be generated, where the one or more input sub-signals are prioritized within the list based on the analyzed characteristics of each input sub-signal.
- As an example, an analysis parameter can be defined as a magnitude of an input sub-signal. A plurality of input sub-signals (i.e., frequency bands) can be created, and each input sub-signal can be analyzed in order to determine a maximum magnitude value. The maximum magnitude value for each input sub-signal can be compared, and the input sub-signals can be prioritized based on the corresponding maximum magnitude values.
- An example implementation of analyzing and prioritizing input sub-signals using an analysis parameter is further described below in conjunction with
FIG. 3 . The flow then proceeds to 240. - At 240, a haptic signal is calculated and generated based on one or more input sub-signals that are selected from the prioritized input sub-signals. More specifically, one or more input sub-signals are first selected from the prioritized input sub-signals. For example, an input sub-signal with the highest priority (or a plurality of input sub-signals with the highest priorities) can be selected from a prioritized list of the input sub-signals. Subsequently, a haptic signal is calculated based on the selected input sub-signal(s). More specifically, the haptic signal is calculated to include one or more characteristics of the selected input sub-signal(s). In certain embodiments, the haptic signal can be calculated to include all the characteristics of the selected input sub-signal(s). In embodiments where there are a plurality of input sub-signals, the haptic signal can be calculated, at least in part, based on a combination of the input sub-signals. The haptic signal is subsequently generated.
- As an example, an input sub-signal with a highest maximum magnitude value can be selected. The maximum magnitude value of the input sub-signal can be used to calculate a haptic signal. This haptic signal can subsequently be used to generate a haptic effect that is based on the input sub-signal with the highest magnitude. As another example, the three input sub-signals with the three highest maximum magnitude values can be selected. The maximum magnitude values of the three input sub-signals can be used to calculate a haptic signal. For example, an average, or other calculation, of the three maximum magnitude values can be calculated. This average value, or other calculated value, can be used to generate a haptic signal. This haptic signal can subsequently be used to generate a haptic effect that is based on the three input sub-signals with the three highest magnitudes. The flow then proceeds to 250.
- At 250, the generated haptic signal is “warped” into a “warped haptic signal.” More specifically, the generated haptic signal is input into a “warping” function, where a warping function can “warp” or transform the data contained within an input signal to create an output signal, where the output signal is also identified as a “warped signal.” Thus, by inputting the generated haptic signal into the warping function, the warping function can warp the data contained within the generated haptic signal. The warping of the data contained within the generated haptic signal can ultimately transform the generated haptic signal into the “warped haptic signal.” In certain embodiments, the warped haptic signal is more suitable for a specific haptic output device, and thus, the warped haptic signal can be played on the specific haptic output device in order to generate one or more haptic effects.
- In one example, for an LRA or an ERM, the warping function can envelope an input haptic signal, or use a maximum magnitude value of the input haptic signal, to calculate a magnitude of an output haptic signal that is generated, where the magnitude of the output haptic signal correlates to the magnitude of a haptic effect generated by the output haptic signal. In another example, for a piezoelectric actuator, a high bandwidth actuator, or an EAP actuator, the warping function can play back the input haptic signal as a waveform, or perform another type of algorithm to convert the input haptic signal into an output haptic signal.
- In certain embodiments, 250 can be omitted. The flow then proceeds to 260.
- At 260, the haptic signal (either the warped haptic signal if 250 is performed, or the generated haptic signal if 250 is omitted) is sent to a haptic output device, such as
actuator 26 ofFIG. 1 , and the haptic output device generates one or more haptic effects based on the haptic signal. Thus, the one or more haptic effects can be generated based on the selected input sub-signal(s). The flow then ends. - In certain embodiments, the haptic conversion functionality illustrated in
FIG. 2 can be performed in real-time on a device configured to output haptic effects, such as a mobile device or touchscreen device. In other alternate embodiments, the haptic conversion functionality illustrated inFIG. 2 can be performed offline, by a computer or other type of computing machine, and a resulting haptic signal can be sent to the device that is configured to output the haptic effects. -
FIG. 3 illustrates a plurality of input sub-signals that are analyzed and converted into one or more haptic effects using frequency band analysis and prioritization, according to an embodiment of the invention. As previously described, aninput signal 301 can be filtered to create a plurality of input sub-signals using one or more filters, where the plurality of input sub-signals includes input sub-signals 302, 303, and 304.Input sub-signal 302 represents a 200 Hz center frequency band-pass of an input signal, and can be created using a first filter.Input sub-signal 303 represents a 1000 Hz center frequency band-pass of an input signal, and can be created using a second filter.Input sub-signal 304 represents a 5000 Hz center frequency band-pass of an input signal, and can be created using a third filter. - According to the embodiment, each input sub-signal can divided into a plurality of segments or windows. In the illustrated embodiment,
example windows windows input sub-signals FIG. 3 . Further, in certain embodiments, the windows of an input sub-signal can be in a sequential arrangement, where a subsequent window starts at a position when a preceding window ends. - According to the embodiment, for each window, one or more characteristics of each input sub-signal can be analyzed based on an analysis parameter. Thus, in the illustrated embodiment, for each window of
windows windows windows windows - In one example, an analysis parameter can be a magnitude. In this example, for
window 310, the magnitude ofinput sub-signals window 310, input sub-signal 302 can be selected becauseinput sub-signal 302 has the highest magnitude.Input sub-signal 302 can subsequently be used to generate a haptic signal forwindow 310. Likewise, forwindow 320, the magnitude ofinput sub-signals window 320, input sub-signal 302 can be selected becauseinput sub-signal 302 has the highest magnitude.Input sub-signal 302 can subsequently be used to generate a haptic signal forwindow 320. Similarly, forwindow 330, the magnitude ofinput sub-signals window 330, input sub-signal 303 can be selected becauseinput sub-signal 303 has the highest magnitude.Input sub-signal 303 can subsequently be used to generate a haptic signal forwindow 330. - In one example, an analysis parameter can be a density. In one embodiment, a density can be a power, or energy, of a signal, distributed over the different frequencies of the signal. In this example, for
window 310, the density ofinput sub-signals window 310, input sub-signal 304 can be selected becauseinput sub-signal 304 has the highest density.Input sub-signal 304 can subsequently be used to generate a haptic signal forwindow 310. Likewise, forwindow 320, the density ofinput sub-signals window 320, input sub-signal 304 can be selected becauseinput sub-signal 304 has the highest density.Input sub-signal 304 can subsequently be used to generate a haptic signal forwindow 320. Similarly, forwindow 330, the density ofinput sub-signals window 330, input sub-signal 303 can be selected becauseinput sub-signal 303 has the highest density.Input sub-signal 303 can subsequently be used to generate a haptic signal forwindow 330. -
FIG. 4 illustrates a flow diagram of haptic conversion functionality performed by a system, according to another embodiment of the invention. According to an embodiment, the haptic conversion functionality can be performed by a software program or algorithm that receives a multimedia file, such as an audio file or video file, as input, and generates a haptic signal as output. The software program or algorithm can be a stand-alone software program, a mobile application, or a plug-in for other audio/video editing tools, such as ProTools. In another embodiment, the haptic conversion functionality can be performed by a multimedia player that receives a multimedia file and outputs the content of the multimedia file, where the content is augmented with one or more haptic effects. The multimedia player can be adjusted for mobile devices or computers. In one embodiment, the functionality may be performed by a haptic conversion module (such ashaptic conversion module 16 ofFIG. 1 ). - According to the embodiment, the flow begins, and
multimedia file 410 is received.Multimedia file 410 is any computer file that includes multimedia data. In one example embodiment,multimedia file 410 is a computer file that includes audio data. In another example embodiment,multimedia file 410 is a computer file that includes video data. In another example embodiment,multimedia file 410 is a computer file that includes both audio and video data. In yet another example embodiment,multimedia file 410 is a computer file that includes some other type of data.Multimedia file 410 can be streamed online or provided on a physical digital support, such as a memory, disk, or other type of non-transitory computer-readable medium. Audio signal 420 (identified inFIG. 4 as audio track 420) is then extracted frommultimedia file 410.Audio signal 420 is an example of an input signal that can be extracted frommultimedia file 410. In alternate embodiments,audio signal 420 can be replaced by another type of input signal, such as a video signal, an acceleration signal, an orientation signal, an ambient light signal, or another type of signal that can include data captured with a sensor. -
Audio signal 420 is subsequently filtered (e.g., band-pass filtered) to create a plurality of audio sub-signals, where an audio sub-signal is a type of input sub-signal (i.e., frequency band). More specifically, one or more filters (e.g., band-pass filters) can be applied toaudio signal 420 to remove segments ofaudio signal 420, so that the remaining segment ofaudio signal 420 includes one or more frequencies within a specific frequency band. The segment ofaudio signal 420 that remains after the application of the filter is identified as an audio sub-signal. In embodiments involving a plurality of filters, each filter can correspond to a specific frequency band, the plurality of filters can be applied toaudio signal 420 in multiple passes (where a different filter is applied toaudio signal 420 in each pass), and each filter can create a segment ofaudio signal 420 that includes one or more frequencies within a frequency band that corresponds to the filter. In the illustrated embodiments, the one or more filters are represented by band pass filters 430, 440, and 450. However, any number of filters can be used. Further, any type of filter can be used. In certain embodiments, an audio sub-signal can include the entirety ofaudio signal 420. - In certain embodiments, the choice of cutoff frequencies that are defined for each filter can be done in a way to have contiguous complementary input sub-signals (i.e., frequency bands) that cover most or all of the frequencies present in the original input signal. The choice of the number and bandwidths of the different filters can be relative to the nature of the input signal (e.g., audio signal 420), and can affect the creation of a haptic signal. In some embodiments, the resonant frequency of a haptic output device can be used to define the number and bandwidths of the filters. For example, three filters can be used to produce three input sub-signals, where the first input sub-signal includes content with frequencies lower than the resonant frequency of the haptic output device, the second input sub-signal includes content with frequencies around the resonant frequency of the haptic output device, and the third input sub-signal includes content with frequencies greater than the resonant frequency of the haptic output device.
- Next, audio sub-signals, or other types of input sub-signals, are converted into haptic sub-signals using a plurality of haptic conversion algorithms. In the illustrated embodiments, the haptic conversion algorithms are represented by
algorithms - Example haptic conversion algorithms can include: (a) multiplying the audio sub-signal by a pre-determined factor and a sine carrier waveform executed at a haptic output device's resonant frequency; (b) multiplying the audio sub-signal by a pre-determined factor; or (c) shifting a frequency content of the audio sub-signal from a frequency band to another frequency band that surrounds the haptic output device's resonant frequency, and multiplying the shifted audio sub-signal by the original audio sub-signal to preserve the shape of the original audio sub-signal. Examples of frequency shifting algorithms are further described in conjunction with
FIG. 5 . - The haptic sub-signals are subsequently mixed into a haptic signal using
haptic mixer 460.Haptic mixer 460 can mix the haptic sub-signals according to one of a number of mixing techniques. Example mixing techniques are further described in conjunction withFIG. 6 . - In certain embodiments, the haptic signal can be normalized to 1 using its maximum absolute value (not illustrated in
FIG. 4 ). In other embodiments, the normalizing can be omitted. - Further, in certain embodiments, one or more “noisy vibrations” can be cleaned from the haptic signal (not illustrated in
FIG. 4 ). A “noisy vibration” is a segment of a haptic signal that includes a deviation from a pre-defined value, where the deviation is below a pre-defined threshold. This segment can be identified as “noise,” and can be removed from the haptic signal. This can cause the haptic signal to produce a “cleaner” (i.e., more adequate and compelling) haptic effect, when sent to a haptic output device. - Different techniques can be used to clean the haptic track from one or more “noisy vibrations” that can be identified as “noise.” One technique takes a plurality of “chunks” or “windows” of samples from the haptic signal, calculates an average absolute value (or a maximum absolute value in another implementation) of these samples, and identifies the samples as “noise” if the calculated value for the sample is lower than a pre-defined threshold. All the samples identified as “noise” will then have their value reduced to 0. Another technique uses interleaving time windows. For each time window, normalized values of the haptic signal are checked within two larger time windows: (a) the time window itself and the preceding time window; and (b) the time window itself and the succeeding time window. If in these two time windows, the average normalized absolute value (or the maximum normalized absolute value in another implementation) is lower than a pre-defined threshold, the content in the time window is identified as “noise,” and its value is reduced to 0. In other embodiments, the cleaning of the one or more noisy vibrations from the haptic signal can be omitted.
- Subsequently, the haptic signal is sent to a
device 470, wheredevice 470 is configured to generate and output one or more haptic effects based on the received haptic signal.Device 470 is further configured to receive an input signal frommultimedia file 410, where the input signal can be an audio signal, a video signal, a signal that contains both audio data and video data, or some other type of input signal.Device 470 can be further configured to generate and output one or more effects, such as audio effects, video effects, or other type of effects, based on the input signal. Further,device 470 can be further configured to generate the one or more haptic effects so that they “complement” the audio effects, video effects, or other type of effects.Device 470 can play different haptic effects and/or signals simultaneously given its configuration (i.e., number and position of haptic output devices). -
FIG. 5 illustrates an example of shifting a frequency of an input signal, according to an embodiment of the invention. As previously describes, a haptic conversion algorithm can convert an input signal into a haptic signal by shifting a frequency content of the input signal from a frequency band to another frequency band that surrounds a haptic output device's resonant frequency, and by eventually multiplying the shifted input signal by the original input signal to preserve the shape of the original input signal. In the example illustrated inFIG. 5 , aninput signal 510 is frequency-shifted into ahaptic signal 520. More specifically,input signal 510 includes frequency content within a frequency band of 300 Hz to 500 Hz. However, wheninput signal 510 is frequency-shifted intohaptic signal 520, the frequency content of input signal is shifted from the frequency band of 300 Hz to 500 Hz to a frequency band of 100 Hz to 200 Hz. Thus,haptic signal 520 includes frequency content within the frequency band of 100 Hz to 200 Hz. - In certain embodiments, the frequency-shifting is achieved using a fast Fourier transform of
input signal 510. Further, one of multiple frequency-shifting techniques can be used, depending on the sizes of a “shift-from” frequency band (e.g., an original frequency band of input signal 510) and a “shift-to” frequency band (e.g., a shifted frequency band of haptic signal 520). This is because, depending on the sizes of the shift-from frequency band and the shift-to frequency band, either multiple frequencies within the shift-from frequency band are replaced with a single frequency within the shift-to frequency band, or a single band within the shift-from frequency band is replaced with multiple frequencies within the shift-to frequency band. - In scenarios where a shift-from frequency band is larger than a shift-to frequency band, each frequency of the shift-to frequency band represents multiple frequencies of the shift-from frequency band. To accomplish this, one of three frequency-shifting techniques can be used, according to certain embodiments. The first frequency-shifting technique is to average the fast Fourier transform values of the multiple frequencies of the shift-from frequency band, and to assign the average to the corresponding frequency of the shift-to frequency band. The second frequency-shifting technique is to sum the fast Fourier transform values of the multiple frequencies of the shift-from frequency band, and to assign the sum to the corresponding frequency of the shift-to frequency band. The third frequency-shifting technique is to select a frequency of the shift-from frequency band that has the largest absolute fast Fourier transform value, ignore the other frequencies of the shift-from frequency band, and to assign the largest absolute fast Fourier transform value to the corresponding frequency of the shift-to frequency band.
- On the other hand, in scenarios where a shift-to frequency band is larger than a shift-from frequency band, each frequency of the shift-from frequency band is represented by multiple frequencies of the shift-to frequency band. To accomplish this, one of two frequency-shifting techniques can be used. The first frequency-shifting technique is to assign the fast Fourier transform value of the frequency of the shift-from frequency band to the multiple frequencies of the shift-to frequency band. The second frequency-shifting technique is to assign the fast Fourier transform value of the frequency of the shift-from frequency band to a single frequency (e.g., a lowest frequency) of the shift-to frequency band.
-
FIG. 6 illustrates a block diagram of ahaptic mixer 610 configured to mix a plurality of haptic sub-signals, according to an embodiment of the invention. As previously described, an input signal, such asaudio signal 620, can be converted into a plurality of haptic sub-signals (illustrated inFIG. 6 as haptic tracks), wherehaptic mixer 610 can mix the haptic sub-signals into a haptic signal.Haptic mixer 610 can mix the haptic sub-signals according to one of a number of mixing techniques, where three example mixing techniques are illustrated inFIG. 6 at 630. - According to the first mixing technique, at 631, the haptic sub-signals are summed, and the sum of the haptic sub-signals is used to calculate a haptic signal. In certain embodiments, the haptic signal is subsequently normalized.
- According to the second mixing technique, at 632, each haptic sub-signal is segmented into a plurality of time windows. At 633, for each time window, one or more dominant frequencies in the corresponding input signal are identified. For each time window, a power spectrum density (“PSD”) value is calculated for each frequency in the original input signal and N frequencies having the highest PSD values are identified as being the “dominant frequencies,” where N is any number of frequencies. These N frequencies belong to the different bands represented by the different input sub-signals. At 634, the band having the highest number of frequencies in the group of the N dominant frequencies is identified as the dominant band. Its corresponding haptic sub-signal values are assigned to the resulting output haptic signal at the specific time window.
- According to the third mixing technique, at 635, each input sub-signal is segmented into a plurality of time windows. At 636, for each time window, a PSD value is calculated per frequency band, and a PSD percentage contribution is also calculated per frequency band for each input sub-signal. More specifically, for each time window, a PSD value for each input sub-signal is calculated, an overall PSD value for the time window calculated, and a ratio of an input sub-signal PSD value to the overall PSD value is calculated for each input sub-signal. The ratio of the input sub-signal PSD value to the overall PSD value is the PSD percentage contribution for that specific input sub-signal. At 637, for each time window, each haptic sub-signal is weighted according to its corresponding input sub-signal PSD percentage contribution, the weighted haptic sub-signals are summed, and the haptic signal is calculated based on the weighted sum of haptic sub-signals of each time window.
- The third mixing technique stems from the fact that the frequency content can change significantly throughout an input sub-signal, and thus, can change significantly through a haptic sub-signal. When haptic sub-signals are added together, each haptic sub-signal is given equal weight. However, this may not take into consideration such situations as when an original input sub-signal had a slight influence on the original input content. By giving a weight related to a PSD percentage contribution, each haptic sub-signal will contribute to the haptic signal similar to its corresponding input sub-signal's contribution to the original input signal. The resulting haptic signal for a specific time window is thus the sum of all the haptic sub-signals, where the haptic sub-signals are each weighted by the contribution of its corresponding input sub-signal to the original input signal for the specific time window.
-
FIG. 7 illustrates a flow diagram of the functionality of a haptic conversion module (such ashaptic conversion module 16 ofFIG. 1 ), according to an embodiment of the invention. The flow begins and proceeds to 710. At 710, an input is received. In certain embodiments, a segment of an input signal can be received. In other embodiments, a multimedia file can be received, and an input signal can be extracted from the multimedia file. In certain embodiments, the input signal can be an audio signal. In other embodiments, the input signal can be a video signal. In other embodiments, the input signal can be an acceleration signal. The flow then proceeds to 720. - At 720, the input is segmented into a plurality of input sub-signals. In certain embodiments, the input can be filtered using one or more filters, where each input sub-signal can include a frequency band. Further, in some of these embodiments, the one or more filters include at least one band-pass filter. The flow then proceeds to 730.
- At 730, the input sub-signals are converted into a haptic signal. In certain embodiments, the input sub-signals can be prioritized based on an analysis parameter. One or more input sub-signals can be selected, and the haptic signal can be generated based on the selected input sub-signals. In some of these embodiments, the analysis parameter can include a characteristic of the input sub-signals. Further, the characteristic of the input sub-signals can include one of: a frequency, a duration, an envelope, a density, or a magnitude. Even further, in some of these embodiments, the haptic signal can be warped into a warped haptic signal, where one or more haptic effects can be generated based on the warped haptic signal.
- In other embodiments, the input sub-signals can be converted into haptic sub-signals. In some of these embodiments, the input sub-signals can be converted into haptic sub-signals by at least one of: multiplying an input sub-signal by a factor and a sine carrier waveform; multiplying the input sub-signal by a factor; or shifting frequency content from a first frequency band of the input sub-signal to a second frequency band of the input sub-signal. Further, in some of these embodiments, each input sub-signal can be converted into a haptic sub-signal using a unique haptic conversion algorithm. Even further, in some embodiments, a fast Fourier transform of the input sub-signal can be performed.
- Furthermore, in embodiments where frequency content is shifted from a first frequency band of the input sub-signal to a second frequency band of the input sub-signal, the shifting can include at least one of: averaging a plurality of fast Fourier transform values for a plurality of frequencies within the first frequency band of the input sub-signal, and assigning the average to a frequency within the second frequency band of the input sub-signal; summing a plurality of fast Fourier transform values for a plurality of frequencies within the first frequency band of the input sub-signal, and assigning the sum to a frequency within the second frequency band of the input sub-signal; selecting a fast Fourier transform value that has a highest absolute value from a plurality of fast Fourier transform values for a plurality of frequencies within the first frequency band of the input sub-signal, and assigning the selected fast Fourier transform value to a frequency within the second frequency band of the input sub-signal; assigning a fast Fourier transform value for a frequency within the first frequency band of the input sub-signal to a plurality of frequencies within the second frequency band of the input sub-signal; or assigning a fast Fourier transform value for a frequency within the first frequency band of the input sub-signal to a lowest frequency within the second frequency band of the input sub-signal.
- In certain embodiments, the haptic sub-signals can then be mixed into the haptic signal. In some of these embodiments, the mixing can include one of: summing the plurality of haptic sub-signals into the haptic signal and normalizing the haptic signal; segmenting the input signal into one or more time-windows, analyzing a plurality of frequencies of the plurality of input sub-signals for each time-window, and selecting a haptic sub-signal from the plurality of haptic sub-signals as the haptic signal for each time-window, where the selected haptic sub-signal includes a frequency of the plurality of frequencies; or segmenting the haptic signal into one or more time windows, calculating a power spectrum density percentage contribution for each input sub-signal of the plurality of input sub-signals for each time window, and calculating a weighted combination of the plurality of haptic sub-signals as the haptic signal for each time-window, where a weight of each haptic sub-signal is based on the power spectrum density percentage contribution of the corresponding input sub-signal.
- In certain embodiments, the haptic signal can be normalized to 1 using its maximum absolute value. Further, in certain embodiments, one or more noisy vibrations can be cleaned from the haptic signal. In some of these embodiments, one or more sample haptic sub-signals can be selected from the haptic signal. A mean absolute value can be calculated for each selected sample haptic sub-signal of the one or more selected sample haptic sub-signals. Each mean absolute value can be compared with a threshold. A sample haptic sub-signal can be removed from the haptic signal when its corresponding mean absolute value is less than the threshold. The flow then proceeds to 740.
- At 740, one or more haptic effects are generated based on the haptic signal. In some embodiments, the haptic signal can be sent to a haptic output device to generate the one or more haptic effects. In some of these embodiments, the haptic output device can be an actuator. The flow then ends.
- Thus, in one embodiment, a system can filter an input into one or more frequency bands, analyze and prioritize the one or more frequency bands based on one or more pre-determined analysis parameters, select at least one of the frequency bands based on the prioritization, and can use the selected frequency band(s) to generate a haptic signal that is ultimately used to generate one or more haptic effects. By analyzing multiple frequency bands of an input and selecting one or more frequency bands based on a prioritization, an opportunity is given for the entire input to come through in the haptic signal. More specifically, an entire frequency spectrum of the input can be encompassed, and one or more specific frequency bands can be selected, such as one or more frequency bands that are mostly in the foreground from the perspective of the user. This can lead to a haptic effect that is more “customized” to the input, rather than merely selecting a frequency band of the input without first analyzing the multiple frequency bands of the input. By filtering the input and prioritizing the multiple frequency bands of the input, the system can further “perfect” the haptic effect that can be generated to “complement” the input. Further, such a solution can be an elegant solution that can be extended by future haptic conversion algorithms.
- Further, in another embodiment, a system can filter an input into one or more frequency bands, convert each frequency band into a haptic sub-signal, and mix the haptic sub-signals into a haptic signal that is ultimate used to generate one or more haptic effects. The system can create a compelling haptic effect that “complements” the input without the need of any human intervention, such as authored effects. Such a system can be implemented on any mobile device, such as a tablet or smartphone, and can use the processing power of the device as well as a haptic playback component of the device to deliver a richer experience, such as a richer video viewing experience or richer music listening experience). Further, unlike previous solutions, the system can attempt to create a haptic signal based on all of the input, rather than a specific segment of the input, such as a low frequency content of the input. As previously described, this can be accomplished by regrouping the content of the input given the existing frequencies, and processing each group adequately according to the group's frequency profile, and according to the haptic play device. Such an approach can produce a more “perfect” haptic effect that “complements” the input.
- The features, structures, or characteristics of the invention described throughout this specification may be combined in any suitable manner in one or more embodiments. For example, the usage of “one embodiment,” “some embodiments,” “certain embodiment,” “certain embodiments,” or other similar language, throughout this specification refers to the fact that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present invention. Thus, appearances of the phrases “one embodiment,” “some embodiments,” “a certain embodiment,” “certain embodiments,” or other similar language, throughout this specification do not necessarily all refer to the same group of embodiments, and the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
- One having ordinary skill in the art will readily understand that the invention as discussed above may be practiced with steps in a different order, and/or with elements in configurations which are different than those which are disclosed. Therefore, although the invention has been described based upon these preferred embodiments, it would be apparent to those of skill in the art that certain modifications, variations, and alternative constructions would be apparent, while remaining within the spirit and scope of the invention. In order to determine the metes and bounds of the invention, therefore, reference should be made to the appended claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/863,261 US20180210552A1 (en) | 2013-09-06 | 2018-01-05 | Haptic conversion system using segmenting and combining |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/020,461 US9898085B2 (en) | 2013-09-06 | 2013-09-06 | Haptic conversion system using segmenting and combining |
US15/863,261 US20180210552A1 (en) | 2013-09-06 | 2018-01-05 | Haptic conversion system using segmenting and combining |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/020,461 Continuation US9898085B2 (en) | 2013-09-06 | 2013-09-06 | Haptic conversion system using segmenting and combining |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180210552A1 true US20180210552A1 (en) | 2018-07-26 |
Family
ID=51357693
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/020,461 Expired - Fee Related US9898085B2 (en) | 2013-09-06 | 2013-09-06 | Haptic conversion system using segmenting and combining |
US15/863,261 Abandoned US20180210552A1 (en) | 2013-09-06 | 2018-01-05 | Haptic conversion system using segmenting and combining |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/020,461 Expired - Fee Related US9898085B2 (en) | 2013-09-06 | 2013-09-06 | Haptic conversion system using segmenting and combining |
Country Status (5)
Country | Link |
---|---|
US (2) | US9898085B2 (en) |
EP (2) | EP2846218B1 (en) |
JP (2) | JP6567809B2 (en) |
KR (1) | KR20150028732A (en) |
CN (2) | CN110244850A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021074107A1 (en) * | 2019-10-14 | 2021-04-22 | Lofelt Gmbh | Systems and methods for authoring an audio signal and for transforming the authored audio signal into a haptic data file |
WO2021142162A1 (en) * | 2020-01-07 | 2021-07-15 | Neosensory, Inc. | Method and system for haptic stimulation |
US11497675B2 (en) | 2020-10-23 | 2022-11-15 | Neosensory, Inc. | Method and system for multimodal stimulation |
US11644900B2 (en) | 2016-09-06 | 2023-05-09 | Neosensory, Inc. | Method and system for providing adjunct sensory information to a user |
US11862147B2 (en) | 2021-08-13 | 2024-01-02 | Neosensory, Inc. | Method and system for enhancing the intelligibility of information for a user |
Families Citing this family (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8717152B2 (en) | 2011-02-11 | 2014-05-06 | Immersion Corporation | Sound to haptic effect conversion system using waveform |
US9715276B2 (en) | 2012-04-04 | 2017-07-25 | Immersion Corporation | Sound to haptic effect conversion system using multiple actuators |
US9368005B2 (en) | 2012-08-31 | 2016-06-14 | Immersion Corporation | Sound to haptic effect conversion system using mapping |
US9997032B2 (en) | 2013-04-09 | 2018-06-12 | Immersion Corporation | Offline haptic conversion system |
US9519346B2 (en) | 2013-05-17 | 2016-12-13 | Immersion Corporation | Low-frequency effects haptic conversion system |
US9443401B2 (en) * | 2013-09-06 | 2016-09-13 | Immersion Corporation | Automatic remote sensing and haptic conversion system |
US9514620B2 (en) | 2013-09-06 | 2016-12-06 | Immersion Corporation | Spatialized haptic feedback based on dynamically scaled values |
US9245429B2 (en) | 2013-09-06 | 2016-01-26 | Immersion Corporation | Haptic warping system |
US9164587B2 (en) | 2013-11-14 | 2015-10-20 | Immersion Corporation | Haptic spatialization system |
US9619029B2 (en) | 2013-11-14 | 2017-04-11 | Immersion Corporation | Haptic trigger control system |
JP2015170174A (en) * | 2014-03-07 | 2015-09-28 | ソニー株式会社 | Information processor, information processing system, information processing method and program |
KR102373337B1 (en) | 2014-09-02 | 2022-03-11 | 애플 인크. | Semantic framework for variable haptic output |
US10185396B2 (en) | 2014-11-12 | 2019-01-22 | Immersion Corporation | Haptic trigger modification system |
JP2017182496A (en) * | 2016-03-30 | 2017-10-05 | ソニー株式会社 | Controller, control method and program |
US20170285774A1 (en) * | 2016-04-01 | 2017-10-05 | Kunjal Parikh | Characterization and simulation of writing instruments |
DK201670737A1 (en) | 2016-06-12 | 2018-01-22 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Providing Haptic Feedback |
DK179823B1 (en) | 2016-06-12 | 2019-07-12 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
US10328345B2 (en) * | 2016-07-26 | 2019-06-25 | Nintendo Co., Ltd. | Vibration control system, vibration control method, and non-transitory computer-readable storage medium with executable vibration control program stored thereon |
US10556176B2 (en) * | 2016-07-26 | 2020-02-11 | Nintendo Co., Ltd. | Vibration control system, vibration control method, and non-transitory computer-readable storage medium with executable vibration control program stored thereon |
EP3293611B1 (en) * | 2016-09-06 | 2019-05-15 | Apple Inc. | Devices, methods, and graphical user interfaces for haptic mixing |
DK179278B1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, methods and graphical user interfaces for haptic mixing |
DK201670720A1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs |
DK179082B1 (en) * | 2016-09-06 | 2017-10-16 | Apple Inc | Devices, methods and graphical user interfaces for haptic mixing |
EP3291054B8 (en) * | 2016-09-06 | 2019-07-24 | Apple Inc. | Devices, methods, and graphical user interfaces for haptic mixing |
KR102669181B1 (en) * | 2016-11-30 | 2024-05-27 | 삼성전자주식회사 | Method for Producing Haptic Signal and the Electronic Device supporting the same |
US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
DK201770372A1 (en) | 2017-05-16 | 2019-01-08 | Apple Inc. | Tactile feedback for locked device user interfaces |
US11259121B2 (en) | 2017-07-21 | 2022-02-22 | Cirrus Logic, Inc. | Surface speaker |
JP7094088B2 (en) * | 2017-10-19 | 2022-07-01 | 株式会社デンソーテン | Operation input device |
CA3092689A1 (en) | 2017-10-23 | 2019-05-02 | Patent Holding Company 001, Llc | Communication devices, methods, and systems |
KR101899538B1 (en) * | 2017-11-13 | 2018-09-19 | 주식회사 씨케이머티리얼즈랩 | Apparatus and method for providing haptic control signal |
US10503261B2 (en) * | 2017-12-15 | 2019-12-10 | Google Llc | Multi-point feedback control for touchpads |
US10455339B2 (en) | 2018-01-19 | 2019-10-22 | Cirrus Logic, Inc. | Always-on detection systems |
US10620704B2 (en) | 2018-01-19 | 2020-04-14 | Cirrus Logic, Inc. | Haptic output systems |
US11139767B2 (en) | 2018-03-22 | 2021-10-05 | Cirrus Logic, Inc. | Methods and apparatus for driving a transducer |
US10795443B2 (en) | 2018-03-23 | 2020-10-06 | Cirrus Logic, Inc. | Methods and apparatus for driving a transducer |
WO2019181955A1 (en) * | 2018-03-23 | 2019-09-26 | 日本電産株式会社 | Sound/vibration conversion apparatus |
US10667051B2 (en) | 2018-03-26 | 2020-05-26 | Cirrus Logic, Inc. | Methods and apparatus for limiting the excursion of a transducer |
US10820100B2 (en) | 2018-03-26 | 2020-10-27 | Cirrus Logic, Inc. | Methods and apparatus for limiting the excursion of a transducer |
US10832537B2 (en) * | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US11069206B2 (en) | 2018-05-04 | 2021-07-20 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
CN109242136A (en) * | 2018-07-17 | 2019-01-18 | 广东工业大学 | A kind of micro-capacitance sensor wind power Chaos-Genetic-BP neural network prediction technique |
US11269415B2 (en) | 2018-08-14 | 2022-03-08 | Cirrus Logic, Inc. | Haptic output systems |
GB201817495D0 (en) | 2018-10-26 | 2018-12-12 | Cirrus Logic Int Semiconductor Ltd | A force sensing system and method |
US10726683B1 (en) | 2019-03-29 | 2020-07-28 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using a two-tone stimulus |
US10828672B2 (en) | 2019-03-29 | 2020-11-10 | Cirrus Logic, Inc. | Driver circuitry |
US10992297B2 (en) | 2019-03-29 | 2021-04-27 | Cirrus Logic, Inc. | Device comprising force sensors |
US20200313529A1 (en) | 2019-03-29 | 2020-10-01 | Cirrus Logic International Semiconductor Ltd. | Methods and systems for estimating transducer parameters |
US12035445B2 (en) | 2019-03-29 | 2024-07-09 | Cirrus Logic Inc. | Resonant tracking of an electromagnetic load |
US10955955B2 (en) | 2019-03-29 | 2021-03-23 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors |
US11509292B2 (en) | 2019-03-29 | 2022-11-22 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter |
US11644370B2 (en) | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load |
EP3748474B1 (en) * | 2019-06-06 | 2022-09-28 | Goodix Technology (HK) Company Limited | Audio-haptic processor, method, system and computer program product |
US10976825B2 (en) | 2019-06-07 | 2021-04-13 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system |
US11150733B2 (en) | 2019-06-07 | 2021-10-19 | Cirrus Logic, Inc. | Methods and apparatuses for providing a haptic output signal to a haptic actuator |
CN114008569A (en) | 2019-06-21 | 2022-02-01 | 思睿逻辑国际半导体有限公司 | Method and apparatus for configuring a plurality of virtual buttons on a device |
US11408787B2 (en) | 2019-10-15 | 2022-08-09 | Cirrus Logic, Inc. | Control methods for a force sensor system |
US11380175B2 (en) | 2019-10-24 | 2022-07-05 | Cirrus Logic, Inc. | Reproducibility of haptic waveform |
JP7476516B2 (en) * | 2019-11-11 | 2024-05-01 | Toppanホールディングス株式会社 | Signal processing device, signal processing system, program, and signal processing method |
WO2021109092A1 (en) * | 2019-12-05 | 2021-06-10 | 瑞声声学科技(深圳)有限公司 | Method and apparatus for implementing tactile signals, terminal and storage medium |
US11545951B2 (en) | 2019-12-06 | 2023-01-03 | Cirrus Logic, Inc. | Methods and systems for detecting and managing amplifier instability |
JP7377093B2 (en) * | 2019-12-16 | 2023-11-09 | 日本放送協会 | Program, information processing device, and information processing method |
WO2021171791A1 (en) * | 2020-02-25 | 2021-09-02 | ソニーグループ株式会社 | Information processing device for mixing haptic signals |
US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator |
US11934583B2 (en) | 2020-10-30 | 2024-03-19 | Datafeel Inc. | Wearable data communication apparatus, kits, methods, and systems |
DE112022001118T5 (en) | 2021-02-18 | 2024-01-18 | Sony Group Corporation | RECEIVING DEVICE, TRANSMISSION DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM |
US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters |
US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system |
US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive |
US11552649B1 (en) | 2021-12-03 | 2023-01-10 | Cirrus Logic, Inc. | Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths |
US20230418416A1 (en) * | 2022-06-22 | 2023-12-28 | Microsoft Technology Licensing, Llc | Touchscreen sensor calibration using adaptive noise classification |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030067440A1 (en) * | 2001-10-09 | 2003-04-10 | Rank Stephen D. | Haptic feedback sensations based on audio output from computer devices |
US20030068053A1 (en) * | 2001-10-10 | 2003-04-10 | Chu Lonny L. | Sound data output and manipulation using haptic feedback |
US20070242040A1 (en) * | 2006-04-13 | 2007-10-18 | Immersion Corporation, A Delaware Corporation | System and method for automatically producing haptic events from a digital audio signal |
US20110068657A1 (en) * | 2009-09-18 | 2011-03-24 | Murata Manufacturing Co., Ltd. | Piezoelectric actuator driver circuit |
US20110115709A1 (en) * | 2009-11-17 | 2011-05-19 | Immersion Corporation | Systems And Methods For Increasing Haptic Bandwidth In An Electronic Device |
US20120206246A1 (en) * | 2011-02-11 | 2012-08-16 | Immersion Corporation | Sound to haptic effect conversion system using amplitude value |
US20140176415A1 (en) * | 2012-12-20 | 2014-06-26 | Amazon Technologies, Inc. | Dynamically generating haptic effects from audio data |
US20160294654A1 (en) * | 2013-11-07 | 2016-10-06 | Snecma | Method and device for characterising a signal |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5684722A (en) | 1994-09-21 | 1997-11-04 | Thorner; Craig | Apparatus and method for generating a control signal for a tactile sensation generator |
US6577739B1 (en) | 1997-09-19 | 2003-06-10 | University Of Iowa Research Foundation | Apparatus and methods for proportional audio compression and frequency shifting |
DE20080209U1 (en) | 1999-09-28 | 2001-08-09 | Immersion Corp | Control of haptic sensations for interface devices with vibrotactile feedback |
US7138989B2 (en) * | 2000-09-15 | 2006-11-21 | Silicon Graphics, Inc. | Display capable of displaying images in response to signals of a plurality of signal formats |
JP2003099177A (en) | 2001-09-21 | 2003-04-04 | Fuji Xerox Co Ltd | Method for preparing haptic information and method for presenting haptic information and its device |
JP2004214843A (en) * | 2002-12-27 | 2004-07-29 | Alpine Electronics Inc | Digital amplifier and gain adjustment method thereof |
US8378964B2 (en) | 2006-04-13 | 2013-02-19 | Immersion Corporation | System and method for automatically producing haptic events from a digital audio signal |
US8000825B2 (en) | 2006-04-13 | 2011-08-16 | Immersion Corporation | System and method for automatically producing haptic events from a digital audio file |
WO2008127316A1 (en) | 2006-11-22 | 2008-10-23 | Chornenky T E | Security and monitoring apparatus |
US8502679B2 (en) | 2008-10-08 | 2013-08-06 | The Board Of Regents Of The University Of Texas System | Noninvasive motion and respiration monitoring system |
DE102009035386B4 (en) * | 2009-07-30 | 2011-12-15 | Cochlear Ltd. | Hörhilfeimplantat |
BR112012010049A2 (en) * | 2009-10-29 | 2016-05-24 | New Transducers Ltd | touch device |
US8902050B2 (en) * | 2009-10-29 | 2014-12-02 | Immersion Corporation | Systems and methods for haptic augmentation of voice-to-text conversion |
US8717152B2 (en) * | 2011-02-11 | 2014-05-06 | Immersion Corporation | Sound to haptic effect conversion system using waveform |
US9083821B2 (en) | 2011-06-03 | 2015-07-14 | Apple Inc. | Converting audio to haptic feedback in an electronic device |
JP6081705B2 (en) * | 2012-02-03 | 2017-02-15 | イマージョン コーポレーションImmersion Corporation | Sound-tactile effect conversion system using waveform |
EP2629178B1 (en) | 2012-02-15 | 2018-01-10 | Immersion Corporation | High definition haptic effects generation using primitives |
US9368005B2 (en) * | 2012-08-31 | 2016-06-14 | Immersion Corporation | Sound to haptic effect conversion system using mapping |
US9196134B2 (en) | 2012-10-31 | 2015-11-24 | Immersion Corporation | Method and apparatus for simulating surface features on a user interface with haptic effects |
-
2013
- 2013-09-06 US US14/020,461 patent/US9898085B2/en not_active Expired - Fee Related
-
2014
- 2014-07-28 JP JP2014152610A patent/JP6567809B2/en not_active Expired - Fee Related
- 2014-08-06 EP EP14002751.7A patent/EP2846218B1/en not_active Not-in-force
- 2014-08-06 EP EP19169501.4A patent/EP3567454A1/en not_active Withdrawn
- 2014-09-03 KR KR20140116717A patent/KR20150028732A/en active IP Right Grant
- 2014-09-04 CN CN201910525367.8A patent/CN110244850A/en active Pending
- 2014-09-04 CN CN201410447604.0A patent/CN104423707B/en not_active Expired - Fee Related
-
2018
- 2018-01-05 US US15/863,261 patent/US20180210552A1/en not_active Abandoned
-
2019
- 2019-05-29 JP JP2019100033A patent/JP2019145171A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030067440A1 (en) * | 2001-10-09 | 2003-04-10 | Rank Stephen D. | Haptic feedback sensations based on audio output from computer devices |
US20030068053A1 (en) * | 2001-10-10 | 2003-04-10 | Chu Lonny L. | Sound data output and manipulation using haptic feedback |
US20070242040A1 (en) * | 2006-04-13 | 2007-10-18 | Immersion Corporation, A Delaware Corporation | System and method for automatically producing haptic events from a digital audio signal |
US20110068657A1 (en) * | 2009-09-18 | 2011-03-24 | Murata Manufacturing Co., Ltd. | Piezoelectric actuator driver circuit |
US20110115709A1 (en) * | 2009-11-17 | 2011-05-19 | Immersion Corporation | Systems And Methods For Increasing Haptic Bandwidth In An Electronic Device |
US20120206246A1 (en) * | 2011-02-11 | 2012-08-16 | Immersion Corporation | Sound to haptic effect conversion system using amplitude value |
US20140176415A1 (en) * | 2012-12-20 | 2014-06-26 | Amazon Technologies, Inc. | Dynamically generating haptic effects from audio data |
US20160294654A1 (en) * | 2013-11-07 | 2016-10-06 | Snecma | Method and device for characterising a signal |
Non-Patent Citations (1)
Title |
---|
Cruz Hernandez 2 Cruz Hernandez US. pub no 011/0115709, hereinafter -_ * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11644900B2 (en) | 2016-09-06 | 2023-05-09 | Neosensory, Inc. | Method and system for providing adjunct sensory information to a user |
WO2021074107A1 (en) * | 2019-10-14 | 2021-04-22 | Lofelt Gmbh | Systems and methods for authoring an audio signal and for transforming the authored audio signal into a haptic data file |
US11468750B2 (en) * | 2019-10-14 | 2022-10-11 | Lofelt Gmbh | Authoring an immersive haptic data file using an authoring tool |
US20230215247A1 (en) * | 2019-10-14 | 2023-07-06 | Meta Platforms, Inc. | Authoring an immersive haptic data file using an authoring tool |
WO2021142162A1 (en) * | 2020-01-07 | 2021-07-15 | Neosensory, Inc. | Method and system for haptic stimulation |
US11079854B2 (en) | 2020-01-07 | 2021-08-03 | Neosensory, Inc. | Method and system for haptic stimulation |
US11614802B2 (en) | 2020-01-07 | 2023-03-28 | Neosensory, Inc. | Method and system for haptic stimulation |
US11497675B2 (en) | 2020-10-23 | 2022-11-15 | Neosensory, Inc. | Method and system for multimodal stimulation |
US11877975B2 (en) | 2020-10-23 | 2024-01-23 | Neosensory, Inc. | Method and system for multimodal stimulation |
US11862147B2 (en) | 2021-08-13 | 2024-01-02 | Neosensory, Inc. | Method and system for enhancing the intelligibility of information for a user |
Also Published As
Publication number | Publication date |
---|---|
EP2846218B1 (en) | 2019-04-17 |
EP3567454A1 (en) | 2019-11-13 |
CN110244850A (en) | 2019-09-17 |
JP2019145171A (en) | 2019-08-29 |
JP2015053037A (en) | 2015-03-19 |
US9898085B2 (en) | 2018-02-20 |
JP6567809B2 (en) | 2019-08-28 |
EP2846218A1 (en) | 2015-03-11 |
KR20150028732A (en) | 2015-03-16 |
CN104423707B (en) | 2019-07-09 |
US20150070260A1 (en) | 2015-03-12 |
CN104423707A (en) | 2015-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180210552A1 (en) | Haptic conversion system using segmenting and combining | |
JP6739571B2 (en) | Haptic transformation system using frequency shift | |
US10416774B2 (en) | Automatic remote sensing and haptic conversion system | |
EP2846221B1 (en) | Method, system and computer program product for transforming haptic signals | |
US9946348B2 (en) | Automatic tuning of haptic effects | |
US9508236B2 (en) | Haptic warping system that transforms a haptic signal into a collection of vibrotactile haptic effect patterns | |
US10162416B2 (en) | Dynamic haptic conversion system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMMERSION CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SABOUNE, JAMAL;CRUZ-HERNANDEZ, JUAN MANUEL;BHATIA, SATVIR SINGH;SIGNING DATES FROM 20130909 TO 20130910;REEL/FRAME:044557/0759 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |