CN112041790A - Method and apparatus for outputting haptic signals to a haptic transducer - Google Patents
Method and apparatus for outputting haptic signals to a haptic transducer Download PDFInfo
- Publication number
- CN112041790A CN112041790A CN201980029423.7A CN201980029423A CN112041790A CN 112041790 A CN112041790 A CN 112041790A CN 201980029423 A CN201980029423 A CN 201980029423A CN 112041790 A CN112041790 A CN 112041790A
- Authority
- CN
- China
- Prior art keywords
- haptic signal
- haptic
- representation
- time
- signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B6/00—Tactile signalling systems, e.g. personal calling systems
Abstract
Embodiments described herein provide methods and apparatus for outputting haptic signals to a haptic transducer. One method comprises the following steps: storing a representation of a portion of a haptic signal, the representation including a first point of information indicative of a first amplitude and at least one first frequency of the portion of the haptic signal at a first time, wherein the representation is associated with a user experience; in response to receiving the indication of the occurrence of the user experience, a haptic signal is determined based on the first point of information such that the portion of the haptic signal has a first amplitude and at least one first frequency at a first time. The method may further include outputting a haptic signal to the haptic transducer.
Description
Technical Field
Embodiments of the present disclosure relate to methods and apparatus for outputting haptic signals to a haptic transducer. In particular, representations of portions of haptic signals are stored with their corresponding associated user experiences.
Background
Vibrotactile devices, such as Linear Resonant Actuators (LRAs), have been used in devices, such as mobile devices, to generate vibratory feedback for user interaction with the device. Among the various vibratory feedbacks, a tactile alert tone or vibratory ringtone is an important type of vibratory notification.
The playback of the haptic signal for generating the haptic effect that can be sensed by the user may be similar to music and audio ring tones. The storage of the haptic signal for creating the haptic effect may use a method similar to the method for storing the ringtone, for example, a Pulse Code Modulation (PCM) waveform. However, when the content of the haptic signal is longer in duration and a greater variety of haptic effects are required, more memory is required for both generation and storage of the haptic signal.
Disclosure of Invention
According to some embodiments, a method for outputting a haptic signal to a haptic transducer is provided. The method comprises the following steps: storing a representation of a portion of a haptic signal, the representation including a first point of information indicating a first amplitude and at least one first frequency of the portion of the haptic signal at a first time, wherein the representation is associated with a user experience; in response to receiving an indication of an occurrence of the user experience, determining the haptic signal based on the first point of information such that the portion of the haptic signal has the first amplitude and the at least one first frequency at the first time; and outputting the haptic signal to the haptic transducer.
In some embodiments, the first time is defined relative to a start time of the portion of the haptic signal.
In some embodiments, the representation of the portion of the haptic signal further includes a second information point indicating a second amplitude and a second frequency of the portion of the haptic signal at a second time. In some embodiments, the second time is defined relative to the first time.
In some embodiments, the method further comprises generating the haptic signal based on the second information point such that the portion of the haptic signal has the second amplitude and the second frequency at the second time. In some embodiments, the method further comprises generating the haptic signal such that the amplitude of the portion of the haptic signal increases from the first amplitude to the second amplitude between the first time and the second time.
In some embodiments, the method further comprises generating the haptic signal such that the frequency of the portion of the haptic signal increases from the first frequency to the second frequency between the first time and the second time.
In some embodiments, the representation of the portion of the haptic signal further includes a repetition time. The haptic signal may be generated such that the portion of the haptic signal is repeated after the repetition time.
In some embodiments, the representation of the portion of the haptic signal further includes an indication of a number of repetitions, X, where X is an integer value, and the method includes generating the haptic signal such that the portion of the haptic signal is repeated X times at intervals of the repetition time.
According to some embodiments, a method of generating a haptic signal for output to a haptic transducer is provided. The method comprises the following steps: in response to receiving an indication of an occurrence of a user experience, generating a first portion of the haptic signal based on a stored representation of the first portion of the haptic signal, the stored representation of the first portion of the haptic signal including information related to a first amplitude of the first portion of the haptic signal; and generating a second portion of a haptic signal based on the stored representation of the second portion of the haptic signal, the stored representation of the second portion of the haptic signal including information related to a second magnitude of the second portion of the haptic signal, wherein the representation of the first portion of the haptic signal and the representation of the second portion of the haptic signal are associated with the user experience.
The second portion of the haptic signal may be generated at a desired latency time after the first portion of the haptic signal ends.
The stored representation of the first portion of the haptic signal may include a pulse code modulation of the first portion of the haptic signal. The stored representation of the second portion of the haptic signal may include a pulse code modulation of the second portion of the haptic signal. In some embodiments, the stored representation of the first portion of the haptic signal includes a first information point indicating a first amplitude and at least one first frequency of the first portion of the haptic signal at a first time. The stored representation of the second portion of the haptic signal may include a second information point indicating a second amplitude and at least one second frequency of the second portion of the haptic signal at a second time. Any combination of different types of stored representations may be used for the first portion of the haptic signal and the second portion of the haptic signal.
In some embodiments, the method includes receiving an audio signal for output to a speaker; wherein the step of receiving an indication of the occurrence of a user experience comprises detecting the user experience in the audio signal.
The stored representation of the first portion of the haptic signal and the stored representation of the second portion of the haptic signal may be associated with the user experience as part of a stored code associated with the user experience. The stored code may include: an indication of the stored representation of the first portion of the haptic signal; an indication of the stored representation of the second portion of the haptic signal; and an indication of an elapsed time between the stored representation of the first portion of the haptic signal and the stored representation of the second portion of the haptic signal.
In some embodiments, the stored code further comprises an indication of a magnitude of the representation for playing back the stored first portion of the haptic signal.
According to some embodiments, a haptic signal generator for outputting a haptic signal to a haptic transducer is provided. The tactile signal generator includes: a memory configured to store a representation of a portion of the haptic signal, the representation including a first point of information indicating a first amplitude and at least one first frequency of the portion of the haptic signal at a first time, wherein the representation is associated with a user experience; and, in response to receiving an indication of an occurrence of the user experience, determining the haptic signal based on the first point of information such that the portion of the haptic signal has the first amplitude and the at least one first frequency at the first time; and outputting the haptic signal to the haptic transducer.
According to some embodiments, a haptic signal generator is provided that generates a haptic signal for output to a haptic transducer. The haptic signal generator comprises processing circuitry configured to generate, in response to receiving an indication of an occurrence of a user experience, a first portion of the haptic signal based on a stored representation of the first portion of the haptic signal, the stored representation of the first portion of the haptic signal comprising information related to a first amplitude of the first portion of the haptic signal; and generating a second portion of a haptic signal based on the stored representation of the second portion of the haptic signal, the stored representation of the second portion of the haptic signal including information related to a second magnitude of the second portion of the haptic signal, wherein the representation of the first portion of the haptic signal and the representation of the second portion of the haptic signal are associated with the user experience.
According to some embodiments, an electronic device is provided that includes a haptic signal generator for outputting a haptic signal to a haptic transducer. The haptic signal generator may be as described above. The electronic device may comprise at least one of: a portable device; a battery powered device; a computing device; a communication device; a game device; a mobile phone; a personal media player; laptop, tablet; a notebook computing device, or a smart home device.
According to some embodiments, an electronic device is provided that includes a haptic signal generator that generates a haptic signal for output to a haptic transducer. The haptic signal generator may be as described above. The electronic device may comprise at least one of: a portable device; a battery powered device; a computing device; a communication device; a game device; a mobile phone; a personal media player; laptop, tablet; a notebook computing device, or a smart home device.
Drawings
For a better understanding of embodiments of the present disclosure, and to show how the same may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings in which:
fig. 1 is an example schematic diagram illustrating a portion of a device having a tactile output transducer according to some embodiments;
fig. 2 is a flow diagram illustrating an example method for outputting a haptic signal to a haptic transducer, according to some embodiments;
fig. 3 illustrates an example haptic signal including haptic atoms (haptic atoms) according to some embodiments;
FIG. 4 illustrates a haptic signal including haptic atoms, according to some embodiments;
fig. 5 is a flow diagram illustrating an example method for outputting a haptic signal to a haptic transducer, according to some embodiments;
fig. 6A illustrates an example haptic signal including a first haptic atom and a second haptic atom, according to some embodiments;
FIG. 6B illustrates an example user experience including an audio event, according to some embodiments;
fig. 7 illustrates an example system having a processor configured to output haptic signals to a haptic transducer, according to some embodiments.
Detailed Description
The following description sets forth example embodiments according to the present disclosure. Other exemplary embodiments and implementations will be apparent to those of ordinary skill in the art. Further, those of ordinary skill in the art will recognize that a variety of equivalent techniques may be employed in place of or in combination with the embodiments discussed below, and all such equivalents are to be considered as encompassed by the present disclosure.
Fig. 1 is an example schematic diagram illustrating a portion of a device 100 having a haptic output transducer 104. The apparatus 100 may comprise any electronic device, such as: a portable device; a battery powered device; a computing device; a communication device; a game device; a mobile phone; a personal media player; a laptop, tablet, or notebook computing device, a smart watch, a Virtual Reality (VR) device, or an Augmented Reality (AR) device, or a smart home device. For clarity, fig. 1 does not show elements of device 100 not relevant to the present disclosure, but those skilled in the art will appreciate that device 100 may include other elements and components in addition to those shown in fig. 1.
The device 100 includes a processor 106, which processor 106 may be, for example, an application processor. The processor 106 interfaces with a signal processor 108, which signal processor 108 may be, for example, a Digital Signal Processor (DSP). The signal processor 108 may interface with an audio output amplifier 110, which audio output amplifier 110 may be configured to output an audio signal to drive the audio output transducer 102. The signal processor 108 also interfaces with a haptic output amplifier 112, the haptic output amplifier 112 configured to output haptic signals to drive the haptic transducer 104.
The signal processor 108 may also interface with any other output device (e.g., a screen) capable of providing a sensory output to the user.
The processor 106 runs the operating environment of the device 100 to allow software applications to be executed by the device 100. Such an application may receive user input. The user input may include one or more of the following: touch user input and/or gesture user input that may be detected by a touch sensitive surface (e.g., a touchscreen) (not shown) of the device; a kinetic user input (such as a rotating device or a tilting device) that may be detected by a sensor (such as an accelerometer or gyroscope of the device) (also not shown); and audio user input (such as spoken commands) that may be detected by a sensor (such as a microphone of the device) (also not shown). In response to detecting the user input, the application is operable to generate an appropriate output at the device. For example, the application may be operative to cause an image displayed on a display (not shown) of the device to be updated and to cause appropriate audio effects to be output by the audio output transducer 102. The application is also operable to cause the appropriate tactile output to be provided by the tactile output transducer 104 in response to detecting the user input. All types of these user inputs may be described as user experiences.
The sensory output provided to the user via any of the output components of the device, such as the display (not shown) and the audio output transducer 102, may also be described as a user experience.
In some embodiments, the signal processor 108 may be configured to actuate the haptic output transducer 104 to cause the device 100 to vibrate while incorporating the sensory output user experience (e.g., the display or image display on the screen is updated and/or audio effects are output by the output audio transducer 102) to provide additional sensory information to the user.
A haptic effect, such as a haptic ringtone, may contain a series of shorter components. The haptic effect may be actuated along with some user experience, for example, a user input such as a button press or touch of a particular location on a touch screen, or a sensory output such as playback of some audio content to the user. Playback of the haptic signals for providing the haptic effects along with the user experience generates a composite sensory experience for the user.
By driving the haptic output transducer 104 with haptic signals having different shapes, frequencies, and amplitudes, a variety of different haptic effects are possible. Each haptic effect may produce a different sensation to the user.
As previously described, a Pulse Code Modulation (PCM) waveform may be used to store haptic signals that are used to drive the transducer to produce haptic effects. However, it may be difficult to store long (e.g., greater than 500ms) PCM waveforms in memory (e.g., in random access memory of a haptic driver integrated circuit or in a corresponding Digital Signal Processor (DSP)).
However, in some embodiments, the haptic signal contains a "quiet" period during which no haptic effect is actually produced. Thus, the PCM waveform used to store these haptic signals also includes a "silent" period. However, the "quiet" period in the haptic signal still consumes memory space.
Fig. 2 is a flow diagram illustrating an example method for outputting a haptic signal to a haptic transducer. The method may be performed by a signal processor (such as processor 108 in fig. 1) and/or may be performed by any other system operable to perform the method. In certain embodiments, the method of fig. 2 may be partially or fully implemented in software and/or firmware embodied in a computer-readable medium.
In step 201, the method includes storing a representation of a portion of the haptic signal, the representation including a first information point indicating a first amplitude and at least one first frequency of the portion of the haptic signal at a first time.
In other words, the haptic signal may be deconstructed into multiple portions of the haptic signal, which may be referred to herein as "haptic atoms. These tactile atoms are parts of the tactile signal. The haptic atoms may be represented by information related to spectral content of the portion of the haptic signal, a duration of the portion of the haptic signal, and an amplitude of the portion of the haptic signal. The representations may include sufficient information to allow the haptic atoms to be reconstructed by the signal processor based on the information they contain.
In particular, where the haptic signal includes a silent period (i.e., where the haptic transducer is not outputting haptic effects), the haptic signal may be deconstructed into haptic atoms that include non "silent" portions of the haptic signal.
The representation of the haptic atom may be associated with a user experience. For example, the first representation may be associated with a particular user input (e.g., a button click). The second representation may be associated with playback of a particular audio event or alert. The representation may be stored with an indication of the associated user experience.
In step 202, the method includes, in response to receiving an indication of an occurrence of a user experience, determining a haptic signal based on a first point of information such that a portion of the haptic signal has a first amplitude and at least one first frequency at a first time.
In other words, in response to receiving an indication that a user experience is occurring (e.g., a notification that a button press is occurring, or a particular audio event is detected in an audio signal to be output to an output audio transducer), the method includes generating a haptic signal from a stored representation of a haptic atom associated with the user experience.
In step 202, the method includes outputting a haptic signal to a haptic transducer.
A representation of a portion of a haptic signal may be stored in a piecewise linear envelope format. The PWLE includes one or more information points, each of which may include an amplitude value and at least one frequency value at a particular time in a portion of the haptic signal.
For example, table 1 illustrates an example PWLE of a first haptic atom. In this embodiment, the PWLE includes four information points.
Table 1: table 1 illustrates PWLE of the first tactile atom 310.
The first information point comprises a first time 120 ms. This time value may for example represent 120ms after the start of a part of the haptic signal (haptic atom). The first information points also include 0.16234 a first amplitude value. This amplitude value is expressed as a fraction of the full scale (FFS) voltage available to the tactile sensor, which in this embodiment is 12.32V. However, it should be understood that the amplitude values may be expressed in any suitable manner. The first information point further comprises a first frequency value of 200 Hz.
In effect, this first information point conveys: the first tactile atom has a voltage of 0.16234FFS and a frequency of 200Hz at a time 120ms after the start of the first tactile atom.
The second information point comprises a second time 200 ms. This second time may for example represent a time of 200ms after the start of a part of the haptic signal (first haptic atom). In some embodiments, the second time may be defined relative to the first time. The second information points also include 0.07305 second amplitude values. This amplitude value is expressed as a fraction of the full scale voltage available to the tactile transducer, which in this embodiment is 12.32V. However, it should be understood that the amplitude values may be expressed in any suitable manner. The second information point further comprises a second frequency value of 200 Hz.
In effect, this second information point conveys: the first tactile atom has a voltage of 0.07305FFS and a frequency of 200Hz at a time 200ms after the start of the first tactile atom.
The third information point includes a third time 500 ms. This third time may for example represent a time 500ms after the start of a portion of the haptic signal. The third information point also includes 0.07305 a third amplitude value. This amplitude value is expressed as a fraction of the full scale voltage available to the tactile transducer, which in this embodiment is 12.32V. However, it should be understood that the amplitude values may be expressed in any suitable manner. The third information point further comprises a third frequency value of 200 Hz.
In effect, this third information point conveys: the first tactile atom has a voltage of 0.07305FFS and a frequency of 200Hz at a time 500ms after the start of the first tactile atom.
The fourth information point includes a fourth time 540 ms. This fourth time may for example represent a time of 540ms after the start of a part of the haptic signal (first haptic atom). The fourth information point further includes a fourth amplitude value of 0. This amplitude value is expressed as a fraction of the full scale voltage available to the tactile transducer, which in this embodiment is 12.32V. However, it should be understood that the amplitude values may be expressed in any suitable manner. The fourth information point further comprises a fourth frequency value of 200 Hz.
In effect, this fourth information point conveys: the first tactile atom has a voltage of 0FFS and a frequency of 200Hz at a time 540ms after the start of the first tactile atom.
FIG. 3 illustrates an example haptic signal 300, the example haptic signal 300 including a first haptic atom 310 generated by the PWLE illustrated in Table 1.
The information contained in the representation of the first tactile atom 310 illustrated in table 1 may be used to recreate the first tactile atom as illustrated in fig. 3.
In other words, the point 301 corresponds to a first information point, the point 302 corresponds to a second information point, the point 303 corresponds to a third information point, and the point 304 corresponds to a fourth information point.
In this embodiment, the haptic signal including the first haptic atom is generated such that the amplitude of the first haptic atom decreases from the first amplitude to the second amplitude between a first time (i.e., 120ms after the default start time) and a second time (i.e., 200ms after the default start time). In this embodiment, the amplitude decreases linearly between the first time and the second time. However, in some embodiments, different rules for the manner in which the first tactile atom is created between information points may be defined. For example, to produce a square wave type tactile atom, tactile atoms may be generated by switching the amplitude and/or frequency of the tactile atoms when an information point occurs.
In some embodiments, the haptic atom may include a default start having an amplitude of 0 and a default start frequency of F1. The amplitude may then increase linearly between the start and the first time indicated in the first information point.
In this embodiment illustrated in fig. 3, the frequency of the first tactile atom remains constant throughout the first tactile atom and only one frequency is used. However, it should be understood that multiple frequencies may be used and that the frequencies may be varied between information points. Again, in some embodiments, the haptic signal may be generated such that the frequency of the first haptic atom increases from the first frequency to the second frequency between the first time and the second time.
By storing the representation of the first haptic atom in PWLE format rather than a PCM waveform, the memory required to store the haptic signal is reduced.
Table 2: table 2 illustrates one embodiment of PWLE of a tactile atom 310.
FIG. 4 illustrates a haptic signal 400 including a haptic atom 310 generated by the PWLE illustrated in Table 2.
PWLE in table 2 includes the same information points as in table 1. However, PWLE in table 2 also includes the repetition time. In this embodiment, the repetition time is 260 ms.
Accordingly, the haptic signal 400 may be generated such that the haptic atom 310 is repeated after a repetition time.
In some embodiments, the representation of the portion of the haptic signal further includes an indication of a number of repetitions X, where X is an integer value, and the method includes: the haptic signal is generated such that a portion of the haptic signal is repeated X times at intervals of the repetition time.
In the embodiments illustrated in table 2 and fig. 4, the number of repetitions is 1 (i.e., X ═ 1). However, it should be understood that any number of repetitions may be used.
In some embodiments, the haptic signal may include a plurality of haptic atoms, each haptic atom having a separate representation. For example, the haptic signal may include: creating a first haptic atom of a short beep haptic effect; and a second haptic atom that creates a longer and softer haptic effect. The two haptic atoms of the haptic signal may be represented and stored separately in separate representations. In some embodiments, different portions of the haptic signal (i.e., the haptic atoms) may be stored in different types of representations. For example, PWLE may be used to store one haptic atom, and PCM waveforms may be used to store a different haptic atom.
Fig. 5 is a flow diagram illustrating an example method for outputting a haptic signal to a haptic transducer. The method of fig. 5 may be implemented by the signal processor 108 illustrated in fig. 1, and/or may be implemented by any other system operable to implement the method. In certain embodiments, the method of fig. 2 may be partially or fully implemented in software and/or firmware embodied in a computer-readable medium.
In step 501, the method includes, in response to receiving an indication of an occurrence of a user experience, generating a first portion of a haptic signal based on a representation of the stored first portion of the haptic signal, the representation including information related to a first amplitude of the first portion of the haptic signal.
In this embodiment, the representation of the first portion (or first haptic atom) of the haptic signal may be any representation that includes information related to the first magnitude of the first portion of the haptic signal. For example, the representation may be a PWLE representation or a PCM waveform.
The PCM waveform representation may be used for short haptic effects, such as sharp high intensity vibration spikes. The short duration of these haptic effects may naturally reduce storage requirements in the time domain. Depending on the characteristics of the haptic transducer (e.g., Linear Resonant Actuator (LRA)), the PCM waveform representation may need to be pre-processed to adjust the PCM waveform so that it is suitable for reproduction on the haptic transducer.
The PLWE representation of the haptic atom can be used for longer continuous tone haptic effects because it may have higher storage efficiency than PCM waveforms. In particular, the sequence of tactile atoms may be represented by a plurality of different PWLEs.
In step 502, the method includes generating a second-portion haptic signal based on a stored representation of a second portion of the haptic signal, the representation including information related to a second magnitude of the second portion of the haptic signal.
Again, in this embodiment, the representation of the second portion (or second haptic atom) of the haptic signal may be any representation that includes information related to the first magnitude of the first portion of the haptic signal. For example, the representation of the second portion of the haptic signal may be a PWLE representation or a PCM waveform.
The representation of the first portion of the haptic signal and the representation of the second portion of the haptic signal are stored such that they are both stored in association with the user experience.
FIG. 6A illustrates an example haptic signal 600 including a first haptic atom 610 and a second haptic atom 612.
In this embodiment, the haptic signal is generated in response to detecting a particular audio event in the audio signal. In other words, the user experience in this embodiment includes the audio event illustrated in FIG. 6B.
In other words, in some embodiments, the method of fig. 5 may include receiving an audio signal for output to a speaker. In this embodiment, receiving an indication of the presence of a user experience in step 501 may comprise detecting a user experience in the audio signal. In other words, the user experience may include an audio event, and an indication of the occurrence of the user experience may be detected in the received audio signal.
In this embodiment, first tactile atom 610 is associated with two segments of the user experience. This first tactile atom 610 may be stored in a number of different ways. For example, a first haptic atom may be stored as being associated with user experience 614 between times t1 and t2 and times t3 and t 4. Alternatively, haptic atom 610 may be stored with a repetition period value of t5, with a number of repetitions of 1.
In this embodiment, user experience 614 is associated with second tactile atom 612. The second haptic atom may be generated at a desired wait time t6 after the end of the first haptic atom. The desired latency may be stored with a first portion of the haptic signal associated with the user experience. In other words, referring to the embodiments in fig. 6A and 6B, the second portion of the haptic signal may be associated with a segment of the user experience between times t7 and t 8.
In some embodiments, a sequence of haptic atoms (or a sequence of portions of a haptic signal) may be associated with a user experience. For example, the method may include storing a sequence of haptic atoms associated with a user experience.
For example, for a user experience such as 614, a code may be used to store a sequence of haptic atoms to be used to generate haptic signal 600. For example, first haptic atom 610 may be represented by an "x" in the code and second haptic atom 612 is represented by a "y" in the code.
The following code may then be used to represent haptic signal 600:
x.100,t5,x.100,t6,y.40
this code may be understood as instructions to the signal processor to perform the following:
1) playback of the first tactile atom "x" at 100% of the designed amplitude "
2) A blank time of t5 seconds follows the first tactile atom "x",
3) the first tactile atom "x" is played back again at 100% of the design magnitude,
4) a blank time of t6 seconds follows the first tactile atom "x",
5) the second haptic atom "y" is played at 40% of the design magnitude.
In other words, the first and second tactile atoms are associated with the user experience as part of the stored code associated with the user experience. The stored code may include an indication of a first tactile atom (in this embodiment, "x"); an indication of a second tactile atom (in this embodiment, "y"); and an indication of the elapsed time between the first tactile atom and the second tactile atom (t 6 in this embodiment).
In some embodiments, the stored code may further include an indication of the magnitude used to playback the first tactile atom (in this embodiment 100% of the design magnitude, although it is understood that other methods for defining the magnitude may be used).
Thus, in this embodiment, when the user experience in fig. 6B is detected, the method of fig. 5 may include: the stored sequence of haptic atoms associated with the detected indication of user experience is retrieved, and a haptic signal is generated from the code, e.g., by retrieving each representation of each individual haptic atom from the storage and generating each portion of the haptic signal as indicated in the code.
When using a haptic driver Integrated Circuit (IC) (or other amplifier + DSP), during playback of a haptic signal comprising a plurality of haptic atoms separated by a quiet period, such as illustrated in fig. 6A, the firmware of the IC may issue an alert signal each time a haptic atom has completed playback to notify the corresponding driver (or microcontroller unit (MCU) or similar device) that waveform playback has completed. The driver may then utilize a clock to determine when the user-specified silence period has ended so that it can play back the next haptic atom/component in the queue. The above procedures may also be performed exclusively at the firmware level.
When considering the embodiment illustrated in fig. 6A, it will be appreciated that if the synthesized haptic signal 600 had been stored as a single PCM waveform, the "quiet" periods between t2 and t3 and between t4 and t7 would also be represented as a single PCM waveform, and the storage space used to store these quiet periods has been freed up by using haptic atoms instead of representing the haptic signal as described in fig. 5 and 6A.
Fig. 7 illustrates an example haptic signal generator 700, the haptic signal generator 700 including a processing circuit 701 configured to output a haptic signal to a haptic transducer 702. The processing circuit 701 may be, for example, the signal processor 108 illustrated in fig. 1. Haptic signal generator 700 may be configured to perform a method as described above with reference to fig. 2 and 5.
The processing circuitry 701 may include an input for receiving an indication of the occurrence of a user experience. As explained previously, this indication may be a notification that a user input (e.g., button press, touch screen activation, tilting the device, or voice activation) occurred. The indication may alternatively be the detection of a sensory event (e.g. an audio event) in a sensory output signal (e.g. an audio signal) to be output to the user.
The tactile signal generator 700 further comprises a memory 703. The memory 703 may be configured to store representations of haptic atoms as described above with reference to fig. 2-6B. In particular, the memory 703 may be configured to store a representation of a portion of the haptic signal, the representation comprising a first point of information indicating a first amplitude and at least one first frequency of the portion of the haptic signal at a first time, wherein the representation is associated with a user experience.
The processing circuitry 701 may be configured to: in response to receiving the indication of the user experience, a haptic signal is determined based on the first point of information such that a portion of the haptic signal has a first amplitude and at least one first frequency at a first time. For example, the processing circuit 701 may be configured to retrieve a representation of a portion of the haptic signal from the memory 703 by sending an indication of the user experience to the memory 703 and in response receiving the representation of the portion of the haptic signal.
The processing circuit 701 may then be further configured to output the haptic signal to the haptic transducer.
In some embodiments, the processing circuit 701 may be configured to generate the second portion haptic signal based on a stored representation of the second portion of the haptic signal, the representation including information related to a second magnitude of the second portion of the haptic signal, wherein the representation of the first portion of the haptic signal and the representation of the second portion of the haptic signal are associated with the user experience in the memory 703. In some embodiments, the representation of the second portion of the haptic signal may comprise a PLWE representation or a PCM waveform.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. The word "comprising" does not exclude the presence of elements or steps other than those listed in a claim, "a" or "an" does not exclude a plurality, and a single feature or other unit may fulfil the functions of several units recited in the claims. Any reference signs or references in the claims shall not be construed as limiting the scope of these claims. Terms such as amplification or gain include the possibility of applying a scaling factor of less than 1 to the signal.
Of course, it should be understood that various embodiments of the analog conditioning circuit as described above, or various blocks or portions thereof, may be integrated together with other blocks or portions thereof, or with other functionality of the host device, on an integrated circuit such as a smart codec.
Accordingly, those skilled in the art will recognize that some aspects of the apparatus and methods described above may be embodied as processor control code, for example, on a non-volatile carrier medium such as a magnetic disk, CD-ROM or DVD-ROM, programmed memory such as read only memory (firmware), or on a data carrier such as an optical or electrical signal carrier.For many applications, embodiments of the invention will be implemented on a DSP (digital signal processor), an ASIC (application specific integrated circuit), or an FPGA (field programmable gate array). Thus, the code may comprise conventional program code or microcode or, for example code for setting up or controlling an ASIC or FPGA. The code may also include code for dynamically configuring a reconfigurable device, such as a re-programmable array of logic gates. Similarly, code may be included for a hardware description language (such as Verilog)TMOr VHDL (very high speed integrated circuit hardware description language)). As will be appreciated by those skilled in the art, code may be distributed among a plurality of coupled components in communication with each other. The embodiments may also be implemented using code running on a field-programmable (re) programmable analog array or similar device to configure analog hardware, where appropriate.
It should be understood that various operations described herein, particularly with respect to the figures, may be performed by other circuits or other hardware components, as would be understood by one of ordinary skill in the art with the benefit of this disclosure. The order in which each operation of a given method is performed can be varied, and various elements of the systems illustrated herein can be added, reordered, combined, omitted, modified, etc. It is intended that the present disclosure includes all such modifications and alterations and, accordingly, the above description should be taken as illustrative and not restrictive.
Similarly, while the present disclosure makes reference to particular embodiments, certain modifications and changes may be made to those embodiments without departing from the scope and coverage of the present disclosure. Moreover, no benefit, advantage, or solution to the problem described herein with respect to a particular embodiment is intended to be construed as a critical, required, or essential feature of the element.
Likewise, other embodiments having the benefit of this disclosure will be apparent to those of ordinary skill in the art and such embodiments are to be considered as encompassed herein.
Claims (40)
1. A method for outputting a haptic signal to a haptic transducer, the method comprising:
storing a representation of a portion of the haptic signal, the representation including a first point of information indicating a first amplitude and at least one first frequency of the portion of the haptic signal at a first time, wherein the representation is associated with a user experience;
in response to receiving an indication of an occurrence of the user experience, determining the haptic signal based on the first point of information such that the portion of the haptic signal has the first amplitude and the at least one first frequency at the first time; and
outputting the haptic signal to the haptic transducer.
2. The method of claim 1, wherein the first time is defined relative to a start time of the portion of the haptic signal.
3. The method of claim 1, wherein the representation of the portion of the haptic signal further comprises a second information point indicating a second amplitude and a second frequency of the portion of the haptic signal at a second time.
4. The method of claim 3, wherein the second time is defined relative to the first time.
5. The method of claim 3, further comprising:
generating the haptic signal based on the second information point such that the portion of the haptic signal has the second amplitude and the second frequency at the second time.
6. The method of claim 5, further comprising:
generating the haptic signal such that an amplitude of the portion of the haptic signal increases from the first amplitude to the second amplitude between the first time and the second time.
7. The method of claim 5, further comprising:
generating the haptic signal such that a frequency of the portion of the haptic signal increases from the first frequency to the second frequency between the first time and the second time.
8. The method of claim 1, wherein the representation of the portion of the haptic signal further comprises a repetition time.
9. The method of claim 8, further comprising:
generating the haptic signal such that the portion of the haptic signal is repeated after the repetition time.
10. The method of claim 8, wherein the representation of the portion of the haptic signal further comprises an indication of a number of repetitions, X, wherein X is an integer value, and the method comprises:
generating the haptic signal such that the portion of the haptic signal is repeated X times at intervals of the repetition time.
11. A method of generating a haptic signal for output to a haptic transducer, the method comprising:
in response to receiving an indication of an occurrence of a user experience, generating a first portion of the haptic signal based on a stored representation of the first portion of the haptic signal, the stored representation of the first portion of the haptic signal including information related to a first amplitude of the first portion of the haptic signal; and
generating a second portion haptic signal based on the stored representation of the second portion of the haptic signal, the stored representation of the second portion of the haptic signal including information related to a second magnitude of the second portion of the haptic signal, wherein the representation of the first portion of the haptic signal and the representation of the second portion of the haptic signal are associated with the user experience.
12. The method of claim 11, further comprising:
generating a second portion of the haptic signal at a desired latency time after the first portion of the haptic signal ends.
13. The method of claim 11, wherein the stored representation of the first portion of the haptic signal comprises a pulse code modulation of the first portion of the haptic signal.
14. The method of claim 11, wherein the stored representation of the second portion of the haptic signal comprises a pulse code modulation of the second portion of the haptic signal.
15. The method of claim 11, wherein the stored representation of the first portion of the haptic signal comprises a first information point indicating a first amplitude and at least one first frequency of the first portion of the haptic signal at a first time.
16. The method of claim 11, wherein the stored representation of the second portion of the haptic signal comprises a second information point indicating a second amplitude and at least one second frequency of the second portion of the haptic signal at a second time.
17. The method of claim 11, further comprising:
receiving an audio signal for output to a speaker; wherein the step of receiving an indication of the occurrence of a user experience comprises detecting the user experience in the audio signal.
18. The method of claim 11, wherein the stored representation of the first portion of the haptic signal and the stored representation of the second portion of the haptic signal are associated with the user experience as part of a stored code associated with the user experience.
19. The method of claim 18, wherein the stored code comprises:
an indication of the stored representation of the first portion of the haptic signal;
an indication of the stored representation of the second portion of the haptic signal; and
an indication of an elapsed time between the stored representation of the first portion of the haptic signal and the stored representation of the second portion of the haptic signal.
20. The method of claim 19, wherein the stored code further comprises an indication of a magnitude of the representation for playing back the stored first portion of the haptic signal.
21. A haptic signal generator for outputting a haptic signal to a haptic transducer, the haptic signal generator comprising:
a memory configured to store a representation of a portion of the haptic signal, the representation including a first point of information indicating a first amplitude and at least one first frequency of the portion of the haptic signal at a first time, wherein the representation is associated with a user experience; and
processing circuitry configured to determine the haptic signal based on the first point of information such that the portion of the haptic signal has the first amplitude and the at least one first frequency at the first time in response to receiving an indication of an occurrence of the user experience; and
outputting the haptic signal to the haptic transducer.
22. The tactile signal generator of claim 21, wherein the first time is defined relative to a start time of the portion of the tactile signal.
23. The haptic signal generator of claim 21, wherein the representation of the portion of the haptic signal further comprises a second information point indicating a second amplitude and a second frequency of the portion of the haptic signal at a second time.
24. The tactile signal generator of claim 23, wherein the second time is defined relative to the first time.
25. The haptic signal generator of claim 23, wherein the processing circuit is further configured to generate the haptic signal based on the second information point such that the portion of the haptic signal has the second amplitude and the second frequency at the second time.
26. The tactile signal generator of claim 25, wherein the processing circuit is further configured to:
generating the haptic signal such that an amplitude of the portion of the haptic signal increases from the first amplitude to the second amplitude between the first time and the second time.
27. The tactile signal generator of claim 25, wherein the processing circuit is further configured to:
generating the haptic signal such that a frequency of the portion of the haptic signal increases from the first frequency to the second frequency between the first time and the second time.
28. The haptic signal generator of claim 21, wherein the representation of the portion of the haptic signal further comprises a repetition time.
29. The tactile signal generator of claim 28, wherein the processing circuit is further configured to:
generating the haptic signal such that the portion of the haptic signal is repeated after the repetition time.
30. The tactile signal generator of claim 28, wherein the representation of the portion of the tactile signal further comprises an indication of a number of repetitions, X, wherein X is an integer value, and the processing circuit is further configured to:
generating the haptic signal such that the portion of the haptic signal is repeated X times at intervals of the repetition time.
31. A haptic signal generator to generate a haptic signal for output to a haptic transducer, the haptic signal generator comprising processing circuitry configured to:
in response to receiving an indication of an occurrence of a user experience, generating a first portion of the haptic signal based on a stored representation of the first portion of the haptic signal, the representation including information related to a first amplitude of the first portion of the haptic signal; and
generating a second portion haptic signal based on the stored representation of the second portion of the haptic signal, the stored representation of the second portion of the haptic signal including information related to a second magnitude of the second portion of the haptic signal, wherein the representation of the first portion of the haptic signal and the representation of the second portion of the haptic signal are associated with the user experience.
32. The haptic signal generator of claim 31, wherein the processing circuit is further configured to generate the second portion of the haptic signal at a desired latency after the first portion of the haptic signal ends.
33. The haptic signal generator of claim 31, wherein the stored representation of the first portion of the haptic signal comprises a pulse code modulation of the first portion of the haptic signal.
34. The haptic signal generator of claim 31, wherein the stored representation of the second portion of the haptic signal comprises a pulse code modulation of the second portion of the haptic signal.
35. The haptic signal generator of claim 31, wherein the stored representation of the first portion of the haptic signal comprises a first information point indicating a first amplitude and at least one first frequency of the first portion of the haptic signal at a first time.
36. The haptic signal generator of claim 31, wherein the stored representation of the second portion of the haptic signal comprises a second information point indicating a second amplitude and at least one second frequency of the second portion of the haptic signal at a second time.
37. The tactile signal generator of claim 31, wherein the processing circuit is further configured to receive an audio signal for output to a speaker; wherein the indication of the occurrence of the user experience comprises the processing circuit detecting the user experience in the audio signal.
38. The haptic signal generator of claim 31, wherein the stored representation of the first portion of the haptic signal and the stored representation of the second portion of the haptic signal are associated with the user experience as part of a stored code associated with the user experience.
39. The haptic signal generator of claim 38, wherein the stored code comprises:
an indication of the stored representation of the first portion of the haptic signal;
an indication of the stored representation of the second portion of the haptic signal; and
an indication of an elapsed time between the stored representation of the first portion of the haptic signal and the stored representation of the second portion of the haptic signal.
40. The haptic signal generator of claim 39, wherein the stored code further comprises an indication of a magnitude of the representation used to playback the stored first portion of the haptic signal.
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862667009P | 2018-05-04 | 2018-05-04 | |
US62/667,009 | 2018-05-04 | ||
US201862670325P | 2018-05-11 | 2018-05-11 | |
US62/670,325 | 2018-05-11 | ||
US16/280,256 | 2019-02-20 | ||
US16/280,256 US11069206B2 (en) | 2018-05-04 | 2019-02-20 | Methods and apparatus for outputting a haptic signal to a haptic transducer |
PCT/GB2019/051189 WO2019211591A1 (en) | 2018-05-04 | 2019-04-30 | Methods and apparatus for outputting a haptic signal to a haptic transducer |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112041790A true CN112041790A (en) | 2020-12-04 |
Family
ID=68385431
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980029423.7A Pending CN112041790A (en) | 2018-05-04 | 2019-04-30 | Method and apparatus for outputting haptic signals to a haptic transducer |
Country Status (5)
Country | Link |
---|---|
US (1) | US11069206B2 (en) |
EP (1) | EP3788458A1 (en) |
KR (1) | KR20210002716A (en) |
CN (1) | CN112041790A (en) |
WO (1) | WO2019211591A1 (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
US10832537B2 (en) | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US11269415B2 (en) | 2018-08-14 | 2022-03-08 | Cirrus Logic, Inc. | Haptic output systems |
GB201817495D0 (en) | 2018-10-26 | 2018-12-12 | Cirrus Logic Int Semiconductor Ltd | A force sensing system and method |
US10992297B2 (en) | 2019-03-29 | 2021-04-27 | Cirrus Logic, Inc. | Device comprising force sensors |
US11509292B2 (en) | 2019-03-29 | 2022-11-22 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter |
US10955955B2 (en) | 2019-03-29 | 2021-03-23 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors |
US11644370B2 (en) | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load |
US11283337B2 (en) | 2019-03-29 | 2022-03-22 | Cirrus Logic, Inc. | Methods and systems for improving transducer dynamics |
US10976825B2 (en) | 2019-06-07 | 2021-04-13 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system |
WO2020254788A1 (en) | 2019-06-21 | 2020-12-24 | Cirrus Logic International Semiconductor Limited | A method and apparatus for configuring a plurality of virtual buttons on a device |
US11408787B2 (en) | 2019-10-15 | 2022-08-09 | Cirrus Logic, Inc. | Control methods for a force sensor system |
US11380175B2 (en) | 2019-10-24 | 2022-07-05 | Cirrus Logic, Inc. | Reproducibility of haptic waveform |
US11545951B2 (en) | 2019-12-06 | 2023-01-03 | Cirrus Logic, Inc. | Methods and systems for detecting and managing amplifier instability |
US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator |
US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters |
US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system |
US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive |
US11552649B1 (en) | 2021-12-03 | 2023-01-10 | Cirrus Logic, Inc. | Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths |
Family Cites Families (193)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3686927A (en) | 1967-03-24 | 1972-08-29 | Bolt Beranek & Newman | Vibration testing method and apparatus |
JPS54131890A (en) | 1978-04-05 | 1979-10-13 | Toshiba Corp | Semiconductor device |
JPS58169960A (en) | 1983-02-18 | 1983-10-06 | Nec Corp | Integrated circuit containing capacity element |
DE3743131A1 (en) | 1987-10-26 | 1989-05-03 | Siemens Ag | ARRANGEMENT FOR HIGH-RESOLUTION SPECTROSCOPY |
US5684722A (en) | 1994-09-21 | 1997-11-04 | Thorner; Craig | Apparatus and method for generating a control signal for a tactile sensation generator |
JP3295564B2 (en) | 1994-11-24 | 2002-06-24 | 株式会社テラテック | Analog-to-digital converter |
US5748578A (en) | 1995-01-25 | 1998-05-05 | Discovision Associates | Colpitts type oscillator having reduced ringing and improved optical disc system utilizing same |
KR19990037725A (en) | 1995-09-02 | 1999-05-25 | 헨리 에이지마 | Display means combined with loudspeakers |
US5857986A (en) | 1996-05-24 | 1999-01-12 | Moriyasu; Hiro | Interactive vibrator for multimedia |
JP3525015B2 (en) | 1996-10-14 | 2004-05-10 | 愛三工業株式会社 | Oscillator driving device and powder supply device |
US6278790B1 (en) | 1997-11-11 | 2001-08-21 | Nct Group, Inc. | Electroacoustic transducers comprising vibrating panels |
US6519346B1 (en) | 1998-01-16 | 2003-02-11 | Sony Corporation | Speaker apparatus and electronic apparatus having a speaker apparatus enclosed therein |
JP3397116B2 (en) | 1998-01-27 | 2003-04-14 | ヤマハ株式会社 | Sound effect imparting device |
US6762745B1 (en) | 1999-05-10 | 2004-07-13 | Immersion Corporation | Actuator control providing linear and continuous force output |
DE20080209U1 (en) | 1999-09-28 | 2001-08-09 | Immersion Corp | Control of haptic sensations for interface devices with vibrotactile feedback |
JP3337669B2 (en) | 1999-12-27 | 2002-10-21 | 株式会社半導体理工学研究センター | Semiconductor integrated circuit |
US20020018578A1 (en) | 2000-08-03 | 2002-02-14 | Paul Burton | Bending wave loudspeaker |
US6906697B2 (en) | 2000-08-11 | 2005-06-14 | Immersion Corporation | Haptic sensations for tactile feedback interface devices |
US7084854B1 (en) | 2000-09-28 | 2006-08-01 | Immersion Corporation | Actuator for providing tactile sensations and device for directional tactile sensations |
US7154470B2 (en) | 2001-07-17 | 2006-12-26 | Immersion Corporation | Envelope modulator for haptic feedback devices |
US6661410B2 (en) | 2001-09-07 | 2003-12-09 | Microsoft Corporation | Capacitive sensing and data input device power management |
US7623114B2 (en) | 2001-10-09 | 2009-11-24 | Immersion Corporation | Haptic feedback sensations based on audio output from computer devices |
US6703550B2 (en) | 2001-10-10 | 2004-03-09 | Immersion Corporation | Sound data output and manipulation using haptic feedback |
US6683437B2 (en) | 2001-10-31 | 2004-01-27 | Immersion Corporation | Current controlled motor amplifier system |
US7158122B2 (en) | 2002-05-17 | 2007-01-02 | 3M Innovative Properties Company | Calibration of force based touch panel systems |
AU2003286504A1 (en) | 2002-10-20 | 2004-05-13 | Immersion Corporation | System and method for providing rotational haptic feedback |
US7742036B2 (en) | 2003-12-22 | 2010-06-22 | Immersion Corporation | System and method for controlling haptic devices having multiple operational modes |
US7791588B2 (en) | 2003-12-22 | 2010-09-07 | Immersion Corporation | System and method for mapping instructions associated with haptic feedback |
US7765333B2 (en) | 2004-07-15 | 2010-07-27 | Immersion Corporation | System and method for ordering haptic effects |
JP2006048302A (en) | 2004-08-03 | 2006-02-16 | Sony Corp | Piezoelectric complex unit, its manufacturing method, its handling method, its control method, input/output device and electronic equipment |
JP4997114B2 (en) | 2004-11-30 | 2012-08-08 | イマージョン コーポレイション | System and method for controlling a resonant device to generate vibrotactile haptic effects |
US20060284856A1 (en) | 2005-06-10 | 2006-12-21 | Soss David A | Sensor signal conditioning in a force-based touch device |
WO2007003984A1 (en) | 2005-06-30 | 2007-01-11 | Freescale Semiconductor, Inc. | Device and method for arbitrating between direct memory access task requests |
US8700791B2 (en) | 2005-10-19 | 2014-04-15 | Immersion Corporation | Synchronization of haptic effect data in a media transport stream |
US7979146B2 (en) | 2006-04-13 | 2011-07-12 | Immersion Corporation | System and method for automatically producing haptic events from a digital audio signal |
US9097639B2 (en) | 2012-12-28 | 2015-08-04 | General Electric Company | Systems for analysis of fluids |
WO2008083315A2 (en) | 2006-12-31 | 2008-07-10 | Personics Holdings Inc. | Method and device configured for sound signature detection |
US8136952B2 (en) | 2007-02-20 | 2012-03-20 | Canon Kabushiki Kaisha | Image capturing apparatus |
US8098234B2 (en) | 2007-02-20 | 2012-01-17 | Immersion Corporation | Haptic feedback system with stored effects |
JP2008219202A (en) | 2007-02-28 | 2008-09-18 | National Institute Of Information & Communication Technology | Acoustic vibration reproducing device |
US20080293453A1 (en) | 2007-05-25 | 2008-11-27 | Scott J. Atlas | Method and apparatus for an audio-linked remote indicator for a wireless communication device |
US8621348B2 (en) | 2007-05-25 | 2013-12-31 | Immersion Corporation | Customizing haptic effects on an end user device |
US8659208B1 (en) | 2007-06-14 | 2014-02-25 | Misonix, Inc. | Waveform generator for driving electromechanical device |
US8988359B2 (en) | 2007-06-19 | 2015-03-24 | Nokia Corporation | Moving buttons |
US9654104B2 (en) | 2007-07-17 | 2017-05-16 | Apple Inc. | Resistive force sensor with capacitive discrimination |
US10126942B2 (en) | 2007-09-19 | 2018-11-13 | Apple Inc. | Systems and methods for detecting a press on a touch-sensitive surface |
US20090079690A1 (en) | 2007-09-21 | 2009-03-26 | Sony Computer Entertainment America Inc. | Method and apparatus for enhancing entertainment software through haptic insertion |
US20090088220A1 (en) | 2007-10-01 | 2009-04-02 | Sony Ericsson Mobile Communications Ab | Cellular terminals and other electronic devices and methods using electroactive polymer transducer indicators |
US9019087B2 (en) | 2007-10-16 | 2015-04-28 | Immersion Corporation | Synchronization of haptic effect data in a media stream |
US8325144B1 (en) | 2007-10-17 | 2012-12-04 | Immersion Corporation | Digital envelope modulator for haptic feedback devices |
US20090102805A1 (en) | 2007-10-18 | 2009-04-23 | Microsoft Corporation | Three-dimensional object simulation using audio, visual, and tactile feedback |
KR100941638B1 (en) | 2007-12-18 | 2010-02-11 | 한국전자통신연구원 | Touching behavior recognition system and method |
KR101956999B1 (en) | 2008-07-15 | 2019-03-11 | 임머숀 코퍼레이션 | Systems and methods for transmitting haptic messages |
US9400555B2 (en) | 2008-10-10 | 2016-07-26 | Internet Services, Llc | System and method for synchronization of haptic data and media data |
US20100141408A1 (en) | 2008-12-05 | 2010-06-10 | Anthony Stephen Doy | Audio amplifier apparatus to drive a panel to produce both an audio signal and haptic feedback |
US7843277B2 (en) | 2008-12-16 | 2010-11-30 | Immersion Corporation | Haptic feedback generation based on resonant frequency |
US8068025B2 (en) | 2009-05-28 | 2011-11-29 | Simon Paul Devenyi | Personal alerting device and method |
KR20110019144A (en) | 2009-08-19 | 2011-02-25 | 엘지전자 주식회사 | Apparatus and method for generating vibration pattern |
JP2011057000A (en) | 2009-09-07 | 2011-03-24 | Yamaha Corp | Acoustic resonance device |
US8902050B2 (en) | 2009-10-29 | 2014-12-02 | Immersion Corporation | Systems and methods for haptic augmentation of voice-to-text conversion |
US8633916B2 (en) | 2009-12-10 | 2014-01-21 | Apple, Inc. | Touch pad with force sensors and actuator feedback |
KR101642149B1 (en) | 2010-01-05 | 2016-07-25 | 삼성전자주식회사 | Method and apparatus for controlling haptic feedback in portable terminal having touch-screen |
US8432368B2 (en) | 2010-01-06 | 2013-04-30 | Qualcomm Incorporated | User interface methods and systems for providing force-sensitive input |
JP2013517548A (en) | 2010-01-13 | 2013-05-16 | イーロ・タッチ・ソリューションズ・インコーポレイテッド | Noise reduction in electronic devices with touch-sensitive surfaces |
US20110187651A1 (en) | 2010-02-03 | 2011-08-04 | Honeywell International Inc. | Touch screen having adaptive input parameter |
JP5841713B2 (en) | 2010-07-27 | 2016-01-13 | 京セラ株式会社 | Tactile sensation presentation apparatus and control method for tactile sensation presentation apparatus |
US9329721B1 (en) | 2010-08-05 | 2016-05-03 | Amazon Technologies, Inc. | Reduction of touch-sensor interference from stable display |
US9262002B2 (en) | 2010-11-03 | 2016-02-16 | Qualcomm Incorporated | Force sensing touch screen |
US20120112894A1 (en) | 2010-11-08 | 2012-05-10 | Korea Advanced Institute Of Science And Technology | Haptic feedback generator, portable device, haptic feedback providing method using the same and recording medium thereof |
US8797830B2 (en) | 2011-02-02 | 2014-08-05 | General Monitors, Inc. | Explosion-proof acoustic source for hazardous locations |
US8717152B2 (en) | 2011-02-11 | 2014-05-06 | Immersion Corporation | Sound to haptic effect conversion system using waveform |
US9448626B2 (en) | 2011-02-11 | 2016-09-20 | Immersion Corporation | Sound to haptic effect conversion system using amplitude value |
DK2487780T3 (en) | 2011-02-14 | 2020-03-02 | Siemens Ag | Control unit for a power converter and method of operation thereof |
EP2489442A1 (en) | 2011-02-18 | 2012-08-22 | Aernnova Engineering Solutions Iberica | Integrated phased array transducer, system and methodology for structural health monitoring of aerospace structures |
US20120229264A1 (en) | 2011-03-09 | 2012-09-13 | Analog Devices, Inc. | Smart linear resonant actuator control |
KR20120126446A (en) | 2011-05-11 | 2012-11-21 | 엘지전자 주식회사 | An apparatus for generating the vibrating feedback from input audio signal |
US9083821B2 (en) | 2011-06-03 | 2015-07-14 | Apple Inc. | Converting audio to haptic feedback in an electronic device |
US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
US11262253B2 (en) | 2017-08-14 | 2022-03-01 | Sentons Inc. | Touch input detection using a piezoresistive sensor |
US20130141382A1 (en) | 2011-12-01 | 2013-06-06 | Martin John Simmons | Touch Sensor With Force Sensing |
GB201200587D0 (en) | 2012-01-13 | 2012-02-29 | Hiwave Technologies Uk Ltd | Haptic feedback and pressure sensing |
US10632040B2 (en) * | 2012-02-29 | 2020-04-28 | Frederick Muench | Systems, devices, components and methods for triggering or inducing resonance or high amplitude oscillations in a cardiovascular system of a patient |
US9715276B2 (en) | 2012-04-04 | 2017-07-25 | Immersion Corporation | Sound to haptic effect conversion system using multiple actuators |
US20130275058A1 (en) | 2012-04-13 | 2013-10-17 | Google Inc. | Apparatus and method for a pressure sensitive device interface |
US9117449B2 (en) | 2012-04-26 | 2015-08-25 | Nuance Communications, Inc. | Embedded system for construction of small footprint speech recognition with user-definable constraints |
WO2013166439A1 (en) | 2012-05-04 | 2013-11-07 | Setem Technologies, Llc | Systems and methods for source signal separation |
US9977499B2 (en) | 2012-05-09 | 2018-05-22 | Apple Inc. | Thresholds for determining feedback in computing devices |
US8847741B2 (en) | 2012-05-16 | 2014-09-30 | Immersion Corporation | System and method for display of multiple data channels on a single haptic display |
WO2013186845A1 (en) | 2012-06-11 | 2013-12-19 | 富士通株式会社 | Electronic device, vibration generation program, and system using vibration patterns |
US9063570B2 (en) | 2012-06-27 | 2015-06-23 | Immersion Corporation | Haptic feedback control system |
US9030428B2 (en) | 2012-07-11 | 2015-05-12 | Immersion Corporation | Generating haptic effects for dynamic events |
WO2014018086A1 (en) | 2012-07-26 | 2014-01-30 | Changello Enterprise Llc | Force correction on multiple sense elements |
KR101589421B1 (en) | 2012-08-16 | 2016-01-27 | 가부시키가이샤 액션 리서치 | Vibration processing device and method |
WO2014031756A2 (en) | 2012-08-21 | 2014-02-27 | Immerz, Inc. | Systems and methods for a vibrating input device |
US9368005B2 (en) | 2012-08-31 | 2016-06-14 | Immersion Corporation | Sound to haptic effect conversion system using mapping |
US9092059B2 (en) | 2012-10-26 | 2015-07-28 | Immersion Corporation | Stream-independent sound to haptic effect conversion system |
US20140119244A1 (en) | 2012-11-01 | 2014-05-01 | Research In Motion Limited | Cognitive radio rf front end |
US8947216B2 (en) | 2012-11-02 | 2015-02-03 | Immersion Corporation | Encoding dynamic haptic effects |
US9122330B2 (en) | 2012-11-19 | 2015-09-01 | Disney Enterprises, Inc. | Controlling a user's tactile perception in a dynamic physical environment |
KR102141044B1 (en) | 2012-12-03 | 2020-08-04 | 삼성전자주식회사 | Apparatus having a plurality of touch screens and method for sound output thereof |
KR102091077B1 (en) | 2012-12-14 | 2020-04-14 | 삼성전자주식회사 | Mobile terminal and method for controlling feedback of an input unit, and the input unit and method therefor |
WO2014094283A1 (en) | 2012-12-20 | 2014-06-26 | Intel Corporation | Touchscreen including force sensors |
US9128523B2 (en) | 2012-12-20 | 2015-09-08 | Amazon Technologies, Inc. | Dynamically generating haptic effects from audio data |
US9117347B2 (en) | 2013-02-25 | 2015-08-25 | Nokia Technologies Oy | Method and apparatus for a flexible housing |
CN103165328B (en) | 2013-02-25 | 2016-06-08 | 苏州达方电子有限公司 | Force feedback keyboard structure |
US9489047B2 (en) | 2013-03-01 | 2016-11-08 | Immersion Corporation | Haptic device with linear resonant actuator |
US9715300B2 (en) | 2013-03-04 | 2017-07-25 | Microsoft Technology Licensing, Llc | Touch screen interaction using dynamic haptic feedback |
US8754757B1 (en) | 2013-03-05 | 2014-06-17 | Immersion Corporation | Automatic fitting of haptic effects |
US11393461B2 (en) | 2013-03-12 | 2022-07-19 | Cerence Operating Company | Methods and apparatus for detecting a voice command |
KR101666393B1 (en) | 2013-03-27 | 2016-10-14 | 한국전자통신연구원 | Apparatus and method for reproducing haptic effect using sound effect |
US9519346B2 (en) | 2013-05-17 | 2016-12-13 | Immersion Corporation | Low-frequency effects haptic conversion system |
US9274603B2 (en) | 2013-05-24 | 2016-03-01 | Immersion Corporation | Method and apparatus to provide haptic feedback based on media content and one or more external parameters |
US9196135B2 (en) | 2013-06-28 | 2015-11-24 | Immersion Corporation | Uniform haptic actuator response with a variable supply voltage |
DE102013012811B4 (en) | 2013-08-01 | 2024-02-22 | Wolfgang Klippel | Arrangement and method for identifying and correcting the nonlinear properties of electromagnetic transducers |
US10162416B2 (en) | 2013-09-06 | 2018-12-25 | Immersion Corporation | Dynamic haptic conversion system |
US9898085B2 (en) | 2013-09-06 | 2018-02-20 | Immersion Corporation | Haptic conversion system using segmenting and combining |
US9619980B2 (en) | 2013-09-06 | 2017-04-11 | Immersion Corporation | Systems and methods for generating haptic effects associated with audio signals |
US9520036B1 (en) | 2013-09-18 | 2016-12-13 | Amazon Technologies, Inc. | Haptic output generation with dynamic feedback control |
US9213408B2 (en) | 2013-10-08 | 2015-12-15 | Immersion Corporation | Generating haptic effects while minimizing cascading |
US9164587B2 (en) * | 2013-11-14 | 2015-10-20 | Immersion Corporation | Haptic spatialization system |
CN105745031A (en) | 2013-12-06 | 2016-07-06 | 富士通株式会社 | Drive device, electronic equipment, drive control program, and drive signal-generating method |
US9248840B2 (en) | 2013-12-20 | 2016-02-02 | Immersion Corporation | Gesture based input system in a vehicle with haptic feedback |
US9946348B2 (en) | 2014-03-21 | 2018-04-17 | Immersion Corporation | Automatic tuning of haptic effects |
US9959744B2 (en) | 2014-04-25 | 2018-05-01 | Motorola Solutions, Inc. | Method and system for providing alerts for radio communications |
US9928728B2 (en) | 2014-05-09 | 2018-03-27 | Sony Interactive Entertainment Inc. | Scheme for embedding a control signal in an audio signal using pseudo white noise |
KR102229137B1 (en) | 2014-05-20 | 2021-03-18 | 삼성디스플레이 주식회사 | Display apparatus |
US9588586B2 (en) | 2014-06-09 | 2017-03-07 | Immersion Corporation | Programmable haptic devices and methods for modifying haptic strength based on perspective and/or proximity |
CN204903757U (en) | 2014-07-11 | 2015-12-23 | 菲力尔系统公司 | Sonar system |
KR101641418B1 (en) | 2014-07-25 | 2016-07-20 | 포항공과대학교 산학협력단 | Method for haptic signal generation based on auditory saliency and apparatus therefor |
US9921678B2 (en) | 2014-08-05 | 2018-03-20 | Georgia Tech Research Corporation | Self-powered, ultra-sensitive, flexible tactile sensors based on contact electrification |
KR102143310B1 (en) * | 2014-09-02 | 2020-08-28 | 애플 인크. | Haptic notifications |
JP6501487B2 (en) | 2014-10-27 | 2019-04-17 | キヤノン株式会社 | Ultrasonic motor and drive device using ultrasonic motor |
KR102292385B1 (en) | 2014-11-19 | 2021-08-23 | 삼성에스디아이 주식회사 | Positive active material for rechargeable lithium battery, method of preparing the same, and rechargeable lithium battery including the same |
US9846484B2 (en) * | 2014-12-04 | 2017-12-19 | Immersion Corporation | Systems and methods for controlling haptic signals |
US10613628B2 (en) | 2014-12-23 | 2020-04-07 | Immersion Corporation | Media driven haptics |
WO2016138144A2 (en) * | 2015-02-25 | 2016-09-01 | Immersion Corporation | Systems and methods for providing context-sensitive haptic notification frameworks |
US9612685B2 (en) | 2015-04-09 | 2017-04-04 | Microsoft Technology Licensing, Llc | Force-sensitive touch sensor compensation |
MX2018000964A (en) | 2015-08-05 | 2018-05-22 | Ford Global Tech Llc | System and method for sound direction detection in a vehicle. |
DK3148214T3 (en) | 2015-09-15 | 2022-01-03 | Oticon As | HEARING DEVICE INCLUDING AN IMPROVED FEEDBACK CANCELLATION SYSTEM |
US9842476B2 (en) | 2015-09-25 | 2017-12-12 | Immersion Corporation | Programmable haptic devices and methods for modifying haptic effects to compensate for audio-haptic interference |
US9971407B2 (en) | 2015-09-30 | 2018-05-15 | Apple Inc. | Haptic feedback for rotary inputs |
US9740245B2 (en) | 2015-10-05 | 2017-08-22 | Microsoft Technology Licensing, Llc | Locking mechanism |
US20170153760A1 (en) | 2015-12-01 | 2017-06-01 | Apple Inc. | Gain-based error tracking for force sensing |
EP3179335B1 (en) | 2015-12-10 | 2020-03-04 | Nxp B.V. | Haptic feedback controller |
CN105446646B (en) | 2015-12-11 | 2019-01-11 | 小米科技有限责任公司 | Content input method, device and touch control device based on dummy keyboard |
CN105630021B (en) | 2015-12-31 | 2018-07-31 | 歌尔股份有限公司 | A kind of the tactile vibrations control system and method for intelligent terminal |
CN105511514B (en) | 2015-12-31 | 2019-03-15 | 歌尔股份有限公司 | A kind of the tactile vibrations control system and method for intelligent terminal |
US20170220197A1 (en) | 2016-02-02 | 2017-08-03 | Fujitsu Ten Limited | Input device, system, method of manufacturing input device and display device |
US9881467B2 (en) | 2016-02-22 | 2018-01-30 | Immersion Corporation | Haptic effects conflict avoidance |
US10198125B2 (en) | 2016-03-22 | 2019-02-05 | Synaptics Incorporated | Force sensor recalibration |
US10534643B2 (en) | 2016-05-09 | 2020-01-14 | Oracle International Corporation | Correlation of thread intensity and heap usage to identify heap-hoarding stack traces |
KR101790892B1 (en) | 2016-05-17 | 2017-10-26 | 주식회사 씨케이머티리얼즈랩 | A method of transforming a sound signal to a tactual signal and haptic device of using thereof |
US9965092B2 (en) | 2016-05-18 | 2018-05-08 | Apple Inc. | Managing power consumption of force sensors |
US10073525B2 (en) | 2016-06-16 | 2018-09-11 | Immersion Corporation | Systems and methods for a low profile haptic actuator |
US9886829B2 (en) | 2016-06-20 | 2018-02-06 | Immersion Corporation | Systems and methods for closed-loop control for haptic feedback |
US10304298B2 (en) | 2016-07-27 | 2019-05-28 | Immersion Corporation | Braking characteristic detection system for haptic actuator |
US20180082673A1 (en) | 2016-07-28 | 2018-03-22 | Theodore Tzanetos | Active noise cancellation for defined spaces |
US9697450B1 (en) | 2016-07-29 | 2017-07-04 | Alpha And Omega Semiconductor Incorporated | Magnetic stripe data transmission system and method for reliable data transmission and low power consumption |
US9921609B2 (en) | 2016-08-02 | 2018-03-20 | Immersion Corporation | Systems and methods for deformation and haptic effects |
US10890973B2 (en) | 2016-08-31 | 2021-01-12 | Apple Inc. | Electronic device including multi-phase driven linear haptic actuator and related methods |
CN106438890B (en) | 2016-09-05 | 2018-08-28 | 南京航空航天大学 | The infinitely variable speed transmission and method of the macro micro- combination of electromagnet-ultrasonic transducer |
DK201670728A1 (en) | 2016-09-06 | 2018-03-19 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Providing Feedback During Interaction with an Intensity-Sensitive Button |
DK201670720A1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs |
WO2018049355A1 (en) | 2016-09-09 | 2018-03-15 | Sensel Inc. | System for detecting and characterizing inputs on a touch sensor |
JP2020502607A (en) | 2016-09-14 | 2020-01-23 | ソニックセンソリー、インコーポレイテッド | Multi-device audio streaming system with synchronization |
US10469971B2 (en) | 2016-09-19 | 2019-11-05 | Apple Inc. | Augmented performance synchronization |
US9929703B1 (en) | 2016-09-27 | 2018-03-27 | Cirrus Logic, Inc. | Amplifier with configurable final output stage |
US10942596B2 (en) | 2016-10-03 | 2021-03-09 | Carnegie Mellon University | Touch-sensing system |
KR20180062174A (en) | 2016-11-30 | 2018-06-08 | 삼성전자주식회사 | Method for Producing Haptic Signal and the Electronic Device supporting the same |
GB201620746D0 (en) | 2016-12-06 | 2017-01-18 | Dialog Semiconductor Uk Ltd | An apparatus and method for controlling a haptic actuator |
US10341767B2 (en) | 2016-12-06 | 2019-07-02 | Cirrus Logic, Inc. | Speaker protection excursion oversight |
JP6588421B2 (en) * | 2016-12-28 | 2019-10-09 | 任天堂株式会社 | Information processing system, information processing program, information processing apparatus, and information processing method |
US10261685B2 (en) | 2016-12-29 | 2019-04-16 | Google Llc | Multi-task machine learning for predicted touch interpretations |
US20180196567A1 (en) | 2017-01-09 | 2018-07-12 | Microsoft Technology Licensing, Llc | Pressure sensitive virtual keyboard |
CN106950832B (en) | 2017-03-08 | 2020-01-31 | 杭州电子科技大学 | ultrasonic dispersion control device using cavitation intensity feedback |
US10032550B1 (en) | 2017-03-30 | 2018-07-24 | Apple Inc. | Moving-coil haptic actuator for electronic devices |
US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
US9964732B1 (en) | 2017-05-15 | 2018-05-08 | Semiconductor Components Industries, Llc | Methods and apparatus for actuator control |
DK201770372A1 (en) | 2017-05-16 | 2019-01-08 | Apple Inc. | Tactile feedback for locked device user interfaces |
GB2563460B (en) | 2017-06-15 | 2021-07-14 | Cirrus Logic Int Semiconductor Ltd | Temperature monitoring for loudspeakers |
US10601355B2 (en) | 2017-09-29 | 2020-03-24 | Apple Inc. | Closed-loop control of linear resonant actuator using back EMF and inertial compensation |
US10110152B1 (en) | 2017-09-29 | 2018-10-23 | Apple Inc. | Integrated driver and controller for haptic engine |
GB201801661D0 (en) | 2017-10-13 | 2018-03-21 | Cirrus Logic International Uk Ltd | Detection of liveness |
KR102430582B1 (en) | 2017-11-28 | 2022-08-08 | 엘지디스플레이 주식회사 | Display Device |
US10546585B2 (en) | 2017-12-29 | 2020-01-28 | Comcast Cable Communications, Llc | Localizing and verifying utterances by audio fingerprinting |
US10264348B1 (en) | 2017-12-29 | 2019-04-16 | Nvf Tech Ltd | Multi-resonant coupled system for flat panel actuation |
US10620704B2 (en) | 2018-01-19 | 2020-04-14 | Cirrus Logic, Inc. | Haptic output systems |
US10782785B2 (en) | 2018-01-29 | 2020-09-22 | Cirrus Logic, Inc. | Vibro-haptic design and automatic evaluation of haptic stimuli |
US11139767B2 (en) | 2018-03-22 | 2021-10-05 | Cirrus Logic, Inc. | Methods and apparatus for driving a transducer |
US10795443B2 (en) | 2018-03-23 | 2020-10-06 | Cirrus Logic, Inc. | Methods and apparatus for driving a transducer |
US10667051B2 (en) | 2018-03-26 | 2020-05-26 | Cirrus Logic, Inc. | Methods and apparatus for limiting the excursion of a transducer |
US10832537B2 (en) | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US10707828B2 (en) | 2018-05-04 | 2020-07-07 | Samsung Electro-Mechanics Co., Ltd. | Filter including bulk acoustic wave resonator |
-
2019
- 2019-02-20 US US16/280,256 patent/US11069206B2/en active Active
- 2019-04-30 KR KR1020207034844A patent/KR20210002716A/en not_active Application Discontinuation
- 2019-04-30 EP EP19722673.1A patent/EP3788458A1/en not_active Ceased
- 2019-04-30 CN CN201980029423.7A patent/CN112041790A/en active Pending
- 2019-04-30 WO PCT/GB2019/051189 patent/WO2019211591A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
US11069206B2 (en) | 2021-07-20 |
US20190340895A1 (en) | 2019-11-07 |
KR20210002716A (en) | 2021-01-08 |
WO2019211591A1 (en) | 2019-11-07 |
EP3788458A1 (en) | 2021-03-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112041790A (en) | Method and apparatus for outputting haptic signals to a haptic transducer | |
JP7061143B2 (en) | Tactile notification | |
JP7240347B2 (en) | Devices, methods, and graphical user interfaces that provide haptic feedback | |
US10741189B2 (en) | Low-frequency effects haptic conversion system | |
JP6096092B2 (en) | Tactile feedback system with memorized effect | |
CN111919381A (en) | Method and apparatus for outputting haptic signals to a haptic transducer | |
EP2136286B1 (en) | System and method for automatically producing haptic events from a digital audio file | |
EP2703951B1 (en) | Sound to haptic effect conversion system using mapping | |
CN108762506B (en) | Stream independent sound to haptic effect conversion system | |
US20170131776A1 (en) | Systems and Methods for Multi-Pressure Interaction on Touch-Sensitive Surfaces | |
JPWO2013186846A1 (en) | Programs and electronic devices | |
EP3674849A1 (en) | Haptic effect signal processing | |
KR102011771B1 (en) | Method and apparatus for providing user interface | |
KR101215099B1 (en) | Method for storing vibration pattern of vibrator | |
EP4236285A1 (en) | Enhanced vibration prompting method and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |