CN108803869A - Encode dynamic haptic effects - Google Patents
Encode dynamic haptic effects Download PDFInfo
- Publication number
- CN108803869A CN108803869A CN201810371898.1A CN201810371898A CN108803869A CN 108803869 A CN108803869 A CN 108803869A CN 201810371898 A CN201810371898 A CN 201810371898A CN 108803869 A CN108803869 A CN 108803869A
- Authority
- CN
- China
- Prior art keywords
- haptic effect
- haptic
- dynamic
- value
- key frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B6/00—Tactile signalling systems, e.g. personal calling systems
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Complex Calculations (AREA)
Abstract
This disclosure relates to encode dynamic haptic effects.Provide the system for encoding one or more dynamic haptic effects.The system is defined as dynamic haptic effects to include multiple key frames, wherein each key frame includes an inserted value and corresponding haptic effect.Inserted value is regulation is inserted into the value where occurred.The system generates haptic effect file, and dynamic haptic effects are stored in the haptic effect file.
Description
The application be the applying date be on October 11st, 2013, entitled " coding dynamic haptic effects " Chinese patent
The divisional application of application 201310471458.0.
Technical field
A kind of embodiment is generally directed to haptic effect, and more specifically, for coding dynamic haptic effects.
Background technology
Electronic equipment set manufacturer makes great efforts to be that user generates abundant interface.Traditional equipment using vision and acoustic cue come to
User provides feedback.In some interface equipments, kinesthetic feedback (such as force feedback actively and passively) and/or tactile feedback
(such as vibration, texture and heat) also provides a user, and is more generally referred to as " touch feedback " or " haptic effect ".Touch feedback
Enhancing can be provided and simplify the clue of user interface.Specifically, dither effect, or vibration sense of touch haptic effect, to electricity
The user of sub- equipment may be useful when providing the clue for reminding user's special event, or provide feedback true to nature, in mould
The sensory immersion of bigger is generated in quasi- or virtual environment.
Touch feedback is attached to the referred to as portable electronic of " portable equipment " or " portable device " and sets more and more
In standby, such as cellular phone, personal digital assistant (" PDA "), smart phone and portable gaming device.For example, some are portable
Formula game application can be with similar to control device being used for bigger scale game system, being configured to offer touch feedback
The mode of (for example, control stick etc.) is vibrated.In addition, the equipment of such as cellular phone and smart phone can be by vibrating to user
Various promptings are provided.For example, the call that cellular phone can be entered by vibrating alert user.Similarly, smart phone
It may remind the user that the calendar item mapped out or provide the prompting of " work to be done " list items or calendar appointment to the user.Separately
Outside, haptic effect can be used for simulating the feeling of marble in " real world " dynamic event, such as video-game.
Invention content
A kind of embodiment is the system of coding haptic signal.The system receives one or more key frames.Each key frame
All there are one inserted value and a haptic effects for tool.The system further generates tactile using one or more of key frames and imitates
Induction signal.The system further stores haptic effect signal in haptic effect file.
Another embodiment is the system for encoding dynamic haptic effects.The system is defined as dynamic haptic effects to include one
A or multiple key frames.Each key frame includes an inserted value and corresponding haptic effect, and the wherein inserted value is regulation,
For corresponding haptic effect, the value where being inserted into.The system further generates haptic effect file.The system is into one
Step stores dynamic haptic effects in haptic effect file.
Description of the drawings
According to the specific descriptions of preferred embodiment, further embodiment, details, advantage and modification will become aobvious below
So, wherein preferred embodiment will understand in conjunction with attached drawing.
Fig. 1 illustrates the block diagram of the system according to an embodiment of the present invention.
Fig. 2 illustrates that exemplary dynamic haptic effect defines according to an embodiment of the present invention.
Fig. 3 illustrates example key frame definition according to an embodiment of the present invention.
Fig. 4 illustrates exemplary basis haptic effect memory block according to an embodiment of the present invention.
Fig. 5 illustrates exemplary frames list block according to an embodiment of the present invention.
Fig. 6 illustrates the flow chart of tactile coding Module functionality according to an embodiment of the present invention.
Fig. 7 illustrates the flow chart of tactile coding Module functionality according to another embodiment of the invention.
Specific implementation mode
As described below, " dynamic haptic effects " refer to the Temporal Evolution when it responds one or more input parameters and touch
Feel effect.Dynamic haptic effects are the tactile shown on a haptic device or vibration sense of touch effect, represent and give input signal
State change.Input signal can be the signal captured with touch feedback is utilized by the sensor in equipment, such as position,
Acceleration, pressure, orientation or proximity, or captured by miscellaneous equipment and be sent to haptic apparatus to influence haptic effect life
At signal.
Dynamic effect signal can be any kind of signal, but be not necessarily intended to complexity.For example, dynamic effect signal can be with
It is simple sine wave, has and change over time or according to some attribute of mapping policy real time reaction, such as phase, frequency
Or amplitude, mapping policy therein are mapped to input parameter on the attribute of effect signal variation.Input parameter can be can
Any kind of input provided by equipment, and in general can be any kind of letter of such as device sensor signal
Number.Device sensor signal can generate in any manner, and user gesture life can be generally captured by using equipment
At.Dynamic effect may be highly useful for gesture interface, but the use of gesture or sensor is believed for generating dynamic
It number is not required.
A kind of common scene for being not directed to gesture is to define the dynamic haptic behavior of animation widget.For example, when using
When family rolls a list, being generally not the tactile generation (haptification) of gesture will feel more intuitive, but widget
The kinaesthesis for responding the gesture is more intuitive.In the scroll list example, gently sliding list will be generated according to rolling speed
The dynamic haptic of change is fed back, but is thrusted scroll bar and be even also possible to generate dynamic haptic after the gesture terminates.This production
It has given birth to illusion of the widget with certain physical attribute and has provided the information about widget state, such as its speed to the user
It spends or whether it is moving.
Gesture is to convey any body kinematics of meaning or user view.It will be recognized that simple gesture can combine, shape
At more complicated gesture.For example, " finger is put " gesture can be referred to as by allowing a finger to be contacted with touch sensitive surface, and from
Touch sensitive surface, which removes finger, can be referred to as " finger is taken away " gesture." if finger is put " and " finger is taken away " gesture it
Between time it is relatively short, then the combination gesture can be referred to as " tapping ";" if finger is put " and " finger is taken away " gesture it
Between time it is relatively long, then the combination gesture can be referred to as " long tap ";" if finger is put " and " finger is taken away " gesture
The distance between position bidimensional (x, y) it is relatively large, then the combination gesture can be referred to as " gently sweeping ";" if finger is put " and
The distance between position bidimensional (x, y) of " finger is taken away " gesture is relatively small, then the combination gesture can be referred to as " trailing ",
" defiling " or " flicking ".Any number of bidimensional or three-dimensional simple or complicated gesture can be combined in any manner, and be formed arbitrary
Other gestures of quantity include, but are not limited to more abutments, palm or fist contact, or the proximity with equipment.Gesture
It can also be and appointing for electric signal is identified and be converted by the equipment with accelerometer, gyroscope or other motion sensors
The movement of the hand of what form.This electric signal can activate dynamic effect, such as rock virtual dice, and wherein sensor captures life
At the user view of dynamic effect.
A kind of embodiment be can be encoded on disk, memory or any computer readable storage medium it is one or more dynamic
The system of state haptic effect.A type of dynamic haptic effects are can be by being based on being used as the first inserted value and the second insertion
The haptic effect that the dynamic value of a value between value is inserted into the first haptic effect and the second haptic effect to generate.Equal to first
The dynamic value of inserted value or the second inserted value is considered " between the first inserted value and the second inserted value ".More specifically, being used for
The value of each parameter of dynamic haptic effects is to be inserted into first by using the value of the interpolating function parameter of the second haptic effect
The value of the parameter of haptic effect calculates.The insertion of each parameter value of dynamic haptic effects can be fallen based on the dynamic value
Between first inserted value and the second inserted value where.Dynamic haptic effects be furthermore that on July 11st, 2012 submit and
The U.S. Patent Application Serial Number 13/ of entitled " GENERATING HAPTIC EFFECTS FOR DYNAMIC EVENTS "
(its content is hereby incorporated by reference) described in 546,351.Dynamic haptic effects can be encoded using haptic effect signal,
Wherein haptic effect signal is the expression of dynamic haptic effects.Haptic effect signal can be in disk, memory or any computer
Persistence on readable storage medium storing program for executing.
According to this embodiment, which can be defined as one or more key frames each dynamic haptic effects, wherein
Each key frame can include a haptic effect and corresponding value.Thus, which can generate one or more dynamics
Haptic effect defines.The system can store one or more dynamic haptic effects definition in haptic effect file.The system
It can further be defined from haptic effect document retrieval one or more dynamic haptic effects.The system can further receive dynamic
State value also, based on received dynamic value, explains one or more dynamic haptic effects definition, to generate one or more
A dynamic haptic effects.
According to another embodiment, system can receive one or more key frames, wherein each key frame can wrap
Include a haptic effect and a value.The system can utilize the one or more key frame to generate haptic effect signal.This is
System can further store haptic effect signal in haptic effect file.The system can be examined further from haptic effect file
Rope haptic effect signal.The system further can apply drive signal according to haptic effect signal to haptic output devices.It should
System further can utilize haptic output devices to generate drive signal.In such an embodiment, one or more key frames can
To include one or more input parameters of the dynamic haptic effects generated by haptic effect signal.
Fig. 1 illustrates the block diagram of the system 10 according to an embodiment of the present invention.In one embodiment, system 10 is to set
A standby part, and system 10 provides tactile coding functionality for the equipment.Although illustrated as individual system, but the work(of system 10
Energy property can be implemented as distributed system.System 10 includes bus 12 for transmitting information or other communication medias and coupling
Close processor 22 of the bus 12 for handling information.Processor 22 can be any kind of general or specialized processor.System
System 10 further comprises for storing information and will be by the memory 14 for the instruction that processor 22 executes.Memory 14 can by with
Machine accesses memory (" RAM "), read-only memory (" ROM "), such as magnetically or optically static store of disk or any other type
Computer-readable medium arbitrary combination constitute.
Computer-readable medium can be any usable medium that can be accessed by processor 22, and can both include easy
The property lost includes non-volatile media again, had not only included moveable but also included immovable medium, communication media and storage medium.
Communication media may include computer-readable instruction in the modulated data signal of such as carrier wave or other transmission mechanisms, data
Structure, program module or other data, and may include information conveyance Jie of any other form as known in the art
Matter.Storage medium may include RAM, flash memories, ROM, erasable programmable read-only memory (" EPROM "), electric erasable
Programmable read only memory (" EEPROM "), register, hard disk, mobile dish, compact disc read-only memory (" CD-ROM ") or sheet
The storage medium of known any other form in field.
In one embodiment, the storage of memory 14 provides functional software module when being executed by processor 22.?
In a kind of embodiment, these modules include that the operation of operation system function is provided for the remainder of system 10 and mobile device
System 15.These modules further comprise the tactile coding module 16 for encoding dynamic haptic effects, such as disclose in further detail below
's.In certain embodiments, tactile coding module 16 may include multiple modules, wherein each individually module offer is used for
Encode the specific individual functional of dynamic haptic effects.System 10 generally will include one or more additional application modules 18,
Including additionally functional, such as the Integrator from Immersion companiesTMUsing.
In sending from remote source and/or receiving the embodiment of data, system 10 further comprises communication equipment 20, such as
Network interface card communicates in order to provide mobile wireless network, such as infrared ray, radio, Wi-Fi or cellular network communication.?
In other embodiments, communication equipment 20 provides cable network connection, such as Ethernet connection or modem.
Processor 22 is further coupled to display 24, such as liquid crystal display (" LCD ") through bus 12, is used for user
Show graphical representation or user interface.Display 24 can be arranged to the touch sent and received signal from processor 22 sensitivity
Input equipment, such as touch screen, and can be multi-touch touch-screen.Processor 22 can be further coupled to allow user with
The keyboard or cursor control 28 that system 10 interacts, such as mouse or stylus.
In one embodiment, system 10 further comprises actuator 26.Processor 22 can to actuator 26 send with
The tactile effect of sense of touch haptic effect is for example vibrated in the associated haptic signal of haptic effect generated, then actuator 26 and output
It answers.Actuator 26 includes actuator driving circuit.Actuator 26 can be such as motor, electromagnetic actuators, voice coil, shape note
Recall alloy, electroactive polymer, solenoid, eccentric rotating mass motor (" ERM "), linear resonance actuator (" LRA "), piezoelectricity
Actuator, high bandwidth actuator, electroactive polymer (" EAP ") actuator, electrostatic friction display or ultrasonic activation hair
Raw device.In an alternative embodiment, in addition to actuator 26, system 10 can also include one or more additional actuators (in Fig. 1
In do not specify).In other embodiments, the equipment detached with system 10 includes generating the actuator of haptic effect, and system
10 are sent to that equipment by communication equipment 20 the haptic effect signal generated.Actuator 26 is haptic output devices
One example, wherein haptic output devices are arranged in response to drive signal and export haptic effect (such as vibration sense of touch tactile
Effect) equipment.
System 10 can be coupled to database 30 with further operating, and wherein database 30 can be configured to storage by module
Data used in 16 and 18.Database 30 can be operating database, analytical database, data warehouse, distributed data base,
End-user database, external data base, navigational route database, the database in memory, the database of Oriented Documents, in real time number
According to library, relational database, OODB Object Oriented Data Base or any other database as known in the art.
Fig. 2 illustrates that exemplary dynamic haptic effect defines 200 according to an embodiment of the present invention.According to a kind of embodiment,
Dynamic haptic effects can be defined as including one or more key frames.Key frame can be used for defining dynamic haptic effects
The expression of basic haptic effect.Moreover, according to a kind of embodiment, haptic effect signal can utilize one or more key frames to give birth to
At wherein haptic effect signal is the signal that can store one or more key frames.By using one or more key frames
Haptic effect signal is generated, which is generated and is subsequently stored in the haptic effect signal.Tactile
Effect signal can be stored in haptic effect file and therefrom retrieve.
Key frame may include basic haptic effect definition.Basic haptic effect be may include defining haptic effect characteristic
One or more parameters of (more specifically, being the characteristic of the kinesthetic feedback and/or tactile feedback that are generated by haptic effect) are touched
Feel effect, wherein haptic effect can be such as vibrating tactile effect.The example of the one or more parameter may include magnitude
Parameter, frequency parameter and duration parameters.The example of basic haptic effect may include " MagSweep haptic effects " and " week
Phase property haptic effect ".MagSweep haptic effects be generate kinesthetic feedback and/or tactile feedback haptic effect it is (such as primary
Vibration).Periodic haptic effect effect is to generate the haptic effect of the kinesthetic feedback and/or tactile feedback that repeat (such as to vibrate mould
Formula).The example of repeat pattern includes the repetition pulse of certain shape, such as sine, rectangle, triangle, sawtooth are upwards and sawtooth
Downwards.
One key frame may include an inserted value.Inserted value is the value where regulation current interpolation occurs.One
In kind embodiment, inserted value can be the integer value from minimum value to maximum value.As an example, inserted value can from 0 to
10000.But this is only an example, and inserted value can be the arbitrary value from any minimum value to any maximum value.
For example, in other embodiments, inserted value can be fixed point or floating-point values.Inserted value can be stored in one or more positions
In.
Key frame further includes optionally repeating gap width.It is instruction to repeat gap width, when a basic haptic effect is continuous
When broadcasting, the value of the time cycle between two continuous examples of the basic haptic effect.In one embodiment, gap is repeated
It can indicate several milliseconds between two continuous examples of basic haptic effect.
In the illustrated embodiment, dynamic haptic effects define 200 include four key frames, key frame 210,220,
230 and 240.But this is only an example embodiment, and in an alternative embodiment, dynamic haptic effects definition can be with
Including any number of key frame.Key frame 210 includes that basic haptic effect refers to " period (Periodic) 1 ", inserted value " 0 "
With repetition gap width " 10ms ".Basic haptic effect refers to reference to " period 1 " to be also included within dynamic haptic effects and defines base in 200
This haptic effect 260.Thus, key frame 210 is defined as basic haptic effect 260 the basic haptic effect that inserted value is " 0 ".
Key frame 210 further indicates, when basic haptic effect 260 repeats playing, in the continuous reality of each of basic haptic effect 260
There is the time cycle of 10ms between example.Similarly, key frame 220 includes that basic haptic effect refers to " period 3 ", inserted value " 10 "
With repetition gap width " 15ms ".Basic haptic effect refers to reference to " period 3 " to be also included within dynamic haptic effects and defines base in 200
This haptic effect 270.Thus, basic haptic effect 270 is defined as the basic tactile that inserted value is " 10 " and imitated by key frame 220
It answers.Key frame 220 further indicates, when basic haptic effect 270 repeats playing, connects in each of basic haptic effect 270
There is the time cycle of 15ms between continuous example.
Equally, key frame 230 includes that basic haptic effect refers to " period 1 ", inserted value " 20 " and repetition gap width
"5ms".As previously mentioned, basic haptic effect refers to reference to " period 1 " and is also included within dynamic haptic effects and defines touching substantially in 200
Feel effect 260.Thus, key frame 230 is defined as basic haptic effect 260 the basic haptic effect that inserted value is " 20 ".This
Illustrate that a basic haptic effect can be defined as the basic haptic effect of more than one inserted value.Key frame 230 further refers to
Show, when basic haptic effect 260 repeats playing, there is the time of 5ms between the continuous example of each of basic haptic effect 260
Period.Similarly, key frame 240 includes that basic haptic effect refers to " period 2 ", inserted value " 30 " and repetition gap width
"20ms".Basic haptic effect refers to reference to " period 2 " to be also included within dynamic haptic effects and defines basic haptic effect in 200
280.Thus, key frame 240 is defined as basic haptic effect 280 the basic haptic effect that inserted value is " 30 ".Key frame 240
Further instruction, when basic haptic effect 280 repeats playing, has between the continuous example of each of basic haptic effect 280
The time cycle of 20ms.
According to a kind of embodiment, dynamic haptic effects can be defined as further including instruction that the dynamic haptic effects terminate.
It includes any additional key frame that the instruction that dynamic haptic effects terminate, which points out the dynamic haptic effects not,.As in further detail below
Description, explain that the equipment that dynamic haptic effects define can be configured to sequentially explain the content that dynamic haptic effects define.
Thus, which can point out the end that dynamic haptic effects define to equipment.In one embodiment, dynamic haptic effects knot
The instruction of beam is considered an additional key frame.In the illustrated embodiment, dynamic haptic effects define 200 packets
The end 250 that dynamic haptic effects define is included, instruction dynamic haptic effects define 200 end.
Fig. 3 illustrates example key frame definition 300 according to an embodiment of the present invention.As previously mentioned, dynamic haptic effects
Definition includes one or more key frames.According to this embodiment, a crucial frame definition may include one or more attributes.It should
Each attribute in one or more attributes can include a value.
Crucial frame definition may include type attribute.In one embodiment, type attribute is the first of crucial frame definition
Attribute.Type attribute can indicate that the key frame is the key that comprising the basic haptic effect defined for dynamic haptic effects
Frame, or instruction dynamic haptic effects define the key frame of end.In the illustrated embodiment, crucial frame definition 300 includes
Type attribute 310, it indicates the type of the key frame defined by crucial frame definition 300.
Crucial frame definition can also include basic haptic effect attribute.Basic haptic effect attribute can be stored to being used for this
The reference of the basic haptic effect of key frame.In the illustrated embodiment, crucial frame definition 300 includes basic haptic effect category
320 (being defined as in figure 3 " effect title ") of property comprising to the basic of the key frame for being defined by crucial frame definition 300
The reference of haptic effect.
Crucial frame definition can also include interpolation attribute.Interpolation attribute can store inserted value, and inserted value regulation is current
Insertion where occur.In one embodiment, inserted value can be the integer value from minimum value to maximum value.As one
Example, inserted value can be from 0 to 10000.Inserted value can be stored in one or more positions.In the illustrated embodiment,
Crucial frame definition 300 includes interpolation attribute 330, this includes the inserted value of the key frame for being defined by crucial frame definition 300.
Crucial frame definition can also optionally include repetition clearance properties (not specifying in figure 3).The repetition clearance properties
Repetition gap width can be stored, value instruction, when a basic haptic effect continuously plays, two of the basic haptic effect
Time cycle between continuous example.In one embodiment, the basic tactile for the key frame can be indicated by repeating gap
Several milliseconds between the continuous example of two of effect.
In one embodiment, haptic effect file is arranged to store the computer of one or more dynamic haptic effects
File, wherein haptic effect file can on disk, memory or any computer readable storage medium persistence.According to the reality
Example is applied, haptic effect file can utilize basic haptic effect memory block and frame list block to store one or more dynamic haptics and imitate
It should define.Basic haptic effect memory block can be used for storing the basic tactile of one or more that dynamic haptic effects can refer to
Effect.Frame list block can be used for storing the crucial frame definition of one or more for corresponding to that dynamic haptic effects define.Now more
It is specifically described basic haptic effect memory block and frame list block.
Fig. 4 illustrates the basic haptic effect memory block of example 400 according to an embodiment of the present invention.As previously mentioned, dynamic
Haptic effect definition may include one or more basic haptic effects, the basic haptic effect quilt that wherein at least one is stored
At least one key frame reference that dynamic haptic defines.In one embodiment, the one or more basic haptic effect can be with
It is stored in basic haptic effect memory block, such as in basic haptic effect memory block 400, wherein the basic haptic effect storage
Block is stored in dynamic haptic effects definition.
According to this embodiment, one or more basic haptic effects can be stored in basic haptic effect as message flow and deposit
It stores up in block 400.A kind of example hair message format is " codename z2 " agreement hair message format.In the illustrated embodiment,
Basic haptic effect is defined by setting (Set) periodic messages, and optionally prefixing Set periods modifier (Modifier) disappears
Breath.Thus, when basic haptic effect has associated envelope (envelope), Set period Modifier disappear in the block
Breath can appear in before Set periodic messages.Otherwise, only Set periodic messages will appear in the block.Thus, according to the reality
Example is applied, when being stored in basic haptic effect memory block (such as basic haptic effect memory block 400 of Fig. 4), basic tactile
Effect or the memory (it is assumed that default envelope) that can take up 8 bytes in (a) single Set periodic messages;Or (b) after
Follow the memory of 16 bytes in the first Set period Modifier message of follow-up Set periodic messages in face.
According to this embodiment, basic haptic effect memory block (such as basic haptic effect memory block 400 of Fig. 4) can wrap
One or more basic haptic effect definition are included, wherein each basic haptic effect definition corresponds to a basic haptic effect.
Each of it can be sequence that the one or more basic haptic effect, which is defined in basic haptic effect memory block, and can
It is associated with an index.
In the illustrated embodiment, basic haptic effect memory block 400 includes five basic haptic effects:Effect0,
Effect1, Effect2, Effect3 and Effect4.Effect0 is first be located in basic haptic effect memory block 400
Basic haptic effect, Effect1 are second basic haptic effect being located in basic haptic effect memory block 400, Effect2
It is the basic haptic effect of third being located in basic haptic effect memory block 400, Effect3 is deposited positioned at basic haptic effect
The 4th basic haptic effect in block 400 is stored up, and Effect4 is the 5th be located in basic haptic effect memory block 400
Basic haptic effect.This five basic haptic effects (that is, Effect0, Effect1, Effect2, Effect3 and Effect4)
In each include disappearing either including single Set periodic messages or including Set periods Modifier message with the Set periods
The basic tactile of the combination of breath defines.
Fig. 5 illustrates exemplary frames list block 500 according to an embodiment of the present.As previously mentioned, dynamic haptic effects
Definition may include one or more key frames, wherein each key frame can refer to a basic haptic effect.In a kind of reality
It applies in example, the one or more key frame can be stored in frame list block, such as in frame list block 500, wherein frame list block
It is stored in dynamic haptic effects definition.
According to this embodiment, frame list block, such as frame list block 500 include the type category for the first crucial frame definition
Property.Dependent on the type attribute, frame list block further comprises and the described first crucial associated one or more categories of frame definition
Property, for example, basic haptic effect attribute, interpolation attribute, repeat clearance properties or in which combination.Frame list block further comprises
For the type attribute of the second crucial frame definition, it indicates the end of the first crucial frame definition.Dependent on type attribute, frame list
Block further comprises and the second crucial associated one or more attributes of frame definition, such as basic haptic effect attribute, interpolation category
Property, repeat clearance properties or in which combination.This continues the crucial frame definition of each of frame list block.Frame list block is further wrapped
Include the type attribute that instruction dynamic haptic effects terminate.According to this embodiment, the crucial frame definition of frame list block is secondary in order
Sequence.In other words, the event of frame list block is that the order positioned in frame list block by them is handled.
According to this embodiment, one or more attributes of frame list block can utilize single header byte, and being followed by can
Selection of land data byte encodes, and the example codes strategy of one or more attributes of frame list block is as follows:
Critical frame types attribute
Byte # | Position 7-0 | Meaning |
0 | 0xC1 | Type=key frame.There is no data and this Attribute Association. |
The end attribute of dynamic haptic effects type
EffectNameAsOffSetU8 attributes
InterpolantU16 attributes
Byte # | Position 7-0 | Meaning |
0 | 0xE6 | Interpolation is stored as 16 signless integers |
1 | TIME15_8 | The MSByte of time migration magnitude (TimeOffset) |
2 | TIME7_0 | The LSByte of time migration magnitude (TimeOffset) |
RepeatGapU16 attributes
According to this embodiment, critical frame types attribute and dynamic haptic effects type terminate attribute corresponding to crucial frame definition
Type attribute, EffectNameAsOffSetU8 attributes correspond to the basic haptic effect attribute of crucial frame definition,
InterpolantU16 attributes correspond to the interpolation attribute of crucial frame definition, and RepeatGapU16 attributes correspond to key frame
Repeat clearance properties.
In the illustrated embodiment, frame list block 500 includes crucial frame definition 510,520 and 530.Crucial frame definition
510 and 520 each contribute to the definition of basic haptic effect key frame.Crucial frame definition 530 is stored in frame list block
The instruction that terminates of dynamic haptic effects.The left column of frame list block 500 indicates in for crucial frame definition 510,520 and 530
The byte stream found in the memory of each.The right row instruction of frame list block is for every in crucial frame definition 510,520 and 530
The meaning of one each attribute.
According to embodiment described, crucial frame definition 510 includes the Key Properties that the crucial frame definition 510 of instruction originates
(" key frame event " as illustrated in fig. 5).Crucial frame definition 510 further comprises storage to being used for crucial frame definition 510
Basic haptic effect reference basic haptic effect attribute (as illustrated in fig. 5
" EffectNameAsOffSetU8 "), wherein basic haptic effect attribute includes header byte and offset byte.Key frame is fixed
Justice 510 further comprises that store predetermined is currently inserted into the interpolation attribute of the inserted value where occurred (as illustrated in fig. 5
" InterpolantU16 "), wherein interpolation attribute includes header byte, most important position (" MSB ") and most unessential position
("LSB").Crucial frame definition 510 further comprises that storage repeats the repetition clearance properties of gap width (as illustrated in fig. 5
" RepeatGapU16 "), which indicates between two continuous examples for the basic haptic effect of key frame
Time cycle, wherein repeatedly clearance properties include header byte, MSB and LSB.
In addition, the critical frame types attribute that crucial frame definition 520, which also includes the crucial frame definition 520 of instruction, to be originated is (in such as Fig. 5
Illustrated " key frame event ").Crucial frame definition 520 further comprises storing to touching substantially for crucial frame definition 520
Feel the basic haptic effect attribute (" EffectNameAsOffSetU16 " as illustrated in fig. 5) of the reference of effect, wherein base
This haptic effect attribute includes header byte, basic haptic effect defines MSB and basic haptic effect defines LSB.Key frame is fixed
Justice 520 further comprises that store predetermined is currently inserted into the interpolation attribute of the inserted value where occurred (as illustrated in fig. 5
" InterpolantU16 "), wherein interpolation attribute includes header byte, MSB and LSB.As illustrated in fig. 5, with key frame
It defines 510 to be contrasted, key frame 520 does not include repeating clearance properties.Finally, crucial frame definition 530 includes instruction dynamic contact
The dynamic haptic effects type that feel effect defines end terminates attribute (as illustrated in fig. 5
“EndofDynamicHapticEffect”)。
According to a kind of embodiment, dynamic haptic effects definition (for example, the dynamic haptic effects of Fig. 2 define 200) can deposit
Storage is in haptic effect file.As previously mentioned, haptic effect file is arranged to store one or more dynamic haptic effects
Computer documents.Dynamic haptic effects definition can be stored in haptic effect file, and haptic effect file can counted
Calculation machine readable medium, such as disk or memory, middle persistence.Then, dynamic haptic effects definition can be from haptic effect file
It retrieves and is explained.Based on the explanation defined to dynamic haptic effects, dynamic haptic effects can be by being based on a dynamic value
The first haptic effect and the second haptic effect are inserted into generate, wherein the dynamic value is to be located at the first inserted value and the second inserted value
Between one value.More specifically, the value of each parameter for dynamic haptic effects can be by using interpolating function with the
The parameter value of two haptic effects is inserted into the parameter value of the first haptic effect to calculate.Each parameter value of dynamic haptic effects is inserted
Enter to fall based on the dynamic value between the first inserted value and the second inserted value where.For example, when the first inserted value is
When " 0 " and the second inserted value are " 100 ", dynamic value " 50 " can make and the associated first haptic effect quilt of the first inserted value " 0 "
It is inserted into associated second haptic effect of the second inserted value " 100 ", generates dynamic haptic effects.Each ginseng of first haptic effect
Numerical value can be inserted into based on interpolating function using the parameter value of second value so that the parameter value of dynamic haptic effects was both based on
The parameter value and the parameter value based on the second haptic effect of first haptic effect.About the additional detail for generating dynamic haptic effects
It is furthermore that submission on July 11st, 2012 and entitled " GENERATING HAPTIC EFFECTS FOR DYNAMIC
Described in the U.S. Patent Application Serial Number 13/546,351 of EVENTS ".
Moreover, according to a kind of embodiment, dynamic haptic effects definition (for example, the dynamic haptic effects of Fig. 2 define 200) can
For generating haptic effect signal, wherein haptic effect signal can be stored in haptic effect file.The haptic effect is believed
It number then can be from haptic effect document retrieval.In addition, drive signal can be applied to tactile output according to haptic effect signal
Equipment.Drive signal further can utilize haptic output devices to generate.
Haptic effect file can be with one kind in many different-formats.In certain embodiments, haptic effect file
There can be extensible markup language (" XML ") format.In certain other embodiments, haptic effect file can have two into
Format processed.
One example of the haptic effect file with XML format is to immerse vibration source (" IVS ") haptic effect file.Packet
The example IVS haptic effect file that dynamic haptic effects define is included to be presented below:
In example provided above IVS haptic effect files, first label<interpolated-effect name
>Identification is stored in the title of the dynamic haptic effects in IVS haptic effect files.It is connect down in example IVS haptic effect files
Three labels come represent the key frame of the dynamic haptic effects.Each key frame can include two parameters, and key frame is inserted
Value parameter and effect parameter.Keyframe interpolation parameter represents the inserted value for the key frame.Effect parameter, which represents, is used for the pass
The title of the basic haptic effect of key frame.In addition, each key frame tagging can optionally include third parameter, between repeating
Gap parameter.It repeats gap parameter and represents the repetition gap for being used for key frame.If crucial frame tagging does not include repeating gap parameter,
Then key frame does not include repeating gap, it means that the basic haptic effect for the key frame does not repeat.It is touched in example IVS
Feel in effect file, first crucial frame tagging includes the keyframe interpolation parameter that value is " 0 ", is worth for " basic effect -0-
" effect parameter and value be referred to as " 0 " repetition gap parameter.Second crucial frame tagging includes that the key frame that value is " 0.5 " is inserted
Value parameter is worth the repetition gap parameter that effect parameter and value for " basic effect -1- titles " are " 15 ".Third key frame mark
Label include the keyframe interpolation parameter that value is " 1 " and value for the effect parameter of " basic effect -2- titles " and including repeating
Gap parameter.Subsequent label "</interpolated-effect>" identify the dynamic being stored in IVS haptic effect files
The end of haptic effect.
According to a kind of embodiment, equipment can read IVS haptic effect files, such as example provided above IVS tactiles effect
File is answered, explains the dynamic haptic effects definition being stored in the IVS haptic effect files, and use is stored in IVS tactiles
Value in effect file generates dynamic haptic effects.For example, equipment can provide inserted value at runtime, and equipment can be with
Explain be stored in IVS haptic effect files dynamic haptic effects definition, and determine inserted value be located at which key frame it
Between.Then, equipment can be inserted between the basic haptic effect of the two key frames, generate dynamic haptic effects.
Another example of haptic effect file with binary format is to immerse Vibration Targets (" IVT ") haptic effect
File.It is presented below including the example IVT haptic effect files that dynamic haptic effects define.
The 0xC1, // key frame event for key frame 0
0xE0, //EffectIndexU8 event attributes
0, // use basic effect 0
0xE6, //InterpolantU8 event attributes
0, the inserted value of // key frame 0
0xE2, //TimeOffsetMs16 event attributes.Pay attention to:This is optional
50, the repetition gap of // key frame 0, LSB
0, the repetition gap of // key frame 0, MSB
0xC1, the key frame event of // key frame 1
0xE0, //EffectIndexU8 event attributes
1, // use basic effect 1
0xE6, //InterpolantU8 event attributes
100, the inserted value of // key frame 1.
0xE2, //TimeOffsetMs16 event attributes.Pay attention to:This is optional
150, the repetition gap of // key frame 1, LSB
0, the repetition gap of // key frame 1, MSB
The End Event of 0xCF//LERP effects
Example provided above IVT haptic effect files include two crucial frame definitions, wherein each key frame definition
It is to be identified by hexadecimal number " 0xC1 ".According to a kind of embodiment, such as the hexadecimal value of " 0xCy " can identify storage
The starting of key frame in IVT haptic effect files, wherein y are the hexadecimal digits of any non-zero.In addition, according to the reality
Apply example, if hexadecimal value " 0xCy " is first this hexadecimal value in IVT haptic effect files, this 16 into
Value processed can also indicate that the starting for the dynamic haptic effects being stored in IVT haptic effect files, the wherein dynamic haptic effects
Including one or more key frames.In example provided above IVT haptic effect files, first example of " 0xC1 " indicates
The starting of the dynamic haptic effects stored, and it is " crucial to also indicate first key frame in stored dynamic haptic effects
The starting of frame 1 ".Moreover, in example IVT haptic effect files, second example of " 0xC1 " indicates stored dynamic
The starting of second key frame " key frame 2 " in haptic effect.
For each key frame, example IVT haptic effect files further comprise the data for defining key frame.For example, root
According to a kind of embodiment, example IVT haptic effect files further comprise basic haptic effect category of the identification for the key frame
The hexadecimal value of property, such as " 0xE0 " or " 0xE1 ", and the value for the basic haptic effect attribute.It is touched in example IVT
Feel in effect file, for first key frame, the example recognition of " 0xE0 " is used for the basic haptic effect of first key frame
Attribute, and it is worth first basic haptic effect of " 0 " identification for the basic haptic effect attribute of first key frame.Equally,
For second key frame, the example recognition of " 0xE0 " is used for the basic haptic effect attribute of second key frame, and is worth " 1 " knowledge
Not Yong Yu second key frame basic haptic effect attribute second basic haptic effect.
As a part for the data for defining key frame, according to this embodiment, example IVT haptic effect files further wrap
Include hexadecimal value of the identification for the interpolation attribute of key frame, such as " 0xE6 ", and the value for the interpolation attribute.Show at this
In example IVT haptic effect files, for first key frame, the example recognition of " 0xE6 " is used for the interpolation category of first key frame
Property, and it is worth inserted value of " 0 " identification for the interpolation attribute of first key frame.Equally, for second key frame, " 0xE6 "
Example recognition be used for second key frame interpolation attribute, and be worth " 100 " identification be used for second key frame interpolation attribute
Inserted value.
As a part for the data for defining key frame, according to this embodiment, example IVT haptic effect files further wrap
Hexadecimal value of the identification for the repetition clearance properties of key frame, such as " 0xE2 " are included, and for the repetition clearance properties
One or more value.In example IVT haptic effect files, for first key frame, the example recognition of " 0xE2 " is used for
The repetition clearance properties of first key frame, " LSB " weight of value " 50 " identification for the repetition clearance properties of first key frame
Multiple gap width, and " MSB " that is worth " 0 " identification for the repetition clearance properties of first key frame repeats gap width.Equally, for
Second key frame, the example recognition of " 0xE2 " are used for the repetition clearance properties of second key frame, and value " 150 " identification is for the
The LSB of the repetition clearance properties of two key frames repeats gap width, and is worth repetition gap of " 0 " identification for second key frame
The MSB of attribute repeats gap width.
Example IVT haptic effect files further comprise that identification is stored in the dynamic haptic in IVT haptic effect files
The hexadecimal value that effect terminates, such as " 0xCF ".In example IVT haptic effect files, the example recognition institute of " 0xCF "
The end of the dynamic haptic effects of storage.
According to a kind of embodiment, equipment can read IVT haptic effect files, such as example provided above IVT tactiles effect
File is answered, explains the dynamic haptic effects definition being stored in the IVT haptic effect files, and use the IVT haptic effects
The value stored in file generates dynamic haptic effects.For example, equipment can provide inserted value at runtime, and equipment can be with
It explains the dynamic haptic effects definition being stored in the IVT haptic effect files, and determines which key frame is inserted value be located at
Between.Then, equipment can be inserted between the basic haptic effect of the two key frames, generate dynamic haptic effects.
In one embodiment, in addition to the definition of dynamic haptic effects, IVT haptic effects file can also include other touches
Feel the definition of effect, such as basic haptic effect.For example, IVT haptic effect files may include that MagSweep haptic effects are fixed
Justice, the definition of periodic haptic effect effect or other types of basic haptic effect definition, such as " waveform haptic effect " definition.
" waveform haptic effect " is the haptic effect that the concrete signal based on the feedback for causing more to accurately control generates feedback.According to the reality
Example is applied, when compared with the starting of basic haptic effect, a hexadecimal value can identify the starting of dynamic haptic effects.Cause
And when equipment reads IVT haptic effect files, equipment can distinguish dynamic haptic effects definition and be defined with basic haptic effect.
In the exemplary embodiment, hexadecimal value, such as " 0xF1 ", " 0xF2 " or " 0xFF " can identify that " timeline touches
The starting of feel effect "." timeline haptic effect " may include the basic haptic effect of one or more mapped out at any time.Separately
Outside, hexadecimal value, such as " 0x20 ", " 0x30 ", " 0x40 " and " 0x50 " can identify the starting of basic haptic effect, such as
As MagSweep haptic effects, periodic haptic effect effect or waveform haptic effect.Moreover, in addition, as previously mentioned, for example
The hexadecimal value of " 0xCy " can identify that the starting of dynamic haptic effects, wherein y are the hexadecimal digits of non-zero.Thus,
When equipment reads IVT haptic effect files, the equipment can read be stored in the IVT haptic effect files 16 into
Value processed, and determine that defined haptic effect is dynamic haptic effects or some other haptic effect.For determining
The haptic effect of justice is that the example pseudo-code of dynamic haptic effects or some other haptic effect is presented below.
But this example pseudo-code only can be achieved on the example of above-mentioned functional code, and realize the work(
The code of energy property can be other formats, and still within the scope of the present invention.In addition, although it have been described that tactile is imitated
The two kinds of format samples (that is, IVS haptic effects file and IVT haptic effects file) answered, but haptic effect can also be not
Same format, and still within the scope of the present invention.
Fig. 6 illustrates tactile coding module (for example, tactile coding module 16 of Fig. 1) according to an embodiment of the present
Functional flow chart.In one embodiment, the functionality of the functionality of Fig. 6 and Fig. 7.Each is by being stored in
In reservoir or other computer-readable or tangible medium and by the software realization of processor execution.In other embodiments,
Each functionality can by hardware (for example, by using application-specific integrated circuit (" ASIC "), programmable gate array (" PGA "),
Field programmable gate array (" FPGA ") etc.) or the arbitrary of hardware and software combine execution.In addition, in an alternative embodiment,
Each functionality can be executed by hardware using analog component.
Flow starts and proceeds to 610.610, dynamic haptic effects are defined to include the first key frame and second and close
Key frame.First key frame includes the first inserted value and corresponding first haptic effect.Second key frame include the second inserted value and
Corresponding second haptic effect.First inserted value can provide where corresponding first haptic effect is inserted into
Value, and the second inserted value can be the value for providing where to be inserted into corresponding second haptic effect.First key frame
May include the gap width repeated with each of the second key frame.Each of first haptic effect and the second haptic effect can be
Vibrating tactile effect, and each may include multiple parameters.This multiple parameters may include magnitude parameter, frequency parameter and
Duration parameters.Dynamic haptic effects can further be defined as including instruction that dynamic haptic effects terminate.Dynamic haptic
Effect can further be defined as including haptic effect memory block, wherein the first haptic effect and the second haptic effect can store
In the haptic effect memory block.Dynamic haptic effects may include additional key frame.In the illustrated embodiment, dynamic
Haptic effect can be defined as including two key frames, wherein each key frame includes haptic effect.But this is only one
A example embodiment, and in an alternative embodiment, dynamic haptic effects can be defined as including three or more key frames,
In each key frame include a haptic effect.Flow proceeds to 620.
620, haptic effect is generated.The format of haptic effect file can be binary format.Alternatively, tactile is imitated
It can be XML format to answer the format of file.Flow proceeds to 630.
630, dynamic haptic effects are stored in haptic effect file.Dynamic haptic effects can be together with one or more
Other haptic effects are collectively stored in haptic effect file.At least one of other haptic effects can be another dynamic
Haptic effect.Flow proceeds to 640.
640, from haptic effect document retrieval dynamic haptic effects.Equipment can read haptic effect file and retrieve dynamic
Haptic effect.When haptic effect file includes one or more of the other haptic effect, equipment can be based on being included in dynamic
Value in state haptic effect identifies dynamic haptic effects.Flow proceeds to 650.
650, dynamic haptic effects are explained.Equipment can be by sequentially reading be included in dynamic haptic effects
One key frame and the second key frame explain the dynamic haptic effects.When dynamic haptic effects include additional key frame,
The dynamic haptic effects can be explained further by sequentially reading additional key frame in equipment.Equipment can be by receiving dynamic
Value and two basic haptic effects being stored in dynamic haptic effects based on received dynamic value selection are further solved
Release dynamic haptic effects.Equipment can be based further on received dynamic value and fall two stored in dynamic haptic effects
The fact in a inserted value selects be stored in the dynamic haptic effects two basic haptic effects, two of which inserted value
Corresponding to two basic haptic effects.Flow proceeds to 660.
660, dynamic haptic effects are generated.Dynamic haptic effects can be based on two selected basic haptic effect lifes
At.More specifically, dynamic haptic effects can be imitated by being inserted into two selected basic tactiles based on received dynamic value
It should generate.In certain embodiments, the value for each parameter of dynamic haptic effects can be used by using interpolating function
Second parameter value for being chosen basic haptic effect is inserted into the parameter value that first is chosen basic haptic effect to calculate.Dynamic contact
Feel that the insertion of each parameter value of effect can be fallen based on received dynamic value and is being chosen basic tactile corresponding to first
First inserted value of effect and corresponding to second be chosen basic haptic effect the second inserted value between where.Described
In bright embodiment, dynamic haptic effects can be generated by being inserted into two basic haptic effects.This insertion can be line
Property be inserted into.But this is only an example embodiment, and in an alternative embodiment, dynamic haptic effects can pass through insertion
Three or more basic haptic effects generate.This insertion can be that batten is inserted into, and wherein batten insertion is a kind of insertion
Form wherein being inserted into the piecewise polynomial that function is the referred to as specific type of batten, and is wherein inserted into function and is available with
Two or more key frames are mapped to inserted value the function of dynamic haptic effects.Dynamic haptic effects can be further by causing
Dynamic device generates.Then, flow terminates.
Fig. 7 illustrates functional flow chart of tactile coding module according to another embodiment of the invention.Flow starts
And proceed to 710.710, the first key frame is received, there is wherein first key frame the first inserted value and the first tactile to imitate
It answers.First inserted value can be the value for providing where to be inserted into the first haptic effect.First key frame can be with
Including repeating gap width.First haptic effect can be vibrating tactile effect, and may include multiple parameters.Multiple parameter
May include magnitude parameter, frequency parameter and duration parameters.Flow proceeds to 720.
720, the second key frame is received, wherein second key frame has the second inserted value and the second haptic effect.The
Two inserted value can be the value for providing where to be inserted into the second haptic effect.This second key frame may include weight
Multiple gap width.Second haptic effect can be vibrating tactile effect, and may include multiple parameters.This multiple parameters can wrap
Include magnitude parameter, frequency parameter and duration parameters.Flow proceeds to 730.
730, haptic effect signal is generated using the first key frame and the second key frame.The haptic effect signal can be into
One step includes the instruction that haptic effect signal terminates.Haptic effect signal may further include haptic effect memory block, wherein
First haptic effect and the second haptic effect can be stored in the haptic effect memory block.Haptic effect signal can be further
Including additional key frame.In the illustrated embodiment, haptic effect signal can utilize two key frames to generate, wherein often
A key frame includes a haptic effect.But this is an example embodiment, and in an alternative embodiment, haptic effect
Three or more key frames can be utilized to generate, wherein each key frame includes a haptic effect.Flow proceeds to 740.
740, haptic effect signal is stored in haptic effect file.The format of haptic effect file can be binary system
Format.Alternatively, the format of haptic effect file can be XML format.Haptic effect signal can be together with one or more
Other haptic effect signals are collectively stored in haptic effect file.At least one other haptic effect signal may include two
Or more key frame.Flow proceeds to 750.
750, from haptic effect document retrieval haptic effect signal.Equipment can read haptic effect file and retrieve to touch
Feel effect signal.When haptic effect file includes one or more of the other haptic effect signal, equipment can be based on packet
Include the value identification haptic effect signal in haptic effect signal.Flow proceeds to 760.
760, drive signal is applied to haptic output devices according to haptic effect signal.This can be such that haptic output devices produce
The raw dynamic haptic effects consistent with haptic effect signal.Flow proceeds to 770.
770, drive signal is generated using haptic output devices.Then, flow terminates.
Thus, in one embodiment, can provide can be on disk, memory or any computer readable storage medium
The system of the one or more dynamic haptic effects of storage.The system can further retrieve one or more dynamic haptic effects simultaneously
And the one or more dynamic haptic effects of output.Dynamic haptic effects can be based on multiple haptic effects.Thus, this system can be with
The information about a small amount of haptic effect is stored, but this system can be hundreds of or thousands of dynamic based on this small amount of haptic effect output
State haptic effect.This can save memory space, and allow the more effective mechanism for storing and retrieving dynamic haptic effects.Separately
Outside, it can be flexible to store the format of the haptic effect file of one or more dynamic haptic effects so that haptic effect is set
Meter personnel are not limited to a small amount of file format.This allows effect designer to create dynamic haptic effects more flexiblely.
Through this specification description the present invention feature, structure or characteristic can in one or more embodiments according to
Any suitable way combines.For example, run through this specification, " a kind of embodiment ", " some embodiments ", " some embodiment ",
The use of " some embodiments " and other similar languages all refers to a particular feature, structure, or characteristic contacted described in the embodiment can be with
The fact being included at least one embodiment of the present invention.Thus, run through this specification, phrase " a kind of embodiment ", " some
The appearance of embodiment ", " some embodiment ", " some embodiments " or other similar languages is not necessarily all referring to identical one group
Embodiment, and described feature, structure or characteristic can be in one or more embodiments in any suitable manner
Combination.
The step of those of ordinary skill in the art will readily appreciate that, the present invention as discussed above can utilize different order
And/or using putting into practice from the element in those disclosed different configurations.Therefore, although to have been based on these excellent by the present invention
Embodiment is selected to be described, it will be apparent to one skilled in the art that, certain modifications, variation and constructive alternative will
It is obvious, while still in the spirit and scope of the present invention.In order to determine the boundary and range of the present invention, institute should refer to
Attached claim.
Claims (20)
1. a kind of method generating haptic effect, including:
From haptic effect document retrieval dynamic haptic effects, the dynamic haptic effects include that the first key frame and second are crucial
Frame, wherein first key frame includes the first inserted value and corresponding first haptic effect, and the second key frame packet
The second inserted value and corresponding second haptic effect are included, is touched for described corresponding first wherein first inserted value is regulation
Feel effect is inserted into the value where occurred, and wherein described second inserted value is that regulation imitates corresponding second tactile
The value where occurred should be inserted into;
The dynamic haptic effects are explained by processor;And
It is described dynamic by being generated into row interpolation to first haptic effect and second haptic effect by the processor
State haptic effect.
2. according to the method described in claim 1,
Further include from haptic effect document retrieval haptic effect signal, wherein institute from haptic effect document retrieval dynamic haptic effects
It includes the dynamic haptic effects to state haptic effect signal;And
It further includes according to the haptic effect signal to haptic output devices application drive signal to generate the dynamic haptic effects.
3. according to the method described in claim 1, it further includes being generated using haptic output devices to generate the dynamic haptic effects
Drive signal.
4. terminating according to the method described in claim 2, the wherein described haptic effect signal further includes the haptic effect signal
Instruction.
5. according to the method described in claim 2, the wherein described haptic effect signal further includes haptic effect memory block;And
Wherein described first haptic effect and second haptic effect are stored in the haptic effect memory block.
6. according to the method described in claim 1, wherein described first haptic effect and second haptic effect are individually to shake
It moves haptic effect and includes respectively multiple parameters.
7. according to the method described in claim 6, wherein the multiple parameter includes range parameter, frequency parameter and duration
Parameter.
8. according to the method described in claim 1, wherein described first key frame and second key frame include respectively repeating
Gap width.
9. according to the method described in claim 1, the format of the wherein described haptic effect file is binary format.
10. according to the method described in claim 1, the format of the wherein described haptic effect file is extensible markup language lattice
Formula.
11. a kind of computer-readable medium with the instruction being stored thereon, when described instruction is executed by processor so that
The processor generates haptic effect according to the method as described in any one of claim 1-10.
12. a kind of system for being encoded to haptic effect signal, the system comprises haptic output devices, memory and
Processor, the processor are configured as the method according to any one of claim 1-10 to generate haptic effect.
13. a kind of method for generating haptic effect executed by processor, the method includes:
Using pressure sensor, detection user inputs;
Input signal is generated in response to detecting user's input;
Show user interface;And
The haptic effect is generated based on the input signal, wherein one or more of the haptic effect and the user interface
A element is associated,
The wherein described haptic effect include the first key frame and the second key frame, first key frame include the first inserted value and
First haptic effect, and second key frame includes the second inserted value and the second haptic effect, wherein described first is inserted into
Value is to provide to be inserted into the value where occurred for corresponding first haptic effect, and wherein described second inserted value is regulation
The value where occurred is inserted into for corresponding second haptic effect.
14. according to the method for claim 13, wherein the pressure sensor, which detects, gently sweeps movement.
15. according to the method for claim 13, wherein the pressure sensor detects two dimensional motion or three-dimensional motion.
16. according to the method for claim 13, wherein the user interface includes the scroll list.
17. according to the method for claim 13, further including:
By generating dynamic haptic effects into row interpolation to first haptic effect and second haptic effect.
18. according to the method for claim 13, wherein it includes generating to be used for haptic output devices to generate the haptic effect
Drive signal.
19. a kind of computer-readable medium with the instruction being stored thereon, when described instruction is executed by processor so that
Method of the processor according to any one of claim 13-18 generates haptic effect.
20. a kind of system for being encoded to haptic effect signal, the system comprises haptic output devices, memory and
Processor, the processor are configured as the method according to any one of claim 13-18 to generate haptic effect.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/667,003 US8947216B2 (en) | 2012-11-02 | 2012-11-02 | Encoding dynamic haptic effects |
US13/667,003 | 2012-11-02 | ||
CN201310471458.0A CN103809960B (en) | 2012-11-02 | 2013-10-11 | Encode dynamic haptic effects |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310471458.0A Division CN103809960B (en) | 2012-11-02 | 2013-10-11 | Encode dynamic haptic effects |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108803869A true CN108803869A (en) | 2018-11-13 |
Family
ID=49162024
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810371898.1A Pending CN108803869A (en) | 2012-11-02 | 2013-10-11 | Encode dynamic haptic effects |
CN201310471458.0A Active CN103809960B (en) | 2012-11-02 | 2013-10-11 | Encode dynamic haptic effects |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310471458.0A Active CN103809960B (en) | 2012-11-02 | 2013-10-11 | Encode dynamic haptic effects |
Country Status (5)
Country | Link |
---|---|
US (4) | US8947216B2 (en) |
EP (2) | EP3495924B1 (en) |
JP (2) | JP6359820B2 (en) |
KR (1) | KR102161541B1 (en) |
CN (2) | CN108803869A (en) |
Families Citing this family (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7765333B2 (en) | 2004-07-15 | 2010-07-27 | Immersion Corporation | System and method for ordering haptic effects |
US9898084B2 (en) * | 2012-12-10 | 2018-02-20 | Immersion Corporation | Enhanced dynamic haptic effects |
JP6351964B2 (en) * | 2013-12-11 | 2018-07-04 | 株式会社東海理化電機製作所 | Input device |
EP3099030A1 (en) * | 2015-05-26 | 2016-11-30 | Thomson Licensing | Method and device for encoding/decoding a packet comprising data representative of a haptic effect |
EP3332310B1 (en) * | 2015-08-05 | 2019-05-29 | Dolby Laboratories Licensing Corporation | Low bit rate parametric encoding and transport of haptic-tactile signals |
CN107735749A (en) * | 2015-09-22 | 2018-02-23 | 意美森公司 | Tactile based on pressure |
US9947187B2 (en) | 2016-01-07 | 2018-04-17 | International Business Machines Corporation | Haptic notification system with rules for notification that can be altered to increase effectiveness |
US10732714B2 (en) | 2017-05-08 | 2020-08-04 | Cirrus Logic, Inc. | Integrated haptic system |
US11259121B2 (en) | 2017-07-21 | 2022-02-22 | Cirrus Logic, Inc. | Surface speaker |
CN107566847B (en) * | 2017-09-18 | 2020-02-14 | 浙江大学 | Method for encoding touch data into video stream for storage and transmission |
US10455339B2 (en) | 2018-01-19 | 2019-10-22 | Cirrus Logic, Inc. | Always-on detection systems |
US10620704B2 (en) | 2018-01-19 | 2020-04-14 | Cirrus Logic, Inc. | Haptic output systems |
US11139767B2 (en) | 2018-03-22 | 2021-10-05 | Cirrus Logic, Inc. | Methods and apparatus for driving a transducer |
US10795443B2 (en) | 2018-03-23 | 2020-10-06 | Cirrus Logic, Inc. | Methods and apparatus for driving a transducer |
US10667051B2 (en) | 2018-03-26 | 2020-05-26 | Cirrus Logic, Inc. | Methods and apparatus for limiting the excursion of a transducer |
US10820100B2 (en) | 2018-03-26 | 2020-10-27 | Cirrus Logic, Inc. | Methods and apparatus for limiting the excursion of a transducer |
US10832537B2 (en) | 2018-04-04 | 2020-11-10 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US10684689B2 (en) | 2018-04-20 | 2020-06-16 | Immersion Corporation | Cross-platform dynamic haptic effect design tool for augmented or virtual reality environments |
US10572017B2 (en) * | 2018-04-20 | 2020-02-25 | Immersion Corporation | Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments |
US11069206B2 (en) | 2018-05-04 | 2021-07-20 | Cirrus Logic, Inc. | Methods and apparatus for outputting a haptic signal to a haptic transducer |
US11269415B2 (en) | 2018-08-14 | 2022-03-08 | Cirrus Logic, Inc. | Haptic output systems |
EP3629128A1 (en) * | 2018-09-25 | 2020-04-01 | Vestel Elektronik Sanayi ve Ticaret A.S. | User device and method for generating haptic feedback in a user device |
GB201817495D0 (en) | 2018-10-26 | 2018-12-12 | Cirrus Logic Int Semiconductor Ltd | A force sensing system and method |
CN109947244B (en) * | 2019-03-12 | 2022-11-04 | 上海天马微电子有限公司 | Display device and tactile feedback method, device and equipment |
US11644370B2 (en) | 2019-03-29 | 2023-05-09 | Cirrus Logic, Inc. | Force sensing with an electromagnetic load |
US11509292B2 (en) | 2019-03-29 | 2022-11-22 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using least-mean-squares filter |
US10955955B2 (en) | 2019-03-29 | 2021-03-23 | Cirrus Logic, Inc. | Controller for use in a device comprising force sensors |
US11283337B2 (en) | 2019-03-29 | 2022-03-22 | Cirrus Logic, Inc. | Methods and systems for improving transducer dynamics |
US10992297B2 (en) | 2019-03-29 | 2021-04-27 | Cirrus Logic, Inc. | Device comprising force sensors |
US10726683B1 (en) | 2019-03-29 | 2020-07-28 | Cirrus Logic, Inc. | Identifying mechanical impedance of an electromagnetic load using a two-tone stimulus |
US10828672B2 (en) | 2019-03-29 | 2020-11-10 | Cirrus Logic, Inc. | Driver circuitry |
US10976825B2 (en) | 2019-06-07 | 2021-04-13 | Cirrus Logic, Inc. | Methods and apparatuses for controlling operation of a vibrational output system and/or operation of an input sensor system |
US11150733B2 (en) | 2019-06-07 | 2021-10-19 | Cirrus Logic, Inc. | Methods and apparatuses for providing a haptic output signal to a haptic actuator |
GB2604215B (en) | 2019-06-21 | 2024-01-31 | Cirrus Logic Int Semiconductor Ltd | A method and apparatus for configuring a plurality of virtual buttons on a device |
US11408787B2 (en) | 2019-10-15 | 2022-08-09 | Cirrus Logic, Inc. | Control methods for a force sensor system |
US10984638B1 (en) * | 2019-10-17 | 2021-04-20 | Immersion Corporation | Systems, devices, and methods for encoding haptic tracks |
US11380175B2 (en) | 2019-10-24 | 2022-07-05 | Cirrus Logic, Inc. | Reproducibility of haptic waveform |
US11545951B2 (en) | 2019-12-06 | 2023-01-03 | Cirrus Logic, Inc. | Methods and systems for detecting and managing amplifier instability |
US11662821B2 (en) | 2020-04-16 | 2023-05-30 | Cirrus Logic, Inc. | In-situ monitoring, calibration, and testing of a haptic actuator |
US11933822B2 (en) | 2021-06-16 | 2024-03-19 | Cirrus Logic Inc. | Methods and systems for in-system estimation of actuator parameters |
US11765499B2 (en) | 2021-06-22 | 2023-09-19 | Cirrus Logic Inc. | Methods and systems for managing mixed mode electromechanical actuator drive |
US11908310B2 (en) | 2021-06-22 | 2024-02-20 | Cirrus Logic Inc. | Methods and systems for detecting and managing unexpected spectral content in an amplifier system |
US11552649B1 (en) | 2021-12-03 | 2023-01-10 | Cirrus Logic, Inc. | Analog-to-digital converter-embedded fixed-phase variable gain amplifier stages for dual monitoring paths |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002149312A (en) * | 2000-08-08 | 2002-05-24 | Ntt Docomo Inc | Portable electronic equipment, electronic equipment, oscillation generator, reporting method by oscillation, and report control method |
US20020163498A1 (en) * | 1997-04-25 | 2002-11-07 | Chang Dean C. | Design of force sensations for haptic feedback computer interfaces |
US20050134562A1 (en) * | 2003-12-22 | 2005-06-23 | Grant Danny A. | System and method for controlling haptic devices having multiple operational modes |
CN101059717A (en) * | 2006-04-21 | 2007-10-24 | 佳能株式会社 | Information-processing method and device for presenting haptics received from a virtual object |
WO2006019389A3 (en) * | 2004-07-15 | 2009-09-24 | Immersion Corporation | System and method for ordering haptic effects |
WO2010105004A1 (en) * | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and methods for using multiple actuators to realize textures |
CN101910978A (en) * | 2007-12-31 | 2010-12-08 | 苹果公司 | Tactile feedback in an electronic device |
CN102292696A (en) * | 2008-12-05 | 2011-12-21 | 平蛙实验室股份公司 | A touch sensing apparatus and method of operating the same |
CN102591518A (en) * | 2010-12-02 | 2012-07-18 | 英默森公司 | A device and a method for providing haptic feedback and haptic feedback device |
Family Cites Families (114)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0559708A1 (en) * | 1990-11-30 | 1993-09-15 | Cambridge Animation Systems Limited | Image synthesis and processing |
JPH0816820A (en) * | 1994-04-25 | 1996-01-19 | Fujitsu Ltd | Three-dimensional animation generation device |
US5774386A (en) * | 1995-09-08 | 1998-06-30 | Eastman Kodak Company | Method and apparatus for performing function evaluation using a cache |
JP4131278B2 (en) | 1996-10-18 | 2008-08-13 | ヤマハ株式会社 | Force control device for keyboard instruments |
US6108011A (en) * | 1996-10-28 | 2000-08-22 | Pacific Data Images, Inc. | Shape interpolation for computer-generated geometric models using independent shape parameters for parametric shape interpolation curves |
US6337678B1 (en) | 1999-07-21 | 2002-01-08 | Tactiva Incorporated | Force feedback computer input and output device with coordinated haptic elements |
US6449019B1 (en) * | 2000-04-07 | 2002-09-10 | Avid Technology, Inc. | Real-time key frame effects using tracking information |
US6864877B2 (en) | 2000-09-28 | 2005-03-08 | Immersion Corporation | Directional tactile feedback for haptic feedback interface devices |
US7623114B2 (en) | 2001-10-09 | 2009-11-24 | Immersion Corporation | Haptic feedback sensations based on audio output from computer devices |
US7199805B1 (en) * | 2002-05-28 | 2007-04-03 | Apple Computer, Inc. | Method and apparatus for titling |
JP2004310518A (en) * | 2003-04-08 | 2004-11-04 | Fuji Xerox Co Ltd | Picture information processor |
KR20050054731A (en) | 2003-12-05 | 2005-06-10 | 한국전자통신연구원 | Haptic simulation system and method for providing real-time haptic interaction in virtual simulation |
US9948885B2 (en) | 2003-12-12 | 2018-04-17 | Kurzweil Technologies, Inc. | Virtual encounters |
WO2005085981A1 (en) * | 2004-02-03 | 2005-09-15 | Nokia Corporation | Method and device for implementing vibration output commands in mobile terminal devices |
JP2005332063A (en) | 2004-05-18 | 2005-12-02 | Sony Corp | Input device with tactile function, information inputting method, and electronic device |
JP2006058973A (en) * | 2004-08-17 | 2006-03-02 | Sony Corp | Tactile information creation apparatus and tactile information creation method |
US7728823B2 (en) | 2004-09-24 | 2010-06-01 | Apple Inc. | System and method for processing raw data of track pad device |
JP2008515052A (en) | 2004-09-24 | 2008-05-08 | アップル インコーポレイテッド | Raw data track pad device and system |
JP4617893B2 (en) * | 2005-01-18 | 2011-01-26 | ソニー株式会社 | Vibration transmission structure, input / output device with tactile function, and electronic equipment |
US7919945B2 (en) * | 2005-06-27 | 2011-04-05 | Coactive Drive Corporation | Synchronized vibration device for haptic feedback |
US8700791B2 (en) * | 2005-10-19 | 2014-04-15 | Immersion Corporation | Synchronization of haptic effect data in a media transport stream |
US9370704B2 (en) | 2006-08-21 | 2016-06-21 | Pillar Vision, Inc. | Trajectory detection and feedback system for tennis |
JP2008123429A (en) * | 2006-11-15 | 2008-05-29 | Sony Corp | Touch panel display device, electronic equipment and game machine |
US8098234B2 (en) * | 2007-02-20 | 2012-01-17 | Immersion Corporation | Haptic feedback system with stored effects |
JP2008257295A (en) * | 2007-03-30 | 2008-10-23 | Tokyo Institute Of Technology | Method for presenting tactile stimulus |
US8621348B2 (en) | 2007-05-25 | 2013-12-31 | Immersion Corporation | Customizing haptic effects on an end user device |
CN101355746B (en) * | 2007-07-27 | 2012-05-16 | 深圳富泰宏精密工业有限公司 | Radio communication device |
US8035535B2 (en) * | 2007-11-21 | 2011-10-11 | Nokia Corporation | Apparatus and method providing transformation for human touch force measurements |
US7911328B2 (en) * | 2007-11-21 | 2011-03-22 | The Guitammer Company | Capture and remote reproduction of haptic events in synchronous association with the video and audio capture and reproduction of those events |
GB2468811B (en) | 2008-01-17 | 2012-12-19 | Articulate Technologies Inc | Methods and devices for intraoral tactile feedback |
JP2009181261A (en) | 2008-01-30 | 2009-08-13 | Panasonic Corp | Bidirectional communication system |
KR100927009B1 (en) * | 2008-02-04 | 2009-11-16 | 광주과학기술원 | Haptic interaction method and system in augmented reality |
US9513704B2 (en) | 2008-03-12 | 2016-12-06 | Immersion Corporation | Haptically enabled user interface |
US20090303175A1 (en) * | 2008-06-05 | 2009-12-10 | Nokia Corporation | Haptic user interface |
JP2010015514A (en) | 2008-07-07 | 2010-01-21 | Sony Corp | Input device, control method thereof, and electronic apparatus |
KR20100066036A (en) * | 2008-12-09 | 2010-06-17 | 삼성전자주식회사 | Operation method and apparatus for portable device |
KR101114603B1 (en) * | 2008-12-12 | 2012-03-05 | 삼성전자주식회사 | Haptic feedback device for portable terminal |
US8686952B2 (en) | 2008-12-23 | 2014-04-01 | Apple Inc. | Multi touch with multi haptics |
US8077021B2 (en) * | 2009-03-03 | 2011-12-13 | Empire Technology Development Llc | Dynamic tactile interface |
US10564721B2 (en) | 2009-03-12 | 2020-02-18 | Immersion Corporation | Systems and methods for using multiple actuators to realize textures |
KR101628782B1 (en) | 2009-03-20 | 2016-06-09 | 삼성전자주식회사 | Apparatus and method for providing haptic function using multi vibrator in portable terminal |
JP2010278727A (en) | 2009-05-28 | 2010-12-09 | Kddi Corp | Portable terminal with vibration function |
US9370459B2 (en) | 2009-06-19 | 2016-06-21 | Andrew Mahoney | System and method for alerting visually impaired users of nearby objects |
JP5197521B2 (en) * | 2009-07-29 | 2013-05-15 | 京セラ株式会社 | Input device |
JP4633183B1 (en) * | 2009-07-29 | 2011-02-23 | 京セラ株式会社 | Input device and control method of input device |
JP4942801B2 (en) * | 2009-08-27 | 2012-05-30 | 京セラ株式会社 | Input device |
US8451238B2 (en) * | 2009-09-02 | 2013-05-28 | Amazon Technologies, Inc. | Touch-screen user interface |
US8619044B2 (en) * | 2009-09-30 | 2013-12-31 | Blackberry Limited | Electronic device including tactile touch-sensitive display and method of controlling same |
JP5704428B2 (en) | 2009-11-18 | 2015-04-22 | 株式会社リコー | Touch panel device and control method of touch panel device |
JP5635274B2 (en) * | 2010-01-27 | 2014-12-03 | 京セラ株式会社 | Tactile sensation presentation apparatus and tactile sensation presentation method |
JP5360499B2 (en) | 2010-02-01 | 2013-12-04 | 国立大学法人東北大学 | Haptic presentation method and haptic presentation device |
CA2731708A1 (en) * | 2010-02-15 | 2011-08-15 | Research In Motion Limited | Electronic device including touch-sensitive display and actuator for providing tactile feedback |
US9417695B2 (en) * | 2010-04-08 | 2016-08-16 | Blackberry Limited | Tactile feedback method and apparatus |
US9251721B2 (en) | 2010-04-09 | 2016-02-02 | University Of Florida Research Foundation, Inc. | Interactive mixed reality system and uses thereof |
US8736559B2 (en) * | 2010-04-23 | 2014-05-27 | Blackberry Limited | Portable electronic device and method of controlling same |
US8451255B2 (en) * | 2010-05-14 | 2013-05-28 | Arnett Ryan Weber | Method of providing tactile feedback and electronic device |
DE112010005736B4 (en) * | 2010-07-13 | 2020-03-26 | Lg Electronics Inc. | Mobile terminal and configuration method for a idle screen of the same |
US8352643B2 (en) * | 2010-09-30 | 2013-01-08 | Immersion Corporation | Haptically enhanced interactivity with interactive content |
US20120081337A1 (en) | 2010-10-04 | 2012-04-05 | Sony Ericsson Mobile Communications Ab | Active Acoustic Multi-Touch and Swipe Detection for Electronic Devices |
BR112013011300A2 (en) | 2010-11-09 | 2019-09-24 | Koninl Philips Electronics Nv | user interface, method of providing the user with tactile feedback that touches an interaction surface (s) with a set of drivers and apparatus comprising a user interface |
JP5587759B2 (en) * | 2010-12-24 | 2014-09-10 | 京セラ株式会社 | Tactile sensation presentation apparatus, program used for the apparatus, and tactile sensation presentation method |
US8624857B2 (en) | 2011-02-09 | 2014-01-07 | Texas Instruments Incorporated | Haptics effect controller architecture and instruction set |
US9483085B2 (en) * | 2011-06-01 | 2016-11-01 | Blackberry Limited | Portable electronic device including touch-sensitive display and method of controlling same |
KR20130007738A (en) * | 2011-07-11 | 2013-01-21 | 삼성전자주식회사 | Key input device |
US9122311B2 (en) * | 2011-08-24 | 2015-09-01 | Apple Inc. | Visual feedback for tactile and non-tactile user interfaces |
US9462262B1 (en) | 2011-08-29 | 2016-10-04 | Amazon Technologies, Inc. | Augmented reality environment with environmental condition control |
US8947126B2 (en) | 2011-10-10 | 2015-02-03 | Infineon Technologies Austria Ag | System, drivers for switches and methods for synchronizing measurements of analog-to-digital converters |
US8711118B2 (en) | 2012-02-15 | 2014-04-29 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
US10852093B2 (en) | 2012-05-22 | 2020-12-01 | Haptech, Inc. | Methods and apparatuses for haptic systems |
JP5831635B2 (en) * | 2012-06-11 | 2015-12-09 | 富士通株式会社 | DRIVE DEVICE, ELECTRONIC DEVICE, AND DRIVE CONTROL PROGRAM |
US8860563B2 (en) | 2012-06-14 | 2014-10-14 | Immersion Corporation | Haptic effect conversion system using granular synthesis |
US9030428B2 (en) | 2012-07-11 | 2015-05-12 | Immersion Corporation | Generating haptic effects for dynamic events |
US9898084B2 (en) | 2012-12-10 | 2018-02-20 | Immersion Corporation | Enhanced dynamic haptic effects |
FR2999741B1 (en) | 2012-12-17 | 2015-02-06 | Centre Nat Rech Scient | HAPTIC SYSTEM FOR NON-CONTACT INTERACTING AT LEAST ONE PART OF THE BODY OF A USER WITH A VIRTUAL ENVIRONMENT |
US9367136B2 (en) | 2013-04-12 | 2016-06-14 | Microsoft Technology Licensing, Llc | Holographic object feedback |
US9908048B2 (en) | 2013-06-08 | 2018-03-06 | Sony Interactive Entertainment Inc. | Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display |
US9811854B2 (en) | 2013-07-02 | 2017-11-07 | John A. Lucido | 3-D immersion technology in a virtual store |
EP4083758A1 (en) | 2013-07-05 | 2022-11-02 | Rubin, Jacob A. | Whole-body human-computer interface |
US9630105B2 (en) | 2013-09-30 | 2017-04-25 | Sony Interactive Entertainment Inc. | Camera based safety mechanisms for users of head mounted displays |
WO2015107386A1 (en) | 2014-01-15 | 2015-07-23 | Sony Corporation | Haptic notification on wearables |
US9551873B2 (en) | 2014-05-30 | 2017-01-24 | Sony Interactive Entertainment America Llc | Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content |
CN106796451B (en) | 2014-07-28 | 2020-07-21 | Ck高新材料有限公司 | Tactile information providing module |
US9645646B2 (en) | 2014-09-04 | 2017-05-09 | Intel Corporation | Three dimensional contextual feedback wristband device |
US9799177B2 (en) | 2014-09-23 | 2017-10-24 | Intel Corporation | Apparatus and methods for haptic covert communication |
US10166466B2 (en) | 2014-12-11 | 2019-01-01 | Elwha Llc | Feedback for enhanced situational awareness |
US9870718B2 (en) | 2014-12-11 | 2018-01-16 | Toyota Motor Engineering & Manufacturing North America, Inc. | Imaging devices including spacing members and imaging devices including tactile feedback devices |
US20160170508A1 (en) | 2014-12-11 | 2016-06-16 | Toyota Motor Engineering & Manufacturing North America, Inc. | Tactile display devices |
US9922518B2 (en) | 2014-12-11 | 2018-03-20 | Elwha Llc | Notification of incoming projectiles |
US10073516B2 (en) | 2014-12-29 | 2018-09-11 | Sony Interactive Entertainment Inc. | Methods and systems for user interaction within virtual reality scene using head mounted display |
US9746921B2 (en) | 2014-12-31 | 2017-08-29 | Sony Interactive Entertainment Inc. | Signal generation and detector systems and methods for determining positions of fingers of a user |
US9843744B2 (en) | 2015-01-13 | 2017-12-12 | Disney Enterprises, Inc. | Audience interaction projection system |
US10322203B2 (en) | 2015-06-26 | 2019-06-18 | Intel Corporation | Air flow generation for scent output |
US9778746B2 (en) | 2015-09-25 | 2017-10-03 | Oculus Vr, Llc | Transversal actuator for haptic feedback |
US20170103574A1 (en) | 2015-10-13 | 2017-04-13 | Google Inc. | System and method for providing continuity between real world movement and movement in a virtual/augmented reality experience |
US20170131775A1 (en) | 2015-11-10 | 2017-05-11 | Castar, Inc. | System and method of haptic feedback by referral of sensation |
US10055948B2 (en) | 2015-11-30 | 2018-08-21 | Nike, Inc. | Apparel with ultrasonic position sensing and haptic feedback for activities |
US10310804B2 (en) | 2015-12-11 | 2019-06-04 | Facebook Technologies, Llc | Modifying haptic feedback provided to a user to account for changes in user perception of haptic feedback |
US10324530B2 (en) | 2015-12-14 | 2019-06-18 | Facebook Technologies, Llc | Haptic devices that simulate rigidity of virtual objects |
US10096163B2 (en) | 2015-12-22 | 2018-10-09 | Intel Corporation | Haptic augmented reality to reduce noxious stimuli |
US10065124B2 (en) | 2016-01-15 | 2018-09-04 | Disney Enterprises, Inc. | Interacting with a remote participant through control of the voice of a toy device |
US11351472B2 (en) | 2016-01-19 | 2022-06-07 | Disney Enterprises, Inc. | Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon |
US9846971B2 (en) | 2016-01-19 | 2017-12-19 | Disney Enterprises, Inc. | Systems and methods for augmenting an appearance of a hilt to simulate a bladed weapon |
TWI688879B (en) | 2016-01-22 | 2020-03-21 | 宏達國際電子股份有限公司 | Method, virtual reality system, and computer-readable recording medium for real-world interaction in virtual reality environment |
US9933851B2 (en) | 2016-02-22 | 2018-04-03 | Disney Enterprises, Inc. | Systems and methods for interacting with virtual objects using sensory feedback |
US10555153B2 (en) | 2016-03-01 | 2020-02-04 | Disney Enterprises, Inc. | Systems and methods for making non-smart objects smart for internet of things |
US20170352185A1 (en) | 2016-06-02 | 2017-12-07 | Dennis Rommel BONILLA ACEVEDO | System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation |
US10155159B2 (en) | 2016-08-18 | 2018-12-18 | Activision Publishing, Inc. | Tactile feedback systems and methods for augmented reality and virtual reality systems |
US20180053351A1 (en) | 2016-08-19 | 2018-02-22 | Intel Corporation | Augmented reality experience enhancement method and apparatus |
US10372213B2 (en) | 2016-09-20 | 2019-08-06 | Facebook Technologies, Llc | Composite ribbon in a virtual reality device |
US10779583B2 (en) | 2016-09-20 | 2020-09-22 | Facebook Technologies, Llc | Actuated tendon pairs in a virtual reality device |
US10300372B2 (en) | 2016-09-30 | 2019-05-28 | Disney Enterprises, Inc. | Virtual blaster |
US10281982B2 (en) | 2016-10-17 | 2019-05-07 | Facebook Technologies, Llc | Inflatable actuators in virtual reality |
US10088902B2 (en) | 2016-11-01 | 2018-10-02 | Oculus Vr, Llc | Fiducial rings in virtual reality |
US20170102771A1 (en) | 2016-12-12 | 2017-04-13 | Leibs Technology Limited | Wearable ultrasonic haptic feedback system |
-
2012
- 2012-11-02 US US13/667,003 patent/US8947216B2/en active Active
-
2013
- 2013-09-12 EP EP18208324.6A patent/EP3495924B1/en active Active
- 2013-09-12 EP EP13184109.0A patent/EP2728443A3/en not_active Ceased
- 2013-10-11 CN CN201810371898.1A patent/CN108803869A/en active Pending
- 2013-10-11 CN CN201310471458.0A patent/CN103809960B/en active Active
- 2013-10-28 KR KR1020130128528A patent/KR102161541B1/en active IP Right Grant
- 2013-10-31 JP JP2013226909A patent/JP6359820B2/en active Active
-
2014
- 2014-12-18 US US14/574,957 patent/US9396630B2/en active Active
-
2016
- 2016-07-15 US US15/211,463 patent/US9958944B2/en active Active
-
2018
- 2018-03-15 US US15/922,257 patent/US10248212B2/en active Active
- 2018-06-21 JP JP2018117601A patent/JP6734325B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020163498A1 (en) * | 1997-04-25 | 2002-11-07 | Chang Dean C. | Design of force sensations for haptic feedback computer interfaces |
JP2002149312A (en) * | 2000-08-08 | 2002-05-24 | Ntt Docomo Inc | Portable electronic equipment, electronic equipment, oscillation generator, reporting method by oscillation, and report control method |
US20050134562A1 (en) * | 2003-12-22 | 2005-06-23 | Grant Danny A. | System and method for controlling haptic devices having multiple operational modes |
WO2006019389A3 (en) * | 2004-07-15 | 2009-09-24 | Immersion Corporation | System and method for ordering haptic effects |
CN101059717A (en) * | 2006-04-21 | 2007-10-24 | 佳能株式会社 | Information-processing method and device for presenting haptics received from a virtual object |
CN101910978A (en) * | 2007-12-31 | 2010-12-08 | 苹果公司 | Tactile feedback in an electronic device |
CN102292696A (en) * | 2008-12-05 | 2011-12-21 | 平蛙实验室股份公司 | A touch sensing apparatus and method of operating the same |
WO2010105004A1 (en) * | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and methods for using multiple actuators to realize textures |
CN102591518A (en) * | 2010-12-02 | 2012-07-18 | 英默森公司 | A device and a method for providing haptic feedback and haptic feedback device |
Also Published As
Publication number | Publication date |
---|---|
JP6734325B2 (en) | 2020-08-05 |
US20160320845A1 (en) | 2016-11-03 |
JP2014093091A (en) | 2014-05-19 |
US20180267610A1 (en) | 2018-09-20 |
EP3495924B1 (en) | 2023-06-07 |
US9396630B2 (en) | 2016-07-19 |
KR102161541B1 (en) | 2020-10-05 |
KR20140057160A (en) | 2014-05-12 |
JP2018163691A (en) | 2018-10-18 |
EP2728443A2 (en) | 2014-05-07 |
US20140125467A1 (en) | 2014-05-08 |
CN103809960B (en) | 2018-05-25 |
CN103809960A (en) | 2014-05-21 |
JP6359820B2 (en) | 2018-07-18 |
EP2728443A3 (en) | 2016-06-08 |
US10248212B2 (en) | 2019-04-02 |
US8947216B2 (en) | 2015-02-03 |
US9958944B2 (en) | 2018-05-01 |
EP3495924A1 (en) | 2019-06-12 |
US20150102918A1 (en) | 2015-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103809960B (en) | Encode dynamic haptic effects | |
JP6598915B2 (en) | Context-sensitive haptic confirmation system | |
US10775895B2 (en) | Systems and methods for multi-pressure interaction on touch-sensitive surfaces | |
JP6479148B2 (en) | Enhanced dynamic haptic effect | |
EP2674835B1 (en) | Haptic effect conversion system using granular synthesis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20181113 |
|
WD01 | Invention patent application deemed withdrawn after publication |