US20190272039A1 - Enhanced dynamic haptic effects - Google Patents

Enhanced dynamic haptic effects Download PDF

Info

Publication number
US20190272039A1
US20190272039A1 US16/419,603 US201916419603A US2019272039A1 US 20190272039 A1 US20190272039 A1 US 20190272039A1 US 201916419603 A US201916419603 A US 201916419603A US 2019272039 A1 US2019272039 A1 US 2019272039A1
Authority
US
United States
Prior art keywords
haptic effect
value
dynamic
key frame
interpolant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/419,603
Inventor
Henry DA COSTA
Eric Gervais
Satvir Singh BHATIA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion Corp filed Critical Immersion Corp
Priority to US16/419,603 priority Critical patent/US20190272039A1/en
Assigned to IMMERSION CORPORATION reassignment IMMERSION CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BHATIA, SATVIR SINGH, Da Costa, Henry, GERVAIS, ERIC
Publication of US20190272039A1 publication Critical patent/US20190272039A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/80Technologies aiming to reduce greenhouse gasses emissions common to all road transportation technologies
    • Y02T10/82Elements for improving aerodynamics

Definitions

  • One embodiment is directed generally to haptic effects, and more particularly, to generating dynamic haptic effects.
  • kinesthetic feedback such as active and resistive force feedback
  • tactile feedback such as vibration, texture, and heat
  • Haptic feedback can provide cues that enhance and simplify the user interface.
  • vibration effects, or vibrotactile haptic effects may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.
  • Haptic feedback has also been increasingly incorporated in portable electronic devices, referred to as “handheld devices” or “portable devices,” such as cellular telephones, personal digital assistants (“PDA's”), smartphones, and portable gaming devices.
  • portable gaming applications are capable of vibrating in a manner similar to control devices (e.g., joysticks, etc.) used with larger-scale gaming systems that are configured to provide haptic feedback.
  • devices such as cellular telephones and smartphones are capable of providing various alerts to users by way of vibrations. For example, a cellular telephone can alert a user to an incoming telephone call by vibrating.
  • a smartphone can alert a user to a scheduled calendar item or provide a user with a reminder for a “to do” list item or calendar appointment.
  • haptic effects can be used to simulate “real world” dynamic events, such as the feel of a bouncing ball in a video game.
  • One embodiment is a system that generates a dynamic haptic effect.
  • the system receives a first key frame including a first interpolant value and a first haptic effect.
  • receives an interpolant value where the interpolant value is between the first interpolant value and the second interpolant value (or equal to either the first interpolant value or the second interpolant value).
  • the system further determines the dynamic haptic effect from the interpolant value, the first key frame, and the second key frame.
  • the system further distributes the dynamic haptic effect among a plurality of actuators.
  • Another embodiment is a system that generates the dynamic haptic effect.
  • the system receives a plurality of key frames, where each key frame includes a key frame interpolant value, a haptic effect, and a direction value.
  • the system further receives an interpolant value, where the interpolant value is between at least two key frame interpolant values (or equal to one of the at least two key frame interpolant values).
  • the system further determines a direction for the dynamic haptic effect.
  • the system further selects one or more key frames from the plurality of key frames, where each selected key frame includes a direction value that is equal to the direction.
  • the system further determines the dynamic haptic effect from the interpolant value and the direction, where the determining includes interpolating the dynamic haptic effect from at least two haptic effects of at least two selected key frames.
  • FIG. 1 illustrates a block diagram of a system in accordance with one embodiment of the invention.
  • FIG. 2 illustrates an example dynamic haptic effect definition, according to an embodiment of the invention.
  • FIG. 3 illustrates an example key frame definition, according to an embodiment of the invention.
  • FIG. 4 illustrates an example basis haptic effect storage block, according to an embodiment of the invention.
  • FIG. 5 illustrates an example frame list block, according to an embodiment of the invention.
  • FIG. 6 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to an embodiment of the invention.
  • FIG. 7 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention.
  • FIG. 8 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention.
  • FIG. 9 illustrates an example key frame definition that includes a direction property, according to an embodiment of the invention.
  • FIG. 10 illustrates a flow diagram of the functionality of a haptic effect generation module, according to one embodiment of the invention.
  • FIG. 11 illustrates a flow diagram of the functionality of a haptic effect generation module, according to another embodiment of the invention.
  • a “dynamic haptic effect” refers to a haptic effect that evolves over time as it responds to one or more input parameters. Dynamic haptic effects are haptic or vibrotactile effects displayed on haptic devices to represent a change in state of a given input signal.
  • the input signal can be a signal captured by sensors on the device with haptic feedback, such as position, acceleration, pressure, orientation, or proximity, or signals captured by other devices and sent to the haptic device to influence the generation of the haptic effect.
  • a dynamic effect signal can be any type of signal, but does not necessarily have to be complex.
  • a dynamic effect signal may be a simple sine wave that has some property such as phase, frequency, or amplitude that is changing over time or reacting in real time according to a mapping schema which maps an input parameter onto a changing property of the effect signal.
  • An input parameter may be any type of input capable of being provided by a device, and typically may be any type of signal such as a device sensor signal.
  • a device sensor signal may be generated by any means, and typically may be generated by capturing a user gesture with a device. Dynamic effects may be very useful for gesture interfaces, but the use of gestures or sensors are not necessarily required to create a dynamic signal.
  • One common scenario that does not involve gestures directly is defining the dynamic haptic behavior of an animated widget. For example, when a user scrolls a list, it is not typically the haptification of the gesture that will feel most intuitive, but instead the motion of the widget in response to the gesture. In the scroll list example, gently sliding the list may generate a dynamic haptic feedback that changes according to the speed of the scrolling, but flinging the scroll bar may produce dynamic haptics even after the gesture has ended. This creates the illusion that the widget has some physical properties and it provides the user with information about the state of the widget such as its velocity or whether it is in motion.
  • a gesture is any movement of the body that conveys meaning or user intent. It will be recognized that simple gestures may be combined to form more complex gestures. For example, bringing a finger into contact with a touch sensitive surface may be referred to as a “finger on” gesture, while removing a finger from a touch sensitive surface may be referred to as a separate “finger off” gesture.
  • the combined gesture may be referred to as “tapping”; if the time between the “finger on” and “finger off” gestures is relatively long, the combined gesture may be referred to as “long tapping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively large, the combined gesture may be referred to as “swiping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively small, the combined gesture may be referred to as “smearing”, “smudging” or “flicking”.
  • gestures may be combined in any manner to form any number of other gestures, including, but not limited to, multiple finger contacts, palm or first contact, or proximity to the device.
  • a gesture can also be any form of hand movement recognized by a device having an accelerometer, gyroscope, or other motion sensor, and converted to electronic signals. Such electronic signals can activate a dynamic effect, such as shaking virtual dice, where the sensor captures the user intent that generates a dynamic effect.
  • One embodiment is a system that can generate one or more dynamic haptic effects at a plurality of actuators, where the one or more dynamic haptic effects can be distributed among the plurality of actuators.
  • the system can define two or more key frames for a dynamic haptic effect.
  • the system can allow each key frame to target a specific actuator of the plurality of actuators using an actuator value, interpolate the separate actuator values, generate the dynamic haptic effect by interpolating two or more haptic effects stored within two or more key frames, and then distribute the dynamic haptic effect among the targeted actuators based on the interpolated actuator value.
  • the system can allow each key frame to target a specific actuator of the plurality of actuators, group together the key frames that target the same actuator, and generate the dynamic haptic effect by independently interpolating the haptic effects stored within the grouped key frames for each actuator.
  • the system can determine actuator distribution information that indicates how to distribute the dynamic haptic effect among a plurality of actuators. The system can then generate the dynamic haptic effect by interpolating two or more haptic effects stored within two or more key frames, and then distribute the dynamic haptic effect among the plurality of actuators using the actuator distribution information.
  • Another embodiment is a system that can generate a dynamic haptic effect using one or more key frames, where each key frame includes a direction property.
  • the direction property can indicate that the key frame is to be used for a specific direction of a dynamic haptic effect.
  • one or more key frames with a direction property that is equal to the determined direction can be used to generate the dynamic haptic effect.
  • One type of a dynamic haptic effect is a haptic effect that can be generated by interpolating a first haptic effect and a second haptic effect based on a dynamic value that is a value between a first interpolant value and a second interpolant value.
  • a dynamic value that is equal to either the first interpolant value or the second interpolant value is considered “between the first interpolant value and the second interpolant value.”
  • a value for each parameter of the dynamic haptic effect is calculated by interpolating a value of the parameter of the first haptic effect with a value of the parameter of the second haptic effect, using an interpolation function.
  • each parameter value of the dynamic haptic effect can be based upon where the dynamic value falls between the first interpolant value and the second interpolant value.
  • Dynamic haptic effects are further described in U.S. patent application Ser. No. 13/546,351, filed on Jul. 11, 2012, entitled “GENERATING HAPTIC EFFECTS FOR DYNAMIC EVENTS” (the contents of which are herein incorporated by reference), and in U.S. patent application Ser. No. 13/667,003, filed on Nov. 2, 2012, entitled “ENCODING DYNAMIC HAPTIC EFFECTS” (the contents of which are herein incorporated by reference).
  • the dynamic haptic effect can be encoded using a haptic effect signal, where the haptic effect signal is a representation of the dynamic haptic effect.
  • the haptic effect signal can be persisted on a disk, memory, or any computer-readable storage medium.
  • the two or more key frames associated with a dynamic haptic effect can comprise the one or more input parameters of a dynamic haptic effect produced by a haptic effect signal.
  • FIG. 1 illustrates a block diagram of a system 10 in accordance with one embodiment of the invention.
  • system 10 is part of a device, and system 10 provides a haptic effect generation functionality for the device.
  • System 10 includes a bus 12 or other communication mechanism for communicating information, and a processor 22 coupled to bus 12 for processing information.
  • Processor 22 may be any type of general or specific purpose processor.
  • System 10 further includes a memory 14 for storing information and instructions to be executed by processor 22 .
  • Memory 14 can be comprised of any combination of random access memory (“RAM”), read only memory (“ROM”), static storage such as a magnetic or optical disk, or any other type of computer-readable medium.
  • RAM random access memory
  • ROM read only memory
  • static storage such as a magnetic or optical disk, or any other type of computer-readable medium.
  • a computer-readable medium may be any available medium that can be accessed by processor 22 and may include both a volatile and nonvolatile medium, a removable and non-removable medium, a communication medium, and a storage medium.
  • a communication medium may include computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any other form of an information delivery medium known in the art.
  • a storage medium may include RAM, flash memory, ROM, erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), registers, hard disk, a removable disk, a compact disk read-only memory (“CD-ROM”), or any other form of a storage medium known in the art.
  • memory 14 stores software modules that provide functionality when executed by processor 22 .
  • the modules include an operating system 15 that provides operating system functionality for system 10 , as well as the rest of a mobile device in one embodiment.
  • the modules further include a haptic effect generation module 16 that generates a dynamic haptic effect, as disclosed in more detail below.
  • haptic effect generation module 16 can comprise a plurality of modules, where each individual module provides specific individual functionality for generating a dynamic haptic effect.
  • System 10 will typically include one or more additional application modules 18 to include additional functionality, such as the IntegratorTM application by Immersion Corporation.
  • System 10 in embodiments that transmit and/or receive data from remote sources, further includes a communication device 20 , such as a network interface card, to provide mobile wireless network communication, such as infrared, radio, Wi-Fi, or cellular network communication.
  • communication device 20 provides a wired network connection, such as an Ethernet connection or a modem.
  • Processor 22 is further coupled via bus 12 to a display 24 , such as a Liquid Crystal Display (“LCD”), for displaying a graphical representation or user interface to a user.
  • the display 24 may be a touch-sensitive input device, such as a touchscreen, configured to send and receive signals from processor 22 , and may be a multi-touch touchscreen.
  • Processor 22 may be further coupled to a keyboard or cursor control 28 that allows a user to interact with system 10 , such as a mouse or a stylus.
  • System 10 further includes an actuator 26 A.
  • Processor 22 may transmit a haptic signal associated with a generated haptic effect to actuator 26 A, which in turn outputs haptic effects such as vibrotactile haptic effects.
  • Actuator 26 A includes an actuator drive circuit.
  • Actuator 26 A may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a linear resonant actuator (“LRA”), a piezoelectric actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, or an ultrasonic vibration generator.
  • system 10 can include one or more additional actuators, in addition to actuator 26 A.
  • system 10 includes actuator 26 B, in addition to actuator 26 A.
  • a separate device from system 10 includes an actuator that generates the haptic effects, and system 10 sends generated haptic effect signals to that device through communication device 20 .
  • Actuators 26 A and 26 B are examples of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, in response to a drive signal.
  • FIG. 2 illustrates an example dynamic haptic effect definition 200 , according to an embodiment of the invention.
  • a dynamic haptic effect can be defined to include one or more key frames.
  • a key frame is a representation of a basis haptic effect that can be used to define the dynamic haptic effect.
  • a haptic effect signal can be generated using the one or more key frames, where the haptic effect signal is a signal that can store one or more key frames. By generating the haptic effect signal using the one or more key frames, the one or more key frames are generated, and subsequently stored within the haptic effect signal.
  • the haptic effect signal can be stored within, and retrieved from, a haptic effect file.
  • a key frame can include a basis haptic effect definition.
  • a basis haptic effect is a haptic effect that can include one or more parameters that define the characteristics of the haptic effect (more specifically, the characteristics of the kinesthetic feedback and/or tactile feedback produced by the haptic effect), where the haptic effect can be a vibratory haptic effect, for example.
  • the one or more parameters can include a magnitude parameter, a frequency parameter, a period parameter, and a duration parameter.
  • basis haptic effects can include a “MagSweep haptic effect”, and a “periodic haptic effect.”
  • a MagSweep haptic effect is a haptic effect that produces kinesthetic feedback and/or tactile feedback (such as a vibration).
  • a periodic haptic effect is a haptic effect that produces a repeating kinesthetic feedback and/or tactile feedback (such as a vibration pattern).
  • An example of a repeating pattern includes repeating pulses of certain shapes, such as sinusoidal, rectangular, triangular, sawtooth-up and sawtooth-down.
  • a key frame can include an interpolant value.
  • An interpolant value is a value that specifies where a current interpolation is occurring.
  • an interpolant value can be an integer value from a minimum value to a maximum value.
  • an interpolant value can be from 0 to 10,000.
  • an interpolant value can be a fixed-point or floating-point numeric value.
  • An interpolant value can be stored within one or more bits.
  • a key frame can optionally also include a repeat gap value.
  • a repeat gap value is a value that indicates a time period between two consecutive instances of a basis haptic effect when the basis haptic effect is played consecutively.
  • a repeat gap can indicate a number of milliseconds between two consecutive instances of the basis haptic effect.
  • dynamic haptic effect definition 200 includes four key frames, key frames 210 , 220 , 230 , and 240 .
  • a dynamic haptic effect definition can include any number of key frames.
  • Key frame 210 includes a basis haptic effect reference of “Periodic1,” an interpolant value of “0,” and a repeat gap value of “10 ms”.
  • the basis haptic effect reference “Periodic1” refers to basis haptic effect 260 , which is also included within dynamic haptic effect definition 200 .
  • key frame 210 defines basis haptic effect 260 as the basis haptic effect for the interpolant value of “0.” Key frame 210 further indicates that when basis haptic effect 260 is played consecutively, there is a time period of 10 ms between each consecutive instance of basis haptic effect 260 .
  • key frame 220 includes a basis haptic effect reference of “Periodic3,” an interpolant value of “10,” and a repeat gap value of “15 ms”.
  • the basis haptic effect reference “Periodic3” refers to basis haptic effect 270 , which is also included within dynamic haptic effect definition 200 .
  • key frame 220 defines basis haptic effect 270 as the basis haptic effect for the interpolant value of “10.”
  • Key frame 220 further indicates that when basis haptic effect 270 is played consecutively, there is a time period of 15 ms between each consecutive instance of basis haptic effect 270 .
  • key frame 230 includes a basis haptic effect reference of “Periodic1,” an interpolant value of “20,” and a repeat gap value of “5 ms”.
  • the basis haptic effect reference “Periodic1” refers to basis haptic effect 260 , which is also included within dynamic haptic effect definition 200 .
  • key frame 230 defines basis haptic effect 260 as the basis haptic effect for the interpolant value of “20.” This illustrates that a basis haptic effect can be defined as a basis haptic effect for more than one interpolant value.
  • Key frame 230 further indicates that when basis haptic effect 260 is played consecutively, there is a time period of 5 ms between each consecutive instance of basis haptic effect 260 .
  • key frame 240 includes a basis haptic effect reference of “Periodic2,” an interpolant value of “30,” and a repeat gap value of “20 ms”.
  • the basis haptic effect reference “Periodic2” refers to basis haptic effect 280 , which is also included within dynamic haptic effect definition 200 .
  • key frame 240 defines basis haptic effect 280 as the basis haptic effect for the interpolant value of “30.” Key frame 240 further indicates that when basis haptic effect 280 is played consecutively, there is a time period of 20 ms between each consecutive instance of basis haptic effect 280 .
  • a dynamic haptic effect can be defined to also include an indication of an end of the dynamic haptic effect.
  • the indication of the end of the dynamic haptic effect indicates that the dynamic haptic effect does not include any additional key frames.
  • a device that interprets a dynamic haptic effect definition can be configured to interpret the contents of the dynamic haptic effect definition sequentially.
  • the indication can indicate to a device the end of the dynamic haptic effect definition.
  • the indication of an end of the dynamic haptic effect can be considered an additional key frame.
  • dynamic haptic effect definition 200 includes end of dynamic haptic effect definition 250 which indicates the end of dynamic haptic effect definition 200 .
  • FIG. 3 illustrates an example key frame definition 300 , according to an embodiment of the invention.
  • a dynamic haptic effect definition includes one or more key frames.
  • a key frame definition can include one or more properties. Each property of the one or more properties can include a value.
  • a key frame definition can include a type property.
  • the type property is the first property of the key frame definition.
  • the type property can indicate whether the key frame is a key frame that contains a basis haptic effect for a dynamic haptic effect definition or a key frame that indicates an end of the dynamic haptic effect definition.
  • key frame definition 300 includes type property 310 which indicates the type of key frame defined by key frame definition 300 .
  • a key frame definition can also include a basis haptic effect property.
  • the basis haptic effect property can store a reference to a basis haptic effect for the key frame.
  • key frame definition 300 includes basis haptic effect property 320 (identified in FIG. 3 as “effect name”) which includes a reference to a basis haptic effect for the key frame defined by key frame definition 300 .
  • a key frame definition can also include an interpolant property.
  • the interpolant property can store an interpolant value, where the interpolant value specifies where a current interpolation is occurring.
  • an interpolant value can be an integer value from a minimum value to a maximum value.
  • an interpolant value can be from 0 to 10,000.
  • the interpolant value can be stored in one or more bits.
  • key frame definition 300 includes interpolant property 330 which includes an interpolant value for the key frame defined by key frame definition 300 .
  • a key frame definition can also optionally include a repeat gap property (not illustrated in FIG. 3 ).
  • the repeat gap property can store a repeat gap value which indicates a time period between two consecutive instances of a basis haptic effect for a key frame when the basis haptic effect is played consecutively.
  • a repeat gap can indicate a number of milliseconds between two consecutive instances of the basis haptic effect for the key frame.
  • a haptic effect file is a computer file configured to store one or more dynamic haptic effects, where the haptic effect file can be persisted on a disk, memory, or any computer-readable storage medium.
  • a haptic effect file can store one or more dynamic haptic effect definitions using a basis haptic effect storage block and a frame list block.
  • a basis haptic effect storage block can be used to store one or more basis haptic effects that a dynamic haptic effect can reference.
  • a frame list block can be used to store one or more key frame definitions that correspond to a dynamic haptic effect definition.
  • FIG. 4 illustrates an example basis haptic effect storage block 400 , according to an embodiment of the invention.
  • a dynamic haptic effect definition can include one or more basis haptic effects, where at least one stored basis haptic effect is referenced by at least one key frame of the dynamic haptic definition.
  • the one or more basis haptic effects can be stored within a basis haptic effect storage block, such as basis haptic effect storage block 400 , where the basis haptic effect storage block is stored within the dynamic haptic effect definition.
  • one or more basis haptic effects can be stored as message streams within basis haptic effect storage block 400 .
  • An example messaging format is a “codename z2” protocol messaging format.
  • a basic haptic effect is defined by a SetPeriodic message optionally preceded by a SetPeriodicModifier message.
  • a SetPeriodicModifier message can appear before a SetPeriodic message in the block. Otherwise, only a SetPeriodic message can appear in the block.
  • a basis haptic effect as stored in a basis haptic effect storage block (such as basis haptic effect storage block 400 of FIG.
  • 4 can either take up: (a) 8 bytes of memory in a single SetPeriodic message (assuming a default envelope); or (b) 16 bytes of memory in a first SetPeriodicModifier message followed by a subsequent SetPeriodic message.
  • a basis haptic effect storage block (such as basis haptic effect storage block 400 of FIG. 4 ) can include one or more basis haptic effect definitions, where each basis haptic effect definition corresponds to a basis haptic effect.
  • the one or more basis haptic effect definitions can be sequential within the basis haptic effect storage block, and can each be associated with an index.
  • basis haptic effect storage block 400 includes five basis haptic effects: Effect0, Effect1, Effect2, Effect3, and Effect4.
  • Effect0 is the first basis haptic effect located in basis haptic effect storage block 400
  • Effect1 is the second basis haptic effect located in basis haptic effect storage block 400
  • Effect2 is the third basis haptic effect located in basis haptic effect storage block 400
  • Effect3 is the fourth basis haptic effect located in basis haptic effect storage block 400
  • Effect4 is the fifth basis haptic effect located in basis haptic effect storage block 400 .
  • Each of the five basis haptic effects includes a basic haptic definition that either includes a single SetPeriodic message or a combination of a SetPeriodicModifier message and a SetPeriodic message.
  • FIG. 5 illustrates an example frame list block 500 , according to an embodiment of the invention.
  • a dynamic haptic effect definition can include one or more key frames, where each key frame can reference a basis haptic effect.
  • the one or more key frames can be stored within a frame list block, such as frame list block 500 , where the frame list block is stored within the dynamic haptic effect definition.
  • a frame list block such as frame list block 500 , includes a type property for a first key frame definition.
  • the frame list block further includes one or more properties associated with the first key frame definition, such as a basis haptic effect property, an interpolant property, a repeat gap property, or a combination therein.
  • the frame list block further includes a type property for a second key frame definition, which indicates the end of the first key frame definition.
  • the frame list block further includes one or more properties associated with the second key frame definition, such as a basis haptic effect property, an interpolant property, a repeat gap property, or a combination therein. This continues for each key frame definition of the frame list block.
  • the frame list block further includes a type property that indicates an end of a dynamic haptic effect.
  • the key frame definitions of the frame list block are in sequential order. In other words, the events of the frame list block are processed in the order in which they are located within the frame list block.
  • one or more properties of the frame list block can be encoded using a single header byte, followed by optional data bytes.
  • An example encoding scheme of one or more properties of a frame list block is as follows:
  • EffectNameAsOffSetU8 Property Byte # Bits7-0 Meaning 0 0xE0 EffectName is specified as a memory offset from the base of the basis haptic effect storage block.
  • OFFSET11_3 The offset from the basis of the basis haptic effect storage block where the 8 or 16-byte haptic effect definition can be found. The OFFSET11_3 value can be multiplied by 8 to obtain the actual address offset
  • InterpolantU16 Property Byte # Bits7-0 Meaning 0 0xE6 Interpolant is stored as a 16-bit unsigned integer 1 TIME15_8 The MSByte of the TimeOffset value 2 TIME7_0 The LSByte of the TimeOffset value
  • the key frame type property and the end of dynamic haptic effect type property correspond to a type property of a key frame definition
  • the EffectNameAsOffSetU8 property corresponds to a basis haptic effect property of the key frame definition
  • the InterpolantU16 property corresponds to an interpolant property of the key frame definition
  • the RepeatGapU16 property corresponds to a repeat gap property of the key frame definition.
  • frame list block 500 includes key frame definitions 510 , 520 , and 530 .
  • Key frame definitions 510 and 520 are each a definition for a basis haptic effect key frame.
  • Key frame definition 530 is an indication of an end of the dynamic haptic effect stored within frame list block.
  • the left column of frame list block 500 indicates a byte stream that is found in memory for each of key frame definitions 510 , 520 , and 530 .
  • the right column of frame list block indicates a meaning of each property for each of key frame definitions 510 , 520 , and 530 .
  • key frame definition 510 includes a key frame type property (“KeyFrame Event” as illustrated in FIG. 5 ) that indicates a start of key frame definition 510 .
  • Key frame definition 510 further includes a basis haptic effect property (“EffectNameAsOffsetU8” as illustrated in FIG. 5 ) that stores a reference to a basis haptic effect for key frame definition 510 , where the basis haptic effect property includes a header byte and an offset byte.
  • Key frame definition 510 further includes an interpolant property (“InterpolantU16” as illustrated in FIG.
  • Key frame definition 510 further includes a repeat gap property (“RepeatGapU16” as illustrated in FIG. 5 ) that stores a repeat gap value which indicates a time period between two consecutive instances of a basis haptic effect for a key frame, where the repeat gap property includes a header byte, an MSB, and an LSB.
  • RepeatGapU16 as illustrated in FIG. 5
  • key frame definition 520 also includes a key frame type property (“KeyFrame Event” as illustrated in FIG. 5 ) that indicates a start of key frame definition 520 .
  • Key frame definition 520 further includes a basis haptic effect property (“EffectNameAsOffsetU16” as illustrated in FIG. 5 ) that stores a reference to a basis haptic effect for key frame definition 520 , where the basis haptic effect property includes a header byte, a basis haptic effect definition MSB, and a basis haptic effect definition LSB.
  • Key frame definition 520 further includes an interpolant property (“InterpolantU16” as illustrated in FIG.
  • key frame definition 530 includes an end of dynamic haptic effect type property (“EndofDynamicHapticEffect” as illustrated in FIG. 5 ) that indicates an end of the dynamic haptic effect definition.
  • a dynamic haptic effect definition (such as dynamic haptic effect definition 200 of FIG. 2 ) can be stored within a haptic effect file.
  • a haptic effect file is a computer file configured to store one or more dynamic haptic effects.
  • the dynamic haptic effect definition can be stored within the haptic effect file, and the haptic effect file can be persisted within a computer-readable medium, such as a disk or memory.
  • the dynamic haptic effect definition can subsequently be retrieved from the haptic effect and interpreted.
  • a dynamic haptic effect can be generated by interpolating a first haptic effect and a second haptic effect based on a dynamic value that is a value between a first interpolant value and a second interpolant value. More specifically, a value for each parameter of the dynamic haptic effect can be calculated by interpolating a value of the parameter of the first haptic effect with a value of the parameter of the second haptic effect, using an interpolation function. The interpolation of each parameter value of the dynamic haptic effect can be based upon where the dynamic value falls between the first interpolant value and the second interpolant value.
  • a dynamic value of “50” can cause a first haptic effect associated with the first interpolant value of “0” to be interpolated with a second haptic effect associated with the second interpolant value of “100” to create the dynamic haptic effect.
  • Each parameter value of the first haptic effect can be interpolated with a parameter value of the second value based on an interpolation function, so that the parameter values of the dynamic haptic effect are based both on the parameter values of the first haptic effect and the parameter values of the second haptic effect.
  • a dynamic haptic effect definition (such as dynamic haptic effect definition 200 of FIG.
  • a haptic effect signal can be stored within a haptic effect file.
  • the haptic effect signal can subsequently be retrieved from the haptic effect file.
  • a drive signal can be applied to a haptic output device according to the haptic effect signal.
  • the drive signal can be further generated using the haptic output device.
  • a system can generate a dynamic haptic effect and distribute the haptic effect among a plurality of actuators.
  • the generation and distribution of the dynamic haptic effect can be implemented using different techniques. Example implementations are described below.
  • FIG. 6 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to an embodiment of the invention.
  • a dynamic haptic effect can include one or more key frames, and each key frame can include an actuator value that identifies an actuator.
  • the actuator identified by the actuator value is an actuator that the key frame targets.
  • the system can also interpolate each actuator value stored in the two or more key frames to generate an interpolated actuator value.
  • the system can then use the interpolated actuator value to distribute the generated dynamic haptic effect among two or actuators identified by the two or more actuator values.
  • dynamic haptic effect 610 includes key frames 611 and 612 .
  • Key frame 611 includes an interpolant value of “40%,” a magnitude value of “20%,” a duration value of “100 ms,” and an actuator value (identified in FIG. 6 as an “actuator index”) of “1.”
  • the magnitude value of “20%,” and the duration value of “100 ms” collectively represents a basis haptic effect that is stored within key frame 611 .
  • a basis haptic effect can include other parameters (such as a frequency parameter and a period parameter) that are stored within a key frame.
  • Key frame 612 includes an interpolant value of “80%,” a magnitude value of “100%,” a duration value of “200 ms,” and an actuator value of “2.” According to the embodiment, the magnitude value of “100%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 612 .
  • the system can receive an interpolant value of “70%” at effect interpolator 620 , and interpolate the basis haptic effect of key frame 611 (i.e., the magnitude value of “20%,” and the duration value of “100 ms,”) with the basis haptic effect of key frame 612 (i.e., the magnitude value of “100%,” and the duration value of “200 ms”).
  • the system can further interpolate the actuator value of key frame 611 (i.e., the actuator value “1”) with the actuator value of key frame 612 (i.e., the actuator value “2”).
  • the result of these two interpolations is interpolated parameters 630 , where interpolated parameters 630 represent dynamic haptic effect 610 .
  • interpolated parameters 630 include an interpolated magnitude value of “80%,” an interpolated duration value of “175 ms,” and an interpolated actuator value (identified in FIG. 6 as an “actuator index”) of “1.75.”
  • the interpolated magnitude value of “80%” is interpolated from the two magnitude values of “20%” and “100%,” based on an interpolation function.
  • the interpolated duration value of “175 ms” is interpolated from the two duration values of “100 ms” and “200 ms,” also based on an interpolation function.
  • the interpolated actuator value of “1.75” is interpolated from the two actuator values of “1,” and “2,” also based on an interpolation function.
  • the system can distribute interpolated parameters 630 , which represent dynamic haptic effect 610 , among a plurality of actuators based on the interpolated actuator value, where the system generates targeted parameters for each actuator of the plurality of actuators based on the interpolated actuator value.
  • the targeted parameters for each actuator represent a portion of dynamic haptic effect 610 that is distributed to the actuator.
  • the system generates targeted parameters 650 and targeted parameters 660 .
  • Targeted parameters 650 include a magnitude value of “20%,” a duration value of “175 ms,” and an actuator value of “1.”
  • Targeted parameters 660 include a magnitude value of “60%,” a duration value of “175 ms,” and an actuator value of “2.”
  • the system can use the interpolated actuator value “1.75” of interpolated parameters 630 to distribute 25% of dynamic haptic effect 610 to actuator “1,” and to distribute 75% of dynamic haptic effect 610 to actuator “2.”
  • the strength parameters of dynamic haptic effect 610 such as a magnitude parameter and a period parameter, are distributed among the plurality of actuators.
  • the time-based parameters such as a frequency parameter and a duration parameter, remain the same for the plurality of actuators.
  • the interpolated magnitude value “80%” of interpolated parameters 630 is distributed into a magnitude value of “20%” for targeted parameters 650 , and a magnitude value of “60%” for targeted parameters 660 .
  • the duration value “175 ms” of interpolated parameters 630 is included in both targeted parameters 650 and targeted parameters 660 . While the illustrated embodiment involves two key frames, where each key frame includes an actuator value, and thus, involves two actuators, one of ordinary skill in the art would readily appreciate that, in alternate embodiments, the above implementation can involve any number of key frames, where each key frame includes an actuator value, and thus, can include any number of actuators.
  • a first actuator can be on a left side of a device, and a second actuator can be on a right side of a device.
  • a first key frame of a dynamic haptic effect can be played at the first actuator.
  • a second key frame of a dynamic haptic effect can be played at the second actuator.
  • FIG. 7 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention.
  • a dynamic haptic effect can include one or more key frames, and each key frame can include an actuator value that identifies an actuator.
  • the actuator identified by the actuator value is an actuator that the key frame targets.
  • a system can then group the one or key frames by each actuator identified by each actuator value. More specifically, the system can create one or more groups of key frames, where each key frame of a group targets the same actuator. For each actuator, the system can independently interpolate each haptic effect stored in the one or more key frames of the group associated with the actuator, in order to generate the dynamic haptic effect.
  • dynamic haptic effect 710 includes key frames 711 , 712 , 713 , and 714 .
  • Key frame 711 includes an interpolant value of “40%,” a magnitude value of “20%,” a duration value of “100 ms,” and an actuator value (identified in FIG. 7 as an “actuator index”) of “1.”
  • the magnitude value of “20%,” and the duration value of “100 ms,” collectively represents a basis haptic effect that is stored within key frame 711 .
  • a basis haptic effect can include other parameters (such as a frequency parameter and a period parameter) that are stored within a key frame.
  • Key frame 712 includes an interpolant value of “30%,” a magnitude value of “80%,” a duration value of “200 ms,” and an actuator value of “2.”
  • the magnitude value of “80%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 712 .
  • Key frame 713 includes an interpolant value of “80%,” a magnitude value of “100%,” a duration value of “200 ms,” and an actuator value of “1.”
  • the magnitude value of “100%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 713 .
  • Key frame 714 includes an interpolant value of “90%,” a magnitude value of “50%,” a duration value of “50 ms,” and an actuator value of “2.”
  • the magnitude value of “50%,” and the duration value of “50 ms,” collectively represents a basis haptic effect that is stored within key frame 714 .
  • the system can group together key frames 711 and 713 based on the actuator value of “1” stored within key frames 711 and 713 , and associate the group with actuator 721 (i.e., actuator 1 ). Further, at effect grouper 720 , the system can group together key frames 712 and 714 based on the actuator value of “2” stored within key frames 712 and 714 , and associate the group with actuator 722 (i.e., actuator 2 ).
  • the system can receive an interpolant value of “70%” at effect interpolator 730 , and interpolate the basis haptic effect of key frame 711 (i.e., the magnitude value of “20%,” and the duration value of “100 ms,”) with the basis haptic effect of key frame 713 (i.e., the magnitude value of “100%,” and the duration value of “200 ms”).
  • the result of this interpolation is interpolated parameters 750 , where interpolated parameters 750 represent a portion of dynamic haptic effect 710 that can be output at actuator 721 .
  • interpolated parameters 750 include an interpolated magnitude value of “80%,” and an interpolated duration value of “175 ms”.
  • the interpolated magnitude value of “80%,” is interpolated from the two magnitude values of “20%” and “100%,” based on an interpolation function.
  • the interpolated duration value of “175 ms” is interpolated from the two duration values of “100 ms” and “200 ms,” also based on an interpolation function.
  • the system can independently receive an interpolant value of “70%” at effect interpolator 740 , and independently interpolate the basis haptic effect of key frame 712 (i.e., the magnitude value of “80%,” and the duration value of “200 ms,”) with the basis haptic effect of key frame 714 (i.e., the magnitude value of “50%,” and the duration value of “50 ms”).
  • the result of this interpolation is interpolated parameters 760 , where interpolated parameters 760 represent a portion of dynamic haptic effect 710 that can be output at actuator 722 .
  • interpolated parameters 760 include an interpolated magnitude value of “60%,” and an interpolated duration value of “100 ms”.
  • the interpolated magnitude value of “60%,” is interpolated from the two magnitude values of “80%” and “50%,” based on an interpolation function.
  • the interpolated duration value of “100 ms” is interpolated from the two duration values of “200 ms” and “50 ms,” also based on an interpolation function. While the illustrated embodiment involves four key frames, and involves two actuators, one of ordinary skill in the art would readily appreciate that, in alternate embodiments, the above implementation can involve any number of key frames, and can include any number of actuators.
  • the independent interpolations can also be synchronized to output dynamic haptic effect 710 at actuators 721 and 722 .
  • actuators 721 and 722 can each output a respective portion of dynamic haptic effect 710 in a synchronized manner.
  • the same interpolant value is provided to effect interpolators 730 and 740 .
  • effect interpolators 730 and 740 can each be provided a different interpolant value.
  • FIG. 8 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention.
  • a dynamic haptic effect can include one or more key frames.
  • the system can also determine actuator distribution information that indicates how to distribute the dynamic haptic effect among a plurality of actuators. The system can then interpolate each haptic effect stored in the two or more key frames to generate the dynamic haptic effect, and can then use the actuator distribution information to distribute the generated dynamic haptic effect among the plurality of actuators.
  • dynamic haptic effect 810 includes key frames 811 and 812 .
  • Key frame 811 includes an interpolant value of “40%,” a magnitude value of “20%,” and a duration value of “100 ms.”
  • the magnitude value of “20%,” and the duration value of “100 ms,” collectively represents a basis haptic effect that is stored within key frame 811 .
  • a basis haptic effect can include other parameters (such as a frequency parameter and a period parameter) that are stored within a key frame.
  • Key frame 812 includes an interpolant value of “80%,” a magnitude value of “100%,” and a duration value of “200 ms”. According to the embodiment, the magnitude value of “100%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 812 .
  • Dynamic haptic effect 810 further includes actuator distribution information 813 (identified in FIG. 8 as “actuator distribution 813 ”). Actuator distribution information 813 indicates how to distribute dynamic haptic effect 810 among a plurality of actuators.
  • distribution information 813 indicates that 25% of dynamic haptic effect 810 is to be distributed to a first actuator (i.e., Actuator 1 ), and that 75% of dynamic haptic effect 810 is to be distributed to a second actuator (i.e., Actuator 2 ).
  • the system can receive an interpolant value of “70%” at effect interpolator 820 , and interpolate the basis haptic effect of key frame 811 (i.e., the magnitude value of “20%,” and the duration value of “100 ms,”) with the basis haptic effect of key frame 812 (i.e., the magnitude value of “100%,” and the duration value of “100 ms”).
  • the result of this interpolation is interpolated parameters 830 , where interpolated parameters 830 represent dynamic haptic effect 810 .
  • interpolated parameters 830 include an interpolated magnitude value of “80%,” and an interpolated duration value of “175 ms”.
  • the interpolated magnitude value of “80%,” is interpolated from the two magnitude values of “20%” and “100%,” based on an interpolation function.
  • the interpolated duration value of “175 ms” is interpolated from the two duration values of “100 ms” and “200 ms,” also based on an interpolation function.
  • the system can distribute interpolated parameters 830 , which represent dynamic haptic effect 810 , among a plurality of actuators based on actuator distribution information 813 , where the system generates targeted parameters for each actuator of the plurality of actuators based on actuator distribution information 813 .
  • the targeted parameters for each actuator represent a portion of dynamic haptic effect 810 that is distributed to the actuator.
  • the system generates targeted parameters 850 and targeted parameters 860 .
  • Targeted parameters 850 include a magnitude value of “20%,” a duration value of “175 ms,” and an actuator value (identified in FIG.
  • Targeted parameters 660 include a magnitude value of “60%,” a duration value of “175 ms,” and an actuator value of “2.”
  • the system can use actuator distribution information 813 to distribute 25% of dynamic haptic effect 810 to actuator “1,” and to distribute 75% of dynamic haptic effect 810 to actuator “2.”
  • the strength parameters of dynamic haptic effect 810 such as a magnitude parameter and a period parameter, are distributed among the plurality of actuators.
  • the time-based parameters such as a frequency parameter and a duration parameter, remain the same for the plurality of actuators.
  • the interpolated magnitude value “80%” of interpolated parameters 830 is distributed into a magnitude value of “20%” for targeted parameters 850 , and a magnitude value of “60%” for targeted parameters 860 .
  • the interpolated duration value “175 ms” of interpolated parameters 830 is included in both targeted parameters 850 and targeted parameters 860 . While the illustrated embodiment involves two key frames, and involves two actuators, one of ordinary skill in the art would readily appreciate that, in alternate embodiments, the above implementation can involve any number of key frames, and can include any number of actuators.
  • actuator distribution information 813 can be stored within a haptic effect file that dynamic haptic effect 810 is stored within. In another embodiment, actuator distribution information 813 can stored within one or more key frames of dynamic haptic effect 810 . In yet another embodiment, actuator distribution information 813 can be stored within a basis haptic effect that is referenced by one or more of the key frames of dynamic haptic effect 810 . In yet another embodiment, actuator distribution information 813 can be determined by the system at run time.
  • the system can disregard the one or more actuator values stored within the one or more key frames of dynamic haptic effect 810 , and can distribute dynamic haptic effect 810 among a plurality of actuators based on actuator distribution information 813 .
  • FIG. 9 illustrates an example key frame definition 900 that includes a direction property, according to an embodiment of the invention.
  • a dynamic haptic effect can be defined to include one or more key frames.
  • a key frame can include a basis haptic effect definition, an interpolant value, and optionally, a repeat gap value.
  • a key frame can also include a direction value.
  • a direction value is a value that specifies a direction of a dynamic haptic effect.
  • a direction of a dynamic haptic effect is an ordinal direction of a received interpolant value for the dynamic haptic effect, as compared to a previously received interpolant value for the dynamic haptic effect. For example, if a received interpolant value for a dynamic haptic effect is the value “100,” and the previously received interpolant value for the dynamic haptic effect is “50,” a direction of the dynamic haptic effect can be classified as “upwards,” because the received interpolant value is greater than the previously received interpolant value.
  • a direction value of “UP,” can specified the “upwards” direction of the dynamic haptic effect.
  • a received interpolant value for a dynamic haptic effect is the value “100,” and the previously received interpolant value for the dynamic haptic effect is “200,” a direction of the dynamic haptic effect can be classified as “downwards,” because the received interpolant value is less than the previously received interpolant value.
  • a direction value of “DOWN,” can specified the “downwards” direction of the dynamic haptic effect.
  • a direction value can be a string value.
  • a direction value can be a fixed-point or floating-point numeric value.
  • a direction value can be stored within one or more bits.
  • a key frame can optionally also include a category value.
  • a category value is a value that specifies a category of a dynamic haptic effect.
  • a category of a dynamic haptic effect is a classification of the dynamic haptic effect.
  • a category of a dynamic haptic effect can be determined by a category value that can be received along with an interpolant value.
  • a category of the dynamic haptic effect can be determined based on a received interpolant value.
  • a category value can be a string value.
  • a category value can be a fixed-point or floating-point numeric value.
  • a category value can be stored within one or more bits.
  • dynamic haptic effect definition 900 includes four key frames, key frames 910 , 920 , 930 , and 940 .
  • a dynamic haptic effect definition can include any number of key frames.
  • Key frame 910 includes a basis haptic effect reference of “Periodic1,” an interpolant value of “0,” and a repeat gap value of “10 ms”.
  • the basis haptic effect reference “Periodic1” refers to basis haptic effect 960 , which is also included within dynamic haptic effect definition 900 .
  • key frame 910 defines basis haptic effect 960 as the basis haptic effect for the interpolant value of “0.” Key frame 910 further indicates that when basis haptic effect 960 is played consecutively, there is a time period of 10 ms between each consecutive instance of basis haptic effect 960 .
  • key frame 920 includes a basis haptic effect reference of “Periodic2,” an interpolant value of “80,” and a repeat gap value of “15 ms.”
  • the basis haptic effect reference “Periodic2” refers to basis haptic effect 970 , which is also included within dynamic haptic effect definition 900 .
  • key frame 920 defines basis haptic effect 970 as the basis haptic effect for the interpolant value of “80.” Key frame 920 further indicates that when basis haptic effect 970 is played consecutively, there is a time period of 15 ms between each consecutive instance of basis haptic effect 970 .
  • key frame 930 includes a basis haptic effect reference of “Periodic3,” an interpolant value of “90,” and a repeat gap value of “5 ms.”
  • the basis haptic effect reference “Periodic3” refers to basis haptic effect 980 , which is also included within dynamic haptic effect definition 900 .
  • key frame 930 defines basis haptic effect 980 as the basis haptic effect for the interpolant value of “90.”
  • Key frame 930 further indicates that when basis haptic effect 980 is played consecutively, there is a time period of 5 ms between each consecutive instance of basis haptic effect 980 .
  • key frame 940 includes a basis haptic effect reference of “Periodic3,” an interpolant value of “100,” and a repeat gap value of “20 ms”.
  • the basis haptic effect reference “Periodic3” refers to basis haptic effect 980 , which is also included within dynamic haptic effect definition 200 .
  • key frame 940 defines basis haptic effect 980 as the basis haptic effect for the interpolant value of “100.”
  • Key frame 940 further indicates that when basis haptic effect 980 is played consecutively, there is a time period of 20 ms between each consecutive instance of basis haptic effect 980 .
  • key frames 910 , 920 , 930 , and 940 each also include a direction value. More specifically, key frames 910 and 930 each include a direction value of “UP,” and key frames 920 and 940 each include a direction value of “DOWN.” Thus, key frames 910 and 930 indicate that they are key frames to be used (and thus, their respective basis haptic effects are to be used) when a direction of dynamic haptic effect definition 900 is an “upwards” direction. Further, key frames 920 and 940 indicate that they are key frames to be used (and thus, their respective basis haptic effects are to be used) when a direction of dynamic haptic effect definition 900 is a “downwards” direction.
  • key frames 910 , 920 , 930 , and 940 can each optionally also include a category value (not illustrated in FIG. 9 ).
  • key frames 910 , 920 , 930 , and 940 further indicate that each respective key frame (and thus, each respective basis haptic effect) is to be used when a category of dynamic haptic effect definition 900 is a category that is equal to the category value of the respective key frame
  • a dynamic haptic effect can be defined to also include an indication of an end of the dynamic haptic effect.
  • the indication of the end of the dynamic haptic effect indicates that the dynamic haptic effect does not include any additional key frames.
  • a device that interprets a dynamic haptic effect definition can be configured to interpret the contents of the dynamic haptic effect definition sequentially.
  • the indication can indicate to a device the end of the dynamic haptic effect definition.
  • the indication of an end of the dynamic haptic effect can be considered an additional key frame.
  • dynamic haptic effect definition 900 includes end of dynamic haptic effect definition 950 which indicates the end of dynamic haptic effect definition 900 .
  • a dynamic haptic effect can be designed for a user interface software module that displays a glow along an edge of a user interface of a device.
  • the dynamic haptic effect can include a plurality of key frames, where each key frame includes an interpolant value. According to the embodiment, there may be many more interpolant values associated with decreasing the glow along the edge of the user interface, in comparison with the interpolant values associated with increasing the glow.
  • each key frame By storing a direction value within each key frame, where each direction value either includes a value of “GROW” or “DECAY,” each key frame (and thus, each interpolant value of the value) can be associated with a specific direction of the dynamic haptic effect (i.e., a “growing” direction of the dynamic haptic effect, or a “decaying” direction of the dynamic haptic effect).
  • FIG. 10 illustrates a flow diagram of the functionality of a haptic effect generation module (such as haptic effect generation module 16 of FIG. 1 ), according to one embodiment of the invention.
  • the functionality of FIG. 10 as well as the functionality of FIG. 11 , are each implemented by software stored in memory or another computer-readable or tangible medium, and executed by a processor.
  • each functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • each functionality may be performed by hardware using analog components.
  • a first key frame is received, where the first key frame includes a first interpolant value and a first haptic effect.
  • the first interpolant value can be a value that specifies where an interpolation occurs for the first haptic effect.
  • the first key frame can include a repeat gap value.
  • the first haptic effect can be a vibratory haptic effect, and can include a plurality of parameters.
  • the plurality of parameters can include a magnitude parameter, a frequency parameter, a period parameter, and a duration parameter.
  • a second key frame is received, where the second key frame includes a second interpolant value and a second haptic effect.
  • the second interpolant value can be a value that specifies where an interpolation occurs for the second haptic effect.
  • the second key frame can include a repeat gap value.
  • the second haptic effect can be a vibratory haptic effect, and can include a plurality of parameters.
  • the plurality of parameters can include a magnitude parameter, a frequency parameter, a period parameter, and a duration parameter.
  • a dynamic haptic effect can be defined as including two key frames, where each key frame includes a haptic effect.
  • a dynamic haptic effect can alternately be defined as including three or more key frames, where each key frame includes a haptic effect.
  • an interpolant value is received, where the interpolant value is between the first interpolant value and the second interpolant value.
  • the flow then proceeds to 1040 .
  • a dynamic haptic effect is determined from the interpolant value. The determining of the dynamic haptic effect is further described in greater detail according to different embodiments.
  • the flow then proceeds to 1050 .
  • the dynamic haptic effect is distributed among a plurality of actuators. The distributing of the dynamic haptic effect is also further described in greater detail according to different embodiments. The flow then ends.
  • the determining of the dynamic haptic effect can be performed according to the following functionality.
  • the dynamic haptic effect can be interpolated from the first haptic effect and the second haptic effect.
  • a value for each parameter of the dynamic haptic effect can be calculated by interpolating a value of the parameter of the first haptic effect with a value of the parameter of the second haptic effect, using an interpolation function.
  • the interpolation of each parameter value of the dynamic haptic effect can be based upon where the received interpolant value falls between the first interpolant value that corresponds to the first haptic effect and the second interpolant value that corresponds to the second haptic effect.
  • a dynamic haptic effect can be generated by interpolating two basis haptic effects.
  • Such an interpolation can be a linear interpolation.
  • a dynamic haptic effect can alternately be generated by interpolating three or more haptic effects based on the abovementioned functionality.
  • Such an interpolation can be a spline interpolation, where a spline interpolation is a form of interpolation where an interpolating function is a special type of piecewise polynomial called a spline, and where the interpolating function is a function that can map an interpolant value to a dynamic haptic effect using two or more key frames.
  • the distributing the dynamic haptic effect can be based on the following functionality.
  • a first actuator value can be received.
  • the first actuator value can correspond to a first actuator of the plurality of actuators, and the first actuator value can be stored within the first key frame.
  • a second actuator value can also be received.
  • the second actuator value can correspond to a second actuator of the plurality of actuators, and the second actuator value can be stored within the second key frame.
  • one or more additional actuator values can be received.
  • An interpolated actuator value can be interpolated from the first actuator value and the second actuator value.
  • the interpolated actuator value can be calculated by interpolating the first actuator value with the second actuator value, using an interpolation function.
  • an interpolated actuator value can alternately be generated by interpolating three or more actuator values based on the abovementioned functionality.
  • the dynamic haptic effect can then be distributed among the first actuator and the second actuator based on the interpolated actuator value.
  • a dynamic haptic effect can alternately be distributed among three or more actuators based on the aforementioned functionality.
  • a third key frame and fourth key frame are received, where the third key frame includes a third interpolant value and a third haptic effect, and the fourth key frame includes a fourth interpolant value and fourth haptic effect.
  • the distributing the dynamic haptic effect can be based on the following functionality.
  • a first actuator value can be received.
  • the first actuator value can correspond to a first actuator of the plurality of actuators, and the first actuator value can be stored within the first key frame and the third key frame.
  • a second actuator value can be received.
  • the second actuator value can correspond to a second actuator of the plurality of actuators, and the second actuator value can be stored within the second key frame and the fourth key frame.
  • the first key frame and the third key frame can be grouped together.
  • the second key frame and the fourth key frame can be grouped together.
  • a dynamic haptic effect can alternately be distributed among three or more actuators based on the aforementioned functionality.
  • the determining of the dynamic haptic effect can be performed according to the following functionality.
  • a first dynamic haptic effect for the first actuator can be interpolated from the first haptic effect and the third haptic effect.
  • a second dynamic haptic effect for the second actuator can be interpolated from the second haptic effect and the fourth haptic effect.
  • each dynamic haptic effect can alternately be generated by interpolating three or more haptic effects based on the abovementioned functionality.
  • the determining of the dynamic haptic effect can be performed according to the following functionality.
  • the dynamic haptic effect can be interpolated from the first haptic effect and the second haptic effect, as previously described.
  • a dynamic haptic effect can alternately be generated by interpolating three or more haptic effects based on the abovementioned functionality.
  • the determining of the dynamic haptic effect can be performed according to the following functionality.
  • Actuator distribution information can be received, where the actuator distribution information indicates how to distribute the dynamic haptic effect among the plurality of actuators.
  • the dynamic haptic effect can then be distributed among the plurality of actuators based on the actuator distribution information.
  • the actuator distribution information can be stored within a haptic effect file that the dynamic haptic effect is stored within.
  • the actuator distribution information can stored within one or more key frames of the dynamic haptic effect.
  • the actuator distribution information can be stored within a haptic effect that is referenced by one or more of the key frames of the dynamic haptic effect.
  • the actuator distribution information can be determined at run time.
  • FIG. 11 illustrates a flow diagram of the functionality of a haptic effect generation module (such as haptic effect generation module 16 of FIG. 1 ), according to another embodiment of the invention.
  • the flow begins and proceeds to 1110 .
  • a plurality of key frames is received. Each key frame includes a key frame interpolant value, a haptic effect, and a direction value.
  • the flow then proceeds to 1120 .
  • an interpolant value is received.
  • the interpolant value is between at least two key frame interpolant values.
  • the flow then proceeds to 1130 .
  • a direction is determined for a dynamic haptic effect.
  • the flow then proceeds to 1140 .
  • one or more key frames are selected from the plurality of key frames.
  • Each selected key frame includes a direction value that is equal to the direction.
  • the flow then proceeds to 1150 .
  • the dynamic haptic effect is determined from the interpolant value and the direction.
  • the determining can include interpolating the dynamic haptic effect from at least two haptic effects of at least two selected key frames. The flow then ends.
  • each key frame can also include a category value.
  • a category can be determined for the dynamic haptic effect.
  • One or more key frames can be further selected from the selected one or more key frames.
  • Each further selected key frame can include a category value that is equal to the category.
  • the dynamic haptic effect can be determined from the interpolant value, the direction, and the category.
  • the determining can include interpolating the dynamic haptic effect from at least two haptic effects of at least two further selected key frames.
  • a system can be provided that generates one or more dynamic haptic effects at a plurality of actuators, where the one or more dynamic haptic effects can be distributed among the plurality of actuators.
  • the system can animate spatial characteristics of the dynamic haptic effect by moving the dynamic haptic effect among different actuators. This can allow the system to add a spatial location as a parameter of a dynamic haptic effect, and can provide for a more robust haptic experience.
  • a system can be provided that generates a dynamic haptic effect using one or more key frames, where each key frame includes a direction property. This can allow a dynamic haptic effect to be modified differently depending on the direction, and thus, can further enhance a haptic experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A system is provided that generates a dynamic haptic effect that includes one or more key frames, where each key frame includes a first interpolant value and a first haptic effect. The system further receives an interpolant value, where the interpolant value is between at least two interpolant values of at least two key frames. The system further determines the dynamic haptic effect from the interpolant value. The system further distributes the dynamic haptic effect among a plurality of actuators.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 15/791,862 filed on Oct. 24, 2017, which is a continuation of U.S. patent application Ser. No. 13/709,157 filed on Dec. 10, 2012, which issued as U.S. Pat. No. 9,898,084 on Feb. 20, 2018, both of which have been incorporated herein by reference in their entirety.
  • FIELD
  • One embodiment is directed generally to haptic effects, and more particularly, to generating dynamic haptic effects.
  • BACKGROUND
  • Electronic device manufacturers strive to produce a rich interface for users. Conventional devices use visual and auditory cues to provide feedback to a user. In some interface devices, kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, texture, and heat) is also provided to the user, more generally known collectively as “haptic feedback” or “haptic effects.” Haptic feedback can provide cues that enhance and simplify the user interface. Specifically, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.
  • Haptic feedback has also been increasingly incorporated in portable electronic devices, referred to as “handheld devices” or “portable devices,” such as cellular telephones, personal digital assistants (“PDA's”), smartphones, and portable gaming devices. For example, some portable gaming applications are capable of vibrating in a manner similar to control devices (e.g., joysticks, etc.) used with larger-scale gaming systems that are configured to provide haptic feedback. Additionally, devices such as cellular telephones and smartphones are capable of providing various alerts to users by way of vibrations. For example, a cellular telephone can alert a user to an incoming telephone call by vibrating. Similarly, a smartphone can alert a user to a scheduled calendar item or provide a user with a reminder for a “to do” list item or calendar appointment. Further, haptic effects can be used to simulate “real world” dynamic events, such as the feel of a bouncing ball in a video game.
  • SUMMARY
  • One embodiment is a system that generates a dynamic haptic effect. The system receives a first key frame including a first interpolant value and a first haptic effect. The system further receives a second key frame including a second interpolant value and a second haptic effect. The system further receives an interpolant value, where the interpolant value is between the first interpolant value and the second interpolant value (or equal to either the first interpolant value or the second interpolant value). The system further determines the dynamic haptic effect from the interpolant value, the first key frame, and the second key frame. The system further distributes the dynamic haptic effect among a plurality of actuators.
  • Another embodiment is a system that generates the dynamic haptic effect. The system receives a plurality of key frames, where each key frame includes a key frame interpolant value, a haptic effect, and a direction value. The system further receives an interpolant value, where the interpolant value is between at least two key frame interpolant values (or equal to one of the at least two key frame interpolant values). The system further determines a direction for the dynamic haptic effect. The system further selects one or more key frames from the plurality of key frames, where each selected key frame includes a direction value that is equal to the direction. The system further determines the dynamic haptic effect from the interpolant value and the direction, where the determining includes interpolating the dynamic haptic effect from at least two haptic effects of at least two selected key frames.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Further embodiments, details, advantages, and modifications will become apparent from the following detailed description of the preferred embodiments, which is to be taken in conjunction with the accompanying drawings.
  • FIG. 1 illustrates a block diagram of a system in accordance with one embodiment of the invention.
  • FIG. 2 illustrates an example dynamic haptic effect definition, according to an embodiment of the invention.
  • FIG. 3 illustrates an example key frame definition, according to an embodiment of the invention.
  • FIG. 4 illustrates an example basis haptic effect storage block, according to an embodiment of the invention.
  • FIG. 5 illustrates an example frame list block, according to an embodiment of the invention.
  • FIG. 6 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to an embodiment of the invention.
  • FIG. 7 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention.
  • FIG. 8 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention.
  • FIG. 9 illustrates an example key frame definition that includes a direction property, according to an embodiment of the invention.
  • FIG. 10 illustrates a flow diagram of the functionality of a haptic effect generation module, according to one embodiment of the invention.
  • FIG. 11 illustrates a flow diagram of the functionality of a haptic effect generation module, according to another embodiment of the invention.
  • DETAILED DESCRIPTION
  • As described below, a “dynamic haptic effect” refers to a haptic effect that evolves over time as it responds to one or more input parameters. Dynamic haptic effects are haptic or vibrotactile effects displayed on haptic devices to represent a change in state of a given input signal. The input signal can be a signal captured by sensors on the device with haptic feedback, such as position, acceleration, pressure, orientation, or proximity, or signals captured by other devices and sent to the haptic device to influence the generation of the haptic effect.
  • A dynamic effect signal can be any type of signal, but does not necessarily have to be complex. For example, a dynamic effect signal may be a simple sine wave that has some property such as phase, frequency, or amplitude that is changing over time or reacting in real time according to a mapping schema which maps an input parameter onto a changing property of the effect signal. An input parameter may be any type of input capable of being provided by a device, and typically may be any type of signal such as a device sensor signal. A device sensor signal may be generated by any means, and typically may be generated by capturing a user gesture with a device. Dynamic effects may be very useful for gesture interfaces, but the use of gestures or sensors are not necessarily required to create a dynamic signal.
  • One common scenario that does not involve gestures directly is defining the dynamic haptic behavior of an animated widget. For example, when a user scrolls a list, it is not typically the haptification of the gesture that will feel most intuitive, but instead the motion of the widget in response to the gesture. In the scroll list example, gently sliding the list may generate a dynamic haptic feedback that changes according to the speed of the scrolling, but flinging the scroll bar may produce dynamic haptics even after the gesture has ended. This creates the illusion that the widget has some physical properties and it provides the user with information about the state of the widget such as its velocity or whether it is in motion.
  • A gesture is any movement of the body that conveys meaning or user intent. It will be recognized that simple gestures may be combined to form more complex gestures. For example, bringing a finger into contact with a touch sensitive surface may be referred to as a “finger on” gesture, while removing a finger from a touch sensitive surface may be referred to as a separate “finger off” gesture. If the time between the “finger on” and “finger off” gestures is relatively short, the combined gesture may be referred to as “tapping”; if the time between the “finger on” and “finger off” gestures is relatively long, the combined gesture may be referred to as “long tapping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively large, the combined gesture may be referred to as “swiping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively small, the combined gesture may be referred to as “smearing”, “smudging” or “flicking”. Any number of two dimensional or three dimensional simple or complex gestures may be combined in any manner to form any number of other gestures, including, but not limited to, multiple finger contacts, palm or first contact, or proximity to the device. A gesture can also be any form of hand movement recognized by a device having an accelerometer, gyroscope, or other motion sensor, and converted to electronic signals. Such electronic signals can activate a dynamic effect, such as shaking virtual dice, where the sensor captures the user intent that generates a dynamic effect.
  • One embodiment is a system that can generate one or more dynamic haptic effects at a plurality of actuators, where the one or more dynamic haptic effects can be distributed among the plurality of actuators. The system can define two or more key frames for a dynamic haptic effect. In one embodiment, the system can allow each key frame to target a specific actuator of the plurality of actuators using an actuator value, interpolate the separate actuator values, generate the dynamic haptic effect by interpolating two or more haptic effects stored within two or more key frames, and then distribute the dynamic haptic effect among the targeted actuators based on the interpolated actuator value. In another embodiment, the system can allow each key frame to target a specific actuator of the plurality of actuators, group together the key frames that target the same actuator, and generate the dynamic haptic effect by independently interpolating the haptic effects stored within the grouped key frames for each actuator. In another embodiment, the system can determine actuator distribution information that indicates how to distribute the dynamic haptic effect among a plurality of actuators. The system can then generate the dynamic haptic effect by interpolating two or more haptic effects stored within two or more key frames, and then distribute the dynamic haptic effect among the plurality of actuators using the actuator distribution information.
  • Another embodiment is a system that can generate a dynamic haptic effect using one or more key frames, where each key frame includes a direction property. The direction property can indicate that the key frame is to be used for a specific direction of a dynamic haptic effect. Based on a determined direction of the dynamic haptic effect, one or more key frames with a direction property that is equal to the determined direction can be used to generate the dynamic haptic effect.
  • One type of a dynamic haptic effect is a haptic effect that can be generated by interpolating a first haptic effect and a second haptic effect based on a dynamic value that is a value between a first interpolant value and a second interpolant value. A dynamic value that is equal to either the first interpolant value or the second interpolant value is considered “between the first interpolant value and the second interpolant value.” More specifically, a value for each parameter of the dynamic haptic effect is calculated by interpolating a value of the parameter of the first haptic effect with a value of the parameter of the second haptic effect, using an interpolation function. The interpolation of each parameter value of the dynamic haptic effect can be based upon where the dynamic value falls between the first interpolant value and the second interpolant value. Dynamic haptic effects are further described in U.S. patent application Ser. No. 13/546,351, filed on Jul. 11, 2012, entitled “GENERATING HAPTIC EFFECTS FOR DYNAMIC EVENTS” (the contents of which are herein incorporated by reference), and in U.S. patent application Ser. No. 13/667,003, filed on Nov. 2, 2012, entitled “ENCODING DYNAMIC HAPTIC EFFECTS” (the contents of which are herein incorporated by reference). The dynamic haptic effect can be encoded using a haptic effect signal, where the haptic effect signal is a representation of the dynamic haptic effect. The haptic effect signal can be persisted on a disk, memory, or any computer-readable storage medium. In the aforementioned embodiments, the two or more key frames associated with a dynamic haptic effect can comprise the one or more input parameters of a dynamic haptic effect produced by a haptic effect signal.
  • FIG. 1 illustrates a block diagram of a system 10 in accordance with one embodiment of the invention. In one embodiment, system 10 is part of a device, and system 10 provides a haptic effect generation functionality for the device. Although shown as a single system, the functionality of system 10 can be implemented as a distributed system. System 10 includes a bus 12 or other communication mechanism for communicating information, and a processor 22 coupled to bus 12 for processing information. Processor 22 may be any type of general or specific purpose processor. System 10 further includes a memory 14 for storing information and instructions to be executed by processor 22. Memory 14 can be comprised of any combination of random access memory (“RAM”), read only memory (“ROM”), static storage such as a magnetic or optical disk, or any other type of computer-readable medium.
  • A computer-readable medium may be any available medium that can be accessed by processor 22 and may include both a volatile and nonvolatile medium, a removable and non-removable medium, a communication medium, and a storage medium. A communication medium may include computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any other form of an information delivery medium known in the art. A storage medium may include RAM, flash memory, ROM, erasable programmable read-only memory (“EPROM”), electrically erasable programmable read-only memory (“EEPROM”), registers, hard disk, a removable disk, a compact disk read-only memory (“CD-ROM”), or any other form of a storage medium known in the art.
  • In one embodiment, memory 14 stores software modules that provide functionality when executed by processor 22. The modules include an operating system 15 that provides operating system functionality for system 10, as well as the rest of a mobile device in one embodiment. The modules further include a haptic effect generation module 16 that generates a dynamic haptic effect, as disclosed in more detail below. In certain embodiments, haptic effect generation module 16 can comprise a plurality of modules, where each individual module provides specific individual functionality for generating a dynamic haptic effect. System 10 will typically include one or more additional application modules 18 to include additional functionality, such as the Integrator™ application by Immersion Corporation.
  • System 10, in embodiments that transmit and/or receive data from remote sources, further includes a communication device 20, such as a network interface card, to provide mobile wireless network communication, such as infrared, radio, Wi-Fi, or cellular network communication. In other embodiments, communication device 20 provides a wired network connection, such as an Ethernet connection or a modem.
  • Processor 22 is further coupled via bus 12 to a display 24, such as a Liquid Crystal Display (“LCD”), for displaying a graphical representation or user interface to a user. The display 24 may be a touch-sensitive input device, such as a touchscreen, configured to send and receive signals from processor 22, and may be a multi-touch touchscreen. Processor 22 may be further coupled to a keyboard or cursor control 28 that allows a user to interact with system 10, such as a mouse or a stylus.
  • System 10, in one embodiment, further includes an actuator 26A. Processor 22 may transmit a haptic signal associated with a generated haptic effect to actuator 26A, which in turn outputs haptic effects such as vibrotactile haptic effects. Actuator 26A includes an actuator drive circuit. Actuator 26A may be, for example, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”), a linear resonant actuator (“LRA”), a piezoelectric actuator, a high bandwidth actuator, an electroactive polymer (“EAP”) actuator, an electrostatic friction display, or an ultrasonic vibration generator. In alternate embodiments, system 10 can include one or more additional actuators, in addition to actuator 26A. In the illustrated embodiment, system 10 includes actuator 26B, in addition to actuator 26A. However, this is only an example embodiment, and in other embodiments, system 10 can include additional actuators not illustrated in FIG. 1, or can only include actuator 26A. In other embodiments, a separate device from system 10 includes an actuator that generates the haptic effects, and system 10 sends generated haptic effect signals to that device through communication device 20. Actuators 26A and 26B are examples of a haptic output device, where a haptic output device is a device configured to output haptic effects, such as vibrotactile haptic effects, in response to a drive signal.
  • FIG. 2 illustrates an example dynamic haptic effect definition 200, according to an embodiment of the invention. According to an embodiment, a dynamic haptic effect can be defined to include one or more key frames. A key frame is a representation of a basis haptic effect that can be used to define the dynamic haptic effect. Also according to an embodiment, a haptic effect signal can be generated using the one or more key frames, where the haptic effect signal is a signal that can store one or more key frames. By generating the haptic effect signal using the one or more key frames, the one or more key frames are generated, and subsequently stored within the haptic effect signal. The haptic effect signal can be stored within, and retrieved from, a haptic effect file.
  • A key frame can include a basis haptic effect definition. A basis haptic effect is a haptic effect that can include one or more parameters that define the characteristics of the haptic effect (more specifically, the characteristics of the kinesthetic feedback and/or tactile feedback produced by the haptic effect), where the haptic effect can be a vibratory haptic effect, for example. Examples of the one or more parameters can include a magnitude parameter, a frequency parameter, a period parameter, and a duration parameter. Examples of basis haptic effects can include a “MagSweep haptic effect”, and a “periodic haptic effect.” A MagSweep haptic effect is a haptic effect that produces kinesthetic feedback and/or tactile feedback (such as a vibration). A periodic haptic effect is a haptic effect that produces a repeating kinesthetic feedback and/or tactile feedback (such as a vibration pattern). An example of a repeating pattern includes repeating pulses of certain shapes, such as sinusoidal, rectangular, triangular, sawtooth-up and sawtooth-down.
  • A key frame can include an interpolant value. An interpolant value is a value that specifies where a current interpolation is occurring. In an embodiment, an interpolant value can be an integer value from a minimum value to a maximum value. As an example, an interpolant value can be from 0 to 10,000. In other embodiments, an interpolant value can be a fixed-point or floating-point numeric value. An interpolant value can be stored within one or more bits.
  • A key frame can optionally also include a repeat gap value. A repeat gap value is a value that indicates a time period between two consecutive instances of a basis haptic effect when the basis haptic effect is played consecutively. In one embodiment, a repeat gap can indicate a number of milliseconds between two consecutive instances of the basis haptic effect.
  • In the illustrated embodiment, dynamic haptic effect definition 200 includes four key frames, key frames 210, 220, 230, and 240. However, this is merely an example embodiment, and in alternate embodiments, a dynamic haptic effect definition can include any number of key frames. Key frame 210 includes a basis haptic effect reference of “Periodic1,” an interpolant value of “0,” and a repeat gap value of “10 ms”. The basis haptic effect reference “Periodic1” refers to basis haptic effect 260, which is also included within dynamic haptic effect definition 200. Thus, key frame 210 defines basis haptic effect 260 as the basis haptic effect for the interpolant value of “0.” Key frame 210 further indicates that when basis haptic effect 260 is played consecutively, there is a time period of 10 ms between each consecutive instance of basis haptic effect 260. Similarly, key frame 220 includes a basis haptic effect reference of “Periodic3,” an interpolant value of “10,” and a repeat gap value of “15 ms”. The basis haptic effect reference “Periodic3” refers to basis haptic effect 270, which is also included within dynamic haptic effect definition 200. Thus, key frame 220 defines basis haptic effect 270 as the basis haptic effect for the interpolant value of “10.” Key frame 220 further indicates that when basis haptic effect 270 is played consecutively, there is a time period of 15 ms between each consecutive instance of basis haptic effect 270.
  • Likewise, key frame 230 includes a basis haptic effect reference of “Periodic1,” an interpolant value of “20,” and a repeat gap value of “5 ms”. As previously described, the basis haptic effect reference “Periodic1” refers to basis haptic effect 260, which is also included within dynamic haptic effect definition 200. Thus, key frame 230 defines basis haptic effect 260 as the basis haptic effect for the interpolant value of “20.” This illustrates that a basis haptic effect can be defined as a basis haptic effect for more than one interpolant value. Key frame 230 further indicates that when basis haptic effect 260 is played consecutively, there is a time period of 5 ms between each consecutive instance of basis haptic effect 260. Similarly, key frame 240 includes a basis haptic effect reference of “Periodic2,” an interpolant value of “30,” and a repeat gap value of “20 ms”. The basis haptic effect reference “Periodic2” refers to basis haptic effect 280, which is also included within dynamic haptic effect definition 200. Thus, key frame 240 defines basis haptic effect 280 as the basis haptic effect for the interpolant value of “30.” Key frame 240 further indicates that when basis haptic effect 280 is played consecutively, there is a time period of 20 ms between each consecutive instance of basis haptic effect 280.
  • According to an embodiment, a dynamic haptic effect can be defined to also include an indication of an end of the dynamic haptic effect. The indication of the end of the dynamic haptic effect indicates that the dynamic haptic effect does not include any additional key frames. As described below in greater detail, a device that interprets a dynamic haptic effect definition can be configured to interpret the contents of the dynamic haptic effect definition sequentially. Thus, the indication can indicate to a device the end of the dynamic haptic effect definition. In one embodiment, the indication of an end of the dynamic haptic effect can be considered an additional key frame. In the illustrated embodiment, dynamic haptic effect definition 200 includes end of dynamic haptic effect definition 250 which indicates the end of dynamic haptic effect definition 200.
  • FIG. 3 illustrates an example key frame definition 300, according to an embodiment of the invention. As previously described, a dynamic haptic effect definition includes one or more key frames. According to the embodiment, a key frame definition can include one or more properties. Each property of the one or more properties can include a value.
  • A key frame definition can include a type property. In one embodiment, the type property is the first property of the key frame definition. The type property can indicate whether the key frame is a key frame that contains a basis haptic effect for a dynamic haptic effect definition or a key frame that indicates an end of the dynamic haptic effect definition. In the illustrated embodiment, key frame definition 300 includes type property 310 which indicates the type of key frame defined by key frame definition 300.
  • A key frame definition can also include a basis haptic effect property. The basis haptic effect property can store a reference to a basis haptic effect for the key frame. In the illustrated embodiment, key frame definition 300 includes basis haptic effect property 320 (identified in FIG. 3 as “effect name”) which includes a reference to a basis haptic effect for the key frame defined by key frame definition 300.
  • A key frame definition can also include an interpolant property. The interpolant property can store an interpolant value, where the interpolant value specifies where a current interpolation is occurring. In an embodiment, an interpolant value can be an integer value from a minimum value to a maximum value. As an example, an interpolant value can be from 0 to 10,000. The interpolant value can be stored in one or more bits. In the illustrated embodiment, key frame definition 300 includes interpolant property 330 which includes an interpolant value for the key frame defined by key frame definition 300.
  • A key frame definition can also optionally include a repeat gap property (not illustrated in FIG. 3). The repeat gap property can store a repeat gap value which indicates a time period between two consecutive instances of a basis haptic effect for a key frame when the basis haptic effect is played consecutively. In one embodiment, a repeat gap can indicate a number of milliseconds between two consecutive instances of the basis haptic effect for the key frame.
  • In one embodiment, a haptic effect file is a computer file configured to store one or more dynamic haptic effects, where the haptic effect file can be persisted on a disk, memory, or any computer-readable storage medium. According to the embodiment, a haptic effect file can store one or more dynamic haptic effect definitions using a basis haptic effect storage block and a frame list block. A basis haptic effect storage block can be used to store one or more basis haptic effects that a dynamic haptic effect can reference. A frame list block can be used to store one or more key frame definitions that correspond to a dynamic haptic effect definition. A basis haptic effect storage block and a frame list block are now described in greater detail.
  • FIG. 4 illustrates an example basis haptic effect storage block 400, according to an embodiment of the invention. As previously described, a dynamic haptic effect definition can include one or more basis haptic effects, where at least one stored basis haptic effect is referenced by at least one key frame of the dynamic haptic definition. In one embodiment, the one or more basis haptic effects can be stored within a basis haptic effect storage block, such as basis haptic effect storage block 400, where the basis haptic effect storage block is stored within the dynamic haptic effect definition.
  • According to the embodiment, one or more basis haptic effects can be stored as message streams within basis haptic effect storage block 400. An example messaging format is a “codename z2” protocol messaging format. In the illustrated embodiment, a basic haptic effect is defined by a SetPeriodic message optionally preceded by a SetPeriodicModifier message. Thus, when a basis haptic effect has an associated envelope, a SetPeriodicModifier message can appear before a SetPeriodic message in the block. Otherwise, only a SetPeriodic message can appear in the block. Thus, according to the embodiment, a basis haptic effect, as stored in a basis haptic effect storage block (such as basis haptic effect storage block 400 of FIG. 4) can either take up: (a) 8 bytes of memory in a single SetPeriodic message (assuming a default envelope); or (b) 16 bytes of memory in a first SetPeriodicModifier message followed by a subsequent SetPeriodic message.
  • According to the embodiment, a basis haptic effect storage block (such as basis haptic effect storage block 400 of FIG. 4) can include one or more basis haptic effect definitions, where each basis haptic effect definition corresponds to a basis haptic effect. The one or more basis haptic effect definitions can be sequential within the basis haptic effect storage block, and can each be associated with an index.
  • In the illustrated embodiment, basis haptic effect storage block 400 includes five basis haptic effects: Effect0, Effect1, Effect2, Effect3, and Effect4. Effect0 is the first basis haptic effect located in basis haptic effect storage block 400, Effect1 is the second basis haptic effect located in basis haptic effect storage block 400, Effect2 is the third basis haptic effect located in basis haptic effect storage block 400, Effect3 is the fourth basis haptic effect located in basis haptic effect storage block 400, and Effect4 is the fifth basis haptic effect located in basis haptic effect storage block 400. Each of the five basis haptic effects (i.e., Effect0, Effect1, Effect2, Effect3, and Effect4) includes a basic haptic definition that either includes a single SetPeriodic message or a combination of a SetPeriodicModifier message and a SetPeriodic message.
  • FIG. 5 illustrates an example frame list block 500, according to an embodiment of the invention. As previously described, a dynamic haptic effect definition can include one or more key frames, where each key frame can reference a basis haptic effect. In one embodiment, the one or more key frames can be stored within a frame list block, such as frame list block 500, where the frame list block is stored within the dynamic haptic effect definition.
  • According to the embodiment, a frame list block, such as frame list block 500, includes a type property for a first key frame definition. Depending on the type property, the frame list block further includes one or more properties associated with the first key frame definition, such as a basis haptic effect property, an interpolant property, a repeat gap property, or a combination therein. The frame list block further includes a type property for a second key frame definition, which indicates the end of the first key frame definition. Depending on the type property, the frame list block further includes one or more properties associated with the second key frame definition, such as a basis haptic effect property, an interpolant property, a repeat gap property, or a combination therein. This continues for each key frame definition of the frame list block. The frame list block further includes a type property that indicates an end of a dynamic haptic effect. According to the embodiment, the key frame definitions of the frame list block are in sequential order. In other words, the events of the frame list block are processed in the order in which they are located within the frame list block.
  • According to the embodiment, one or more properties of the frame list block can be encoded using a single header byte, followed by optional data bytes. An example encoding scheme of one or more properties of a frame list block is as follows:
  • Key Frame Type Property
    Byte # Bits7-0 Meaning
    0 0xC1 Type = Key Frame. No data associated with this
    property.
  • End of Dynamic Haptic Effect Type Property
    Byte # Bits7-0 Meaning
    0 0xCF Type = End of Dynamic Haptic Effect. No data
    associated with this property.
  • EffectNameAsOffSetU8 Property
    Byte
    # Bits7-0 Meaning
    0 0xE0 EffectName is specified as a memory offset from
    the base of the basis haptic effect storage block.
    1 OFFSET11_3 The offset from the basis of the basis haptic
    effect storage block where the 8 or 16-byte haptic
    effect definition can be found. The OFFSET11_3
    value can be multiplied by 8 to obtain the
    actual address offset
  • InterpolantU16 Property
    Byte # Bits7-0 Meaning
    0 0xE6 Interpolant is stored as a 16-bit unsigned integer
    1 TIME15_8 The MSByte of the TimeOffset value
    2 TIME7_0 The LSByte of the TimeOffset value
  • RepeatGapU16 Property
    Byte # Bits7-0 Meaning
    0 0xE2 Repeat gap value is stored in an un-signed
    16-bit value, in milliseconds
    1 PERIOD15_8 The MSByte of the value
    2 PERIOD7_0 The LSByte of the value
  • According to the embodiment, the key frame type property and the end of dynamic haptic effect type property correspond to a type property of a key frame definition, the EffectNameAsOffSetU8 property corresponds to a basis haptic effect property of the key frame definition, the InterpolantU16 property corresponds to an interpolant property of the key frame definition, and the RepeatGapU16 property corresponds to a repeat gap property of the key frame definition.
  • In the illustrated embodiment, frame list block 500 includes key frame definitions 510, 520, and 530. Key frame definitions 510 and 520 are each a definition for a basis haptic effect key frame. Key frame definition 530 is an indication of an end of the dynamic haptic effect stored within frame list block. The left column of frame list block 500 indicates a byte stream that is found in memory for each of key frame definitions 510, 520, and 530. The right column of frame list block indicates a meaning of each property for each of key frame definitions 510, 520, and 530.
  • According to the illustrated embodiment, key frame definition 510 includes a key frame type property (“KeyFrame Event” as illustrated in FIG. 5) that indicates a start of key frame definition 510. Key frame definition 510 further includes a basis haptic effect property (“EffectNameAsOffsetU8” as illustrated in FIG. 5) that stores a reference to a basis haptic effect for key frame definition 510, where the basis haptic effect property includes a header byte and an offset byte. Key frame definition 510 further includes an interpolant property (“InterpolantU16” as illustrated in FIG. 5) that stores an interpolant value specifying where a current interpolation is occurring, where the interpolant property includes a header byte, a most significant bit (“MSB”), and a least significant bit (“LSB”). Key frame definition 510 further includes a repeat gap property (“RepeatGapU16” as illustrated in FIG. 5) that stores a repeat gap value which indicates a time period between two consecutive instances of a basis haptic effect for a key frame, where the repeat gap property includes a header byte, an MSB, and an LSB.
  • Further, key frame definition 520 also includes a key frame type property (“KeyFrame Event” as illustrated in FIG. 5) that indicates a start of key frame definition 520. Key frame definition 520 further includes a basis haptic effect property (“EffectNameAsOffsetU16” as illustrated in FIG. 5) that stores a reference to a basis haptic effect for key frame definition 520, where the basis haptic effect property includes a header byte, a basis haptic effect definition MSB, and a basis haptic effect definition LSB. Key frame definition 520 further includes an interpolant property (“InterpolantU16” as illustrated in FIG. 5) that stores an interpolant value specifying where a current interpolation is occurring, where the interpolant property includes a header byte, an MSB, and an LSB. As illustrated in FIG. 5, in contrast to key frame definition 510, key frame definition 520 does not include a repeat gap property. Finally, key frame definition 530 includes an end of dynamic haptic effect type property (“EndofDynamicHapticEffect” as illustrated in FIG. 5) that indicates an end of the dynamic haptic effect definition.
  • According to an embodiment, a dynamic haptic effect definition (such as dynamic haptic effect definition 200 of FIG. 2) can be stored within a haptic effect file. As previously described, a haptic effect file is a computer file configured to store one or more dynamic haptic effects. The dynamic haptic effect definition can be stored within the haptic effect file, and the haptic effect file can be persisted within a computer-readable medium, such as a disk or memory. The dynamic haptic effect definition can subsequently be retrieved from the haptic effect and interpreted. Based on the interpretation of the dynamic haptic effect definition, a dynamic haptic effect can be generated by interpolating a first haptic effect and a second haptic effect based on a dynamic value that is a value between a first interpolant value and a second interpolant value. More specifically, a value for each parameter of the dynamic haptic effect can be calculated by interpolating a value of the parameter of the first haptic effect with a value of the parameter of the second haptic effect, using an interpolation function. The interpolation of each parameter value of the dynamic haptic effect can be based upon where the dynamic value falls between the first interpolant value and the second interpolant value. For example, where a first interpolant value is “0” and a second interpolant value is “100,” a dynamic value of “50” can cause a first haptic effect associated with the first interpolant value of “0” to be interpolated with a second haptic effect associated with the second interpolant value of “100” to create the dynamic haptic effect. Each parameter value of the first haptic effect can be interpolated with a parameter value of the second value based on an interpolation function, so that the parameter values of the dynamic haptic effect are based both on the parameter values of the first haptic effect and the parameter values of the second haptic effect. Also according to an embodiment, a dynamic haptic effect definition (such as dynamic haptic effect definition 200 of FIG. 2) can be used to generate a haptic effect signal, where the haptic effect signal can be stored within a haptic effect file. The haptic effect signal can subsequently be retrieved from the haptic effect file. Further, a drive signal can be applied to a haptic output device according to the haptic effect signal. The drive signal can be further generated using the haptic output device.
  • As previously described, a system can generate a dynamic haptic effect and distribute the haptic effect among a plurality of actuators. The generation and distribution of the dynamic haptic effect can be implemented using different techniques. Example implementations are described below.
  • FIG. 6 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to an embodiment of the invention. According to the embodiment, a dynamic haptic effect can include one or more key frames, and each key frame can include an actuator value that identifies an actuator. The actuator identified by the actuator value is an actuator that the key frame targets. When a system interpolates each haptic effect stored in two or more key frames to generate the dynamic haptic effect, the system can also interpolate each actuator value stored in the two or more key frames to generate an interpolated actuator value. The system can then use the interpolated actuator value to distribute the generated dynamic haptic effect among two or actuators identified by the two or more actuator values.
  • For example, as illustrated in FIG. 6, dynamic haptic effect 610 includes key frames 611 and 612. Key frame 611 includes an interpolant value of “40%,” a magnitude value of “20%,” a duration value of “100 ms,” and an actuator value (identified in FIG. 6 as an “actuator index”) of “1.” According to the embodiment, the magnitude value of “20%,” and the duration value of “100 ms,” collectively represents a basis haptic effect that is stored within key frame 611. Of course, the illustrated embodiment is only an example embodiment, and, in alternate embodiments, a basis haptic effect can include other parameters (such as a frequency parameter and a period parameter) that are stored within a key frame. Key frame 612 includes an interpolant value of “80%,” a magnitude value of “100%,” a duration value of “200 ms,” and an actuator value of “2.” According to the embodiment, the magnitude value of “100%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 612.
  • According to the embodiment, the system can receive an interpolant value of “70%” at effect interpolator 620, and interpolate the basis haptic effect of key frame 611 (i.e., the magnitude value of “20%,” and the duration value of “100 ms,”) with the basis haptic effect of key frame 612 (i.e., the magnitude value of “100%,” and the duration value of “200 ms”). The system can further interpolate the actuator value of key frame 611 (i.e., the actuator value “1”) with the actuator value of key frame 612 (i.e., the actuator value “2”). The result of these two interpolations is interpolated parameters 630, where interpolated parameters 630 represent dynamic haptic effect 610.
  • According to the illustrated embodiment, interpolated parameters 630 include an interpolated magnitude value of “80%,” an interpolated duration value of “175 ms,” and an interpolated actuator value (identified in FIG. 6 as an “actuator index”) of “1.75.” According to the embodiment, the interpolated magnitude value of “80%,” is interpolated from the two magnitude values of “20%” and “100%,” based on an interpolation function. Further, the interpolated duration value of “175 ms” is interpolated from the two duration values of “100 ms” and “200 ms,” also based on an interpolation function. Additionally, the interpolated actuator value of “1.75” is interpolated from the two actuator values of “1,” and “2,” also based on an interpolation function.
  • Further, according to the embodiment, at effect distributor 640, the system can distribute interpolated parameters 630, which represent dynamic haptic effect 610, among a plurality of actuators based on the interpolated actuator value, where the system generates targeted parameters for each actuator of the plurality of actuators based on the interpolated actuator value. The targeted parameters for each actuator represent a portion of dynamic haptic effect 610 that is distributed to the actuator. In the illustrated embodiment, at effect distributor 640, the system generates targeted parameters 650 and targeted parameters 660. Targeted parameters 650 include a magnitude value of “20%,” a duration value of “175 ms,” and an actuator value of “1.” Targeted parameters 660 include a magnitude value of “60%,” a duration value of “175 ms,” and an actuator value of “2.” According to the embodiment, the system can use the interpolated actuator value “1.75” of interpolated parameters 630 to distribute 25% of dynamic haptic effect 610 to actuator “1,” and to distribute 75% of dynamic haptic effect 610 to actuator “2.” In one embodiment, the strength parameters of dynamic haptic effect 610, such as a magnitude parameter and a period parameter, are distributed among the plurality of actuators. According to the embodiment, the time-based parameters, such as a frequency parameter and a duration parameter, remain the same for the plurality of actuators.
  • Thus, in the illustrated embodiment, the interpolated magnitude value “80%” of interpolated parameters 630 is distributed into a magnitude value of “20%” for targeted parameters 650, and a magnitude value of “60%” for targeted parameters 660. Further, in the illustrated embodiment, the duration value “175 ms” of interpolated parameters 630 is included in both targeted parameters 650 and targeted parameters 660. While the illustrated embodiment involves two key frames, where each key frame includes an actuator value, and thus, involves two actuators, one of ordinary skill in the art would readily appreciate that, in alternate embodiments, the above implementation can involve any number of key frames, where each key frame includes an actuator value, and thus, can include any number of actuators.
  • Thus, in another example, a first actuator can be on a left side of a device, and a second actuator can be on a right side of a device. A first key frame of a dynamic haptic effect can be played at the first actuator. A second key frame of a dynamic haptic effect can be played at the second actuator. As a system interpolates from the first key frame to the second key frame to generate the dynamic haptic effect, a user of the device can feel the dynamic haptic effect move from the left actuator to the right actuator.
  • FIG. 7 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention. According to the embodiment, a dynamic haptic effect can include one or more key frames, and each key frame can include an actuator value that identifies an actuator. The actuator identified by the actuator value is an actuator that the key frame targets. A system can then group the one or key frames by each actuator identified by each actuator value. More specifically, the system can create one or more groups of key frames, where each key frame of a group targets the same actuator. For each actuator, the system can independently interpolate each haptic effect stored in the one or more key frames of the group associated with the actuator, in order to generate the dynamic haptic effect.
  • For example, as illustrated in FIG. 7, dynamic haptic effect 710 includes key frames 711, 712, 713, and 714. Key frame 711 includes an interpolant value of “40%,” a magnitude value of “20%,” a duration value of “100 ms,” and an actuator value (identified in FIG. 7 as an “actuator index”) of “1.” The magnitude value of “20%,” and the duration value of “100 ms,” collectively represents a basis haptic effect that is stored within key frame 711. Of course, the illustrated embodiment is only an example embodiment, and, in alternate embodiments, a basis haptic effect can include other parameters (such as a frequency parameter and a period parameter) that are stored within a key frame. Key frame 712 includes an interpolant value of “30%,” a magnitude value of “80%,” a duration value of “200 ms,” and an actuator value of “2.” The magnitude value of “80%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 712. Key frame 713 includes an interpolant value of “80%,” a magnitude value of “100%,” a duration value of “200 ms,” and an actuator value of “1.” The magnitude value of “100%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 713. Key frame 714 includes an interpolant value of “90%,” a magnitude value of “50%,” a duration value of “50 ms,” and an actuator value of “2.” The magnitude value of “50%,” and the duration value of “50 ms,” collectively represents a basis haptic effect that is stored within key frame 714.
  • According to the embodiment, at effect grouper 720, the system can group together key frames 711 and 713 based on the actuator value of “1” stored within key frames 711 and 713, and associate the group with actuator 721 (i.e., actuator 1). Further, at effect grouper 720, the system can group together key frames 712 and 714 based on the actuator value of “2” stored within key frames 712 and 714, and associate the group with actuator 722 (i.e., actuator 2).
  • Further, according to the embodiment, the system can receive an interpolant value of “70%” at effect interpolator 730, and interpolate the basis haptic effect of key frame 711 (i.e., the magnitude value of “20%,” and the duration value of “100 ms,”) with the basis haptic effect of key frame 713 (i.e., the magnitude value of “100%,” and the duration value of “200 ms”). The result of this interpolation is interpolated parameters 750, where interpolated parameters 750 represent a portion of dynamic haptic effect 710 that can be output at actuator 721. According to the illustrated embodiment, interpolated parameters 750 include an interpolated magnitude value of “80%,” and an interpolated duration value of “175 ms”. According to the embodiment, the interpolated magnitude value of “80%,” is interpolated from the two magnitude values of “20%” and “100%,” based on an interpolation function. Further, the interpolated duration value of “175 ms” is interpolated from the two duration values of “100 ms” and “200 ms,” also based on an interpolation function.
  • Also according to the embodiment, the system can independently receive an interpolant value of “70%” at effect interpolator 740, and independently interpolate the basis haptic effect of key frame 712 (i.e., the magnitude value of “80%,” and the duration value of “200 ms,”) with the basis haptic effect of key frame 714 (i.e., the magnitude value of “50%,” and the duration value of “50 ms”). The result of this interpolation is interpolated parameters 760, where interpolated parameters 760 represent a portion of dynamic haptic effect 710 that can be output at actuator 722. According to the illustrated embodiment, interpolated parameters 760 include an interpolated magnitude value of “60%,” and an interpolated duration value of “100 ms”. According to the embodiment, the interpolated magnitude value of “60%,” is interpolated from the two magnitude values of “80%” and “50%,” based on an interpolation function. Further, the interpolated duration value of “100 ms” is interpolated from the two duration values of “200 ms” and “50 ms,” also based on an interpolation function. While the illustrated embodiment involves four key frames, and involves two actuators, one of ordinary skill in the art would readily appreciate that, in alternate embodiments, the above implementation can involve any number of key frames, and can include any number of actuators.
  • According to the embodiment, while the interpolation of the basis haptic effects of key frames 711 and 713, and the interpolation of the basis haptic effect of key frames 712 and 714, are performed independently, the independent interpolations can also be synchronized to output dynamic haptic effect 710 at actuators 721 and 722. Thus, actuators 721 and 722 can each output a respective portion of dynamic haptic effect 710 in a synchronized manner. In the illustrated embodiment, the same interpolant value is provided to effect interpolators 730 and 740. However, in alternate embodiments, effect interpolators 730 and 740 can each be provided a different interpolant value.
  • FIG. 8 illustrates a block diagram of an example implementation of generating a dynamic haptic effect at multiple actuators, according to another embodiment of the invention. According to the embodiment, a dynamic haptic effect can include one or more key frames. When a system interpolates each haptic effect stored in two or more key frames to generate the dynamic haptic effect, the system can also determine actuator distribution information that indicates how to distribute the dynamic haptic effect among a plurality of actuators. The system can then interpolate each haptic effect stored in the two or more key frames to generate the dynamic haptic effect, and can then use the actuator distribution information to distribute the generated dynamic haptic effect among the plurality of actuators.
  • For example, as illustrated in FIG. 8, dynamic haptic effect 810 includes key frames 811 and 812. Key frame 811 includes an interpolant value of “40%,” a magnitude value of “20%,” and a duration value of “100 ms.” According to the embodiment, the magnitude value of “20%,” and the duration value of “100 ms,” collectively represents a basis haptic effect that is stored within key frame 811. Of course, the illustrated embodiment is only an example embodiment, and, in alternate embodiments, a basis haptic effect can include other parameters (such as a frequency parameter and a period parameter) that are stored within a key frame. Key frame 812 includes an interpolant value of “80%,” a magnitude value of “100%,” and a duration value of “200 ms”. According to the embodiment, the magnitude value of “100%,” and the duration value of “200 ms,” collectively represents a basis haptic effect that is stored within key frame 812. Dynamic haptic effect 810 further includes actuator distribution information 813 (identified in FIG. 8 as “actuator distribution 813”). Actuator distribution information 813 indicates how to distribute dynamic haptic effect 810 among a plurality of actuators. In the illustrated embodiment, distribution information 813 indicates that 25% of dynamic haptic effect 810 is to be distributed to a first actuator (i.e., Actuator 1), and that 75% of dynamic haptic effect 810 is to be distributed to a second actuator (i.e., Actuator 2).
  • According to the embodiment, the system can receive an interpolant value of “70%” at effect interpolator 820, and interpolate the basis haptic effect of key frame 811 (i.e., the magnitude value of “20%,” and the duration value of “100 ms,”) with the basis haptic effect of key frame 812 (i.e., the magnitude value of “100%,” and the duration value of “100 ms”). The result of this interpolation is interpolated parameters 830, where interpolated parameters 830 represent dynamic haptic effect 810. According to the illustrated embodiment, interpolated parameters 830 include an interpolated magnitude value of “80%,” and an interpolated duration value of “175 ms”. According to the embodiment, the interpolated magnitude value of “80%,” is interpolated from the two magnitude values of “20%” and “100%,” based on an interpolation function. Further, the interpolated duration value of “175 ms” is interpolated from the two duration values of “100 ms” and “200 ms,” also based on an interpolation function.
  • Further, according to the embodiment, at effect distributor 840, the system can distribute interpolated parameters 830, which represent dynamic haptic effect 810, among a plurality of actuators based on actuator distribution information 813, where the system generates targeted parameters for each actuator of the plurality of actuators based on actuator distribution information 813. The targeted parameters for each actuator represent a portion of dynamic haptic effect 810 that is distributed to the actuator. In the illustrated embodiment, at effect distributor 840, the system generates targeted parameters 850 and targeted parameters 860. Targeted parameters 850 include a magnitude value of “20%,” a duration value of “175 ms,” and an actuator value (identified in FIG. 8 as an “actuator index”) of “1.” Targeted parameters 660 include a magnitude value of “60%,” a duration value of “175 ms,” and an actuator value of “2.” According to the embodiment, the system can use actuator distribution information 813 to distribute 25% of dynamic haptic effect 810 to actuator “1,” and to distribute 75% of dynamic haptic effect 810 to actuator “2.” In one embodiment, the strength parameters of dynamic haptic effect 810, such as a magnitude parameter and a period parameter, are distributed among the plurality of actuators. According to the embodiment, the time-based parameters, such as a frequency parameter and a duration parameter, remain the same for the plurality of actuators.
  • Thus, in the illustrated embodiment, the interpolated magnitude value “80%” of interpolated parameters 830 is distributed into a magnitude value of “20%” for targeted parameters 850, and a magnitude value of “60%” for targeted parameters 860. Further, in the illustrated embodiment, the interpolated duration value “175 ms” of interpolated parameters 830 is included in both targeted parameters 850 and targeted parameters 860. While the illustrated embodiment involves two key frames, and involves two actuators, one of ordinary skill in the art would readily appreciate that, in alternate embodiments, the above implementation can involve any number of key frames, and can include any number of actuators.
  • In one embodiment, actuator distribution information 813 can be stored within a haptic effect file that dynamic haptic effect 810 is stored within. In another embodiment, actuator distribution information 813 can stored within one or more key frames of dynamic haptic effect 810. In yet another embodiment, actuator distribution information 813 can be stored within a basis haptic effect that is referenced by one or more of the key frames of dynamic haptic effect 810. In yet another embodiment, actuator distribution information 813 can be determined by the system at run time.
  • In one embodiment, even if one or more key frames of dynamic haptic effect 810 includes an actuator value, the system can disregard the one or more actuator values stored within the one or more key frames of dynamic haptic effect 810, and can distribute dynamic haptic effect 810 among a plurality of actuators based on actuator distribution information 813.
  • FIG. 9 illustrates an example key frame definition 900 that includes a direction property, according to an embodiment of the invention. According to an embodiment, as previously described, a dynamic haptic effect can be defined to include one or more key frames. As also previously described, a key frame can include a basis haptic effect definition, an interpolant value, and optionally, a repeat gap value.
  • According to the embodiment, a key frame can also include a direction value. A direction value is a value that specifies a direction of a dynamic haptic effect. A direction of a dynamic haptic effect is an ordinal direction of a received interpolant value for the dynamic haptic effect, as compared to a previously received interpolant value for the dynamic haptic effect. For example, if a received interpolant value for a dynamic haptic effect is the value “100,” and the previously received interpolant value for the dynamic haptic effect is “50,” a direction of the dynamic haptic effect can be classified as “upwards,” because the received interpolant value is greater than the previously received interpolant value. Thus, as an example, a direction value of “UP,” can specified the “upwards” direction of the dynamic haptic effect. As another example, if a received interpolant value for a dynamic haptic effect is the value “100,” and the previously received interpolant value for the dynamic haptic effect is “200,” a direction of the dynamic haptic effect can be classified as “downwards,” because the received interpolant value is less than the previously received interpolant value. Thus, as an example, a direction value of “DOWN,” can specified the “downwards” direction of the dynamic haptic effect. In an embodiment, a direction value can be a string value. In another embodiment, a direction value can be a fixed-point or floating-point numeric value. A direction value can be stored within one or more bits.
  • Also according to the embodiment, a key frame can optionally also include a category value. A category value is a value that specifies a category of a dynamic haptic effect. A category of a dynamic haptic effect is a classification of the dynamic haptic effect. In one embodiment, a category of a dynamic haptic effect can be determined by a category value that can be received along with an interpolant value. In another embodiment, a category of the dynamic haptic effect can be determined based on a received interpolant value. In an embodiment, a category value can be a string value. In another embodiment, a category value can be a fixed-point or floating-point numeric value. A category value can be stored within one or more bits.
  • In the illustrated embodiment, dynamic haptic effect definition 900 includes four key frames, key frames 910, 920, 930, and 940. However, this is merely an example embodiment, and in alternate embodiments, a dynamic haptic effect definition can include any number of key frames. Key frame 910 includes a basis haptic effect reference of “Periodic1,” an interpolant value of “0,” and a repeat gap value of “10 ms”. The basis haptic effect reference “Periodic1” refers to basis haptic effect 960, which is also included within dynamic haptic effect definition 900. Thus, key frame 910 defines basis haptic effect 960 as the basis haptic effect for the interpolant value of “0.” Key frame 910 further indicates that when basis haptic effect 960 is played consecutively, there is a time period of 10 ms between each consecutive instance of basis haptic effect 960. Similarly, key frame 920 includes a basis haptic effect reference of “Periodic2,” an interpolant value of “80,” and a repeat gap value of “15 ms.” The basis haptic effect reference “Periodic2” refers to basis haptic effect 970, which is also included within dynamic haptic effect definition 900. Thus, key frame 920 defines basis haptic effect 970 as the basis haptic effect for the interpolant value of “80.” Key frame 920 further indicates that when basis haptic effect 970 is played consecutively, there is a time period of 15 ms between each consecutive instance of basis haptic effect 970.
  • Likewise, key frame 930 includes a basis haptic effect reference of “Periodic3,” an interpolant value of “90,” and a repeat gap value of “5 ms.” The basis haptic effect reference “Periodic3” refers to basis haptic effect 980, which is also included within dynamic haptic effect definition 900. Thus, key frame 930 defines basis haptic effect 980 as the basis haptic effect for the interpolant value of “90.” Key frame 930 further indicates that when basis haptic effect 980 is played consecutively, there is a time period of 5 ms between each consecutive instance of basis haptic effect 980. Similarly, key frame 940 includes a basis haptic effect reference of “Periodic3,” an interpolant value of “100,” and a repeat gap value of “20 ms”. As previously described, the basis haptic effect reference “Periodic3” refers to basis haptic effect 980, which is also included within dynamic haptic effect definition 200. Thus, key frame 940 defines basis haptic effect 980 as the basis haptic effect for the interpolant value of “100.” Key frame 940 further indicates that when basis haptic effect 980 is played consecutively, there is a time period of 20 ms between each consecutive instance of basis haptic effect 980.
  • According to the illustrated embodiment, key frames 910, 920, 930, and 940 each also include a direction value. More specifically, key frames 910 and 930 each include a direction value of “UP,” and key frames 920 and 940 each include a direction value of “DOWN.” Thus, key frames 910 and 930 indicate that they are key frames to be used (and thus, their respective basis haptic effects are to be used) when a direction of dynamic haptic effect definition 900 is an “upwards” direction. Further, key frames 920 and 940 indicate that they are key frames to be used (and thus, their respective basis haptic effects are to be used) when a direction of dynamic haptic effect definition 900 is a “downwards” direction. In an alternate embodiment, key frames 910, 920, 930, and 940 can each optionally also include a category value (not illustrated in FIG. 9). In this alternate embodiment, key frames 910, 920, 930, and 940 further indicate that each respective key frame (and thus, each respective basis haptic effect) is to be used when a category of dynamic haptic effect definition 900 is a category that is equal to the category value of the respective key frame
  • According to an embodiment, a dynamic haptic effect can be defined to also include an indication of an end of the dynamic haptic effect. The indication of the end of the dynamic haptic effect indicates that the dynamic haptic effect does not include any additional key frames. As previously described, a device that interprets a dynamic haptic effect definition can be configured to interpret the contents of the dynamic haptic effect definition sequentially. Thus, the indication can indicate to a device the end of the dynamic haptic effect definition. In one embodiment, the indication of an end of the dynamic haptic effect can be considered an additional key frame. In the illustrated embodiment, dynamic haptic effect definition 900 includes end of dynamic haptic effect definition 950 which indicates the end of dynamic haptic effect definition 900.
  • In one example embodiment, a dynamic haptic effect can be designed for a user interface software module that displays a glow along an edge of a user interface of a device. The dynamic haptic effect can include a plurality of key frames, where each key frame includes an interpolant value. According to the embodiment, there may be many more interpolant values associated with decreasing the glow along the edge of the user interface, in comparison with the interpolant values associated with increasing the glow. By storing a direction value within each key frame, where each direction value either includes a value of “GROW” or “DECAY,” each key frame (and thus, each interpolant value of the value) can be associated with a specific direction of the dynamic haptic effect (i.e., a “growing” direction of the dynamic haptic effect, or a “decaying” direction of the dynamic haptic effect).
  • FIG. 10 illustrates a flow diagram of the functionality of a haptic effect generation module (such as haptic effect generation module 16 of FIG. 1), according to one embodiment of the invention. In one embodiment, the functionality of FIG. 10, as well as the functionality of FIG. 11, are each implemented by software stored in memory or another computer-readable or tangible medium, and executed by a processor. In other embodiments, each functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software. Furthermore, in alternate embodiments, each functionality may be performed by hardware using analog components.
  • The flow begins and proceeds to 1010. At 1010, a first key frame is received, where the first key frame includes a first interpolant value and a first haptic effect. The first interpolant value can be a value that specifies where an interpolation occurs for the first haptic effect. The first key frame can include a repeat gap value. The first haptic effect can be a vibratory haptic effect, and can include a plurality of parameters. The plurality of parameters can include a magnitude parameter, a frequency parameter, a period parameter, and a duration parameter. The flow then proceeds to 1020.
  • At 1020, a second key frame is received, where the second key frame includes a second interpolant value and a second haptic effect. The second interpolant value can be a value that specifies where an interpolation occurs for the second haptic effect. The second key frame can include a repeat gap value. The second haptic effect can be a vibratory haptic effect, and can include a plurality of parameters. The plurality of parameters can include a magnitude parameter, a frequency parameter, a period parameter, and a duration parameter. In the illustrated embodiment, a dynamic haptic effect can be defined as including two key frames, where each key frame includes a haptic effect. However, a dynamic haptic effect can alternately be defined as including three or more key frames, where each key frame includes a haptic effect. The flow then proceeds to 1030.
  • At 1030, an interpolant value is received, where the interpolant value is between the first interpolant value and the second interpolant value. The flow then proceeds to 1040. At 1040, a dynamic haptic effect is determined from the interpolant value. The determining of the dynamic haptic effect is further described in greater detail according to different embodiments. The flow then proceeds to 1050. At 1050, the dynamic haptic effect is distributed among a plurality of actuators. The distributing of the dynamic haptic effect is also further described in greater detail according to different embodiments. The flow then ends.
  • In one embodiment, the determining of the dynamic haptic effect can be performed according to the following functionality. The dynamic haptic effect can be interpolated from the first haptic effect and the second haptic effect. According to the embodiment, a value for each parameter of the dynamic haptic effect can be calculated by interpolating a value of the parameter of the first haptic effect with a value of the parameter of the second haptic effect, using an interpolation function. The interpolation of each parameter value of the dynamic haptic effect can be based upon where the received interpolant value falls between the first interpolant value that corresponds to the first haptic effect and the second interpolant value that corresponds to the second haptic effect. In the illustrated embodiment, a dynamic haptic effect can be generated by interpolating two basis haptic effects. Such an interpolation can be a linear interpolation. However, a dynamic haptic effect can alternately be generated by interpolating three or more haptic effects based on the abovementioned functionality. Such an interpolation can be a spline interpolation, where a spline interpolation is a form of interpolation where an interpolating function is a special type of piecewise polynomial called a spline, and where the interpolating function is a function that can map an interpolant value to a dynamic haptic effect using two or more key frames.
  • Further, according to the embodiment, the distributing the dynamic haptic effect can be based on the following functionality. A first actuator value can be received. The first actuator value can correspond to a first actuator of the plurality of actuators, and the first actuator value can be stored within the first key frame. A second actuator value can also be received. The second actuator value can correspond to a second actuator of the plurality of actuators, and the second actuator value can be stored within the second key frame. In alternate embodiments, one or more additional actuator values can be received. An interpolated actuator value can be interpolated from the first actuator value and the second actuator value. According to the embodiment, the interpolated actuator value can be calculated by interpolating the first actuator value with the second actuator value, using an interpolation function. However, an interpolated actuator value can alternately be generated by interpolating three or more actuator values based on the abovementioned functionality. The dynamic haptic effect can then be distributed among the first actuator and the second actuator based on the interpolated actuator value. However, a dynamic haptic effect can alternately be distributed among three or more actuators based on the aforementioned functionality.
  • In another embodiment, a third key frame and fourth key frame are received, where the third key frame includes a third interpolant value and a third haptic effect, and the fourth key frame includes a fourth interpolant value and fourth haptic effect. In this embodiment, the distributing the dynamic haptic effect can be based on the following functionality. A first actuator value can be received. The first actuator value can correspond to a first actuator of the plurality of actuators, and the first actuator value can be stored within the first key frame and the third key frame. A second actuator value can be received. The second actuator value can correspond to a second actuator of the plurality of actuators, and the second actuator value can be stored within the second key frame and the fourth key frame. The first key frame and the third key frame can be grouped together. The second key frame and the fourth key frame can be grouped together. However, a dynamic haptic effect can alternately be distributed among three or more actuators based on the aforementioned functionality.
  • Further, according to the embodiment, the determining of the dynamic haptic effect can be performed according to the following functionality. A first dynamic haptic effect for the first actuator can be interpolated from the first haptic effect and the third haptic effect. A second dynamic haptic effect for the second actuator can be interpolated from the second haptic effect and the fourth haptic effect. However, each dynamic haptic effect can alternately be generated by interpolating three or more haptic effects based on the abovementioned functionality.
  • In another embodiment, the determining of the dynamic haptic effect can be performed according to the following functionality. The dynamic haptic effect can be interpolated from the first haptic effect and the second haptic effect, as previously described. However, a dynamic haptic effect can alternately be generated by interpolating three or more haptic effects based on the abovementioned functionality.
  • Further, according to the embodiment, the determining of the dynamic haptic effect can be performed according to the following functionality. Actuator distribution information can be received, where the actuator distribution information indicates how to distribute the dynamic haptic effect among the plurality of actuators. The dynamic haptic effect can then be distributed among the plurality of actuators based on the actuator distribution information. In one embodiment, the actuator distribution information can be stored within a haptic effect file that the dynamic haptic effect is stored within. In another embodiment, the actuator distribution information can stored within one or more key frames of the dynamic haptic effect. In yet another embodiment, the actuator distribution information can be stored within a haptic effect that is referenced by one or more of the key frames of the dynamic haptic effect. In yet another embodiment, the actuator distribution information can be determined at run time.
  • FIG. 11 illustrates a flow diagram of the functionality of a haptic effect generation module (such as haptic effect generation module 16 of FIG. 1), according to another embodiment of the invention. The flow begins and proceeds to 1110. At 1110, a plurality of key frames is received. Each key frame includes a key frame interpolant value, a haptic effect, and a direction value. The flow then proceeds to 1120. At 1120, an interpolant value is received. The interpolant value is between at least two key frame interpolant values. The flow then proceeds to 1130. At 1130, a direction is determined for a dynamic haptic effect. The flow then proceeds to 1140. At 1140, one or more key frames are selected from the plurality of key frames. Each selected key frame includes a direction value that is equal to the direction. The flow then proceeds to 1150. At 1150, the dynamic haptic effect is determined from the interpolant value and the direction. According to the embodiment, the determining can include interpolating the dynamic haptic effect from at least two haptic effects of at least two selected key frames. The flow then ends.
  • In an alternate embodiment, each key frame can also include a category value. A category can be determined for the dynamic haptic effect. One or more key frames can be further selected from the selected one or more key frames. Each further selected key frame can include a category value that is equal to the category. The dynamic haptic effect can be determined from the interpolant value, the direction, and the category. According to the alternate embodiment, the determining can include interpolating the dynamic haptic effect from at least two haptic effects of at least two further selected key frames.
  • Thus, according to an embodiment, a system can be provided that generates one or more dynamic haptic effects at a plurality of actuators, where the one or more dynamic haptic effects can be distributed among the plurality of actuators. Thus, the system can animate spatial characteristics of the dynamic haptic effect by moving the dynamic haptic effect among different actuators. This can allow the system to add a spatial location as a parameter of a dynamic haptic effect, and can provide for a more robust haptic experience. Further, according to another embodiment, a system can be provided that generates a dynamic haptic effect using one or more key frames, where each key frame includes a direction property. This can allow a dynamic haptic effect to be modified differently depending on the direction, and thus, can further enhance a haptic experience.
  • The features, structures, or characteristics of the invention described throughout this specification may be combined in any suitable manner in one or more embodiments. For example, the usage of “one embodiment,” “some embodiments,” “certain embodiment,” “certain embodiments,” or other similar language, throughout this specification refers to the fact that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present invention. Thus, appearances of the phrases “one embodiment,” “some embodiments,” “a certain embodiment,” “certain embodiments,” or other similar language, throughout this specification do not necessarily all refer to the same group of embodiments, and the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
  • One having ordinary skill in the art will readily understand that the invention as discussed above may be practiced with steps in a different order, and/or with elements in configurations which are different than those which are disclosed. Therefore, although the invention has been described based upon these preferred embodiments, it would be apparent to those of skill in the art that certain modifications, variations, and alternative constructions would be apparent, while remaining within the spirit and scope of the invention. In order to determine the metes and bounds of the invention, therefore, reference should be made to the appended claims.

Claims (15)

We claim:
1. A computer-readable medium having instructions stored thereon that, when executed by a processor, cause the processor to generate a dynamic haptic effect, the generating the dynamic haptic effect comprising:
receiving a plurality of key frames, where each key frame comprises a key frame interpolant value, a haptic effect, and a direction value;
receiving an interpolant value, wherein the interpolant value is between at least two key frame interpolant values;
determining a direction for the dynamic haptic effect;
selecting one or more key frames from the plurality of key frames, wherein each
selected key frame comprises a direction value that is equal to the direction; and
determining the dynamic haptic effect from the interpolant value and the direction, wherein the determining comprises interpolating the dynamic haptic effect from at least two haptic effects of at least two selected key frames.
2. The computer-readable medium of claim 1, wherein each key frame comprises a category value;
the generating the dynamic haptic effect further comprising:
determining a category for the dynamic haptic effect;
further selecting one or more key frames from the selected one or more key frames, wherein each further selected key frame comprises a category value equal to the category; and
determining the dynamic haptic effect from the interpolant value, the direction, and the category, wherein the determining comprises interpolating the dynamic haptic effect from at least two haptic effects of at least two further selected key frames.
3. The computer readable medium of claim 2, wherein the dynamic haptic effect is a vibratory haptic effect and comprises a plurality of parameters.
4. The computer readable medium of claim 3, wherein the plurality of parameters comprise a duration parameter, a magnitude parameter, a period parameter, and a frequency parameter.
5. A computer-implemented method for generating a dynamic haptic effect, the computer-implemented method comprising:
receiving a plurality of key frames, where each key frame comprises a key frame interpolant value, a haptic effect, and a direction value;
receiving an interpolant value, wherein the interpolant value is between at least two key frame interpolant values;
determining a direction for the dynamic haptic effect;
selecting one or more key frames from the plurality of key frames, wherein each selected key frame comprises a direction value that is equal to the direction; and
determining the dynamic haptic effect from the interpolant value and the direction, wherein the determining comprises interpolating the dynamic haptic effect from at least two haptic effects of at least two selected key frames.
6. The computer-implemented method of claim 5, wherein each key frame comprises a category value;
the computer-implemented method further comprising:
determining a category for the dynamic haptic effect.
7. The computer-implemented method of claim 6, wherein the category is determined based on the interpolant value or the category value received along with the interpolant value.
8. The computer-implemented method of any of claim 6, further comprising:
further selecting one or more key frames from the one or more selected key frames, wherein each further selected key frame comprises a category value equal to the category; and
determining the dynamic haptic effect from the interpolant value, the direction, and the category, wherein the determining comprises interpolating the dynamic haptic effect from at least two haptic effects of at least two further selected key frames.
9. The computer-implemented method of claim 5, wherein the dynamic haptic effect is a vibratory haptic effect and comprises a plurality of parameters.
10. The computer-implemented method of claim 9, wherein the plurality of parameters comprise a duration parameter, a magnitude parameter, a period parameter, and a frequency parameter.
11. A system for generating a dynamic haptic effect, the system comprising:
a memory configured to store a haptic effect generation module; and
a processor configured to execute the haptic effect generation module stored on the memory;
wherein the haptic effect generation module is configured to receive a plurality of key frames, each key frame comprising a key frame interpolant value, a haptic effect, and a direction value;
wherein the haptic effect generation module is further configured to receive an interpolant value, the interpolant value being between at least two key frame interpolant values;
wherein the haptic effect generation module is further configured to determine a direction for the dynamic haptic effect;
wherein the haptic effect generation module is further configured to select one or more key frames from the plurality of key frames, wherein each selected key frame comprises a direction value that is equal to the direction; and
wherein the haptic effect generation module is further configured to determine the dynamic haptic effect from the interpolant value and the direction, wherein the determining comprises interpolating the dynamic haptic effect from at least two haptic effects of at least two selected key frames.
12. The system of claim 11,
wherein each key frame comprises a category value; and
wherein the haptic effect generation module is further configured to determine a category for the dynamic haptic effect.
13. The system of claim 12, wherein the category is determined based on the interpolant value or the category value received along with the interpolant value.
14. The system of any one of claim 12,
wherein the haptic effect generation module is further configured to further select one or more key frames from the one or more selected key frames, each further selected key frame comprising a category value equal to the category; and
wherein the haptic effect generation module is further configured to determine the dynamic haptic effect from the interpolant value, the direction, and the category, wherein the determining comprises interpolating the dynamic haptic effect from at least two haptic effects of at least two further selected key frames.
15. The system of claim 11, further comprising an actuator configured to output one or more haptic effects based on the dynamic haptic effect.
US16/419,603 2012-12-10 2019-05-22 Enhanced dynamic haptic effects Abandoned US20190272039A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/419,603 US20190272039A1 (en) 2012-12-10 2019-05-22 Enhanced dynamic haptic effects

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/709,157 US9898084B2 (en) 2012-12-10 2012-12-10 Enhanced dynamic haptic effects
US15/791,862 US10359851B2 (en) 2012-12-10 2017-10-24 Enhanced dynamic haptic effects
US16/419,603 US20190272039A1 (en) 2012-12-10 2019-05-22 Enhanced dynamic haptic effects

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/791,862 Continuation US10359851B2 (en) 2012-12-10 2017-10-24 Enhanced dynamic haptic effects

Publications (1)

Publication Number Publication Date
US20190272039A1 true US20190272039A1 (en) 2019-09-05

Family

ID=49552214

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/709,157 Active 2034-04-17 US9898084B2 (en) 2012-12-10 2012-12-10 Enhanced dynamic haptic effects
US15/791,862 Active US10359851B2 (en) 2012-12-10 2017-10-24 Enhanced dynamic haptic effects
US16/419,603 Abandoned US20190272039A1 (en) 2012-12-10 2019-05-22 Enhanced dynamic haptic effects

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/709,157 Active 2034-04-17 US9898084B2 (en) 2012-12-10 2012-12-10 Enhanced dynamic haptic effects
US15/791,862 Active US10359851B2 (en) 2012-12-10 2017-10-24 Enhanced dynamic haptic effects

Country Status (5)

Country Link
US (3) US9898084B2 (en)
EP (1) EP2741174A3 (en)
JP (2) JP6258023B2 (en)
KR (1) KR102207669B1 (en)
CN (2) CN108803878A (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8947216B2 (en) 2012-11-02 2015-02-03 Immersion Corporation Encoding dynamic haptic effects
US9898084B2 (en) * 2012-12-10 2018-02-20 Immersion Corporation Enhanced dynamic haptic effects
US9645646B2 (en) * 2014-09-04 2017-05-09 Intel Corporation Three dimensional contextual feedback wristband device
KR101606791B1 (en) * 2015-09-08 2016-03-28 박재성 System providing Real Time Vibration according to Frequency variation and Method providing the vibration
WO2017053761A1 (en) * 2015-09-25 2017-03-30 Immersion Corporation Haptic effects design system
KR102188157B1 (en) * 2015-12-11 2020-12-07 코오롱인더스트리 주식회사 Tactile stimulation device and driving method thereof
JP6383765B2 (en) * 2016-08-25 2018-08-29 株式会社ファセテラピー Haptic content generation device, tactile content generation method, and tactile content use device
DK201670728A1 (en) 2016-09-06 2018-03-19 Apple Inc Devices, Methods, and Graphical User Interfaces for Providing Feedback During Interaction with an Intensity-Sensitive Button
US10684689B2 (en) 2018-04-20 2020-06-16 Immersion Corporation Cross-platform dynamic haptic effect design tool for augmented or virtual reality environments

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080223627A1 (en) * 2005-10-19 2008-09-18 Immersion Corporation, A Delaware Corporation Synchronization of haptic effect data in a media transport stream
US9898084B2 (en) * 2012-12-10 2018-02-20 Immersion Corporation Enhanced dynamic haptic effects

Family Cites Families (116)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU8932191A (en) 1990-11-30 1992-06-25 Cambridge Animation Systems Limited Image synthesis and processing
JPH0816820A (en) 1994-04-25 1996-01-19 Fujitsu Ltd Three-dimensional animation generation device
US5774386A (en) 1995-09-08 1998-06-30 Eastman Kodak Company Method and apparatus for performing function evaluation using a cache
JP4131278B2 (en) 1996-10-18 2008-08-13 ヤマハ株式会社 Force control device for keyboard instruments
US6108011A (en) 1996-10-28 2000-08-22 Pacific Data Images, Inc. Shape interpolation for computer-generated geometric models using independent shape parameters for parametric shape interpolation curves
US7091948B2 (en) 1997-04-25 2006-08-15 Immersion Corporation Design of force sensations for haptic feedback computer interfaces
US6337678B1 (en) * 1999-07-21 2002-01-08 Tactiva Incorporated Force feedback computer input and output device with coordinated haptic elements
US6449019B1 (en) 2000-04-07 2002-09-10 Avid Technology, Inc. Real-time key frame effects using tracking information
JP3949912B2 (en) 2000-08-08 2007-07-25 株式会社エヌ・ティ・ティ・ドコモ Portable electronic device, electronic device, vibration generator, notification method by vibration and notification control method
US6864877B2 (en) 2000-09-28 2005-03-08 Immersion Corporation Directional tactile feedback for haptic feedback interface devices
US7623114B2 (en) 2001-10-09 2009-11-24 Immersion Corporation Haptic feedback sensations based on audio output from computer devices
US7199805B1 (en) 2002-05-28 2007-04-03 Apple Computer, Inc. Method and apparatus for titling
JP2004310518A (en) 2003-04-08 2004-11-04 Fuji Xerox Co Ltd Picture information processor
KR20050054731A (en) 2003-12-05 2005-06-10 한국전자통신연구원 Haptic simulation system and method for providing real-time haptic interaction in virtual simulation
US9948885B2 (en) 2003-12-12 2018-04-17 Kurzweil Technologies, Inc. Virtual encounters
CN1914583A (en) 2004-02-03 2007-02-14 诺基亚公司 Method and device for implementing vibration output commands in mobile terminal devices
JP2005332063A (en) * 2004-05-18 2005-12-02 Sony Corp Input device with tactile function, information inputting method, and electronic device
US7765333B2 (en) * 2004-07-15 2010-07-27 Immersion Corporation System and method for ordering haptic effects
JP2006058973A (en) 2004-08-17 2006-03-02 Sony Corp Tactile information creation apparatus and tactile information creation method
US7728823B2 (en) * 2004-09-24 2010-06-01 Apple Inc. System and method for processing raw data of track pad device
CN101027631B (en) 2004-09-24 2014-09-03 苹果公司 Raw data track pad device and system
JP4617893B2 (en) 2005-01-18 2011-01-26 ソニー株式会社 Vibration transmission structure, input / output device with tactile function, and electronic equipment
JP5275025B2 (en) 2005-06-27 2013-08-28 コアクティヴ・ドライヴ・コーポレイション Synchronous vibrator for tactile feedback
US9370704B2 (en) 2006-08-21 2016-06-21 Pillar Vision, Inc. Trajectory detection and feedback system for tennis
JP2008123429A (en) 2006-11-15 2008-05-29 Sony Corp Touch panel display device, electronic equipment and game machine
US8098234B2 (en) 2007-02-20 2012-01-17 Immersion Corporation Haptic feedback system with stored effects
JP2008257295A (en) 2007-03-30 2008-10-23 Tokyo Institute Of Technology Method for presenting tactile stimulus
US8621348B2 (en) 2007-05-25 2013-12-31 Immersion Corporation Customizing haptic effects on an end user device
CN101355746B (en) 2007-07-27 2012-05-16 深圳富泰宏精密工业有限公司 Radio communication device
US7911328B2 (en) 2007-11-21 2011-03-22 The Guitammer Company Capture and remote reproduction of haptic events in synchronous association with the video and audio capture and reproduction of those events
US8035535B2 (en) 2007-11-21 2011-10-11 Nokia Corporation Apparatus and method providing transformation for human touch force measurements
GB2468811B (en) 2008-01-17 2012-12-19 Articulate Technologies Inc Methods and devices for intraoral tactile feedback
JP2009181261A (en) 2008-01-30 2009-08-13 Panasonic Corp Bidirectional communication system
KR100927009B1 (en) 2008-02-04 2009-11-16 광주과학기술원 Haptic interaction method and system in augmented reality
US9513704B2 (en) 2008-03-12 2016-12-06 Immersion Corporation Haptically enabled user interface
US20090303175A1 (en) * 2008-06-05 2009-12-10 Nokia Corporation Haptic user interface
JP2010015514A (en) 2008-07-07 2010-01-21 Sony Corp Input device, control method thereof, and electronic apparatus
KR20100066036A (en) 2008-12-09 2010-06-17 삼성전자주식회사 Operation method and apparatus for portable device
KR101114603B1 (en) 2008-12-12 2012-03-05 삼성전자주식회사 Haptic feedback device for portable terminal
US8686952B2 (en) * 2008-12-23 2014-04-01 Apple Inc. Multi touch with multi haptics
US8077021B2 (en) 2009-03-03 2011-12-13 Empire Technology Development Llc Dynamic tactile interface
US10564721B2 (en) * 2009-03-12 2020-02-18 Immersion Corporation Systems and methods for using multiple actuators to realize textures
KR101628782B1 (en) * 2009-03-20 2016-06-09 삼성전자주식회사 Apparatus and method for providing haptic function using multi vibrator in portable terminal
JP2010278727A (en) 2009-05-28 2010-12-09 Kddi Corp Portable terminal with vibration function
US9370459B2 (en) 2009-06-19 2016-06-21 Andrew Mahoney System and method for alerting visually impaired users of nearby objects
JP5197521B2 (en) 2009-07-29 2013-05-15 京セラ株式会社 Input device
JP4633183B1 (en) 2009-07-29 2011-02-23 京セラ株式会社 Input device and control method of input device
JP4942801B2 (en) 2009-08-27 2012-05-30 京セラ株式会社 Input device
US8451238B2 (en) 2009-09-02 2013-05-28 Amazon Technologies, Inc. Touch-screen user interface
US8619044B2 (en) 2009-09-30 2013-12-31 Blackberry Limited Electronic device including tactile touch-sensitive display and method of controlling same
JP5704428B2 (en) * 2009-11-18 2015-04-22 株式会社リコー Touch panel device and control method of touch panel device
JP5635274B2 (en) 2010-01-27 2014-12-03 京セラ株式会社 Tactile sensation presentation apparatus and tactile sensation presentation method
JP5360499B2 (en) 2010-02-01 2013-12-04 国立大学法人東北大学 Haptic presentation method and haptic presentation device
CA2731708A1 (en) 2010-02-15 2011-08-15 Research In Motion Limited Electronic device including touch-sensitive display and actuator for providing tactile feedback
US9417695B2 (en) 2010-04-08 2016-08-16 Blackberry Limited Tactile feedback method and apparatus
WO2011127379A2 (en) 2010-04-09 2011-10-13 University Of Florida Research Foundation Inc. Interactive mixed reality system and uses thereof
US8736559B2 (en) 2010-04-23 2014-05-27 Blackberry Limited Portable electronic device and method of controlling same
US8451255B2 (en) 2010-05-14 2013-05-28 Arnett Ryan Weber Method of providing tactile feedback and electronic device
WO2012008628A1 (en) 2010-07-13 2012-01-19 엘지전자 주식회사 Mobile terminal and configuration method for standby screen thereof
US8352643B2 (en) 2010-09-30 2013-01-08 Immersion Corporation Haptically enhanced interactivity with interactive content
US20120081337A1 (en) 2010-10-04 2012-04-05 Sony Ericsson Mobile Communications Ab Active Acoustic Multi-Touch and Swipe Detection for Electronic Devices
RU2596994C2 (en) 2010-11-09 2016-09-10 Конинклейке Филипс Электроникс Н.В. User interface with tactile feedback
JP5587759B2 (en) 2010-12-24 2014-09-10 京セラ株式会社 Tactile sensation presentation apparatus, program used for the apparatus, and tactile sensation presentation method
US8624857B2 (en) 2011-02-09 2014-01-07 Texas Instruments Incorporated Haptics effect controller architecture and instruction set
EP3306449B1 (en) * 2011-03-04 2022-03-09 Apple Inc. Linear vibrator providing localized and generalized haptic feedback
US9483085B2 (en) 2011-06-01 2016-11-01 Blackberry Limited Portable electronic device including touch-sensitive display and method of controlling same
KR20130007738A (en) 2011-07-11 2013-01-21 삼성전자주식회사 Key input device
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9462262B1 (en) 2011-08-29 2016-10-04 Amazon Technologies, Inc. Augmented reality environment with environmental condition control
US8711118B2 (en) 2012-02-15 2014-04-29 Immersion Corporation Interactivity model for shared feedback on mobile devices
US10852093B2 (en) 2012-05-22 2020-12-01 Haptech, Inc. Methods and apparatuses for haptic systems
WO2013186847A1 (en) 2012-06-11 2013-12-19 富士通株式会社 Drive device, electronic device, and drive control program
US8860563B2 (en) * 2012-06-14 2014-10-14 Immersion Corporation Haptic effect conversion system using granular synthesis
US9030428B2 (en) 2012-07-11 2015-05-12 Immersion Corporation Generating haptic effects for dynamic events
US8947216B2 (en) * 2012-11-02 2015-02-03 Immersion Corporation Encoding dynamic haptic effects
FR2999741B1 (en) 2012-12-17 2015-02-06 Centre Nat Rech Scient HAPTIC SYSTEM FOR NON-CONTACT INTERACTING AT LEAST ONE PART OF THE BODY OF A USER WITH A VIRTUAL ENVIRONMENT
US9367136B2 (en) 2013-04-12 2016-06-14 Microsoft Technology Licensing, Llc Holographic object feedback
US9908048B2 (en) 2013-06-08 2018-03-06 Sony Interactive Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display
US9811854B2 (en) 2013-07-02 2017-11-07 John A. Lucido 3-D immersion technology in a virtual store
DK3014394T3 (en) 2013-07-05 2022-07-11 Jacob A Rubin WHOLE BODY HUMAN COMPUTER INTERFACE
US9630105B2 (en) 2013-09-30 2017-04-25 Sony Interactive Entertainment Inc. Camera based safety mechanisms for users of head mounted displays
EP3095023A1 (en) 2014-01-15 2016-11-23 Sony Corporation Haptic notification on wearables
US9551873B2 (en) 2014-05-30 2017-01-24 Sony Interactive Entertainment America Llc Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content
EP3176676B1 (en) 2014-07-28 2020-01-08 CK Materials Lab Co., Ltd. Haptic information providing module
US9645646B2 (en) 2014-09-04 2017-05-09 Intel Corporation Three dimensional contextual feedback wristband device
US9799177B2 (en) 2014-09-23 2017-10-24 Intel Corporation Apparatus and methods for haptic covert communication
US9922518B2 (en) 2014-12-11 2018-03-20 Elwha Llc Notification of incoming projectiles
US9870718B2 (en) 2014-12-11 2018-01-16 Toyota Motor Engineering & Manufacturing North America, Inc. Imaging devices including spacing members and imaging devices including tactile feedback devices
US10166466B2 (en) 2014-12-11 2019-01-01 Elwha Llc Feedback for enhanced situational awareness
US20160170508A1 (en) 2014-12-11 2016-06-16 Toyota Motor Engineering & Manufacturing North America, Inc. Tactile display devices
US10073516B2 (en) 2014-12-29 2018-09-11 Sony Interactive Entertainment Inc. Methods and systems for user interaction within virtual reality scene using head mounted display
US9746921B2 (en) 2014-12-31 2017-08-29 Sony Interactive Entertainment Inc. Signal generation and detector systems and methods for determining positions of fingers of a user
US9843744B2 (en) 2015-01-13 2017-12-12 Disney Enterprises, Inc. Audience interaction projection system
US10322203B2 (en) 2015-06-26 2019-06-18 Intel Corporation Air flow generation for scent output
US9851799B2 (en) 2015-09-25 2017-12-26 Oculus Vr, Llc Haptic surface with damping apparatus
US20170103574A1 (en) 2015-10-13 2017-04-13 Google Inc. System and method for providing continuity between real world movement and movement in a virtual/augmented reality experience
US20170131775A1 (en) 2015-11-10 2017-05-11 Castar, Inc. System and method of haptic feedback by referral of sensation
US10055948B2 (en) 2015-11-30 2018-08-21 Nike, Inc. Apparel with ultrasonic position sensing and haptic feedback for activities
US10310804B2 (en) 2015-12-11 2019-06-04 Facebook Technologies, Llc Modifying haptic feedback provided to a user to account for changes in user perception of haptic feedback
US10324530B2 (en) 2015-12-14 2019-06-18 Facebook Technologies, Llc Haptic devices that simulate rigidity of virtual objects
US10096163B2 (en) 2015-12-22 2018-10-09 Intel Corporation Haptic augmented reality to reduce noxious stimuli
US10065124B2 (en) 2016-01-15 2018-09-04 Disney Enterprises, Inc. Interacting with a remote participant through control of the voice of a toy device
US9846971B2 (en) 2016-01-19 2017-12-19 Disney Enterprises, Inc. Systems and methods for augmenting an appearance of a hilt to simulate a bladed weapon
US11351472B2 (en) 2016-01-19 2022-06-07 Disney Enterprises, Inc. Systems and methods for using a gyroscope to change the resistance of moving a virtual weapon
TWI688879B (en) 2016-01-22 2020-03-21 宏達國際電子股份有限公司 Method, virtual reality system, and computer-readable recording medium for real-world interaction in virtual reality environment
US9933851B2 (en) 2016-02-22 2018-04-03 Disney Enterprises, Inc. Systems and methods for interacting with virtual objects using sensory feedback
US10555153B2 (en) 2016-03-01 2020-02-04 Disney Enterprises, Inc. Systems and methods for making non-smart objects smart for internet of things
US20170352185A1 (en) 2016-06-02 2017-12-07 Dennis Rommel BONILLA ACEVEDO System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation
US10155159B2 (en) 2016-08-18 2018-12-18 Activision Publishing, Inc. Tactile feedback systems and methods for augmented reality and virtual reality systems
US20180053351A1 (en) 2016-08-19 2018-02-22 Intel Corporation Augmented reality experience enhancement method and apparatus
US10779583B2 (en) 2016-09-20 2020-09-22 Facebook Technologies, Llc Actuated tendon pairs in a virtual reality device
US10372213B2 (en) 2016-09-20 2019-08-06 Facebook Technologies, Llc Composite ribbon in a virtual reality device
US10300372B2 (en) 2016-09-30 2019-05-28 Disney Enterprises, Inc. Virtual blaster
US10281982B2 (en) 2016-10-17 2019-05-07 Facebook Technologies, Llc Inflatable actuators in virtual reality
US10088902B2 (en) 2016-11-01 2018-10-02 Oculus Vr, Llc Fiducial rings in virtual reality
US20170102771A1 (en) 2016-12-12 2017-04-13 Leibs Technology Limited Wearable ultrasonic haptic feedback system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080223627A1 (en) * 2005-10-19 2008-09-18 Immersion Corporation, A Delaware Corporation Synchronization of haptic effect data in a media transport stream
US9898084B2 (en) * 2012-12-10 2018-02-20 Immersion Corporation Enhanced dynamic haptic effects

Also Published As

Publication number Publication date
KR20140074833A (en) 2014-06-18
JP2018037112A (en) 2018-03-08
EP2741174A2 (en) 2014-06-11
CN103869969A (en) 2014-06-18
JP2014115999A (en) 2014-06-26
JP6258023B2 (en) 2018-01-10
EP2741174A3 (en) 2016-05-18
US20180046252A1 (en) 2018-02-15
US20140160034A1 (en) 2014-06-12
CN103869969B (en) 2018-06-29
CN108803878A (en) 2018-11-13
JP6479148B2 (en) 2019-03-06
KR102207669B1 (en) 2021-01-25
US9898084B2 (en) 2018-02-20
US10359851B2 (en) 2019-07-23

Similar Documents

Publication Publication Date Title
US10248212B2 (en) Encoding dynamic haptic effects
US10359851B2 (en) Enhanced dynamic haptic effects
US9030428B2 (en) Generating haptic effects for dynamic events
EP2680107B1 (en) Haptic feedback control system
WO2013164351A1 (en) Device and method for processing user input
JP2019192238A (en) Cross-platform dynamic haptic effect design tool for augmented or virtual reality environments

Legal Events

Date Code Title Description
AS Assignment

Owner name: IMMERSION CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DA COSTA, HENRY;GERVAIS, ERIC;BHATIA, SATVIR SINGH;REEL/FRAME:049256/0481

Effective date: 20121207

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION