EP3598273A1 - Rendu à effet haptique adaptatif basé sur l'identification de système dynamique - Google Patents

Rendu à effet haptique adaptatif basé sur l'identification de système dynamique Download PDF

Info

Publication number
EP3598273A1
EP3598273A1 EP19184327.5A EP19184327A EP3598273A1 EP 3598273 A1 EP3598273 A1 EP 3598273A1 EP 19184327 A EP19184327 A EP 19184327A EP 3598273 A1 EP3598273 A1 EP 3598273A1
Authority
EP
European Patent Office
Prior art keywords
haptic
dynamic system
user
parameters
enabled apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19184327.5A
Other languages
German (de)
English (en)
Inventor
Colin Swindells
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Immersion Corp filed Critical Immersion Corp
Publication of EP3598273A1 publication Critical patent/EP3598273A1/fr
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game

Definitions

  • Haptics is a tactile and force feedback technology that takes advantage of a user's senses by haptic effects such as vibrations, motions, and other forces and stimulations.
  • Devices such as mobile devices, gaming devices, touchscreen devices, and personal computers, can be configured to generate haptic effects.
  • Haptic feedback can provide kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, vibrotactile feedback, texture, heat, etc.) to a user.
  • Haptic effects may be useful to alert the user to specific events or to provide realistic feedback to create greater sensory immersion within a simulated or virtual environment
  • haptic enabled device can render consistent haptic effects
  • such intended haptic effects may be perceived differently by different users. They also may be perceived differently by the same user, depending on factors such as how the user interacts with the haptic enabled device, and/or various physical properties of the users, the haptic enabled device, and/or the environment surrounding thereof. It may be desirable to enable the user to feel haptic effects as consistent and more similar to the intended sensation of such haptic effects.
  • the present disclosure generally relates to adaptive haptic effect rendering based on identification of a dynamic system.
  • Various aspects are described in this disclosure, which include, but are not limited to, the following aspects.
  • One aspect is a method for generating a haptic effect.
  • the method includes receiving, in real time, a dynamic system parameter signal from an input device of a haptic enabled apparatus, the dynamic system parameter signal representative of one or more dynamic system parameters of a dynamic system; determining a haptic parameter modification value based on the dynamic system parameter signal; modifying at least one of haptic data parameters based on the haptic parameter modification value; generating a modified haptic signal based on the haptic data parameters; and applying the modified haptic signal to a haptic actuator, thereby providing a haptic effect adapted to the dynamic system.
  • the apparatus includes an actuator, an actuator drive circuit configured to operate the actuator, an input device configured to monitor a dynamic system, and a processing device connected to the actuator drive circuit and the input device.
  • the processing device operates to receive, in real time, a dynamic system parameter signal from the input device, the dynamic system parameter signal representative of one or more dynamic system parameters of the dynamic system; determine a haptic parameter modification value based on the dynamic system parameter signal; modify at least one of haptic data parameters based on the haptic parameter modification value; generate a modified haptic signal based on the haptic data parameters; and transmit the modified haptic signal to the actuator drive circuit, the modified haptic signal enabling the actuator drive circuit to control the actuator, thereby providing a haptic effect adapted to the dynamic system.
  • Yet another aspect is a computer-readable storage medium comprising software instructions that, when executed, cause a haptic enabled apparatus to, while the haptic enabled apparatus is in use, receive a dynamic system parameter signal from an input device of the haptic enabled apparatus, the dynamic system parameter signal representative of one or more dynamic system parameters of a dynamic system; determine a haptic parameter modification value based on the dynamic system parameter signal; modify at least one of haptic data parameters based on the haptic parameter modification value; generate a modified haptic signal based on the haptic data parameters; and operate a haptic actuator using the modified haptic signal, thereby providing a haptic effect adapted to the dynamic system.
  • Yet another aspect is a method for generating a haptic effect.
  • the method includes: generating a dynamic system characterization model representative of a dynamic system, the dynamic system indicative of inputs received through an input device of a haptic enabled apparatus; receiving, in real time, an input device signal from the input device, the input device signal representative of dynamic system parameters of the dynamic system; updating the dynamic system characterization model based on the input device signal; modifying a haptic effect rendering model based on the updated dynamic system characterization model; generating a haptic signal based on the modified haptic effect rendering model; and controlling a haptic actuator using the haptic signal, thereby providing a haptic effect adapted to the dynamic system.
  • the method may further include: storing the dynamic system parameters in the haptic enabled apparatus.
  • the apparatus includes an actuator, an actuator drive circuit configured to operate the actuator, an input device configured to monitor a dynamic system, and a processing device connected to the actuator drive circuit and the input device.
  • the processing device is configured to generate a dynamic system characterization model representative of the dynamic system; receive, in real time, an input device signal from the input device, the input device signal representative of dynamic system parameters of the dynamic system; update the dynamic system characterization model based on the input device signal; modify a haptic effect rendering model based on the updated dynamic system characterization model; generate a haptic effect signal based on the modified haptic effect rendering model; and transmit the haptic effect signal to the actuator drive circuit, the haptic effect signal enabling the actuator drive circuit to control the actuator, thereby providing a haptic effect adapted to the dynamic system.
  • the processing device may be further configured to store the dynamic system parameters in the haptic enabled apparatus.
  • Yet another aspect is a computer-readable storage medium comprising software instructions that, when executed, cause a haptic enabled apparatus to: store a dynamic system characterization model representative of a dynamic system; store a haptic effect rendering model; while the haptic enabled apparatus is in use, receive a input device signal from an input device associated with the haptic enabled apparatus, the input device signal representative of dynamic system parameters of the dynamic system; update the dynamic system characterization model based on the input device signal; store the updated dynamic system characterization model; modify the haptic effect rendering model based on the updated dynamic system characterization model; generate a haptic effect signal based on the modified haptic effect rendering model; and operate a haptic actuator using the haptic effect signal, thereby providing a haptic effect adapted to the dynamic system.
  • the dynamic system parameters include object parameters representative of physical characteristics of the haptic enabled apparatus.
  • the object parameters may include position, velocity and acceleration effects associated with the haptic enabled apparatus.
  • the dynamic system parameters include user parameters associated with a user's behavioral and physiological properties with respect to the haptic enabled apparatus.
  • the user parameters may include a user grip strength, a user grip pattern, and a user skin sensitivity with respect to the haptic enabled apparatus.
  • the dynamic system parameters include environmental parameters representative of physical characteristics of an environment surrounding the haptic enabled apparatus.
  • the present disclosure relates to systems and methods for generating haptic effects adapted to a dynamic system.
  • the dynamic system includes various properties associated with a user's dynamic interaction with a haptic enabled apparatus.
  • the dynamic system may further include a dynamic change in an environment surrounding the haptic enabled apparatus and the user thereof.
  • a user's perception of haptic rendering for a particular haptic enabled apparatus can change as the dynamic system constantly varies.
  • the effective mass and other dynamic properties of the dynamic system including at least the user's hand and the smartphone can change the user's feel of the haptic effect generated from the smartphone.
  • the systems and methods of the present disclosure operates to monitor a dynamic change in the dynamic system and automatically modify haptic rendering in real time so that the user can feel consistent haptic effects even though the factors that affect the haptic effect may change.
  • the haptic signal controls the haptic effect adapted to change in the dynamic system. As such, the haptic rendering is dynamically updated in response to a status of the dynamic system.
  • FIG. 1 illustrates a haptic enabled system 100 in accordance with an exemplary embodiment of the present disclosure.
  • the system 100 includes a haptic enabled apparatus 102.
  • the haptic enabled apparatus 102 operates to generate haptic effects.
  • An example of the haptic enabled apparatus 102 is further described in more detail herein, including the description with reference to FIG. 2 .
  • the haptic enabled apparatus 102 includes an adaptive haptic effect rendering device 104.
  • the adaptive haptic effect rendering device 104 operates to monitor a dynamic system 106 and generate a haptic effect which is dynamically adapted to a change in the dynamic system 106, thereby providing a consistent, effective haptic effect to the user's perception.
  • the dynamic system 106 indicates one or more inputs received through an input device of the haptic enabled apparatus 102, such as an input device 162 and/or a dynamic system monitoring device 152 illustrated in FIG. 2 .
  • Such inputs can include signals which are generated from the input device of the haptic enabled apparatus 102 and define a user's dynamic interaction with the haptic enabled apparatus 102.
  • the inputs from the input device of the haptic enabled apparatus 102 can define the status of an environment 108 which surrounds the haptic enabled apparatus 102 and a user U who interacts with the apparatus 102 in real time.
  • the inputs from the input device of the haptic enabled apparatus 102 can define the user's sensation (also referred to herein as feel or perception) of the haptic rendering when interacting with the apparatus, and/or the user's behavior and/or physiological conditions that may or may not be associated with the haptic enabled apparatus 102.
  • the dynamic system 106 changes as physical properties of the haptic enabled apparatus 102, the user's interaction with the apparatus 102, and/or the environment of the apparatus 102 vary.
  • a resonant frequency of the apparatus 102 may also change because of, for example, the user or other surrounding elements that are in contact with, or arranged adjacent to, the apparatus 102.
  • the changed resonant frequency can determine a method of adjusting an operation of a haptic actuator (e.g., how to change a frequency of a haptic data) to provide haptic rendering adapted to the change in the dynamic system.
  • the dynamic system 106 may be represented with a plurality of dynamic system parameters 120, as illustrated in FIG. 4 .
  • the dynamic system parameters 120 are updated as the dynamic system 106 changes, and such update or change in the dynamic system parameters 120 are detected in real time.
  • the dynamic system parameters 120 include object parameters 122, user parameters 124, and environmental parameters 126.
  • the dynamic system parameters 120 may include less than the object parameters 122, the user parameters 124, and the environmental parameters 126.
  • the dynamic system parameters 120 may include additional parameters.
  • the object parameters 122 include parameters representative of physical characteristics of the haptic enabled apparatus 102. Such physical characteristics of the haptic enabled apparatus 102 can influence the user's sensation of a haptic effect generated from the haptic enabled apparatus 102.
  • the haptic enabled apparatus 102 includes one or more devices (e.g., attachment devices) attached or coupled to the apparatus 102, such as a head-mounted display, a controller, or a screen. Therefore, the object parameters 122 also can indicate physical characteristics of the haptic enabled apparatus 102 and other devices associated with the apparatus 102. For example, any device, such as a controller or a screen, coupled to the haptic enabled apparatus 102 can determine the object parameters 122.
  • the object parameters 122 include position, velocity, and acceleration of the haptic enabled apparatus 102. Further, the object parameters 122 include stiffness, damping, effective mass, acceleration, friction, or any other physical properties associated with the haptic enabled apparatus 102. In some embodiments, the position, velocity, and acceleration effects are associated with stiffness, damping, and inertia. In addition or alternatively, the object parameters 122 include a location, an arrangement, an orientation, and any other positional information associated with the haptic enabled apparatus 102.
  • the user may feel a haptic effect generated from the haptic enabled apparatus 102 differently depending on where the haptic enabled apparatus 102 is placed, such as when the apparatus 102 is laid on a table or when the apparatus 102 is placed in a plastic case.
  • the object parameters 122 include a shape or any other structural properties of the haptic enabled apparatus 102.
  • the object parameters 122 include product specifications of the haptic enabled apparatus 102 (including any auxiliary devices attachable to the apparatus 102, such as a head-mounted display or a controller as illustrated in FIGS. 7 and 8 ).
  • the user parameters 124 include parameters associated with the user's behavior with respect to the haptic enabled apparatus 102, and/or the user's physiological properties in contact with the haptic enabled apparatus 102. Similar to the object parameters 122, the user parameters 124 can influence the user's sensation of a haptic effect generated from the haptic enabled apparatus 102. For example, the user parameters 124 can indicate physical properties of the user's body in contact with the haptic enabled apparatus 102.
  • the user parameters 124 include a user's grip strength or force with respect to the apparatus 102, a user's grip pattern with respect to the apparatus 102 (e.g., holding with one hand or two hands), a user's skin sensitivity, a hand size, a finger size, a user's height, a user's posture (e.g., the user's sitting or standing while using the apparatus 102), a user's movement (e.g., the user's walking or running while using the apparatus 102), presence of a glove in hand, or any other user related properties which may influence the user's perception of haptic rendering.
  • a user's grip strength or force with respect to the apparatus 102 e.g., holding with one hand or two hands
  • a user's skin sensitivity e.g., holding with one hand or two hands
  • a hand size e.g., a hand size, a finger size, a user's height
  • a user's posture e.g., the
  • the user parameters 124 can further include the user's medical information, such as age, disability, illness, or any other information that may affect the user's feeling of haptic rendering.
  • the environmental parameters 126 include parameters associated with the environment 108 that surrounds the apparatus 102 and/or the user U using the apparatus 102.
  • the environmental parameters 126 can influence the user's sensation of a haptic effect generated from the haptic enabled apparatus 102.
  • Examples of environmental parameters 126 include weather information (e.g., temperature, humidity, precipitation, cloud cover, etc.), darkness (e.g., amount of light), loudness, geographic information (e.g., elevation and altitude), atmospheric pressure, or any other environmental factors that may affect the user's sensation of haptic rendering.
  • FIG. 2 illustrates a block diagram of one of many possible embodiments of a haptic enabled apparatus 102 as illustrated in FIG. 1 .
  • the haptic enabled apparatus 102 can be of various configurations.
  • the haptic enabled apparatus 102 can be any type of device that can be used to deliver haptic effects, such as a cellular phone, a smart phone, a personal digital assistant (PDA), a portable music player, a portable video player, a game system, a virtual reality (VR) system, a virtual reality headset, a 360-degree video headset, an automotive system, a navigation system, a desktop, a laptop computer, electronic appliances (e.g., a television, an oven, a washer, a dryer, a refrigerator, or a lighting system), a movie theater such as IMAXTM theater with seats, headsets, or other devices having haptic actuators, and any other electronic or computing devices capable of processing information as well as providing haptic feedback.
  • PDA personal digital assistant
  • VR virtual reality
  • VR virtual
  • the haptic enabled apparatus 102 can be a single device. In other embodiments, the haptic enabled apparatus 102 can collectively be a set of devices connected together.
  • the haptic enabled apparatus 102 includes a bus 140, a processor 142, an input/output (I/O) controller 144, memory 146, a network interface controller (NIC) 148, a user interface 150, a dynamic system monitoring device 152, an actuator drive circuit 154, a haptic actuator 156, and a dynamic system characterization database 158.
  • I/O input/output
  • NIC network interface controller
  • the elements, devices, and components of the apparatus 102 are incorporated into a single device, which can be worn or carried by a user.
  • at least one of the illustrated elements, devices, and components is separately arranged from the others and connected to each other either wirelessly or by wire.
  • the bus 140 includes conductors or transmission lines for providing a path to transfer data between the components in the apparatus 102 including the processor 142, the I/O controller 144, the memory 146, the NIC 148, the dynamic system monitoring device 152, and the actuator drive circuit 154.
  • the bus 140 typically comprises a control bus, address bus, and data bus.
  • the bus 140 can be any bus or combination of busses, suitable to transfer data between components in the apparatus 102.
  • the processor 142 can be any circuit configured to process information and can include any suitable analog or digital circuit.
  • the processor 142 also can include a programmable circuit that executes instructions. Examples of programmable circuits include microprocessors, microcontrollers, application specific integrated circuits (ASIC), programmable gate arrays (PLA), field programmable gate arrays (FPGA), or any other processor or hardware suitable for executing instructions.
  • the processor 142 can be a single unit or a combination of two or more units. If the processor 142 includes two or more units, the units can be physically located in a single controller or in separate devices.
  • the processor 142 may be the same processor that operates the entire apparatus 102, or may be a separate processor.
  • the processor 142 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters.
  • the high level parameters that define a particular haptic effect include magnitude, frequency and duration.
  • Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect.
  • the processor 142 receives signals or data from the input device 162 and outputs control signals to drive the actuator drive circuit 154.
  • Data received by the processor 142 can be any type of parameters, instructions, flags, or other information that is processed by the processors, program modules, and other hardware disclosed herein.
  • the I/O controller 144 is circuitry that monitors operation of the apparatus 102 and peripheral or external devices such as the user interface 150.
  • the I/O controller 144 also manages data flow between the apparatus 102 and the peripheral devices and frees the processor 142 from details associated with monitoring and controlling the peripheral devices. Examples of other peripheral or external devices with which the I/O controller 144 can interface includes external storage devices, monitors, input devices such as controllers, keyboards and pointing devices, external computing devices, antennas, other articles worn by a person, and any other remote devices.
  • the memory 146 can be any type of storage device or computer-readable medium such as random access memory (RAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), flash memory, magnetic memory, optical memory, or any other suitable memory technology.
  • RAM random access memory
  • ROM read only memory
  • EEPROM electrically erasable programmable read only memory
  • flash memory magnetic memory
  • optical memory or any other suitable memory technology.
  • the memory 146 also can include a combination of volatile and nonvolatile memory.
  • the memory 146 stores instructions executed by the processor 142.
  • the memory 146 may also be located internal to the processor 142, or any combination of internal and external memory.
  • the network interface controller (NIC) 148 is in electrical communication with a network to provide communication (either wireless or wired) between the apparatus 102 and remote devices. Communication can be according to any wireless transmission techniques including standards such as Bluetooth, cellular standards (e.g., CDMA, GPRS, GSM, 2.5G, 3G, 3.5G, 4G), WiGig, IEEE 802.11a/b/g/n/ac, IEEE 802.16 (e.g., WiMax).
  • the NIC 148 also can provide wired communication between the apparatus 102 and remote devices through wired connections using any suitable port and connector for transmitting data and according to any suitable standards such as RS 232, USB, FireWire, Ethernet, MIDI, eSATA, or thunderbolt.
  • the user interface 150 can include an input device 162 and an output device 164.
  • the input device 162 includes any device or mechanism through which a user can input parameters, commands, and other information into the apparatus 102.
  • the input device 162 is configured to monitor or detect one or more events associated with the haptic enabled apparatus 102 or a user of the haptic enabled apparatus 102, or one or more events performed by the user, of which the user can be informed with a haptic feedback.
  • the input device 162 is any device that inputs a signal into the processor 142.
  • Examples of the input device 162 include touchscreens, touch sensitive surfaces, cameras, mechanical inputs such as buttons and switches, and other types of input components, such as a mouse, touchpad, mini-joystick, scroll wheel, trackball, game pads or game controllers.
  • Other examples of the input device 162 include a control device such as a key, button, switch or other type of user interfaces.
  • Yet other examples of the input device 162 include a transducer that inputs a signal into the processor 142. Examples of transducers that can be used as the input device 162 include one or more antennas and sensors.
  • the input device 162 includes the dynamic system monitoring device 152 as described herein. In other examples, the dynamic system monitoring device 152 includes the input device 162.
  • Yet other examples of the input device 162 include removable memory readers for portable memory, such as flash memory, magnetic memory, optical memory, or any other suitable memory technology.
  • the output device 164 includes any device or mechanism that presents information to a user in various formats, such as visual and audible formats. Examples of output device 164 include display screens, speakers, lights, and other types of output components. The output device 164 can also include removable memory readers. In one embodiment, the input device 162 and the output device 164 are integrally formed, such as a touch-sensitive display screen.
  • the dynamic system monitoring device 152 operates to monitor characteristics of the dynamic system 106.
  • the dynamic system monitoring device 152 detects the dynamic system parameters 120 in real time when the apparatus 102 is in use or manipulated by a user U.
  • the dynamic system monitoring device 152 includes one or more sensors of various types, which may be incorporated in the apparatus 102 or connected to the apparatus 102.
  • the dynamic system monitoring device 152 can include the input device 162 of the apparatus 102.
  • the dynamic system monitoring device 152 can also be referred to as the input device.
  • Sensors can be any instruments or other devices that output signals in response to receiving stimuli.
  • the sensors can be hardwired to the processor or can be connected to the processor wirelessly.
  • the sensors can be used to detect or sense a variety of different conditions, events, environmental conditions, the operation or condition of the apparatus 102, the presence of other people or objects, or any other condition or thing capable of stimulating sensors.
  • sensors include acoustical or sound sensors such as microphones; vibration sensors; chemical and particle sensors such as breathalyzers, carbon monoxide and carbon dioxide sensors, and Geiger counters; electrical and magnetic sensors such as voltage detectors or hall-effect sensors; flow sensors; navigational sensors or instruments such as GPS receivers, altimeters, gyroscopes, magnetometers or accelerometers; position, proximity, and movement-related sensors such as piezoelectric materials, rangefinders, odometers, speedometers, shock detectors; imaging and other optical sensors such as charge-coupled devices (CCD), CMOS sensors, infrared sensors, and photodetectors; pressure sensors such as barometers, piezometers, and tactile sensors; force sensors such as piezoelectric sensors and strain gauges; temperature and heat sensors such as thermometers, calorimeters, thermistors, thermocouples, and pyrometers; proximity and presence sensors such as motion detectors, triangulation sensors, radars, photo cells, sonars, and hall-
  • Various embodiments can include a single sensor or can include two or more sensors of the same or different types. Additionally, various embodiments can include different types of sensors.
  • the actuator drive circuit 154 is a circuit that receives a haptic signal (which is also referred to herein as a control signal) from the actuator drive module 178.
  • the haptic signal embodies haptic data associated with haptic effects, and the haptic data defines parameters the actuator drive circuit 154 uses to generate an actuator drive signal.
  • such parameters relate to, or are associated with, electrical characteristics. Examples of electrical characteristics that can be defined by the haptic data includes frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event.
  • the actuator drive signal is applied to the actuator 156 to cause one or more haptic effects.
  • the actuator 156 which also is referred to herein as a haptic actuator or a haptic output device, operates to generate haptic effects.
  • the actuator 156 is controlled by the processor 142 that executes the actuator drive module 178, which sends a haptic signal to the actuator drive circuit 154.
  • the actuator drive circuit 154 then generates and applies an actuator drive signal to the actuator 156 to drive the actuator 156.
  • an actuator drive signal causes the actuator 156 to generate haptic effects by activating and braking the actuator 156.
  • the actuator 156 can be of various types.
  • the actuator is a resonant actuator, such as a Linear Resonant Actuator (LRA) in which a mass attached to a spring is driven back and forth.
  • LRA Linear Resonant Actuator
  • SRA solenoid resonant actuator
  • Actuators 156 also broadly include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
  • ESF electrostatic friction
  • USF ultrasonic surface friction
  • haptic output such as a puff of air using an air jet, and so on.
  • the apparatus 102 may include more than one actuator 156, and each actuator may include a separate actuator drive circuit 154, all coupled to the processor 142. In embodiments with more than one actuator, each actuator can have a different output capability in order to create a wide range of haptic effects on the device.
  • the dynamic system characterization database 158 operates to store various data from the dynamic system characterization module 174 and/or the haptic effect rendering module 176.
  • the dynamic system characterization database 158 stores data of characteristics or properties of the dynamic system 106, such as dynamic system characterization model 200 and/or dynamic system parameters 120 (as shown in FIG. 4 ).
  • the dynamic system characterization database 158 is updated as the dynamic system characterization model 200 is updated.
  • the dynamic system parameters 120 are revised or updated in the database 158 as any change to the dynamic system parameters 120 is detected.
  • the dynamic system characterization database 158 is configured as a secondary storage device (such as a hard disk drive, flash memory cards, digital video disks, compact disc read only memories, digital versatile disk read only memories, random access memories, or read only memories) for storing digital data.
  • the secondary storage device is connected to the bus 140.
  • the secondary storage devices and their associated computer readable media provide nonvolatile storage of computer readable instructions (including application programs and program modules), data structures, and other data for the apparatus 102.
  • the secondary storage device for the database 158 is included in the apparatus 102, it is understood that the secondary storage device is a separate device from the apparatus 102 in other embodiments.
  • the database 158 is included in the memory 146.
  • the memory 146 can store a number of program modules for execution by the processor 142, including a user input acquisition module 172, a dynamic system characterization module 174, a haptic effect rendering module 176, an actuator drive module 178, and a communication module 180.
  • Each module is a collection of data, routines, objects, calls, and other instructions that perform one or more particular task.
  • the various instructions and tasks described herein can be performed by a single module, different combinations of modules, modules other than those disclosed herein, or modules executed by remote devices that are in communication, either wirelessly or by wire, with the apparatus 102.
  • the user input acquisition module 172 are instructions that, when executed by the processor 142, cause the processor 142 to receive user inputs of one or more parameters associated with haptic effects or haptic effect modifiers.
  • the user input acquisition module 172 can communicate with the input device 162 of the user interface 150 and enable a user to input such parameters through the input device 162.
  • the user input acquisition module 172 provides a graphical user interface on a display screen (i.e., the input device 162) that allows a user to enter or select one or more parameters for haptic effects.
  • the dynamic system characterization module 174 are instructions that, when executed by the processor 142, cause the processor 142 to receive signals from the dynamic system monitoring device 152 and obtain the dynamic system parameters 120 based on the received signals.
  • the dynamic system characterization module 174 further operates to generate and update a dynamic system characterization model 200 ( FIG. 4 ) based on the dynamic system parameters 120.
  • An example of the dynamic system characterization model 200 is further described and illustrated herein, for example with reference to FIGS. 3 and 4 .
  • the haptic effect rendering module 176 are instructions that, when executed by the processor 142, cause the processor 142 to render haptic effects on the haptic enabled apparatus 102.
  • the haptic effect rendering module 176 generates a haptic data or haptic effect rendering model 210 ( FIG. 5 ), which defines haptic data parameters 202 ( FIGS. 4 and 5 ) the actuator drive circuit 154 uses to generate an actuator drive signal.
  • haptic data parameters 202 relate to, or are associated with, characteristics of the haptic drive signals, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event.
  • the actuator drive signal is applied to the actuator 156 to cause one or more haptic effects.
  • the actuator drive module 178 include instructions that, when executed by the processor 142, cause the processor 142 to generate control signals for the actuator drive circuit 154.
  • the actuator drive module 178 can also determine feedback from the actuator 156 and adjust the control signals accordingly.
  • the communication module 180 facilitates communication between the apparatus 102 and remote devices.
  • remote devices include computing devices, sensors, actuators, networking equipment such as routers and hotspots, vehicles, exercise equipment, and smart appliances.
  • computing devices include servers, desktop computers, laptop computers, tablets, smartphones, home automation computers and controllers, and any other device that is programmable.
  • the communication can take any form suitable for data communication including communication over wireless or wired signal or data paths.
  • the communication module may configure the apparatus 102 as a centralized controller of the system 100 or other remote devices, as a peer that communicates with other computing devices or other remote devices, or as a hybrid centralized controller and peer such that the controller can operate as a centralized controller in some circumstances and as a peer in other circumstances.
  • program modules are possible. For example, some alternative embodiments might have more or fewer program modules than the modules illustrated in FIG. 2 . In some possible embodiments, one or more of the program modules are in remote devices such as remote computing devices or other wearable articles.
  • the adaptive haptic effect rendering device 104 as illustrated in FIG. 1 may include one or more of the elements, devices, and components of the haptic enabled apparatus 102, as described with reference to FIG. 2 .
  • the adaptive haptic effect rendering device 104 can include the processor 142, the memory 146 including at least one of the modules 172, 174, 176, 178, and 180, the dynamic system monitoring device 152, the actuator drive circuit 154, and the actuator 156.
  • the adaptive haptic effect rendering device 104 can include more or less elements, devices, and components than illustrated in FIG. 2 .
  • the adaptive haptic effect rendering device 104 can be configured separately from the haptic enabled apparatus 102.
  • the adaptive haptic effect rendering device 104 is configured as part of a server computing device that communicates with the haptic enabled apparatus 102 via a network.
  • FIG. 3 is a flowchart of an example method 300 for generating adaptive haptic effects.
  • the haptic enabled apparatus 102 including the adaptive haptic effect rendering device 104 generally preforms the method 300. Therefore, the method 300 is primarily described below as being performed by the haptic enabled apparatus 102.
  • the adaptive haptic effect rendering device 104 is implemented in a separate device from the haptic enabled apparatus 102 and configured to communicate with the apparatus 102 to perform at least some of the steps in the method 300.
  • the haptic enabled apparatus 102 obtains a dynamic system characterization model 200.
  • the haptic enabled apparatus 102 operates to generate a dynamic system characterization model 200.
  • the dynamic system characterization model 200 is generated using a separate computing device (e.g., a server computing device) and provided to the haptic enabled apparatus 102.
  • the dynamic system characterization model 200 is configured to represent the dynamic system 106.
  • the dynamic system characterization model 200 is built based on the dynamic system parameters 120 and can be configured in the form of a transformation matrix that correlates the dynamic system parameters 120 with the haptic data parameters 202 of the haptic data, as further illustrated in FIG. 4 .
  • the characterization data of the dynamic system 106 which include the dynamic system parameters 120, can be stored in a database, such as the database 158 ( FIG. 2 ), and updated as changes to the dynamic system 106 are detected.
  • the dynamic system characterization model 200 is configured to render a standard haptic effect representation by default.
  • the dynamic system characterization model 200 is configured such that standard haptic effects are rendered until any change to the dynamic system parameters 120 of the dynamic system 106 is detected and thereby the dynamic system characterization model 200 is updated.
  • the haptic enabled apparatus 102 operates to detect characteristics of the dynamic system 106.
  • the haptic enabled apparatus 102 monitors a change to the dynamic system parameters 120 in real time when the haptic enabled apparatus 102 is used or manipulated by a user U.
  • the haptic enabled apparatus 102 monitors the dynamic system parameters 120 periodically (or at predetermined intervals).
  • the haptic enabled apparatus 102 obtains the dynamic system parameters 120 when a change to the dynamic system parameters 120 is detected.
  • the haptic enabled apparatus 102 detects the dynamic system parameters 120 at random times.
  • the haptic enabled apparatus 102 operates the dynamic system monitoring device 152 to dynamically monitor the dynamic system parameters 120 of the dynamic system 106.
  • the dynamic system monitoring device 152 operates to detect the dynamic system parameters 120 or a change thereto, and generates a sensor signal (also referred to herein as an input device signal) representative of the dynamic system parameters 120.
  • the haptic enabled apparatus 102 operates to receive the sensor signal from the dynamic system monitoring device 152 and process the sensor signal to obtain the dynamic system parameters 120.
  • the haptic enabled apparatus 102 actively uses one or more sensors to monitor changes in the dynamic system 106, and, as described below, implements an algorithm to constantly re-characterize the dynamic system 106.
  • the sensors include accelerometers to parameterize mass, cameras to parameterize stiffness, and force sensors to measure grip strength and biometrics (e.g., to monitor muscle tension).
  • the sensed proprieties (e.g., the dynamic system parameters) of the dynamic system 106 can include physical properties (e.g., mass, friction, and damping parameters), abstract system characteristic properties (e.g., amplitude and controller settings), and user characteristics (e.g., user's grip strength (such as weak grips or tight grips), user's likely sensitivity level to vibrations at different frequencies at different body sites, and user's skin conductance level (e.g., a change in conductive as a user touches a touch screen with a bare finger or a gloved finger).
  • physical properties e.g., mass, friction, and damping parameters
  • abstract system characteristic properties e.g., amplitude and controller settings
  • user characteristics e.g., user's grip strength (such as weak grips or tight grips), user's likely sensitivity level to vibrations at different frequencies at different body sites, and user's skin conductance level (e.g., a change in conductive as a user touches a touch screen with
  • the dynamic system associated with the user is created and/or modified such that lower frequency vibrotactile effects are amplified than higher frequency vibrotactile effects when the haptic effect is to be generated to the user's right hand.
  • such user characteristics can be identified or detected in various manners, such as using sensing devices of the haptic enabled apparatus or using user's medical record.
  • the haptic enabled apparatus 102 operates to update the dynamic system characterization model 200 based on the detected dynamic system parameters 120.
  • the dynamic system characterization model 200 can be updated based on the received sensor signal representative of the detected dynamic system parameters 120.
  • the updated dynamic system characterization model 200 is used to transform a standard haptic data (i.e., a standard haptic rendering model), thereby rendering more appropriate haptic effects to the changed dynamic system.
  • a standard haptic data can also be referred to herein as a base haptic data or a universal haptic data.
  • the database 158 can be updated according to the detected dynamic system parameters 120.
  • the haptic enabled apparatus 102 operates to modify the haptic effect rendering model 210 (also referred to herein as the haptic data) based on the updated dynamic system characterization model 200.
  • the haptic effect rendering model 210 can define the haptic data parameters 202.
  • Some examples of haptic data parameters 202 relate to, or are associated with, characteristics of the haptic drive signals, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event. As described herein and illustrated in FIG.
  • the dynamic system characterization model 200 correlates the characteristics (e.g., the dynamic system parameters 120) of the dynamic system 106 with the haptic data (e.g., the haptic data parameters 202). Therefore, when the dynamic system characterization model 200 is updated with the monitored dynamic system parameters 120, the haptic data parameters 202 are updated correspondingly, and thus the haptic effect rendering model 210 is also modified accordingly.
  • a standard haptic data can be modified to increase the amplitude of haptic rendering as a user tightens the grip on the haptic enabled apparatus, so that the user can perceive the haptic rendering as consistent and more similar to the haptic designer's intended sensation or feel.
  • Torque applied to the actuator (Units are in mNm, radians, and seconds unless noted otherwise);
  • Equation (1) is used for a rotary control, such as a torque control or a radian control.
  • a rotary control such as a torque control or a radian control.
  • the haptic effect may be rendered using Equation (1).
  • the output of Equation (1) e.g., a torque value
  • the output of Equation (1) is used to generate a haptic signal of various characteristics, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event.
  • the haptic signal is then applied to the actuator to cause one or more haptic effects.
  • Equation (2) e.g., a force value
  • a haptic signal of various characteristics, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event.
  • the haptic signal is then applied to the actuator to cause one or more haptic effects.
  • the haptic enabled apparatus 102 operates to generate a haptic signal 212 based on the modified haptic effect rendering model 210.
  • the haptic effect rendering model 210 is used to convert the haptic data parameters 202 to a haptic signal 212 usable to drive the actuator 156.
  • the haptic enabled apparatus 102 operates to control the haptic actuator 156 using the haptic signal 212.
  • the haptic signal 212 is provided to the actuator drive circuit 154, which then generates and applies an actuator drive signal to the haptic actuator 156, thereby driving the haptic actuator 156.
  • the haptic actuator 156 is driven by the actuator drive signal to provide a haptic effect adapted to the monitored dynamic system parameters 120 of the dynamic system 106.
  • FIG. 6 illustrates two example situations, such as Situation A and Situation B, where the haptic enabled apparatus 102 generates different haptic effects to a user U who grips the haptic enabled apparatus 102 in different manners.
  • Situation A a left diagram of FIG. 6
  • the user U is holding a haptic enabled apparatus 102 (e.g., a smartphone) that includes a high dynamic range vibrotactile actuator configured to render high definition haptic feedback, such as button click, emoji action, and other haptic effects, as the user sends messages.
  • a haptic enabled apparatus 102 e.g., a smartphone
  • a high dynamic range vibrotactile actuator configured to render high definition haptic feedback, such as button click, emoji action, and other haptic effects, as the user sends messages.
  • the dynamic system monitoring device 152 of the apparatus 102 operates to detect the user's very light, resting grips on the apparatus 102 and dynamically updates the haptic renderings to accommodate the user's grip style in this situation.
  • the user's grip changes to Situation B (a right diagram of FIG. 6 ) where the user U holds the apparatus 102 tightly in her left hand while jogging.
  • the user U interacts with a music streaming application using only her left hand and thumb, still gripping tightly.
  • the dynamic system monitoring device 152 of the apparatus 102 operates to detect changes to the dynamic system parameters, such as an increased effected mass, stiffness, and damping, which are caused by the user's different grip and manipulation of the apparatus 102.
  • the dynamic system 106 includes at least the apparatus 102 and the user's hand holding the apparatus 102.
  • the frequency dynamics and the intensity profiles of the haptic renderings of the music streaming application running on the apparatus 102 can be adapted on the apparatus 102 to account for the user's stronger grip.
  • FIG. 7 illustrates two example situations, such as Situation A and Situation B, where the haptic enabled apparatus 102 generates different haptic effects depending on physical properties surrounding the apparatus 102.
  • Situation A a left diagram of FIG. 7
  • the apparatus 102 e.g., a smartphone
  • Situation B a right diagram of FIG. 7
  • the apparatus 102 operates to detect that the apparatus 102 is now physically coupled to the HMD 402, instead of the user's hand.
  • the dynamic system monitoring device 152 of the apparatus 102 operates to detect changes to the dynamic system 106, such as the changed physical properties of the apparatus 102 coupled to the HMD 402, such as its total mass and damping associated with the HMD 402 (e.g., a foam padding of the HMD).
  • sensors in the apparatus 102, as well as look-up tables associated with the physical characteristics of the HMD 402 can be used to automatically perform a system characterization of the physical dynamics of the dynamic system 106 including the HMD 402 mounting the apparatus 102 and the user's head.
  • the dynamic system characterization model 200 which can include a transformation matrix in some examples, can be created and/or updated to account for the changed characteristics of the dynamic system 106 while the user U interacts with the apparatus 102 in the HMD 402.
  • the frequency distribution of the vibrotactile acceleration profiles of the haptic renderings for the user's virtual reality games running on the apparatus 102 coupled to the HMD 402 can automatically adjust to optimally render haptic effects adapted to the user U.
  • FIG. 8 illustrates three example situations, such as Situation A, Situation B, and Situation C, wherein the apparatus 102 generates different haptic effects depending on different modes of operation.
  • the apparatus 102 includes various components of the Nintendo Switch console, available from Nintendo Co., Ltd. (Kyoto, Japan).
  • Situation A a user U is playing with the apparatus 102 in a configuration called "Handheld Mode with Joy-Cons Attached" where controllers 410 and 412 (“Joy-Cons”) are attached to opposite sides of a portable display screen 414.
  • the user U changes the configuration of the apparatus 102 to "TV Mode with Joy-Con Grip" where the controllers 410 and 412 are joined together into a grip device 416 ("Joy-Con Grip").
  • Haptic effects relayed to the controllers 410 and 412 are automatically modified according to the system and method described herein in order to account for the differences in dynamic system parameters, such as stiffness, mass, etc., in different configurations.
  • the user U changes the controller configuration to a pair of separated controllers 410 and 412 ("Individual Joy-Cons") in Situation C.
  • the haptic effects are adapted to the physical change to the controller configuration (e.g., lower mass and stiffness) and the user characteristics (e.g., modified grip strength with more skin surface contact).
  • haptic effects can be rendered differently to the users to accommodate their different gripping forces, stiffness, and mass with respect to the controllers.
  • the haptic effects are adapted (e.g., magnitude increased) to accommodate the first user's hands and controller configuration which have been actively monitored and updated as described herein.
  • FIGS. 9 and 10 examples of the dynamic system characterization model 200 are described and illustrated.
  • FIG. 9 illustrates an example of the dynamic system characterization model 200 in a decision tree format
  • FIG. 10 illustrates an example of the dynamic system characterization model 200 in a lookup table format.
  • the dynamic system characterization model 200 can have one or more predetermined ranges for each of the dynamic system parameters 120, which determine adjustment of one or more haptic data parameters 202 of the haptic effect rendering model 210.
  • the decision tree for the dynamic system characterization model 200 includes decision nodes which represent dynamic system parameters 120 and can branch out to a plurality of end nodes for setting up haptic profiles (e.g., Value 1, 2, 3, ...) depending on the ranges for the dynamic system parameters.
  • a decision node for a grip strength can categorized into three end nodes depending on the range of the grip strength, and each end node indicates a haptic profile to be set up for the haptic data parameter 202.
  • other decision nodes can be used for other dynamic system parameters, such as effective mass and temperature in the illustration of FIG. 9 .
  • the haptic profile being set up can be a mass value or a value related thereto.
  • the haptic profile being set up can be a temperature value or a value related thereto, such as heat flux, temperature, or other thermal related feedback.
  • humidity is another environmental condition that may be associated with one or more particular haptic profiles.
  • the lookup table for the dynamic system characterization model 200 can be used to implement a similar algorithm as illustrated in FIG. 9 .
  • the lookup table for the dynamic system characterization model 200 includes columns for dynamic system parameters, ranges, and haptic data parameters. For each dynamic system parameter, different values (e.g., Values 1, 2, 3, ...) are set up for haptic profiles depending on different ranges of the dynamic system parameter.
  • the haptic profile can represent one or more characteristics of the haptic drive signals, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event.
  • the haptic profile can be one or a combination of various characteristics of the haptic drive signals, thereby mapping different haptic responses for different system configurations.
  • the system configurations can change due to various factors, examples of which include the apparatus 102 being held by a user hand with variable parameters, such as grip strength, effective mass, and temperature.
  • the haptic profile being set up can be a haptic amplitude.
  • different amplitudes can compensate for different grip strengths.
  • a higher amplitude haptic effect can be rendered to compensate for the reduced vibration intensity that will result a user holding a device tightly.
  • the unit of amplitude can be a unitless scale from 0 - 1, as a voltage (i.e., driving a haptic actuator) or any other characteristics.
  • the haptic profile being set up can be a haptic frequency.
  • the haptic frequency can map to a resonant frequency of the apparatus 102 (which can change in the context of the dynamic system 106; in this sense, a resonant frequency may be also referred to as a resonant frequency of the dynamic system 106).
  • a resonant frequency may be also referred to as a resonant frequency of the dynamic system 106.
  • Such mapping of the haptic frequency to the resonant frequency of the apparatus 102 can help conveying a stronger haptic effect.
  • the dynamic system characterization model 200 incorporates a machine learning algorithm configured to learn the dynamic system 106 (e.g., a user behavior) with respect to the haptic enabled apparatus 102.
  • a machine learning algorithm configured to learn the dynamic system 106 (e.g., a user behavior) with respect to the haptic enabled apparatus 102.
  • a machine learning algorithm allows anticipating a user behavior interacting with the haptic enabled apparatus 102 and updating the dynamic system characterization model 200 accordingly, thereby providing improved user experience of haptic effects adapted to the anticipated change in the user behavior.
  • the systems and methods for generating adaptive haptic effects in accordance with the present disclosure can also be applied to kinesthetic and temperature haptic feedback conditions.
  • an adjustment in force feedback can be made depending on a monitored stiffness of a user's muscles. If tense muscles are monitored, more force feedback is provided, and if relaxed muscled are monitored, less force feedback is provided.
  • a temperature adjustment can be made depending on the temperature of a haptic enabled device or the environment thereof. By way of example, if a warm climate is detected, a higher rendered temperature is generated and provided, and if a cool climate is detected, a lower rendered temperature is generated and provided.
  • the systems and methods of the present disclosure can be used for physics-based models involving velocity and acceleration based effects such as friction and mass, respectively.
  • the mathematical stability of physics-based models can be sensitive to the accuracy of the dynamic system identification.
  • a step for calibrating the dynamic system identification can be provided to improve the accuracy.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
EP19184327.5A 2018-07-17 2019-07-04 Rendu à effet haptique adaptatif basé sur l'identification de système dynamique Withdrawn EP3598273A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/037,648 US20200026354A1 (en) 2018-07-17 2018-07-17 Adaptive haptic effect rendering based on dynamic system identification

Publications (1)

Publication Number Publication Date
EP3598273A1 true EP3598273A1 (fr) 2020-01-22

Family

ID=67180595

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19184327.5A Withdrawn EP3598273A1 (fr) 2018-07-17 2019-07-04 Rendu à effet haptique adaptatif basé sur l'identification de système dynamique

Country Status (5)

Country Link
US (1) US20200026354A1 (fr)
EP (1) EP3598273A1 (fr)
JP (1) JP2020013549A (fr)
KR (1) KR20200008946A (fr)
CN (1) CN110727342A (fr)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016124275A1 (de) * 2016-12-13 2018-06-14 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Bamberg Verfahren zur Ansteuerung einer motorischen Verschlusselementanordnung eines Kraftfahrzeugs
US11489958B2 (en) * 2018-05-02 2022-11-01 Telefonaktiebolaget Lm Ericsson (Publ) User equipment, method by a user equipment, and network node for generating a grip pattern notification through a user interface based on radio characteristics
GB2578454A (en) * 2018-10-28 2020-05-13 Cambridge Mechatronics Ltd Haptic feedback generation
KR102645332B1 (ko) * 2018-12-19 2024-03-11 삼성전자주식회사 디스플레이의 복수의 영역들 간 인터랙팅을 위한 방법 및 전자 장치
US11921923B2 (en) * 2019-07-30 2024-03-05 Maxim Integrated Products, Inc. Oscillation reduction in haptic vibrators by minimization of feedback acceleration
CN112631426A (zh) * 2020-12-21 2021-04-09 瑞声新能源发展(常州)有限公司科教城分公司 一种动态触感效果的生成方法、装置、设备及存储介质
WO2022178792A1 (fr) * 2021-02-26 2022-09-01 京东方科技集团股份有限公司 Système de rendu haptique et procédé de commande
US11775071B1 (en) 2022-03-08 2023-10-03 Microsoft Technology Licensing, Llc Haptic feedback from a computing accessory
WO2023215975A1 (fr) * 2022-05-09 2023-11-16 D-Box Technologies Inc. Procédé et système pour une simulation de mouvement adaptative dans un jeu

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998049614A1 (fr) * 1997-04-25 1998-11-05 Immersion Corporation Procede et appareil de conception et de controle de sensations d'effort transmises dans des applications informatiques a retour d'effort
EP2778850A1 (fr) * 2013-03-15 2014-09-17 Immersion Corporation Systèmes et procédés de modification de paramètre d'effets haptiques
US20150035780A1 (en) * 2012-02-15 2015-02-05 Immersion Corporation Interactivity model for shared feedback on mobile devices
EP3293621A2 (fr) * 2016-09-09 2018-03-14 Immersion Corporation Rendu haptique compensé pour des dispositifs électroniques flexibles
WO2018113952A1 (fr) * 2016-12-21 2018-06-28 Telefonaktiebolaget Lm Ericsson (Publ) Procédé et agencement de gestion d'une rétroaction haptique

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108883335A (zh) * 2015-04-14 2018-11-23 约翰·詹姆斯·丹尼尔斯 用于人与机器或人与人的可穿戴式的电子多感官接口
DK179823B1 (en) * 2016-06-12 2019-07-12 Apple Inc. DEVICES, METHODS, AND GRAPHICAL USER INTERFACES FOR PROVIDING HAPTIC FEEDBACK

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998049614A1 (fr) * 1997-04-25 1998-11-05 Immersion Corporation Procede et appareil de conception et de controle de sensations d'effort transmises dans des applications informatiques a retour d'effort
US20150035780A1 (en) * 2012-02-15 2015-02-05 Immersion Corporation Interactivity model for shared feedback on mobile devices
EP2778850A1 (fr) * 2013-03-15 2014-09-17 Immersion Corporation Systèmes et procédés de modification de paramètre d'effets haptiques
EP3293621A2 (fr) * 2016-09-09 2018-03-14 Immersion Corporation Rendu haptique compensé pour des dispositifs électroniques flexibles
WO2018113952A1 (fr) * 2016-12-21 2018-06-28 Telefonaktiebolaget Lm Ericsson (Publ) Procédé et agencement de gestion d'une rétroaction haptique

Also Published As

Publication number Publication date
US20200026354A1 (en) 2020-01-23
KR20200008946A (ko) 2020-01-29
CN110727342A (zh) 2020-01-24
JP2020013549A (ja) 2020-01-23

Similar Documents

Publication Publication Date Title
EP3598273A1 (fr) Rendu à effet haptique adaptatif basé sur l'identification de système dynamique
US10974138B2 (en) Haptic surround functionality
EP3588250A1 (fr) Interactions haptiques du monde réel pour un utilisateur de réalité virtuelle
US10564730B2 (en) Non-collocated haptic cues in immersive environments
CN110096131B (zh) 触感交互方法、装置、以及触感可穿戴设备
KR101666096B1 (ko) 강화된 제스처 기반 상호작용 시스템 및 방법
US9041647B2 (en) User interface device provided with surface haptic sensations
EP3364272A1 (fr) Système de génération haptique localisée automatique
EP3293621A2 (fr) Rendu haptique compensé pour des dispositifs électroniques flexibles
US10477298B2 (en) Rendering haptics on headphones with non-audio data
US20180011538A1 (en) Multimodal haptic effects
US11880528B2 (en) Stimulus transmission device

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200723