WO2012145264A1 - Electro-vibrotactile display - Google Patents
Electro-vibrotactile display Download PDFInfo
- Publication number
- WO2012145264A1 WO2012145264A1 PCT/US2012/033743 US2012033743W WO2012145264A1 WO 2012145264 A1 WO2012145264 A1 WO 2012145264A1 US 2012033743 W US2012033743 W US 2012033743W WO 2012145264 A1 WO2012145264 A1 WO 2012145264A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- haptic effect
- interface device
- change
- generating
- electrostatic
- Prior art date
Links
- 230000000694 effects Effects 0.000 claims abstract description 192
- 238000000034 method Methods 0.000 claims description 31
- 230000008859 change Effects 0.000 claims description 29
- 230000000007 visual effect Effects 0.000 claims description 11
- 230000001133 acceleration Effects 0.000 claims description 3
- 230000002596 correlated effect Effects 0.000 claims 1
- 230000000875 corresponding effect Effects 0.000 claims 1
- 230000033001 locomotion Effects 0.000 description 28
- 230000008878 coupling Effects 0.000 description 9
- 238000010168 coupling process Methods 0.000 description 9
- 238000005859 coupling reaction Methods 0.000 description 9
- 239000000463 material Substances 0.000 description 6
- 239000004927 clay Substances 0.000 description 3
- 238000001816 cooling Methods 0.000 description 3
- 238000010438 heat treatment Methods 0.000 description 3
- 238000012876 topography Methods 0.000 description 3
- 230000005057 finger movement Effects 0.000 description 2
- 239000012530 fluid Substances 0.000 description 2
- 230000000630 rising effect Effects 0.000 description 2
- 230000035807 sensation Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- 239000011449 brick Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 230000009881 electrostatic interaction Effects 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000011810 insulating material Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000004576 sand Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 229910001285 shape-memory alloy Inorganic materials 0.000 description 1
- 229910052709 silver Inorganic materials 0.000 description 1
- 239000004332 silver Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 230000004936 stimulating effect Effects 0.000 description 1
- 230000007474 system interaction Effects 0.000 description 1
- 239000004753 textile Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04809—Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
Definitions
- the invention relates to an electro-vibrotactile display.
- haptic effects Systems that provide haptic effects to users in conjunction with visual and/or audio content are known. It is generally understood that a haptic effect may enhance one or more aspects of the experience of the users associated with the content.
- Conventional haptic systems generally provide a stimulus only in the form of a mechanical vibration, which may limit the system's interaction with the user.
- an interface device for providing an overall haptic effect.
- the interface device comprises a surface configured to output the overall haptic effect.
- the interface device further comprises an electrostatic device that is coupled to the surface and configured to create a first haptic effect at the surface.
- the interface device further comprises an actuator that is configured to create a second haptic effect at the surface.
- the overall haptic effect comprises the first haptic effect and the second haptic effect.
- a method of providing an overall haptic effect comprises generating with an electrostatic device a first haptic effect at a surface of an interface device and generating with an actuator a second haptic effect at the surface.
- the overall haptic effect comprises the first haptic effect and the second haptic effect.
- a system configured to provide a haptic effect through an electrostatic display.
- the electrostatic display is configured to output a haptic effect that may simulate a level of friction or texture on the surface of the display, even when the surface is smooth.
- the electrostatic display can output different haptic effects to simulate different coefficients of friction or texture in order to convey a topography depicted on the display.
- the haptic effect may simulate a coefficient of friction or texture that conveys the terrain of a material.
- the haptic effect may simulate a coefficient of friction or texture that conveys a height or depth of a location of the display.
- the haptic effect may convey a property of data at a location on the display.
- a sensor may measure the simulated coefficient of friction and may adjust the haptic effect based on the measured coefficient.
- a system that changes the haptic effect based on the impedance measured at the surface of the electrostatic display.
- the system measures differences in the impedance of users' fingers due to differences in users' skin thicknesses, finger sizes, or moisture levels on their fingers.
- the system adjusts the haptic effect for these differences in order for create stimuli that the system may expect to be experienced consistently by two or more users.
- an electrostatic display with actuators with actuators, with a non-moving effect generator, such as a thermal device, or with both.
- An aspect of the invention relates to generating a haptic effect on surfaces other than the display screen of the electrostatic display.
- a haptic effect may be generated on the side or back of the display device, where controls such as a slide bar may be located.
- Other surfaces include those of, for example, a steering wheel, a dashboard, a joystick, a touchpad (or trackpad), and a remote control.
- a haptic effect may be generated at locations on a touchpad that correspond to locations on a display, in order to convey features shown on the display.
- FIG. 1A illustrates a system configured to generate a haptic effect with actuators and an electrostatic device, according to one or more embodiments of the invention.
- FIG. 1 B illustrates another view of the system of Fig. 1 A.
- FIG. 2 illustrates a system configured to generate a haptic effect to simulate a texture, according to one or more embodiments of the invention.
- FIG. 3 illustrates a system configured to generate a haptic effect
- FIG. 4 illustrates a system configured to generate a haptic effect
- FIG. 5 illustrates a system configured to generate a haptic effect
- FIG. 6 illustrates a system configured to generate a haptic effect
- FIG. 7 illustrates a depiction of how height may be simulated with a haptic effect, in accordance with one or more embodiments of the invention.
- FIG. 8A and 8B illustrate a system configured to generate a haptic effect based on a user gesture, in accordance with one or more embodiments of the invention.
- FIG. 9 illustrates a system configured to generate a haptic effect based on a video or audio trigger, in accordance with one or more embodiments of the invention.
- FIG. 10 illustrates a system configured to generate a haptic effect on a nondisplay surface, where the haptic effect conveys a texture, in accordance with one or more embodiments of the invention.
- FIG. 11 illustrates a system configured to generate a haptic effect on a nondisplay surface, in accordance with one or more embodiments of the invention.
- FIG. 12 illustrates a system configured to generate a haptic effect on a nondisplay surface to simulate texture, in accordance with one or more embodiments of the invention.
- FIG. 13 illustrates a system configured to generate a haptic effect on a nondisplay surface, in accordance with one or more embodiments of the invention.
- FIG. 14 illustrates a block diagram of a system and its circuits configured to generate a haptic effect, in accordance with one or more embodiments of the invention.
- FIG. 1 illustrates one embodiment of a system 100 that provides a haptic effect which simulates a friction coefficient at a user interface.
- a haptic effect refers to a stimulus or force, including but not limited to a vibration, an attractive or repulsive force, a voltage or current, some other mechanical or electromagnetic force, heating or cooling, or some other stimulus.
- the haptic effect may comprise one or a combination of the forces or stimuli described herein.
- a plurality of haptic effects may be combined to form an overall haptic effect.
- the haptic effect may be outputted at a user interface to provide feedback to a user or object interacting with the interface.
- the overall haptic effect may also provide feedback through electrostatic interactions, either to generate a force on an object, like a finger at the user interface, or to send an electric signal to an object that can perceive the signal, like a nerve of the finger or a sensor in a stylus.
- the system may be, for example, part of a music player, a video player, a graphic display, an e-book reader, some combination of the devices, or may be some other general device with a user interface.
- a display screen 110 that is configured to sense an object that is touching the screen 110.
- the object may be a user's finger 140, a palm of the user's hand, or any other part of the user's body that can sense a haptic effect.
- the object may also be a stylus or some other device whose presence can be sensed on the screen 110.
- the screen may sense the presence of the object through capacitive, resistive, or inductive coupling, but is not limited to those techniques.
- the system 100 may provide a haptic effect at the surface of the display screen 110 through one or more actuators 1 12, 1 14, 116, through an electrostatic device 120, or through combinations thereof.
- the actuators 112, 1 14, and 116 are configured to generate mechanical motion that may translate into vibrations at the surface of the screen 110.
- the actuators may be implemented as piezoelectric actuators, voice coils, magnetic actuators such as solenoids, pneumatic actuators, ultrasonic energy actuators, an eccentric mass actuator, an electroactive polymer actuator, a shape memory alloy, or some other type of actuator.
- the actuators may rely on motors that convert torque into vibrations, on fluid pressure, on changes in the shape of a material, or on other forces that generate motion.
- the actuators may use the electrostatic attraction between two objects, such as a conductive layer 120 and insulating layer 122 discussed below, or between layers in an electrostatic surface actuator, to generate motion.
- the actuators are not limited to generating vibrations, but may instead generate lateral motion, up and down motion, rotational motion, or some combinations thereof, or some other motion.
- actuator 116 may be a piezoelectric or a voice coil actuator that generates vibrations to generate a haptic effect
- actuator 1 12 may be a solenoid that generates up and down motion to generate a haptic effect
- actuator 1 14 may be a pneumatic actuator that generates lateral motion to generate a haptic effect.
- the actuators may all be activated when a haptic effect is desired, or only one may be activated to conserve power or to generate different haptic effects. Further, a particular actuator may be positioned and be configured to generate a haptic effect for the entire system 100, for only the display screen 110 that interfaces with the user, for only a portion of the display screen, or on some other part of the system 100.
- the actuator 116 can be configured to generate vibrations for only its corner of the display screen 1 10 by keeping the level of vibrations low enough so that vibration amplitude outside of its corner is less than a threshold amplitude.
- the system 100 also provides a haptic effect through an electrostatic device.
- the electrostatic device may be an electrovibrotactile display or any other device that applies voltages and currents instead of mechanical motion to generate a haptic effect.
- the electrostatic device in this embodiment has at least a conductive layer 120 and an insulating layer 122.
- the conducting layer 120 may be any
- the system 100 may operate the electrostatic device by applying an electric signal to the conducting layer 120.
- the electric signal may be an AC signal that, in this embodiment, capacitively couples the conducting layer with an object near or touching the display screen 1 10.
- the AC signal may be generated by a high-voltage amplifier.
- the system 100 may also rely on principles other than capacitive coupling to generate a haptic effect.
- the capacitive coupling may simulate a friction coefficient or texture on the surface of the display screen 110.
- a coefficient of friction is a simulated one in that while the display screen 1 10 can be smooth, the capacitive coupling may produce an attractive force between an object near the screen 100 and the conducting layer 120.
- the attractive force increases the friction on the surface even when the structure of the material at the surface has not changed. Varying the levels of attraction between the object and the conducting layer can vary the friction on an object moving across the display screen 10.
- Varying the friction force simulates a change in the coefficient of friction.
- the simulated coefficient of friction may be changed by the actuators as well.
- the actuators may increase the friction force by generating vibrations, or by changing the surface relief of the display screen 1 10 to change the actual coefficient of friction.
- the capacitive coupling may also generate a haptic effect by stimulating parts of the object near or touching the display screen 1 10, such as corpuscles in the skin of a user's finger 140 or components in a stylus that can respond to the coupling.
- the corpuscles in the skin may be stimulated and sense the capacitive coupling as a vibration or some more specific sensation.
- the conducting layer 120 can be applied with an AC voltage signal that couples with conductive parts of a user's finger 140.
- the user may sense a texture of prickliness, graininess, bumpiness, roughness, stickiness, or some other texture.
- the user's skin corpuscles may also be stimulated to have a general sensation as the finger 140 moves across the screen 110. Therefore, the capacitive coupling may be used to simulate a coefficient of friction or texture by generating a signal that couples with an object near or touching the screen.
- the system 100 may also include a sensor that can measure the impedance at the surface of the display screen 100.
- the sensor may measure the impedance by applying a pulse across the surface and measuring the surface voltage or by measuring the strength of the capacitive coupling.
- the sensor may use other known techniques for measuring impedance, and may compensate for varying ambient conditions such as the moisture in the air or temperature.
- the haptic effect can be adjusted based on the impedance of a person. For example, a more forceful haptic effect may be applied to an object with a higher impedance and a less forceful effect for one with lower impedance.
- a user interface does not have an insulating layer, so that an object can directly touch the conducting layer.
- a haptic effect can be generated by passing an electrical current from the conducting layer to the object.
- This embodiment may alternatively use an insulating layer, but include one or more electrodes in the insulating layer that can pass current to objects that touch the electrode as they move across the insulating layer.
- a haptic effect system may include a thermal device 1 18 that may be combined with the actuators and electrostatic device.
- the thermal device 118 may generate a haptic effect by directly or indirectly heating or cooling an object interacting with the system.
- the haptic effects can be generated by actuators and electrostatic device one at a time, or can be combined.
- a voltage may be applied to the conducting layer 120 at a level high enough to both attract the skin of a finger 40 touching the screen 10 and to stimulate corpuscles within the skin.
- electrostatic forces may be produced on the conducting layer 120 and the insulating layer 122 to create mechanical motion in those layers.
- the haptic effects may be combined with motions generated by one or a combination of actuators 112, 114, and 1 16.
- the devices may work together to simulate the coefficient of friction or texture on the surface of the screen.
- the actuators may generate vibrations, for example, to also simulate changes in the surface friction or texture.
- the devices can generate different simulated textures or coefficients of friction as an object, like a user's finger 140, moves across the surface of the screen 10.
- FIG. 1A shows a first haptic region 130, a second haptic region 132, and a third haptic region 134, but other embodiments may have one, two, or more than three haptic regions.
- the actuators and electrostatic device may simulate a first texture or friction coefficient.
- the actuators and electrostatic devices may simulate second and third textures or friction coefficients, respectively, that can be different from the first texture or friction coefficient.
- the system may also have a sensor that measures the simulated coefficient of friction. This may be the same sensor as the sensor described above that measures the impedance, or it may be a different sensor.
- the sensor may measure the simulated coefficient based on a measured pressure that the surface of the screen 110 is receiving, such as from an object touching the screen, and on the movement of the object at the surface. Movement of the object may be measured based on how the pressure at the surface changes over time and over locations on the surface. For example, the sensor may calculate a value representing the simulated coefficient of friction based on an acceleration of a user's finger on the screen 110 and on the pressure that the surface receives from the user's finger.
- the haptic effects and the sensors may be controlled by a controller.
- the controller may analyze the impedance, the simulated coefficient of friction, the surface pressure, a rate of movement measured at the surface, and other factors to determine whether there has been a triggering condition for a haptic effect or how forceful a haptic effect should be. Details of the controller are discussed below.
- FIG. 1 B shows the haptic region 130 with a simulated friction coefficient or texture.
- the actuators and electrostatic device may generate a haptic effect that simulates a friction force corresponding to the simulated friction coefficient.
- the area of a haptic region can be much smaller or much bigger than that shown in FIGS. 1A and 1 B, and its associated coefficient of friction or texture may change.
- FIG. 2 shows an embodiment of a haptic effect system 200.
- the system 200 is configured to provide a user interface through a display screen 210.
- the display screen 210 has a haptic region 230 that may simulate the coarse texture of bricks as a user's finger 140 moves across the region.
- the display screen 210 also has another haptic region 232 that simulates the bumpy texture of rocks as the user's finger 140 moves across the region.
- the system may generate haptic effects that simulate other textures, such as stickiness, roughness, or abrasiveness.
- the haptic effects may incorporate the heating or cooling from a thermal device to simulate both the texture and temperature.
- FIG. 3 shows another form of interaction that the haptic effect system 200 can provide.
- the system 200 can generate haptic effects to represent data elements such as file folders.
- the screen 210 may have three haptic regions 330, 332, and 334 that simulate different textures or coefficients of friction for different types of folders.
- the system 200 may provide a less forceful or no haptic effect when a user's finger 140 moves across region 330, which corresponds to folders representing visual data.
- the system 200 may provide a more forceful haptic effect when a user's finger 140 moves across region 332, which corresponds to a folder representing audio data, and finally may provide an even more forceful haptic effect when a user's finger moves across region 334, which corresponds to a folder representing protected data.
- the haptic regions can correspond to many different types of data elements, such as buttons, checkboxes, dropdown boxes, other form elements, icons, cursors, and windows.
- the system 200 may change the texture or friction coefficient of a region based on time. For example, if a haptic region corresponds to a button, the system 200 may change the texture or friction coefficient of the region between the first time and the second time that a user's finger 140 moves across the region. This change can reflect, for example, that the button has been depressed.
- a haptic region 430 corresponds to a slide bar (or scroll bar).
- the system 200 can generate a haptic effect that simulates a texture or friction coefficient so long as the finger 140 is still in the region.
- the system 200 may localize the haptic effect to the region on the slide bar where the finger is located.
- the system 200 may have a plurality of actuators at different locations under the screen 210, and it may activate only the actuator nearest to the finger 140.
- the system 200 may also be configured to adjust the haptic effect to adjust the simulated texture or coefficient of friction.
- a user may configure the system 200 to have a high simulated friction of coefficient in order to achieve more control while scrolling down the screen with the slide bar.
- the system may also automatically adjust the haptic effect based on, for example, the motion of the finger 140. For example, if the user's finger is attempting to scroll quickly, the system 200 may accommodate the user by sensing a high rate of motion on the screen and generating a less forceful haptic effect to simulate a lower coefficient of friction.
- FIG. 5 shows an example that corresponds a haptic region to properties of text.
- the particular embodiment shows a haptic region 530 that corresponds to highlighting of text.
- the haptic region may instead correspond to underlined, italicized, bolded text, to text of a certain font, or any other property that can be attributed to the text.
- the haptic region may be assigned or modified as property is applied to the text. For example, the system may generate a haptic effect as the user is in the process of highlighting text.
- the haptic region can correspond not to just visible properties of text, but also to invisible properties, such as metadata.
- the system 200 may assign a haptic region to text that is a hyperlink.
- the system 200 may access a XML file to identify certain text that belongs to a certain section and assign a haptic region to that section.
- the friction coefficient of a region can change based on time or on movement on the screen. In this instance, when a user's finger 140 scrolls text to a different location on the screen 210, the system 200 is configured to effectively move the haptic region to the text's new location by setting a new simulated friction coefficient at the new location and, for example, setting the simulated friction coefficient at the old location to zero or some default value.
- FIG. 6 shows yet another example that corresponds a haptic region to a location.
- haptic regions 632 may correspond to locations that have no particular relevance to a user and have a zero or some default simulated coefficient of friction.
- Haptic regions 630 and 634 may have different simulated textures or coefficients of friction that distinguish two different locations. The friction coefficients may act as an identifier for the location.
- the system 200 may also generate a haptic effect that produces a texture or friction force that simulates the terrain at the location.
- the haptic effect at haptic region 630 may simulate the texture of a sand lot, while the haptic effect at haptic region 634 may simulate the texture of a gravel track.
- the haptic effect is not limited to simulating the texture of terrain.
- the screen 210 may display garments, textiles, foods, and other materials and products.
- the system 200 can assign a haptic region that corresponds to the material or product displayed and generate a haptic effect that produces a texture or friction force that simulates that material or product.
- the haptic effect may be generated to simulate textures in drawings in general. For example, if the user's finger 140 were drawing on a drawing program rendered on the display screen 210, the system may generate a haptic effect that produces a texture that simulates, for example, the texture of crosshatches when a user's finger 140 is drawing crosshatches on the screen or a grainier texture when a user's finger 140 is drawing spray paint on the screen.
- FIG. 7 shows a simulation of a surface feature, specifically a raised edge.
- the simulated raised edge 731 may represent three-dimensional aspects of areas depicted on a screen, such as buttons, textboxes, icons, or other user input objects. For example, users who perceive the simulated raised edge of a button may know that their finger has moved over the top of the button without having to look at the screen.
- the simulated surface features may more generally convey height or depth at the user interface.
- the raised edge 731 may be defined by a time domain profile that applies a more intense haptic effect when a user's finger 140 is over haptic region 732, a less intense haptic effect when the user's finger 40 is over haptic region 730, and no haptic effect when the user's finger 140 is over haptic region 734.
- a haptic effect system may simulate a high friction coefficient when the user's finger 140 moves over region 732 in order to represent the rising slope of the raised edge 731.
- the haptic system may simulate a second, lower friction coefficient when the user's finger 140 moves over region 730 in order to represent the flat top of an object.
- the haptic system may simulate an even lower coefficient of friction (e.g., zero) when the finger 40 moves over region 734 to represent a down slope.
- the system can vary the haptic effect to simulate a higher coefficient for the rise and a lower coefficient for the descent.
- time is not an independent variable. Rather, the timing of the haptic effect depends on the position of the finger 140. Therefore, the haptic effects in the time domain profile is not always applied at fixed times, but may depend on how quickly a user's finger is approaching the haptic regions 730, 732, and 734.
- a topography map may be used to assign haptic regions and friction coefficients to locations on the user interface.
- the haptic regions are not limited to two simulated friction
- coefficients may have more or fewer coefficients. More coefficients may be used to adjust the simulated friction based on the direction in which a user's finger is moving. For example, if a user's finger is approaching region 730 from a direction that corresponds to north, the system may apply one friction coefficient, whereas it may apply a different friction coefficient if a user's finger is approaching from a direction that corresponds to west. Having multiple coefficients can effectively subdivide the region 732 into separate sub-regions that correspond to different slopes. The system may also be configured to calculate the slope based on, for example, a topographical map by dividing the height difference between two levels in the map by the horizontal distance between the two levels.
- the system may also be configured to calculate height differences in the image based on, for example, lighting differences, location of shadows, and other visual factors that can be used to measure height differences in an image.
- the system may incorporate height and terrain information to generate a haptic effect that simulates both the topography and texture of an object, of a terrain, or of some other type of image shown on the user interface.
- Actuators may also be used to change the perception of height or depth. For example, they may generate vibrations to signal a change from a region representing a slope to a region representing a plateau. Alternatively, the actuators may actually create height or depth by using, for example, rods or pins to change the surface relief.
- Haptic effects may be generated based not only on the measured surface pressure and rate of movement from an object at the surface, but more specifically on recognized gestures.
- FIGS. 8A and 8B show a haptic effect being generated in response to a gesture for flipping a page.
- the haptic effect system 800 has a display screen 810 that depicts a stack of pages 820 on the screen. Rather than assign a haptic region to a location on the screen 810, the system 800 may generate a haptic effect when it recognizes that the user's finger 140 is flipping a page in the stack 820. For example, the system may detect whether the user's finger starts at the lower right corner of the screen 810 and whether it moves to the left.
- the system 800 may detect whether the user's finger 140 follows the corner of the page. If it does, the system 800 may recognize this as a gesture for flipping a page and generate a haptic effect.
- the system can be configured to recognize other gestures, such as a scrolling gesture, a zooming gesture, or any other finger movement that can be recognized.
- the system 800 may be configured to recognize the movement of more than one finger on the screen 810.
- the system 800 may also recognize other gestures based on two, three, four, or more fingers, such as a pinching gesture or rotate gesture.
- the system may also accommodate finger movements from more than one user.
- a haptic effect may also be initiated based on a video, graphic, or audio content shown at a user interface.
- FIG. 9 shows a haptic effect system 900 that generates a haptic effect to accompany a visual and audio indication of an incoming phone call.
- the system 900 may be configured to generate a haptic effect that, for example, simulates a friction coefficient or texture on the screen 910 when it displays the visual indication of a phone call.
- haptic effects can be generated on user interface surfaces other than display screens. Because controls for a device can be located on surfaces other than the display screen, the screen does not need to be able to receive user input. Instead, the screen may merely output visual and audio content and haptic effects while a slide bar 930 on a nondisplay surface of the system in FIG. 9, for example, may receive user input.
- the slide bar may be a control that answers an incoming call.
- the system may generate a haptic effect without a triggering event.
- an electrostatic device may apply a constant voltage beneath the slide bar regardless of the presence of a user or triggering event. Alternatively, it may generate a haptic effect only upon a triggering event, like an incoming call.
- FIG. 0 shows another nondisplay surface 1020.
- the surface may belong to a trackpad or a touchpad that provides input to a computer display screen 1010.
- the computer system 1000 may correspond regions on the screen 1010 to haptic regions on the touchpad 1020.
- the touchpad may generate an output signal that is received by the computing system's display 1010 and reflected as a cursor or pointer moving across region 1040.
- the touchpad 1020 may also receive an input signal from the computing system 1000 that causes a haptic effect to be generated at haptic region 1030.
- the touchpad can detect the movement and output a signal to the system 1000 that causes it to show a cursor or pointer to move across corresponding regions 1042 and 1044 on the display screen 1010.
- the system may send a signal to the touchpad 1020 that causes it to generate a first haptic effect as the user's finger 140 is moving across region 1032 and a second haptic effect as the user's finger 140 is moving across region 034.
- FIG. 11 shows another example where a haptic region may be in the form of a border region.
- the embodiment shows a file region 1140 containing one file 1 144 and a file region 1142 containing no documents.
- the file regions may be windows or file folders depicted on the display screen 1010.
- the system 1000 may assign a haptic region 1 130 that corresponds to the border of file region 140 and may assign a haptic region 1 132 that corresponds to the border of file region 1 142.
- the file regions may be locked such that a file cannot be transferred from one region or cannot be written into another region.
- 11 may provide a haptic effect when it detects that a user's finger 140, for example, is moving the file out of region 1 140 or into region 1142. That is, the system may generate a haptic effect when the user's finger touches or comes near the edges of regions 1 130 or
- the system 1000 may initiate the haptic effect only if the user is trying to drag a file. For example, the system 1000 may monitor motion at the touchpad 1020 to detect whether a user's finger began its motion at region 1 134. If the user's finger began its movement at some other part of the touchpad 1020, the system 1000 may decide against generating a haptic effect.
- FIG. 12 shows yet another example where the touchpad 1020 may simulate the friction or texture of locations corresponding to terrain on a screen 1010.
- the haptic effect in this embodiment simulates texture on a nondisplay surface like the touchpad 1020.
- the system 1000 may show a map with a clay tennis court at location 1240 and a hard court at location 1242.
- the system may assign a haptic region 1230 to correspond with the screen depiction of the clay court 1240, and similarly a haptic region 1232 for the depicted hard court 1242.
- the system may monitor surface movement and generate a haptic effect to produce a friction force or texture resembling a clay court when a user's finger moves across region 1230, and a friction force or texture resembling a hard court when the finger moves across region 1232.
- a haptic effect may be generated even for features that are not displayed on the display screen 1010.
- the system 1000 may assign a haptic region 1234 that may correspond to an underground tunnel that is not displayed on the display screen 1010 or is not visible to the user.
- the haptic regions are drawn in FIG. 12 for illustrative purposes, and may not actually be visible.
- the system 1000 may display a cursor, pointer, icon, or avatar on the screen to allow a user to see how his movement on the touch pad
- FIG. 13 shows yet another nondisplay surface, this time on a game remote control 1320, that may output a haptic effect
- the game remote control 1320 may use actuators, electrostatic devices, thermal devices, or some combination thereof to generate a haptic effect at the surface of the control.
- the haptic effect may be triggered by an event depicted on a display screen 1310, or an acceleration or velocity of the remote control, the position or orientation of the control 1320 relative to the screen 1310, on a sound, the light level, or some other trigger.
- triggers may be used for any of the embodiments discussed.
- Other examples of trigger include the temperature, humidity, lighting, other ambient conditions, and surface contact area (e.g. with another object). These factors may serve as not only triggers, but also as determinants in how forceful a haptic effect is. For example, if a sensor detects dark or low lighting conditions, a more forceful haptic effect may be generated to compensate for the poorer visibility in those conditions.
- a haptic effect can be outputted on the surface of a switch, a knob, some other control instrument, a dashboard, some other board, or any other surface that can output a haptic effect.
- embodiments of the present invention may be used with deformable surfaces, such as surfaces that are adapted for gross deformations.
- FIG. 14 illustrates an embodiment of a module 1400 for generating a haptic effect.
- the module 1400 may be included in any of the embodiments of the haptic effect systems described herein.
- the module 1400 may contain a haptic device 1430 that generates one or more haptic effects, and may adjust the effect based on an impedance or simulated friction coefficient measured by a sensor 1440.
- the sensor data may be analyzed by a controller 1450 and stored in a memory 420.
- the controller 1450 may be included as part of a processor 1410, or the controller may be a separate logic circuit.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
A system for generating a haptic effect with an electrostatic device, alone or in combination with actuators or other devices configured to generate a haptic effect. The system may generate the haptic effect to simulate a coefficient of friction or texture on the surface of a user interface, such as a display screen or touchpad. The simulated coefficient of friction may be adjusted to convey other properties on the user interface surface. The system may also include sensors that measure the simulated coefficient of friction and the impedance at the surface.
Description
ELECTRO-VIBROTACTILE DISPLAY
CROSS REFERENCE TO RELATED APPLICATIONS
(01) This application claims priority to U.S. Patent Application Serial No.
13/092,269, filed on April 22, 2011 , the entire content of which is incorporated herein by reference.
FIELD
(02) The invention relates to an electro-vibrotactile display. BACKGROUND
(03) Systems that provide haptic effects to users in conjunction with visual and/or audio content are known. It is generally understood that a haptic effect may enhance one or more aspects of the experience of the users associated with the content. Conventional haptic systems generally provide a stimulus only in the form of a mechanical vibration, which may limit the system's interaction with the user.
SUMMARY
(04) According to an aspect of the present invention, there is provided an interface device for providing an overall haptic effect. The interface device comprises a surface configured to output the overall haptic effect. The interface device further comprises an electrostatic device that is coupled to the surface and configured to create a first haptic effect at the surface. The interface device further comprises an actuator that is configured to create a second haptic effect at the surface. The overall haptic effect comprises the first haptic effect and the second haptic effect.
(05) According to an aspect of the present invention, there is provided a method of providing an overall haptic effect. The method comprises generating with an electrostatic device a first haptic effect at a surface of an interface device and generating with an actuator a second haptic effect at the surface. The overall haptic effect comprises the first haptic effect and the second haptic effect.
(06) According to an aspect of the present invention, there is provided a system configured to provide a haptic effect through an electrostatic display. The
electrostatic display is configured to output a haptic effect that may simulate a level of friction or texture on the surface of the display, even when the surface is smooth. The electrostatic display can output different haptic effects to simulate different coefficients of friction or texture in order to convey a topography depicted on the display. In an embodiment, the haptic effect may simulate a coefficient of friction or texture that conveys the terrain of a material. In an embodiment, the haptic effect may simulate a coefficient of friction or texture that conveys a height or depth of a location of the display. In an embodiment, the haptic effect may convey a property of data at a location on the display. A sensor may measure the simulated coefficient of friction and may adjust the haptic effect based on the measured coefficient.
(07) According to an aspect of the present invention, there is provided a system that changes the haptic effect based on the impedance measured at the surface of the electrostatic display. In an embodiment, the system measures differences in the impedance of users' fingers due to differences in users' skin thicknesses, finger sizes, or moisture levels on their fingers. The system adjusts the haptic effect for these differences in order for create stimuli that the system may expect to be experienced consistently by two or more users.
(08) According to an aspect of the present invention, there is provided an electrostatic display with actuators, with a non-moving effect generator, such as a thermal device, or with both.
(09) An aspect of the invention relates to generating a haptic effect on surfaces other than the display screen of the electrostatic display. In an embodiment, a haptic effect may be generated on the side or back of the display device, where controls such as a slide bar may be located. Other surfaces include those of, for example, a steering wheel, a dashboard, a joystick, a touchpad (or trackpad), and a remote control. In an embodiment, a haptic effect may be generated at locations on a touchpad that correspond to locations on a display, in order to convey features shown on the display.
(10) These and other aspects, features, and characteristics of the present invention, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding
parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of "a", "an", and "the" include plural referents unless the context clearly dictates otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
(11) FIG. 1A illustrates a system configured to generate a haptic effect with actuators and an electrostatic device, according to one or more embodiments of the invention.
(12) FIG. 1 B illustrates another view of the system of Fig. 1 A.
(13) FIG. 2 illustrates a system configured to generate a haptic effect to simulate a texture, according to one or more embodiments of the invention.
(14) FIG. 3 illustrates a system configured to generate a haptic effect
corresponding to one or more data elements, in accordance with one or more embodiments of the invention.
(15) FIG. 4 illustrates a system configured to generate a haptic effect
corresponding to one or more data elements, in accordance with one or more embodiments of the invention.
(16) FIG. 5 illustrates a system configured to generate a haptic effect
corresponding to one or more data elements, in accordance with one or more embodiments of the invention.
(17) FIG. 6 illustrates a system configured to generate a haptic effect
corresponding to map locations, according to one or more embodiments of the invention.
(18) FIG. 7 illustrates a depiction of how height may be simulated with a haptic effect, in accordance with one or more embodiments of the invention.
(19) FIG. 8A and 8B illustrate a system configured to generate a haptic effect based on a user gesture, in accordance with one or more embodiments of the invention.
(20) FIG. 9 illustrates a system configured to generate a haptic effect based on a video or audio trigger, in accordance with one or more embodiments of the invention.
(21) FIG. 10 illustrates a system configured to generate a haptic effect on a nondisplay surface, where the haptic effect conveys a texture, in accordance with one or more embodiments of the invention.
(22) FIG. 11 illustrates a system configured to generate a haptic effect on a nondisplay surface, in accordance with one or more embodiments of the invention.
(23) FIG. 12 illustrates a system configured to generate a haptic effect on a nondisplay surface to simulate texture, in accordance with one or more embodiments of the invention.
(24) FIG. 13 illustrates a system configured to generate a haptic effect on a nondisplay surface, in accordance with one or more embodiments of the invention.
(25) FIG. 14 illustrates a block diagram of a system and its circuits configured to generate a haptic effect, in accordance with one or more embodiments of the invention.
DETAILED DESCRIPTION
(26) FIG. 1 illustrates one embodiment of a system 100 that provides a haptic effect which simulates a friction coefficient at a user interface. A haptic effect refers to a stimulus or force, including but not limited to a vibration, an attractive or repulsive force, a voltage or current, some other mechanical or electromagnetic force, heating or cooling, or some other stimulus. The haptic effect may comprise one or a combination of the forces or stimuli described herein. A plurality of haptic effects may be combined to form an overall haptic effect. The haptic effect may be outputted at a user interface to provide feedback to a user or object interacting with the interface. It may provide the feedback through mechanical movement, such as through vibrations of a solid object, vibrations of fluids, or actuating objects like pins or rods to touch the user. The pins or rods may deform the surface by changing the surface relief or contour. The overall haptic effect may also provide feedback through electrostatic interactions, either to generate a force on an object, like a finger at the user interface, or to send an electric signal to an object that can perceive the signal, like a nerve of the finger or a sensor in a stylus.
(27) The system may be, for example, part of a music player, a video player, a graphic display, an e-book reader, some combination of the devices, or may be some other general device with a user interface. The system 100 in this
embodiment interfaces with the user through a display screen 110 that is configured
to sense an object that is touching the screen 110. The object may be a user's finger 140, a palm of the user's hand, or any other part of the user's body that can sense a haptic effect. The object may also be a stylus or some other device whose presence can be sensed on the screen 110. The screen may sense the presence of the object through capacitive, resistive, or inductive coupling, but is not limited to those techniques.
(28) The system 100 may provide a haptic effect at the surface of the display screen 110 through one or more actuators 1 12, 1 14, 116, through an electrostatic device 120, or through combinations thereof. The actuators 112, 1 14, and 116 are configured to generate mechanical motion that may translate into vibrations at the surface of the screen 110. The actuators may be implemented as piezoelectric actuators, voice coils, magnetic actuators such as solenoids, pneumatic actuators, ultrasonic energy actuators, an eccentric mass actuator, an electroactive polymer actuator, a shape memory alloy, or some other type of actuator. The actuators may rely on motors that convert torque into vibrations, on fluid pressure, on changes in the shape of a material, or on other forces that generate motion. For example, the actuators may use the electrostatic attraction between two objects, such as a conductive layer 120 and insulating layer 122 discussed below, or between layers in an electrostatic surface actuator, to generate motion. Further, the actuators are not limited to generating vibrations, but may instead generate lateral motion, up and down motion, rotational motion, or some combinations thereof, or some other motion. For the embodiment in FIG. 1A, actuator 116 may be a piezoelectric or a voice coil actuator that generates vibrations to generate a haptic effect, actuator 1 12 may be a solenoid that generates up and down motion to generate a haptic effect, and actuator 1 14 may be a pneumatic actuator that generates lateral motion to generate a haptic effect. The actuators may all be activated when a haptic effect is desired, or only one may be activated to conserve power or to generate different haptic effects. Further, a particular actuator may be positioned and be configured to generate a haptic effect for the entire system 100, for only the display screen 110 that interfaces with the user, for only a portion of the display screen, or on some other part of the system 100. For example, the actuator 116 can be configured to generate vibrations for only its corner of the display screen 1 10 by keeping the level of vibrations low enough so that vibration amplitude outside of its corner is less than a threshold amplitude.
(29) The system 100 also provides a haptic effect through an electrostatic device. The electrostatic device may be an electrovibrotactile display or any other device that applies voltages and currents instead of mechanical motion to generate a haptic effect. The electrostatic device in this embodiment has at least a conductive layer 120 and an insulating layer 122. The conducting layer 120 may be any
semiconductor or other conductive material, such as copper, aluminum, gold, or silver. The insulating layer 122 may be glass, plastic, polymer, or any other insulating material. The system 100 may operate the electrostatic device by applying an electric signal to the conducting layer 120. The electric signal may be an AC signal that, in this embodiment, capacitively couples the conducting layer with an object near or touching the display screen 1 10. The AC signal may be generated by a high-voltage amplifier. The system 100 may also rely on principles other than capacitive coupling to generate a haptic effect. The capacitive coupling may simulate a friction coefficient or texture on the surface of the display screen 110. A coefficient of friction is a simulated one in that while the display screen 1 10 can be smooth, the capacitive coupling may produce an attractive force between an object near the screen 100 and the conducting layer 120. The attractive force increases the friction on the surface even when the structure of the material at the surface has not changed. Varying the levels of attraction between the object and the conducting layer can vary the friction on an object moving across the display screen 10.
Varying the friction force simulates a change in the coefficient of friction. The simulated coefficient of friction may be changed by the actuators as well. For example, the actuators may increase the friction force by generating vibrations, or by changing the surface relief of the display screen 1 10 to change the actual coefficient of friction.
(30) The capacitive coupling may also generate a haptic effect by stimulating parts of the object near or touching the display screen 1 10, such as corpuscles in the skin of a user's finger 140 or components in a stylus that can respond to the coupling. The corpuscles in the skin, for example, may be stimulated and sense the capacitive coupling as a vibration or some more specific sensation. For example, the conducting layer 120 can be applied with an AC voltage signal that couples with conductive parts of a user's finger 140. As the user touches the display screen 110 and moves his or her finger 140 on the screen, the user may sense a texture of prickliness, graininess, bumpiness, roughness, stickiness, or some other texture.
The user's skin corpuscles may also be stimulated to have a general sensation as the finger 140 moves across the screen 110. Therefore, the capacitive coupling may be used to simulate a coefficient of friction or texture by generating a signal that couples with an object near or touching the screen.
(31) To provide the same attractive force or to provide the same level of stimuli across many different objects or persons, the system 100 may also include a sensor that can measure the impedance at the surface of the display screen 100. The sensor may measure the impedance by applying a pulse across the surface and measuring the surface voltage or by measuring the strength of the capacitive coupling. The sensor may use other known techniques for measuring impedance, and may compensate for varying ambient conditions such as the moisture in the air or temperature. The haptic effect can be adjusted based on the impedance of a person. For example, a more forceful haptic effect may be applied to an object with a higher impedance and a less forceful effect for one with lower impedance.
(32) In an embodiment, a user interface does not have an insulating layer, so that an object can directly touch the conducting layer. A haptic effect can be generated by passing an electrical current from the conducting layer to the object. This embodiment may alternatively use an insulating layer, but include one or more electrodes in the insulating layer that can pass current to objects that touch the electrode as they move across the insulating layer.
(33) In an embodiment, a haptic effect system may include a thermal device 1 18 that may be combined with the actuators and electrostatic device. The thermal device 118 may generate a haptic effect by directly or indirectly heating or cooling an object interacting with the system.
(34) The haptic effects can be generated by actuators and electrostatic device one at a time, or can be combined. For example, a voltage may be applied to the conducting layer 120 at a level high enough to both attract the skin of a finger 40 touching the screen 10 and to stimulate corpuscles within the skin. Simultaneous to this electro-vibrotactile haptic effect, electrostatic forces may be produced on the conducting layer 120 and the insulating layer 122 to create mechanical motion in those layers. The haptic effects may be combined with motions generated by one or a combination of actuators 112, 114, and 1 16. The devices may work together to simulate the coefficient of friction or texture on the surface of the screen. The
actuators may generate vibrations, for example, to also simulate changes in the surface friction or texture.
(35) The devices can generate different simulated textures or coefficients of friction as an object, like a user's finger 140, moves across the surface of the screen 10. FIG. 1A shows a first haptic region 130, a second haptic region 132, and a third haptic region 134, but other embodiments may have one, two, or more than three haptic regions. As the user's finger moves over the first region 130, the actuators and electrostatic device may simulate a first texture or friction coefficient. As the user's finger moves on to the second and third regions 132, 134, the actuators and electrostatic devices may simulate second and third textures or friction coefficients, respectively, that can be different from the first texture or friction coefficient.
(36) The system may also have a sensor that measures the simulated coefficient of friction. This may be the same sensor as the sensor described above that measures the impedance, or it may be a different sensor. The sensor may measure the simulated coefficient based on a measured pressure that the surface of the screen 110 is receiving, such as from an object touching the screen, and on the movement of the object at the surface. Movement of the object may be measured based on how the pressure at the surface changes over time and over locations on the surface. For example, the sensor may calculate a value representing the simulated coefficient of friction based on an acceleration of a user's finger on the screen 110 and on the pressure that the surface receives from the user's finger.
(37) The haptic effects and the sensors may be controlled by a controller. The controller may analyze the impedance, the simulated coefficient of friction, the surface pressure, a rate of movement measured at the surface, and other factors to determine whether there has been a triggering condition for a haptic effect or how forceful a haptic effect should be. Details of the controller are discussed below.
(38) A frontal view of the haptic effect system of Fig. 1 A is illustrated in FIG. 1 B, which shows the haptic region 130 with a simulated friction coefficient or texture. As the user moves a finger 140 over the region, the actuators and electrostatic device may generate a haptic effect that simulates a friction force corresponding to the simulated friction coefficient. The area of a haptic region can be much smaller or much bigger than that shown in FIGS. 1A and 1 B, and its associated coefficient of friction or texture may change.
(39) FIG. 2 shows an embodiment of a haptic effect system 200. The system 200 is configured to provide a user interface through a display screen 210. The display screen 210 has a haptic region 230 that may simulate the coarse texture of bricks as a user's finger 140 moves across the region. The display screen 210 also has another haptic region 232 that simulates the bumpy texture of rocks as the user's finger 140 moves across the region. The system may generate haptic effects that simulate other textures, such as stickiness, roughness, or abrasiveness. The haptic effects may incorporate the heating or cooling from a thermal device to simulate both the texture and temperature.
(40) FIG. 3 shows another form of interaction that the haptic effect system 200 can provide. The system 200 can generate haptic effects to represent data elements such as file folders. The screen 210 may have three haptic regions 330, 332, and 334 that simulate different textures or coefficients of friction for different types of folders. For example, the system 200 may provide a less forceful or no haptic effect when a user's finger 140 moves across region 330, which corresponds to folders representing visual data. The system 200 may provide a more forceful haptic effect when a user's finger 140 moves across region 332, which corresponds to a folder representing audio data, and finally may provide an even more forceful haptic effect when a user's finger moves across region 334, which corresponds to a folder representing protected data. The haptic regions can correspond to many different types of data elements, such as buttons, checkboxes, dropdown boxes, other form elements, icons, cursors, and windows. The system 200 may change the texture or friction coefficient of a region based on time. For example, if a haptic region corresponds to a button, the system 200 may change the texture or friction coefficient of the region between the first time and the second time that a user's finger 140 moves across the region. This change can reflect, for example, that the button has been depressed.
(41) Another example of a data element is shown in FIG. 4. Here, a haptic region 430 corresponds to a slide bar (or scroll bar). As the user's finger 140 moves across the region, the system 200 can generate a haptic effect that simulates a texture or friction coefficient so long as the finger 140 is still in the region. The system 200 may localize the haptic effect to the region on the slide bar where the finger is located. For example, the system 200 may have a plurality of actuators at different locations under the screen 210, and it may activate only the actuator nearest to the finger 140.
(42) The system 200 may also be configured to adjust the haptic effect to adjust the simulated texture or coefficient of friction. That is, a user may configure the system 200 to have a high simulated friction of coefficient in order to achieve more control while scrolling down the screen with the slide bar. The system may also automatically adjust the haptic effect based on, for example, the motion of the finger 140. For example, if the user's finger is attempting to scroll quickly, the system 200 may accommodate the user by sensing a high rate of motion on the screen and generating a less forceful haptic effect to simulate a lower coefficient of friction.
(43) FIG. 5 shows an example that corresponds a haptic region to properties of text. The particular embodiment shows a haptic region 530 that corresponds to highlighting of text. The haptic region may instead correspond to underlined, italicized, bolded text, to text of a certain font, or any other property that can be attributed to the text. Further, the haptic region may be assigned or modified as property is applied to the text. For example, the system may generate a haptic effect as the user is in the process of highlighting text.
(44) The haptic region can correspond not to just visible properties of text, but also to invisible properties, such as metadata. For example, the system 200 may assign a haptic region to text that is a hyperlink. In another example, the system 200 may access a XML file to identify certain text that belongs to a certain section and assign a haptic region to that section. As discussed earlier, the friction coefficient of a region can change based on time or on movement on the screen. In this instance, when a user's finger 140 scrolls text to a different location on the screen 210, the system 200 is configured to effectively move the haptic region to the text's new location by setting a new simulated friction coefficient at the new location and, for example, setting the simulated friction coefficient at the old location to zero or some default value.
(45) FIG. 6 shows yet another example that corresponds a haptic region to a location. In the embodiment, haptic regions 632 may correspond to locations that have no particular relevance to a user and have a zero or some default simulated coefficient of friction. Haptic regions 630 and 634 may have different simulated textures or coefficients of friction that distinguish two different locations. The friction coefficients may act as an identifier for the location. The system 200 may also generate a haptic effect that produces a texture or friction force that simulates the terrain at the location. For example, the haptic effect at haptic region 630 may
simulate the texture of a sand lot, while the haptic effect at haptic region 634 may simulate the texture of a gravel track. The haptic effect is not limited to simulating the texture of terrain. For example, the screen 210 may display garments, textiles, foods, and other materials and products. The system 200 can assign a haptic region that corresponds to the material or product displayed and generate a haptic effect that produces a texture or friction force that simulates that material or product.
(46) The haptic effect may be generated to simulate textures in drawings in general. For example, if the user's finger 140 were drawing on a drawing program rendered on the display screen 210, the system may generate a haptic effect that produces a texture that simulates, for example, the texture of crosshatches when a user's finger 140 is drawing crosshatches on the screen or a grainier texture when a user's finger 140 is drawing spray paint on the screen.
(47) FIG. 7 shows a simulation of a surface feature, specifically a raised edge. In the embodiment, the simulated raised edge 731 may represent three-dimensional aspects of areas depicted on a screen, such as buttons, textboxes, icons, or other user input objects. For example, users who perceive the simulated raised edge of a button may know that their finger has moved over the top of the button without having to look at the screen. The simulated surface features may more generally convey height or depth at the user interface.
(48) The raised edge 731 may be defined by a time domain profile that applies a more intense haptic effect when a user's finger 140 is over haptic region 732, a less intense haptic effect when the user's finger 40 is over haptic region 730, and no haptic effect when the user's finger 140 is over haptic region 734. For example, a haptic effect system may simulate a high friction coefficient when the user's finger 140 moves over region 732 in order to represent the rising slope of the raised edge 731. The haptic system may simulate a second, lower friction coefficient when the user's finger 140 moves over region 730 in order to represent the flat top of an object. The haptic system may simulate an even lower coefficient of friction (e.g., zero) when the finger 40 moves over region 734 to represent a down slope. By detecting whether a user's finger is moving toward region 730, which corresponds to a rising edge, or whether the user's finger is moving away, which corresponds to a descending edge, the system can vary the haptic effect to simulate a higher coefficient for the rise and a lower coefficient for the descent. Although the embodiment describes a time domain profile that varies the haptic effect over time,
time is not an independent variable. Rather, the timing of the haptic effect depends on the position of the finger 140. Therefore, the haptic effects in the time domain profile is not always applied at fixed times, but may depend on how quickly a user's finger is approaching the haptic regions 730, 732, and 734.
(49) To more generally convey height or depth at the user interface, a topography map may be used to assign haptic regions and friction coefficients to locations on the user interface. The haptic regions are not limited to two simulated friction
coefficients, but may have more or fewer coefficients. More coefficients may be used to adjust the simulated friction based on the direction in which a user's finger is moving. For example, if a user's finger is approaching region 730 from a direction that corresponds to north, the system may apply one friction coefficient, whereas it may apply a different friction coefficient if a user's finger is approaching from a direction that corresponds to west. Having multiple coefficients can effectively subdivide the region 732 into separate sub-regions that correspond to different slopes. The system may also be configured to calculate the slope based on, for example, a topographical map by dividing the height difference between two levels in the map by the horizontal distance between the two levels. For images displayed on a screen, the system may also be configured to calculate height differences in the image based on, for example, lighting differences, location of shadows, and other visual factors that can be used to measure height differences in an image. The system may incorporate height and terrain information to generate a haptic effect that simulates both the topography and texture of an object, of a terrain, or of some other type of image shown on the user interface.
(50) Actuators may also be used to change the perception of height or depth. For example, they may generate vibrations to signal a change from a region representing a slope to a region representing a plateau. Alternatively, the actuators may actually create height or depth by using, for example, rods or pins to change the surface relief.
(51) Haptic effects may be generated based not only on the measured surface pressure and rate of movement from an object at the surface, but more specifically on recognized gestures. FIGS. 8A and 8B show a haptic effect being generated in response to a gesture for flipping a page. The haptic effect system 800 has a display screen 810 that depicts a stack of pages 820 on the screen. Rather than assign a haptic region to a location on the screen 810, the system 800 may generate
a haptic effect when it recognizes that the user's finger 140 is flipping a page in the stack 820. For example, the system may detect whether the user's finger starts at the lower right corner of the screen 810 and whether it moves to the left. Once the system 800 depicts the top page being flipped, the system 800 may detect whether the user's finger 140 follows the corner of the page. If it does, the system 800 may recognize this as a gesture for flipping a page and generate a haptic effect. The system can be configured to recognize other gestures, such as a scrolling gesture, a zooming gesture, or any other finger movement that can be recognized. For the zooming gesture, the system 800 may be configured to recognize the movement of more than one finger on the screen 810. The system 800 may also recognize other gestures based on two, three, four, or more fingers, such as a pinching gesture or rotate gesture. The system may also accommodate finger movements from more than one user.
(52) A haptic effect may also be initiated based on a video, graphic, or audio content shown at a user interface. FIG. 9 shows a haptic effect system 900 that generates a haptic effect to accompany a visual and audio indication of an incoming phone call. The system 900 may be configured to generate a haptic effect that, for example, simulates a friction coefficient or texture on the screen 910 when it displays the visual indication of a phone call.
(53) Further, haptic effects can be generated on user interface surfaces other than display screens. Because controls for a device can be located on surfaces other than the display screen, the screen does not need to be able to receive user input. Instead, the screen may merely output visual and audio content and haptic effects while a slide bar 930 on a nondisplay surface of the system in FIG. 9, for example, may receive user input. The slide bar may be a control that answers an incoming call. The system may generate a haptic effect without a triggering event. For example, an electrostatic device may apply a constant voltage beneath the slide bar regardless of the presence of a user or triggering event. Alternatively, it may generate a haptic effect only upon a triggering event, like an incoming call.
(54) FIG. 0 shows another nondisplay surface 1020. In one example, the surface may belong to a trackpad or a touchpad that provides input to a computer display screen 1010. The computer system 1000 may correspond regions on the screen 1010 to haptic regions on the touchpad 1020. When a user's finger 140 moves across region 1030, for example, the touchpad may generate an output signal that is
received by the computing system's display 1010 and reflected as a cursor or pointer moving across region 1040. The touchpad 1020 may also receive an input signal from the computing system 1000 that causes a haptic effect to be generated at haptic region 1030. As the user's finger moves onto regions 1032 and 1034, the touchpad can detect the movement and output a signal to the system 1000 that causes it to show a cursor or pointer to move across corresponding regions 1042 and 1044 on the display screen 1010. At the same time, the system may send a signal to the touchpad 1020 that causes it to generate a first haptic effect as the user's finger 140 is moving across region 1032 and a second haptic effect as the user's finger 140 is moving across region 034.
(55) FIG. 11 shows another example where a haptic region may be in the form of a border region. The embodiment shows a file region 1140 containing one file 1 144 and a file region 1142 containing no documents. The file regions may be windows or file folders depicted on the display screen 1010. The system 1000 may assign a haptic region 1 130 that corresponds to the border of file region 140 and may assign a haptic region 1 132 that corresponds to the border of file region 1 142. The file regions may be locked such that a file cannot be transferred from one region or cannot be written into another region. The haptic regions in FIG. 11 may provide a haptic effect when it detects that a user's finger 140, for example, is moving the file out of region 1 140 or into region 1142. That is, the system may generate a haptic effect when the user's finger touches or comes near the edges of regions 1 130 or
1 132 to indicate to the user that a prohibited action is being attempted. The system 1000 may initiate the haptic effect only if the user is trying to drag a file. For example, the system 1000 may monitor motion at the touchpad 1020 to detect whether a user's finger began its motion at region 1 134. If the user's finger began its movement at some other part of the touchpad 1020, the system 1000 may decide against generating a haptic effect.
(56) FIG. 12 shows yet another example where the touchpad 1020 may simulate the friction or texture of locations corresponding to terrain on a screen 1010. Unlike the embodiment shown in FIG. 6, the haptic effect in this embodiment simulates texture on a nondisplay surface like the touchpad 1020. Here, the system 1000 may show a map with a clay tennis court at location 1240 and a hard court at location 1242. The system may assign a haptic region 1230 to correspond with the screen depiction of the clay court 1240, and similarly a haptic region 1232 for the depicted
hard court 1242. The system may monitor surface movement and generate a haptic effect to produce a friction force or texture resembling a clay court when a user's finger moves across region 1230, and a friction force or texture resembling a hard court when the finger moves across region 1232. A haptic effect may be generated even for features that are not displayed on the display screen 1010. For example, the system 1000 may assign a haptic region 1234 that may correspond to an underground tunnel that is not displayed on the display screen 1010 or is not visible to the user.
(57) The haptic regions are drawn in FIG. 12 for illustrative purposes, and may not actually be visible. The system 1000 may display a cursor, pointer, icon, or avatar on the screen to allow a user to see how his movement on the touch pad
corresponds to movement on the screen.
(58) FIG. 13 shows yet another nondisplay surface, this time on a game remote control 1320, that may output a haptic effect The game remote control 1320 may use actuators, electrostatic devices, thermal devices, or some combination thereof to generate a haptic effect at the surface of the control. The haptic effect may be triggered by an event depicted on a display screen 1310, or an acceleration or velocity of the remote control, the position or orientation of the control 1320 relative to the screen 1310, on a sound, the light level, or some other trigger.
(59) Other triggers may be used for any of the embodiments discussed. Other examples of trigger include the temperature, humidity, lighting, other ambient conditions, and surface contact area (e.g. with another object). These factors may serve as not only triggers, but also as determinants in how forceful a haptic effect is. For example, if a sensor detects dark or low lighting conditions, a more forceful haptic effect may be generated to compensate for the poorer visibility in those conditions.
(60) Further, other nondisplay surfaces may be used. For example, a haptic effect can be outputted on the surface of a switch, a knob, some other control instrument, a dashboard, some other board, or any other surface that can output a haptic effect.
In addition, embodiments of the present invention may be used with deformable surfaces, such as surfaces that are adapted for gross deformations.
(61) FIG. 14 illustrates an embodiment of a module 1400 for generating a haptic effect. The module 1400 may be included in any of the embodiments of the haptic effect systems described herein. The module 1400 may contain a haptic device
1430 that generates one or more haptic effects, and may adjust the effect based on an impedance or simulated friction coefficient measured by a sensor 1440. The sensor data may be analyzed by a controller 1450 and stored in a memory 420. The controller 1450 may be included as part of a processor 1410, or the controller may be a separate logic circuit.
(62) Although the invention has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.
Claims
1. An interface device for providing an overall haptic effect, the interface device comprising:
a surface configured to output the overall haptic effect;
an electrostatic device coupled to the surface and configured to create a first haptic effect at the surface; and
an actuator configured to create a second haptic effect at the surface, the overall haptic effect comprising the first haptic effect and the second haptic effect.
2. The interface device of claim 1 , wherein the overall haptic effect comprises a change in a simulated coefficient of friction, a vibration, a change in a surface relief at the surface, a texture, or any combination thereof.
3. The interface device of claim , wherein the electrostatic device is configured to simulate a coefficient of friction or texture by generating the first haptic effect.
4. The interface device of claim 3, wherein the texture is roughness, bumpiness, stickiness, fineness, coarseness, smoothness, or any combination thereof.
5. The interface device of claim 3, wherein the actuator is configured to change the overall haptic effect by generating the second haptic effect as a vibration or a change in a surface relief of the surface.
6. The interface device of claim 5, further comprising a sensor configured to receive an input from the surface that is correlated to the simulated coefficient of friction and the surface relief of the surface.
7. The interface device of claim 6, wherein the input correlates to a surface pressure or changes in the surface pressure over time and over locations on the surface.
8. The interface device of claim 1 , wherein the electrostatic device is configured to simulate height or depth at the surface by generating the first haptic effect to simulate a coefficient of friction.
9. The interface device of claim 8, wherein the actuator is configured to change the simulated height or depth at the surface by generating the second haptic effect to change a surface relief of the surface.
10. The interface device of claim 1 , wherein the electrostatic device is configured to simulate height or depth at the surface by generating the first haptic effect that simulates a coefficient of friction.
11.The interface device of claim 10, wherein the second haptic effect is a vibration that corresponds to an edge crossing.
12. The interface device of claim 1 , wherein the electrostatic device is configured to change the first haptic effect based on a location of an input at the surface.
13. The interface device of claim 12, wherein the electrostatic device is configured to change the first haptic effect based on time.
14. The interface device of claim 12, wherein the electrostatic device is configured to change the first haptic effect based on a change of the location of the input at the surface.
15. The interface device of claim 14, wherein the electrostatic device is configured to recognize an input of a gesture based on the change of the location of the input at the surface.
16. The interface device of claim 1 , wherein the electrostatic device is configured to change the first haptic effect based on pressure on the surface, size of a surface contact area, humidity, temperature, amount of ambient light, or any combination thereof.
17. The interface device of claim 1 , wherein the electrostatic device is configured to create the first haptic effect based on a trigger selected from velocity of the interface device, an acceleration of the interface device, visual content displayed at the surface, and sound at the surface, or any combination thereof.
18. The interface device of claim 1 , further comprising a sensor configured to measure an impedance at the surface.
9. The interface device of claim 8, wherein the electrostatic device is configured to change the first haptic effect based on the impedance at the surface.
20. The interface device of claim 18, wherein the sensor is configured to measure the impedance by applying a pulse to the surface.
21.The interface device of claim 1 , wherein at least part of the surface is a nondisplay surface, and wherein the electrostatic device is configured to create the first haptic effect at the nondisplay surface and the actuator is configured to create the second haptic effect at the nondisplay surface.
22. The interface device of claim 21 , wherein the nondisplay surface is part of a switch, knob, trackpad, remote control, or dashboard.
23. The interface device of claim 21 , further comprising a slide bar on the nondisplay surface.
24. The interface device of claim 21 , further comprising a second surface configured as a display surface, wherein the electrostatic device is configured to create the first haptic effect at the nondisplay surface based on visual or audio content displayed at the second surface.
25. The interface device of claim 1 , further comprising a thermal device configured to provide a third haptic effect at the surface.
26. A method of providing an overall haptic effect, the method comprising:
generating with an electrostatic device a first haptic effect at a surface of an interface device; and
generating with an actuator a second haptic effect at the surface, the overall haptic effect comprising the first haptic effect and the second haptic effect.
27. The method of claim 26, wherein the overall haptic effect comprises a change in a simulated coefficient of friction, a vibration, a change in a surface relief at the surface, a texture, or any combination thereof.
28. The method of claim 26, wherein the first haptic effect simulates a coefficient of friction or texture at the surface.
29. The method of claim 28, wherein the texture is roughness, bumpiness, stickiness, fineness, coarseness, smoothness, or any combination thereof.
30. The method of claim 28, wherein the second haptic effect is a vibration or a change of a surface relief of the surface.
31. The method of claim 28, further comprising changing the first haptic effect with the electrostatic device or the second haptic effect with the actuator to change the simulated coefficient of friction.
32. The method of claim 26, further comprising measuring an impedance at the surface.
33. The method of claim 32, wherein the measuring the impedance comprises applying a pulse to the surface.
34. The method of claim 32, wherein the generating the first haptic effect is based on the impedance at the surface.
35. The method of claim 26, wherein the generating the first haptic effect is based on pressure on the surface, size of a surface contact area, humidity, temperature, amount of ambient light, or any combination thereof.
36. The method of claim 31 , wherein the changing the first haptic effect with the electrostatic device is based on a location of an input received at the surface or a change in the location of the input at the surface.
37. The method of claim 30, further comprising simulating a height or depth with the electrostatic device based on the location of an input received at the surface.
38. The method of claim 37, wherein the simulating the height or depth is based on a topographical profile.
39. The method of claim 37, further comprising changing the simulated height or depth with the actuator based on generating the second haptic effect to change the surface relief of the surface.
40. The method of claim 37, wherein the second haptic effect is a vibration that corresponds to an edge crossing.
41. The method of claim 40, further comprising changing with the actuator the simulated crossing of the edge based on generating the second haptic effect to generate a vibration.
42. The method of claim 31 , wherein the changing the first haptic effect with the electrostatic device is based on visual content displayed at the surface.
43. The method of claim 31 , wherein the changing the first haptic effect with the electrostatic device is based on pressure on the surface or changes in the pressure on the surface over time and over locations on the surface.
44. The method of claim 26, further comprising generating with a thermal device a third haptic effect at the surface.
45. The method of claim 26, wherein the surface is a nondisplay surface.
46. The method of claim 45, wherein the generating the first haptic effect with the electrostatic device is based on visual or audio content displayed at a second surface.
47. The method of claim 46, wherein the generating the first haptic effect with the electrostatic device is based on the visual content displayed at a location on the second surface corresponding to a location of an input at the surface.
48. The method of claim 45, wherein the nondisplay surface is part of a switch, knob, trackpad, remote control, or dashboard.
49. The method of claim 45, wherein the generating the first haptic effect with the electrostatic device comprises generating the haptic effect on a slide bar on the nondisplay surface.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP19169687.1A EP3531251A1 (en) | 2011-04-22 | 2012-04-16 | Electro-vibrotactile display |
JP2014506471A JP6234364B2 (en) | 2011-04-22 | 2012-04-16 | Electric vibration type tactile display |
KR1020137031069A KR102027307B1 (en) | 2011-04-22 | 2012-04-16 | Electro-vibrotactile display |
KR1020197028083A KR20190112192A (en) | 2011-04-22 | 2012-04-16 | Electro-vibrotactile display |
CN201280025757.5A CN103562827B (en) | 2011-04-22 | 2012-04-16 | Electro-vibrotactile display |
EP12773915.9A EP2702468B1 (en) | 2011-04-22 | 2012-04-16 | Electro-vibrotactile display |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/092,269 | 2011-04-22 | ||
US13/092,269 US9448713B2 (en) | 2011-04-22 | 2011-04-22 | Electro-vibrotactile display |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012145264A1 true WO2012145264A1 (en) | 2012-10-26 |
Family
ID=47020937
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/033743 WO2012145264A1 (en) | 2011-04-22 | 2012-04-16 | Electro-vibrotactile display |
Country Status (6)
Country | Link |
---|---|
US (1) | US9448713B2 (en) |
EP (2) | EP2702468B1 (en) |
JP (3) | JP6234364B2 (en) |
KR (2) | KR20190112192A (en) |
CN (2) | CN103562827B (en) |
WO (1) | WO2012145264A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103793051A (en) * | 2012-10-31 | 2014-05-14 | 英默森公司 | Method and apparatus for simulating surface features on user interface with haptic effect |
CN103838421A (en) * | 2012-11-20 | 2014-06-04 | 英默森公司 | Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction |
JP2014102829A (en) * | 2012-11-20 | 2014-06-05 | Immersion Corp | System and method for feedforward and feedback with haptic effects |
JP2014102830A (en) * | 2012-11-20 | 2014-06-05 | Immersion Corp | System and method for simulated physical interactions with haptic effects |
JP2014102649A (en) * | 2012-11-19 | 2014-06-05 | Aisin Aw Co Ltd | Operation support system, operation support method, and computer program |
KR20140105303A (en) * | 2013-02-22 | 2014-09-01 | 삼성전자주식회사 | Method for displaying user input applying texture of background image and apparatus for the same |
JP2014206970A (en) * | 2013-03-15 | 2014-10-30 | イマージョン コーポレーションImmersion Corporation | User interface device provided with surface haptic sensations |
JP2015015027A (en) * | 2013-07-02 | 2015-01-22 | イマージョン コーポレーションImmersion Corporation | Systems and methods for perceptual normalization of haptic effects |
JP2018185823A (en) * | 2012-11-20 | 2018-11-22 | イマージョン コーポレーションImmersion Corporation | System and method for providing mode or state awareness with programmable surface texture |
US11747907B2 (en) | 2020-01-07 | 2023-09-05 | Mitsubishi Electric Corporation | Tactile presentation panel, tactile presentation touch panel, and tactile presentation touch display |
Families Citing this family (192)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8405618B2 (en) * | 2006-03-24 | 2013-03-26 | Northwestern University | Haptic device with indirect haptic feedback |
US9696803B2 (en) | 2009-03-12 | 2017-07-04 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US9746923B2 (en) | 2009-03-12 | 2017-08-29 | Immersion Corporation | Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction |
US8487759B2 (en) | 2009-09-30 | 2013-07-16 | Apple Inc. | Self adapting haptic device |
US9501145B2 (en) * | 2010-05-21 | 2016-11-22 | Disney Enterprises, Inc. | Electrovibration for touch surfaces |
US10013058B2 (en) | 2010-09-21 | 2018-07-03 | Apple Inc. | Touch-based user interface with haptic feedback |
US10108288B2 (en) | 2011-05-10 | 2018-10-23 | Northwestern University | Touch interface device and method for applying controllable shear forces to a human appendage |
US9122325B2 (en) | 2011-05-10 | 2015-09-01 | Northwestern University | Touch interface device and method for applying controllable shear forces to a human appendage |
US8681130B2 (en) | 2011-05-20 | 2014-03-25 | Sony Corporation | Stylus based haptic peripheral for touch screen and tablet devices |
US8773403B2 (en) | 2011-05-20 | 2014-07-08 | Sony Corporation | Haptic device for position detection |
US8749533B2 (en) * | 2011-05-20 | 2014-06-10 | Sony Corporation | Haptic device for carving and molding objects |
US8956230B2 (en) | 2011-05-20 | 2015-02-17 | Sony Corporation | Haptic device for 3-D gaming |
US9710061B2 (en) * | 2011-06-17 | 2017-07-18 | Apple Inc. | Haptic feedback device |
EP2754008A4 (en) * | 2011-06-21 | 2015-04-22 | Univ Northwestern | Touch interface device and method for applying lateral forces on a human appendage |
WO2013036614A1 (en) | 2011-09-06 | 2013-03-14 | Immersion Corporation | Haptic output device and method of generating a haptic effect in a haptic output device |
US9411423B2 (en) * | 2012-02-08 | 2016-08-09 | Immersion Corporation | Method and apparatus for haptic flex gesturing |
JP2013228936A (en) * | 2012-04-26 | 2013-11-07 | Kyocera Corp | Electronic device and method for controlling electronic device |
US9075460B2 (en) * | 2012-08-10 | 2015-07-07 | Blackberry Limited | Method of momentum based zoom of content on an electronic device |
US9116546B2 (en) | 2012-08-29 | 2015-08-25 | Immersion Corporation | System for haptically representing sensor input |
US9178509B2 (en) | 2012-09-28 | 2015-11-03 | Apple Inc. | Ultra low travel keyboard |
US9202350B2 (en) * | 2012-12-19 | 2015-12-01 | Nokia Technologies Oy | User interfaces and associated methods |
US9880623B2 (en) * | 2013-01-24 | 2018-01-30 | Immersion Corporation | Friction modulation for three dimensional relief in a haptic device |
CN103970310A (en) * | 2013-01-24 | 2014-08-06 | 宏碁股份有限公司 | Touch control device and touch control method |
WO2014125857A1 (en) * | 2013-02-13 | 2014-08-21 | Necカシオモバイルコミュニケーションズ株式会社 | Input device and control method therefor, and program |
US10578499B2 (en) | 2013-02-17 | 2020-03-03 | Microsoft Technology Licensing, Llc | Piezo-actuated virtual buttons for touch surfaces |
US9189098B2 (en) | 2013-03-14 | 2015-11-17 | Immersion Corporation | Systems and methods for syncing haptic feedback calls |
US9547366B2 (en) * | 2013-03-14 | 2017-01-17 | Immersion Corporation | Systems and methods for haptic and gesture-driven paper simulation |
US9939900B2 (en) * | 2013-04-26 | 2018-04-10 | Immersion Corporation | System and method for a haptically-enabled deformable surface |
US9405369B2 (en) | 2013-04-26 | 2016-08-02 | Immersion Corporation, Inc. | Simulation of tangible user interface interactions and gestures using array of haptic cells |
US20140340316A1 (en) * | 2013-05-14 | 2014-11-20 | Microsoft Corporation | Feedback for Gestures |
US10120447B2 (en) | 2013-06-24 | 2018-11-06 | Northwestern University | Haptic display with simultaneous sensing and actuation |
US10359857B2 (en) | 2013-07-18 | 2019-07-23 | Immersion Corporation | Usable hidden controls with haptic feedback |
WO2015020663A1 (en) | 2013-08-08 | 2015-02-12 | Honessa Development Laboratories Llc | Sculpted waveforms with no or reduced unforced response |
US9711014B2 (en) | 2013-09-06 | 2017-07-18 | Immersion Corporation | Systems and methods for generating haptic effects associated with transitions in audio signals |
US9619980B2 (en) | 2013-09-06 | 2017-04-11 | Immersion Corporation | Systems and methods for generating haptic effects associated with audio signals |
US9514620B2 (en) * | 2013-09-06 | 2016-12-06 | Immersion Corporation | Spatialized haptic feedback based on dynamically scaled values |
US9652945B2 (en) * | 2013-09-06 | 2017-05-16 | Immersion Corporation | Method and system for providing haptic effects based on information complementary to multimedia content |
US9576445B2 (en) | 2013-09-06 | 2017-02-21 | Immersion Corp. | Systems and methods for generating haptic effects associated with an envelope in audio signals |
US9779592B1 (en) | 2013-09-26 | 2017-10-03 | Apple Inc. | Geared haptic feedback element |
CN105579928A (en) | 2013-09-27 | 2016-05-11 | 苹果公司 | Band with haptic actuators |
WO2015047343A1 (en) | 2013-09-27 | 2015-04-02 | Honessa Development Laboratories Llc | Polarized magnetic actuators for haptic response |
US10126817B2 (en) | 2013-09-29 | 2018-11-13 | Apple Inc. | Devices and methods for creating haptic effects |
US10236760B2 (en) | 2013-09-30 | 2019-03-19 | Apple Inc. | Magnetic actuators for haptic response |
US9921649B2 (en) | 2013-10-07 | 2018-03-20 | Immersion Corporation | Electrostatic haptic based user input elements |
US9317118B2 (en) | 2013-10-22 | 2016-04-19 | Apple Inc. | Touch surface for simulating materials |
US20150123913A1 (en) * | 2013-11-06 | 2015-05-07 | Andrew Kerdemelidis | Apparatus and method for producing lateral force on a touchscreen |
US9639158B2 (en) * | 2013-11-26 | 2017-05-02 | Immersion Corporation | Systems and methods for generating friction and vibrotactile effects |
WO2015088491A1 (en) | 2013-12-10 | 2015-06-18 | Bodhi Technology Ventures Llc | Band attachment mechanism with haptic response |
JP2015130168A (en) * | 2013-12-31 | 2015-07-16 | イマージョン コーポレーションImmersion Corporation | Friction augmented control, and method to convert buttons of touch control panels to friction augmented controls |
JP6644466B2 (en) | 2013-12-31 | 2020-02-12 | イマージョン コーポレーションImmersion Corporation | System and method for providing tactile notification |
US9448631B2 (en) | 2013-12-31 | 2016-09-20 | Microsoft Technology Licensing, Llc | Input device haptics and pressure sensing |
US9501912B1 (en) | 2014-01-27 | 2016-11-22 | Apple Inc. | Haptic feedback device with a rotating mass of variable eccentricity |
EP3108344A4 (en) | 2014-02-21 | 2017-11-22 | Northwestern University | Haptic display with simultaneous sensing and actuation |
CN103793062B (en) * | 2014-03-05 | 2016-09-28 | 吉林大学 | The self adaptation multiple spot electrostatic force tactile representation device of application impedance detection and method |
JP6381240B2 (en) * | 2014-03-14 | 2018-08-29 | キヤノン株式会社 | Electronic device, tactile sensation control method, and program |
US9594429B2 (en) | 2014-03-27 | 2017-03-14 | Apple Inc. | Adjusting the level of acoustic and haptic output in haptic devices |
WO2015163842A1 (en) | 2014-04-21 | 2015-10-29 | Yknots Industries Llc | Apportionment of forces for multi-touch input devices of electronic devices |
KR101846256B1 (en) * | 2014-05-09 | 2018-05-18 | 삼성전자주식회사 | Tactile feedback apparatus and method for providing tactile feeling |
DE102015209639A1 (en) | 2014-06-03 | 2015-12-03 | Apple Inc. | Linear actuator |
US9606624B2 (en) * | 2014-07-02 | 2017-03-28 | Immersion Corporation | Systems and methods for surface elements that provide electrostatic haptic effects |
US9696806B2 (en) * | 2014-07-02 | 2017-07-04 | Immersion Corporation | Systems and methods for multi-output electrostatic haptic effects |
US9886090B2 (en) | 2014-07-08 | 2018-02-06 | Apple Inc. | Haptic notifications utilizing haptic input devices |
JP6639771B2 (en) * | 2014-07-09 | 2020-02-05 | ユニバーシティ・オブ・タンペレUniversity of Tampere | Tactile imaging system |
US9710063B2 (en) | 2014-07-21 | 2017-07-18 | Immersion Corporation | Systems and methods for determining haptic effects for multi-touch input |
JP6283280B2 (en) * | 2014-08-05 | 2018-02-21 | 株式会社Nttドコモ | Electronic book browsing apparatus and electronic book browsing method |
US9690381B2 (en) | 2014-08-21 | 2017-06-27 | Immersion Corporation | Systems and methods for shape input and output for a haptically-enabled deformable surface |
KR102373337B1 (en) | 2014-09-02 | 2022-03-11 | 애플 인크. | Semantic framework for variable haptic output |
WO2016036671A2 (en) | 2014-09-02 | 2016-03-10 | Apple Inc. | Haptic notifications |
WO2016038677A1 (en) * | 2014-09-09 | 2016-03-17 | 三菱電機株式会社 | Tactile sensation control system and tactile sensation control method |
JP6498403B2 (en) * | 2014-09-10 | 2019-04-10 | 三菱電機株式会社 | Tactile control system |
US9927786B2 (en) | 2014-10-02 | 2018-03-27 | Anne Dewitte | Expandable and collapsible shape element for a programmable shape surface |
US10146308B2 (en) * | 2014-10-14 | 2018-12-04 | Immersion Corporation | Systems and methods for impedance coupling for haptic devices |
KR102357599B1 (en) * | 2014-11-24 | 2022-02-03 | 엘지디스플레이 주식회사 | Portable electronic device |
US9535550B2 (en) | 2014-11-25 | 2017-01-03 | Immersion Corporation | Systems and methods for deformation-based haptic effects |
US9971406B2 (en) * | 2014-12-05 | 2018-05-15 | International Business Machines Corporation | Visually enhanced tactile feedback |
WO2016123351A1 (en) | 2015-01-30 | 2016-08-04 | Immersion Corporation | Electrostatic haptic actuator and user interface with an electrostatic haptic actuator |
KR101554256B1 (en) * | 2015-02-16 | 2015-09-18 | 박동현 | Method of displaying characters for the blind using haptic patterns, a touchscreen the method applied thereto, and a display using the touchscreen |
US10353467B2 (en) | 2015-03-06 | 2019-07-16 | Apple Inc. | Calibration of haptic devices |
US9750322B2 (en) | 2015-03-08 | 2017-09-05 | Apple Inc. | Co-molded ceramic and polymer structure |
AU2016100399B4 (en) | 2015-04-17 | 2017-02-02 | Apple Inc. | Contracting and elongating materials for providing input and output for an electronic device |
US10703680B2 (en) | 2015-05-25 | 2020-07-07 | Apple Inc. | Fiber-reinforced ceramic matrix composite for electronic devices |
US10222889B2 (en) | 2015-06-03 | 2019-03-05 | Microsoft Technology Licensing, Llc | Force inputs and cursor control |
US10416799B2 (en) | 2015-06-03 | 2019-09-17 | Microsoft Technology Licensing, Llc | Force sensing and inadvertent input control of an input device |
KR101630430B1 (en) * | 2015-06-29 | 2016-06-13 | 박동현 | Method of displaying characters for the blind using haptic patterns |
MA42907A (en) * | 2015-07-07 | 2018-05-16 | Mankiewicz Gebr & Co Gmbh & Co Kg | COATINGS WITH MICROSTRUCTURED SURFACES AND THEIR USE IN DASHBOARDS, SWITCHING CONSOLES AND CONTROL PANELS |
US20170024010A1 (en) | 2015-07-21 | 2017-01-26 | Apple Inc. | Guidance device for the sensory impaired |
KR101650099B1 (en) * | 2015-07-23 | 2016-08-23 | 린츠 주식회사 | A wearable haptic pattern display device for blind |
JP6618734B2 (en) * | 2015-08-28 | 2019-12-11 | 株式会社デンソーテン | Input device and display device |
WO2017044618A1 (en) | 2015-09-08 | 2017-03-16 | Apple Inc. | Linear actuators for use in electronic devices |
US10013060B2 (en) | 2015-09-18 | 2018-07-03 | Immersion Corporation | Systems and methods for providing haptic effects in response to deformation of a cover for an electronic device |
US11104616B2 (en) | 2015-09-30 | 2021-08-31 | Apple Inc. | Ceramic having a residual compressive stress for use in electronic devices |
US10664053B2 (en) * | 2015-09-30 | 2020-05-26 | Apple Inc. | Multi-transducer tactile user interface for electronic devices |
KR102422461B1 (en) * | 2015-11-06 | 2022-07-19 | 삼성전자 주식회사 | Method for providing a haptic and electronic device supporting the same |
US10152125B2 (en) | 2015-11-20 | 2018-12-11 | Immersion Corporation | Haptically enabled flexible devices |
US9990078B2 (en) | 2015-12-11 | 2018-06-05 | Immersion Corporation | Systems and methods for position-based haptic effects |
US9895607B2 (en) | 2015-12-15 | 2018-02-20 | Igt Canada Solutions Ulc | Haptic feedback on a gaming terminal display |
WO2017100901A1 (en) * | 2015-12-15 | 2017-06-22 | Igt Canada Solutions Ulc | Haptic feedback on a gaming terminal display |
US9875625B2 (en) | 2015-12-18 | 2018-01-23 | Immersion Corporation | Systems and methods for multifunction haptic output devices |
US10976819B2 (en) * | 2015-12-28 | 2021-04-13 | Microsoft Technology Licensing, Llc | Haptic feedback for non-touch surface interaction |
CN105700795A (en) * | 2015-12-30 | 2016-06-22 | 宇龙计算机通信科技(深圳)有限公司 | Method and device for setting vibration intensity of mobile terminal, and mobile terminal |
WO2017119107A1 (en) | 2016-01-07 | 2017-07-13 | 富士通株式会社 | Electronic device and drive control method for electronic device |
US10061385B2 (en) | 2016-01-22 | 2018-08-28 | Microsoft Technology Licensing, Llc | Haptic feedback for a touch input device |
US10039080B2 (en) | 2016-03-04 | 2018-07-31 | Apple Inc. | Situationally-aware alerts |
US9898903B2 (en) | 2016-03-07 | 2018-02-20 | Immersion Corporation | Systems and methods for haptic surface elements |
US10268272B2 (en) | 2016-03-31 | 2019-04-23 | Apple Inc. | Dampening mechanical modes of a haptic actuator using a delay |
CN105892664B (en) * | 2016-03-31 | 2021-05-18 | 联想(北京)有限公司 | Information processing method and electronic equipment |
US11604514B2 (en) | 2016-04-14 | 2023-03-14 | Apple Inc. | Substrate having a visually imperceptible texture for providing variable coefficients of friction between objects |
JP2019514139A (en) * | 2016-04-21 | 2019-05-30 | アップル インコーポレイテッドApple Inc. | Tactile user interface for electronic devices |
US10585480B1 (en) | 2016-05-10 | 2020-03-10 | Apple Inc. | Electronic device with an input device having a haptic engine |
US10444838B2 (en) * | 2016-05-17 | 2019-10-15 | Immersion Corporation | Thermally activated haptic output device |
US9983675B2 (en) * | 2016-06-10 | 2018-05-29 | Immersion Corporation | Systems and methods for monitoring insulation integrity for electrostatic friction |
DK201670737A1 (en) | 2016-06-12 | 2018-01-22 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Providing Haptic Feedback |
DK179823B1 (en) | 2016-06-12 | 2019-07-12 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
US10649529B1 (en) | 2016-06-28 | 2020-05-12 | Apple Inc. | Modification of user-perceived feedback of an input device using acoustic or haptic output |
JP2019519856A (en) | 2016-07-08 | 2019-07-11 | イマージョン コーポレーションImmersion Corporation | Multimodal haptic effect |
US10845878B1 (en) | 2016-07-25 | 2020-11-24 | Apple Inc. | Input device with tactile feedback |
US10416771B2 (en) | 2016-08-03 | 2019-09-17 | Apple Inc. | Haptic output system for user input surface |
US10264690B2 (en) | 2016-09-01 | 2019-04-16 | Apple Inc. | Ceramic sintering for uniform color for a housing of an electronic device |
US11088718B2 (en) | 2016-09-06 | 2021-08-10 | Apple Inc. | Multi-colored ceramic housings for an electronic device |
EP3674871A1 (en) * | 2016-09-06 | 2020-07-01 | Apple Inc. | Devices, methods, and graphical user interfaces for providing haptic feedback |
DK201670720A1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs |
DK179278B1 (en) | 2016-09-06 | 2018-03-26 | Apple Inc | Devices, methods and graphical user interfaces for haptic mixing |
US10372214B1 (en) | 2016-09-07 | 2019-08-06 | Apple Inc. | Adaptable user-selectable input area in an electronic device |
US10420226B2 (en) | 2016-09-21 | 2019-09-17 | Apple Inc. | Yttria-sensitized zirconia |
US10261586B2 (en) | 2016-10-11 | 2019-04-16 | Immersion Corporation | Systems and methods for providing electrostatic haptic effects via a wearable or handheld device |
JP6794763B2 (en) * | 2016-10-17 | 2020-12-02 | コニカミノルタ株式会社 | Display system |
US11678445B2 (en) | 2017-01-25 | 2023-06-13 | Apple Inc. | Spatial composites |
JP2018132929A (en) * | 2017-02-15 | 2018-08-23 | 株式会社デンソーテン | Control device and control method |
US10437359B1 (en) | 2017-02-28 | 2019-10-08 | Apple Inc. | Stylus with external magnetic influence |
WO2018164321A1 (en) * | 2017-03-09 | 2018-09-13 | 한양대학교 산학협력단 | Tactile sensation providing device and tactile display device using ultrasoinc wave |
CN115657850A (en) | 2017-03-29 | 2023-01-31 | 苹果公司 | Device with integrated interface system |
FR3065548B1 (en) * | 2017-04-24 | 2022-02-04 | Commissariat Energie Atomique | TACTILE STIMULATION INTERFACE BY TIME REVERSAL OFFERING ENRICHED SENSATIONS |
DK201770372A1 (en) | 2017-05-16 | 2019-01-08 | Apple Inc. | Tactile feedback for locked device user interfaces |
US10627906B2 (en) | 2017-06-02 | 2020-04-21 | International Business Machines Corporation | Tactile display using microscale electrostatic accelerators |
ES2863276T3 (en) * | 2017-06-28 | 2021-10-11 | Ericsson Telefon Ab L M | Flexible communication device and method to change the shape of the device |
US10622538B2 (en) | 2017-07-18 | 2020-04-14 | Apple Inc. | Techniques for providing a haptic output and sensing a haptic input using a piezoelectric body |
US10775889B1 (en) | 2017-07-21 | 2020-09-15 | Apple Inc. | Enclosure with locally-flexible regions |
KR102494625B1 (en) * | 2017-08-28 | 2023-02-01 | 삼성디스플레이 주식회사 | Display device |
CN107688416A (en) * | 2017-08-30 | 2018-02-13 | 京东方科技集团股份有限公司 | Touch base plate, touch-screen, electronic equipment and method of toch control |
US10768747B2 (en) | 2017-08-31 | 2020-09-08 | Apple Inc. | Haptic realignment cues for touch-input displays |
US10416772B2 (en) * | 2017-09-06 | 2019-09-17 | Apple Inc. | Electrical haptic output array |
US11054932B2 (en) | 2017-09-06 | 2021-07-06 | Apple Inc. | Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module |
US10556252B2 (en) | 2017-09-20 | 2020-02-11 | Apple Inc. | Electronic device having a tuned resonance haptic actuation system |
US10694014B2 (en) | 2017-09-22 | 2020-06-23 | Apple Inc. | Haptic locomotion using wide-band actuator |
US10768738B1 (en) | 2017-09-27 | 2020-09-08 | Apple Inc. | Electronic device having a haptic actuator with magnetic augmentation |
US10775890B2 (en) | 2017-09-27 | 2020-09-15 | Apple Inc. | Electronic device having a piezoelectric body for friction haptics |
US10585482B2 (en) | 2017-09-27 | 2020-03-10 | Apple Inc. | Electronic device having a hybrid conductive coating for electrostatic haptics |
US10248211B1 (en) | 2017-09-28 | 2019-04-02 | Apple Inc. | Ground-shifted touch input sensor for capacitively driving an electrostatic plate |
KR102413568B1 (en) | 2017-09-29 | 2022-06-27 | 애플 인크. | multi-part device enclosure |
US10440848B2 (en) | 2017-12-20 | 2019-10-08 | Immersion Corporation | Conformable display with linear actuator |
CN108536280B (en) * | 2018-01-30 | 2021-01-01 | 吉林大学 | Electrostatic force and vibration fusion touch sense reproduction device and method applying finger pad |
US10649531B2 (en) * | 2018-02-01 | 2020-05-12 | Microsoft Technology Licensing, Llc | Haptic effect on a touch input surface |
US11402669B2 (en) | 2018-04-27 | 2022-08-02 | Apple Inc. | Housing surface with tactile friction features |
WO2019226191A1 (en) * | 2018-05-25 | 2019-11-28 | Apple Inc. | Portable computer with dynamic display interface |
US10620705B2 (en) * | 2018-06-01 | 2020-04-14 | Google Llc | Vibrating the surface of an electronic device to raise the perceived height at a depression in the surface |
US10942571B2 (en) | 2018-06-29 | 2021-03-09 | Apple Inc. | Laptop computing device with discrete haptic regions |
US11112827B2 (en) | 2018-07-20 | 2021-09-07 | Apple Inc. | Electronic device with glass housing member |
US11258163B2 (en) | 2018-08-30 | 2022-02-22 | Apple Inc. | Housing and antenna architecture for mobile device |
US10705570B2 (en) | 2018-08-30 | 2020-07-07 | Apple Inc. | Electronic device housing with integrated antenna |
US10936071B2 (en) | 2018-08-30 | 2021-03-02 | Apple Inc. | Wearable electronic device with haptic rotatable input |
GB2577079B (en) | 2018-09-12 | 2021-05-12 | Sony Interactive Entertainment Inc | Portable device and system |
US10613678B1 (en) | 2018-09-17 | 2020-04-07 | Apple Inc. | Input device with haptic feedback |
US10966007B1 (en) | 2018-09-25 | 2021-03-30 | Apple Inc. | Haptic output system |
US10691211B2 (en) | 2018-09-28 | 2020-06-23 | Apple Inc. | Button providing force sensing and/or haptic output |
US10599223B1 (en) | 2018-09-28 | 2020-03-24 | Apple Inc. | Button providing force sensing and/or haptic output |
KR20210074344A (en) * | 2018-10-12 | 2021-06-21 | 베스텔 일렉트로닉 사나이 베 티카레트 에이에스 | Navigation device and method for operating navigation device |
US20200192480A1 (en) * | 2018-12-18 | 2020-06-18 | Immersion Corporation | Systems and methods for providing haptic effects based on a user's motion or environment |
US11691912B2 (en) | 2018-12-18 | 2023-07-04 | Apple Inc. | Chemically strengthened and textured glass housing member |
US11199929B2 (en) | 2019-03-21 | 2021-12-14 | Apple Inc. | Antireflective treatment for textured enclosure components |
DE102019108971B4 (en) | 2019-04-05 | 2024-09-26 | Technische Universität Dresden | Display system, touchscreen with display system and method for operating a display system and computer program product |
CN114399014A (en) | 2019-04-17 | 2022-04-26 | 苹果公司 | Wireless locatable tag |
US11372137B2 (en) | 2019-05-29 | 2022-06-28 | Apple Inc. | Textured cover assemblies for display applications |
US10827635B1 (en) * | 2019-06-05 | 2020-11-03 | Apple Inc. | Electronic device enclosure having a textured glass component |
US11192823B2 (en) | 2019-06-05 | 2021-12-07 | Apple Inc. | Electronic devices including laser-textured glass cover members |
US11109500B2 (en) | 2019-06-05 | 2021-08-31 | Apple Inc. | Textured glass component for an electronic device enclosure |
US11380470B2 (en) | 2019-09-24 | 2022-07-05 | Apple Inc. | Methods to control force in reluctance actuators based on flux related parameters |
US12013982B2 (en) * | 2019-09-26 | 2024-06-18 | Mitsubishi Electric Corporation | Tactile sensation presentation panel, tactile sensation presentation touch panel, and tactile sensation presentation touch display |
US12009576B2 (en) | 2019-12-03 | 2024-06-11 | Apple Inc. | Handheld electronic device |
CN111158474B (en) * | 2019-12-19 | 2021-10-22 | 维沃移动通信有限公司 | Interaction method and electronic equipment |
DE102020110153A1 (en) | 2020-04-14 | 2021-10-14 | Bayerische Motoren Werke Aktiengesellschaft | Operating device with active and passive haptics and steering device with such an operating device |
US11281881B2 (en) * | 2020-05-05 | 2022-03-22 | Qualcomm Incorporated | Device including an ultrasonic fingerprint sensor and a coulomb force apparatus |
US11024135B1 (en) | 2020-06-17 | 2021-06-01 | Apple Inc. | Portable electronic device having a haptic button assembly |
DE102020004365A1 (en) | 2020-07-20 | 2022-01-20 | Daimler Ag | Method for generating haptic feedback |
US11897809B2 (en) | 2020-09-02 | 2024-02-13 | Apple Inc. | Electronic devices with textured glass and glass ceramic components |
CN112333329B (en) * | 2020-10-28 | 2022-04-26 | 维沃移动通信有限公司 | Unread information reminding method and device and electronic equipment |
CN112596612A (en) | 2020-12-28 | 2021-04-02 | 北京小米移动软件有限公司 | Tactile feedback generation method, tactile feedback generation device, and storage medium |
US11977683B2 (en) | 2021-03-12 | 2024-05-07 | Apple Inc. | Modular systems configured to provide localized haptic feedback using inertial actuators |
US20240272718A1 (en) * | 2021-05-31 | 2024-08-15 | Beijing Boe Technology Development Co., Ltd. | Display device |
US11809631B2 (en) | 2021-09-21 | 2023-11-07 | Apple Inc. | Reluctance haptic engine for an electronic device |
EP4202606A1 (en) * | 2021-12-21 | 2023-06-28 | Nokia Technologies Oy | User-device interaction |
WO2023118958A1 (en) * | 2021-12-23 | 2023-06-29 | Bosch Car Multimedia Portugal, S.A. | Haptic device |
WO2023126645A1 (en) * | 2021-12-27 | 2023-07-06 | Bosch Car Multimedia Portugal S.A | Interface and alert device with haptic and thermal feedback for autonomous vehicle |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5149918A (en) * | 1990-10-29 | 1992-09-22 | International Business Machines Corporation | Touch sensitive overlay |
US7522152B2 (en) | 2004-05-27 | 2009-04-21 | Immersion Corporation | Products and processes for providing haptic feedback in resistive interface devices |
US20100231539A1 (en) | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and Methods for Interfaces Featuring Surface-Based Haptic Effects |
Family Cites Families (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4680429A (en) * | 1986-01-15 | 1987-07-14 | Tektronix, Inc. | Touch panel |
WO1997020305A1 (en) * | 1995-11-30 | 1997-06-05 | Virtual Technologies, Inc. | Tactile feedback man-machine interface device |
US6211861B1 (en) | 1998-06-23 | 2001-04-03 | Immersion Corporation | Tactile mouse device |
JPH11203020A (en) | 1998-01-16 | 1999-07-30 | Fuji Xerox Co Ltd | Touch sense display device |
EP1717679B1 (en) * | 1998-01-26 | 2016-09-21 | Apple Inc. | Method for integrating manual input |
US6429846B2 (en) | 1998-06-23 | 2002-08-06 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
US6822635B2 (en) * | 2000-01-19 | 2004-11-23 | Immersion Corporation | Haptic interface for laptop computers and other portable devices |
JP2003216294A (en) * | 2002-01-25 | 2003-07-31 | Victor Co Of Japan Ltd | User interface device |
JP2003288158A (en) | 2002-01-28 | 2003-10-10 | Sony Corp | Mobile apparatus having tactile feedback function |
JP3852368B2 (en) | 2002-05-16 | 2006-11-29 | ソニー株式会社 | Input method and data processing apparatus |
US7456823B2 (en) | 2002-06-14 | 2008-11-25 | Sony Corporation | User interface apparatus and portable information apparatus |
JP3937982B2 (en) | 2002-08-29 | 2007-06-27 | ソニー株式会社 | INPUT / OUTPUT DEVICE AND ELECTRONIC DEVICE HAVING INPUT / OUTPUT DEVICE |
JP2004319255A (en) * | 2003-04-16 | 2004-11-11 | Equos Research Co Ltd | Pseudo tactile presentation device |
DE10340188A1 (en) * | 2003-09-01 | 2005-04-07 | Siemens Ag | Screen with a touch-sensitive user interface for command input |
WO2005121939A2 (en) * | 2004-06-10 | 2005-12-22 | Koninklijke Philips Electronics N.V. | Generating control signals using the impedance of parts of a living body for controlling a controllable device |
JP2006048302A (en) | 2004-08-03 | 2006-02-16 | Sony Corp | Piezoelectric complex unit, its manufacturing method, its handling method, its control method, input/output device and electronic equipment |
KR100682901B1 (en) * | 2004-11-17 | 2007-02-15 | 삼성전자주식회사 | Apparatus and method for providing fingertip haptics of visual information using electro-active polymer in a image displaying device |
JP2006163206A (en) * | 2004-12-09 | 2006-06-22 | Ntt Docomo Inc | Tactile sense presentation device |
US7542816B2 (en) | 2005-01-27 | 2009-06-02 | Outland Research, Llc | System, method and computer program product for automatically selecting, suggesting and playing music media files |
JP4360497B2 (en) * | 2005-03-09 | 2009-11-11 | 国立大学法人 東京大学 | Electric tactile presentation device and electric tactile presentation method |
JP4626376B2 (en) | 2005-04-25 | 2011-02-09 | ソニー株式会社 | Music content playback apparatus and music content playback method |
JP4736605B2 (en) * | 2005-08-01 | 2011-07-27 | ソニー株式会社 | Display device, information processing device, and control method thereof |
US20070118043A1 (en) | 2005-11-23 | 2007-05-24 | Microsoft Corporation | Algorithms for computing heart rate and movement speed of a user from sensor data |
US7509588B2 (en) * | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
JP2007188597A (en) | 2006-01-13 | 2007-07-26 | Sony Corp | Content reproduction device and content reproduction method, and program |
US8405618B2 (en) * | 2006-03-24 | 2013-03-26 | Northwestern University | Haptic device with indirect haptic feedback |
US8780053B2 (en) * | 2007-03-21 | 2014-07-15 | Northwestern University | Vibrating substrate for haptic interface |
US10152124B2 (en) * | 2006-04-06 | 2018-12-11 | Immersion Corporation | Systems and methods for enhanced haptic effects |
JP2007287005A (en) | 2006-04-19 | 2007-11-01 | Sony Corp | Information input/output device, information processing method and computer program |
JP4305671B2 (en) | 2006-08-22 | 2009-07-29 | ソニー株式会社 | HEALTH EXERCISE SUPPORT SYSTEM, PORTABLE MUSIC REPRODUCTION DEVICE, SERVICE INFORMATION PROVIDING DEVICE, INFORMATION PROCESSING DEVICE, HEALTH EXERCISE SUPPORT METHOD |
US20080068334A1 (en) | 2006-09-14 | 2008-03-20 | Immersion Corporation | Localized Haptic Feedback |
US20100315345A1 (en) * | 2006-09-27 | 2010-12-16 | Nokia Corporation | Tactile Touch Screen |
US20080097633A1 (en) | 2006-09-29 | 2008-04-24 | Texas Instruments Incorporated | Beat matching systems |
US7890863B2 (en) * | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
JP4346652B2 (en) * | 2007-01-26 | 2009-10-21 | 株式会社エヌ・ティ・ティ・ドコモ | Information terminal and input method thereof |
US20090015560A1 (en) * | 2007-07-13 | 2009-01-15 | Motorola, Inc. | Method and apparatus for controlling a display of a device |
FI20085475A0 (en) | 2008-05-19 | 2008-05-19 | Senseg Oy | Touch Device Interface |
EP2203797B1 (en) * | 2007-09-18 | 2014-07-09 | Senseg OY | Method and apparatus for sensory stimulation |
GB0721475D0 (en) * | 2007-11-01 | 2007-12-12 | Asquith Anthony | Virtual buttons enabled by embedded inertial sensors |
US9357052B2 (en) * | 2008-06-09 | 2016-05-31 | Immersion Corporation | Developing a notification framework for electronic device events |
US9733704B2 (en) * | 2008-06-12 | 2017-08-15 | Immersion Corporation | User interface impact actuator |
US10019061B2 (en) * | 2008-07-15 | 2018-07-10 | Immersion Corporation | Systems and methods for haptic message transmission |
KR101625360B1 (en) * | 2008-08-12 | 2016-05-30 | 코닌클리케 필립스 엔.브이. | Motion detection system |
US8749495B2 (en) | 2008-09-24 | 2014-06-10 | Immersion Corporation | Multiple actuation handheld device |
JP2010086471A (en) * | 2008-10-02 | 2010-04-15 | Sony Corp | Operation feeling providing device, and operation feeling feedback method, and program |
KR20100039024A (en) * | 2008-10-07 | 2010-04-15 | 엘지전자 주식회사 | Mobile terminal and method for controlling display thereof |
US8823542B2 (en) | 2008-11-26 | 2014-09-02 | Nokia Corporation | Apparatus and methods relevant to electronic devices |
US9746923B2 (en) | 2009-03-12 | 2017-08-29 | Immersion Corporation | Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction |
EP3467624A1 (en) * | 2009-03-12 | 2019-04-10 | Immersion Corporation | System and method for interfaces featuring surface-based haptic effects |
US9696803B2 (en) | 2009-03-12 | 2017-07-04 | Immersion Corporation | Systems and methods for friction displays and additional haptic effects |
US9927873B2 (en) * | 2009-03-12 | 2018-03-27 | Immersion Corporation | Systems and methods for using textures in graphical user interface widgets |
JP5779508B2 (en) * | 2009-03-12 | 2015-09-16 | イマージョン コーポレーションImmersion Corporation | System and method for a texture engine |
US8570291B2 (en) * | 2009-05-21 | 2013-10-29 | Panasonic Corporation | Tactile processing device |
US8378797B2 (en) * | 2009-07-17 | 2013-02-19 | Apple Inc. | Method and apparatus for localization of haptic feedback |
US8441465B2 (en) * | 2009-08-17 | 2013-05-14 | Nokia Corporation | Apparatus comprising an optically transparent sheet and related methods |
US10254824B2 (en) | 2009-10-16 | 2019-04-09 | Immersion Corporation | Systems and methods for output of content based on sensing an environmental factor |
US8902050B2 (en) | 2009-10-29 | 2014-12-02 | Immersion Corporation | Systems and methods for haptic augmentation of voice-to-text conversion |
-
2011
- 2011-04-22 US US13/092,269 patent/US9448713B2/en active Active
-
2012
- 2012-04-16 KR KR1020197028083A patent/KR20190112192A/en not_active Application Discontinuation
- 2012-04-16 CN CN201280025757.5A patent/CN103562827B/en not_active Expired - Fee Related
- 2012-04-16 CN CN201710189757.3A patent/CN106940622A/en active Pending
- 2012-04-16 JP JP2014506471A patent/JP6234364B2/en not_active Expired - Fee Related
- 2012-04-16 KR KR1020137031069A patent/KR102027307B1/en active IP Right Grant
- 2012-04-16 EP EP12773915.9A patent/EP2702468B1/en active Active
- 2012-04-16 WO PCT/US2012/033743 patent/WO2012145264A1/en unknown
- 2012-04-16 EP EP19169687.1A patent/EP3531251A1/en not_active Withdrawn
-
2016
- 2016-09-23 JP JP2016186095A patent/JP2017033586A/en not_active Ceased
-
2018
- 2018-07-26 JP JP2018140510A patent/JP2018195335A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5149918A (en) * | 1990-10-29 | 1992-09-22 | International Business Machines Corporation | Touch sensitive overlay |
US7522152B2 (en) | 2004-05-27 | 2009-04-21 | Immersion Corporation | Products and processes for providing haptic feedback in resistive interface devices |
US20100231539A1 (en) | 2009-03-12 | 2010-09-16 | Immersion Corporation | Systems and Methods for Interfaces Featuring Surface-Based Haptic Effects |
Non-Patent Citations (1)
Title |
---|
See also references of EP2702468A4 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103793051A (en) * | 2012-10-31 | 2014-05-14 | 英默森公司 | Method and apparatus for simulating surface features on user interface with haptic effect |
US10591994B2 (en) | 2012-10-31 | 2020-03-17 | Immersion Corporation | Method and apparatus for simulating surface features on a user interface with haptic effects |
CN103793051B (en) * | 2012-10-31 | 2019-07-23 | 意美森公司 | Method and apparatus for simulating the surface characteristics in the user interface with haptic effect |
US10139912B2 (en) | 2012-10-31 | 2018-11-27 | Immersion Corporation | Method and apparatus for simulating surface features on a user interface with haptic effects |
JP2014102649A (en) * | 2012-11-19 | 2014-06-05 | Aisin Aw Co Ltd | Operation support system, operation support method, and computer program |
US9836150B2 (en) | 2012-11-20 | 2017-12-05 | Immersion Corporation | System and method for feedforward and feedback with haptic effects |
JP2018110000A (en) * | 2012-11-20 | 2018-07-12 | イマージョン コーポレーションImmersion Corporation | System and method for feed-forward and feed-back by haptic effect |
JP2018136958A (en) * | 2012-11-20 | 2018-08-30 | イマージョン コーポレーションImmersion Corporation | System and method for physical interaction simulated by haptic effect |
US10078384B2 (en) | 2012-11-20 | 2018-09-18 | Immersion Corporation | Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction |
JP2018185823A (en) * | 2012-11-20 | 2018-11-22 | イマージョン コーポレーションImmersion Corporation | System and method for providing mode or state awareness with programmable surface texture |
JP2014102830A (en) * | 2012-11-20 | 2014-06-05 | Immersion Corp | System and method for simulated physical interactions with haptic effects |
JP2014102829A (en) * | 2012-11-20 | 2014-06-05 | Immersion Corp | System and method for feedforward and feedback with haptic effects |
CN103838421A (en) * | 2012-11-20 | 2014-06-04 | 英默森公司 | Method and apparatus for providing haptic cues for guidance and alignment with electrostatic friction |
KR20140105303A (en) * | 2013-02-22 | 2014-09-01 | 삼성전자주식회사 | Method for displaying user input applying texture of background image and apparatus for the same |
KR102048015B1 (en) * | 2013-02-22 | 2019-11-25 | 삼성전자주식회사 | Method for displaying user input applying texture of background image and apparatus for the same |
JP2014206970A (en) * | 2013-03-15 | 2014-10-30 | イマージョン コーポレーションImmersion Corporation | User interface device provided with surface haptic sensations |
JP2015015027A (en) * | 2013-07-02 | 2015-01-22 | イマージョン コーポレーションImmersion Corporation | Systems and methods for perceptual normalization of haptic effects |
US11747907B2 (en) | 2020-01-07 | 2023-09-05 | Mitsubishi Electric Corporation | Tactile presentation panel, tactile presentation touch panel, and tactile presentation touch display |
Also Published As
Publication number | Publication date |
---|---|
CN103562827B (en) | 2017-04-26 |
EP2702468A4 (en) | 2015-02-25 |
KR20190112192A (en) | 2019-10-02 |
US9448713B2 (en) | 2016-09-20 |
JP2017033586A (en) | 2017-02-09 |
KR102027307B1 (en) | 2019-11-04 |
JP2014512619A (en) | 2014-05-22 |
CN106940622A (en) | 2017-07-11 |
CN103562827A (en) | 2014-02-05 |
KR20140040134A (en) | 2014-04-02 |
US20120268412A1 (en) | 2012-10-25 |
EP2702468B1 (en) | 2019-06-12 |
JP2018195335A (en) | 2018-12-06 |
EP2702468A1 (en) | 2014-03-05 |
JP6234364B2 (en) | 2017-11-22 |
EP3531251A1 (en) | 2019-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9448713B2 (en) | Electro-vibrotactile display | |
EP2876528B1 (en) | Systems and methods for generating friction and vibrotactile effects | |
JP6581173B2 (en) | Tactile output device and method for generating a haptic effect in a tactile output device | |
US10551924B2 (en) | Mobile device configured to receive squeeze input | |
CN103838421B (en) | For providing the tactile cue for being used for guidance and the method and apparatus with electrostatic friction calibration | |
EP2626775A2 (en) | Method and apparatus for haptic flex gesturing | |
KR101202734B1 (en) | Touch screen and operating method of the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12773915 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014506471 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20137031069 Country of ref document: KR Kind code of ref document: A |