CN102349038A - Systems and methods for a texture engine - Google Patents

Systems and methods for a texture engine Download PDF

Info

Publication number
CN102349038A
CN102349038A CN2010800117439A CN201080011743A CN102349038A CN 102349038 A CN102349038 A CN 102349038A CN 2010800117439 A CN2010800117439 A CN 2010800117439A CN 201080011743 A CN201080011743 A CN 201080011743A CN 102349038 A CN102349038 A CN 102349038A
Authority
CN
China
Prior art keywords
haptic effect
processor
touch
actuator
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010800117439A
Other languages
Chinese (zh)
Other versions
CN102349038B (en
Inventor
君·曼扭尔·克鲁斯-赫南德斯
丹尼·A·格兰特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/696,900 external-priority patent/US9696803B2/en
Priority claimed from US12/697,042 external-priority patent/US10564721B2/en
Priority claimed from US12/697,010 external-priority patent/US9874935B2/en
Priority claimed from US12/696,908 external-priority patent/US10007340B2/en
Priority claimed from US12/696,893 external-priority patent/US9746923B2/en
Priority claimed from US12/697,037 external-priority patent/US9927873B2/en
Priority to CN201610662488.3A priority Critical patent/CN106339169B/en
Application filed by Immersion Corp filed Critical Immersion Corp
Publication of CN102349038A publication Critical patent/CN102349038A/en
Application granted granted Critical
Publication of CN102349038B publication Critical patent/CN102349038B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B06GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS IN GENERAL
    • B06BMETHODS OR APPARATUS FOR GENERATING OR TRANSMITTING MECHANICAL VIBRATIONS OF INFRASONIC, SONIC, OR ULTRASONIC FREQUENCY, e.g. FOR PERFORMING MECHANICAL WORK IN GENERAL
    • B06B1/00Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency
    • B06B1/02Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy
    • B06B1/06Methods or apparatus for generating mechanical vibrations of infrasonic, sonic, or ultrasonic frequency making use of electrical energy operating with piezoelectric effect or with electrostriction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02NELECTRIC MACHINES NOT OTHERWISE PROVIDED FOR
    • H02N2/00Electric machines in general using piezoelectric effect, electrostriction or magnetostriction
    • H02N2/02Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing linear motion, e.g. actuators; Linear positioners ; Linear motors
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02NELECTRIC MACHINES NOT OTHERWISE PROVIDED FOR
    • H02N2/00Electric machines in general using piezoelectric effect, electrostriction or magnetostriction
    • H02N2/02Electric machines in general using piezoelectric effect, electrostriction or magnetostriction producing linear motion, e.g. actuators; Linear positioners ; Linear motors
    • H02N2/06Drive circuits; Control arrangements or methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B6/00Tactile signalling systems, e.g. personal calling systems

Abstract

Systems and methods for a texture engine are disclosed. For example, one disclosed system includes: a processor configured to receive a display signal including a plurality of pixels, determine a haptic effect comprising a texture, and transmit a haptic signal associated with the haptic effect to an actuator in communication with the processor, the actuator configured to receive the haptic signal and output the haptic effect.

Description

The system and method that is used for grain engine
The cross reference of related application
It is the U.S. Provisional Patent Application No.61/159 of " Locating Features Using a Friction Display " that present patent application requires in the title that on March 12nd, 2009 submitted to; 482 right of priority, mode by reference is herein incorporated its full content.
It is the U.S. Provisional Patent Application No.61/262 of " System and Method for Increasing Haptic Bandwidth in an Electronic Device " that present patent application requires in the title that on November 17th, 2009 submitted to; 041 right of priority, mode by reference is herein incorporated its full content.
It is the U.S. Provisional Patent Application No.61/262 of " Friction Rotary Device for Haptic Feedback " that present patent application requires in the title that on November 17th, 2009 submitted to; 038 right of priority, mode by reference is herein incorporated its full content.
It is the novel patented claim No.12/696 of u. s. utility of " Systems And Methods For Providing Features In A Friction Display " that present patent application requires in the title that on January 29th, 2010 submitted to; 893 right of priority, mode by reference is herein incorporated its full content.
It is the novel patented claim No.12/696 of u. s. utility of " Systems And Methods For Friction Displays And Additional Haptic Effects " that present patent application requires in the title that on January 29th, 2010 submitted to; 900 right of priority, mode by reference is herein incorporated its full content.
It is the novel patented claim No.12/696 of u. s. utility of " Systems And Methods For Interfaces Featuring Surface-Based Haptic Effects " that present patent application requires in the title that on January 29th, 2010 submitted to; 908 right of priority, mode by reference is herein incorporated its full content.
It is the novel patented claim No.12/697 of u. s. utility of " Systems And Methods For A Texture Engine " that present patent application requires in the title that on January 29th, 2010 submitted to; 010 right of priority, mode by reference is herein incorporated its full content.
It is the u. s. utility patented claim No.12/697 of " Systems And Methods For Using Textures In Graphical User Interface Widgets " that present patent application requires in the title that on January 29th, 2010 submitted to; 037 right of priority, mode by reference is herein incorporated its full content.
It is the novel patented claim No.12/697 of u. s. utility of " Systems And Methods For Using Multiple Actuators To Realize Textures " that present patent application requires in the title that on January 29th, 2010 submitted to; 042 right of priority, mode by reference is herein incorporated its full content.
Technical field
Present invention relates in general to tactile feedback, and relate more specifically to be used for the system and method for grain engine.
Background technology
In the past few years, the use of all types of handheld devices all exponentially increases.These equipment are used as portable organizers, phone, music player and games system.The various modern handheld device all combines certain type tactile feedback now.Along with haptic technology is improved, equipment can combine to imitate the tactile feedback of texture.Therefore, need the sense of touch grain engine.
Summary of the invention
Embodiments of the invention provide the system and method that is used for grain engine.For example; In one embodiment; The system that is used for grain engine comprises: processor; This processor is configured to receive the shows signal that comprises a plurality of pixels; Confirm to comprise the haptic effect of texture; And the haptic signal that will be associated with haptic effect is sent to the actuator with processor communication, and this actuator is configured to receive haptic signal and output haptic effect.
This illustrative example is mentioned not limit or to limit the present invention, is understood example of the present invention but offer help.Illustrative example is discussed in embodiment, and it provides of the present invention and further describes.Advantage by various embodiments of the present invention provides can further be understood through checking this instructions.
Description of drawings
When reading following embodiment with reference to the accompanying drawings, will understand better of the present invention these with other characteristics, aspect and advantage, in the accompanying drawings:
Fig. 1 is the block diagram that is used for the system of grain engine according to an embodiment of the invention;
Fig. 2 is the diagram that is used for the system of grain engine according to an embodiment of the invention;
Fig. 3 a is the diagram that is used for the system of grain engine according to an embodiment of the invention;
Fig. 3 b is the diagram that is used for the system of grain engine according to an embodiment of the invention;
Fig. 4 is the process flow diagram that is used for the method for grain engine according to an embodiment of the invention;
Fig. 5 a is the diagram of one of grain engine according to an embodiment of the invention texture that can generate;
Fig. 5 b is another diagram of one of grain engine according to an embodiment of the invention texture that can generate;
Fig. 5 c is another diagram of one of grain engine according to an embodiment of the invention texture that can generate;
Fig. 5 d is another diagram of one of grain engine according to an embodiment of the invention texture that can generate;
Fig. 5 e is another diagram of one of grain engine according to an embodiment of the invention texture that can generate;
Fig. 5 f is another diagram of one of grain engine according to an embodiment of the invention texture that can generate;
Fig. 5 g is another diagram of one of grain engine according to an embodiment of the invention texture that can generate; And
Fig. 5 h is another diagram of one of grain engine according to an embodiment of the invention texture that can generate.
Embodiment
Embodiments of the invention are provided for the system and method for grain engine.
The illustrative example of grain engine
An illustrative example of the present invention comprises the message transmitting apparatus, such as mobile phone.In an illustrative embodiment, the message transmission device includes fitted with Immersion's TouchSense 3000, TouchSense
Figure BPA00001434050400042
4000, or TouchSense
Figure BPA00001434050400043
5000 vibrating tactile feedback system (previously known as Immersion's VibeTonz
Figure BPA00001434050400044
vibrating tactile feedback system) Samsung Touch Phone (SCH- W420).In other embodiments, can utilize different messages transmitting apparatus and haptic feedback system.
The illustrative message transmitting apparatus comprise display, loudspeaker, network interface, storer and with these elements in each processor of communicating by letter.The illustrative message transmitting apparatus also comprises touch sensitive interface interface and actuator, both equal and processor communications.The touch-sensitive interface is configured to the mutual of sensing user and message transmitting apparatus, and actuator is configured to export haptic effect.The illustrative message transmitting apparatus may further include: catanator (manipulandum), this catanator are configured to detect user interactions and the interface signals that will be associated with user interactions is sent to processor.
In the illustrative message transmitting apparatus, display is configured to give the user with graphical user interface displays.Graphic user interface can comprise virtual objects, for example, and icon, button or dummy keyboard.The illustrative message transmitting apparatus further comprises the touch-sensitive interface, such as, touch-screen, it is assembled on the top of display.The touch-sensitive interface allows the user mutual with the virtual objects that in graphic user interface, shows.For example, in one embodiment, graphic user interface can comprise dummy keyboard, and in such embodiment, the touch-sensitive interface allows the user to touch the key on the dummy keyboard, to import the alphanumeric character that is associated with key.This function can be used to key in message, perhaps in addition and the object interaction in the graphic user interface.
In the illustrative message transmitting apparatus, processor is configured to confirm haptic effect and will be sent to the actuator that is configured to export haptic effect with the corresponding haptic signal of haptic effect.In the illustrative message transmitting apparatus, the texture that this haptic effect imitation user feels on the surface at touch-sensitive interface.The texture that is imitated can be associated with the user interface that illustrates on the display.For example, display can illustrate the icon of the shape that comprises rock.In such embodiment, processor can confirm to be configured to imitate the haptic effect at the texture of the lip-deep rock at touch-sensitive interface.Then, processor is sent to haptic signal the actuator that is configured to export haptic effect.When actuator received haptic signal, it will be to be configured to the making surface at touch-sensitive interface export the haptic effect such as vibration near the frequency of the texture of rock.
In illustrative example, processor can be realized tactile mapping (haptic map), to confirm haptic effect.For example, in illustrative example, processor can receive the shows signal that comprises a plurality of pixels, the equal and associated with colors of each pixel.For example, in illustrative example, each pixel in the shows signal can with red, green or blue associated with colors, and it can further be associated with the intensity that is used for every kind of color.In illustrative example, processor is distributed to every kind of color with the sense of touch value, and further distributes the sense of touch intensity that is associated with the intensity of every kind of color.Then, processor will comprise that the haptic signal of sense of touch value and sense of touch intensity is sent to the actuator that is configured to export haptic effect.
In illustrative example, processor can further be confirmed haptic effect based on external trigger.For example, in illustrative example, processor is configured to from being configured to detect the touch-sensitive interface reception interface signal of user interactions.Then, in illustrative example, processor will be based in part on interface signals at least and confirm haptic effect.For example, processor can be revised sense of touch value or sense of touch intensity based on interface signals at least in part.In illustrative example, if the touch-sensitive interface detects at a high speed or the high pressure user interactions, then processor will be confirmed the haptic effect of higher-strength.
The illustrative message transmitting apparatus can be exported haptic effect and be used for multiple purpose.For example, in one embodiment, haptic effect can serve as the affirmation that processor has received the interface signals that is associated with user interactions.For example, graphic user interface can comprise button, and the touch-sensitive interface can detect the user interactions that is associated with pressing button, and interface signals is sent to processor.As response, processor can be confirmed haptic effect, to confirm the reception interface signal.In such embodiment, haptic effect possibly make the user feel the lip-deep texture at the touch-sensitive interface.In illustrative example, processor can confirm further that haptic effect is used for other purpose.For example, the illustrative message transmitting apparatus can be exported texture, with warning users on the border of display or as to sign (identification) such as the object of the lip-deep icon of display.
This illustrated examples is presented, to be presented in the general theme of this argumentation to the reader.The invention is not restricted to this example.The multiple additional non-limiting example and the example of the system and method that is used for grain engine have been described with the lower part.
The demonstrative system that is used for grain engine
With reference now to accompanying drawing,, wherein, identical numeral runs through a plurality of accompanying drawings representes components identical, and Fig. 1 is the block diagram that is used for the system of grain engine according to an embodiment of the invention.As shown in Figure 1, system 100 comprises message transmitting apparatus 102, such as mobile phone, portable digital-assistant (PDA), portable media player, portable computer, portable game device or certain other mobile devices.In certain embodiments, message transmitting apparatus 102 can comprise kneetop computer, board, desktop PC or other similar devices.In other embodiment that also have, the message transmitting apparatus can comprise and is used for the external monitor used with PC or certain other equipment.Message transmitting apparatus 102 comprises processor 110, touch-sensitive interface 114, display 116, actuator 118, loudspeaker 120 and the storer 122 of communicating by letter with network interface 112.
Processor 110 is configured to carry out the computer-executable program instructions of storage in storer 122.For example, processor 110 can be carried out and be used for one or more computer programs that tactile feedback was sent or be used to generate to message.Processor 110 can comprise microprocessor, digital signal processor (DSP), special IC (ASIC), one or more field programmable gate array (FPGA) or state machine.Processor 110 may further include the programmable electronic device, such as programmable logic controller (PLC) (PLC), programmable interrupt controller (PIC), programmable logic device (PLD) (PLD), programmable read-only memory (prom), EPROM (EPROM or EEPROM) or other similar devices.
Storer 122 comprises the computer-readable medium of storage instruction, when by processor 110 execution command, makes processor 110 carry out a plurality of steps, such as described here those.The embodiment of computer-readable medium can include but not limited to electronics, optics, magnetic or other reservoirs or the transfer equipment of computer-readable instruction can be provided to processor 110.Other examples of medium include but not limited to that processor, all optical mediums, all tapes or other magnetic mediums of floppy disk, CD-ROM, disk, storage chip, ROM, RAM, ASIC, configuration or computer processor can be from its any other media that reads.In addition, multiple other equipment can comprise computer-readable medium, such as router, special use or public network or other transfer equipments.Processor 110 can be one or more instructions with the processing of describing, and can spread all over one or more structures and scatter.
Processor 110 is communicated by letter with network interface 112.Network interface 112 can comprise the method for one or more mobile communication, such as infrared, radio, Wi-Fi or cellular network communication.In other variants, network interface 112 includes wired network interface, such as Ethernet.Message transmitting apparatus 102 can be configured on such as the network of cellular network and/or internet, exchange messages or the virtual message object with other equipment (not shown).Embodiment in the message of exchanged between equipment can comprise speech message, text message, data-message or other forms of digital massage.
Processor 110 is also communicated by letter with one or more touch-sensitives interface 114.In certain embodiments, touch-sensitive interface 114 can comprise touch-screen or touch pad.For example, in certain embodiments, touch-sensitive interface 114 can comprise being assemblied in and be configured to receive shows signal and image exported to the touch-screen on user's the top of display.In other embodiments, touch-sensitive interface 114 can comprise the sensor of optical sensor or another kind of type.In one embodiment, the touch-sensitive interface can comprise the LED detecting device.For example, in one embodiment, touch-sensitive interface 114 can comprise the LED finger detector on the side that is assemblied in display 116.In certain embodiments, processor is communicated by letter with single touch-sensitive interface 114, in other embodiments, and processor and a plurality of touch-sensitive interface communications, for example, first touch-screen and second touch-screen.Touch-sensitive interface 114 is configured to detect user interactions, and transmits signals to processor 110 based on user interactions.In certain embodiments, touch-sensitive interface 114 can be configured to detect the many aspects of user interactions.For example, the speed and the pressure of user interactions can be detected in touch-sensitive interface 114, and this information is merged in the interface signals.
In the embodiment shown in Fig. 1, processor 110 is also communicated by letter with display 116.Processor 110 can be configured to generate the diagrammatic representation of the user interface that will on display 116, illustrate, and will comprise that then this figured shows signal is sent to display 116.In other embodiments, display 116 is configured to receive shows signal from another equipment.For example, in certain embodiments, display 116 can comprise external display, such as computer monitor.Display 116 is configured to receive shows signal and exports the image that is associated with this shows signal.In certain embodiments, shows signal can comprise the shows signal of Video Graphics Array (vga), HDMI (High Definition Multimedia Interface) (hdmi), advanced video graphic array (svga), video, two component vides (s-video) or other types as known in the art.In certain embodiments, display 116 comprises flat screen, such as LCD (LCD) or plasma screen displays.In other embodiments, display 116 comprises the display of cathode ray tube (CRT) or other types as known in the art.In other embodiment that also have, display 116 can comprise touch-sensitive interface 114, and for example, display 116 can comprise touch-screen LCD.In other embodiment that also have, display 116 can comprise flexible screen or flexible display.For example, in certain embodiments, display 116 can comprise the sense of touch substrate that is assemblied under its surface.In such embodiment, display 116 is processed by flexible material, and in response to the signal that receives from processor 110, the sense of touch curved substrate forms bridge, groove or other characteristics on the surface of display 116.In certain embodiments, the sense of touch substrate can comprise that plasma activates the grid of device, piezo-activator, electroactive polymer, MEMS (micro electro mechanical system), marmem, topping up or inflation unit.
In certain embodiments, processor 110 from the touch-sensitive interface 114 receive with and the signal that joins in the intercorrelation of the graphic user interface shown in the display 116.For example, in one embodiment, touch-sensitive interface 114 can comprise touch-screen, and the graphic user interface on the display 116 can comprise dummy keyboard.In such embodiment, when the user is mutual with the part of the touch-screen of one of key that covers dummy keyboard, touch-screen will be sent to processor 110 with the corresponding interface signals of this user interactions.Based on this interface signals, processor 110 will confirm that the user has pushed a key on the dummy keyboard.Other icons and virtual objects on this functional permission user and the display 116 are mutual.For example, in certain embodiments, the user can touch touch-screen, with mobile virtual sphere or rotation virtual knob.
As shown in Figure 1, processor 110 is also communicated by letter with actuating system, electric power and control wiring that this actuating system comprises one or more actuator 118, is used for the suspension of each actuator and is used for each actuator.In certain embodiments, message transmitting apparatus 102 comprises more than an actuating system.Processor 110 is configured to confirm haptic effect, and will be sent to actuator 118 with the corresponding haptic signal of this haptic effect.In certain embodiments, haptic effect is included in the vibrating tactile texture of experiencing on the housing of surface, touch-sensitive interface 114 or message transmitting apparatus 102 of display 116.In certain embodiments, confirm that haptic effect can comprise the execution series of computation.In other embodiments, confirm that haptic effect can comprise the visit look-up table.In other embodiment that also have, confirm that haptic effect can comprise the combination of look-up table and algorithm.
In certain embodiments, confirm that haptic effect can comprise tactile mapping.In such embodiment, confirm that haptic effect can comprise shows signal is mapped to actuator.For example, shows signal can comprise a plurality of pixels, each pixel all with associated with colors.In such embodiment, each pixel can with red, green or blue associated with colors; Every kind of color can further be associated with intensity, for example, and intensity 1-8.In such embodiment, confirm that haptic effect can comprise haptic effect is distributed to every kind of color.In certain embodiments, haptic effect can comprise the direction and the intensity of operation, and for example, in one embodiment, haptic signal can be configured to make revolving actuator to rotate with a half-power clockwise direction.In certain embodiments, operation intensity can be associated with the intensity of color.In case processor 110 is confirmed haptic effect, then it just transmits the haptic signal that comprises this haptic effect.In certain embodiments, some pixels in the shows signal can only be distributed to haptic effect by processor 110.For example, in such embodiment, haptic effect can only be associated with the part of shows signal.
In certain embodiments, processor 110 can utilize tactile mapping to confirm haptic effect, and shows signal is exported to display 116 then.In other embodiments, processor 110 can use tactile mapping to confirm haptic effect, and shows signal is not sent to display 116 then.In such embodiment, display 116 can keep dark or close, simultaneously actuator 118 output haptic effects.For example, in such embodiment, processor 110 can receive shows signal from the digital camera that is associated with message transmitting apparatus 102.In certain embodiments, in order to conserve battery power, the user can remove to forbid display 116.In such embodiment, processor can utilize tactile mapping to come to provide to the user haptic effect of the lip-deep texture of imitation display.This texture can be used for focusing on or warning users when certain other incident takes place when camera.For example, processor 110 can use facial recognition software to confirm the haptic effect of the texture of a plurality of positions of imitation on display 116, and wherein, if display 116 is activated, then said texture will be associated with face.
In certain embodiments, processor 110 can be confirmed haptic effect based on user interactions or triggering at least in part.In such embodiment, processor 110 is 114 reception interface signals from the touch-sensitive interface, and confirm haptic effect based on interface signals at least in part.For example, in certain embodiments, processor 110 can be based on confirming haptic effect by the position of touch-sensitive interface 114 detected user interactions.For example, in such embodiment, processor 110 can confirm to imitate the haptic effect of the texture of the virtual objects that the user touches on display 116.In other embodiments, processor 110 can be confirmed the intensity of haptic effect at least in part based on interface signals.For example, if touch-sensitive interface 114 detects the high pressure user interactions, then processor 110 can be confirmed the high strength haptic effect.In another embodiment, if touch-sensitive interface 114 detects the low-pressure user interactions, then processor 110 can be confirmed the low-intensity haptic effect.In other embodiment that also have, processor 110 can be confirmed the intensity of haptic effect at least in part based on the speed of user interactions.For example, in one embodiment, when touch-sensitive interface 114 detected the low velocity user interactions, processor 110 can be confirmed the low-intensity haptic effect.In other embodiment that also have, processor 110 can confirm not have haptic effect, only if it receives the interface signals that is associated with user interactions from touch-sensitive interface 114.
In case processor 110 is confirmed haptic effects, then its haptic signal that just will be associated with haptic effect is sent to actuator 118.Actuator 118 is configured to receive haptic signal from processor 110, and generates haptic effect.Actuator 118 can for example be piezo-activator, motor, electromagnetic actuators, voice coil loudspeaker voice coil, marmem, electroactive polymer, solenoid, eccentric rotary inertia motor (ERM) or linear resonance actuator (LRA).In certain embodiments, actuator 118 can comprise a plurality of actuators, for example, and ERM and LRA.
In one embodiment of the invention, the haptic effect that is generated by actuator 118 is configured to imitate the texture that the user feels on the surface of touch-sensitive interface 114 or display 116.This texture can be associated with the graphic user interface that on display 116, illustrates.For example, display 116 can illustrate the icon of the shape that comprises rock.In such embodiment, processor 110 can confirm to be configured to imitate the haptic effect at the texture of the lip-deep rock at touch-sensitive interface 114.Then, the haptic signal that processor 110 will be associated with haptic effect is sent to actuator 118, and it exports haptic effect.For example, when actuator 118 received haptic signal, it can comprise that the frequency of the texture of rock exports vibration with the surface that is configured to make the touch-sensitive interface.In other embodiments, actuator 118 can be configured to export vibration so that the surface at display 116 or touch-sensitive interface 114 comprises the frequency of following texture: water, ice, leather, sand, gravel, snow, skin, fur or certain other surfaces.In certain embodiments, haptic effect can be outputed on the different piece of message transmitting apparatus 102, for example, outputs on its housing.In certain embodiments, actuator 118 can be exported a large amount of vibrations that are configured to export simultaneously multiple texture.For example, actuator 118 can be exported the vibration that the surface that is configured to make display 116 comprises the texture of sand.In such embodiment, actuator 118 can be configured to export extraneous vibration, and it is configured to make the user to feel the texture of the rock in the sand.
Processor 110 can be confirmed haptic effect for multiple reason.For example, in certain embodiments, processor 110 can be exported the corresponding haptic effect of texture with the object that on display 116, illustrates.In such embodiment, display can illustrate a plurality of objects, and when the user with his/her finger when object moves to object, processor can be confirmed different haptic effects, thus imitation is used for the different texture of each object.In certain embodiments, haptic effect can serve as the affirmation that processor 110 has received the signal that is associated with user interactions.For example, in one embodiment, graphic user interface can comprise button, and the user interactions that is associated with pressing button can be detected in touch-sensitive interface 114.When the interface signals that will be associated with user interactions when touch-sensitive interface 114 was sent to processor 110, processor 110 can be confirmed haptic effect, to confirm the reception of interface signals.In such embodiment, haptic effect can make the user feel the lip-deep texture at touch-sensitive interface 114.For example, processor can be exported the haptic effect of the texture of imitation sand, has received user's input to confirm processor 110.In other embodiments, processor can be confirmed different texture, for example, and the texture of water, ice, oil, rock or skin.In certain embodiments, haptic effect can be used for various objectives, and for example, warning users perhaps provides the tactile data about the image on the display 116 to the user on the border of display 116.For example, in certain embodiments, each icon on the display 116 can comprise different texture, and when the user with his/her finger when an icon moves to another icon, processor will confirm to imitate the haptic effect of the texture of each icon.In further embodiment, when user's finger when moving to the contacting of the background of display with contacting of icon, processor can change texture, thus warning users he no longer touch icon.
As shown in Figure 1, processor 110 is also communicated by letter with loudspeaker 120.Loudspeaker 120 is configured to from processor 110 received audio signals, and exports them to user.In certain embodiments, sound signal can be associated with haptic effect of being exported by actuator 118 or the image of being exported by display 116.In other embodiments, sound signal can not correspond to this haptic effect or image.
In certain embodiments; Processor 110 may further include one or more sensors; For example, the sensor of GPS sensor, imaging sensor, accelerometer, position transducer, rotation speed sensor, optical sensor, camera, microphone or certain other types.Sensor can be configured to the change of sense acceleration, degree of tilt, inertia or position.For example, message transmitting apparatus 102 can comprise the accelerometer of the acceleration that is configured to the measuring message transmitting apparatus.Sensor is configured to sensor signal is sent to processor 110.
Sensor signal can comprise one or more parameters that position, motion, acceleration or " jerk (jerk) " (that is the derivative of acceleration) with message transmitting apparatus 102 is associated.For example, in one embodiment, sensor can generate and transmit the sensor signal that comprises a plurality of parameters, each parameter all with along or about the translation of measuring or the movements of turning axle.In certain embodiments, sensor output processor 110 is programmed to explain and is used to indicate voltage that moves or the electric current along one or more.
In certain embodiments; Processor 110 is with sensor-lodging and confirm that it should activate virtual workspace, and the motion of message transmitting apparatus 102 on X, Y or Z direction that senses is interpreted as the fantasy sport corresponding to virtual workspace " interior ".Then, the user can be in virtual workspace mobile device 102, with through in the Virtual Space, making a sign with the hand selection function or file.For example, select the function in the virtual workspace through mobile messaging transmitting apparatus 102 on the Z axle.In certain embodiments, the user can make to use gesture in virtual workspace and revise the haptic effect by 102 outputs of message transmitting apparatus.
Fig. 2 is the diagram that is used for the system of grain engine according to an embodiment of the invention.Fig. 2 comprises message transmitting apparatus 200, such as mobile phone, PDA, portable electronic device, portable game device or mobile computer.Message transmitting apparatus 200 is configured on such as the network of cellular network or internet, send and receive the signal such as voice mail, text message and other data-messages.Message transmitting apparatus 200 can comprise radio network interface and/or wired network interface (not shown among Fig. 2).Though equipment 200 is illustrated as hand-held message transmitting apparatus in Fig. 2, other embodiment can comprise distinct device, such as video game system and/or personal computer.
As shown in Figure 2, message transmitting apparatus 200 comprises housing 202 and display 216.In certain embodiments, display 216 can comprise LCD display.In other embodiments, display 216 can comprise the display of plasma scope or other types as known in the art.Display 216 is configured to receive shows signal and exports the image that is associated with this shows signal.In certain embodiments, shows signal can comprise the shows signal of Video Graphics Array (vga), HDMI (High Definition Multimedia Interface) (hdmi), advanced video graphic array (svga), video, two component vides (s-video) or other types as known in the art.In the embodiment shown in Fig. 2, display 216 comprises veined sphere 204.Display 216 further comprises texture selection icon 206, and it comprises the texture of rock, sand and water.
Also with reference to figure 2, message transmitting apparatus 200 further comprises catanator 214.In the embodiment shown in Fig. 2, catanator 214 comprises roller ball and button.Message transmitting apparatus 200 also comprises touch-sensitive interface 218.In the embodiment shown in Fig. 2, touch-sensitive interface 218 comprises the touch-screen on the top that is positioned at display 216.In certain embodiments, display 216 can comprise single integrated package with touch-screen, such as touch-screen display.
Catanator 214 is configured to detect user interactions with touch-sensitive interface 218, and will be sent to processor with the corresponding interface signals of user interactions.In certain embodiments, user interactions is associated with the graphic user interface that on display 216, illustrates.In such embodiment, processor reception interface signal, and come the manipulating graphics user interface based on this interface signals at least in part.For example, in the embodiment shown in Fig. 2, the user can use catanator 214 or touch-sensitive interface 218 to select texture to select in the icon 206.In case the user has selected to be used for the texture of veined sphere 204, then its outward appearance on screen just can be changed into corresponding to this texture.For example; If the user has selected sand texture icon; Then processor can be handled display, giving the outward appearance on veined sphere 204 sands surface, and further confirms when make alternately the time user feel the haptic effect of sand texture with veined sphere 204.Perhaps, in another embodiment, if the user selects the rock texture icon, then processor can confirm when user and veined sphere 204 are mutual, to make the user to feel the haptic effect of rock texture.
Message transmitting apparatus 200 further comprises and is configured to the actuator (not shown among Fig. 2) that receives haptic signal and export haptic effect.In certain embodiments, haptic effect comprises the vibrating tactile texture that the user by message transmitting apparatus 200 feels.Processor 110 is configured to confirm haptic effect, and will be sent to actuator with the corresponding haptic signal of this haptic effect.In certain embodiments, confirm that haptic effect can comprise series of computation, to confirm haptic effect.In other embodiments, confirm that haptic effect can comprise the visit look-up table, to confirm suitable haptic effect.In other embodiment that also have, confirm that haptic effect can comprise the combination of using look-up table and algorithm.In case processor 110 is confirmed haptic effect, then its haptic signal that will be associated with this haptic effect is sent to actuator.Actuator receives haptic signal from processor 110, and generates haptic effect.The user can feel haptic effect via the surface of display 216 or through certain other part (for example, via catanator 214 or housing 202) of message transmitting apparatus 200.In certain embodiments, when the user moved his/her finger on the surface of veined sphere 204, processor can be revised this haptic effect, with the change of imitation texture.
Be used for the explanation of the system of grain engine
Fig. 3 a is the diagram that is used for the system of grain engine according to an embodiment of the invention.Fig. 3 a comprises message transmitting apparatus 300, such as mobile phone, PDA, portable electronic device, portable game device or mobile computer.Message transmitting apparatus 300 is configured to send on such as the network of cellular network or internet and receive the signal that comprises message, such as message and other data-messages of voice mail, text message.Message transmitting apparatus 300 can comprise wireless network interface and/or cable network interface (not shown among Fig. 3 a).Though equipment 300 is illustrated as hand-held message transmitting apparatus in Fig. 3 a, other embodiment can comprise distinct device, such as video game system and/or personal computer.
Shown in Fig. 3 a, message transmitting apparatus 300 comprises display 316.Display 316 is configured to receive shows signal, and comes output image based on this shows signal at least in part.Message transmitting apparatus 300 further comprises processor (not shown among Fig. 3 a), and this processor is configured to shows signal is sent to display 316.Message transmitting apparatus 300 further comprises the touch-sensitive interface 314 on the top that is assemblied in display 316.Touch-sensitive interface 314 is configured to detect user interactions, and will be sent to processor with the corresponding interface signals of this user interactions.Display 316 comprises two icons 302 and 304.When one in user and icon 302 and 304 when mutual, user interactions will be detected in touch-sensitive interface 314, and the respective interface signal is sent to processor.Based on this interface signals, processor can confirm that the user has opened the file that links to one of icon or carried out certain other action as known in the art.
Shown in Fig. 3 a, each in the icon 302 and 304 all comprises texture.In an illustrated embodiment, icon 302 comprises the texture of fragment of brick, and icon 304 comprises the texture of rock.In other embodiments, can use different texture, for example, the texture of sand, water, oil, grass, fur, skin, leather, ice, wood or certain other textures as known in the art.When the part that is associated with each icon with display 316 as the user is mutual, shown in finger 306 among Fig. 3 a, processor will confirm to be configured to imitate the haptic effect of the texture of this icon.Then, the processor signal that will be associated with this haptic effect exports the actuator (not shown among Fig. 3 a) that is configured to export haptic effect to.For example, in the embodiment shown in Fig. 3 a, when the part that is associated with icon 302 with display 316 as the user is mutual, processor will be confirmed the haptic effect that is associated with the texture of fragment of brick.This haptic effect can be when striding across mortar (mortar) when user's finger 306 and move the high power pulse random signal that gets involved (punctuate) characterize.In other embodiments, other haptic effects will be used to imitate can with the corresponding different texture of the image that on display 316, illustrates.
Fig. 3 b is the diagram that is used for the system of grain engine according to an embodiment of the invention.In the embodiment shown in Fig. 3 b, confirm that haptic effect comprises: shows signal is mapped to actuator.Embodiment shown in Fig. 3 b comprises the amplifier section of display 350.Display 350 is configured to receive shows signal from processor.Shows signal comprises a plurality of pixels, and each pixel all is associated with the intensity of color and this color.Display 350 receives this shows signal, and exports the image that is associated with this shows signal.In the embodiment shown in Fig. 3 b, the amplifier section of display 350 comprises six pixels: 351,352,353,354,355 and 356.Each pixel all is associated with the intensity of color and the color in the scope of 1-10.For example, pixel 355 and green associated with colors, and color intensity is 3 in 10.Therefore, display 350 will be exported green color with intensity 3 in the position of pixel 355.
In the embodiment shown in Fig. 3 b, processor will be based in part on the interface signals that shows signal and the touch-sensitive interface from the top that is assemblied in display 350 (not shown Fig. 3 b) receive at least and confirm haptic effect.For example, in the embodiment shown in Fig. 3 b, processor uses shows signal to make haptic effect and each pixel be associated perhaps " mapping " haptic effect and each pixel.For example, in the embodiment shown in Fig. 3 b, processor can confirm to be used for the different frequency haptic effect of every kind of color.Processor can further make in the intensity of the haptic effect at each pixel place and be associated with intensity in the color at each pixel place.For example, processor can confirm that the pixel with color intensity 8 also will have sense of touch intensity 8.When processor receive with top in the pixel of display on user interactions be associated interface signals the time, processor will be exported the haptic signal that is associated with the mutual pixel of user.This haptic effect is configured to make the user to feel the lip-deep texture at display.
For example, in the embodiment shown in Fig. 3 b, processor can confirm that blue pixel is associated with knocking haptic effect, and red pixel is associated with impulse oscillation, and green pixel is relevant with the click haptic effect.In such embodiment, when the touch-sensitive interface detect user's finger on pixel 351 through out-of-date, processor will confirm to have knocking of intensity 1.Then, when user's finger moves on pixel 352, processor will confirm to have the impulse oscillation of intensity 5.And when user's finger continued on display 350, to move to pixel 353, processor can confirm to have the click effect of intensity 3.
When the user moved his/her finger on the surface of display 350, these haptic effects were configured to make the user to feel the lip-deep texture at display 350.In certain embodiments, processor can with communicate by letter more than an actuator, and every kind of color can be associated with its oneself actuator.In other embodiments, the various combination of color, intensity and haptic effect can be used to make the user to feel the lip-deep texture of display.
Fig. 4 is the process flow diagram that is used for the method for grain engine according to an embodiment of the invention, about the equipment shown in Fig. 1 it is discussed.As shown in Figure 4, when processor 110 receives the shows signal that comprises a plurality of pixels 402, method 400 beginnings.Shows signal can comprise the shows signal of Video Graphics Array (vga), HDMI (High Definition Multimedia Interface) (hdmi), advanced video graphic array (svga), video, two component vides (s-video) or other types as known in the art.Shows signal can comprise that graphic user interface or message transmitting apparatus will be via display 116 other images to user's demonstration.
Then, touch-sensitive interface 114 is sent to processor 110 with interface signals, its reception interface signal 404.In certain embodiments, touch-sensitive interface 114 can comprise touch-screen or touch pad.For example, in certain embodiments, touch-sensitive interface 114 can comprise being assemblied in and be configured to receive shows signal and image exported to the touch-screen on user's the top of display.In other embodiments, the touch-sensitive interface can comprise the physical equipment interface of button, switch, roller, roller ball or certain other types as known in the art.In certain embodiments, processor 110 is communicated by letter with single touch-sensitive interface 114.In other embodiments, processor 110 is communicated by letter with a plurality of touch-sensitives interface 114, for example, and touch-screen and roller ball.Touch-sensitive interface 114 is configured to detect user interactions, and transmits signals to processor based on this user interactions at least in part.In certain embodiments, touch-sensitive interface 114 can be configured to detect the many aspects of user interactions.For example, the speed and the pressure of user interactions can be detected in touch-sensitive interface 114, and this information is merged in the interface signals.
Next, processor 110 confirms to comprise the haptic effect of texture 406.This haptic effect can comprise the vibration that the user can arrive through the surface feel of touch-sensitive interface or catanator.In certain embodiments, this vibration can make the user feel the lip-deep texture at the touch-sensitive interface.For example, leather, snow, sand, ice, skin or certain other surperficial texture.In certain embodiments, confirm that haptic effect can comprise the series of computation of confirming haptic effect.In other embodiments, confirm that haptic effect can comprise that the visit look-up table is to confirm suitable haptic effect.In other embodiment that also have, confirm that haptic effect can comprise the combination of look-up table and algorithm.
In certain embodiments, confirm that haptic effect can comprise tactile mapping.In such embodiment, confirm that haptic effect can comprise shows signal is mapped to actuator.For example, shows signal can comprise a plurality of pixels, the equal and associated with colors of each pixel.In such embodiment, confirm that haptic effect can comprise haptic effect is distributed to every kind of color.Then, processor 110 comprises output the haptic signal of this haptic effect.In certain embodiments, some pixels in the shows signal can only be distributed to haptic effect by processor 110.For example, in such embodiment, haptic effect can only be associated with the part of shows signal.
In certain embodiments, processor 110 can be confirmed haptic effect based on user interactions or triggering at least in part.In such embodiment, processor 110 is 114 reception interface signals from the touch-sensitive interface, and confirm haptic effect based on this interface signals at least in part.For example, in certain embodiments, processor 110 can be confirmed the haptic effect of varying strength based on the interface signals that receives from touch-sensitive interface 114.For example, if touch-sensitive interface 114 detects the high pressure user interactions, then processor 110 can be confirmed the high strength haptic effect.In another embodiment, if touch-sensitive interface 114 detects the low-pressure user interactions, then processor 110 can be confirmed the low-intensity haptic effect.In other embodiment that also have, when touch-sensitive interface 114 detected the low velocity user interactions, processor 110 can be confirmed the low-intensity haptic effect.And when touch-sensitive interface 114 detected high-speed user interactions, processor 110 can be confirmed the high strength haptic effect.In other embodiment that also have, processor 110 can confirm not have haptic effect, only if it receives the interface signals that comprises user interactions from touch-sensitive interface 114.
At last, the haptic signal that processor 110 will be associated with haptic effect is sent to actuator 118, and it is configured to receive haptic signal and output haptic effect 408.Actuator 118 is configured to receive haptic signal from processor 110, and generates haptic effect.Actuator 118 can be for example piezo-activator, motor, electromagnetic actuators, voice coil loudspeaker voice coil, linear resonance actuator, marmem, electroactive polymer, solenoid, eccentric rotary inertia motor (ERM) or linear resonance actuator (LRA).
Fig. 5 a is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.Embodiment shown in Fig. 5 a comprises fragment of brick.The texture of fragment of brick is felt to characterize from the chiltern recess of mortar through the coarse irregular grain, the intervention that have from fragment of brick.When user's finger moved, the system that is used for grain engine can drive the coarse irregular grain that generates fragment of brick such as the actuator of LRA, LPA or FPA through the random signal that reaches high maximum variance with medium.In certain embodiments, this variance can be conditioned and be used for different roughness.In certain embodiments, the transition from the fragment of brick to the mortar can be ejected (high duration pop) through the high duration of being created by ERM provides.In addition, if mortar is enough thick, then can be through than amplitude signal driving actuator fine textures being provided with what have than be used to a higher variance of the actuator of texture that drives the output fragment of brick.
Fig. 5 b is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.Embodiment shown in Fig. 5 b comprises rock.The texture of rock is through characterizing when user's smooth surface that transition (transition) gets involved when rock moves to rock.In order to export the texture of rock, be used to create the patch of low friction such as the actuator of FPA.Each rock can provide through the non-virtual outline map of institute's images displayed, and when the touch-sensitive interface detects user's motion, exports the high-amplitude haptic signal to actuator, such as LPA or ERM.For example, no matter when the touch-sensitive interface detects user's finger just carries out the transition to rock from rock, all exports haptic effect.
Fig. 5 c is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.Embodiment shown in Fig. 5 c comprises sand or sandpaper.Sand characterizes through coarse sensation gritty and the sensation of before user's finger, setting up a pile grains of sand.In order to export the texture coarse gritty of sand, when user's finger moves, be driven through random signal with high maximum variance such as the actuator of LRA, LPA or FPA.In certain embodiments, the variance that processor can conditioning signal is to create roughness in various degree.In order to create the sensation that sand is piled up, can use actuator such as FPA.In such embodiment, when the user strides across touch-screen and moves their finger, processor will come driving actuator with signal, and this signal begins with low-intensity and progressively strengthens along with the user moves his/her finger in one direction.
In another embodiment, the texture shown in Fig. 5 c can comprise sandpaper.Sandpaper characterizes through having coarse sensation gritty.In order to create coarse sensation gritty, processor comes driving actuator with the random signal with high maximum variance, such as LRA, LPA or FPA.In certain embodiments, only when move on the surface that user's finger is just striding across the touch-sensitive interface, just export this signal.In certain embodiments, the variance that processor can conditioning signal is to change the grade of roughness.
Fig. 5 d is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.In the embodiment shown in Fig. 5 c, texture comprises the texture of grass.Grass feels gently that through the periodicity that user's finger is felt itch (periodic light sensation) characterizes.In order to create the sensation of grass, processor can drive the actuator such as FPA with the signal of the patch that is configured to create the low friction that covers through the polylith meadow.In certain embodiments, when user interface detects user interactions, processor can be through having the non-virtual outline map of images displayed and the actuator that the amplitude signal is exported to such as LPA or ERM each blade of grass is provided.
Fig. 5 e is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.In the embodiment shown in Fig. 5 e, texture comprises the texture of fabric.Fabric characterizes through light level and smooth sense (light smooth sensation).For the sensation of the texture of creating fabric, when moved on the surface that user's finger strides across the touch-sensitive interface, processor can drive the actuator such as LPA or LRA with the amplitude high-frequency signal.
Fig. 5 f is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.In the embodiment shown in Fig. 5 f, texture comprises the texture of water or molasses.Water characterizes through almost not feeling.Yet the water of upset can be splashed on every side and the bump user's finger.In order to imitate the texture of water, processor can drive actuator such as FPA to reduce the lip-deep friction at the touch sensor interface.For imitate the water sputter, just processor can only be exported haptic signal when the positive touch screen of user.In order to imitate the texture of more tacky fluid such as molasses or oil, when moved on the surface that user's finger strides across the touch-sensitive interface, processor can come driving actuator with the signal that is configured to increase the friction on the user's finger.
Fig. 5 g is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.In the embodiment shown in Fig. 5 g, texture comprises the texture of leather.The convexity on the surface of leather through comprising leather and the whole smooth of recess are felt to characterize.For the sensation of the texture of creating leather, processor can drive the actuator such as FPA with the signal that is configured to the haptic effect that output reduces to rub when move on the surface that user's finger strides across the touch-sensitive interface.Processor can use very short amplitude haptic signal driving actuator to export crackle and convexity when user's finger is moving through detecting at the touch-sensitive interface.
Fig. 5 g is the diagram of one of grain engine according to an embodiment of the invention texture that can generate.In the embodiment shown in Fig. 5 e, texture comprises wooden texture.Wood can be through characterizing through the irregular rugged and rough texture that sharp transitions (sharp transition) gets involved when moving to plank from plank (board) as the user.In order to create irregular rugged and rough texture; Processor can drive the actuator such as LRA, LPA or FPA with the non-virtual outline map of institute's images displayed, and when user's finger moves, repeatedly comes driving actuator with very short amplitude signal.In order to export the transition from the solid wood to the solid wood, processor can be exported and be configured to make actuator to generate the haptic signal of the short duration ejection of high-amplitude.
In other embodiments, can export the haptic effect that is associated with different texture.For example, in one embodiment, processor can transmit the haptic signal of the haptic effect of the texture that is configured to make actuator output to be configured to that the user is felt to be associated with the texture of icing.Ice characterizes through low friction, and in certain embodiments, ice has perfectly smooth texture.In other embodiments, ice comprises meticulous amplitude chiltern texture.In order to create the texture of ice, the haptic signal of reducing friction as much as possible when processor can confirm to be configured to make actuator to move on the surface that the user strides across the touch-sensitive interface.In another embodiment, processor can drive the actuator such as LPA or LRA with the haptic signal that is configured to output amplitude effect when the user moves their finger.These amplitude effects can be associated with the lip-deep imperfection or the coarse sand (grit) of ice.
In another embodiment, processor can be with being configured to make actuator output come driving actuator near the signal of the haptic effect of the texture of lizard skin.Lizard skin is felt to characterize through the whole smooth that is got involved by the transition from the convexity on the skin to convexity.For the haptic effect of the texture of realizing comprising lizard skin, processor can come driving actuator with the haptic signal that is configured to make actuator be created in the patch of the low friction on the touch-sensitive interface.When the touch-sensitive interface detected user's finger and just striding across its surface and move, processor can provide crackle through periodicity ground output high-amplitude haptic signal on the surface of skin.These high amplitude signals maybe be near the crackle in the surface of skin.
In another embodiment, processor can be with being configured to make actuator output come driving actuator near the signal of the haptic effect of the texture of fur.Fur characterizes through touching very soft periodicity light perception.For the haptic effect of the texture of realizing comprising fur, processor can come driving actuator with the haptic signal that is configured to make actuator output to be configured to reduce the haptic effect of the friction that the user feels on the surface at touch-sensitive interface.When the touch-sensitive interface detected user's motion, processor can further provide each hair of output amplitude pulse haptic signal.
In another embodiment, processor can be with being configured to make actuator output come driving actuator near the signal of the haptic effect of the texture of metal.Metal characterizes through smooth low-friction surface, and in certain embodiments, it comprises light coarse sand (light grit).For the haptic effect of the texture of realizing comprising metal, processor can come driving actuator with the signal that is configured to reduce the friction that the user feels on the surface at touch-sensitive interface.In certain embodiments, processor can provide each convexity by the of short duration high-amplitude haptic signal of output when detecting the user when the touch-sensitive interface and move just in its surface.These of short duration high amplitude signals can be near the coarse sand on the metallic surface.
In other embodiment that also have, processor can be with being configured to make actuator output come driving actuator near the signal of the haptic effect of another sensation (for example, heat).In such embodiment, when the user touched the element of the display that is associated with heat, processor can be exported and be configured to make actuator output high frequency to shake the haptic signal of effect.
The advantage that is used for the system and method for grain engine
The multiple advantage that has the system and method that is used for grain engine.For example, the system and method for grain engine is added into mobile device with previous untapped haptic effect.This new effect is provided for from the new way of mobile device reception information for the user, and the user must not watch the display of mobile device attentively.For example, the system and method for grain engine can allow the user different texture to be distributed to other assemblies of different icons, button or their display.Therefore, the user can confirm under the situation that must not watch this icon attentively which icon they are touching.This can increase the availability of equipment, and can make equipment to visually impaired more useful.
And, to the user more information is provided because be used for the system and method for grain engine, and the user is divert one's attention from other tasks, so it will reduce user error.For example, if the user is used for the system and method for grain engine, then the user will unlikely hit wrong icon or push wrong button.This is functional can be used to increase the employing rate that user satisfaction and increase are combined the technology of the system and method that is used for grain engine.
Overall consideration
Mean not stop in this use of " being suitable for " or " being configured to " and be suitable for or be configured to carry out the open of the attachment of a task or step and comprise language.In addition, " based on " use mean open with comprise, this be because " based on " those subsidiary condition or the value that in fact can put down in writing of processing, step, calculating or other actions of the conditioned disjunction value of one or more records based on surpassing.Only be used for more easily explaining in this title that comprises, tabulation and numbering, and do not mean that it is restrictive.
Embodiment according to the many aspects of this theme can or realize in aforementioned combination in Fundamental Digital Circuit, in computer hardware, firmware, software.In one embodiment, computing machine can comprise one or more processors.Processor comprises maybe and can conduct interviews to computer-readable medium that computer-readable medium is such as the random-access memory (ram) that is coupled to processor.Processor is carried out the computer-executable program instructions of in storer, storing, and comprises one or more computer programs that sensor sample routine, haptic effect select routine and generation signal to design with the suitable procedure that generates above-mentioned selected haptic effect such as execution.
Such processor can comprise microprocessor, digital signal processor (DSP), special IC (ASIC), field programmable gate array (FPGA) and state machine.Such processor may further include the programmable electronic device, such as PLC, programmable interrupt controller (PIC), programmable logic device (PLD) (PLD), programmable read-only memory (prom), EPROM (EPROM or EEPROM) or other similar devices.
Such processor can comprise or can with medium (for example, tangible computer-readable medium) communication that can storage instruction, when executing instruction, can make processor carry out as carry out or auxiliary step described herein by processor by processor.The embodiment of computer-readable medium can include but not limited to provide to the processor such as the processor in the webserver all electronics, optics, magnetic or other memory devices of computer-readable instruction.Other examples of medium include but not limited to that processor, all optical mediums, all tapes or other magnetic mediums of floppy disk, CD-ROM, disk, memory chip, ROM, RAM, ASIC, configuration or computer processor can be from its any other media that reads.And multiple other equipment can comprise computer-readable medium, such as router, special use or public network or other transmission equipments.The processing of processor and description can be one or more instructions, and can scatter through one or more instructions.Processor can comprise the code that is used to carry out one or more methods described herein (or a plurality of parts of method).
Though described this theme in detail about its specific embodiment, will understand, those skilled in the art are obtaining aforementionedly can easily to produce replacement, variation and the equivalent to such embodiment when understanding.Therefore, should be appreciated that the disclosure is used for for example by appearing rather than the purpose of restriction, and do not get rid of that this will be easy to those skilled in the art is conspicuous to the comprising of such modification, variation and/or the interpolation of this theme.

Claims (24)

1. system comprises:
Processor, said processor is configured to:
Reception comprises the shows signal of a plurality of pixels;
Confirm to comprise the haptic effect of texture; And
Transmit the haptic signal that is associated with said haptic effect;
Actuator, said actuator and said processor communication, said actuator are configured to receive said haptic signal and export said haptic effect.
2. system according to claim 1, wherein, said texture is the vibrating tactile effect.
3. system according to claim 1, wherein, said texture comprises following texture: sand, lizard skin or fragment of brick.
4. system according to claim 1, wherein, it is one of following that said actuator comprises: eccentric rotary inertia motor, linear resonance actuator, marmem, electroactive polymer or piezo-activator.
5. system according to claim 1, wherein, said haptic effect is confirmed based on said shows signal at least in part.
6. system according to claim 5, wherein, each in said a plurality of pixels and associated with colors, and are wherein confirmed that said haptic effect comprises the sense of touch value is distributed to said color.
7. system according to claim 6 wherein, confirms that said haptic effect comprises: the sense of touch value is only distributed to some in said a plurality of pixel.
8. system according to claim 6, wherein, every kind of color comprises intensity, and confirms that said haptic effect further comprises: regulate said sense of touch value with corresponding to said intensity.
9. system according to claim 1 further comprises: with the display of said processor communication, said display is configured to receive said shows signal and output image.
10. system according to claim 9, wherein, said texture is outputed on the surface of said display.
11. system according to claim 9, wherein, said actuator is coupled to said display.
12. system according to claim 1 further comprises: housing, said housing are configured to surround said actuator and said processor.
13. system according to claim 12, wherein, said housing comprises the mobile device housing.
14. system according to claim 12, wherein, said actuator is coupled to said housing.
15. system according to claim 1 further comprises: the touch-sensitive interface, the touch-sensitive interface is configured to detect user interactions, and based on said user interactions sensor signal is sent to said processor at least in part.
16. system according to claim 15, wherein, said processor further is configured to confirms said haptic effect based on said sensor signal at least in part.
17. system according to claim 16, wherein, said touch-sensitive interface is configured to detect the speed of said user interactions, and wherein, confirms that said haptic effect comprises: regulate said haptic effect with corresponding with the said speed of said user interactions.
18. system according to claim 16; Wherein, Said touch-sensitive interface is configured to detect the pressure of said user interactions, and wherein, confirms that said haptic effect comprises: the intensity of regulating said haptic effect is with corresponding with the said pressure of said user interactions.
19. a method that is used to export haptic effect comprises:
Reception comprises the shows signal of a plurality of pixels;
Confirm to comprise the haptic effect of texture; And
To be sent to actuator with the haptic signal that said haptic effect is associated, said actuator is configured to receive said haptic signal and exports said haptic effect.
20. method according to claim 19, wherein, said haptic effect is confirmed based on said shows signal at least in part.
21. method according to claim 20, wherein, each in said a plurality of pixels and associated with colors, and wherein, confirm that said haptic effect comprises: the sense of touch value is distributed to every kind of color.
22. method according to claim 20, wherein, every kind of color comprises intensity, and confirms that said haptic effect further comprises: said sense of touch value is associated with said intensity.
23. method according to claim 20 further comprises: the reception interface signal from the touch-sensitive interface, and wherein, said haptic effect is confirmed based on said interface signals at least in part.
24. a system comprises:
Touch-sensitive interface, said touch-sensitive interface are configured to detect user interactions and transmission and the corresponding signal of said user interactions, and said touch-sensitive interface is configured to detect the said speed and the pressure of said user interactions;
Processor, said processor and said touch-sensitive interface communications, said processor is configured to:
Reception comprises the shows signal of a plurality of pixels, and wherein, each pixel comprises color and intensity;
Confirm haptic effect based on the said color of each pixel and the said speed and the pressure of intensity and said user interactions at least in part; And
Transmit the haptic signal that is associated with said haptic effect;
Actuator, said actuator and said processor communication, said actuator are configured to receive said haptic signal and export said haptic effect.
CN201080011743.9A 2009-03-12 2010-03-11 System and method for grain engine Active CN102349038B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610662488.3A CN106339169B (en) 2009-03-12 2010-03-11 System and method for grain engine

Applications Claiming Priority (19)

Application Number Priority Date Filing Date Title
US15948209P 2009-03-12 2009-03-12
US61/159,482 2009-03-12
US26204109P 2009-11-17 2009-11-17
US26203809P 2009-11-17 2009-11-17
US61/262,041 2009-11-17
US61/262,038 2009-11-17
US12/697,042 US10564721B2 (en) 2009-03-12 2010-01-29 Systems and methods for using multiple actuators to realize textures
US12/696,908 2010-01-29
US12/697,010 US9874935B2 (en) 2009-03-12 2010-01-29 Systems and methods for a texture engine
US12/697,037 2010-01-29
US12/697,042 2010-01-29
US12/696,908 US10007340B2 (en) 2009-03-12 2010-01-29 Systems and methods for interfaces featuring surface-based haptic effects
US12/696,893 US9746923B2 (en) 2009-03-12 2010-01-29 Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction
US12/696,900 US9696803B2 (en) 2009-03-12 2010-01-29 Systems and methods for friction displays and additional haptic effects
US12/697,037 US9927873B2 (en) 2009-03-12 2010-01-29 Systems and methods for using textures in graphical user interface widgets
US12/697,010 2010-01-29
US12/696,893 2010-01-29
US12/696,900 2010-01-29
PCT/US2010/026909 WO2010105012A1 (en) 2009-03-12 2010-03-11 Systems and methods for a texture engine

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201610662488.3A Division CN106339169B (en) 2009-03-12 2010-03-11 System and method for grain engine

Publications (2)

Publication Number Publication Date
CN102349038A true CN102349038A (en) 2012-02-08
CN102349038B CN102349038B (en) 2016-08-24

Family

ID=73451201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201080011743.9A Active CN102349038B (en) 2009-03-12 2010-03-11 System and method for grain engine

Country Status (5)

Country Link
EP (1) EP2406704A1 (en)
JP (1) JP5779508B2 (en)
KR (2) KR102051180B1 (en)
CN (1) CN102349038B (en)
WO (1) WO2010105012A1 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662477A (en) * 2012-05-10 2012-09-12 孙晓颖 Touch representation device based on electrostatic force
CN104750411A (en) * 2013-12-31 2015-07-01 意美森公司 System and method for providing haptic notifications
CN104750309A (en) * 2013-12-31 2015-07-01 意美森公司 Friction augmented controls and method to convert buttons of touch control panels to friction augmented controls
CN105556423A (en) * 2013-06-11 2016-05-04 意美森公司 Systems and methods for pressure-based haptic effects
CN105589557A (en) * 2014-11-12 2016-05-18 乐金显示有限公司 Method of modeling haptic signal from haptic object, display apparatus, and driving method thereof
CN105955520A (en) * 2015-03-08 2016-09-21 苹果公司 Devices and Methods for Controlling Media Presentation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10067645B2 (en) 2015-03-08 2018-09-04 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10073527B2 (en) 2009-03-12 2018-09-11 Immersion Corporation Systems and methods for providing features in a friction display including a haptic effect based on a color and a degree of shading
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10101887B2 (en) 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10168826B2 (en) 2012-05-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10203868B2 (en) 2015-08-10 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10222980B2 (en) 2015-03-19 2019-03-05 Apple Inc. Touch input cursor manipulation
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
CN109582150A (en) * 2013-04-26 2019-04-05 意美森公司 Utilize the simulation Tangible User Interfaces interaction of haptic unit array and gesture
US10303354B2 (en) 2015-06-07 2019-05-28 Apple Inc. Devices and methods for navigating between user interfaces
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10387029B2 (en) 2015-03-08 2019-08-20 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
CN111052060A (en) * 2017-10-24 2020-04-21 微芯片技术股份有限公司 Touch sensitive user interface including configurable virtual widgets
US10775994B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10782871B2 (en) 2012-05-09 2020-09-22 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10908808B2 (en) 2012-05-09 2021-02-02 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US11023116B2 (en) 2012-05-09 2021-06-01 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
CN113168083A (en) * 2018-12-10 2021-07-23 环球城市电影有限责任公司 Animated window system
US11231831B2 (en) 2015-06-07 2022-01-25 Apple Inc. Devices and methods for content preview based on touch input intensity
US11240424B2 (en) 2015-06-07 2022-02-01 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9927873B2 (en) * 2009-03-12 2018-03-27 Immersion Corporation Systems and methods for using textures in graphical user interface widgets
US9448713B2 (en) * 2011-04-22 2016-09-20 Immersion Corporation Electro-vibrotactile display
WO2013157626A1 (en) * 2012-04-20 2013-10-24 株式会社ニコン Electronic device and vibration control method
JP6392747B2 (en) * 2012-05-31 2018-09-19 ノキア テクノロジーズ オサケユイチア Display device
JP6071372B2 (en) 2012-09-21 2017-02-01 キヤノン株式会社 Electronic device and control method of electronic device
US9196134B2 (en) * 2012-10-31 2015-11-24 Immersion Corporation Method and apparatus for simulating surface features on a user interface with haptic effects
JP6020083B2 (en) * 2012-11-19 2016-11-02 アイシン・エィ・ダブリュ株式会社 Operation support system, operation support method, and computer program
US9330544B2 (en) * 2012-11-20 2016-05-03 Immersion Corporation System and method for simulated physical interactions with haptic effects
KR101516926B1 (en) * 2013-09-26 2015-05-04 후지쯔 가부시끼가이샤 Drive controlling apparatus, electronic device and drive controlling method
JP6289100B2 (en) * 2014-01-06 2018-03-07 キヤノン株式会社 Information processing apparatus, information processing method, and program
JPWO2015121969A1 (en) * 2014-02-14 2017-03-30 富士通株式会社 Tactile sensation providing apparatus and system
JPWO2015121964A1 (en) * 2014-02-14 2017-03-30 富士通株式会社 Input device
JP6319328B2 (en) 2014-02-14 2018-05-09 富士通株式会社 Educational tactile sensation providing apparatus and system
WO2015121971A1 (en) * 2014-02-14 2015-08-20 富士通株式会社 Tactile device and system
CN111399646A (en) * 2014-03-21 2020-07-10 意美森公司 Systems, methods, and computer-readable media for force-based object manipulation and haptic detection
US9645646B2 (en) * 2014-09-04 2017-05-09 Intel Corporation Three dimensional contextual feedback wristband device
JP2016057764A (en) * 2014-09-08 2016-04-21 株式会社東海理化電機製作所 Tactile sense presentation device
EP4141622B1 (en) * 2017-07-27 2024-01-31 Telefonaktiebolaget LM Ericsson (publ) Improved perception of haptic objects
JP7087367B2 (en) * 2017-12-08 2022-06-21 富士フイルムビジネスイノベーション株式会社 Information processing equipment, programs and control methods
JP7345387B2 (en) * 2019-12-26 2023-09-15 Kddi株式会社 Tactile sensation presentation system, local terminal and server device of the tactile sensation presentation system, tactile sensation presentation method, and tactile sensation presentation program
WO2023108131A1 (en) * 2021-12-10 2023-06-15 Shaw Industries Group, Inc. Visceral surface covering simulator and method of use
KR102504937B1 (en) 2021-12-22 2023-03-02 현대건설기계 주식회사 Remote Control System for Construction Equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050057528A1 (en) * 2003-09-01 2005-03-17 Martin Kleen Screen having a touch-sensitive user interface for command input
US20060119586A1 (en) * 2004-10-08 2006-06-08 Immersion Corporation, A Delaware Corporation Haptic feedback for button and scrolling action simulation in touch input devices
US20060209037A1 (en) * 2004-03-15 2006-09-21 David Wang Method and system for providing haptic effects
EP1748350A2 (en) * 2005-07-28 2007-01-31 Avago Technologies General IP (Singapore) Pte. Ltd Touch device and method for providing tactile feedback
US20080068348A1 (en) * 1998-06-23 2008-03-20 Immersion Corporation Haptic feedback for touchpads and other touch controls

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001290572A (en) * 2000-04-05 2001-10-19 Fuji Xerox Co Ltd Information processor
JP2003099177A (en) * 2001-09-21 2003-04-04 Fuji Xerox Co Ltd Method for preparing haptic information and method for presenting haptic information and its device
US6703924B2 (en) * 2001-12-20 2004-03-09 Hewlett-Packard Development Company, L.P. Tactile display apparatus
JP3852368B2 (en) * 2002-05-16 2006-11-29 ソニー株式会社 Input method and data processing apparatus
JP2004310518A (en) * 2003-04-08 2004-11-04 Fuji Xerox Co Ltd Picture information processor
US20060024647A1 (en) * 2004-07-30 2006-02-02 France Telecom Method and apparatus for communicating graphical information to a visually impaired person using haptic feedback
KR100791379B1 (en) * 2006-01-02 2008-01-07 삼성전자주식회사 System and method for user interface
US20080122589A1 (en) * 2006-11-28 2008-05-29 Ivanov Yuri A Tactile Output Device
US9170649B2 (en) * 2007-12-28 2015-10-27 Nokia Technologies Oy Audio and tactile feedback based on visual environment
WO2009097866A1 (en) * 2008-02-04 2009-08-13 Nokia Corporation Device and method for providing tactile information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080068348A1 (en) * 1998-06-23 2008-03-20 Immersion Corporation Haptic feedback for touchpads and other touch controls
US20050057528A1 (en) * 2003-09-01 2005-03-17 Martin Kleen Screen having a touch-sensitive user interface for command input
US20060209037A1 (en) * 2004-03-15 2006-09-21 David Wang Method and system for providing haptic effects
US20060119586A1 (en) * 2004-10-08 2006-06-08 Immersion Corporation, A Delaware Corporation Haptic feedback for button and scrolling action simulation in touch input devices
EP1748350A2 (en) * 2005-07-28 2007-01-31 Avago Technologies General IP (Singapore) Pte. Ltd Touch device and method for providing tactile feedback

Cited By (93)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10073527B2 (en) 2009-03-12 2018-09-11 Immersion Corporation Systems and methods for providing features in a friction display including a haptic effect based on a color and a degree of shading
US10747322B2 (en) 2009-03-12 2020-08-18 Immersion Corporation Systems and methods for providing features in a friction display
US10942570B2 (en) 2012-05-09 2021-03-09 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US11947724B2 (en) 2012-05-09 2024-04-02 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US11354033B2 (en) 2012-05-09 2022-06-07 Apple Inc. Device, method, and graphical user interface for managing icons in a user interface region
US11314407B2 (en) 2012-05-09 2022-04-26 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US11221675B2 (en) 2012-05-09 2022-01-11 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US11068153B2 (en) 2012-05-09 2021-07-20 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11023116B2 (en) 2012-05-09 2021-06-01 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US11010027B2 (en) 2012-05-09 2021-05-18 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10996788B2 (en) 2012-05-09 2021-05-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10969945B2 (en) 2012-05-09 2021-04-06 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10908808B2 (en) 2012-05-09 2021-02-02 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10782871B2 (en) 2012-05-09 2020-09-22 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10114546B2 (en) 2012-05-09 2018-10-30 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10775994B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10775999B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10168826B2 (en) 2012-05-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10592041B2 (en) 2012-05-09 2020-03-17 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10481690B2 (en) 2012-05-09 2019-11-19 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10884591B2 (en) 2012-05-09 2021-01-05 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects
CN102662477A (en) * 2012-05-10 2012-09-12 孙晓颖 Touch representation device based on electrostatic force
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10101887B2 (en) 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10185491B2 (en) 2012-12-29 2019-01-22 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or enlarge content
US10175879B2 (en) 2012-12-29 2019-01-08 Apple Inc. Device, method, and graphical user interface for zooming a user interface while performing a drag operation
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10915243B2 (en) 2012-12-29 2021-02-09 Apple Inc. Device, method, and graphical user interface for adjusting content selection
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
CN109582150A (en) * 2013-04-26 2019-04-05 意美森公司 Utilize the simulation Tangible User Interfaces interaction of haptic unit array and gesture
CN105556423A (en) * 2013-06-11 2016-05-04 意美森公司 Systems and methods for pressure-based haptic effects
US9939904B2 (en) 2013-06-11 2018-04-10 Immersion Corporation Systems and methods for pressure-based haptic effects
CN105556423B (en) * 2013-06-11 2019-01-15 意美森公司 System and method for the haptic effect based on pressure
US10488931B2 (en) 2013-06-11 2019-11-26 Immersion Corporation Systems and methods for pressure-based haptic effects
CN104750309A (en) * 2013-12-31 2015-07-01 意美森公司 Friction augmented controls and method to convert buttons of touch control panels to friction augmented controls
CN104750411B (en) * 2013-12-31 2020-01-21 意美森公司 System and method for providing haptic notifications
CN104750309B (en) * 2013-12-31 2019-12-03 意美森公司 The button of touch panel is converted into rubbing the method and system of enhanced control
CN104750411A (en) * 2013-12-31 2015-07-01 意美森公司 System and method for providing haptic notifications
CN105589557A (en) * 2014-11-12 2016-05-18 乐金显示有限公司 Method of modeling haptic signal from haptic object, display apparatus, and driving method thereof
US9984479B2 (en) 2014-11-12 2018-05-29 Lg Display Co., Ltd. Display apparatus for causing a tactile sense in a touch area, and driving method thereof
CN105589557B (en) * 2014-11-12 2018-06-08 乐金显示有限公司 Modeling method, display equipment and its driving method of the haptic signal of haptic object
US10268342B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10180772B2 (en) 2015-03-08 2019-01-15 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11112957B2 (en) 2015-03-08 2021-09-07 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10860177B2 (en) 2015-03-08 2020-12-08 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10387029B2 (en) 2015-03-08 2019-08-20 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10613634B2 (en) 2015-03-08 2020-04-07 Apple Inc. Devices and methods for controlling media presentation
US10067645B2 (en) 2015-03-08 2018-09-04 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10338772B2 (en) 2015-03-08 2019-07-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268341B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
CN105955520B (en) * 2015-03-08 2019-02-19 苹果公司 For controlling the device and method of media presentation
CN105955520A (en) * 2015-03-08 2016-09-21 苹果公司 Devices and Methods for Controlling Media Presentation
US10402073B2 (en) 2015-03-08 2019-09-03 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US11054990B2 (en) 2015-03-19 2021-07-06 Apple Inc. Touch input cursor manipulation
US10599331B2 (en) 2015-03-19 2020-03-24 Apple Inc. Touch input cursor manipulation
US11550471B2 (en) 2015-03-19 2023-01-10 Apple Inc. Touch input cursor manipulation
US10222980B2 (en) 2015-03-19 2019-03-05 Apple Inc. Touch input cursor manipulation
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US11240424B2 (en) 2015-06-07 2022-02-01 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10303354B2 (en) 2015-06-07 2019-05-28 Apple Inc. Devices and methods for navigating between user interfaces
US11835985B2 (en) 2015-06-07 2023-12-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11681429B2 (en) 2015-06-07 2023-06-20 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10841484B2 (en) 2015-06-07 2020-11-17 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11231831B2 (en) 2015-06-07 2022-01-25 Apple Inc. Devices and methods for content preview based on touch input intensity
US10705718B2 (en) 2015-06-07 2020-07-07 Apple Inc. Devices and methods for navigating between user interfaces
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10455146B2 (en) 2015-06-07 2019-10-22 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11182017B2 (en) 2015-08-10 2021-11-23 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10963158B2 (en) 2015-08-10 2021-03-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10884608B2 (en) 2015-08-10 2021-01-05 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10754542B2 (en) 2015-08-10 2020-08-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10203868B2 (en) 2015-08-10 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11327648B2 (en) 2015-08-10 2022-05-10 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10209884B2 (en) 2015-08-10 2019-02-19 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10698598B2 (en) 2015-08-10 2020-06-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11740785B2 (en) 2015-08-10 2023-08-29 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
CN111052060B (en) * 2017-10-24 2023-11-24 微芯片技术股份有限公司 Touch-sensitive user interface including configurable virtual widgets
CN111052060A (en) * 2017-10-24 2020-04-21 微芯片技术股份有限公司 Touch sensitive user interface including configurable virtual widgets
CN113168083A (en) * 2018-12-10 2021-07-23 环球城市电影有限责任公司 Animated window system

Also Published As

Publication number Publication date
JP2012520137A (en) 2012-09-06
JP5779508B2 (en) 2015-09-16
KR20110130469A (en) 2011-12-05
WO2010105012A1 (en) 2010-09-16
KR102051180B1 (en) 2019-12-02
CN102349038B (en) 2016-08-24
KR20160110547A (en) 2016-09-21
KR102003426B1 (en) 2019-07-24
EP2406704A1 (en) 2012-01-18

Similar Documents

Publication Publication Date Title
CN102349038A (en) Systems and methods for a texture engine
US10198077B2 (en) Systems and methods for a texture engine
JP6588951B2 (en) System and method using multiple actuators to achieve texture
CN105892921B (en) System and method for implementing texture using multiple actuators
CN102362246A (en) Systems and methods for using multiple actuators to realize textures
CN101910977B (en) Audio and tactile feedback based on visual environment
JP2019050003A (en) Simulation of tangible user interface interactions and gestures using array of haptic cells
KR20130050251A (en) Systems and methods for multi-pressure interaction on touch-sensitive surfaces
EP3333674A1 (en) Systems and methods for compliance simulation with haptics
KR101992070B1 (en) Systems and methods for a texture engine

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C53 Correction of patent of invention or patent application
CB02 Change of applicant information

Address after: American California

Applicant after: IMMERSION CORPORATION

Address before: American California

Applicant before: Immersion Corp

COR Change of bibliographic data

Free format text: CORRECT: APPLICANT; FROM: IMMERSION CORP. TO: YIMEISEN CO., LTD.

C14 Grant of patent or utility model
GR01 Patent grant