US20120056839A1 - Haptic feedback for touchpads and other touch controls - Google Patents

Haptic feedback for touchpads and other touch controls Download PDF

Info

Publication number
US20120056839A1
US20120056839A1 US13/295,947 US201113295947A US2012056839A1 US 20120056839 A1 US20120056839 A1 US 20120056839A1 US 201113295947 A US201113295947 A US 201113295947A US 2012056839 A1 US2012056839 A1 US 2012056839A1
Authority
US
United States
Prior art keywords
touchpad
haptic feedback
computer
region
graphical object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/295,947
Inventor
Louis B. Rosenberg
James R. Riegel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersion Corp
Original Assignee
Immersion Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=23936915&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20120056839(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Priority claimed from US09/103,281 external-priority patent/US6088019A/en
Priority claimed from US09/156,802 external-priority patent/US6184868B1/en
Priority claimed from US09/253,132 external-priority patent/US6243078B1/en
Priority claimed from US09/467,309 external-priority patent/US6563487B2/en
Application filed by Immersion Corp filed Critical Immersion Corp
Priority to US13/295,947 priority Critical patent/US20120056839A1/en
Assigned to IMMERSION CORPORATION reassignment IMMERSION CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROSENBERG, LOUIS B., RIEGEL, JAMES R.
Publication of US20120056839A1 publication Critical patent/US20120056839A1/en
Priority to US13/747,389 priority patent/US9280205B2/en
Priority to US15/054,693 priority patent/US9740290B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • G06F1/1692Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes the I/O peripheral being a secondary touch screen used as control interface, e.g. virtual buttons or sliders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • G06F3/04142Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position the force sensing means being located peripherally, e.g. disposed at the corners or at the side of a touch sensing plate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1056Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving pressure sensitive buttons
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1062Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/015Force feedback applied to a joystick
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H3/00Mechanisms for operating contacts
    • H01H2003/008Mechanisms for operating contacts with a haptic or a tactile feedback controlled by electrical means, e.g. a motor or magnetofriction
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H2215/00Tactile feedback
    • H01H2215/05Tactile feedback electromechanical
    • H01H2215/052Tactile feedback electromechanical piezoelectric
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/30Definitions, standards or architectural aspects of layered protocol stacks
    • H04L69/32Architecture of open systems interconnection [OSI] 7-layer type protocol stacks, e.g. the interfaces between the data link level and the physical level
    • H04L69/322Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions
    • H04L69/329Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions in the application layer [OSI layer 7]

Definitions

  • the subject matter described relates generally to the interfacing with computer and mechanical devices by a user, and more particularly to devices used to interface with computer systems and electronic devices and which provide haptic feedback to the user.
  • One such application is interacting with computer-generated environments such as games, simulations, and application programs.
  • Computer input devices such as mice and trackballs are often used to control a cursor within a graphical environment and provide input in these applications.
  • haptic feedback force feedback or tactile feedback is also provided to the user, collectively known herein as “haptic feedback.”
  • haptic feedback For example, haptic versions of joysticks, mice, gamepads, steering wheels, or other types of devices can output forces to the user based on events or interactions occurring within the graphical environment, such as in a game or other application program.
  • touchpads are small rectangular, planar pads provided near the keyboard of the computer.
  • the touchpads senses the location of a pointing object by any of a variety of sensing technologies, such as capacitive sensors or pressure sensors that detect pressure applied to the touchpad.
  • the user contacts the touchpad most commonly with a fingertip and moves his or her finger on the pad to move a cursor displayed in the graphical environment.
  • the user can operate a stylus in conjunction with the touchpad by pressing the stylus tip on the touchpad and moving the stylus.
  • touchpads One problem with existing touchpads is that there is no haptic feedback provided to the user. The user of a touchpad is therefore not able to experience haptic sensations that assist and inform the user of targeting and other control tasks within the graphical environment. The touchpads of the prior art also cannot take advantage of existing haptic-enabled software run on the portable computer.
  • An embodiment is directed to a haptic feedback planar touch control used to provide input to a computer system.
  • the control can be a touchpad provided on a portable computer, or can be a touch screen found on a variety of devices.
  • the haptic sensations output on the touch control enhance interactions and manipulations in a displayed graphical environment or when controlling an electronic device.
  • the embodiment relates to a haptic feedback touch control for inputting signals to a computer and for outputting forces to a user of the touch control.
  • the control includes a touch input device including an approximately planar touch surface operative to input a position signal to a processor of said computer based on a location of user contact on the touch surface.
  • the computer positions a cursor in a graphical environment displayed on a display device based at least in part on the position signal.
  • At least one actuator is also coupled to the touch input device and outputs a force on the touch input device to provide a haptic sensation to the user contacting the touch surface.
  • the actuator outputs the force based on force information output by the processor to the actuator.
  • the touch input device can be a touchpad separate from a display screen of the computer, or can be included in a display screen of the computer as a touch screen.
  • the touch input device can be integrated in a housing of the computer or handheld device, or provided in a housing that is separate from the computer.
  • the user contacts the touch surface with a finger, a stylus, or other object.
  • the force is preferably a linear force output approximately perpendicularly to a plane of the touch surface of the touch input device, and the actuator can include a piezo-electric actuator, a voice coil actuator, a pager motor, a solenoid, or other type of actuator.
  • the actuator is coupled between the touch input device and a grounded surface.
  • the actuator is coupled to an inertial mass, wherein said actuator outputs an inertial force on the touch input device approximately along an axis perpendicular to the planar touch surface.
  • a touch device microprocessor separate from the main processor of the computer can receive force information from the host computer and provide control signals based on the force information to control the actuator.
  • the haptic sensations are preferably output in accordance with an interaction of a controlled cursor with a graphical object in the graphical environment.
  • a pulse can be output when the cursor is moved between menu elements in a menu, moved over said icon, or moved over a hyperlink.
  • the touch input device can include multiple different regions, where at least one of the regions provides the position signal and at least one other region provides a signal that is used by the computer to control a different function, such as rate control function of a value or a button press. Different regions and borders between regions can be associated with different haptic sensations.
  • An embodiment advantageously provides haptic feedback to a planar touch control device of a computer, such as a touchpad or touch screen.
  • the haptic feedback can assist and inform the user of interactions and events within a graphical user interface or other environment and ease cursor targeting tasks.
  • an embodiment allows portable computer devices having such touch controls to take advantage of existing haptic feedback enabled software.
  • the haptic touch devices disclosed herein are also inexpensive, compact and consume low power, allowing them to be easily incorporated into a wide variety of portable and desktop computers and electronic devices.
  • FIG. 1 is a perspective view of a haptic touchpad
  • FIG. 2 is a perspective view of a remote control device including the touchpad
  • FIG. 3 is a perspective view of a first embodiment of the touchpad including one or more actuators coupled to the underside of the touchpad;
  • FIG. 4 is a side elevational view of a first embodiment in which a piezo-electric actuator is directly coupled to the touchpad;
  • FIG. 5 is a side elevational view of a second embodiment of the touchpad including a linear actuator
  • FIG. 6 is a side elevational view of a third embodiment of the touchpad having an inertial mass
  • FIG. 7 is a top plan view of an example of a touchpad having different control regions.
  • FIGS. 8 a and 8 b are top plan and side cross sectional views, respectively, of a touch screen embodiment.
  • FIG. 1 is a perspective view of a portable computer 10 including a haptic touchpad.
  • Computer 10 is preferably a portable or “laptop” computer that can be carried or otherwise transported by the user and may be powered by batteries or other portable energy source in addition to other more stationary power sources.
  • Computer 10 preferably runs one or more host application programs with which a user is interacting via peripherals.
  • Computer 10 may include the various input and output devices as shown, including a display device 12 for outputting graphical images to the user, a keyboard 14 for providing character or toggle input from the user to the computer, and a touchpad 16 .
  • Display device 12 can be any of a variety of types of display devices; flat-panel displays are most common on portable computers.
  • Display device 12 can display a graphical environment 18 based on application programs and/or operating systems that are running, such as a graphical user interface (GUI), that can include a cursor 20 that can be moved by user input, as well as windows 22 , icons 24 , and other graphical objects well known in GUI environments.
  • GUI graphical user interface
  • Other devices may also be incorporated or coupled to the computer 10 , such as storage devices (hard disk drive, DVD-ROM drive, etc.), network server or clients, game controllers, etc.
  • the computer 10 can take a wide variety of forms, including computing devices that rest on a tabletop or other surface, stand-up arcade game machines, other portable devices or devices worn on the person, handheld or used with a single hand of the user, etc.
  • host computer 10 can be a video game console, personal computer, workstation, a television “set top box” or a “network computer”, or other computing or electronic device.
  • Touchpad device 16 preferably appears externally to be similar to the touchpads of the prior art.
  • Pad 16 includes a planar, rectangular smooth surface that can be positioned below the keyboard 14 on the housing of the computer 10 , as shown, or may be positioned at other areas of the housing.
  • the user may conveniently place a fingertip or other object on the touchpad 16 and move the fingertip to correspondingly move cursor 20 in the graphical environment 18 .
  • the touchpad 16 inputs coordinate data to the main microprocessor(s) of the computer 10 based on the sensed location of an object on (or near) the touchpad.
  • touchpad 16 can be capacitive, resistive, or use a different type of sensing.
  • Some existing touchpad embodiments are disclosed, for example, in U.S. Pat. No. 5,521,336 and U.S. Pat. No. 5,943,044.
  • Capacitive touchpads typically sense the location of an object on or near the surface of the touchpad based on capacitive coupling between capacitors in the touchpad and the object.
  • Resistive touchpads are typically pressure-sensitive, detecting the pressure of a finger, stylus, or other object against the pad, where the pressure causes conductive layers, traces, switches, etc. in the pad to electrically connect. Some resistive or other types of touchpads can detect the amount of pressure applied by the user and can use the degree of pressure for proportional or variable input to the computer 10 . Resistive touchpads typically are at least partially deformable, so that when a pressure is applied to a particular location, the conductors at that location are brought into electrical contact. Such deformability can be useful since it can potentially amplify the magnitude of output forces such as pulses or vibrations on the touchpad.
  • Capacitive touchpads and other types of touchpads that do not require significant contact pressure may be better suited in many embodiments, since excessive pressure on the touchpad may in some cases interfere with the motion of the touchpad for haptic feedback.
  • Other types of sensing technologies can also be used in the touchpad.
  • the term “touchpad” preferably includes the surface of the touchpad 16 as well as any sensing apparatus included in the touchpad unit.
  • Touchpad 16 preferably operates similarly to existing touchpads, where the speed of the fingertip on the touchpad correlates to the distance that the cursor is moved in the graphical environment. For example, if the user moves his or her finger quickly across the pad, the cursor is moved a greater distance than if the user moves the fingertip more slowly. If the user's finger reaches the edge of the touchpad before the cursor reaches a desired destination in that direction, then the user can simply move his or her finger off the touchpad, reposition the finger away from the edge, and continue moving the cursor. This is an “indexing” function similar to lifting a mouse off a surface to change the offset between mouse position and cursor.
  • touchpads can be provided with particular regions that are each assigned to particular functions that can be unrelated to cursor positioning. Such an embodiment is described in greater detail below with respect to FIG. 7 .
  • the touchpad 16 may also allow a user to “tap” the touchpad (rapidly touch and remove the object from the pad) in a particular location to provide a command. For example, the user can tap or “double tap” the pad with a finger while the controlled cursor is over an icon to select that icon.
  • the touchpad 16 is provided with the ability to output haptic feedback such as tactile sensations to the user who is physically contacting the touchpad 16 .
  • haptic feedback such as tactile sensations to the user who is physically contacting the touchpad 16 .
  • the forces output on the touchpad are linear (or approximately linear) and oriented along the z-axis, approximately perpendicular to the surface of the touchpad 16 and the top surface of computer 10 .
  • forces can be applied to the touchpad 16 to cause side-to-side (e.g., x-y) motion of the pad in the plane of its surface in addition to or instead of z-axis motion, although such motion is not preferred.
  • haptic sensations can be output to the user who is contacting the pad. For example, jolts, vibrations (varying or constant amplitude), and textures can be output. Forces output on the pad can be at least in part based on the location of the finger on the pad or the state of a controlled object in the graphical environment of the host computer 10 , and/or independent of finger position or object state. Such forces output on the touchpad 16 are considered “computer-controlled” since a microprocessor or other electronic controller is controlling the magnitude and/or direction of the force output of the actuator(s) using electronic signals.
  • the entire pad 16 is provided with haptic sensations as a single unitary member; in other embodiments, individually-moving portions of the pad can each be provided with its own haptic feedback actuator and related transmissions so that haptic sensations can be provided for only a particular portion.
  • some embodiments may include a touchpad having different portions that may be flexed or otherwise moved with respect to other portions of the pad.
  • the touchpad 16 can be provided in a separate housing that is connected to a port of the computer 10 via a cable or via wireless transmission and which receives force information from and sends position information to the computer 10 .
  • a port of the computer 10 via a cable or via wireless transmission and which receives force information from and sends position information to the computer 10 .
  • USB Universal Serial Bus
  • Firewire or a standard serial bus can connect such a touchpad to the computer 10 .
  • the computer 10 can be any desktop or stationary computer or device and need not be a portable device.
  • buttons 26 can also be provided on the housing of the computer 10 to be used in conjunction with the touchpad 16 .
  • the user's hands have easy access to the buttons, each of which may be pressed by the user to provide a distinct input signal to the host computer 12 .
  • each button 26 corresponds to a similar button found on a mouse input device, so that a left button can be used to select a graphical object (click or double click), a right button can bring up a context menu, etc.
  • one or more of the buttons 26 can be provided with tactile feedback as described in U.S. Pat. No. 6,184,868 and U.S. Pat. No. 6,563,487. Other features of these disclosures may also be used.
  • one or more moveable portions 28 of the housing of the computer device 10 can be included which is contacted by the user when the user operates the touchpad 16 and which can provide haptic feedback.
  • Having a moveable portion of a housing for haptic feedback is described in U.S. Pat. No. 6,184,868 and U.S. Pat. No. 6,088,019.
  • both the housing can provide haptic feedback (e.g., through the use of an eccentric rotating mass on a motor coupled to the housing) and the touchpad 16 can provide separate haptic feedback.
  • a vibration of a low frequency can be conveyed through the housing to the user and a higher frequency vibration can be conveyed to the user through the touchpad 16 .
  • Each other button or other control provided with haptic feedback can also provide tactile feedback independently from the other controls.
  • the host application program(s) and/or operating system preferably displays graphical images of the environment on display device 12 .
  • the software and environment running on the host computer 12 may be of a wide variety.
  • the host application program can be a word processor, spreadsheet, video or computer game, drawing program, operating system, graphical user interface, simulation, Web page or browser that implements HTML or VRML instructions, scientific analysis program, virtual reality training program or application, or other application program that utilizes input from the touchpad 16 and outputs force feedback commands to the touchpad 16 .
  • many games and other application programs include force feedback functionality and may communicate with the touchpad 16 using a standard protocol/drivers such as I-Force®, FEELit®, or TouchsenseTM available from Immersion Corporation of San Jose, Calif.
  • the touchpad 16 can include circuitry necessary to report control signals to the microprocessor of the host computer 10 and to process command signals from the host's microprocessor. For example, appropriate sensors (and related circuitry) are used to report the position of the user's finger on the touchpad 16 .
  • the touchpad device also includes circuitry that receives signals from the host and outputs tactile sensations in accordance with the host signals using one or more actuators.
  • a separate, local microprocessor can be provided for the touchpad 16 to both report touchpad sensor data to the host and/or to carry out force commands received from the host, such commands including, for example, the type of haptic sensation and parameters describing the commanded haptic sensation.
  • the touchpad microprocessor can simply pass streamed data from the main processor to the actuators.
  • the term “force information” can include both commands/parameters and streamed data.
  • the touchpad microprocessor can implement haptic sensations independently after receiving a host command by controlling the touchpad actuators; or, the host processor can maintain a greater degree of control over the haptic sensations by controlling the actuators more directly.
  • logic circuitry such as state machines provided for the touchpad 16 can handle haptic sensations as directed by the host main processor. Architectures and control methods that can be used for reading sensor signals and providing haptic feedback for a device are described in greater detail in U.S. Pat. No. 5,734,373 and co-pending application nos. 60/156,354, 60,133,208, Ser. No. 09/376,649, U.S. Pat. No. 6,639,581 and 60/160,401.
  • FIG. 2 is a perspective view of another embodiment of a device which can include the active touchpad 16 .
  • the device can be a handheld remote control device 30 , which the user grasps in one hand and manipulates controls to access the functions of an electronic device or appliance remotely by a user (such as a television, video cassette recorder or DVD player, audio/video receiver, Internet or network computer connected to a television, etc.).
  • a user such as a television, video cassette recorder or DVD player, audio/video receiver, Internet or network computer connected to a television, etc.
  • buttons 32 can be included on the remote control device 30 to manipulate functions of the controlled apparatus.
  • a touchpad 16 can also be provided to allow the user to provide more sophisticated directional input.
  • a controlled apparatus may have a selection screen in which a cursor may be moved, and the touchpad 16 can be manipulated to control the cursor in two dimensions.
  • the touchpad 16 includes the ability to output haptic sensations to the user as described herein, based on a controlled value or event. For example, a volume level passing a mid-point or reaching a maximum level can cause a pulse to be output to the touchpad and to the user.
  • the controlled apparatus can be a computer system such as Web-TV from Microsoft Corp. or other computing device which displays a graphical user interface and/or web pages accessed over a network such as the Internet.
  • the user can control the direction of the cursor by moving a finger (or other object) on the touchpad 16 .
  • the cursor can be used to select and/or manipulate icons, windows, menu items, graphical buttons, slider bars, scroll bars, or other graphical objects in a graphical user interface or desktop interface.
  • the cursor can also be used to select and/or manipulate graphical objects on a web page, such as links, images, buttons, etc. Other force sensations associated with graphical objects are described below with reference to FIG. 7 .
  • FIG. 3 is a perspective view of a first embodiment 40 of a touchpad 16 for providing haptic feedback to the user.
  • one or more piezoelectric actuators 42 are coupled to the underside of the touchpad 16 .
  • the piezoelectric actuator 42 is driven by suitable electronics, as is well known to those skilled in the art.
  • a single piezoelectric actuator 42 is positioned at or near the center of the touchpad 16 , or off to one side if space constraints of the housing require such a position.
  • multiple piezoelectric actuators 42 can be positioned at different areas of the touchpad; the dashed lines show one configuration, where an actuator 42 is placed at each corner of the pad 16 and at the center of the pad.
  • the piezoelectric actuators 42 can each output a small pulse, vibration, or texture sensation on the touchpad 16 and to the user if the user is contacting the touchpad.
  • the entire touchpad 16 is preferably moved with the forces output by actuator(s) 42 .
  • the forces output on the touchpad are linear (or approximately linear) and along the z-axis, approximately perpendicular to the surface of the touchpad 16 and the top surface of computer 10 .
  • forces can be applied to the touchpad 16 to cause side-to-side (e.g., x-y) motion of the pad in the plane of its surface in addition to or instead of z-axis motion.
  • one linear actuator can provide motion for the x-axis
  • a second linear actuator can provide motion for the y-axis and/or the x-axis.
  • the frequency of a vibration output by an actuator 42 can be varied by providing different control signals to an actuator 42 . Furthermore, the magnitude of a pulse or vibration can be controlled based on the applied control signal. If multiple actuators 42 are provided, a stronger vibration can be imparted on the touchpad by activating two or more actuators simultaneously. Furthermore, if an actuator is positioned at an extreme end of the touchpad and is the only actuator that is activated, the user may experience a stronger vibration on the side of the touchpad having the actuator than on the opposite side of the touchpad. Different magnitudes and localized effects can be obtained by activating some but not all of the actuators. Since the tip of a user's finger that is touching the pad is fairly sensitive, the output forces do not have to be of a high magnitude for the haptic sensation to be effective and compelling.
  • the user may also hold other objects that directly contact the touchpad. Any haptic sensations output on the pad can be transmitted through the held object to the user's hand.
  • the user can hold a stylus having a point that contacts the touchpad 16 more precisely than a finger.
  • Other objects may also be used.
  • specialized objects can be used to enhance the haptic sensations.
  • a stylus or other object having a flexible portion or compliance may be able to magnify at least some of the touchpad haptic sensations as experienced by the user.
  • the piezoelectric actuators 42 have several advantages for the touchpad 16 . These actuators can be made very thin and small, allowing their use in compact housings that are typical for portable electronic devices. They also require very low power, and are thus suitable for devices with limited power (e.g., powered by batteries). In some embodiments described herein, power for the actuators can be drawn off a bus connecting the computer to the touchpad (or touch screen). For example, if the touchpad 16 is provided in a separate housing, a Universal Serial Bus can connect the pad to the computer and provide power from the computer to the pad as well as data (e.g. streaming force data, force commands, etc.).
  • data e.g. streaming force data, force commands, etc.
  • FIG. 4 is a side elevational view of the embodiment 40 of the touchpad 16 as shown in FIG. 3 .
  • Touchpad 16 is directly coupled to a grounded piezo-electric actuator 42 which operates to produce a force on the touchpad 16 when an electrical signal is input to the actuator.
  • a piezo-electric actuator includes two layers which can move relative to each other when a current is applied to the actuator; here, the grounded portion of the actuator remains stationary with respect to the surrounding housing 41 while the moving portion of the actuator and the touchpad move with respect to the housing 41 .
  • the operation of piezo-electric actuators to output force based on an input electrical signal is well known to those skilled the art.
  • the touchpad 16 can be coupled only to the actuator 42 , or can be additionally coupled to the housing of the computer device at other locations besides the actuators 42 .
  • the other couplings are compliant connections, using a material or element such as a spring or foam. If such connections are not made compliant, then the touchpad 16 itself preferably has some compliance to allow portions of the pad to move in response to actuator forces and to convey the haptic sensations to the user more effectively.
  • the electric signal preferably is obtained from a microprocessor and any circuitry required to convert the microprocessor signal to an appropriate signal for use with the actuator 42 .
  • FIG. 5 is a side elevational view of another embodiment 50 , in which the touchpad 16 is positioned on one or more springs 52 .
  • the springs 52 couple the touchpad 16 to the rigid housing of the computer 10 and allow the touchpad 16 to be moved along the z-axis 56 . Only a very small range of motion is required to produce effective pulses (jolts) or vibrations on the pad 16 . Stops (not shown) can be positioned to limit the travel of the touchpad 16 to a desired range along the z-axis.
  • actuator 54 is also coupled to the touchpad 16 to impart forces on the touchpad and cause the touchpad 16 to move along the z-axis.
  • actuator 54 is a linear voice coil actuator, where the moving portion (bobbin) of the actuator is directly coupled to the touchpad 16 .
  • the actuator 54 is grounded to the computer 10 housing and outputs a linear force on the touchpad 16 and thus drives the touchpad along the z-axis.
  • a short pulse or jolt can be output, or the moving portion of the actuator can be oscillated to provide a vibration having a particular desired frequency.
  • the springs 52 cause the touchpad 16 to return to a rest position after a force from the actuator causes the touchpad to move up or down.
  • the springs can also provide a compliant suspension for the touchpad 16 and allow forces output by the actuator 54 to be amplified as explained above.
  • Different types of spring elements can be used in other embodiments to couple the touchpad 16 to the rigid housing, such as leaf springs, foam, flexures, or other compliant materials.
  • the user is able to push the touchpad 16 along the z-axis to provide additional input to the computer 10 .
  • a sensor can be used to detect the position of the touchpad 16 along the z-axis, such as an optical sensor, magnetic sensor, Polhemus sensor, etc.
  • the position on the z-axis can be used to provide proportional input to the computer, for example.
  • other types of forces can be output along the z-axis, such as spring forces, damping forces, inertial forces, and other position-based forces, as disclosed in U.S. Pat. No. 6,563,487.
  • 3-D elevations can be simulated in the graphical environment by moving the pad to different elevations along the z-axis.
  • the pad 16 can be used as an analog input depending on the distance the entire pad is moved along the z-axis, and/or if kinesthetic (force) feedback is applied in the z-axis degree of freedom, then a greater range of motion for the pad 16 along the z-axis is desirable.
  • An elastomeric layer can be provided if the touchpad 16 is able to be pressed by the user to close a switch and provide button or switch input to the computer 10 (e.g. using contact switches, optical switches, or the like).
  • the z-axis movement require a relatively large amount of force to move the pad at least initially, since such z-axis movement may not be desired during normal use of the pad by the user.
  • the voice coil actuator 54 preferably includes a coil and a magnet, where a current is flowed through the coil and interacts with the magnetic field of the magnet to cause a force on the moving portion of the actuator (the coil or the magnet, depending on the implementation), as is well known to those skilled in the art and is described in U.S. Pat. No. 6,184,868.
  • Other types of actuators can also be used, such as a standard speaker, an E-core type actuator (as described in U.S. Pat. No. 6,704,001), a solenoid, a pager motor, a DC motor, moving magnet actuator (described in provisional application No. 60/133,208 and U.S. Pat. No. 6,704,001), or other type of actuator.
  • the actuator can be positioned to output linear motion along an axis perpendicular to the z-axis or along another direction different from the z-axis (rotary or linear), where a mechanism converts such output motion to linear motion along the z-axis as is well known to those skilled in the art.
  • the touchpad 16 can also be integrated with an elastomeric layer and/or a printed circuit board in a sub-assembly, where one or more actuators are coupled to the printed circuit board to provide tactile sensations to the touchpad 16 .
  • Helical springs can also be provided to engage electrical contacts.
  • multiple voice coil actuators can be positioned at different locations under the touchpad 16 .
  • FIG. 6 is a side elevational view of a third embodiment 60 of the haptic touchpad 16 .
  • the stationary portion of the actuator is coupled to the touchpad 16
  • the moving portion of the actuator is coupled to an inertial mass to provide inertial haptic sensations.
  • Touchpad 16 can be compliantly mounted to the rigid housing of the computer device similarly to the embodiments described above.
  • one or more spring elements 62 can be coupled between the touchpad and the housing.
  • These springs can be helical or leaf springs, a compliant material such as rubber or foam, flexures, etc.
  • One or more actuators 64 are coupled to the underside of the touchpad 16 .
  • a piezoelectric actuator is shown.
  • One portion 66 of each actuator 64 is coupled to the touchpad 16
  • the other portion 68 is coupled to a mass 70 .
  • the mass 70 can be any suitable object of the desired weight, such as plastic or metal material. The mass 70 is moved approximately along the z-axis and is not coupled to the housing, allowing free motion.
  • the motion of the mass 70 along the z-axis causes an inertial force that is transmitted through the actuator 64 to the touchpad 16 , and the touchpad 16 moves along the z-axis due to the compliant coupling 62 .
  • the motion of the touchpad 16 is felt by the user contacting the touchpad 16 as a haptic sensation.
  • a linear voice coil actuator as described for FIG. 5 can be used, in which an inertial mass is coupled to the linear-moving portion of the voice coil actuator.
  • Other actuators can also be used, such as solenoids, pager motors, moving magnet actuators, E-core actuators, etc.
  • Many actuators used for inertial haptic sensations are described in U.S. Pat. No. 6,211,861.
  • a rotary actuator can be used, where the rotary output force is converted to a linear force approximately along the z-axis.
  • the rotary force can be converted using a flexure, as described in U.S. Pat. No. 6,697,043.
  • the direction or degree of freedom that the force is applied on the touchpad with respect to the inertial mass is important. If a significant component of the force is applied in the planar workspace of the touchpad (i.e., along the X or Y axis) with respect to the inertial mass, a short pulse or vibration can interfere with the user's object motion in one or both of those planar degrees of freedom and thereby impair the user's ability to accurately guide a controlled graphical object, such as a cursor, to a given target. Since a primary function of the touchpad is accurate targeting, a tactile sensation that distorts or impairs targeting, even mildly, is undesirable.
  • the touchpad device applies inertial forces substantially along the Z axis, orthogonal to the planar X and Y axes of the touchpad surface.
  • tactile sensations can be applied at a perceptually strong level for the user without impairing the ability to accurately position a user controlled graphical object in the X and Y axes of the screen.
  • the tactile sensations are directed in a third degree of freedom relative to the two-dimensional planar workspace and display screen, jolts or pulses output along the Z axis feel much more like three-dimensional bumps or divots to the user that come “out” or go “into” the screen, increasing the realism of the tactile sensations and creating a more compelling interaction.
  • an upwardly-directed pulse that is output when the cursor is moved over a window border creates the illusion that the user is moving a finger or other object “over” a bump at the window border.
  • FIG. 7 is a top elevational view of the touchpad 16 .
  • Touchpad 16 can in some embodiments be used simply as a positioning device, where the entire area of the pad provides cursor control. In other embodiments, different regions of the pad can be designated for different functions. In some of these region embodiments, each region can be provided with an actuator located under the region, while other region embodiments may use a single actuator that imparts forces on the entire pad 16 . In the embodiment shown, a central cursor control region 70 is used to position the cursor.
  • the cursor control region 70 of the pad 16 can cause forces to be output on the pad based on interactions of the controlled cursor with the graphical environment and/or events in that environment.
  • the user moves a finger or other object within region 70 to correspondingly move the cursor 20 .
  • Forces are preferably associated with the interactions of the cursor with displayed graphical objects.
  • a jolt or “pulse” sensation can be output, which is a single impulse of force that quickly rises to the desired magnitude and then is turned off or quickly decays back to zero or small magnitude.
  • the touchpad 16 can be jolted in the z-axis to provide the pulse.
  • a vibration sensation can also be output, which is a time-varying force that is typically periodic. The vibration can cause the touchpad 16 or portions thereof to oscillate back and forth on the z axis, and can be output by a host or local microprocessor to simulate a particular effect that is occurring in a host application.
  • texture force Another type of force sensation that can be output on the touchpad 16 is a texture force.
  • This type of force is similar to a pulse force, but depends on the position of the user's finger on the area of the touchpad and/or on the location of the cursor in the graphical environment.
  • texture bumps are output depending on whether the cursor has moved over a location of a bump in a graphical object.
  • This type of force is spatially-dependent, i.e. a force is output depending on the location of the cursor as it moves over a designated textured area; when the cursor is positioned between “bumps” of the texture, no force is output, and when the cursor moves over a bump, a force is output.
  • a separate touchpad microprocessor can be dedicated for haptic feedback with the touchpad, and the texture effect and be achieved using local control (e.g., the host sends a high level command with texture parameters and the sensation is directly controlled by the touchpad processor).
  • a texture can be performed by presenting a vibration to a user, the vibration being dependent upon the current velocity of the user's finger (or other object) on the touchpad. When the finger is stationary, the vibration is deactivated; as the finger is moved faster, the frequency and magnitude of the vibration is increased.
  • This sensation can be controlled locally by the touchpad processor (if present), or be controlled by the host. Local control by the pad processor may eliminate communication burden in some embodiments. Other spatial force sensations can also be output. In addition, any of the described force sensations herein can be output simultaneously or otherwise combined as desired.
  • Tactile sensations can output on the touchpad 16 based on interaction between a cursor and a window.
  • a z-axis “bump” or pulse can be output on the touchpad to signal the user of the location of the cursor when the cursor is moved over a border of a window.
  • a texture force sensation can be output.
  • the texture can be a series of bumps that are spatially arranged within the area of the window in a predefined pattern; when the cursor moves over a designated bump area, a bump force is output on the touchpad.
  • a pulse or bump force can be output when the cursor is moved over a selectable object, such as a link in a displayed web page or an icon.
  • a vibration can also be output to signify a graphical object which the cursor is currently positioned over.
  • features of a document displaying in a window can also be associated with force sensations. For example, a pulse can be output on the touchpad when a page break in a document is scrolled past a particular area of the window. Page breaks or line breaks in a document can similarly be associated with force sensations such as bumps or vibrations.
  • a menu items in a displayed menu can be selected by the user after a menu heading or graphical button is selected.
  • the individual menu items in the menu can be associated with forces. For example, vertical (z-axis) bumps or pulses can be output when the cursor is moved over the border between menu items.
  • the sensations for certain menu choices can be stronger than others to indicate importance or frequency of use, i.e., the most used menu choices can be associated with higher-magnitude (stronger) pulses than the less used menu choices.
  • currently-disabled menu choices can have a weaker pulse, or no pulse, to indicate that the menu choice is not enabled at that time.
  • pulse sensations can be sent when a sub-menu is displayed. This can be very useful because users may not expect a sub-menu to be displayed when moving a cursor on a menu element.
  • Icons can be associated with textures, pulses, and vibrations similarly to the windows described above.
  • Drawing or CAD programs also have many features which can be associated with similar haptic sensations, such as displayed (or invisible) grid lines or dots, control points of a drawn object, etc.
  • a vibration can be displayed on the device to indicate that scrolling is in process.
  • a pulse can be output to indicate that the end of the range has been reached.
  • Pulse sensations can be used to indicate the location of the “ticks” for discrete values or settings in the adjusted range.
  • a pulse can also be output to inform the user when the center of the range is reached.
  • Different strength pulses can also be used, larger strength indicating the more important ticks.
  • strength and/or frequency of a vibration can be correlated with the adjustment of a control to indicate current magnitude of the volume or other adjusted value.
  • a vibration sensation can be used to indicate that a control function is active.
  • a user performs a function, like selection or cutting or pasting a document, and there is a delay between the button press that commands the function and the execution of the function, due to processing delays or other delays.
  • a pulse sensation can be used to indicate that the function (the cut or paste) has been executed.
  • the magnitude of output forces on the touchpad can depend on the event or interaction in the graphical environment.
  • the force pulse can be a different magnitude of force depending on the type of graphical object encountered by the cursor. For example, a pulses of higher magnitude can be output when the cursor moves over windows, while pulses of lower magnitude can be output when the cursor moves over icons.
  • the magnitude of the pulses can also depend on other characteristics of graphical objects, such as an active window as distinguished a background window, file folder icons of different priorities designated by the user, icons for games as distinguished from icons for business applications, different menu items in a drop-down menu, etc.
  • the user or developer can also preferably associate particular graphical objects with customized haptic sensations.
  • User-independent events can also be relayed to the user using haptic sensations on the touchpad.
  • An event occurring within the graphical environment such as an appointment reminder, receipt of email, explosion in a game, etc., can be signified using a vibration, pulse, or other time-based force.
  • the force sensation can be varied to signify different events of the same type.
  • vibrations of different frequency can each be used to differentiate different events or different characteristics of events, such as particular users sending email, the priority of an event, or the initiation or conclusion of particular tasks (e.g. the downloading of a document or data over a network).
  • a software designer may want to allow a user to be able to select options or a software function by positioning a cursor over an area on the screen using the touchpad, but not require pressing a physical button or tapping the touchpad to actually select the option.
  • a pulse sent to the touchpad can act as that physical confirmation without the user having to press a button or other control for selection.
  • a user can position a cursor over a web page element, and once the cursor is within the desired region for a given period of time, an associated function can be executed. This is indicated to the user through a tactile pulse sent to the pad 16 .
  • a vibration can be output when a user-controlled racing car is driving on a dirt shoulder of a displayed road
  • a pulse can be output when the car collides with another object
  • a varying-frequency vibration can be output when a vehicle engine starts and rumbles.
  • the magnitude of pulses can be based on the severity of a collision or explosion, the size of the controlled graphical object or entity (and/or the size of a different graphical object/entity that is interacted with), etc.
  • Force sensations can also be output based on user-independent events in the game or simulation, such as pulses when bullets are fired at the user's character.
  • haptic sensations can be similar to those described in U.S. Pat. No. 6,243,078 and U.S. Pat. No. 6,211,861.
  • Other control devices or grips that can include a touchpad 16 in its housing include a gamepad, mouse or trackball device for manipulating a cursor or other graphical objects in a computer-generated environment; or a pressure sphere or the like.
  • the touchpad 16 can be provided on the housing of a computer mouse to provide additional input to the host computer.
  • selective disturbance filtering of forces as described in U.S. Pat. No. 6,020,876, and shaping of force signals to drive the touchpad with impulse waves as described in U.S. Pat. No. 5,959,613, can be used.
  • Such impulses are also effective when driven with stored power in a battery on the computer 10 or from a bus such as USB connected to a host computer.
  • the touchpad 16 can also be provided with different control regions that provide separate input from the main cursor control region 70 .
  • the different regions can be physically marked with lines, borders, or textures on the surface of the pad 16 (and/or sounds from the computer 10 ) so that the user can visually, audibly, and/or or tactilely tell which region he or she is contacting on the pad.
  • scroll or rate control regions 62 a and 62 b can be used to provide input to perform a rate control task, such as scrolling documents, adjusting a value (such as audio volume, speaker balance, monitor display brightness, etc.), or panning/tilting the view in a game or virtual reality simulation.
  • Region 62 a can be used by placing a finger (or other object) within the region, where the upper portion of the region will increase the value, scroll up, etc., and the lower portion of the region will decrease the value, scroll down, etc.
  • the amount of pressure can directly control the rate of adjustment; e.g., a greater pressure will cause a document to scroll faster.
  • the region 62 b can similarly be used for horizontal (left/right) scrolling or rate control adjustment of a different value, view, etc.
  • Particular haptic effects can be associated with the control regions 62 a and 62 b.
  • a vibration of a particular frequency can be output on the pad 16 .
  • an actuator placed directly under the region 62 a or 62 b can be activated to provide a more localized tactile sensation for the “active” (currently used) region.
  • pulses can be output on the pad (or region of the pad) to indicate when a page has scroll by, a particular value has passed, etc.
  • a vibration can also be continually output while the user contacts the region 62 a or 62 b.
  • regions 64 can also be positioned on the touchpad 16 .
  • each of regions 64 provides a small rectangular area, like a button, which the user can point to in order to initiate a function associated with the pointed-to region.
  • the regions 64 can initiate such computer functions as running a program, opening or closing a window, going “forward” or “back” in a queue of web pages in a web browser, powering the computer 10 or initiating a “sleep” mode, checking mail, firing a gun in a game, cutting or pasting data from a buffer, selecting a font, etc.
  • the regions 64 can duplicate functions and buttons provided in an application program or provide new, different functions.
  • the regions 64 an each be associated with haptic sensations; for example, a region 64 can provide a pulse sensation when it has been selected by the user, providing instant feedback that the function has been selected. Furthermore, the same types of regions can be associated with similar-feeling haptic sensations. For example, each word processor related region 64 can, when pointed to, cause a pulse of a particular strength, while each game-related region can provide a pulse of different strength or a vibration. Furthermore, when the user moves the pointing object from one region 62 or 64 to another, a haptic sensation (such as a pulse) can be output on the pad 16 to signify that a region border has been crossed.
  • a haptic sensation such as a pulse
  • the regions are preferably programmable in size and shape as well as in the function with which they are associated.
  • the functions for regions 64 can change based on an active application program in the graphical environment and/or based on user preferences input to and/or stored on the computer 10 .
  • the size and location of each of the regions can be adjusted by the user or by an application program, and any or all of the regions can be completely removed if desired.
  • the user is preferably able to assign particular haptic sensations to particular areas or types of areas based on types of functions associated with those areas, as desired. Different haptic sensations can be designed in a tool such as Immersion StudioTM available from Immersion Corporation of San Jose, Calif.
  • the regions 62 and 64 need not be physical regions of the touchpad 16 . That is, the entire touchpad 16 surface need merely provide coordinates of user contact to the processor of the computer and software on the computer can designate where different regions are located.
  • the computer can interpret the coordinates and, based on the location of the user contact, can interpret the touchpad input signal as a cursor control signal or a different type of signal, such as rate control, button function, etc.
  • the local touchpad microprocessor if present, may alternatively interpret the function associated with the user contact location and report appropriate signal or data to the host processor (such as position coordinates or a button signal), thus keeping the host processor unaware of the lower level processing.
  • the touchpad 16 can be physically designed to output different signals to the computer based on different regions marked on the touchpad surface that are contacted by the user; e.g. each region can be sensed by a different sensor or sensor array.
  • FIGS. 8 a and 8 b are top plan and side cross-sectional views, respectively, of another computer device embodiment 80 including a form of the haptic touchpad 16 .
  • Device 80 is in the form of a portable computer device such as “personal digital assistant” (PDA), a “pen-based” computer, “electronic book”, or similar device (collectively known as a “personal digital assistant” or PDA herein).
  • PDA personal digital assistant
  • PDA personal digital assistant
  • Those devices which allow a user to input information by touching a display screen or readout in some fashion are primarily relevant to this embodiment.
  • Such devices can include the Palm Pilot from 3Com Corp., the Newton from Apple Computer, pocket-sized computer devices from Casio, Hewlett-Packard, or other manufacturers, cellular phones or pagers having touch screens, etc.
  • a display screen 82 typically covers a large portion of the surface of the computer device 80 .
  • Screen 82 is preferably a flat-panel display as is well known to those skilled in the art and can display text, images, animations, etc.; in some embodiments screen 80 is as functional as any personal computer screen.
  • Display screen 82 is preferably a “touch screen” that includes sensors which allow the user to input information to the computer device 80 by physically contacting the screen 80 (i.e. it is another form of planar “touch device” similar to the touchpad 16 ).
  • a transparent sensor film can be overlaid on the screen 80 , where the film can detect pressure from an object contacting the film.
  • the sensor devices for implementing touch screens are well known to those skilled in the art.
  • buttons or other graphical objects by pressing a finger or a stylus to the screen 82 at the exact location where the graphical object is displayed. Furthermore, some embodiments allow the user to “draw” or “write” on the screen by displaying graphical “ink” images 85 at locations where the user has pressed a tip of a stylus, finger, or other object.
  • Handwritten characters can be recognized by software running on the device microprocessor as commands, data, or other input. In other embodiments, the user can provide input additionally or alternatively through voice recognition, where a microphone on the device inputs the user's voice which is translated to appropriate commands or data by software running on the device.
  • Physical buttons 84 can also be included in the housing of the device 80 to provide particular commands to the device 80 when the buttons are pressed.
  • Many PDA's are characterized by the lack of a standard keyboard for character input from the user; rather, an alternative input mode is used, such as using a stylus to draw characters on the screen, voice recognition, etc.
  • some PDA's also include a fully-functional keyboard as well as a touch screen, where the keyboard is typically much smaller than a standard-sized keyboard.
  • standard-size laptop computers with standard keyboards may include flat-panel touch-input display screens, and such screens (similar to screen 12 of FIG. 1 ) can be provided with haptic feedback.
  • the touch screen 82 provides haptic feedback to the user similarly to the touchpad 16 described in previous embodiments.
  • One or more actuators 86 can be coupled to the underside of the touch screen 82 to provide haptic feedback such as pulses, vibrations, and textures; for example, an actuator 86 can be positioned near each corner of the screen 82 , as shown in FIG. 8 a . Other configurations of actuators can also be used.
  • the user can experience the haptic feedback through a finger or a held object such as a stylus 87 that is contacting the screen 82 .
  • the touch screen 82 is preferably coupled to the housing 88 of the device 80 by one or more spring or compliant elements 90 , such as helical springs, leaf springs, flexures, or compliant material (foam, rubber, etc.)
  • the compliant element allows the touch screen 82 to move approximately along the z-axis, thereby providing haptic feedback similarly to the touchpad embodiments described above.
  • Actuators 86 can be piezo-electric actuators, voice coil actuators, or any of the other types of actuators described above for the touchpad embodiments.
  • the actuators 86 are directly coupled to the touch screen 82 similarly to the touchpad embodiment of FIG. 3 ; alternatively, an inertial mass can be moved to provide inertial feedback in the z-axis of the touch screen, similarly to the touchpad embodiment of FIG. 6 .
  • Other features described above for the touchpad are equally applicable to the touch screen embodiment 80 .
  • touch input devices touchpad and touch screen
  • contact of the user is detected by the touch input device. Since haptic feedback need only be output when the user is contacting the touch device, this detection allows haptic feedback to be stopped (actuators “turned off”) when no objects are contacting the touch input device. This feature can conserve battery power for portable devices. If a local touch device microprocessor (or similar circuitry) is being used in the computer, such a microprocessor can turn off actuator output when no user contact is sensed, thus alleviating the host processor of additional computational burden.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A haptic feedback planar touch control used to provide input to a computer. A touch input device includes a planar touch surface that inputs a position signal to a processor of the computer based on a location of user contact on the touch surface. The computer can position a cursor in a displayed graphical environment based at least in part on the position signal, or perform a different function. At least one actuator is also coupled to the touch input device and outputs a force to provide a haptic sensation to the user contacting the touch surface. The touch input device can be a touchpad separate from the computer's display screen, or can be a touch screen. Output haptic sensations on the touch input device can include pulses, vibrations, and spatial textures. The touch input device can include multiple different regions to control different computer functions.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 11/405,811 entitled “Haptic Feedback for Touchpads and Other Touch Controls,” filed Apr. 17, 2006, which is a continuation of U.S. Pat. No. 7,148,875, entitled “Haptic Feedback for Touchpads and Other Touch Controls,” issued Dec. 12, 2006, which is a continuation of U.S. Pat. No. 6,429,846, entitled “Haptic Feedback for Touchpads and Other Touch Controls,” issued Aug. 6, 2002, which is a continuation-in-part of U.S. Pat. No. 6,563,487, entitled “Haptic Feedback for Directional Control Pads,” issued May 13, 2003, which is a continuation-in-part of U.S. Pat. No. 6,243,078, entitled “Low Cost Force Feedback Pointing Device,” issued Jun. 5, 2001, which is a continuation-in-part of U.S. Pat. No. 6,184,868, entitled “Haptic Feedback Control Devices,” issued Feb. 6, 2001, which is a continuation-in-part of U.S. Pat. No. 6,088,019, entitled “Low Cost Force Feedback Device with Actuator for Non-Primary Axis,” issued Jul. 11, 2000.
  • BACKGROUND
  • The subject matter described relates generally to the interfacing with computer and mechanical devices by a user, and more particularly to devices used to interface with computer systems and electronic devices and which provide haptic feedback to the user.
  • Humans interface with electronic and mechanical devices in a variety of applications, and the need for a more natural, easy-to-use, and informative interface is a constant concern. In the context, humans interface with computer devices for a variety of applications. One such application is interacting with computer-generated environments such as games, simulations, and application programs. Computer input devices such as mice and trackballs are often used to control a cursor within a graphical environment and provide input in these applications.
  • In some interface devices, force feedback or tactile feedback is also provided to the user, collectively known herein as “haptic feedback.” For example, haptic versions of joysticks, mice, gamepads, steering wheels, or other types of devices can output forces to the user based on events or interactions occurring within the graphical environment, such as in a game or other application program.
  • In portable computer or electronic devices, such as laptop computers, mice typically too large a workspace to be practical. As a result, more compact devices such as trackballs are often used. A more popular device for portable computers are “touchpads,” which are small rectangular, planar pads provided near the keyboard of the computer. The touchpads senses the location of a pointing object by any of a variety of sensing technologies, such as capacitive sensors or pressure sensors that detect pressure applied to the touchpad. The user contacts the touchpad most commonly with a fingertip and moves his or her finger on the pad to move a cursor displayed in the graphical environment. In other embodiments, the user can operate a stylus in conjunction with the touchpad by pressing the stylus tip on the touchpad and moving the stylus.
  • One problem with existing touchpads is that there is no haptic feedback provided to the user. The user of a touchpad is therefore not able to experience haptic sensations that assist and inform the user of targeting and other control tasks within the graphical environment. The touchpads of the prior art also cannot take advantage of existing haptic-enabled software run on the portable computer.
  • OVERVIEW
  • An embodiment is directed to a haptic feedback planar touch control used to provide input to a computer system. The control can be a touchpad provided on a portable computer, or can be a touch screen found on a variety of devices. The haptic sensations output on the touch control enhance interactions and manipulations in a displayed graphical environment or when controlling an electronic device.
  • More specifically, the embodiment relates to a haptic feedback touch control for inputting signals to a computer and for outputting forces to a user of the touch control. The control includes a touch input device including an approximately planar touch surface operative to input a position signal to a processor of said computer based on a location of user contact on the touch surface. The computer positions a cursor in a graphical environment displayed on a display device based at least in part on the position signal. At least one actuator is also coupled to the touch input device and outputs a force on the touch input device to provide a haptic sensation to the user contacting the touch surface. The actuator outputs the force based on force information output by the processor to the actuator.
  • The touch input device can be a touchpad separate from a display screen of the computer, or can be included in a display screen of the computer as a touch screen. The touch input device can be integrated in a housing of the computer or handheld device, or provided in a housing that is separate from the computer. The user contacts the touch surface with a finger, a stylus, or other object. The force is preferably a linear force output approximately perpendicularly to a plane of the touch surface of the touch input device, and the actuator can include a piezo-electric actuator, a voice coil actuator, a pager motor, a solenoid, or other type of actuator. In one embodiment, the actuator is coupled between the touch input device and a grounded surface. In another embodiment, the actuator is coupled to an inertial mass, wherein said actuator outputs an inertial force on the touch input device approximately along an axis perpendicular to the planar touch surface. A touch device microprocessor separate from the main processor of the computer can receive force information from the host computer and provide control signals based on the force information to control the actuator.
  • The haptic sensations, such as a pulse, vibration, or spatial texture, are preferably output in accordance with an interaction of a controlled cursor with a graphical object in the graphical environment. For example, a pulse can be output when the cursor is moved between menu elements in a menu, moved over said icon, or moved over a hyperlink. The touch input device can include multiple different regions, where at least one of the regions provides the position signal and at least one other region provides a signal that is used by the computer to control a different function, such as rate control function of a value or a button press. Different regions and borders between regions can be associated with different haptic sensations.
  • An embodiment advantageously provides haptic feedback to a planar touch control device of a computer, such as a touchpad or touch screen. The haptic feedback can assist and inform the user of interactions and events within a graphical user interface or other environment and ease cursor targeting tasks. Furthermore, an embodiment allows portable computer devices having such touch controls to take advantage of existing haptic feedback enabled software. The haptic touch devices disclosed herein are also inexpensive, compact and consume low power, allowing them to be easily incorporated into a wide variety of portable and desktop computers and electronic devices.
  • These and other advantages will become apparent to those skilled in the art upon a reading of the following specification and a study of the several figures of the drawing.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of a haptic touchpad;
  • FIG. 2 is a perspective view of a remote control device including the touchpad;
  • FIG. 3 is a perspective view of a first embodiment of the touchpad including one or more actuators coupled to the underside of the touchpad;
  • FIG. 4 is a side elevational view of a first embodiment in which a piezo-electric actuator is directly coupled to the touchpad;
  • FIG. 5 is a side elevational view of a second embodiment of the touchpad including a linear actuator;
  • FIG. 6 is a side elevational view of a third embodiment of the touchpad having an inertial mass;
  • FIG. 7 is a top plan view of an example of a touchpad having different control regions; and
  • FIGS. 8 a and 8 b are top plan and side cross sectional views, respectively, of a touch screen embodiment.
  • DETAILED DESCRIPTION
  • FIG. 1 is a perspective view of a portable computer 10 including a haptic touchpad. Computer 10 is preferably a portable or “laptop” computer that can be carried or otherwise transported by the user and may be powered by batteries or other portable energy source in addition to other more stationary power sources. Computer 10 preferably runs one or more host application programs with which a user is interacting via peripherals.
  • Computer 10 may include the various input and output devices as shown, including a display device 12 for outputting graphical images to the user, a keyboard 14 for providing character or toggle input from the user to the computer, and a touchpad 16. Display device 12 can be any of a variety of types of display devices; flat-panel displays are most common on portable computers. Display device 12 can display a graphical environment 18 based on application programs and/or operating systems that are running, such as a graphical user interface (GUI), that can include a cursor 20 that can be moved by user input, as well as windows 22, icons 24, and other graphical objects well known in GUI environments. Other devices may also be incorporated or coupled to the computer 10, such as storage devices (hard disk drive, DVD-ROM drive, etc.), network server or clients, game controllers, etc. In alternate embodiments, the computer 10 can take a wide variety of forms, including computing devices that rest on a tabletop or other surface, stand-up arcade game machines, other portable devices or devices worn on the person, handheld or used with a single hand of the user, etc. For example, host computer 10 can be a video game console, personal computer, workstation, a television “set top box” or a “network computer”, or other computing or electronic device.
  • Touchpad device 16 preferably appears externally to be similar to the touchpads of the prior art. Pad 16 includes a planar, rectangular smooth surface that can be positioned below the keyboard 14 on the housing of the computer 10, as shown, or may be positioned at other areas of the housing. When the user operates the computer 10, the user may conveniently place a fingertip or other object on the touchpad 16 and move the fingertip to correspondingly move cursor 20 in the graphical environment 18.
  • In operation, the touchpad 16 inputs coordinate data to the main microprocessor(s) of the computer 10 based on the sensed location of an object on (or near) the touchpad. As with many touchpads of the prior art, touchpad 16 can be capacitive, resistive, or use a different type of sensing. Some existing touchpad embodiments are disclosed, for example, in U.S. Pat. No. 5,521,336 and U.S. Pat. No. 5,943,044. Capacitive touchpads typically sense the location of an object on or near the surface of the touchpad based on capacitive coupling between capacitors in the touchpad and the object. Resistive touchpads are typically pressure-sensitive, detecting the pressure of a finger, stylus, or other object against the pad, where the pressure causes conductive layers, traces, switches, etc. in the pad to electrically connect. Some resistive or other types of touchpads can detect the amount of pressure applied by the user and can use the degree of pressure for proportional or variable input to the computer 10. Resistive touchpads typically are at least partially deformable, so that when a pressure is applied to a particular location, the conductors at that location are brought into electrical contact. Such deformability can be useful since it can potentially amplify the magnitude of output forces such as pulses or vibrations on the touchpad. Forces can be amplified if a tuned compliant suspension is provided between an actuator and the object that is moved, as described in U.S. Pat. No. 6,680,729. Capacitive touchpads and other types of touchpads that do not require significant contact pressure may be better suited in many embodiments, since excessive pressure on the touchpad may in some cases interfere with the motion of the touchpad for haptic feedback. Other types of sensing technologies can also be used in the touchpad. Herein, the term “touchpad” preferably includes the surface of the touchpad 16 as well as any sensing apparatus included in the touchpad unit.
  • Touchpad 16 preferably operates similarly to existing touchpads, where the speed of the fingertip on the touchpad correlates to the distance that the cursor is moved in the graphical environment. For example, if the user moves his or her finger quickly across the pad, the cursor is moved a greater distance than if the user moves the fingertip more slowly. If the user's finger reaches the edge of the touchpad before the cursor reaches a desired destination in that direction, then the user can simply move his or her finger off the touchpad, reposition the finger away from the edge, and continue moving the cursor. This is an “indexing” function similar to lifting a mouse off a surface to change the offset between mouse position and cursor. Furthermore, many touchpads can be provided with particular regions that are each assigned to particular functions that can be unrelated to cursor positioning. Such an embodiment is described in greater detail below with respect to FIG. 7. In some embodiments the touchpad 16 may also allow a user to “tap” the touchpad (rapidly touch and remove the object from the pad) in a particular location to provide a command. For example, the user can tap or “double tap” the pad with a finger while the controlled cursor is over an icon to select that icon.
  • The touchpad 16 is provided with the ability to output haptic feedback such as tactile sensations to the user who is physically contacting the touchpad 16. Various embodiments detailing the structure of the haptic feedback touchpad are described in greater detail below. Preferably, the forces output on the touchpad are linear (or approximately linear) and oriented along the z-axis, approximately perpendicular to the surface of the touchpad 16 and the top surface of computer 10. In a different embodiment, forces can be applied to the touchpad 16 to cause side-to-side (e.g., x-y) motion of the pad in the plane of its surface in addition to or instead of z-axis motion, although such motion is not preferred.
  • Using one or more actuators coupled to the touchpad 16, a variety of haptic sensations can be output to the user who is contacting the pad. For example, jolts, vibrations (varying or constant amplitude), and textures can be output. Forces output on the pad can be at least in part based on the location of the finger on the pad or the state of a controlled object in the graphical environment of the host computer 10, and/or independent of finger position or object state. Such forces output on the touchpad 16 are considered “computer-controlled” since a microprocessor or other electronic controller is controlling the magnitude and/or direction of the force output of the actuator(s) using electronic signals. Preferably, the entire pad 16 is provided with haptic sensations as a single unitary member; in other embodiments, individually-moving portions of the pad can each be provided with its own haptic feedback actuator and related transmissions so that haptic sensations can be provided for only a particular portion. For example, some embodiments may include a touchpad having different portions that may be flexed or otherwise moved with respect to other portions of the pad.
  • In other embodiments, the touchpad 16 can be provided in a separate housing that is connected to a port of the computer 10 via a cable or via wireless transmission and which receives force information from and sends position information to the computer 10. For example, Universal Serial Bus (USB), Firewire, or a standard serial bus can connect such a touchpad to the computer 10. In such an embodiment, the computer 10 can be any desktop or stationary computer or device and need not be a portable device.
  • One or more buttons 26 can also be provided on the housing of the computer 10 to be used in conjunction with the touchpad 16. The user's hands have easy access to the buttons, each of which may be pressed by the user to provide a distinct input signal to the host computer 12. Typically, each button 26 corresponds to a similar button found on a mouse input device, so that a left button can be used to select a graphical object (click or double click), a right button can bring up a context menu, etc. In some embodiments, one or more of the buttons 26 can be provided with tactile feedback as described in U.S. Pat. No. 6,184,868 and U.S. Pat. No. 6,563,487. Other features of these disclosures may also be used.
  • Furthermore, in some embodiments, one or more moveable portions 28 of the housing of the computer device 10 can be included which is contacted by the user when the user operates the touchpad 16 and which can provide haptic feedback. Having a moveable portion of a housing for haptic feedback is described in U.S. Pat. No. 6,184,868 and U.S. Pat. No. 6,088,019. Thus, both the housing can provide haptic feedback (e.g., through the use of an eccentric rotating mass on a motor coupled to the housing) and the touchpad 16 can provide separate haptic feedback. This allows the host to control two different tactile sensations simultaneously to the user, for example, a vibration of a low frequency can be conveyed through the housing to the user and a higher frequency vibration can be conveyed to the user through the touchpad 16. Each other button or other control provided with haptic feedback can also provide tactile feedback independently from the other controls.
  • The host application program(s) and/or operating system preferably displays graphical images of the environment on display device 12. The software and environment running on the host computer 12 may be of a wide variety. For example, the host application program can be a word processor, spreadsheet, video or computer game, drawing program, operating system, graphical user interface, simulation, Web page or browser that implements HTML or VRML instructions, scientific analysis program, virtual reality training program or application, or other application program that utilizes input from the touchpad 16 and outputs force feedback commands to the touchpad 16. For example, many games and other application programs include force feedback functionality and may communicate with the touchpad 16 using a standard protocol/drivers such as I-Force®, FEELit®, or Touchsense™ available from Immersion Corporation of San Jose, Calif.
  • The touchpad 16 can include circuitry necessary to report control signals to the microprocessor of the host computer 10 and to process command signals from the host's microprocessor. For example, appropriate sensors (and related circuitry) are used to report the position of the user's finger on the touchpad 16. The touchpad device also includes circuitry that receives signals from the host and outputs tactile sensations in accordance with the host signals using one or more actuators. In some embodiments, a separate, local microprocessor can be provided for the touchpad 16 to both report touchpad sensor data to the host and/or to carry out force commands received from the host, such commands including, for example, the type of haptic sensation and parameters describing the commanded haptic sensation. Alternatively, the touchpad microprocessor can simply pass streamed data from the main processor to the actuators. The term “force information” can include both commands/parameters and streamed data. The touchpad microprocessor can implement haptic sensations independently after receiving a host command by controlling the touchpad actuators; or, the host processor can maintain a greater degree of control over the haptic sensations by controlling the actuators more directly. In other embodiments, logic circuitry such as state machines provided for the touchpad 16 can handle haptic sensations as directed by the host main processor. Architectures and control methods that can be used for reading sensor signals and providing haptic feedback for a device are described in greater detail in U.S. Pat. No. 5,734,373 and co-pending application nos. 60/156,354, 60,133,208, Ser. No. 09/376,649, U.S. Pat. No. 6,639,581 and 60/160,401.
  • FIG. 2 is a perspective view of another embodiment of a device which can include the active touchpad 16. The device can be a handheld remote control device 30, which the user grasps in one hand and manipulates controls to access the functions of an electronic device or appliance remotely by a user (such as a television, video cassette recorder or DVD player, audio/video receiver, Internet or network computer connected to a television, etc.). For example, several buttons 32 can be included on the remote control device 30 to manipulate functions of the controlled apparatus. A touchpad 16 can also be provided to allow the user to provide more sophisticated directional input. For example, a controlled apparatus may have a selection screen in which a cursor may be moved, and the touchpad 16 can be manipulated to control the cursor in two dimensions. The touchpad 16 includes the ability to output haptic sensations to the user as described herein, based on a controlled value or event. For example, a volume level passing a mid-point or reaching a maximum level can cause a pulse to be output to the touchpad and to the user.
  • In one application, the controlled apparatus can be a computer system such as Web-TV from Microsoft Corp. or other computing device which displays a graphical user interface and/or web pages accessed over a network such as the Internet. The user can control the direction of the cursor by moving a finger (or other object) on the touchpad 16. The cursor can be used to select and/or manipulate icons, windows, menu items, graphical buttons, slider bars, scroll bars, or other graphical objects in a graphical user interface or desktop interface. The cursor can also be used to select and/or manipulate graphical objects on a web page, such as links, images, buttons, etc. Other force sensations associated with graphical objects are described below with reference to FIG. 7.
  • FIG. 3 is a perspective view of a first embodiment 40 of a touchpad 16 for providing haptic feedback to the user. In this embodiment, one or more piezoelectric actuators 42 are coupled to the underside of the touchpad 16. The piezoelectric actuator 42 is driven by suitable electronics, as is well known to those skilled in the art. In one embodiment, a single piezoelectric actuator 42 is positioned at or near the center of the touchpad 16, or off to one side if space constraints of the housing require such a position. In other embodiments, multiple piezoelectric actuators 42 can be positioned at different areas of the touchpad; the dashed lines show one configuration, where an actuator 42 is placed at each corner of the pad 16 and at the center of the pad.
  • The piezoelectric actuators 42 can each output a small pulse, vibration, or texture sensation on the touchpad 16 and to the user if the user is contacting the touchpad. The entire touchpad 16 is preferably moved with the forces output by actuator(s) 42. Preferably, the forces output on the touchpad are linear (or approximately linear) and along the z-axis, approximately perpendicular to the surface of the touchpad 16 and the top surface of computer 10. In a different embodiment, as mentioned above, forces can be applied to the touchpad 16 to cause side-to-side (e.g., x-y) motion of the pad in the plane of its surface in addition to or instead of z-axis motion. For example, one linear actuator can provide motion for the x-axis, and a second linear actuator can provide motion for the y-axis and/or the x-axis.
  • The frequency of a vibration output by an actuator 42 can be varied by providing different control signals to an actuator 42. Furthermore, the magnitude of a pulse or vibration can be controlled based on the applied control signal. If multiple actuators 42 are provided, a stronger vibration can be imparted on the touchpad by activating two or more actuators simultaneously. Furthermore, if an actuator is positioned at an extreme end of the touchpad and is the only actuator that is activated, the user may experience a stronger vibration on the side of the touchpad having the actuator than on the opposite side of the touchpad. Different magnitudes and localized effects can be obtained by activating some but not all of the actuators. Since the tip of a user's finger that is touching the pad is fairly sensitive, the output forces do not have to be of a high magnitude for the haptic sensation to be effective and compelling.
  • Besides using a finger to contact the touchpad, the user may also hold other objects that directly contact the touchpad. Any haptic sensations output on the pad can be transmitted through the held object to the user's hand. For example, the user can hold a stylus having a point that contacts the touchpad 16 more precisely than a finger. Other objects may also be used. In some embodiments, specialized objects can be used to enhance the haptic sensations. For example, a stylus or other object having a flexible portion or compliance may be able to magnify at least some of the touchpad haptic sensations as experienced by the user.
  • The piezoelectric actuators 42 have several advantages for the touchpad 16. These actuators can be made very thin and small, allowing their use in compact housings that are typical for portable electronic devices. They also require very low power, and are thus suitable for devices with limited power (e.g., powered by batteries). In some embodiments described herein, power for the actuators can be drawn off a bus connecting the computer to the touchpad (or touch screen). For example, if the touchpad 16 is provided in a separate housing, a Universal Serial Bus can connect the pad to the computer and provide power from the computer to the pad as well as data (e.g. streaming force data, force commands, etc.).
  • FIG. 4 is a side elevational view of the embodiment 40 of the touchpad 16 as shown in FIG. 3. Touchpad 16 is directly coupled to a grounded piezo-electric actuator 42 which operates to produce a force on the touchpad 16 when an electrical signal is input to the actuator. Typically, a piezo-electric actuator includes two layers which can move relative to each other when a current is applied to the actuator; here, the grounded portion of the actuator remains stationary with respect to the surrounding housing 41 while the moving portion of the actuator and the touchpad move with respect to the housing 41. The operation of piezo-electric actuators to output force based on an input electrical signal is well known to those skilled the art.
  • The touchpad 16 can be coupled only to the actuator 42, or can be additionally coupled to the housing of the computer device at other locations besides the actuators 42. Preferably the other couplings are compliant connections, using a material or element such as a spring or foam. If such connections are not made compliant, then the touchpad 16 itself preferably has some compliance to allow portions of the pad to move in response to actuator forces and to convey the haptic sensations to the user more effectively.
  • Since the touchpad 16 is directly coupled to the actuator 42, any produced forces are directly applied to the touchpad 16. The electric signal preferably is obtained from a microprocessor and any circuitry required to convert the microprocessor signal to an appropriate signal for use with the actuator 42.
  • FIG. 5 is a side elevational view of another embodiment 50, in which the touchpad 16 is positioned on one or more springs 52. The springs 52 couple the touchpad 16 to the rigid housing of the computer 10 and allow the touchpad 16 to be moved along the z-axis 56. Only a very small range of motion is required to produce effective pulses (jolts) or vibrations on the pad 16. Stops (not shown) can be positioned to limit the travel of the touchpad 16 to a desired range along the z-axis.
  • An actuator 54 is also coupled to the touchpad 16 to impart forces on the touchpad and cause the touchpad 16 to move along the z-axis. In the present embodiment, actuator 54 is a linear voice coil actuator, where the moving portion (bobbin) of the actuator is directly coupled to the touchpad 16. The actuator 54 is grounded to the computer 10 housing and outputs a linear force on the touchpad 16 and thus drives the touchpad along the z-axis. A short pulse or jolt can be output, or the moving portion of the actuator can be oscillated to provide a vibration having a particular desired frequency. The springs 52 cause the touchpad 16 to return to a rest position after a force from the actuator causes the touchpad to move up or down. The springs can also provide a compliant suspension for the touchpad 16 and allow forces output by the actuator 54 to be amplified as explained above. Different types of spring elements can be used in other embodiments to couple the touchpad 16 to the rigid housing, such as leaf springs, foam, flexures, or other compliant materials.
  • In some embodiments, the user is able to push the touchpad 16 along the z-axis to provide additional input to the computer 10. For example, a sensor can be used to detect the position of the touchpad 16 along the z-axis, such as an optical sensor, magnetic sensor, Polhemus sensor, etc. The position on the z-axis can be used to provide proportional input to the computer, for example. In addition, other types of forces can be output along the z-axis, such as spring forces, damping forces, inertial forces, and other position-based forces, as disclosed in U.S. Pat. No. 6,563,487. In addition, 3-D elevations can be simulated in the graphical environment by moving the pad to different elevations along the z-axis. If the pad 16 can be used as an analog input depending on the distance the entire pad is moved along the z-axis, and/or if kinesthetic (force) feedback is applied in the z-axis degree of freedom, then a greater range of motion for the pad 16 along the z-axis is desirable. An elastomeric layer can be provided if the touchpad 16 is able to be pressed by the user to close a switch and provide button or switch input to the computer 10 (e.g. using contact switches, optical switches, or the like). If such z-axis movement of the pad 16 is allowed, it is preferred that the z-axis movement require a relatively large amount of force to move the pad at least initially, since such z-axis movement may not be desired during normal use of the pad by the user.
  • The voice coil actuator 54 preferably includes a coil and a magnet, where a current is flowed through the coil and interacts with the magnetic field of the magnet to cause a force on the moving portion of the actuator (the coil or the magnet, depending on the implementation), as is well known to those skilled in the art and is described in U.S. Pat. No. 6,184,868. Other types of actuators can also be used, such as a standard speaker, an E-core type actuator (as described in U.S. Pat. No. 6,704,001), a solenoid, a pager motor, a DC motor, moving magnet actuator (described in provisional application No. 60/133,208 and U.S. Pat. No. 6,704,001), or other type of actuator. Furthermore, the actuator can be positioned to output linear motion along an axis perpendicular to the z-axis or along another direction different from the z-axis (rotary or linear), where a mechanism converts such output motion to linear motion along the z-axis as is well known to those skilled in the art.
  • The touchpad 16 can also be integrated with an elastomeric layer and/or a printed circuit board in a sub-assembly, where one or more actuators are coupled to the printed circuit board to provide tactile sensations to the touchpad 16. Helical springs can also be provided to engage electrical contacts. Or, multiple voice coil actuators can be positioned at different locations under the touchpad 16. These embodiments are described in U.S. Pat. No. 6,563,487. Any of the actuators described in that patent can also be used.
  • FIG. 6 is a side elevational view of a third embodiment 60 of the haptic touchpad 16. In this embodiment, the stationary portion of the actuator is coupled to the touchpad 16, and the moving portion of the actuator is coupled to an inertial mass to provide inertial haptic sensations.
  • Touchpad 16 can be compliantly mounted to the rigid housing of the computer device similarly to the embodiments described above. For example, one or more spring elements 62 can be coupled between the touchpad and the housing. These springs can be helical or leaf springs, a compliant material such as rubber or foam, flexures, etc.
  • One or more actuators 64 are coupled to the underside of the touchpad 16. In the embodiment of FIG. 6, a piezoelectric actuator is shown. One portion 66 of each actuator 64 is coupled to the touchpad 16, and the other portion 68 is coupled to a mass 70. Thus, when the portion 68 is moved relative to the portion 66, the mass 70 is moved with the portion 68. The mass 70 can be any suitable object of the desired weight, such as plastic or metal material. The mass 70 is moved approximately along the z-axis and is not coupled to the housing, allowing free motion. The motion of the mass 70 along the z-axis causes an inertial force that is transmitted through the actuator 64 to the touchpad 16, and the touchpad 16 moves along the z-axis due to the compliant coupling 62. The motion of the touchpad 16 is felt by the user contacting the touchpad 16 as a haptic sensation.
  • In different embodiments, other types of actuators can be used. For example, a linear voice coil actuator as described for FIG. 5 can be used, in which an inertial mass is coupled to the linear-moving portion of the voice coil actuator. Other actuators can also be used, such as solenoids, pager motors, moving magnet actuators, E-core actuators, etc. Many actuators used for inertial haptic sensations are described in U.S. Pat. No. 6,211,861. Furthermore, a rotary actuator can be used, where the rotary output force is converted to a linear force approximately along the z-axis. For example, the rotary force can be converted using a flexure, as described in U.S. Pat. No. 6,697,043.
  • In the preferred linear force implementation, the direction or degree of freedom that the force is applied on the touchpad with respect to the inertial mass is important. If a significant component of the force is applied in the planar workspace of the touchpad (i.e., along the X or Y axis) with respect to the inertial mass, a short pulse or vibration can interfere with the user's object motion in one or both of those planar degrees of freedom and thereby impair the user's ability to accurately guide a controlled graphical object, such as a cursor, to a given target. Since a primary function of the touchpad is accurate targeting, a tactile sensation that distorts or impairs targeting, even mildly, is undesirable. To solve this problem, the touchpad device applies inertial forces substantially along the Z axis, orthogonal to the planar X and Y axes of the touchpad surface. In such a configuration, tactile sensations can be applied at a perceptually strong level for the user without impairing the ability to accurately position a user controlled graphical object in the X and Y axes of the screen. Furthermore, since the tactile sensations are directed in a third degree of freedom relative to the two-dimensional planar workspace and display screen, jolts or pulses output along the Z axis feel much more like three-dimensional bumps or divots to the user that come “out” or go “into” the screen, increasing the realism of the tactile sensations and creating a more compelling interaction. For example, an upwardly-directed pulse that is output when the cursor is moved over a window border creates the illusion that the user is moving a finger or other object “over” a bump at the window border.
  • FIG. 7 is a top elevational view of the touchpad 16. Touchpad 16 can in some embodiments be used simply as a positioning device, where the entire area of the pad provides cursor control. In other embodiments, different regions of the pad can be designated for different functions. In some of these region embodiments, each region can be provided with an actuator located under the region, while other region embodiments may use a single actuator that imparts forces on the entire pad 16. In the embodiment shown, a central cursor control region 70 is used to position the cursor.
  • The cursor control region 70 of the pad 16 can cause forces to be output on the pad based on interactions of the controlled cursor with the graphical environment and/or events in that environment. The user moves a finger or other object within region 70 to correspondingly move the cursor 20. Forces are preferably associated with the interactions of the cursor with displayed graphical objects. For example, a jolt or “pulse” sensation can be output, which is a single impulse of force that quickly rises to the desired magnitude and then is turned off or quickly decays back to zero or small magnitude. The touchpad 16 can be jolted in the z-axis to provide the pulse. A vibration sensation can also be output, which is a time-varying force that is typically periodic. The vibration can cause the touchpad 16 or portions thereof to oscillate back and forth on the z axis, and can be output by a host or local microprocessor to simulate a particular effect that is occurring in a host application.
  • Another type of force sensation that can be output on the touchpad 16 is a texture force. This type of force is similar to a pulse force, but depends on the position of the user's finger on the area of the touchpad and/or on the location of the cursor in the graphical environment. Thus, texture bumps are output depending on whether the cursor has moved over a location of a bump in a graphical object. This type of force is spatially-dependent, i.e. a force is output depending on the location of the cursor as it moves over a designated textured area; when the cursor is positioned between “bumps” of the texture, no force is output, and when the cursor moves over a bump, a force is output. This can be achieved by host control (e.g., the host sends the pulse signals as the cursor is dragged over the grating). In some embodiments, a separate touchpad microprocessor can be dedicated for haptic feedback with the touchpad, and the texture effect and be achieved using local control (e.g., the host sends a high level command with texture parameters and the sensation is directly controlled by the touchpad processor). In other cases a texture can be performed by presenting a vibration to a user, the vibration being dependent upon the current velocity of the user's finger (or other object) on the touchpad. When the finger is stationary, the vibration is deactivated; as the finger is moved faster, the frequency and magnitude of the vibration is increased. This sensation can be controlled locally by the touchpad processor (if present), or be controlled by the host. Local control by the pad processor may eliminate communication burden in some embodiments. Other spatial force sensations can also be output. In addition, any of the described force sensations herein can be output simultaneously or otherwise combined as desired.
  • Different types of graphical objects can be associated with tactile sensations. Tactile sensations can output on the touchpad 16 based on interaction between a cursor and a window. For example, a z-axis “bump” or pulse can be output on the touchpad to signal the user of the location of the cursor when the cursor is moved over a border of a window. When the cursor is moved within the window's borders, a texture force sensation can be output. The texture can be a series of bumps that are spatially arranged within the area of the window in a predefined pattern; when the cursor moves over a designated bump area, a bump force is output on the touchpad. A pulse or bump force can be output when the cursor is moved over a selectable object, such as a link in a displayed web page or an icon. A vibration can also be output to signify a graphical object which the cursor is currently positioned over. Furthermore, features of a document displaying in a window can also be associated with force sensations. For example, a pulse can be output on the touchpad when a page break in a document is scrolled past a particular area of the window. Page breaks or line breaks in a document can similarly be associated with force sensations such as bumps or vibrations.
  • Furthermore, a menu items in a displayed menu can be selected by the user after a menu heading or graphical button is selected. The individual menu items in the menu can be associated with forces. For example, vertical (z-axis) bumps or pulses can be output when the cursor is moved over the border between menu items. The sensations for certain menu choices can be stronger than others to indicate importance or frequency of use, i.e., the most used menu choices can be associated with higher-magnitude (stronger) pulses than the less used menu choices. Also, currently-disabled menu choices can have a weaker pulse, or no pulse, to indicate that the menu choice is not enabled at that time. Furthermore, when providing tiled menus in which a sub-menu is displayed after a particular menu element is selected, as in Microsoft Windows™, pulse sensations can be sent when a sub-menu is displayed. This can be very useful because users may not expect a sub-menu to be displayed when moving a cursor on a menu element. Icons can be associated with textures, pulses, and vibrations similarly to the windows described above. Drawing or CAD programs also have many features which can be associated with similar haptic sensations, such as displayed (or invisible) grid lines or dots, control points of a drawn object, etc.
  • In other related interactions, when a rate control or scrolling function is performed with the touchpad (through use of the cursor), a vibration can be displayed on the device to indicate that scrolling is in process. When reaching the end of a numerical range that is being adjusted (such as volume), a pulse can be output to indicate that the end of the range has been reached. Pulse sensations can be used to indicate the location of the “ticks” for discrete values or settings in the adjusted range. A pulse can also be output to inform the user when the center of the range is reached. Different strength pulses can also be used, larger strength indicating the more important ticks. In other instances, strength and/or frequency of a vibration can be correlated with the adjustment of a control to indicate current magnitude of the volume or other adjusted value. In other interactions, a vibration sensation can be used to indicate that a control function is active. Furthermore, in some cases a user performs a function, like selection or cutting or pasting a document, and there is a delay between the button press that commands the function and the execution of the function, due to processing delays or other delays. A pulse sensation can be used to indicate that the function (the cut or paste) has been executed.
  • Furthermore, the magnitude of output forces on the touchpad can depend on the event or interaction in the graphical environment. For example, the force pulse can be a different magnitude of force depending on the type of graphical object encountered by the cursor. For example, a pulses of higher magnitude can be output when the cursor moves over windows, while pulses of lower magnitude can be output when the cursor moves over icons. The magnitude of the pulses can also depend on other characteristics of graphical objects, such as an active window as distinguished a background window, file folder icons of different priorities designated by the user, icons for games as distinguished from icons for business applications, different menu items in a drop-down menu, etc. The user or developer can also preferably associate particular graphical objects with customized haptic sensations.
  • User-independent events can also be relayed to the user using haptic sensations on the touchpad. An event occurring within the graphical environment, such as an appointment reminder, receipt of email, explosion in a game, etc., can be signified using a vibration, pulse, or other time-based force. The force sensation can be varied to signify different events of the same type. For example, vibrations of different frequency can each be used to differentiate different events or different characteristics of events, such as particular users sending email, the priority of an event, or the initiation or conclusion of particular tasks (e.g. the downloading of a document or data over a network). When the host system is “thinking,” requiring the user to wait while a function is being performed or accessed (usually when a timer is displayed by the host) it is often a surprise when the function is complete. If the user takes his or her eyes off the screen, he or she may not be aware that the function is complete. A pulse sensation can be sent to indicate that the “thinking” is over.
  • A software designer may want to allow a user to be able to select options or a software function by positioning a cursor over an area on the screen using the touchpad, but not require pressing a physical button or tapping the touchpad to actually select the option. Currently, it is problematic to allow such selection because a user has physical confirmation of execution when pressing a physical button. A pulse sent to the touchpad can act as that physical confirmation without the user having to press a button or other control for selection. For example, a user can position a cursor over a web page element, and once the cursor is within the desired region for a given period of time, an associated function can be executed. This is indicated to the user through a tactile pulse sent to the pad 16.
  • The above-described force sensations can also be used in games or simulations. For example, a vibration can be output when a user-controlled racing car is driving on a dirt shoulder of a displayed road, a pulse can be output when the car collides with another object, and a varying-frequency vibration can be output when a vehicle engine starts and rumbles. The magnitude of pulses can be based on the severity of a collision or explosion, the size of the controlled graphical object or entity (and/or the size of a different graphical object/entity that is interacted with), etc. Force sensations can also be output based on user-independent events in the game or simulation, such as pulses when bullets are fired at the user's character.
  • The above haptic sensations can be similar to those described in U.S. Pat. No. 6,243,078 and U.S. Pat. No. 6,211,861. Other control devices or grips that can include a touchpad 16 in its housing include a gamepad, mouse or trackball device for manipulating a cursor or other graphical objects in a computer-generated environment; or a pressure sphere or the like. For example, the touchpad 16 can be provided on the housing of a computer mouse to provide additional input to the host computer. Furthermore, selective disturbance filtering of forces, as described in U.S. Pat. No. 6,020,876, and shaping of force signals to drive the touchpad with impulse waves as described in U.S. Pat. No. 5,959,613, can be used. Such impulses are also effective when driven with stored power in a battery on the computer 10 or from a bus such as USB connected to a host computer.
  • The touchpad 16 can also be provided with different control regions that provide separate input from the main cursor control region 70. In some embodiments, the different regions can be physically marked with lines, borders, or textures on the surface of the pad 16 (and/or sounds from the computer 10) so that the user can visually, audibly, and/or or tactilely tell which region he or she is contacting on the pad.
  • For example, scroll or rate control regions 62 a and 62 b can be used to provide input to perform a rate control task, such as scrolling documents, adjusting a value (such as audio volume, speaker balance, monitor display brightness, etc.), or panning/tilting the view in a game or virtual reality simulation. Region 62 a can be used by placing a finger (or other object) within the region, where the upper portion of the region will increase the value, scroll up, etc., and the lower portion of the region will decrease the value, scroll down, etc. In embodiments that can read the amount of pressure placed on the pad 16, the amount of pressure can directly control the rate of adjustment; e.g., a greater pressure will cause a document to scroll faster. The region 62 b can similarly be used for horizontal (left/right) scrolling or rate control adjustment of a different value, view, etc.
  • Particular haptic effects can be associated with the control regions 62 a and 62 b. For example, when using the rate control region 62 a or 62 b, a vibration of a particular frequency can be output on the pad 16. In those embodiments having multiple actuators, an actuator placed directly under the region 62 a or 62 b can be activated to provide a more localized tactile sensation for the “active” (currently used) region. As a portion of a region 62 is pressed for rate control, pulses can be output on the pad (or region of the pad) to indicate when a page has scroll by, a particular value has passed, etc. A vibration can also be continually output while the user contacts the region 62 a or 62 b.
  • Other regions 64 can also be positioned on the touchpad 16. For example, each of regions 64 provides a small rectangular area, like a button, which the user can point to in order to initiate a function associated with the pointed-to region. The regions 64 can initiate such computer functions as running a program, opening or closing a window, going “forward” or “back” in a queue of web pages in a web browser, powering the computer 10 or initiating a “sleep” mode, checking mail, firing a gun in a game, cutting or pasting data from a buffer, selecting a font, etc. The regions 64 can duplicate functions and buttons provided in an application program or provide new, different functions.
  • Similarly to regions 62, the regions 64 an each be associated with haptic sensations; for example, a region 64 can provide a pulse sensation when it has been selected by the user, providing instant feedback that the function has been selected. Furthermore, the same types of regions can be associated with similar-feeling haptic sensations. For example, each word processor related region 64 can, when pointed to, cause a pulse of a particular strength, while each game-related region can provide a pulse of different strength or a vibration. Furthermore, when the user moves the pointing object from one region 62 or 64 to another, a haptic sensation (such as a pulse) can be output on the pad 16 to signify that a region border has been crossed.
  • In addition, the regions are preferably programmable in size and shape as well as in the function with which they are associated. Thus, the functions for regions 64 can change based on an active application program in the graphical environment and/or based on user preferences input to and/or stored on the computer 10. Preferably, the size and location of each of the regions can be adjusted by the user or by an application program, and any or all of the regions can be completely removed if desired. Furthermore, the user is preferably able to assign particular haptic sensations to particular areas or types of areas based on types of functions associated with those areas, as desired. Different haptic sensations can be designed in a tool such as Immersion Studio™ available from Immersion Corporation of San Jose, Calif.
  • It should be noted that the regions 62 and 64 need not be physical regions of the touchpad 16. That is, the entire touchpad 16 surface need merely provide coordinates of user contact to the processor of the computer and software on the computer can designate where different regions are located. The computer can interpret the coordinates and, based on the location of the user contact, can interpret the touchpad input signal as a cursor control signal or a different type of signal, such as rate control, button function, etc. The local touchpad microprocessor, if present, may alternatively interpret the function associated with the user contact location and report appropriate signal or data to the host processor (such as position coordinates or a button signal), thus keeping the host processor ignorant of the lower level processing. In other embodiments, the touchpad 16 can be physically designed to output different signals to the computer based on different regions marked on the touchpad surface that are contacted by the user; e.g. each region can be sensed by a different sensor or sensor array.
  • FIGS. 8 a and 8 b are top plan and side cross-sectional views, respectively, of another computer device embodiment 80 including a form of the haptic touchpad 16. Device 80 is in the form of a portable computer device such as “personal digital assistant” (PDA), a “pen-based” computer, “electronic book”, or similar device (collectively known as a “personal digital assistant” or PDA herein). Those devices which allow a user to input information by touching a display screen or readout in some fashion are primarily relevant to this embodiment. Such devices can include the Palm Pilot from 3Com Corp., the Newton from Apple Computer, pocket-sized computer devices from Casio, Hewlett-Packard, or other manufacturers, cellular phones or pagers having touch screens, etc.
  • In one embodiment of a device 80, a display screen 82 typically covers a large portion of the surface of the computer device 80. Screen 82 is preferably a flat-panel display as is well known to those skilled in the art and can display text, images, animations, etc.; in some embodiments screen 80 is as functional as any personal computer screen. Display screen 82 is preferably a “touch screen” that includes sensors which allow the user to input information to the computer device 80 by physically contacting the screen 80 (i.e. it is another form of planar “touch device” similar to the touchpad 16). For example, a transparent sensor film can be overlaid on the screen 80, where the film can detect pressure from an object contacting the film. The sensor devices for implementing touch screens are well known to those skilled in the art.
  • The user can select graphically-displayed buttons or other graphical objects by pressing a finger or a stylus to the screen 82 at the exact location where the graphical object is displayed. Furthermore, some embodiments allow the user to “draw” or “write” on the screen by displaying graphical “ink” images 85 at locations where the user has pressed a tip of a stylus, finger, or other object. Handwritten characters can be recognized by software running on the device microprocessor as commands, data, or other input. In other embodiments, the user can provide input additionally or alternatively through voice recognition, where a microphone on the device inputs the user's voice which is translated to appropriate commands or data by software running on the device. Physical buttons 84 can also be included in the housing of the device 80 to provide particular commands to the device 80 when the buttons are pressed. Many PDA's are characterized by the lack of a standard keyboard for character input from the user; rather, an alternative input mode is used, such as using a stylus to draw characters on the screen, voice recognition, etc. However, some PDA's also include a fully-functional keyboard as well as a touch screen, where the keyboard is typically much smaller than a standard-sized keyboard. In yet other embodiments, standard-size laptop computers with standard keyboards may include flat-panel touch-input display screens, and such screens (similar to screen 12 of FIG. 1) can be provided with haptic feedback.
  • The touch screen 82 provides haptic feedback to the user similarly to the touchpad 16 described in previous embodiments. One or more actuators 86 can be coupled to the underside of the touch screen 82 to provide haptic feedback such as pulses, vibrations, and textures; for example, an actuator 86 can be positioned near each corner of the screen 82, as shown in FIG. 8 a. Other configurations of actuators can also be used. The user can experience the haptic feedback through a finger or a held object such as a stylus 87 that is contacting the screen 82.
  • As shown in FIG. 8 b, the touch screen 82 is preferably coupled to the housing 88 of the device 80 by one or more spring or compliant elements 90, such as helical springs, leaf springs, flexures, or compliant material (foam, rubber, etc.) The compliant element allows the touch screen 82 to move approximately along the z-axis, thereby providing haptic feedback similarly to the touchpad embodiments described above. Actuators 86 can be piezo-electric actuators, voice coil actuators, or any of the other types of actuators described above for the touchpad embodiments. As shown in FIG. 8 b, the actuators 86 are directly coupled to the touch screen 82 similarly to the touchpad embodiment of FIG. 3; alternatively, an inertial mass can be moved to provide inertial feedback in the z-axis of the touch screen, similarly to the touchpad embodiment of FIG. 6. Other features described above for the touchpad are equally applicable to the touch screen embodiment 80.
  • In the embodiments of touch input devices (touchpad and touch screen) described herein, it is also advantageous that contact of the user is detected by the touch input device. Since haptic feedback need only be output when the user is contacting the touch device, this detection allows haptic feedback to be stopped (actuators “turned off”) when no objects are contacting the touch input device. This feature can conserve battery power for portable devices. If a local touch device microprocessor (or similar circuitry) is being used in the computer, such a microprocessor can turn off actuator output when no user contact is sensed, thus alleviating the host processor of additional computational burden.
  • While the subject matter has been described in terms of several preferred embodiments, it is contemplated that alterations, permutations, and equivalents thereof will become apparent to those skilled in the art upon a reading of the specification and study of the drawings. For example, many different types of actuators can be used to output tactile sensations to the user. Furthermore, many of the features described in one embodiment can be used interchangeably with other embodiments. Furthermore, certain terminology has been used for the purposes of descriptive clarity, and not to be limiting.

Claims (28)

What is claimed is:
1. A haptic feedback device, comprising:
a touch surface configured to display a graphical environment and to output a first signal associated with a selection of a graphical object of the graphical environment, wherein the touch surface comprises a first region associated with the selected graphical object and a second region configured to provide a second signal different from the first signal and associated with a control functionality different from the selected graphical object, and wherein the first and second regions are associated with different haptic effects; and
at least a first actuator configured to impart a first force to thereby provide a haptic effect in response to the selected graphical object or the control functionality different from the selected graphical object, the first force based on information output by a computer device.
2. The haptic feedback device of claim 1, wherein the computer device comprises a portable computer.
3. The haptic feedback device of claim 2, wherein the portable computer comprises a PDA, a pager or a cellular phone.
4. The haptic feedback device of claim 1, wherein the selected graphical object is associated with a button.
5. The haptic feedback device of claim 1, wherein the touch surface is operative to receive a selection of the graphical object from a user's finger.
6. The haptic feedback device of claim 1, wherein the touch surface is operative to receive a selection of the graphical object from a physical object held by the user.
7. The haptic feedback device of claim 6, wherein the physical object is a stylus.
8. The haptic feedback device of claim 1, wherein the touch surface is integrated in a housing of a handheld device that is capable of operation by at least one hand of the user.
9. The haptic feedback device of claim 8, wherein the handheld device is a remote control device for controlling one or more functions of an electronic device or appliance.
10. The haptic feedback device of claim 1, wherein the first actuator is a piezo-electric actuator.
11. The haptic feedback device of claim 1, wherein the first actuator is a voice coil actuator.
12. The haptic feedback device of claim 1, wherein the first actuator includes a solenoid.
13. The haptic feedback device of claim 1, wherein the first actuator outputs a continuous vibration or a pulse tactile sensation on the touch surface.
14. The haptic feedback device of claim 1, wherein the second signal is used in conjunction with a button press.
15. The haptic feedback device of claim 1, wherein the second signal is associated with: a running of a program, an opening or closing of a window, a navigation of a web browser, a powering on or off of a computer, an initiation of a sleep mode in a computer, an electronic mail function, a gaming function, or a word processing function.
16. The haptic feedback device of claim 1, wherein the first region is demarcated from the second region by: a visual demarcation, an audible demarcation, or a tactile demarcation.
17. The haptic feedback device of claim 1, wherein the selection of the graphical object is sensed by a first sensor associated with the first region and an input at the second region is sensed by a second sensor different from the first sensor.
18. The haptic feedback device of claim 1, wherein the selection of the graphical object and an input at the second region is sensed by the same sensor.
19. A method, comprising:
outputting, by a touch surface operative to display a graphical environment, a first signal associated with a selection of a graphical object of the graphical environment, wherein the touch surface comprises a first region associated with the selected graphical object and a second region configured to provide a second signal different from the first signal and associated with a control functionality different from the selected graphical object, and wherein the first and second regions are associated with different haptic effects; and
imparting, by at least a first actuator, a first force to thereby provide a haptic effect in response to the selected graphical object or the control functionality different from the selected graphical object, the first force based on information output by a computer device.
20. The method of claim 19, wherein the second signal is used in a rate control function of a value.
21. The method of claim 19, wherein the second signal is used in conjunction with a button press.
22. The method of claim 19, wherein the second signal is associated with: a running of a program, an opening or closing of a window, a navigation of a web browser, a powering on or off of a computer, an initiation of a sleep mode in a computer, an electronic mail function, a gaming function, or a word processing function.
23. The method of claim 19, wherein the first region is demarcated from the second region by: a visual demarcation, an audible demarcation, or a tactile demarcation.
24. The method of claim 19, wherein the selected graphical object is associated with a button.
25. The method of claim 19, further comprising:
sensing, by a first sensor associated with the first region, the selection of the graphical object; and
sensing, by a second sensor different from the first sensor, an input at the second region.
26. The method of claim 19, further comprising:
sensing, by a sensor associated with the first region, the selection of the graphical object; and
sensing, by the sensor, an input at the second region.
27. A haptic feedback device, comprising:
a touch surface configured to display a graphical environment and to output a first signal associated with a selection of a graphical object of the graphical environment, wherein the touch surface comprises a first region associated with the selected graphical object;
a button separate from the touch surface and associated with a command to be executed when the button is pressed; and
at least a first actuator configured to impart a first force or the button to thereby provide a haptic effect in response to the selected graphical object or the command, the first force based on information output by a computer device.
28. The haptic feedback device of claim 27, wherein the button is disposed adjacent to the touch surface.
US13/295,947 1998-06-23 2011-11-14 Haptic feedback for touchpads and other touch controls Abandoned US20120056839A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/295,947 US20120056839A1 (en) 1998-06-23 2011-11-14 Haptic feedback for touchpads and other touch controls
US13/747,389 US9280205B2 (en) 1999-12-17 2013-01-22 Haptic feedback for touchpads and other touch controls
US15/054,693 US9740290B2 (en) 1999-12-17 2016-02-26 Haptic feedback for touchpads and other touch controls

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US09/103,281 US6088019A (en) 1998-06-23 1998-06-23 Low cost force feedback device with actuator for non-primary axis
US09/156,802 US6184868B1 (en) 1998-09-17 1998-09-17 Haptic feedback control devices
US09/253,132 US6243078B1 (en) 1998-06-23 1999-02-18 Pointing device with forced feedback button
US09/467,309 US6563487B2 (en) 1998-06-23 1999-12-17 Haptic feedback for directional control pads
US09/487,737 US6429846B2 (en) 1998-06-23 2000-01-19 Haptic feedback for touchpads and other touch controls
US10/213,940 US7148875B2 (en) 1998-06-23 2002-08-06 Haptic feedback for touchpads and other touch controls
US11/405,811 US7592999B2 (en) 1998-06-23 2006-04-17 Haptic feedback for touchpads and other touch controls
US13/295,947 US20120056839A1 (en) 1998-06-23 2011-11-14 Haptic feedback for touchpads and other touch controls

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US11/405,811 Continuation US7592999B2 (en) 1998-06-23 2006-04-17 Haptic feedback for touchpads and other touch controls
US12/008,916 Continuation US8059105B2 (en) 1998-06-23 2008-01-14 Haptic feedback for touchpads and other touch controls

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/747,389 Continuation US9280205B2 (en) 1999-12-17 2013-01-22 Haptic feedback for touchpads and other touch controls

Publications (1)

Publication Number Publication Date
US20120056839A1 true US20120056839A1 (en) 2012-03-08

Family

ID=23936915

Family Applications (20)

Application Number Title Priority Date Filing Date
US09/487,737 Expired - Lifetime US6429846B2 (en) 1998-06-23 2000-01-19 Haptic feedback for touchpads and other touch controls
US10/213,940 Expired - Lifetime US7148875B2 (en) 1998-06-23 2002-08-06 Haptic feedback for touchpads and other touch controls
US10/615,986 Expired - Fee Related US7728820B2 (en) 1998-06-23 2003-07-10 Haptic feedback for touchpads and other touch controls
US11/340,997 Expired - Fee Related US7777716B2 (en) 1998-06-23 2006-01-27 Haptic feedback for touchpads and other touch controls
US11/405,811 Expired - Fee Related US7592999B2 (en) 1998-06-23 2006-04-17 Haptic feedback for touchpads and other touch controls
US11/414,122 Expired - Fee Related US7602384B2 (en) 1998-06-23 2006-04-28 Haptic feedback touchpad
US11/525,473 Expired - Fee Related US7944435B2 (en) 1998-06-23 2006-09-21 Haptic feedback for touchpads and other touch controls
US11/589,004 Abandoned US20070040815A1 (en) 1998-06-23 2006-10-27 Haptic feedback for touchpads and other touch controls
US11/805,621 Expired - Fee Related US7768504B2 (en) 1998-06-23 2007-05-23 Haptic feedback for touchpads and other touch controls
US11/805,609 Abandoned US20070229478A1 (en) 1998-06-23 2007-05-23 Haptic feedback for touchpads and other touch controls
US11/981,501 Expired - Fee Related US8031181B2 (en) 1998-06-23 2007-10-30 Haptic feedback for touchpads and other touch controls
US11/985,656 Expired - Fee Related US7982720B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch controls
US11/985,599 Expired - Fee Related US8063893B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch controls
US11/985,655 Expired - Fee Related US7978183B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch controls
US11/985,657 Expired - Fee Related US8049734B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch control
US12/008,916 Expired - Fee Related US8059105B2 (en) 1998-06-23 2008-01-14 Haptic feedback for touchpads and other touch controls
US13/295,947 Abandoned US20120056839A1 (en) 1998-06-23 2011-11-14 Haptic feedback for touchpads and other touch controls
US13/296,017 Abandoned US20120056806A1 (en) 1998-06-23 2011-11-14 Haptic feedback for touchpads and other touch controls
US13/747,389 Expired - Fee Related US9280205B2 (en) 1999-12-17 2013-01-22 Haptic feedback for touchpads and other touch controls
US15/054,693 Expired - Fee Related US9740290B2 (en) 1999-12-17 2016-02-26 Haptic feedback for touchpads and other touch controls

Family Applications Before (16)

Application Number Title Priority Date Filing Date
US09/487,737 Expired - Lifetime US6429846B2 (en) 1998-06-23 2000-01-19 Haptic feedback for touchpads and other touch controls
US10/213,940 Expired - Lifetime US7148875B2 (en) 1998-06-23 2002-08-06 Haptic feedback for touchpads and other touch controls
US10/615,986 Expired - Fee Related US7728820B2 (en) 1998-06-23 2003-07-10 Haptic feedback for touchpads and other touch controls
US11/340,997 Expired - Fee Related US7777716B2 (en) 1998-06-23 2006-01-27 Haptic feedback for touchpads and other touch controls
US11/405,811 Expired - Fee Related US7592999B2 (en) 1998-06-23 2006-04-17 Haptic feedback for touchpads and other touch controls
US11/414,122 Expired - Fee Related US7602384B2 (en) 1998-06-23 2006-04-28 Haptic feedback touchpad
US11/525,473 Expired - Fee Related US7944435B2 (en) 1998-06-23 2006-09-21 Haptic feedback for touchpads and other touch controls
US11/589,004 Abandoned US20070040815A1 (en) 1998-06-23 2006-10-27 Haptic feedback for touchpads and other touch controls
US11/805,621 Expired - Fee Related US7768504B2 (en) 1998-06-23 2007-05-23 Haptic feedback for touchpads and other touch controls
US11/805,609 Abandoned US20070229478A1 (en) 1998-06-23 2007-05-23 Haptic feedback for touchpads and other touch controls
US11/981,501 Expired - Fee Related US8031181B2 (en) 1998-06-23 2007-10-30 Haptic feedback for touchpads and other touch controls
US11/985,656 Expired - Fee Related US7982720B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch controls
US11/985,599 Expired - Fee Related US8063893B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch controls
US11/985,655 Expired - Fee Related US7978183B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch controls
US11/985,657 Expired - Fee Related US8049734B2 (en) 1998-06-23 2007-11-15 Haptic feedback for touchpads and other touch control
US12/008,916 Expired - Fee Related US8059105B2 (en) 1998-06-23 2008-01-14 Haptic feedback for touchpads and other touch controls

Family Applications After (3)

Application Number Title Priority Date Filing Date
US13/296,017 Abandoned US20120056806A1 (en) 1998-06-23 2011-11-14 Haptic feedback for touchpads and other touch controls
US13/747,389 Expired - Fee Related US9280205B2 (en) 1999-12-17 2013-01-22 Haptic feedback for touchpads and other touch controls
US15/054,693 Expired - Fee Related US9740290B2 (en) 1999-12-17 2016-02-26 Haptic feedback for touchpads and other touch controls

Country Status (5)

Country Link
US (20) US6429846B2 (en)
JP (1) JP3085481U (en)
KR (2) KR20010108361A (en)
AU (1) AU2001229543A1 (en)
WO (1) WO2001054109A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090037840A1 (en) * 2007-08-03 2009-02-05 Siemens Medical Solutions Usa, Inc. Location Determination For Z-Direction Increments While Viewing Medical Images
US20100164895A1 (en) * 2008-12-31 2010-07-01 Samsung Electronics Co., Ltd. Apparatus and method for performing scroll function in portable terminal
WO2014046390A1 (en) * 2012-09-24 2014-03-27 Lg Electronics Inc. Portable device and control method thereof
US9056244B2 (en) 2012-09-12 2015-06-16 Wms Gaming Inc. Gaming apparatus incorporating targeted haptic feedback
US9373993B2 (en) 2012-07-07 2016-06-21 Saia-Burgess, Inc. Haptic actuators
US9436341B2 (en) 2012-12-21 2016-09-06 Johnson Electric S.A. Haptic feedback devices
US10019155B2 (en) 2014-06-30 2018-07-10 Honda Motor Co., Ltd. Touch control panel for vehicle control system
US20180321753A1 (en) * 2015-03-08 2018-11-08 Apple Inc. Device, Method, and User Interface for Processing Intensity of Touch Contact

Families Citing this family (1120)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9513744B2 (en) * 1994-08-15 2016-12-06 Apple Inc. Control systems employing novel physical controls and touch screens
US8228305B2 (en) 1995-06-29 2012-07-24 Apple Inc. Method for providing human input to a computer
US8482534B2 (en) * 1995-06-29 2013-07-09 Timothy R. Pryor Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US5930744A (en) * 1995-09-15 1999-07-27 Defelsko Corporation Coating thickness gauge
US6161126A (en) * 1995-12-13 2000-12-12 Immersion Corporation Implementing force feedback over the World Wide Web and other computer networks
US6300936B1 (en) 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6731267B1 (en) * 1997-09-15 2004-05-04 Veijo Matias Tuoriniemi Single touch dual axis input device
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US6211861B1 (en) * 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
US8020095B2 (en) 1997-11-14 2011-09-13 Immersion Corporation Force feedback system including multi-tasking graphical host environment
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US6184868B1 (en) * 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6697043B1 (en) 1999-12-21 2004-02-24 Immersion Corporation Haptic interface device and actuator assembly providing linear haptic sensations
US6417638B1 (en) * 1998-07-17 2002-07-09 Sensable Technologies, Inc. Force reflecting haptic interface
US7233321B1 (en) 1998-12-15 2007-06-19 Intel Corporation Pointing device with integrated audio input
US7469381B2 (en) * 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US6982695B1 (en) * 1999-04-22 2006-01-03 Palmsource, Inc. Method and apparatus for software control of viewing parameters
US6977808B2 (en) 1999-05-14 2005-12-20 Apple Computer, Inc. Display housing for computing device
US6357887B1 (en) 1999-05-14 2002-03-19 Apple Computers, Inc. Housing for a computing device
US6337678B1 (en) 1999-07-21 2002-01-08 Tactiva Incorporated Force feedback computer input and output device with coordinated haptic elements
DE20080209U1 (en) * 1999-09-28 2001-08-09 Immersion Corp Control of haptic sensations for interface devices with vibrotactile feedback
US7340763B1 (en) 1999-10-26 2008-03-04 Harris Scott C Internet browsing from a television
US8482535B2 (en) 1999-11-08 2013-07-09 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
US6498601B1 (en) * 1999-11-29 2002-12-24 Xerox Corporation Method and apparatus for selecting input modes on a palmtop computer
US7089292B1 (en) * 1999-12-20 2006-08-08 Vulcan Patents, Llc Interface including non-visual display for use in browsing an indexed collection of electronic content
US20080316171A1 (en) * 2000-01-14 2008-12-25 Immersion Corporation Low-Cost Haptic Mouse Implementations
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US6760276B1 (en) * 2000-02-11 2004-07-06 Gerald S. Karr Acoustic signaling system
US20080122799A1 (en) * 2001-02-22 2008-05-29 Pryor Timothy R Human interfaces for vehicles, homes, and other applications
US8576199B1 (en) 2000-02-22 2013-11-05 Apple Inc. Computer control systems
JP3909994B2 (en) * 2000-02-29 2007-04-25 アルプス電気株式会社 Input device
US7965276B1 (en) * 2000-03-09 2011-06-21 Immersion Corporation Force output adjustment in force feedback devices based on user contact
US6924787B2 (en) * 2000-04-17 2005-08-02 Immersion Corporation Interface for controlling a graphical image
EP1285330B1 (en) * 2000-05-11 2006-08-30 Nes Stewart Irvine Zeroclick
JP4370042B2 (en) * 2000-05-12 2009-11-25 アルプス電気株式会社 Operating device
US7159008B1 (en) * 2000-06-30 2007-01-02 Immersion Corporation Chat interface with haptic feedback functionality
FR2811499B1 (en) * 2000-07-10 2002-12-20 Cit Alcatel PORTABLE ELECTRONIC DEVICE PROVIDED WITH AN INTEGRATED RADIOCOMMUNICATION DEVICE INCLUDING AN ANTENNA FOR TRANSMITTING AND / OR RECEIVING ELECTROMAGNETIC WAVES
DE60142101D1 (en) * 2000-08-11 2010-06-24 Alps Electric Co Ltd Input device with key input operation and coordinate input operation
TW468321B (en) * 2000-08-14 2001-12-11 Inventec Corp Structure and fabrication method of mobile phone battery featured as touch pad
US7170500B2 (en) * 2000-08-29 2007-01-30 Palm, Inc. Flip-style user interface
DE10046099A1 (en) * 2000-09-18 2002-04-04 Siemens Ag Touch sensitive display with tactile feedback
US7039877B2 (en) * 2001-01-04 2006-05-02 Intel Corporation Conserving space on browser user interfaces
US6646633B1 (en) * 2001-01-24 2003-11-11 Palm Source, Inc. Method and system for a full screen user interface and data entry using sensors to implement handwritten glyphs
US6806865B2 (en) * 2001-02-05 2004-10-19 Palm, Inc. Integrated joypad for handheld computer
US6932706B1 (en) * 2001-02-06 2005-08-23 International Game Technology Electronic gaming unit with virtual object input device
US7419425B1 (en) * 2001-02-15 2008-09-02 Bally Gaming, Inc. Shared secondary game station and system
US20080024463A1 (en) * 2001-02-22 2008-01-31 Timothy Pryor Reconfigurable tactile control display applications
US7567232B2 (en) 2001-03-09 2009-07-28 Immersion Corporation Method of using tactile feedback to deliver silent status information to a user of an electronic device
US7202851B2 (en) * 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
US7730401B2 (en) * 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
US20050176665A1 (en) * 2001-05-18 2005-08-11 Sirna Therapeutics, Inc. RNA interference mediated inhibition of hairless (HR) gene expression using short interfering nucleic acid (siNA)
US6816154B2 (en) * 2001-05-30 2004-11-09 Palmone, Inc. Optical sensor based user interface for a portable electronic device
US6904570B2 (en) * 2001-06-07 2005-06-07 Synaptics, Inc. Method and apparatus for controlling a display of data on a display screen
US7766517B2 (en) 2001-06-15 2010-08-03 Apple Inc. Active enclosure for computing device
WO2002103504A2 (en) 2001-06-15 2002-12-27 Apple Computer, Inc. Active enclosure for computing device
US7452098B2 (en) 2001-06-15 2008-11-18 Apple Inc. Active enclosure for computing device
US20030001874A1 (en) * 2001-06-27 2003-01-02 International Business Machines Corporation Method and apparatus for computer input using the skin as sensory feedback
US6937033B2 (en) * 2001-06-27 2005-08-30 Immersion Corporation Position sensor with resistive element
US7127679B2 (en) * 2001-06-29 2006-10-24 Softrek, Inc. Method for generating and navigating a plurality of menus using a database and a menu template
KR100457509B1 (en) * 2001-07-07 2004-11-17 삼성전자주식회사 Communication terminal controlled through a touch screen and a voice recognition and instruction executing method thereof
US7056123B2 (en) * 2001-07-16 2006-06-06 Immersion Corporation Interface apparatus with cable-driven force feedback and grounded actuators
US7154470B2 (en) * 2001-07-17 2006-12-26 Immersion Corporation Envelope modulator for haptic feedback devices
JP3858642B2 (en) * 2001-08-17 2006-12-20 富士ゼロックス株式会社 Operation switch device
DE10144634A1 (en) * 2001-09-11 2003-04-10 Trw Automotive Electron & Comp operating system
US6989815B2 (en) * 2001-09-13 2006-01-24 E-Book Systems Pte Ltd. Method for flipping pages via electromechanical information browsing device
DE10146470B4 (en) * 2001-09-21 2007-05-31 3Dconnexion Gmbh Selection of software and hardware functions with a force / moment sensor
US6703550B2 (en) * 2001-10-10 2004-03-09 Immersion Corporation Sound data output and manipulation using haptic feedback
JP3798287B2 (en) * 2001-10-10 2006-07-19 Smk株式会社 Touch panel input device
US7046230B2 (en) * 2001-10-22 2006-05-16 Apple Computer, Inc. Touch pad handheld device
US7345671B2 (en) 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
US7312785B2 (en) 2001-10-22 2007-12-25 Apple Inc. Method and apparatus for accelerated scrolling
WO2003054849A1 (en) * 2001-10-23 2003-07-03 Immersion Corporation Method of using tactile feedback to deliver silent status information to a user of an electronic device
US7379053B2 (en) * 2001-10-27 2008-05-27 Vortant Technologies, Llc Computer interface for navigating graphical user interface by touch
KR20040062601A (en) * 2001-10-30 2004-07-07 임머숀 코퍼레이션 Methods and apparatus for providing haptic feedback in interacting with virtual pets
EP2793101A3 (en) 2001-11-01 2015-04-29 Immersion Corporation Method and apparatus for providing tactile feedback sensations
FI115861B (en) * 2001-11-12 2005-07-29 Myorigo Oy Method and apparatus for generating a response
US20030095105A1 (en) * 2001-11-16 2003-05-22 Johannes Vaananen Extended keyboard
FI112415B (en) * 2001-11-28 2003-11-28 Nokia Oyj Piezoelectric user interface
ATE320059T1 (en) * 2001-12-12 2006-03-15 Koninkl Philips Electronics Nv DISPLAY SYSTEM WITH TACTILE GUIDANCE
US20030117378A1 (en) 2001-12-21 2003-06-26 International Business Machines Corporation Device and system for retrieving and displaying handwritten annotations
JP2003288158A (en) * 2002-01-28 2003-10-10 Sony Corp Mobile apparatus having tactile feedback function
US7176899B2 (en) * 2002-01-31 2007-02-13 Kabushiki Kaisha Tokai-Rika-Denki-Seisakusho Display screen operation device
US20030184574A1 (en) * 2002-02-12 2003-10-02 Phillips James V. Touch screen interface with haptic feedback device
JP3788942B2 (en) * 2002-02-22 2006-06-21 株式会社東芝 Information processing apparatus and computer operation support method
US7333092B2 (en) 2002-02-25 2008-02-19 Apple Computer, Inc. Touch pad for handheld device
US7181502B2 (en) * 2002-03-21 2007-02-20 International Business Machines Corporation System and method for locating on electronic documents items referenced in a physical document
US20040004741A1 (en) * 2002-04-22 2004-01-08 Fuji Xerox Co., Ltd. Information processing system and information processing method
JP3892760B2 (en) * 2002-05-31 2007-03-14 株式会社東芝 Information processing device
US6710518B2 (en) * 2002-05-31 2004-03-23 Motorola, Inc. Manually operable electronic apparatus
FI20021162A0 (en) * 2002-06-14 2002-06-14 Nokia Corp Electronic device and a method for administering its keypad
JP3880888B2 (en) * 2002-06-18 2007-02-14 Smk株式会社 Tablet device
GB2390308A (en) * 2002-07-01 2004-01-07 Green Solutions Ltd Touch sensitive pad controlled game apparatus
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
EP1378856A1 (en) * 2002-07-01 2004-01-07 Sony Ericsson Mobile Communications AB Tactile feedback method and device
US7656393B2 (en) 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
US7058902B2 (en) * 2002-07-30 2006-06-06 Microsoft Corporation Enhanced on-object context menus
US7248248B2 (en) * 2002-08-12 2007-07-24 Microsoft Corporation Pointing system for pen-based computer
US20050193351A1 (en) * 2002-08-16 2005-09-01 Myorigo, L.L.C. Varying-content menus for touch screens
US20040037016A1 (en) * 2002-08-26 2004-02-26 Norio Kaneko Complex functional device and method of manufacturing the same, and haptic information system and information input apparatus comprising that complex functional device
JP3937982B2 (en) * 2002-08-29 2007-06-27 ソニー株式会社 INPUT / OUTPUT DEVICE AND ELECTRONIC DEVICE HAVING INPUT / OUTPUT DEVICE
US7358963B2 (en) 2002-09-09 2008-04-15 Apple Inc. Mouse having an optically-based scrolling feature
US7015897B2 (en) * 2002-09-17 2006-03-21 Taiwan Tri Gem Information Co., Ltd. Pushbutton of touch pad of electronic device
EP1406150A1 (en) * 2002-10-01 2004-04-07 Sony Ericsson Mobile Communications AB Tactile feedback method and device and portable device incorporating same
US8917234B2 (en) * 2002-10-15 2014-12-23 Immersion Corporation Products and processes for providing force sensations in a user interface
AU2003286504A1 (en) 2002-10-20 2004-05-13 Immersion Corporation System and method for providing rotational haptic feedback
EP1566728B1 (en) * 2002-10-30 2018-05-23 Thomson Licensing Input device and process for manufacturing the same, portable electronic apparatus comprising input device
JP4047712B2 (en) * 2002-11-29 2008-02-13 アルプス電気株式会社 Operating device
AU2003279475A1 (en) * 2002-12-04 2004-06-23 Koninklijke Philips Electronics N.V. Graphic user interface having touch detectability
US7769417B2 (en) * 2002-12-08 2010-08-03 Immersion Corporation Method and apparatus for providing haptic feedback to off-activating area
US8803795B2 (en) 2002-12-08 2014-08-12 Immersion Corporation Haptic communication devices
US20060136630A1 (en) * 2002-12-08 2006-06-22 Immersion Corporation, A Delaware Corporation Methods and systems for providing haptic messaging to handheld communication devices
US8830161B2 (en) 2002-12-08 2014-09-09 Immersion Corporation Methods and systems for providing a virtual touch haptic effect to handheld communication devices
US8059088B2 (en) 2002-12-08 2011-11-15 Immersion Corporation Methods and systems for providing haptic messaging to handheld communication devices
US20060136631A1 (en) * 2002-12-08 2006-06-22 Immersion Corporation, A Delaware Corporation Methods and systems for providing haptic messaging to handheld communication devices
US7779166B2 (en) * 2002-12-08 2010-08-17 Immersion Corporation Using haptic effects to enhance information content in communications
KR100516886B1 (en) * 2002-12-09 2005-09-23 제일모직주식회사 Slurry Composition for Final Polishing of Silicon Wafer
EP2347657B1 (en) * 2002-12-20 2015-08-05 Marel Stork Poultry Processing B.V. Method and device for processing a carcass part of slaughtered poultry
US6819990B2 (en) * 2002-12-23 2004-11-16 Matsushita Electric Industrial Co., Ltd. Touch panel input for automotive devices
FI20022282A0 (en) * 2002-12-30 2002-12-30 Nokia Corp Method for enabling interaction in an electronic device and an electronic device
US6717075B1 (en) * 2003-01-08 2004-04-06 Hewlett-Packard Development Company, L.P. Method and apparatus for a multi-sided input device
JP2004227222A (en) * 2003-01-22 2004-08-12 Toshiba Corp Electronic equipment
US8488308B2 (en) * 2003-02-12 2013-07-16 3M Innovative Properties Company Sealed force-based touch sensor
US6775129B1 (en) * 2003-02-14 2004-08-10 Intel Corporation Convertible and detachable laptops
US7280348B2 (en) * 2003-02-14 2007-10-09 Intel Corporation Positioning mechanism for a pen-based computing system
US7336266B2 (en) * 2003-02-20 2008-02-26 Immersion Corproation Haptic pads for use with user-interface devices
JP4177142B2 (en) * 2003-03-10 2008-11-05 富士通コンポーネント株式会社 Coordinate input device and drive device
TWI226584B (en) * 2003-04-07 2005-01-11 Darfon Electronics Corp Input device and input method
US20040221243A1 (en) * 2003-04-30 2004-11-04 Twerdahl Timothy D Radial menu interface for handheld computing device
CA2468481A1 (en) * 2003-05-26 2004-11-26 John T. Forbis Multi-position rail for a barrier
DE10324579A1 (en) * 2003-05-30 2004-12-16 Daimlerchrysler Ag operating device
US7310779B2 (en) 2003-06-26 2007-12-18 International Business Machines Corporation Method for creating and selecting active regions on physical documents
JP2005044241A (en) * 2003-07-24 2005-02-17 Nec Corp Pointing device notification system and method
JP2005049994A (en) * 2003-07-30 2005-02-24 Canon Inc Method for controlling cursor
US8094127B2 (en) * 2003-07-31 2012-01-10 Volkswagen Ag Display device
US20070152977A1 (en) 2005-12-30 2007-07-05 Apple Computer, Inc. Illuminated touchpad
US20060181517A1 (en) * 2005-02-11 2006-08-17 Apple Computer, Inc. Display actuator
AU2007101094B4 (en) * 2003-08-18 2008-06-05 Apple Inc. Movable touch pad with added functionality
US7499040B2 (en) * 2003-08-18 2009-03-03 Apple Inc. Movable touch pad with added functionality
GB0319714D0 (en) * 2003-08-21 2003-09-24 Philipp Harald Anisotropic touch screen element
EP1517224A3 (en) * 2003-09-16 2007-02-21 Volkswagen Aktiengesellschaft Touch sensitive display device
JP4359757B2 (en) * 2003-09-17 2009-11-04 ソニー株式会社 Information display device
US20050062841A1 (en) * 2003-09-18 2005-03-24 Rivera-Cintron Carlos A. System and method for multi-media record, distribution and playback using wireless communication
EP2639723A1 (en) * 2003-10-20 2013-09-18 Zoll Medical Corporation Portable medical information device with dynamically configurable user interface
US20050088418A1 (en) * 2003-10-28 2005-04-28 Nguyen Mitchell V. Pen-based computer interface system
US7411576B2 (en) 2003-10-30 2008-08-12 Sensable Technologies, Inc. Force reflecting haptic interface
JP4111278B2 (en) * 2003-11-20 2008-07-02 独立行政法人産業技術総合研究所 Haptic information presentation system
US10936074B2 (en) * 2003-11-20 2021-03-02 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US10416767B2 (en) * 2003-11-20 2019-09-17 National Institute Of Advanced Industrial Science And Technology Haptic information presentation system and method
US7495659B2 (en) 2003-11-25 2009-02-24 Apple Inc. Touch pad for handheld device
US8059099B2 (en) 2006-06-02 2011-11-15 Apple Inc. Techniques for interactive input to portable electronic devices
US8164573B2 (en) 2003-11-26 2012-04-24 Immersion Corporation Systems and methods for adaptive interpretation of input from a touch-sensitive input device
US7227535B1 (en) 2003-12-01 2007-06-05 Romano Edwin S Keyboard and display for a computer
US20060066569A1 (en) * 2003-12-08 2006-03-30 Immersion Corporation, A Delaware Corporation Methods and systems for providing haptic messaging to handheld communication devices
CN1629876A (en) * 2003-12-19 2005-06-22 升达科技股份有限公司 Separated touch control board module and electronic product having the same
US7742036B2 (en) 2003-12-22 2010-06-22 Immersion Corporation System and method for controlling haptic devices having multiple operational modes
US20050136893A1 (en) * 2003-12-22 2005-06-23 Timo Ala-Lehtimaki Terminal and method for transmitting electronic message with user-defined contents
DE102005003548A1 (en) 2004-02-02 2006-02-09 Volkswagen Ag Operating unit for e.g. ground vehicle, has layer, comprising dielectric elastomer, arranged between front electrode and rear electrode, and pressure sensor measuring pressure exerted on operating surface of unit
EP1560102A3 (en) * 2004-02-02 2007-02-21 Volkswagen Aktiengesellschaft Touchscreen with spring-controlled haptic feedback
JP4667755B2 (en) * 2004-03-02 2011-04-13 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
US8842070B2 (en) * 2004-03-17 2014-09-23 Intel Corporation Integrated tracking for on screen navigation with small hand held devices
JP4046095B2 (en) * 2004-03-26 2008-02-13 ソニー株式会社 Input device with tactile function, information input method, and electronic device
US20060061545A1 (en) * 2004-04-02 2006-03-23 Media Lab Europe Limited ( In Voluntary Liquidation). Motion-activated control with haptic feedback
US7508382B2 (en) * 2004-04-28 2009-03-24 Fuji Xerox Co., Ltd. Force-feedback stylus and applications to freeform ink
US20050248549A1 (en) * 2004-05-06 2005-11-10 Dietz Paul H Hand-held haptic stylus
US20050257150A1 (en) * 2004-05-11 2005-11-17 Universite Des Sciences Et Technologies De Lille Ground-based haptic interface comprising at least two decoupled rotary finger actuators
HU0401034D0 (en) * 2004-05-24 2004-08-30 Ratai Daniel System of three dimension induting computer technology, and method of executing spatial processes
TWI236239B (en) * 2004-05-25 2005-07-11 Elan Microelectronics Corp Remote controller
DE102004026461A1 (en) * 2004-05-29 2005-12-15 Braun Gmbh Brush head for electric and / or manual toothbrushes
US7233314B2 (en) * 2004-06-02 2007-06-19 Inventec Corporation Notebook having combined touch pad and CD-ROM drive
US8281241B2 (en) 2004-06-28 2012-10-02 Nokia Corporation Electronic device and method for providing extended user interface
US7342573B2 (en) * 2004-07-07 2008-03-11 Nokia Corporation Electrostrictive polymer as a combined haptic-seal actuator
US20060007179A1 (en) * 2004-07-08 2006-01-12 Pekka Pihlaja Multi-functional touch actuation in electronic devices
JP4489525B2 (en) * 2004-07-23 2010-06-23 富士通コンポーネント株式会社 Input device
JP4439351B2 (en) * 2004-07-28 2010-03-24 アルパイン株式会社 Touch panel input device with vibration applying function and vibration applying method for operation input
EP2000894B1 (en) * 2004-07-30 2016-10-19 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US20060028428A1 (en) * 2004-08-05 2006-02-09 Xunhu Dai Handheld device having localized force feedback
US20060036947A1 (en) * 2004-08-10 2006-02-16 Jelley Kevin W User interface controller method and apparatus for a handheld electronic device
JP2008511045A (en) 2004-08-16 2008-04-10 フィンガーワークス・インコーポレーテッド Method for improving the spatial resolution of a touch sense device
DE102004040886A1 (en) * 2004-08-24 2006-03-02 Volkswagen Ag Operating device for a motor vehicle
US8013847B2 (en) * 2004-08-24 2011-09-06 Immersion Corporation Magnetic actuator for providing haptic feedback
US7428142B1 (en) * 2004-08-25 2008-09-23 Apple Inc. Lid-closed detector
JP4473685B2 (en) * 2004-09-01 2010-06-02 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
US20100231506A1 (en) * 2004-09-07 2010-09-16 Timothy Pryor Control of appliances, kitchen and home
FR2875024B1 (en) * 2004-09-09 2007-06-08 Itt Mfg Enterprises Inc TOUCH SLAB INCLUDING MEANS FOR PRODUCING A MECHANICAL IMPULSE IN RESPONSE TO A CONTROL ACTION, AND ARRANGEMENT FOR THE ASSEMBLY OF THIS SLAB
US7148789B2 (en) * 2004-09-09 2006-12-12 Motorola, Inc. Handheld device having multiple localized force feedback
EP1805585B1 (en) * 2004-10-08 2017-08-16 Immersion Corporation Haptic feedback for button and scrolling action simulation in touch input devices
EP1650642B1 (en) * 2004-10-20 2016-06-22 Harman Becker Automotive Systems GmbH On-board electronic system for a vehicle, vehicle multimedia system and method for configuring an on-board electronic system
US20060097996A1 (en) * 2004-11-10 2006-05-11 Alps Electric Co., Ltd. Input device
KR100682901B1 (en) * 2004-11-17 2007-02-15 삼성전자주식회사 Apparatus and method for providing fingertip haptics of visual information using electro-active polymer in a image displaying device
US7847789B2 (en) * 2004-11-23 2010-12-07 Microsoft Corporation Reducing accidental touch-sensitive device activation
JP4672347B2 (en) * 2004-12-01 2011-04-20 アルパイン株式会社 Operation input device with vibration function
EP1846811A2 (en) * 2004-12-01 2007-10-24 Koninklijke Philips Electronics N.V. Image display that moves physical objects and causes tactile sensation
US20060119585A1 (en) * 2004-12-07 2006-06-08 Skinner David N Remote control with touchpad and method
US7489306B2 (en) * 2004-12-22 2009-02-10 Microsoft Corporation Touch screen accuracy
KR100590576B1 (en) * 2004-12-28 2006-11-23 삼성전자주식회사 Apparatus and method for providing haptics of image
EP1677180A1 (en) * 2004-12-30 2006-07-05 Volkswagen Aktiengesellschaft Touchscreen capable of detecting two simultaneous touch locations
WO2006074712A2 (en) * 2004-12-30 2006-07-20 Volkswagen Aktiengesellschaft Input device and method for the operation thereof
DE102005038161A1 (en) * 2004-12-30 2006-07-13 Volkswagen Ag Input device for cockpit of land vehicle, controls e.g. brushless DC actuator as function of speed of touching movement over operating surface or quantity derived from speed
US7562117B2 (en) 2005-09-09 2009-07-14 Outland Research, Llc System, method and computer program product for collaborative broadcast media
US7542816B2 (en) 2005-01-27 2009-06-02 Outland Research, Llc System, method and computer program product for automatically selecting, suggesting and playing music media files
US20070189544A1 (en) 2005-01-15 2007-08-16 Outland Research, Llc Ambient sound responsive media player
US9275052B2 (en) 2005-01-19 2016-03-01 Amazon Technologies, Inc. Providing annotations of a digital work
DE202005001032U1 (en) * 2005-01-22 2005-05-12 Wessling, Herbert Gaming or gambling machine, has an operating unit in the form of a dielectric operating plate that has operating fields and matching capacitive sensors on its reverse side
US20100312129A1 (en) 2005-01-26 2010-12-09 Schecter Stuart O Cardiovascular haptic handle system
JP4056528B2 (en) * 2005-02-03 2008-03-05 Necインフロンティア株式会社 Electronics
JP4657748B2 (en) * 2005-02-03 2011-03-23 アルプス電気株式会社 Input device
US7892096B2 (en) * 2005-02-22 2011-02-22 Wms Gaming Inc. Gaming machine with configurable button panel
US20060227066A1 (en) * 2005-04-08 2006-10-12 Matsushita Electric Industrial Co., Ltd. Human machine interface method and device for automotive entertainment systems
US20060241864A1 (en) * 2005-04-22 2006-10-26 Outland Research, Llc Method and apparatus for point-and-send data transfer within an ubiquitous computing environment
US7825903B2 (en) * 2005-05-12 2010-11-02 Immersion Corporation Method and apparatus for providing haptic effects to a touch panel
US7385530B2 (en) * 2005-05-16 2008-06-10 Research In Motion Limited Key system for a communication device
ATE408887T1 (en) * 2005-05-16 2008-10-15 Research In Motion Ltd BUTTON SYSTEM FOR A COMMUNICATIONS DEVICE
DE102005025301B4 (en) * 2005-06-02 2008-12-18 Preh Keytec Gmbh Device for manual input and display of data
CN1881419A (en) * 2005-06-15 2006-12-20 光宝科技股份有限公司 Portable electronic apparatus capable of operating in accordance with sensed pressure
JP4749069B2 (en) * 2005-07-20 2011-08-17 任天堂株式会社 Game system and game machine used therefor
JP4229098B2 (en) * 2005-07-29 2009-02-25 ソニー株式会社 Touch panel display device, electronic device including touch panel display device, and camera including touch panel display device
EP1920408A2 (en) * 2005-08-02 2008-05-14 Ipifini, Inc. Input device having multifunctional keys
JP4684794B2 (en) * 2005-08-04 2011-05-18 富士通コンポーネント株式会社 Operation device, electronic book apparatus, and electronic apparatus
KR100731019B1 (en) 2005-09-13 2007-06-22 엘지전자 주식회사 Touch screen assembly, mobile communication terminal having the same and method for applying key inputs thereto
EP1752860B1 (en) 2005-08-12 2015-03-18 LG Electronics Inc. Mobile terminal with touch screen providing haptic feedback and corresponding method
US20070043725A1 (en) * 2005-08-16 2007-02-22 Steve Hotelling Feedback responsive input arrangements
KR100714725B1 (en) * 2005-08-29 2007-05-07 삼성전자주식회사 Apparatus and method for protecting exposure of inputted information
JP2007065814A (en) * 2005-08-30 2007-03-15 Sony Corp Input/output device and electronic equipment with input/output device
US7671837B2 (en) 2005-09-06 2010-03-02 Apple Inc. Scrolling input arrangements using capacitive sensors on a flexible membrane
WO2007030603A2 (en) 2005-09-08 2007-03-15 Wms Gaming Inc. Gaming machine having display with sensory feedback
DE102006033014A1 (en) * 2005-09-13 2007-04-05 Volkswagen Ag Input device for motor vehicle, has touch screen deflected such that integral of deflection of screen in one direction amounts to four times of integral of deflection of screen in another direction opposite to former direction
EP1764674B1 (en) 2005-09-14 2012-06-13 Volkswagen AG Input device
US20070057928A1 (en) * 2005-09-14 2007-03-15 Michael Prados Input device for a vehicle
US7917148B2 (en) 2005-09-23 2011-03-29 Outland Research, Llc Social musical media rating system and method for localized establishments
US8176101B2 (en) 2006-02-07 2012-05-08 Google Inc. Collaborative rejection of media for physical establishments
EP1938175A1 (en) * 2005-09-30 2008-07-02 Nokia Corporation Electronic device with touch sensitive input
DE102005047650A1 (en) * 2005-10-05 2007-04-12 Volkswagen Ag Entry device for e.g. land vehicle, has controller for adjusting slider corresponding to touch movement at touch screen, and actuator for deflecting touch screen when slider reaches preset position or is moved to preset distance
US7880729B2 (en) 2005-10-11 2011-02-01 Apple Inc. Center button isolation ring
JP4405457B2 (en) * 2005-10-28 2010-01-27 京セラ株式会社 Broadcast receiver
DE102006029506B4 (en) 2005-10-28 2018-10-11 Volkswagen Ag input device
JP5208362B2 (en) * 2005-10-28 2013-06-12 ソニー株式会社 Electronics
US9001045B2 (en) * 2005-11-08 2015-04-07 Nokia Corporation Cost efficient element for combined piezo sensor and actuator in robust and small touch screen realization and method for operation thereof
CN1964606B (en) * 2005-11-09 2010-10-27 华硕电脑股份有限公司 A display with prompt sound effect
DE102006047893A1 (en) * 2005-12-01 2007-06-06 Volkswagen Ag Input device for e.g. land vehicle, has controller provided for optical representation of operating information and operating units on display, and for detection of contact position of operating surface of touch-screen
US20070132740A1 (en) * 2005-12-09 2007-06-14 Linda Meiby Tactile input device for controlling electronic contents
US20070145680A1 (en) * 2005-12-15 2007-06-28 Outland Research, Llc Shake Responsive Portable Computing Device for Simulating a Randomization Object Used In a Game Of Chance
KR100791378B1 (en) * 2005-12-29 2008-01-07 삼성전자주식회사 User command input apparatus supporting variable input modes, and device using the input apparatus
US20070152983A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
US8077147B2 (en) 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
KR100791379B1 (en) * 2006-01-02 2008-01-07 삼성전자주식회사 System and method for user interface
KR20070074086A (en) * 2006-01-06 2007-07-12 엘지전자 주식회사 Method and mobile communiation terminal for modulating volume of speaker
TWI380211B (en) * 2006-02-10 2012-12-21 Forest Assets Ii Ltd Liability Company A system generating an input useful to an electronic device and a method of fabricating a system having multiple variable resistors
US8510666B2 (en) * 2006-03-14 2013-08-13 Siemens Enterprise Communications Gmbh & Co. Kg Systems for development and/or use of telephone user interface
US8780053B2 (en) * 2007-03-21 2014-07-15 Northwestern University Vibrating substrate for haptic interface
WO2007111909A2 (en) * 2006-03-24 2007-10-04 Northwestern University Haptic device with indirect haptic feedback
US8525778B2 (en) 2007-03-21 2013-09-03 Northwestern University Haptic device with controlled traction forces
US20070236474A1 (en) * 2006-04-10 2007-10-11 Immersion Corporation Touch Panel with a Haptically Generated Reference Key
US7233396B1 (en) * 2006-04-17 2007-06-19 Alphasniffer Llc Polarization based interferometric detector
US7978181B2 (en) 2006-04-25 2011-07-12 Apple Inc. Keystroke tactility arrangement on a smooth touch surface
US20070261002A1 (en) * 2006-05-08 2007-11-08 Mediatek Inc. System and method for controlling a portable electronic device
JP2007304666A (en) * 2006-05-08 2007-11-22 Sony Computer Entertainment Inc Information output system and information output method
JP2007310496A (en) * 2006-05-16 2007-11-29 Alps Electric Co Ltd Touch operation input device
KR100897806B1 (en) 2006-05-23 2009-05-15 엘지전자 주식회사 Method for selecting items and terminal therefor
DE102007016083A1 (en) * 2006-05-31 2007-12-06 Mizukawa, Suehiro, Settsu Method and device for bending a knife element
DE102006037725B4 (en) 2006-06-22 2018-05-30 Volkswagen Ag Motor vehicle with an input device
KR100846497B1 (en) * 2006-06-26 2008-07-17 삼성전자주식회사 Input device with display button and portable electronic device having the same
EP1876419B1 (en) * 2006-07-03 2008-10-15 Continental Automotive GmbH Watertight navigation device
US8743060B2 (en) 2006-07-06 2014-06-03 Apple Inc. Mutual capacitance touch sensing device
US9360967B2 (en) 2006-07-06 2016-06-07 Apple Inc. Mutual capacitance touch sensing device
US8022935B2 (en) * 2006-07-06 2011-09-20 Apple Inc. Capacitance sensing electrode with integrated I/O mechanism
KR100827150B1 (en) 2006-07-10 2008-05-02 삼성전자주식회사 Apparatus for driving in portable terminal having a touch pad
US20090278798A1 (en) * 2006-07-26 2009-11-12 The Research Foundation Of The State University Of New York Active Fingertip-Mounted Object Digitizer
JP2008033739A (en) * 2006-07-31 2008-02-14 Sony Corp Touch screen interaction method and apparatus based on tactile force feedback and pressure measurement
FR2905195B1 (en) * 2006-08-23 2008-10-10 Dav Sa CONTROL MODULE, IN PARTICULAR FOR MOTOR VEHICLE
KR20090077755A (en) * 2006-09-09 2009-07-15 에프-오리진, 인크. Integrated pressure sensitive lens assembly
US7795553B2 (en) 2006-09-11 2010-09-14 Apple Inc. Hybrid button
CN104656900A (en) * 2006-09-13 2015-05-27 意美森公司 Systems and methods for casino gaming haptics
US20080068334A1 (en) * 2006-09-14 2008-03-20 Immersion Corporation Localized Haptic Feedback
US7890863B2 (en) * 2006-10-04 2011-02-15 Immersion Corporation Haptic effects with proximity sensing
US20080084384A1 (en) * 2006-10-05 2008-04-10 Immersion Corporation Multiple Mode Haptic Feedback System
US8274479B2 (en) 2006-10-11 2012-09-25 Apple Inc. Gimballed scroll wheel
US20080210474A1 (en) * 2006-10-31 2008-09-04 Volkswagen Of America, Inc. Motor vehicle having a touch screen
US8482530B2 (en) 2006-11-13 2013-07-09 Apple Inc. Method of capacitively sensing finger position
TWI349870B (en) * 2006-11-22 2011-10-01 Ind Tech Res Inst Device and method of tactile sensing for human robot interaction
US20080122589A1 (en) * 2006-11-28 2008-05-29 Ivanov Yuri A Tactile Output Device
US10915242B1 (en) * 2006-12-19 2021-02-09 Philip R. Schaefer Interface to computer and other display information
EP1936929A1 (en) * 2006-12-21 2008-06-25 Samsung Electronics Co., Ltd Haptic generation method and system for mobile phone
US8120584B2 (en) * 2006-12-21 2012-02-21 Cypress Semiconductor Corporation Feedback mechanism for user detection of reference location on a sensing device
KR20080058121A (en) * 2006-12-21 2008-06-25 삼성전자주식회사 An apparatus and a method for providing a haptic user interface in a mobile terminal
JP2008158909A (en) 2006-12-25 2008-07-10 Pro Tech Design Corp Tactile feedback controller
EP2126667B1 (en) * 2006-12-27 2020-06-24 Immersion Corporation Virtual detents through vibrotactile feedback
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
US20080168478A1 (en) 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US7844915B2 (en) 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
US20080207317A1 (en) * 2007-01-24 2008-08-28 Splitfish Gameware Inc. Game controller with tactile feedback
GB2446702A (en) * 2007-02-13 2008-08-20 Qrg Ltd Touch Control Panel with Pressure Sensor
US8098234B2 (en) * 2007-02-20 2012-01-17 Immersion Corporation Haptic feedback system with stored effects
KR101299600B1 (en) * 2007-03-08 2013-08-26 크루셜텍 (주) Haptic control method
US20080235627A1 (en) * 2007-03-21 2008-09-25 Microsoft Corporation Natural interaction by flower-like navigation
US9665529B1 (en) 2007-03-29 2017-05-30 Amazon Technologies, Inc. Relative progress and event indicators
US20080238886A1 (en) * 2007-03-29 2008-10-02 Sony Ericsson Mobile Communications Ab Method for providing tactile feedback for touch-based input device
US20080248248A1 (en) * 2007-04-04 2008-10-09 Motorola, Inc. Method and apparatus for controlling a skin texture surface on a device using a gas
US7876199B2 (en) * 2007-04-04 2011-01-25 Motorola, Inc. Method and apparatus for controlling a skin texture surface on a device using a shape memory alloy
US8761846B2 (en) * 2007-04-04 2014-06-24 Motorola Mobility Llc Method and apparatus for controlling a skin texture surface on a device
US20080248836A1 (en) * 2007-04-04 2008-10-09 Motorola, Inc. Method and apparatus for controlling a skin texture surface on a device using hydraulic control
US20080259046A1 (en) * 2007-04-05 2008-10-23 Joseph Carsanaro Pressure sensitive touch pad with virtual programmable buttons for launching utility applications
US20080251364A1 (en) * 2007-04-11 2008-10-16 Nokia Corporation Feedback on input actuator
ATE522697T1 (en) 2007-04-30 2011-09-15 Frank S Inr Inc METHOD AND DEVICE FOR POSITIONING AND PROTECTING CONTROL LINES CONNECTED TO A PIPE STRING ON A DRILLING RIG
US20080280657A1 (en) * 2007-05-09 2008-11-13 Nokia Corporation Seal and actuator assembly
DE102007022085A1 (en) * 2007-05-11 2008-11-13 Continental Automotive Gmbh Touchpadanordnung
US7671269B1 (en) * 2007-05-14 2010-03-02 Leapfrog Enterprises Methods and systems for graphical actuation of a velocity and directionally sensitive sound generation application
WO2008143790A2 (en) * 2007-05-14 2008-11-27 Wms Gaming Inc. Wagering game
DE102007023066B4 (en) * 2007-05-16 2015-09-10 Continental Automotive Gmbh instrument cluster
US8315652B2 (en) 2007-05-18 2012-11-20 Immersion Corporation Haptically enabled messaging
US8700005B1 (en) 2007-05-21 2014-04-15 Amazon Technologies, Inc. Notification of a user device to perform an action
US20080303646A1 (en) * 2007-05-22 2008-12-11 Elwell James K Tactile Feedback Device for Use with a Force-Based Input Device
US8621348B2 (en) * 2007-05-25 2013-12-31 Immersion Corporation Customizing haptic effects on an end user device
US9823833B2 (en) * 2007-06-05 2017-11-21 Immersion Corporation Method and apparatus for haptic enabled flexible touch sensitive surface
US8917244B2 (en) 2007-06-11 2014-12-23 Honeywell Internation Inc. Stimuli sensitive display screen with multiple detect modes
CN101681212A (en) * 2007-06-14 2010-03-24 诺基亚公司 Screen assembly
US20090002199A1 (en) * 2007-06-28 2009-01-01 Nokia Corporation Piezoelectric sensing as user input means
US7750895B2 (en) * 2007-06-29 2010-07-06 Microsoft Corporation Navigating lists using input motions
US7911453B2 (en) * 2007-06-29 2011-03-22 Microsoft Corporation Creating virtual replicas of physical objects
US7741979B2 (en) 2007-07-06 2010-06-22 Pacinian Corporation Haptic keyboard systems and methods
US8199033B2 (en) 2007-07-06 2012-06-12 Pacinian Corporation Haptic keyboard systems and methods
US20090015560A1 (en) * 2007-07-13 2009-01-15 Motorola, Inc. Method and apparatus for controlling a display of a device
US9654104B2 (en) 2007-07-17 2017-05-16 Apple Inc. Resistive force sensor with capacitive discrimination
CN101765746A (en) * 2007-07-26 2010-06-30 Lg电子株式会社 Air conditioner
KR101382021B1 (en) 2007-07-27 2014-04-04 엘지전자 주식회사 Air conditioner
US20090033632A1 (en) * 2007-07-30 2009-02-05 Szolyga Thomas H Integrated touch pad and pen-based tablet input system
US20090046068A1 (en) * 2007-08-13 2009-02-19 Research In Motion Limited Tactile touchscreen for electronic device
EP2177974A1 (en) * 2007-08-13 2010-04-21 Research in Motion Limited Touchscreen for electronic device
US8094130B2 (en) * 2007-08-13 2012-01-10 Research In Motion Limited Portable electronic device and method of controlling same
US20090132093A1 (en) * 2007-08-21 2009-05-21 Motorola, Inc. Tactile Conforming Apparatus and Method for a Device
US8219936B2 (en) * 2007-08-30 2012-07-10 Lg Electronics Inc. User interface for a mobile device using a user's gesture in the proximity of an electronic device
US8432365B2 (en) * 2007-08-30 2013-04-30 Lg Electronics Inc. Apparatus and method for providing feedback for three-dimensional touchscreen
US8683378B2 (en) 2007-09-04 2014-03-25 Apple Inc. Scrolling techniques for user interfaces
US7910843B2 (en) 2007-09-04 2011-03-22 Apple Inc. Compact input device
KR100938260B1 (en) * 2007-09-20 2010-01-22 한국전자통신연구원 A device and system for providing sensation effect on touch screen
US20090091479A1 (en) * 2007-10-04 2009-04-09 Motorola, Inc. Keypad haptic communication
JP4811381B2 (en) * 2007-10-10 2011-11-09 ソニー株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, AND PROGRAM
US8031172B2 (en) * 2007-10-12 2011-10-04 Immersion Corporation Method and apparatus for wearable remote interface device
US20090102805A1 (en) * 2007-10-18 2009-04-23 Microsoft Corporation Three-dimensional object simulation using audio, visual, and tactile feedback
TWI406551B (en) * 2007-11-06 2013-08-21 Lg Electronics Inc Mobile terminal
US20090125811A1 (en) * 2007-11-12 2009-05-14 Microsoft Corporation User interface providing auditory feedback
US8117364B2 (en) * 2007-11-13 2012-02-14 Microsoft Corporation Enhanced protocol and architecture for low bandwidth force feedback game controller
US9058077B2 (en) * 2007-11-16 2015-06-16 Blackberry Limited Tactile touch screen for electronic device
US8174508B2 (en) * 2007-11-19 2012-05-08 Microsoft Corporation Pointing and data entry input device
US8866641B2 (en) * 2007-11-20 2014-10-21 Motorola Mobility Llc Method and apparatus for controlling a keypad of a device
US10488926B2 (en) 2007-11-21 2019-11-26 Immersion Corporation Method and apparatus for providing a fixed relief touch screen with locating features using deformable haptic surfaces
US8253698B2 (en) * 2007-11-23 2012-08-28 Research In Motion Limited Tactile touch screen for electronic device
US8416198B2 (en) 2007-12-03 2013-04-09 Apple Inc. Multi-dimensional scroll wheel
DE102007058110B4 (en) * 2007-12-03 2010-01-21 Lisa Dräxlmaier GmbH switch
US8265308B2 (en) 2007-12-07 2012-09-11 Motorola Mobility Llc Apparatus including two housings and a piezoelectric transducer
EP2071433A3 (en) * 2007-12-12 2012-05-30 Advanced Digital Broadcast S.A. User interface for selecting and controlling plurality of parameters and method for selecting and controlling plurality of parameters
US8395587B2 (en) * 2007-12-21 2013-03-12 Motorola Mobility Llc Haptic response apparatus for an electronic device
US9170649B2 (en) 2007-12-28 2015-10-27 Nokia Technologies Oy Audio and tactile feedback based on visual environment
US9857872B2 (en) * 2007-12-31 2018-01-02 Apple Inc. Multi-touch display screen with localized tactile feedback
US8373549B2 (en) 2007-12-31 2013-02-12 Apple Inc. Tactile feedback in an electronic device
US20090174672A1 (en) * 2008-01-03 2009-07-09 Schmidt Robert M Haptic actuator assembly and method of manufacturing a haptic actuator assembly
US8947383B2 (en) 2008-01-04 2015-02-03 Tactus Technology, Inc. User interface system and method
US20160187981A1 (en) 2008-01-04 2016-06-30 Tactus Technology, Inc. Manual fluid actuator
US9052790B2 (en) 2008-01-04 2015-06-09 Tactus Technology, Inc. User interface and methods
US8154527B2 (en) 2008-01-04 2012-04-10 Tactus Technology User interface system
US8456438B2 (en) 2008-01-04 2013-06-04 Tactus Technology, Inc. User interface system
US8922502B2 (en) 2008-01-04 2014-12-30 Tactus Technology, Inc. User interface system
US8922510B2 (en) 2008-01-04 2014-12-30 Tactus Technology, Inc. User interface system
US9552065B2 (en) 2008-01-04 2017-01-24 Tactus Technology, Inc. Dynamic tactile interface
US8547339B2 (en) 2008-01-04 2013-10-01 Tactus Technology, Inc. System and methods for raised touch screens
US9612659B2 (en) 2008-01-04 2017-04-04 Tactus Technology, Inc. User interface system
EP2077142A1 (en) * 2008-01-04 2009-07-08 Koninklijke Philips Electronics N.V. Object, method and system for transmitting information to a user
US9423875B2 (en) 2008-01-04 2016-08-23 Tactus Technology, Inc. Dynamic tactile interface with exhibiting optical dispersion characteristics
US8928621B2 (en) 2008-01-04 2015-01-06 Tactus Technology, Inc. User interface system and method
US9557915B2 (en) 2008-01-04 2017-01-31 Tactus Technology, Inc. Dynamic tactile interface
US9720501B2 (en) 2008-01-04 2017-08-01 Tactus Technology, Inc. Dynamic tactile interface
US9128525B2 (en) 2008-01-04 2015-09-08 Tactus Technology, Inc. Dynamic tactile interface
US9013417B2 (en) 2008-01-04 2015-04-21 Tactus Technology, Inc. User interface system
US8243038B2 (en) 2009-07-03 2012-08-14 Tactus Technologies Method for adjusting the user interface of a device
US9298261B2 (en) 2008-01-04 2016-03-29 Tactus Technology, Inc. Method for actuating a tactile interface layer
US9274612B2 (en) 2008-01-04 2016-03-01 Tactus Technology, Inc. User interface system
US9430074B2 (en) 2008-01-04 2016-08-30 Tactus Technology, Inc. Dynamic tactile interface
US9588683B2 (en) 2008-01-04 2017-03-07 Tactus Technology, Inc. Dynamic tactile interface
US8570295B2 (en) 2008-01-04 2013-10-29 Tactus Technology, Inc. User interface system
US8179375B2 (en) 2008-01-04 2012-05-15 Tactus Technology User interface system and method
US8207950B2 (en) 2009-07-03 2012-06-26 Tactus Technologies User interface enhancement system
US9063627B2 (en) 2008-01-04 2015-06-23 Tactus Technology, Inc. User interface and methods
US8553005B2 (en) 2008-01-04 2013-10-08 Tactus Technology, Inc. User interface system
US8125461B2 (en) 2008-01-11 2012-02-28 Apple Inc. Dynamic input graphic display
US8004501B2 (en) 2008-01-21 2011-08-23 Sony Computer Entertainment America Llc Hand-held device with touchscreen and digital tactile pixels
US20090184932A1 (en) * 2008-01-22 2009-07-23 Apple Inc. Portable Device Capable of Initiating Disengagement from Host System
US20100127140A1 (en) * 2008-01-23 2010-05-27 Gary Smith Suspension for a pressure sensitive touch display or panel
US8820133B2 (en) 2008-02-01 2014-09-02 Apple Inc. Co-extruded materials and methods
KR100927009B1 (en) * 2008-02-04 2009-11-16 광주과학기술원 Haptic interaction method and system in augmented reality
US20090195512A1 (en) * 2008-02-05 2009-08-06 Sony Ericsson Mobile Communications Ab Touch sensitive display with tactile feedback
KR100981269B1 (en) * 2008-02-14 2010-09-10 한국표준과학연구원 Tactile feedback touch pad apparatus using tactile sensors
WO2009102992A1 (en) * 2008-02-15 2009-08-20 Pacinian Corporation Keyboard adaptive haptic response
US20090219252A1 (en) * 2008-02-28 2009-09-03 Nokia Corporation Apparatus, method and computer program product for moving controls on a touchscreen
US8205157B2 (en) * 2008-03-04 2012-06-19 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8416196B2 (en) 2008-03-04 2013-04-09 Apple Inc. Touch event model programming interface
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
BRPI0804355A2 (en) * 2008-03-10 2009-11-03 Lg Electronics Inc terminal and control method
KR100956826B1 (en) * 2008-03-10 2010-05-11 엘지전자 주식회사 Terminal and method for controlling the same
US20090227369A1 (en) * 2008-03-10 2009-09-10 Merit Entertainment Amusement Device Having a Configurable Display for Presenting Games Having Different Aspect Ratios
US8237665B2 (en) * 2008-03-11 2012-08-07 Microsoft Corporation Interpreting ambiguous inputs on a touch-screen
US9454256B2 (en) 2008-03-14 2016-09-27 Apple Inc. Sensor configurations of an input device that are switchable based on mode
US8203531B2 (en) 2008-03-14 2012-06-19 Pacinian Corporation Vector-specific haptic feedback
DE102008016017A1 (en) * 2008-03-26 2009-10-22 Continental Automotive Gmbh operating device
US9056549B2 (en) * 2008-03-28 2015-06-16 Denso International America, Inc. Haptic tracking remote control for driver information center system
US20100250071A1 (en) * 2008-03-28 2010-09-30 Denso International America, Inc. Dual function touch switch with haptic feedback
KR101032632B1 (en) * 2008-04-01 2011-05-06 한국표준과학연구원 Method for providing an user interface and the recording medium thereof
US9829977B2 (en) 2008-04-02 2017-11-28 Immersion Corporation Method and apparatus for providing multi-point haptic feedback texture systems
US20090256809A1 (en) * 2008-04-14 2009-10-15 Sony Ericsson Mobile Communications Ab Three-dimensional touch interface
US7868515B2 (en) * 2008-04-15 2011-01-11 Visteon Global Technologies, Inc. Thin laminate construction for the creation of tactile feedback
US20090262078A1 (en) * 2008-04-21 2009-10-22 David Pizzi Cellular phone with special sensor functions
KR101488796B1 (en) * 2008-05-23 2015-02-02 엘지전자 주식회사 Mobile terminal and control method thereof
US20090295739A1 (en) * 2008-05-27 2009-12-03 Wes Albert Nagara Haptic tactile precision selection
US20090303175A1 (en) * 2008-06-05 2009-12-10 Nokia Corporation Haptic user interface
US7924143B2 (en) * 2008-06-09 2011-04-12 Research In Motion Limited System and method for providing tactile feedback to a user of an electronic device
US9733704B2 (en) * 2008-06-12 2017-08-15 Immersion Corporation User interface impact actuator
US20090313020A1 (en) * 2008-06-12 2009-12-17 Nokia Corporation Text-to-speech user interface control
US20090315688A1 (en) * 2008-06-19 2009-12-24 Hongwei Kong Method and system for processing audio signals for handset vibration
US8115745B2 (en) * 2008-06-19 2012-02-14 Tactile Displays, Llc Apparatus and method for interactive display with tactile feedback
US9513705B2 (en) 2008-06-19 2016-12-06 Tactile Displays, Llc Interactive display with tactile feedback
US8217908B2 (en) * 2008-06-19 2012-07-10 Tactile Displays, Llc Apparatus and method for interactive display with tactile feedback
US8665228B2 (en) 2008-06-19 2014-03-04 Tactile Displays, Llc Energy efficient interactive display with energy regenerative keyboard
CN102124429B (en) 2008-06-20 2015-06-24 美泰有限公司 Capacitive touchpad and toy incorporating the same
US8174372B2 (en) * 2008-06-26 2012-05-08 Immersion Corporation Providing haptic feedback on a touch surface
US8625899B2 (en) * 2008-07-10 2014-01-07 Samsung Electronics Co., Ltd. Method for recognizing and translating characters in camera-based image
KR100960970B1 (en) 2008-07-14 2010-06-03 한국과학기술원 Electronic apparatus with module providing haptic feedback using impulse
EP3206381A1 (en) 2008-07-15 2017-08-16 Immersion Corporation Systems and methods for mapping message contents to virtual physical properties for vibrotactile messaging
US20100214243A1 (en) * 2008-07-15 2010-08-26 Immersion Corporation Systems and Methods For Interpreting Physical Interactions With A Graphical User Interface
US8072317B2 (en) * 2008-07-16 2011-12-06 Johnson Electric S.A. Haptic solenoid system
FR2934067B1 (en) * 2008-07-21 2013-01-25 Dav HAPTICALLY RETURN CONTROL DEVICE AND CORRESPONDING CONTROL METHOD
FR2934066B1 (en) * 2008-07-21 2013-01-25 Dav HAPTIC RETURN CONTROL DEVICE
US20100020036A1 (en) * 2008-07-23 2010-01-28 Edward Hui Portable electronic device and method of controlling same
US20100020022A1 (en) * 2008-07-24 2010-01-28 Dell Products L.P. Visual Feedback System For Touch Input Devices
DE102008034987A1 (en) * 2008-07-25 2010-01-28 Phoenix Contact Gmbh & Co. Kg Touch-sensitive front panel for a touch screen
GB2474624B (en) 2008-07-26 2013-05-22 Michael J Pelletter System for sensing human movement and methods of using the same
US8482381B2 (en) * 2008-07-31 2013-07-09 Palm, Inc. Multi-purpose detector-based input feature for a computing device
GB2462465B (en) * 2008-08-08 2013-02-13 Hiwave Technologies Uk Ltd Touch sensitive device
KR101469619B1 (en) * 2008-08-14 2014-12-08 삼성전자주식회사 Movement Control System For Display Unit And Movement Control Method using the same
TWI350958B (en) * 2008-08-26 2011-10-21 Asustek Comp Inc Notebook computer with force feedback for gaming
US20100053087A1 (en) * 2008-08-26 2010-03-04 Motorola, Inc. Touch sensors with tactile feedback
DE102008046102B4 (en) * 2008-09-05 2016-05-12 Lisa Dräxlmaier GmbH Control element with specific feedback
KR101362771B1 (en) * 2008-09-17 2014-02-14 삼성전자주식회사 Apparatus and method for displaying stereoscopic image
US8698750B2 (en) * 2008-09-18 2014-04-15 Microsoft Corporation Integrated haptic control apparatus and touch sensitive display
US8749495B2 (en) 2008-09-24 2014-06-10 Immersion Corporation Multiple actuation handheld device
US8816967B2 (en) 2008-09-25 2014-08-26 Apple Inc. Capacitive sensor having electrodes arranged on the substrate and the flex circuit
US20100079379A1 (en) * 2008-09-26 2010-04-01 Sony Ericsson Mobile Communications Ab Portable communication device having an electroluminescent driven haptic keypad
US10289199B2 (en) * 2008-09-29 2019-05-14 Apple Inc. Haptic feedback system
GB2464117B (en) * 2008-10-03 2015-01-28 Hiwave Technologies Uk Ltd Touch sensitive device
US20100085313A1 (en) * 2008-10-07 2010-04-08 Research In Motion Limited Portable electronic device and method of secondary character rendering and entry
EP2175355A1 (en) * 2008-10-07 2010-04-14 Research In Motion Limited Portable electronic device and method of secondary character rendering and entry
US8593409B1 (en) 2008-10-10 2013-11-26 Immersion Corporation Method and apparatus for providing haptic feedback utilizing multi-actuated waveform phasing
US8339250B2 (en) * 2008-10-10 2012-12-25 Motorola Mobility Llc Electronic device with localized haptic response
US7999660B2 (en) * 2008-10-10 2011-08-16 Motorola Mobility, Inc. Electronic device with suspension interface for localized haptic response
US8500732B2 (en) * 2008-10-21 2013-08-06 Hermes Innovations Llc Endometrial ablation devices and systems
WO2010047718A2 (en) * 2008-10-24 2010-04-29 Hewlett-Packard Development Company, L.P. Touchpad input device
FR2940844B1 (en) * 2008-10-24 2013-01-04 Dav HAPTIC RETURN CONTROL DEVICE
CN101730416B (en) * 2008-10-31 2012-08-29 鸿富锦精密工业(深圳)有限公司 Electronic equipment and key thereof
US20100117809A1 (en) * 2008-11-11 2010-05-13 Motorola Inc. Display module with piezoelectric haptics
US8884884B2 (en) * 2008-11-12 2014-11-11 Immersion Corporation Haptic effect generation with an eccentric rotating mass actuator
US20100137845A1 (en) 2008-12-03 2010-06-03 Immersion Corporation Tool Having Multiple Feedback Devices
US8250143B2 (en) 2008-12-10 2012-08-21 International Business Machines Corporation Network driven actuator mapping agent and bus and method of use
US8362882B2 (en) * 2008-12-10 2013-01-29 Immersion Corporation Method and apparatus for providing Haptic feedback from Haptic textile
US20100152620A1 (en) * 2008-12-12 2010-06-17 Immersion Corporation Method and Apparatus for Providing A Haptic Monitoring System Using Multiple Sensors
KR101053627B1 (en) * 2008-12-16 2011-08-03 엘지전자 주식회사 Method for selecting items and terminal therefor
KR101030389B1 (en) * 2008-12-17 2011-04-20 삼성전자주식회사 Haptic function control method for portable terminal
US8395590B2 (en) 2008-12-17 2013-03-12 Apple Inc. Integrated contact switch and touch sensor elements
US8330732B2 (en) * 2008-12-19 2012-12-11 Honeywell International Inc. Method and apparatus for avionic touchscreen operation providing sensible feedback
US9600070B2 (en) 2008-12-22 2017-03-21 Apple Inc. User interface having changeable topography
US20100156823A1 (en) * 2008-12-23 2010-06-24 Research In Motion Limited Electronic device including touch-sensitive display and method of controlling same to provide tactile feedback
EP2202619A1 (en) * 2008-12-23 2010-06-30 Research In Motion Limited Portable electronic device including tactile touch-sensitive input device and method of controlling same
US8179027B2 (en) * 2008-12-23 2012-05-15 Research In Motion Limited Coating for actuator and method of applying coating
US8686952B2 (en) * 2008-12-23 2014-04-01 Apple Inc. Multi touch with multi haptics
US20100156814A1 (en) * 2008-12-23 2010-06-24 Research In Motion Limited Portable electronic device including tactile touch-sensitive input device and method of controlling same
US8384680B2 (en) * 2008-12-23 2013-02-26 Research In Motion Limited Portable electronic device and method of control
US8384679B2 (en) * 2008-12-23 2013-02-26 Todd Robert Paleczny Piezoelectric actuator arrangement
US8427441B2 (en) * 2008-12-23 2013-04-23 Research In Motion Limited Portable electronic device and method of control
KR20100074695A (en) * 2008-12-24 2010-07-02 삼성전기주식회사 Touch sensitive interface device
US20100167820A1 (en) * 2008-12-29 2010-07-01 Houssam Barakat Human interface device
WO2010078597A1 (en) * 2009-01-05 2010-07-08 Tactus Technology, Inc. User interface system
WO2010078596A1 (en) * 2009-01-05 2010-07-08 Tactus Technology, Inc. User interface system
CN101770283B (en) * 2009-01-05 2012-10-10 联想(北京)有限公司 Method and computer for generating feedback effect for touch operation
US9588684B2 (en) 2009-01-05 2017-03-07 Tactus Technology, Inc. Tactile interface for a computing device
US8760413B2 (en) * 2009-01-08 2014-06-24 Synaptics Incorporated Tactile surface
JP2010170388A (en) * 2009-01-23 2010-08-05 Sony Corp Input device and method, information processing apparatus and method, information processing system, and program
US8378979B2 (en) * 2009-01-27 2013-02-19 Amazon Technologies, Inc. Electronic device with haptic feedback
TWI496122B (en) * 2009-01-28 2015-08-11 Semiconductor Energy Lab Display device
GB2468275A (en) 2009-02-16 2010-09-08 New Transducers Ltd A method of making a touch-sensitive data entry screen with haptic feedback
FR2942179B1 (en) * 2009-02-17 2011-03-04 Peugeot Citroen Automobiles Sa DATA DISPLAY DEVICE FOR AUTOMOBILE
US8253703B2 (en) 2009-03-03 2012-08-28 Empire Technology Development Llc Elastomeric wave tactile interface
US8077021B2 (en) * 2009-03-03 2011-12-13 Empire Technology Development Llc Dynamic tactile interface
US9746923B2 (en) 2009-03-12 2017-08-29 Immersion Corporation Systems and methods for providing features in a friction display wherein a haptic effect is configured to vary the coefficient of friction
JP5343871B2 (en) * 2009-03-12 2013-11-13 株式会社リコー Touch panel device, display device with touch panel including the same, and control method for touch panel device
US10007340B2 (en) 2009-03-12 2018-06-26 Immersion Corporation Systems and methods for interfaces featuring surface-based haptic effects
US9696803B2 (en) * 2009-03-12 2017-07-04 Immersion Corporation Systems and methods for friction displays and additional haptic effects
KR20190015624A (en) * 2009-03-12 2019-02-13 임머숀 코퍼레이션 Systems and methods for interfaces featuring surface-based haptic effects, and tangible computer-readable medium
US9874935B2 (en) * 2009-03-12 2018-01-23 Immersion Corporation Systems and methods for a texture engine
KR102051180B1 (en) * 2009-03-12 2019-12-02 임머숀 코퍼레이션 Systems and methods for a texture engine
KR20170016521A (en) * 2009-03-12 2017-02-13 임머숀 코퍼레이션 Systems and methods for using multiple actuators to realize textures
KR101973918B1 (en) * 2009-03-12 2019-04-29 임머숀 코퍼레이션 Systems and methods for providing features in a friction display
KR101992070B1 (en) * 2009-03-12 2019-06-21 임머숀 코퍼레이션 Systems and methods for a texture engine
US9927873B2 (en) * 2009-03-12 2018-03-27 Immersion Corporation Systems and methods for using textures in graphical user interface widgets
KR101761082B1 (en) * 2009-03-12 2017-07-24 임머숀 코퍼레이션 Systems and methods for using textures in graphical user interface widgets
KR20190020180A (en) * 2009-03-12 2019-02-27 임머숀 코퍼레이션 Systems and methods for friction displays and additional haptic effects
US10564721B2 (en) 2009-03-12 2020-02-18 Immersion Corporation Systems and methods for using multiple actuators to realize textures
US8589374B2 (en) 2009-03-16 2013-11-19 Apple Inc. Multifunction device with integrated search and application selection
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8285499B2 (en) 2009-03-16 2012-10-09 Apple Inc. Event recognition
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US8686951B2 (en) 2009-03-18 2014-04-01 HJ Laboratories, LLC Providing an elevated and texturized display in an electronic device
JP4522475B1 (en) * 2009-03-19 2010-08-11 Smk株式会社 Operation input device, control method, and program
KR101628782B1 (en) * 2009-03-20 2016-06-09 삼성전자주식회사 Apparatus and method for providing haptic function using multi vibrator in portable terminal
US8976012B2 (en) * 2009-03-23 2015-03-10 Methode Electronics, Inc. Touch panel assembly with haptic effects and method of manufacturuing thereof
US8169306B2 (en) * 2009-03-23 2012-05-01 Methode Electronics, Inc. Touch panel assembly with haptic effects and method of manufacturing thereof
EP2411896B1 (en) * 2009-03-23 2018-10-31 Methode Electronics, Inc. Touch panel assembly with haptic effects
KR101054939B1 (en) * 2009-03-23 2011-08-05 삼성전기주식회사 Tactile Interface Device
US20100245254A1 (en) * 2009-03-24 2010-09-30 Immersion Corporation Planar Suspension Of A Haptic Touch Screen
JP5197457B2 (en) * 2009-03-25 2013-05-15 三菱電機株式会社 Coordinate input display device
US8570280B2 (en) * 2009-03-25 2013-10-29 Lenovo (Singapore) Pte. Ltd. Filtering of inadvertent contact with touch pad input device
US9024907B2 (en) * 2009-04-03 2015-05-05 Synaptics Incorporated Input device with capacitive force sensor and method for constructing the same
EP2417511B1 (en) * 2009-04-09 2016-11-09 New Transducers Limited Touch sensitive device
KR101553842B1 (en) * 2009-04-21 2015-09-17 엘지전자 주식회사 Mobile terminal providing multi haptic effect and control method thereof
US9164584B2 (en) * 2009-04-21 2015-10-20 Google Technology Holdings LLC Methods and devices for consistency of the haptic response across a touch sensitive device
US8633904B2 (en) 2009-04-24 2014-01-21 Cypress Semiconductor Corporation Touch identification for multi-touch technology
DE102009020796B3 (en) 2009-04-30 2010-07-29 Technische Universität Dresden Device for processing and reproducing signals in electronic systems for electrotactic stimulation
US20100277422A1 (en) * 2009-04-30 2010-11-04 Microsoft Corporation Touchpad display
TWI490736B (en) * 2009-04-30 2015-07-01 Asustek Comp Inc Display panel apparatus and reaction apparatus
US9703411B2 (en) 2009-04-30 2017-07-11 Synaptics Incorporated Reduction in latency between user input and visual feedback
US9489046B2 (en) 2009-05-04 2016-11-08 Immersion Corporation Method and apparatus for providing haptic feedback to non-input locations
KR101615872B1 (en) * 2009-05-08 2016-04-27 삼성전자주식회사 A method for transmitting haptic function in mobile terminal and system thereof
US9354751B2 (en) 2009-05-15 2016-05-31 Apple Inc. Input device with optimized capacitive sensing
US8400410B2 (en) * 2009-05-26 2013-03-19 Microsoft Corporation Ferromagnetic user interfaces
US9372536B2 (en) * 2009-06-05 2016-06-21 Empire Technology Development Llc Touch screen with tactile feedback
US9891708B2 (en) 2009-06-09 2018-02-13 Immersion Corporation Method and apparatus for generating haptic effects using actuators
US10401961B2 (en) * 2009-06-09 2019-09-03 Immersion Corporation Method and apparatus for generating haptic effects using actuators
KR101658991B1 (en) * 2009-06-19 2016-09-22 삼성전자주식회사 Touch panel and electronic device including the touch panel
KR101667801B1 (en) * 2009-06-19 2016-10-20 삼성전자주식회사 Touch panel and electronic device including the touch panel
US20100328229A1 (en) * 2009-06-30 2010-12-30 Research In Motion Limited Method and apparatus for providing tactile feedback
EP2270627A1 (en) * 2009-06-30 2011-01-05 Research In Motion Limited Method and apparatus for providing tactile feedback
US8310458B2 (en) * 2009-07-06 2012-11-13 Research In Motion Limited Electronic device including a moveable touch-sensitive input and method of controlling same
DE102009032068A1 (en) 2009-07-07 2011-01-13 Volkswagen Ag Method for providing user interface in motor vehicle, involves assigning virtual mass to graphic object, and exercising haptic feedback when shifting graphic object on actuator, where feedback depends on virtual mass of graphic object
US8872771B2 (en) 2009-07-07 2014-10-28 Apple Inc. Touch sensing device having conductive nodes
US9737796B2 (en) 2009-07-08 2017-08-22 Steelseries Aps Apparatus and method for managing operations of accessories in multi-dimensions
US8719714B2 (en) 2009-07-08 2014-05-06 Steelseries Aps Apparatus and method for managing operations of accessories
KR20110005587A (en) * 2009-07-10 2011-01-18 삼성전자주식회사 Method and apparatus for generating vibration in portable terminal
US8378797B2 (en) * 2009-07-17 2013-02-19 Apple Inc. Method and apparatus for localization of haptic feedback
US9213776B1 (en) 2009-07-17 2015-12-15 Open Invention Network, Llc Method and system for searching network resources to locate content
US8224391B2 (en) * 2009-07-20 2012-07-17 Lg Electronics Inc. Mobile terminal having an LED backlight unit
US8649524B2 (en) * 2009-08-13 2014-02-11 Starkey Laboratories, Inc. Method and apparatus for using haptics for fitting hearing aids
US20110037706A1 (en) * 2009-08-14 2011-02-17 Research In Motion Limited Electronic device including tactile touch-sensitive input device and method of controlling same
US8441790B2 (en) * 2009-08-17 2013-05-14 Apple Inc. Electronic device housing as acoustic input device
US8654524B2 (en) 2009-08-17 2014-02-18 Apple Inc. Housing as an I/O device
AU2010284771B2 (en) 2009-08-18 2015-07-02 Airway Limited Endoscope simulator
US8878772B2 (en) * 2009-08-21 2014-11-04 Mitsubishi Electric Research Laboratories, Inc. Method and system for displaying images on moveable display devices
CN102498459A (en) 2009-08-27 2012-06-13 京瓷株式会社 Tactile sensation imparting device and control method of tactile sensation imparting device
JP2011048685A (en) * 2009-08-27 2011-03-10 Kyocera Corp Input apparatus
EP2472365B1 (en) 2009-08-27 2016-10-12 Kyocera Corporation Tactile sensation imparting device and control method of tactile sensation imparting device
US8816981B2 (en) * 2009-08-31 2014-08-26 Nissha Printing Co., Ltd. Mount structure of touch panel with vibration function
JP5278259B2 (en) * 2009-09-07 2013-09-04 ソニー株式会社 Input device, input method, and program
JP5026486B2 (en) * 2009-09-29 2012-09-12 日本写真印刷株式会社 Mounting structure of touch input device with pressure sensitive sensor
US8310350B2 (en) * 2009-09-29 2012-11-13 Visteon Global Technologies, Inc. Mounting apparatus for a haptic surface
US8310349B2 (en) * 2009-09-29 2012-11-13 Visteon Global Technologies, Inc. Haptic surface with mechanical buttons
DE102009048823A1 (en) 2009-10-09 2011-04-14 Volkswagen Ag Method for providing a user interface and operating device
US8717309B2 (en) * 2009-10-13 2014-05-06 Blackberry Limited Portable electronic device including a touch-sensitive display and method of controlling same
EP2320302A1 (en) 2009-10-13 2011-05-11 Research In Motion Limited Portable electronic device including a touch-sensitive display and method of controlling same
US8624839B2 (en) * 2009-10-15 2014-01-07 Synaptics Incorporated Support-surface apparatus to impart tactile feedback
US10068728B2 (en) * 2009-10-15 2018-09-04 Synaptics Incorporated Touchpad with capacitive force sensing
JP5358392B2 (en) * 2009-10-21 2013-12-04 アルプス電気株式会社 Input processing device
US8531485B2 (en) * 2009-10-29 2013-09-10 Immersion Corporation Systems and methods for compensating for visual distortion caused by surface features on a display
WO2011051722A2 (en) * 2009-10-29 2011-05-05 New Transducers Limited Touch sensitive device
US9035784B2 (en) * 2009-10-30 2015-05-19 Joseph Maanuel Garcia Clock(s) as a seismic wave receiver
WO2011054835A2 (en) * 2009-11-09 2011-05-12 Osram Gesellschaft mit beschränkter Haftung Display device and display method therefor
JP5668076B2 (en) * 2009-11-17 2015-02-12 イマージョン コーポレーションImmersion Corporation System and method for increasing haptic bandwidth in electronic devices
WO2011061603A1 (en) * 2009-11-20 2011-05-26 Nokia Corporation Methods and apparatuses for generating and utilizing haptic style sheets
US20110128227A1 (en) * 2009-11-30 2011-06-02 Research In Motion Limited Portable electronic device and method of controlling same to provide tactile feedback
US20110128236A1 (en) * 2009-11-30 2011-06-02 Research In Motion Limited Electronic device and method of controlling same
EP2328065A1 (en) 2009-11-30 2011-06-01 Research In Motion Limited Electronic device and method of controlling same
KR101092722B1 (en) * 2009-12-02 2011-12-09 현대자동차주식회사 User interface device for controlling multimedia system of vehicle
KR20110063297A (en) * 2009-12-02 2011-06-10 삼성전자주식회사 Mobile device and control method thereof
JPWO2011067845A1 (en) * 2009-12-03 2013-04-18 富士通株式会社 Electronics
US8633916B2 (en) 2009-12-10 2014-01-21 Apple, Inc. Touch pad with force sensors and actuator feedback
JP5529515B2 (en) * 2009-12-14 2014-06-25 京セラ株式会社 Tactile presentation device
JP5587596B2 (en) * 2009-12-14 2014-09-10 京セラ株式会社 Tactile presentation device
US8909414B2 (en) * 2009-12-14 2014-12-09 Volkswagen Ag Three-dimensional corporeal figure for communication with a passenger in a motor vehicle
US20110148607A1 (en) * 2009-12-17 2011-06-23 Charles Timberlake Zeleny System,device and method for providing haptic technology
US20110148608A1 (en) * 2009-12-18 2011-06-23 Research In Motion Limited Portable electronic device and method of control
CN102667671B (en) * 2009-12-18 2016-08-31 本田技研工业株式会社 Morphable pad for tactile control
WO2011087817A1 (en) 2009-12-21 2011-07-21 Tactus Technology User interface system
US20110148762A1 (en) * 2009-12-22 2011-06-23 Universal Electronics Inc. System and method for multi-mode command input
KR20110074333A (en) * 2009-12-24 2011-06-30 삼성전자주식회사 Method and apparatus for generating vibration in potable terminal
JP5126215B2 (en) * 2009-12-25 2013-01-23 ソニー株式会社 Input device and electronic device
US9239623B2 (en) 2010-01-05 2016-01-19 Tactus Technology, Inc. Dynamic tactile interface
KR101616875B1 (en) * 2010-01-07 2016-05-02 삼성전자주식회사 Touch panel and electronic device including the touch panel
US8334840B2 (en) * 2010-01-19 2012-12-18 Visteon Global Technologies, Inc. System and method of screen manipulation using haptic enable controller
US8624878B2 (en) 2010-01-20 2014-01-07 Apple Inc. Piezo-based acoustic and capacitive detection
WO2011090780A1 (en) * 2010-01-20 2011-07-28 Northwestern University Method and apparatus for increasing the forces applied to bare a finger on a haptic surface
KR101631892B1 (en) * 2010-01-28 2016-06-21 삼성전자주식회사 Touch panel and electronic device including the touch panel
WO2011091993A1 (en) * 2010-01-29 2011-08-04 Johnson Controls Automotive Electronics Gmbh Input device
US20110191675A1 (en) * 2010-02-01 2011-08-04 Nokia Corporation Sliding input user interface
KR101678549B1 (en) * 2010-02-02 2016-11-23 삼성전자주식회사 Method and apparatus for providing user interface using surface acoustic signal, and device with the user interface
FR2955813B1 (en) * 2010-02-02 2015-05-22 Dav HAPTIC RETURN MODULE FOR INTEGRATING IN A MOTOR VEHICLE FOR A NOMAD DEVICE AND CORRESPONDING CONTROL DEVICE
TW201205910A (en) * 2010-02-03 2012-02-01 Bayer Materialscience Ag An electroactive polymer actuator haptic grip assembly
US9870053B2 (en) 2010-02-08 2018-01-16 Immersion Corporation Systems and methods for haptic feedback using laterally driven piezoelectric actuators
US8619035B2 (en) 2010-02-10 2013-12-31 Tactus Technology, Inc. Method for assisting user input to a device
KR101097332B1 (en) * 2010-02-10 2011-12-21 삼성모바일디스플레이주식회사 Display module having haptic function
TW201128447A (en) * 2010-02-11 2011-08-16 Compal Electronics Inc Vibration module and vibration method thereof
US20110199321A1 (en) * 2010-02-12 2011-08-18 Electronics And Telecommunications Research Institute Apparatus for providing self-morphable haptic and visual information and method thereof
US20110199342A1 (en) 2010-02-16 2011-08-18 Harry Vartanian Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound
US20110205165A1 (en) * 2010-02-24 2011-08-25 Douglas Allen Pfau Tuned mass damper for improving nvh characteristics of a haptic touch panel
US20110216015A1 (en) * 2010-03-05 2011-09-08 Mckesson Financial Holdings Limited Apparatus and method for directing operation of a software application via a touch-sensitive surface divided into regions associated with respective functions
US8941600B2 (en) * 2010-03-05 2015-01-27 Mckesson Financial Holdings Apparatus for providing touch feedback for user input to a touch sensitive surface
WO2011112984A1 (en) 2010-03-11 2011-09-15 Tactus Technology User interface system
US8629954B2 (en) * 2010-03-18 2014-01-14 Immersion Corporation Grommet suspension component and system
KR101710523B1 (en) 2010-03-22 2017-02-27 삼성전자주식회사 Touch panel and electronic device including the touch panel
US9645996B1 (en) 2010-03-25 2017-05-09 Open Invention Network Llc Method and device for automatically generating a tag from a conversation in a social networking website
DE102011006448A1 (en) 2010-03-31 2011-10-06 Tk Holdings, Inc. steering wheel sensors
US8587422B2 (en) 2010-03-31 2013-11-19 Tk Holdings, Inc. Occupant sensing system
DE102011006649B4 (en) 2010-04-02 2018-05-03 Tk Holdings Inc. Steering wheel with hand sensors
FR2958424B1 (en) * 2010-04-02 2015-05-15 Thales Sa HAPTIC INTERACTION DEVICE.
US10719131B2 (en) 2010-04-05 2020-07-21 Tactile Displays, Llc Interactive display with tactile feedback
US9417695B2 (en) 2010-04-08 2016-08-16 Blackberry Limited Tactile feedback method and apparatus
US20200393907A1 (en) 2010-04-13 2020-12-17 Tactile Displays, Llc Interactive display with tactile feedback
EP2846465B1 (en) * 2010-04-14 2017-11-29 Frederick Johannes Bruwer Pressure dependent capacitive sensing circuit switch construction
WO2011133605A1 (en) 2010-04-19 2011-10-27 Tactus Technology Method of actuating a tactile interface layer
KR20110117534A (en) * 2010-04-21 2011-10-27 삼성전자주식회사 Vibration control device and method
JP2011242386A (en) * 2010-04-23 2011-12-01 Immersion Corp Transparent compound piezoelectric material aggregate of contact sensor and tactile sense actuator
US8552997B2 (en) 2010-04-23 2013-10-08 Blackberry Limited Portable electronic device including tactile touch-sensitive input device
EP2383631A1 (en) * 2010-04-27 2011-11-02 Sony Ericsson Mobile Communications AB Hand-held mobile device and method for operating the hand-held mobile device
US8416066B2 (en) * 2010-04-29 2013-04-09 Microsoft Corporation Active vibrations
US9025013B2 (en) 2010-04-30 2015-05-05 Samsung Electronics Co., Ltd. Stereoscopic display apparatus for displaying an image with reduced crosstalk and method of driving the same
JP2011238205A (en) * 2010-05-04 2011-11-24 Samsung Electro-Mechanics Co Ltd Touch screen device
KR100986681B1 (en) * 2010-05-06 2010-10-08 (주)이미지스테크놀로지 Apparatus for controlling multi acutator drive for generating touch feeling
US9057653B2 (en) * 2010-05-11 2015-06-16 Synaptics Incorporated Input device with force sensing
KR101661728B1 (en) 2010-05-11 2016-10-04 삼성전자주식회사 User's input apparatus and electronic device including the user's input apparatus
US8938753B2 (en) 2010-05-12 2015-01-20 Litl Llc Configurable computer system
US9436219B2 (en) 2010-05-12 2016-09-06 Litl Llc Remote control to operate computer system
US20110285652A1 (en) * 2010-05-21 2011-11-24 Kabushiki Kaisha Toshiba Broadcast receiving device and electronic device
US10216408B2 (en) 2010-06-14 2019-02-26 Apple Inc. Devices and methods for identifying user interface objects based on view hierarchy
KR101782639B1 (en) 2010-06-16 2017-09-27 삼성전자주식회사 Method for using A PORTABLE TERMINAL
FR2961610B1 (en) * 2010-06-18 2013-01-18 Thales Sa HOSPITABLE INTERACTION DEVICE ENHANCED BY THE EFFORT
US9132352B1 (en) 2010-06-24 2015-09-15 Gregory S. Rabin Interactive system and method for rendering an object
US8405606B2 (en) * 2010-07-02 2013-03-26 Alpha & Omega Inc. Remote control systems and methods for activating buttons of digital electronic display devices
WO2012004629A1 (en) * 2010-07-05 2012-01-12 Nokia Corporation An apparatus and a method for providing haptic feedback
FR2962566B1 (en) * 2010-07-06 2013-05-17 Commissariat Energie Atomique SIMULATION SYSTEM FOR CONTACT WITH A SURFACE BY TOUCH STIMULATION
US8947372B2 (en) * 2010-08-11 2015-02-03 Blackberry Limited Electronic device including touch-sensitive display
JP5579583B2 (en) * 2010-08-11 2014-08-27 京セラ株式会社 Tactile sensation presentation apparatus and control method of tactile sensation presentation apparatus
EP3179330B1 (en) * 2010-08-20 2020-03-18 SeeScan, Inc. Magnetic sensing user interface device
CA2808716C (en) * 2010-08-23 2018-03-06 Nokia Corporation Apparatus and method for providing haptic and audio feedback in a touch sensitive user interface
US9182820B1 (en) 2010-08-24 2015-11-10 Amazon Technologies, Inc. High resolution haptic array
JP5492023B2 (en) * 2010-08-27 2014-05-14 京セラ株式会社 Character input device, character input method, and character input program
US20120058813A1 (en) * 2010-09-08 2012-03-08 Lee Amaitis Systems and methods for interprocess communication of wagering opportunities and/or wager requests
KR20120025684A (en) * 2010-09-08 2012-03-16 삼성전자주식회사 Touch screen panel display device
FR2964761B1 (en) * 2010-09-14 2012-08-31 Thales Sa HAPTIC INTERACTION DEVICE AND METHOD FOR GENERATING HAPTIC AND SOUND EFFECTS
US8710965B2 (en) * 2010-09-22 2014-04-29 At&T Intellectual Property I, L.P. Devices, systems, and methods for tactile feedback and input
JP5642474B2 (en) * 2010-09-24 2014-12-17 ミネベア株式会社 Input device, vibration device, and input detection method
US9019207B1 (en) * 2010-09-28 2015-04-28 Google Inc. Spacebar integrated with trackpad
DE102010047261B4 (en) * 2010-10-01 2013-04-25 Trw Automotive Electronics & Components Gmbh switching device
KR101809191B1 (en) 2010-10-11 2018-01-18 삼성전자주식회사 Touch panel
KR20140037011A (en) 2010-10-20 2014-03-26 택투스 테크놀로지, 아이엔씨. User interface system
US8780060B2 (en) 2010-11-02 2014-07-15 Apple Inc. Methods and systems for providing haptic control
US9335181B2 (en) * 2010-11-10 2016-05-10 Qualcomm Incorporated Haptic based personal navigation
US9746968B1 (en) * 2010-11-10 2017-08-29 Open Invention Network Llc Touch screen display with tactile feedback using transparent actuator assemblies
KR101735715B1 (en) 2010-11-23 2017-05-15 삼성전자주식회사 Input sensing circuit and touch panel including the input sensing circuit
US10503255B2 (en) 2010-12-02 2019-12-10 Immersion Corporation Haptic feedback assisted text manipulation
FR2968786B1 (en) 2010-12-13 2012-12-14 Delphi Tech Inc HAPTIC CONTROL DEVICE HAVING SEAL GASKET
US8543168B2 (en) 2010-12-14 2013-09-24 Motorola Mobility Llc Portable electronic device
US9377876B2 (en) * 2010-12-15 2016-06-28 Hillcrest Laboratories, Inc. Visual whiteboard for television-based social network
JP5591095B2 (en) * 2010-12-20 2014-09-17 トヨタ自動車株式会社 Tactile display
US9652944B2 (en) * 2010-12-28 2017-05-16 Lg Innotek Co., Ltd Locally vibrating haptic apparatus, method for locally vibrating haptic apparatus, haptic display apparatus and vibrating panel using the same
US9921712B2 (en) 2010-12-29 2018-03-20 Mako Surgical Corp. System and method for providing substantially stable control of a surgical tool
US9411509B2 (en) 2010-12-29 2016-08-09 Microsoft Technology Licensing, Llc Virtual controller for touch display
US9119655B2 (en) 2012-08-03 2015-09-01 Stryker Corporation Surgical manipulator capable of controlling a surgical instrument in multiple modes
US8847890B2 (en) 2011-01-04 2014-09-30 Synaptics Incorporated Leveled touchsurface with planar translational responsiveness to vertical travel
US8912458B2 (en) 2011-01-04 2014-12-16 Synaptics Incorporated Touchsurface with level and planar translational travel responsiveness
US8309870B2 (en) 2011-01-04 2012-11-13 Cody George Peterson Leveled touchsurface with planar translational responsiveness to vertical travel
US8842081B2 (en) * 2011-01-13 2014-09-23 Synaptics Incorporated Integrated display and touch system with displayport/embedded displayport interface
US8988087B2 (en) 2011-01-24 2015-03-24 Microsoft Technology Licensing, Llc Touchscreen testing
US9965094B2 (en) 2011-01-24 2018-05-08 Microsoft Technology Licensing, Llc Contact geometry tests
US9417696B2 (en) * 2011-01-27 2016-08-16 Blackberry Limited Portable electronic device and method therefor
US8674961B2 (en) * 2011-01-31 2014-03-18 National Semiconductor Corporation Haptic interface for touch screen in mobile device or other device
DE102011009840A1 (en) * 2011-01-31 2012-08-02 Continental Automotive Gmbh operating device
JP5660580B2 (en) * 2011-02-09 2015-01-28 パナソニックIpマネジメント株式会社 Electronics
WO2012108203A1 (en) * 2011-02-10 2012-08-16 京セラ株式会社 Electronic device and method of controlling same
US8982061B2 (en) 2011-02-12 2015-03-17 Microsoft Technology Licensing, Llc Angular contact geometry
US9542092B2 (en) 2011-02-12 2017-01-10 Microsoft Technology Licensing, Llc Prediction-based touch contact tracking
KR101128392B1 (en) * 2011-02-15 2012-03-27 (주)펜앤프리 Apparatus and method for inputting information
US8723820B1 (en) 2011-02-16 2014-05-13 Google Inc. Methods and apparatus related to a haptic feedback drawing device
US20120218193A1 (en) * 2011-02-28 2012-08-30 Research In Motion Limited Patterned activation of piezoelectric actuators
WO2012121961A1 (en) 2011-03-04 2012-09-13 Apple Inc. Linear vibrator providing localized and generalized haptic feedback
US8773377B2 (en) 2011-03-04 2014-07-08 Microsoft Corporation Multi-pass touch contact tracking
US8735755B2 (en) 2011-03-07 2014-05-27 Synaptics Incorporated Capacitive keyswitch technologies
US20120242584A1 (en) 2011-03-22 2012-09-27 Nokia Corporation Method and apparatus for providing sight independent activity reports responsive to a touch gesture
KR101278405B1 (en) * 2011-03-23 2013-06-24 삼성전기주식회사 Piezoelectric vibration module and touch screen using the same
US8457654B1 (en) 2011-03-31 2013-06-04 Google Inc. Directional feedback
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
US8942828B1 (en) 2011-04-13 2015-01-27 Stuart Schecter, LLC Minimally invasive cardiovascular support system with true haptic coupling
KR101784436B1 (en) 2011-04-18 2017-10-11 삼성전자주식회사 Touch panel and driving device for the touch panel
US9448713B2 (en) 2011-04-22 2016-09-20 Immersion Corporation Electro-vibrotactile display
US8892162B2 (en) 2011-04-25 2014-11-18 Apple Inc. Vibration sensing system and method for categorizing portable device context and modifying device operation
US9189109B2 (en) 2012-07-18 2015-11-17 Sentons Inc. Detection of type of object used to provide a touch contact input
US10198097B2 (en) 2011-04-26 2019-02-05 Sentons Inc. Detecting touch input force
US9436281B2 (en) * 2011-04-26 2016-09-06 Blackberry Limited Electronic device and method of providing tactile feedback
US11327599B2 (en) 2011-04-26 2022-05-10 Sentons Inc. Identifying a contact type
US9477350B2 (en) 2011-04-26 2016-10-25 Sentons Inc. Method and apparatus for active ultrasonic touch devices
US9557857B2 (en) 2011-04-26 2017-01-31 Synaptics Incorporated Input device with force sensing and haptic response
US9639213B2 (en) 2011-04-26 2017-05-02 Sentons Inc. Using multiple signals to detect touch input
US20120274545A1 (en) * 2011-04-28 2012-11-01 Research In Motion Limited Portable electronic device and method of controlling same
US8612808B2 (en) 2011-05-05 2013-12-17 International Business Machines Corporation Touch-sensitive user input device failure prediction
US10108288B2 (en) 2011-05-10 2018-10-23 Northwestern University Touch interface device and method for applying controllable shear forces to a human appendage
US9122325B2 (en) 2011-05-10 2015-09-01 Northwestern University Touch interface device and method for applying controllable shear forces to a human appendage
US9218727B2 (en) 2011-05-12 2015-12-22 Apple Inc. Vibration in portable devices
US9692411B2 (en) 2011-05-13 2017-06-27 Flow Control LLC Integrated level sensing printed circuit board
US8773403B2 (en) 2011-05-20 2014-07-08 Sony Corporation Haptic device for position detection
US8956230B2 (en) 2011-05-20 2015-02-17 Sony Corporation Haptic device for 3-D gaming
US8749533B2 (en) 2011-05-20 2014-06-10 Sony Corporation Haptic device for carving and molding objects
US8681130B2 (en) 2011-05-20 2014-03-25 Sony Corporation Stylus based haptic peripheral for touch screen and tablet devices
US20120302323A1 (en) 2011-05-23 2012-11-29 Wms Gaming Inc. Haptic gaming chairs and wagering game systems and machines with a haptic gaming chair
US9142083B2 (en) 2011-06-13 2015-09-22 Bally Gaming, Inc. Convertible gaming chairs and wagering game systems and machines with a convertible gaming chair
US9710061B2 (en) 2011-06-17 2017-07-18 Apple Inc. Haptic feedback device
US10007341B2 (en) 2011-06-21 2018-06-26 Northwestern University Touch interface device and method for applying lateral forces on a human appendage
EP2538303A1 (en) * 2011-06-22 2012-12-26 Research In Motion Limited Optical navigation device with haptic feedback
US8547333B2 (en) 2011-06-22 2013-10-01 Blackberry Limited Optical navigation device with haptic feedback
CN202135103U (en) 2011-06-27 2012-02-01 瑞声声学科技(常州)有限公司 Piezoelectric vibrator
US8194036B1 (en) * 2011-06-29 2012-06-05 Google Inc. Systems and methods for controlling a cursor on a display using a trackpad input device
US8913019B2 (en) 2011-07-14 2014-12-16 Microsoft Corporation Multi-finger detection and component resolution
US8319746B1 (en) 2011-07-22 2012-11-27 Google Inc. Systems and methods for removing electrical noise from a touchpad signal
US9417754B2 (en) 2011-08-05 2016-08-16 P4tents1, LLC User interface system, method, and computer program product
DE102011112618A1 (en) * 2011-09-08 2013-03-14 Eads Deutschland Gmbh Interaction with a three-dimensional virtual scenario
CN102999155B (en) * 2011-09-09 2016-10-05 联想(北京)有限公司 Electric terminal, method of toch control and display backlight control method
US9378389B2 (en) 2011-09-09 2016-06-28 Microsoft Technology Licensing, Llc Shared item account selection
US9748952B2 (en) 2011-09-21 2017-08-29 Synaptics Incorporated Input device with integrated deformable electrode structure for force sensing
US8333657B1 (en) 2011-09-26 2012-12-18 Igt Gaming system, gaming device and method for displaying multiple concurrent games using dynamic focal points
WO2013051806A2 (en) 2011-10-06 2013-04-11 (주)하이소닉 Vibrator having a piezoelectric element mounted thereon
KR101350543B1 (en) * 2011-10-18 2014-01-14 삼성전기주식회사 Haptic feedback device and portable electronic equipment
JP2013089117A (en) * 2011-10-20 2013-05-13 Alps Electric Co Ltd Input device
WO2013058890A1 (en) 2011-10-20 2013-04-25 Alcon Research, Ltd. Haptic footswitch treadle
US9041418B2 (en) 2011-10-25 2015-05-26 Synaptics Incorporated Input device with force sensing
FR2982050B1 (en) * 2011-11-01 2014-06-20 Nantes Ecole Centrale METHOD AND DEVICE FOR REAL-TIME SIMULATION OF COMPLEX SYSTEMS AND PROCESSES
US9582178B2 (en) 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
US9785281B2 (en) 2011-11-09 2017-10-10 Microsoft Technology Licensing, Llc. Acoustic touch sensitive testing
US9213482B2 (en) 2011-11-11 2015-12-15 Elan Microelectronics Corporation Touch control device and method
TWI451309B (en) * 2011-11-11 2014-09-01 Elan Microelectronics Corp Touch device and its control method
JP6373758B2 (en) * 2011-11-16 2018-08-15 ボルケーノ コーポレイション Medical measurement system and method
US10235004B1 (en) 2011-11-18 2019-03-19 Sentons Inc. Touch input detector with an integrated antenna
US11340124B2 (en) 2017-08-14 2022-05-24 Sentons Inc. Piezoresistive sensor for detecting a physical disturbance
US9594450B2 (en) 2011-11-18 2017-03-14 Sentons Inc. Controlling audio volume using touch input force
KR101771896B1 (en) 2011-11-18 2017-08-28 센톤스 아이엔씨. Localized haptic feedback
US8737062B2 (en) 2011-11-22 2014-05-27 Htc Corporation Handheld electronic device
US8837143B2 (en) 2011-11-25 2014-09-16 Htc Corporation Handheld electronic device
US9756927B2 (en) 2011-11-30 2017-09-12 Apple Inc. Mounting system for portable electronic device
FR2983989B1 (en) * 2011-12-09 2014-06-20 Thales Sa HAPTIC RESONATOR DEVICE
US8904052B2 (en) 2011-12-23 2014-12-02 Apple Inc. Combined input port
WO2013099743A1 (en) * 2011-12-27 2013-07-04 株式会社村田製作所 Tactile presentation device
US9013405B2 (en) 2011-12-28 2015-04-21 Microsoft Technology Licensing, Llc Touch-scrolling pad for computer input devices
US9983757B2 (en) * 2012-01-20 2018-05-29 Microchip Technology Incorporated Inductive touch sensor using a flexible coil
US9766704B2 (en) * 2012-01-27 2017-09-19 Visteon Global Technologies, Inc. Touch surface and microprocessor assembly
US8914254B2 (en) 2012-01-31 2014-12-16 Microsoft Corporation Latency measurement
US8493354B1 (en) 2012-08-23 2013-07-23 Immersion Corporation Interactivity model for shared feedback on mobile devices
US8711118B2 (en) 2012-02-15 2014-04-29 Immersion Corporation Interactivity model for shared feedback on mobile devices
EP3321780A1 (en) 2012-02-15 2018-05-16 Immersion Corporation High definition haptic effects generation using primitives
JP2013200863A (en) * 2012-02-23 2013-10-03 Panasonic Corp Electronic device
US8913026B2 (en) 2012-03-06 2014-12-16 Industry-University Cooperation Foundation Hanyang University System for linking and controlling terminals and user terminal used in the same
US9069394B2 (en) * 2012-03-20 2015-06-30 Google Inc. Fully clickable trackpad
WO2013154720A1 (en) 2012-04-13 2013-10-17 Tk Holdings Inc. Pressure sensor including a pressure sensitive material for use with control systems and methods of using the same
WO2013157626A1 (en) * 2012-04-20 2013-10-24 株式会社ニコン Electronic device and vibration control method
EP2882107B1 (en) * 2012-05-01 2017-07-26 Kyocera Corporation Electronic device
WO2013169843A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for manipulating framed graphical objects
WO2013169865A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
WO2013169875A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying content associated with a corresponding affordance
WO2013169842A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for selecting object within a group of objects
CN104487928B (en) 2012-05-09 2018-07-06 苹果公司 For equipment, method and the graphic user interface of transition to be carried out between dispaly state in response to gesture
CN104471521B (en) 2012-05-09 2018-10-23 苹果公司 For providing the equipment, method and graphic user interface of feedback for the state of activation for changing user interface object
EP3185116B1 (en) 2012-05-09 2019-09-11 Apple Inc. Device, method and graphical user interface for providing tactile feedback for operations performed in a user interface
WO2013169849A2 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for displaying user interface objects corresponding to an application
JP5373236B1 (en) * 2012-05-09 2013-12-18 パナソニック株式会社 Electronics
WO2013169845A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for scrolling nested regions
WO2013169846A1 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for displaying additional information in response to a user contact
EP3096218B1 (en) 2012-05-09 2018-12-26 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
JP2015519656A (en) 2012-05-09 2015-07-09 アップル インコーポレイテッド Device, method and graphical user interface for moving and dropping user interface objects
WO2013169851A2 (en) 2012-05-09 2013-11-14 Yknots Industries Llc Device, method, and graphical user interface for facilitating user interaction with controls in a user interface
US10108265B2 (en) * 2012-05-09 2018-10-23 Apple Inc. Calibration of haptic feedback systems for input devices
US8570296B2 (en) 2012-05-16 2013-10-29 Immersion Corporation System and method for display of multiple data channels on a single haptic display
US9891709B2 (en) 2012-05-16 2018-02-13 Immersion Corporation Systems and methods for content- and context specific haptic effects using predefined haptic effects
US9522096B2 (en) * 2012-05-17 2016-12-20 Zoll Medical Corporation CPR team performance
US11590053B2 (en) 2012-05-17 2023-02-28 Zoll Medical Corporation Cameras for emergency rescue
US9892357B2 (en) 2013-05-29 2018-02-13 Cardlab, Aps. Method for remotely controlling a reprogrammable payment card
US9406011B2 (en) 2012-05-29 2016-08-02 Stratos Technologies, Inc. Virtual wallet
US9286561B2 (en) 2012-05-29 2016-03-15 Stratos Technologies, Inc. Payment card and methods
US8928607B1 (en) 2012-05-29 2015-01-06 Google Inc. Handheld input device for a computer
TW201403294A (en) * 2012-06-04 2014-01-16 Compal Electronics Inc Electronic device
US10013082B2 (en) 2012-06-05 2018-07-03 Stuart Schecter, LLC Operating system with haptic interface for minimally invasive, hand-held surgical instrument
US20150109223A1 (en) 2012-06-12 2015-04-23 Apple Inc. Haptic electromagnetic actuator
US9459160B2 (en) 2012-06-13 2016-10-04 Microsoft Technology Licensing, Llc Input device sensor configuration
US9684382B2 (en) 2012-06-13 2017-06-20 Microsoft Technology Licensing, Llc Input device configuration having capacitive and pressure sensors
US9703378B2 (en) * 2012-06-13 2017-07-11 Immersion Corporation Method and apparatus for representing user interface metaphors as physical changes on a shape-changing device
EP2862040B1 (en) 2012-06-15 2021-08-18 Nokia Technologies Oy A display suspension
US9493342B2 (en) 2012-06-21 2016-11-15 Nextinput, Inc. Wafer level MEMS force dies
EP2870445A1 (en) 2012-07-05 2015-05-13 Ian Campbell Microelectromechanical load sensor and methods of manufacturing the same
US9632648B2 (en) * 2012-07-06 2017-04-25 Lg Electronics Inc. Mobile terminal, image display device and user interface provision method using the same
US9348468B2 (en) 2013-06-07 2016-05-24 Sentons Inc. Detecting multi-touch inputs
US9466783B2 (en) 2012-07-26 2016-10-11 Immersion Corporation Suspension element having integrated piezo material for providing haptic effects to a touch screen
US9245428B2 (en) 2012-08-02 2016-01-26 Immersion Corporation Systems and methods for haptic remote control gaming
US9226796B2 (en) 2012-08-03 2016-01-05 Stryker Corporation Method for detecting a disturbance as an energy applicator of a surgical instrument traverses a cutting path
KR102603224B1 (en) 2012-08-03 2023-11-16 스트리커 코포레이션 Systems and methods for robotic surgery
US9820818B2 (en) 2012-08-03 2017-11-21 Stryker Corporation System and method for controlling a surgical manipulator based on implant parameters
US9177733B2 (en) 2012-08-06 2015-11-03 Synaptics Incorporated Touchsurface assemblies with linkages
US9040851B2 (en) 2012-08-06 2015-05-26 Synaptics Incorporated Keycap assembly with an interactive spring mechanism
US9218927B2 (en) 2012-08-06 2015-12-22 Synaptics Incorporated Touchsurface assembly with level and planar translational responsiveness via a buckling elastic component
WO2014025786A1 (en) 2012-08-06 2014-02-13 Synaptics Incorporated Touchsurface assembly utilizing magnetically enabled hinge
US9317146B1 (en) * 2012-08-23 2016-04-19 Rockwell Collins, Inc. Haptic touch feedback displays having double bezel design
KR101378891B1 (en) * 2012-08-29 2014-03-28 주식회사 하이소닉 Touch motion switch
US8868199B2 (en) 2012-08-31 2014-10-21 Greatbatch Ltd. System and method of compressing medical maps for pulse generator or database storage
US10668276B2 (en) 2012-08-31 2020-06-02 Cirtec Medical Corp. Method and system of bracketing stimulation parameters on clinician programmers
US8983616B2 (en) 2012-09-05 2015-03-17 Greatbatch Ltd. Method and system for associating patient records with pulse generators
US8903496B2 (en) 2012-08-31 2014-12-02 Greatbatch Ltd. Clinician programming system and method
US9375582B2 (en) 2012-08-31 2016-06-28 Nuvectra Corporation Touch screen safety controls for clinician programmer
US9471753B2 (en) 2012-08-31 2016-10-18 Nuvectra Corporation Programming and virtual reality representation of stimulation parameter Groups
US8761897B2 (en) 2012-08-31 2014-06-24 Greatbatch Ltd. Method and system of graphical representation of lead connector block and implantable pulse generators on a clinician programmer
US9259577B2 (en) 2012-08-31 2016-02-16 Greatbatch Ltd. Method and system of quick neurostimulation electrode configuration and positioning
US9594877B2 (en) 2012-08-31 2017-03-14 Nuvectra Corporation Virtual reality representation of medical devices
US9180302B2 (en) 2012-08-31 2015-11-10 Greatbatch Ltd. Touch screen finger position indicator for a spinal cord stimulation programming device
US8812125B2 (en) 2012-08-31 2014-08-19 Greatbatch Ltd. Systems and methods for the identification and association of medical devices
US9507912B2 (en) 2012-08-31 2016-11-29 Nuvectra Corporation Method and system of simulating a pulse generator on a clinician programmer
US9615788B2 (en) 2012-08-31 2017-04-11 Nuvectra Corporation Method and system of producing 2D representations of 3D pain and stimulation maps and implant models on a clinician programmer
US9767255B2 (en) 2012-09-05 2017-09-19 Nuvectra Corporation Predefined input for clinician programmer data entry
US8757485B2 (en) 2012-09-05 2014-06-24 Greatbatch Ltd. System and method for using clinician programmer and clinician programming data for inventory and manufacturing prediction and control
US9563239B2 (en) 2012-09-10 2017-02-07 Apple Inc. Internal computer assembly features and methods
US9046925B2 (en) 2012-09-11 2015-06-02 Dell Products L.P. Method for using the GPU to create haptic friction maps
DE112013004512T5 (en) 2012-09-17 2015-06-03 Tk Holdings Inc. Single-layer force sensor
KR102058990B1 (en) * 2012-09-19 2019-12-24 엘지전자 주식회사 Mobile device and method for controlling the same
WO2014047656A2 (en) 2012-09-24 2014-03-27 Tactus Technology, Inc. Dynamic tactile interface and methods
US9405417B2 (en) 2012-09-24 2016-08-02 Tactus Technology, Inc. Dynamic tactile interface and methods
US9196134B2 (en) 2012-10-31 2015-11-24 Immersion Corporation Method and apparatus for simulating surface features on a user interface with haptic effects
US20140139451A1 (en) * 2012-11-20 2014-05-22 Vincent Levesque Systems and Methods For Providing Mode or State Awareness With Programmable Surface Texture
KR102052960B1 (en) * 2012-11-23 2019-12-06 삼성전자주식회사 Input apparatus, display apparatus and control method thereof
FR2999742B1 (en) * 2012-12-13 2018-03-30 Dav TOUCH CONTROL INTERFACE
KR102091077B1 (en) 2012-12-14 2020-04-14 삼성전자주식회사 Mobile terminal and method for controlling feedback of an input unit, and the input unit and method therefor
KR102047689B1 (en) * 2012-12-17 2019-11-22 엘지전자 주식회사 Touch sensitive device and controlling method for providing mini-map of tactile user interface
US9202350B2 (en) * 2012-12-19 2015-12-01 Nokia Technologies Oy User interfaces and associated methods
CN107831991B (en) 2012-12-29 2020-11-27 苹果公司 Device, method and graphical user interface for determining whether to scroll or select content
AU2013368441B2 (en) 2012-12-29 2016-04-14 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
EP2939098B1 (en) 2012-12-29 2018-10-10 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
WO2014105279A1 (en) 2012-12-29 2014-07-03 Yknots Industries Llc Device, method, and graphical user interface for switching between user interfaces
CN105144057B (en) 2012-12-29 2019-05-17 苹果公司 For moving the equipment, method and graphic user interface of cursor according to the cosmetic variation of the control icon with simulation three-dimensional feature
KR102000253B1 (en) 2012-12-29 2019-07-16 애플 인크. Device, method, and graphical user interface for navigating user interface hierachies
JP2014157450A (en) * 2013-02-15 2014-08-28 Nlt Technologies Ltd Display unit with touch sensor, and control system and control method of the same
US10578499B2 (en) 2013-02-17 2020-03-03 Microsoft Technology Licensing, Llc Piezo-actuated virtual buttons for touch surfaces
US9760278B2 (en) 2013-03-01 2017-09-12 Altaba INC. Finger expressions for touch screens
US9715300B2 (en) 2013-03-04 2017-07-25 Microsoft Technology Licensing, Llc Touch screen interaction using dynamic haptic feedback
WO2014142807A1 (en) * 2013-03-12 2014-09-18 Intel Corporation Menu system and interactions with an electronic device
US9904394B2 (en) 2013-03-13 2018-02-27 Immerson Corporation Method and devices for displaying graphical user interfaces based on user contact
US9229592B2 (en) 2013-03-14 2016-01-05 Synaptics Incorporated Shear force detection using capacitive sensors
US9384919B2 (en) 2013-03-14 2016-07-05 Synaptics Incorporated Touchsurface assembly having key guides formed in a sheet metal component
US20140267139A1 (en) * 2013-03-15 2014-09-18 Motorola Mobility Llc Touch Sensitive Surface with False Touch Protection for an Electronic Device
US9415299B2 (en) 2013-03-15 2016-08-16 Steelseries Aps Gaming device
US9409087B2 (en) 2013-03-15 2016-08-09 Steelseries Aps Method and apparatus for processing gestures
US9687730B2 (en) 2013-03-15 2017-06-27 Steelseries Aps Gaming device with independent gesture-sensitive areas
EP2778852B1 (en) 2013-03-15 2017-05-03 Immersion Corporation Programmable haptic peripheral
US9604147B2 (en) 2013-03-15 2017-03-28 Steelseries Aps Method and apparatus for managing use of an accessory
US9423874B2 (en) 2013-03-15 2016-08-23 Steelseries Aps Gaming accessory with sensory feedback device
US9213372B2 (en) 2013-04-19 2015-12-15 Synaptics Incorporated Retractable keyboard keys
EP2989525B1 (en) 2013-04-26 2019-09-25 Immersion Corporation Simulation of tangible user interface interactions and gestures using array of haptic cells
US9939900B2 (en) 2013-04-26 2018-04-10 Immersion Corporation System and method for a haptically-enabled deformable surface
US9590875B2 (en) 2013-04-29 2017-03-07 International Business Machines Corporation Content delivery infrastructure with non-intentional feedback parameter provisioning
US9448628B2 (en) 2013-05-15 2016-09-20 Microsoft Technology Licensing, Llc Localized key-click feedback
US9275386B2 (en) 2013-05-29 2016-03-01 Stratos Technologies, Inc. Method for facilitating payment with a programmable payment card
CN105452992B (en) 2013-05-30 2019-03-08 Tk控股公司 Multidimensional Trackpad
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
US9937416B2 (en) 2013-06-11 2018-04-10 Microsoft Technology Licensing, Llc Adaptive touch input controls
US10591992B2 (en) * 2013-06-17 2020-03-17 Lenovo (Singapore) Pte. Ltd. Simulation of control areas on touch surface using haptic feedback
US10120447B2 (en) 2013-06-24 2018-11-06 Northwestern University Haptic display with simultaneous sensing and actuation
US9557813B2 (en) 2013-06-28 2017-01-31 Tactus Technology, Inc. Method for reducing perceived optical distortion
JP2015027661A (en) * 2013-07-01 2015-02-12 東京パーツ工業株式会社 Tactile type solenoid and attachment structure of the same
EP2830039B1 (en) * 2013-07-24 2018-10-03 Native Instruments GmbH Method, arrangement, computer program and computer-readable storage means for controlling at least one parameter or at least one object using capacity sensing input elements
US9280259B2 (en) 2013-07-26 2016-03-08 Blackberry Limited System and method for manipulating an object in a three-dimensional desktop environment
CN103474280B (en) * 2013-08-19 2015-09-02 苏州达方电子有限公司 Vibrations keyboard
US9229612B2 (en) 2013-08-27 2016-01-05 Industrial Technology Research Institute Electronic device, controlling method for screen, and program storage medium thereof
US9390598B2 (en) 2013-09-11 2016-07-12 Blackberry Limited Three dimensional haptics hybrid modeling
US10037130B2 (en) * 2013-09-13 2018-07-31 Samsung Electronics Co., Ltd. Display apparatus and method for improving visibility of the same
US9459715B1 (en) 2013-09-20 2016-10-04 Sentons Inc. Using spectral control in detecting touch input
JP2015070729A (en) * 2013-09-30 2015-04-13 日本電産コパル株式会社 Information terminal processing device and vibration generator system
CN105612476B (en) * 2013-10-08 2019-09-20 Tk控股公司 Self-alignment stereognosis tactile multi-touch Multifunctional switch panel
US10328344B2 (en) * 2013-10-11 2019-06-25 Valve Corporation Game controller systems and methods
US9514902B2 (en) 2013-11-07 2016-12-06 Microsoft Technology Licensing, Llc Controller-less quick tactile feedback keyboard
US9393493B2 (en) 2013-11-12 2016-07-19 Immersion Corporation Gaming device with haptic effect isolated to user input elements
US9619029B2 (en) 2013-11-14 2017-04-11 Immersion Corporation Haptic trigger control system
US9164587B2 (en) 2013-11-14 2015-10-20 Immersion Corporation Haptic spatialization system
US9213409B2 (en) 2013-11-25 2015-12-15 Immersion Corporation Dual stiffness suspension system
US9639158B2 (en) 2013-11-26 2017-05-02 Immersion Corporation Systems and methods for generating friction and vibrotactile effects
US9489048B2 (en) * 2013-12-13 2016-11-08 Immersion Corporation Systems and methods for optical transmission of haptic display parameters
EP3086207B1 (en) * 2013-12-18 2019-04-17 Panasonic Intellectual Property Management Co., Ltd. Electronic device for generating vibrations
US9448631B2 (en) 2013-12-31 2016-09-20 Microsoft Technology Licensing, Llc Input device haptics and pressure sensing
US9244532B2 (en) 2013-12-31 2016-01-26 Immersion Corporation Systems and methods for controlling multiple displays with single controller and haptic enabled user interface
JP2015130168A (en) * 2013-12-31 2015-07-16 イマージョン コーポレーションImmersion Corporation Friction augmented control, and method to convert buttons of touch control panels to friction augmented controls
US9902611B2 (en) 2014-01-13 2018-02-27 Nextinput, Inc. Miniaturized and ruggedized wafer level MEMs force sensors
US20150242037A1 (en) 2014-01-13 2015-08-27 Apple Inc. Transparent force sensor with strain relief
JP2015138416A (en) * 2014-01-22 2015-07-30 キヤノン株式会社 Electronic device, its control method and program
US9817489B2 (en) 2014-01-27 2017-11-14 Apple Inc. Texture capture stylus and method
KR102205283B1 (en) 2014-02-12 2021-01-20 삼성전자주식회사 Electro device executing at least one application and method for controlling thereof
EP3382512A1 (en) 2014-02-21 2018-10-03 Northwestern University Haptic display with simultaneous sensing and actuation
US9396629B1 (en) 2014-02-21 2016-07-19 Apple Inc. Haptic modules with independently controllable vertical and horizontal mass movements
JP5843908B2 (en) * 2014-03-07 2016-01-13 株式会社コナミデジタルエンタテインメント GAME CONTROL DEVICE, GAME SYSTEM, AND PROGRAM
US9594429B2 (en) 2014-03-27 2017-03-14 Apple Inc. Adjusting the level of acoustic and haptic output in haptic devices
US11625145B2 (en) 2014-04-28 2023-04-11 Ford Global Technologies, Llc Automotive touchscreen with simulated texture for the visually impaired
US9542801B1 (en) 2014-04-28 2017-01-10 Bally Gaming, Inc. Wearable wagering game system and methods
US10579252B2 (en) 2014-04-28 2020-03-03 Ford Global Technologies, Llc Automotive touchscreen with simulated texture for the visually impaired
US9829979B2 (en) * 2014-04-28 2017-11-28 Ford Global Technologies, Llc Automotive touchscreen controls with simulated texture for haptic feedback
US10133351B2 (en) 2014-05-21 2018-11-20 Apple Inc. Providing haptic output based on a determined orientation of an electronic device
US10133314B2 (en) 2014-05-26 2018-11-20 Apple Inc. Portable computing system
US10228721B2 (en) 2014-05-26 2019-03-12 Apple Inc. Portable computing system
US9886090B2 (en) * 2014-07-08 2018-02-06 Apple Inc. Haptic notifications utilizing haptic input devices
CN204009771U (en) * 2014-08-06 2014-12-10 胡竞韬 A kind of sense of touch type controller
US9690381B2 (en) 2014-08-21 2017-06-27 Immersion Corporation Systems and methods for shape input and output for a haptically-enabled deformable surface
US9838009B2 (en) 2014-08-27 2017-12-05 Continental Automotive Systems, Inc. Switch with user feedback
US9984838B2 (en) 2014-08-29 2018-05-29 Hewlett-Packard Development Company, L.P. Click pad
KR102096146B1 (en) 2014-09-02 2020-04-28 애플 인크. Semantic framework for variable haptic output
US10297119B1 (en) 2014-09-02 2019-05-21 Apple Inc. Feedback device in an electronic device
US9858751B2 (en) 2014-09-26 2018-01-02 Bally Gaming, Inc. Wagering game wearables
WO2016053901A1 (en) 2014-09-30 2016-04-07 Apple Inc Configurable force-sensitive input structures for electronic devices
CN207586791U (en) * 2014-09-30 2018-07-06 苹果公司 Portable computing system
US10466826B2 (en) 2014-10-08 2019-11-05 Joyson Safety Systems Acquisition Llc Systems and methods for illuminating a track pad system
WO2016069989A1 (en) * 2014-10-30 2016-05-06 Intuitive Surgical Operations, Inc. System and method for an articulated arm based tool guide
KR102368044B1 (en) * 2014-11-03 2022-02-25 삼성전자주식회사 User terminal device and method for controlling the user terminal device thereof
US9910493B2 (en) 2014-11-07 2018-03-06 Faurecia Interior Systems, Inc. Suspension component for a haptic touch panel assembly
US9720500B2 (en) 2014-11-07 2017-08-01 Faurecia Interior Systems, Inc Haptic touch panel assembly for a vehicle
US10185396B2 (en) 2014-11-12 2019-01-22 Immersion Corporation Haptic trigger modification system
US9174134B1 (en) * 2014-11-12 2015-11-03 Immersion Corporation Peripheral device with haptic diminishment prevention component
US9400570B2 (en) 2014-11-14 2016-07-26 Apple Inc. Stylus with inertial sensor
US9535550B2 (en) 2014-11-25 2017-01-03 Immersion Corporation Systems and methods for deformation-based haptic effects
DE102014224102A1 (en) * 2014-11-26 2016-06-02 Robert Bosch Gmbh A method for tactile interaction of a user with an electronic device and electronic device thereto
DE102014224110A1 (en) * 2014-11-26 2016-06-02 Robert Bosch Gmbh A method for tactile interaction of a user with an electronic device and electronic device thereto
US9590808B2 (en) 2014-12-08 2017-03-07 International Business Machines Corporation Obfuscated passwords
CN104393025B (en) * 2014-12-09 2017-08-11 京东方科技集团股份有限公司 A kind of array base palte, touch-control display panel and touch control display apparatus
US9575573B2 (en) 2014-12-18 2017-02-21 Apple Inc. Stylus with touch sensor
US9632582B2 (en) * 2014-12-22 2017-04-25 Immersion Corporation Magnetic suspension system for touch screens and touch surfaces
US9589432B2 (en) 2014-12-22 2017-03-07 Immersion Corporation Haptic actuators having programmable magnets with pre-programmed magnetic surfaces and patterns for producing varying haptic effects
US9937839B2 (en) * 2014-12-31 2018-04-10 Harman International Industries, Incorporated Feedback by modifying stiffness
WO2016111829A1 (en) 2015-01-09 2016-07-14 Apple Inc. Features of a flexible connector in a portable computing device
US10162390B2 (en) 2015-01-16 2018-12-25 Apple Inc. Hybrid acoustic EMI foam for use in a personal computer
US9798409B1 (en) 2015-03-04 2017-10-24 Apple Inc. Multi-force input device
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
CN104731333B (en) * 2015-03-25 2018-11-09 联想(北京)有限公司 A kind of wearable electronic equipment
US10613629B2 (en) 2015-03-27 2020-04-07 Chad Laurendeau System and method for force feedback interface devices
US20170045981A1 (en) 2015-08-10 2017-02-16 Apple Inc. Devices and Methods for Processing Touch Inputs Based on Their Intensities
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10126861B2 (en) 2015-05-08 2018-11-13 Synaptics Incorporated Force sensor substrate
DE102015006605B3 (en) * 2015-05-21 2016-09-22 Audi Ag Operating device and method for controlling functional units of a motor vehicle and motor vehicle
US10416799B2 (en) 2015-06-03 2019-09-17 Microsoft Technology Licensing, Llc Force sensing and inadvertent input control of an input device
US10222889B2 (en) 2015-06-03 2019-03-05 Microsoft Technology Licensing, Llc Force inputs and cursor control
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
WO2016201235A1 (en) 2015-06-10 2016-12-15 Nextinput, Inc. Ruggedized wafer level mems force sensor with a tolerance trench
EP3314381B1 (en) * 2015-06-26 2023-03-08 Microsoft Technology Licensing, LLC Passive haptics as reference for active haptics
CN107850923B (en) 2015-07-02 2021-03-12 瑟克公司 Method and system for providing mechanical movement of a surface of a touch sensor
DE102015008571B4 (en) * 2015-07-02 2017-08-24 Audi Ag Motor vehicle operating device with movable user interface
US20170024010A1 (en) 2015-07-21 2017-01-26 Apple Inc. Guidance device for the sensory impaired
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
DK179096B1 (en) * 2015-08-10 2017-10-30 Apple Inc Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
JP6625372B2 (en) * 2015-08-27 2019-12-25 株式会社デンソーテン Input device and in-vehicle device
US10481645B2 (en) 2015-09-11 2019-11-19 Lucan Patent Holdco, LLC Secondary gesture input mechanism for touchscreen devices
US10048811B2 (en) 2015-09-18 2018-08-14 Sentons Inc. Detecting touch input provided by signal transmitting stylus
US11194398B2 (en) 2015-09-26 2021-12-07 Intel Corporation Technologies for adaptive rendering using 3D sensors
US9850957B2 (en) 2015-09-30 2017-12-26 Apple Inc. Electronic device with haptic actuation stiction release after non-movement threshold time period and related methods
US10409412B1 (en) 2015-09-30 2019-09-10 Apple Inc. Multi-input element for electronic device
TWI628569B (en) 2015-09-30 2018-07-01 蘋果公司 Keyboard with adaptive input row
JP6194532B2 (en) * 2015-11-13 2017-09-13 株式会社コナミデジタルエンタテインメント GAME CONTROL DEVICE, GAME SYSTEM, AND PROGRAM
DE102015120605A1 (en) * 2015-11-27 2017-06-01 Valeo Schalter Und Sensoren Gmbh Operating device for a motor vehicle with drive device for outputting a haptic feedback and motor vehicle
US10379615B2 (en) 2015-12-09 2019-08-13 International Business Machines Corporation Providing haptic feedback to a user of a touch surface display
JP6359507B2 (en) * 2015-12-10 2018-07-18 株式会社東海理化電機製作所 Vibration presentation device
CN106371579B (en) * 2015-12-24 2019-05-21 北京智谷睿拓技术服务有限公司 Control the method, apparatus and virtual reality interactive system of virtual reality interaction controller deformation flexible
US9927887B2 (en) * 2015-12-31 2018-03-27 Synaptics Incorporated Localized haptics for two fingers
US10061385B2 (en) 2016-01-22 2018-08-28 Microsoft Technology Licensing, Llc Haptic feedback for a touch input device
US10429935B2 (en) 2016-02-08 2019-10-01 Comcast Cable Communications, Llc Tremor correction for gesture recognition
CN105536249B (en) * 2016-02-18 2023-09-01 高创(苏州)电子有限公司 game system
CN108885477A (en) * 2016-02-25 2018-11-23 瑟克公司 Touch panel system with a variety of tracking modes and the mechanical force alignment sensor for being integrated with capacitance positions following function
US10152132B2 (en) 2016-02-26 2018-12-11 Immersion Corporation Method and apparatus for enabling heavy floating touchscreen haptics assembles and passive braking system
US10772394B1 (en) 2016-03-08 2020-09-15 Apple Inc. Tactile output for wearable device
KR102456953B1 (en) 2016-03-11 2022-10-21 한국전자통신연구원 apparatus and method for providing image
US20170269687A1 (en) * 2016-03-17 2017-09-21 Google Inc. Methods and apparatus to provide haptic feedback for computing devices
JP6999567B2 (en) 2016-03-23 2022-02-10 ベーア-ヘラー サーモコントロール ゲーエムベーハー Operation unit
US10118696B1 (en) 2016-03-31 2018-11-06 Steven M. Hoffberg Steerable rotating projectile
EP3446795B1 (en) * 2016-04-19 2022-11-30 Nippon Telegraph and Telephone Corporation Pseudo tactile force generation device
US10585480B1 (en) 2016-05-10 2020-03-10 Apple Inc. Electronic device with an input device having a haptic engine
US10248781B2 (en) 2016-05-16 2019-04-02 Blackberry Limited Method of passcode unlock using force detection
US9829981B1 (en) 2016-05-26 2017-11-28 Apple Inc. Haptic output device
US10452211B2 (en) 2016-05-27 2019-10-22 Synaptics Incorporated Force sensor with uniform response in an axis
JP6784518B2 (en) * 2016-06-10 2020-11-11 株式会社ソニー・インタラクティブエンタテインメント Operation device
DK179823B1 (en) 2016-06-12 2019-07-12 Apple Inc. Devices, methods, and graphical user interfaces for providing haptic feedback
DK201670737A1 (en) 2016-06-12 2018-01-22 Apple Inc Devices, Methods, and Graphical User Interfaces for Providing Haptic Feedback
US10698485B2 (en) * 2016-06-27 2020-06-30 Microsoft Technology Licensing, Llc Augmenting text narration with haptic feedback
US10649529B1 (en) 2016-06-28 2020-05-12 Apple Inc. Modification of user-perceived feedback of an input device using acoustic or haptic output
EP3481662B1 (en) * 2016-07-08 2020-06-10 Behr-Hella Thermocontrol GmbH Operator control unit for a vehicle
US20180011548A1 (en) * 2016-07-08 2018-01-11 Apple Inc. Interacting with touch devices proximate to other input devices
US10845878B1 (en) 2016-07-25 2020-11-24 Apple Inc. Input device with tactile feedback
US10318065B2 (en) 2016-08-03 2019-06-11 Apple Inc. Input device having a dimensionally configurable input area
CN109643163A (en) * 2016-08-29 2019-04-16 索尼公司 Information processing equipment, information processing method and program
US9870033B1 (en) 2016-08-30 2018-01-16 Apple Inc. Sensor assemblies for electronic devices
DK179278B1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, methods and graphical user interfaces for haptic mixing
DK201670720A1 (en) 2016-09-06 2018-03-26 Apple Inc Devices, Methods, and Graphical User Interfaces for Generating Tactile Outputs
EP3291054B8 (en) * 2016-09-06 2019-07-24 Apple Inc. Devices, methods, and graphical user interfaces for haptic mixing
US10372214B1 (en) 2016-09-07 2019-08-06 Apple Inc. Adaptable user-selectable input area in an electronic device
WO2018046302A1 (en) * 2016-09-09 2018-03-15 Behr-Hella Thermocontrol Gmbh Operating unit for a device, in particular for a vehicle component
US10871860B1 (en) 2016-09-19 2020-12-22 Apple Inc. Flexible sensor configured to detect user inputs
US10444887B2 (en) 2016-09-20 2019-10-15 Cypress Semiconductor Corporation Force sensing
KR20180035559A (en) * 2016-09-29 2018-04-06 엘지이노텍 주식회사 Haptic feedbck apparatus
DE102016220858A1 (en) * 2016-10-24 2018-04-26 Preh Car Connect Gmbh Display device with a touch-sensitive display unit
US10908741B2 (en) 2016-11-10 2021-02-02 Sentons Inc. Touch input detection along device sidewall
US10744053B2 (en) * 2016-12-07 2020-08-18 Stryker Corporation Haptic systems and methods for a user interface of a patient support apparatus
US10296144B2 (en) 2016-12-12 2019-05-21 Sentons Inc. Touch input detection with shared receivers
WO2018112025A1 (en) 2016-12-16 2018-06-21 Mako Surgical Corp. Techniques for modifying tool operation in a surgical robotic system based on comparing actual and commanded states of the tool relative to a surgical site
KR102649009B1 (en) 2016-12-20 2024-03-20 삼성전자주식회사 Display apparatus and the controlling method thereof
US10275032B2 (en) 2016-12-22 2019-04-30 Immersion Corporation Pressure-sensitive suspension system for a haptic device
CN110431515B (en) 2017-01-04 2023-09-12 乔伊森安全系统收购有限责任公司 Switch assembly with force dependent variable scroll speed and method of use
US11001147B2 (en) 2017-02-01 2021-05-11 Behr-Hella Thermocontrol Gmbh Operating unit for a device, in particular for a vehicle component
DE102017208238B4 (en) * 2017-05-16 2018-12-13 Behr-Hella Thermocontrol Gmbh Operating unit for a device, in particular for a vehicle component
US10126877B1 (en) 2017-02-01 2018-11-13 Sentons Inc. Update of reference data for touch input detection
KR102710628B1 (en) * 2017-02-03 2024-09-27 엘지전자 주식회사 Mobile terminal and method for controlling of the same
CN116907693A (en) 2017-02-09 2023-10-20 触控解决方案股份有限公司 Integrated digital force sensor and related manufacturing method
WO2018148510A1 (en) 2017-02-09 2018-08-16 Nextinput, Inc. Integrated piezoresistive and piezoelectric fusion force sensor
DE102017103670A1 (en) * 2017-02-22 2018-08-23 Preh Gmbh Input device with actuatorically moved input part with tuning of the mechanical natural frequencies to produce an improved haptic feedback
US10585522B2 (en) 2017-02-27 2020-03-10 Sentons Inc. Detection of non-touch inputs using a signature
US10437359B1 (en) 2017-02-28 2019-10-08 Apple Inc. Stylus with external magnetic influence
US10169943B2 (en) 2017-04-05 2019-01-01 Hart Intercivic, Inc. Haptic feedback apparatus and method for an election voting system
KR102389063B1 (en) 2017-05-11 2022-04-22 삼성전자주식회사 Method and electronic device for providing haptic feedback
DK201770372A1 (en) 2017-05-16 2019-01-08 Apple Inc. Tactile feedback for locked device user interfaces
US10489567B2 (en) * 2017-05-26 2019-11-26 Visa International Service Association Accessible secure data entry
USD847052S1 (en) * 2017-06-19 2019-04-30 Biraj Ray Portable universal ground control system
US10969870B2 (en) * 2017-07-10 2021-04-06 Sharp Kabushiki Kaisha Input device
US10732743B2 (en) 2017-07-18 2020-08-04 Apple Inc. Concealable input region for an electronic device having microperforations
CN111448446B (en) 2017-07-19 2022-08-30 触控解决方案股份有限公司 Strain transferring stack in MEMS force sensor
US10980687B2 (en) * 2017-07-19 2021-04-20 Stryker Corporation Techniques for generating auditory and haptic output with a vibrational panel of a patient support apparatus
US10775889B1 (en) 2017-07-21 2020-09-15 Apple Inc. Enclosure with locally-flexible regions
US11423686B2 (en) 2017-07-25 2022-08-23 Qorvo Us, Inc. Integrated fingerprint and force sensor
WO2019023552A1 (en) 2017-07-27 2019-01-31 Nextinput, Inc. A wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture
US10455320B2 (en) 2017-08-02 2019-10-22 Body Beats, Llc System, method and apparatus for translating, converting and/or transforming audio energy into haptic and/or visual representation
US11580829B2 (en) 2017-08-14 2023-02-14 Sentons Inc. Dynamic feedback for haptics
US10768747B2 (en) 2017-08-31 2020-09-08 Apple Inc. Haptic realignment cues for touch-input displays
DE102017215581A1 (en) 2017-09-05 2019-03-07 Zf Friedrichshafen Ag Haptic feedback for touch sensitive panel device
US10732676B2 (en) 2017-09-06 2020-08-04 Apple Inc. Illuminated device enclosure with dynamic trackpad
US11054932B2 (en) 2017-09-06 2021-07-06 Apple Inc. Electronic device having a touch sensor, force sensor, and haptic actuator in an integrated module
US10556252B2 (en) 2017-09-20 2020-02-11 Apple Inc. Electronic device having a tuned resonance haptic actuation system
US10768738B1 (en) 2017-09-27 2020-09-08 Apple Inc. Electronic device having a haptic actuator with magnetic augmentation
DE102018119590A1 (en) * 2017-10-02 2019-04-04 Preh Gmbh Control element with improved feel
US11579028B2 (en) 2017-10-17 2023-02-14 Nextinput, Inc. Temperature coefficient of offset compensation for force sensor and strain gauge
US10481262B1 (en) * 2017-10-19 2019-11-19 Facebook Technologies, Llc Optical sensor for measuring displacement of a haptic plate
WO2019090057A1 (en) 2017-11-02 2019-05-09 Nextinput, Inc. Sealed force sensor with etch stop layer
WO2019099821A1 (en) 2017-11-16 2019-05-23 Nextinput, Inc. Force attenuator for force sensor
US11132709B2 (en) * 2017-11-30 2021-09-28 International Business Machines Corporation Implementation of physical changes to devices for effective electronic content reception
US10440848B2 (en) 2017-12-20 2019-10-08 Immersion Corporation Conformable display with linear actuator
US10360774B1 (en) 2018-01-05 2019-07-23 Immersion Corporation Method and device for enabling pitch control for a haptic effect
US10216231B1 (en) * 2018-02-20 2019-02-26 Nvf Tech Ltd Moving magnet actuator for haptic alerts
KR102413936B1 (en) * 2018-02-21 2022-06-28 삼성전자주식회사 Electronic device comprisng display with switch
US11712637B1 (en) 2018-03-23 2023-08-01 Steven M. Hoffberg Steerable disk or ball
DE102018107382B3 (en) * 2018-03-28 2019-05-29 Preh Gmbh Touch-sensitive input device with improved haptic generation
JP6940698B2 (en) * 2018-05-18 2021-09-29 アルプスアルパイン株式会社 Input device
US11314410B2 (en) * 2018-05-29 2022-04-26 Asustek Computer Inc. Mobile device
US20190369792A1 (en) * 2018-05-30 2019-12-05 Wuhan China Star Optoelectronics Technology Co., Ltd. Touch display module and display device
US11504201B2 (en) 2018-05-31 2022-11-22 Covidien Lp Haptic touch feedback surgical device for palpating tissue
US10534451B2 (en) 2018-06-01 2020-01-14 Google Llc Trackpad with capacitive force sensing
US10459542B1 (en) * 2018-06-01 2019-10-29 Google Llc Trackpad with capacitive force sensing and haptic feedback
US10942571B2 (en) 2018-06-29 2021-03-09 Apple Inc. Laptop computing device with discrete haptic regions
US10936071B2 (en) 2018-08-30 2021-03-02 Apple Inc. Wearable electronic device with haptic rotatable input
US10613678B1 (en) 2018-09-17 2020-04-07 Apple Inc. Input device with haptic feedback
WO2020068876A1 (en) 2018-09-24 2020-04-02 Interlink Electronics, Inc. Multi-modal touchpad
US10966007B1 (en) 2018-09-25 2021-03-30 Apple Inc. Haptic output system
TWI687856B (en) * 2018-10-11 2020-03-11 緯創資通股份有限公司 Input device and electronic apparatus therewith
USD950565S1 (en) * 2018-12-11 2022-05-03 Intel Corporation Convertible electronic device
DE202018006400U1 (en) 2018-12-18 2020-07-27 Volkswagen Aktiengesellschaft Control device for a motor vehicle
US10635202B1 (en) * 2018-12-18 2020-04-28 Valve Corporation Dynamic sensor assignment
USD914021S1 (en) * 2018-12-18 2021-03-23 Intel Corporation Touchpad display screen for computing device
US10962427B2 (en) 2019-01-10 2021-03-30 Nextinput, Inc. Slotted MEMS force sensor
US10839636B2 (en) 2019-01-15 2020-11-17 Igt Programmable haptic force feedback sensations in electronic wagering games
KR102535015B1 (en) 2019-01-22 2023-05-22 삼성디스플레이 주식회사 Display device and method for driving the same
US10905946B2 (en) 2019-02-28 2021-02-02 Valve Corporation Continuous controller calibration
US11675438B2 (en) 2019-02-28 2023-06-13 Samsung Display Co., Ltd. Display device and sound providing method of the display device
CN113490904A (en) 2019-03-27 2021-10-08 英特尔公司 Smart display panel device and related method
DE102019112461A1 (en) * 2019-05-13 2020-11-19 Preh Gmbh INPUT ARRANGEMENT WITH ACTIVE HAPTIC FEEDBACK AND INTERFERENCE SUPPRESSION
US20220202102A1 (en) 2019-05-17 2022-06-30 Philip Morris Products S.A An aerosol-generating system and haptic output elements for an aerosol-generating system
US11379016B2 (en) 2019-05-23 2022-07-05 Intel Corporation Methods and apparatus to operate closed-lid portable computers
WO2021054936A1 (en) * 2019-09-16 2021-03-25 Hewlett-Packard Development Company, L.P. Haptic feedback for computing systems
US11543873B2 (en) 2019-09-27 2023-01-03 Intel Corporation Wake-on-touch display screen devices and related methods
US11733761B2 (en) 2019-11-11 2023-08-22 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US20220382393A1 (en) * 2019-11-19 2022-12-01 Hewlett-Packard Development Company, L.P. Input devices
CN210836042U (en) * 2019-12-13 2020-06-23 湃瑞电子科技(苏州)有限公司 Touch panel and keyboard
KR20210077813A (en) 2019-12-17 2021-06-28 삼성디스플레이 주식회사 Display device and method of haptic feedback of the same
US11809535B2 (en) 2019-12-23 2023-11-07 Intel Corporation Systems and methods for multi-modal user device authentication
US11360528B2 (en) 2019-12-27 2022-06-14 Intel Corporation Apparatus and methods for thermal management of electronic user devices based on user activity
US11422629B2 (en) 2019-12-30 2022-08-23 Joyson Safety Systems Acquisition Llc Systems and methods for intelligent waveform interruption
CN111228793B (en) * 2020-01-21 2021-11-19 腾讯科技(深圳)有限公司 Interactive interface display method and device, storage medium and electronic device
KR102339031B1 (en) * 2020-01-28 2021-12-14 한국과학기술원 Wearable tactile display device for presentation of continuous parameters using multiple skin stretch tactors and operating method thereof
US10996693B1 (en) 2020-02-17 2021-05-04 Robert Bosch Gmbh Haptic feedback actuation via open/closed loop control system
CN111208317B (en) * 2020-02-26 2021-07-02 深迪半导体(绍兴)有限公司 MEMS inertial sensor, application method and electronic equipment
US11747857B2 (en) 2020-06-02 2023-09-05 Futurisks, LLC Wearable security device and charging band, system and method
US11024135B1 (en) 2020-06-17 2021-06-01 Apple Inc. Portable electronic device having a haptic button assembly
CN112083827A (en) * 2020-08-21 2020-12-15 欧菲微电子技术有限公司 Touch control assembly control method and device, touch control assembly and terminal
CN112653791B (en) * 2020-12-21 2022-11-08 维沃移动通信有限公司 Incoming call answering method and device, electronic equipment and readable storage medium
US11543931B2 (en) * 2021-01-27 2023-01-03 Ford Global Technologies, Llc Systems and methods for interacting with a tabletop model using a mobile device
US11797091B2 (en) 2021-06-24 2023-10-24 Microsoft Technology Licensing, Llc Computing device with haptic trackpad
CN115967822A (en) * 2021-10-12 2023-04-14 北京字跳网络技术有限公司 Information display method and device, electronic equipment and storage medium
US11893156B2 (en) 2021-10-25 2024-02-06 Dell Products L.P. Information handling system touchpad with mechanical uniform touch response
GB2612856A (en) * 2021-11-11 2023-05-17 Cirrus Logic Int Semiconductor Ltd User input device
CN118435153A (en) * 2021-12-23 2024-08-02 波瑞阿斯技术公司 Touch pad system with piezoelectric actuator
US20240329765A1 (en) * 2023-03-27 2024-10-03 Cirque Corporation Pressure Capacitive Reference Fixed to a Housing

Family Cites Families (518)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US424823A (en) * 1890-04-01 Road-scraper
US2008A (en) * 1841-03-18 Gas-lamp eok conducting gas pkom ah elevated buhner to one below it
US667618A (en) * 1900-07-18 1901-02-05 Eugene Louis Doyen Photographic apparatus.
US1616723A (en) * 1915-04-06 1927-02-08 Zeiss Carl Fa Finder for photographic apparatus
US3157853A (en) 1957-12-06 1964-11-17 Hirsch Joseph Tactile communication system
US2972140A (en) 1958-09-23 1961-02-14 Hirsch Joseph Apparatus and method for communication through the sense of touch
GB958325A (en) 1962-07-08 1964-05-21 Communications Patents Ltd Improvements in or relating to ground-based flight training or simulating apparatus
US3497668A (en) 1966-08-25 1970-02-24 Joseph Hirsch Tactile control system
US3517446A (en) 1967-04-19 1970-06-30 Singer General Precision Vehicle trainer controls and control loading
US3623064A (en) 1968-10-11 1971-11-23 Bell & Howell Co Paging receiver having cycling eccentric mass
US3903614A (en) 1970-03-27 1975-09-09 Singer Co Apparatus for simulating aircraft control loading
US3919691A (en) 1971-05-26 1975-11-11 Bell Telephone Labor Inc Tactile man-machine communication system
US3875488A (en) 1973-02-15 1975-04-01 Raytheon Co Inertially stabilized gimbal platform
US3902687A (en) 1973-06-25 1975-09-02 Robert E Hightower Aircraft indicator system
US3923166A (en) 1973-10-11 1975-12-02 Nasa Remote manipulator system
US4023290A (en) 1974-01-21 1977-05-17 Josephson Joseph P Chart device
US3911416A (en) 1974-08-05 1975-10-07 Motorola Inc Silent call pager
US4108164A (en) 1976-10-01 1978-08-22 Hall Sr Henry W Standard bending profile jacket
US4101884A (en) 1977-02-03 1978-07-18 Benton Jr Lawrence M Visual display with magnetic overlay
US4104603A (en) 1977-02-16 1978-08-01 Hei, Inc. Tactile feel device
US4108146A (en) 1977-05-16 1978-08-22 Theodore Alan Golden Bendable thermal pack unit
US4160508A (en) 1977-08-19 1979-07-10 Nasa Controller arm for a remotely related slave arm
US4242823A (en) * 1977-08-30 1981-01-06 John Bruno Magnetically attractive display device
US4127752A (en) 1977-10-13 1978-11-28 Sheldahl, Inc. Tactile touch switch panel
FR2411603A2 (en) 1977-12-19 1979-07-13 Zarudiansky Alain DEVICE AND METHOD FOR RECORDING OF RESTITUTION AND SYNTHESIS OF TACTILE SENSATIONS
US4262549A (en) 1978-05-10 1981-04-21 Schwellenbach Donald D Variable mechanical vibrator
US4236325A (en) 1978-12-26 1980-12-02 The Singer Company Simulator control loading inertia compensator
US4484179A (en) 1980-04-16 1984-11-20 At&T Bell Laboratories Touch position sensitive surface
US4334280A (en) 1980-06-09 1982-06-08 Texas Instruments Incorporated System and method for providing an audible sound and a tactile feedback in an electronic data processing system
US4464117A (en) 1980-08-27 1984-08-07 Dr. Ing. Reiner Foerst Gmbh Driving simulator apparatus
US4342920A (en) 1980-10-15 1982-08-03 Bucknam Donald C Power plant and process utilizing gravitational force
NL8006091A (en) 1980-11-07 1982-06-01 Fokker Bv FLIGHTMATTER.
FR2494465B1 (en) 1980-11-14 1987-02-13 Epd Engineering Projectdevelop POCKET COMPUTER
US4333070A (en) 1981-02-06 1982-06-01 Barnes Robert W Motor vehicle fuel-waste indicator
US4599070A (en) 1981-07-29 1986-07-08 Control Interface Company Limited Aircraft simulator and simulated control system therefor
US4414537A (en) 1981-09-15 1983-11-08 Bell Telephone Laboratories, Incorporated Digital data entry glove interface device
US4436188A (en) 1981-11-18 1984-03-13 Jones Cecil R Controlled motion apparatus
EP0085518B1 (en) 1982-01-22 1989-08-16 British Aerospace Public Limited Company Control apparatus
US4542375A (en) 1982-02-11 1985-09-17 At&T Bell Laboratories Deformable touch sensitive surface
US4484191A (en) 1982-06-14 1984-11-20 Vavra George S Tactile signaling systems for aircraft
US4560983A (en) 1982-09-17 1985-12-24 Ampex Corporation Dynamically interactive responsive control device and system
US4477043A (en) 1982-12-15 1984-10-16 The United States Of America As Represented By The Secretary Of The Air Force Biodynamic resistant control stick
DE3366764D1 (en) 1983-01-28 1986-11-13 Ibm A stylus or pen for interactive use with a graphics input tablet
US4557275A (en) 1983-05-20 1985-12-10 Dempsey Jr Levi T Biofeedback system
GB2142711A (en) 1983-07-04 1985-01-23 Philips Electronic Associated Manually operable x-y signal generator
US4604016A (en) 1983-08-03 1986-08-05 Joyce Stephen A Multi-dimensional force-torque hand controller having force feedback
US4550221A (en) 1983-10-07 1985-10-29 Scott Mabusth Touch sensitive control device
US4581491A (en) 1984-05-04 1986-04-08 Research Corporation Wearable tactile sensory aid providing information on voice pitch and intonation patterns
US4603284A (en) 1984-06-05 1986-07-29 Unimation, Inc. Control system for manipulator apparatus with resolved compliant motion control
US4584625A (en) * 1984-09-11 1986-04-22 Kellogg Nelson R Capacitive tactile sensor
US4794384A (en) 1984-09-27 1988-12-27 Xerox Corporation Optical translator device
US4782327A (en) 1985-01-02 1988-11-01 Victor B. Kley Computer control
US4935728A (en) 1985-01-02 1990-06-19 Altra Corporation Computer control
US4791416A (en) 1985-02-05 1988-12-13 Zenith Electronics Corporation Touch control system for controllable apparatus
US4715235A (en) 1985-03-04 1987-12-29 Asahi Kasei Kogyo Kabushiki Kaisha Deformation sensitive electroconductive knitted or woven fabric and deformation sensitive electroconductive device comprising the same
JPH0537531Y2 (en) 1985-06-11 1993-09-22
US5078152A (en) 1985-06-23 1992-01-07 Loredan Biomedical, Inc. Method for diagnosis and/or training of proprioceptor feedback capabilities in a muscle and joint system of a human patient
JPH047371Y2 (en) 1985-10-02 1992-02-27
US4713007A (en) 1985-10-11 1987-12-15 Alban Eugene P Aircraft controls simulator
US5275174B1 (en) 1985-10-30 1998-08-04 Jonathan A Cook Repetitive strain injury assessment
NL8503096A (en) 1985-11-11 1987-06-01 Fokker Bv SIMULATOR OF MECHANICAL PROPERTIES OF OPERATING SYSTEM.
US4891764A (en) 1985-12-06 1990-01-02 Tensor Development Inc. Program controlled force measurement and control system
US5103404A (en) 1985-12-06 1992-04-07 Tensor Development, Inc. Feedback for a manipulator
US4934694A (en) 1985-12-06 1990-06-19 Mcintosh James L Computer controlled exercise system
NL8600453A (en) 1986-02-24 1987-09-16 Tieman F J Bv DISPLAY OPERATING DEVICE AND METHOD FOR MANUFACTURING THESE
JPH085018B2 (en) 1986-02-26 1996-01-24 株式会社日立製作所 Remote manipulation method and apparatus
US4757453A (en) 1986-03-25 1988-07-12 Nasiff Roger E Body activity monitor using piezoelectric transducers on arms and legs
AT387100B (en) 1986-05-06 1988-11-25 Siemens Ag Oesterreich TACTILE DOTS OR PICTURE DISPLAY
US4689449A (en) 1986-10-03 1987-08-25 Massachusetts Institute Of Technology Tremor suppressing hand controls
JPH0764723B2 (en) 1986-10-15 1995-07-12 鐘紡株式会社 Method for manufacturing enteric coated drug
NL8602624A (en) 1986-10-20 1988-05-16 Oce Nederland Bv INPUT DEVICE WITH TAKTILE FEEDBACK.
US4771344A (en) 1986-11-13 1988-09-13 James Fallacaro System for enhancing audio and/or visual presentation
US4795296A (en) 1986-11-17 1989-01-03 California Institute Of Technology Hand-held robot end effector controller having movement and force control
US4839634A (en) 1986-12-01 1989-06-13 More Edward S Electro-optic slate for input/output of hand-entered textual and graphic information
US4763356A (en) 1986-12-11 1988-08-09 AT&T Information Systems, Inc. American Telephone and Telegraph Company Touch screen form entry system
JPH0829509B2 (en) 1986-12-12 1996-03-27 株式会社日立製作所 Control device for manipulator
US4821030A (en) * 1986-12-19 1989-04-11 Tektronix, Inc. Touchscreen feedback system
US4800721A (en) 1987-02-13 1989-01-31 Caterpillar Inc. Force feedback lever
US4794392A (en) 1987-02-20 1988-12-27 Motorola, Inc. Vibrator alert device for a communication receiver
US5986643A (en) 1987-03-24 1999-11-16 Sun Microsystems, Inc. Tactile feedback mechanism for a data processing system
US4839838A (en) 1987-03-30 1989-06-13 Labiche Mitchell Spatial input apparatus
JP2511577Y2 (en) 1987-04-14 1996-09-25 日本電気ホームエレクトロニクス株式会社 Touch panel switch
US4868549A (en) 1987-05-18 1989-09-19 International Business Machines Corporation Feedback mouse
US5849298A (en) * 1987-06-24 1998-12-15 Autoimmune Inc. Treatment of multiple sclerosis by oral administration of bovine myelin
US4851820A (en) 1987-10-30 1989-07-25 Fernandez Emilio A Paging device having a switch actuated signal strength detector
US4896554A (en) 1987-11-03 1990-01-30 Culver Craig F Multifunction tactile manipulatable control
US4823634A (en) 1987-11-03 1989-04-25 Culver Craig F Multifunction tactile manipulatable control
US5078517A (en) * 1987-11-27 1992-01-07 Oki Electric Industry Co., Ltd. Wire-dot impact printer with head gap adjustment responsive to measured wire movement
US4906843A (en) 1987-12-31 1990-03-06 Marq Technolgies Combination mouse, optical scanner and digitizer puck
JPH07113703B2 (en) 1988-05-16 1995-12-06 三菱電機株式会社 Mirror support mechanism
US5038089A (en) 1988-03-23 1991-08-06 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Synchronized computational architecture for generalized bilateral control of robot arms
US4861269A (en) 1988-03-30 1989-08-29 Grumman Aerospace Corporation Sidestick flight control simulator
US4914624A (en) 1988-05-06 1990-04-03 Dunthorn David I Virtual button for touch screen
US4885565A (en) 1988-06-01 1989-12-05 General Motors Corporation Touchscreen CRT with tactile feedback
US4926879A (en) 1988-06-13 1990-05-22 Sevrain-Tech, Inc. Electro-tactile stimulator
JPH0618341Y2 (en) 1988-06-23 1994-05-11 寿男 青木 Confusion prevention device
NL8801653A (en) 1988-06-29 1990-01-16 Stork Kwant Bv OPERATING SYSTEM.
US4871992A (en) 1988-07-08 1989-10-03 Petersen Robert C Tactile display apparatus
US5116180A (en) 1988-07-18 1992-05-26 Spar Aerospace Limited Human-in-the-loop machine control loop
JPH0520226Y2 (en) 1988-08-29 1993-05-26
FR2638010B1 (en) 1988-10-13 1991-01-18 Acroe MODULAR RETROACTIVE KEYBOARD AND FLAT MODULAR ACTUATOR
US4930770A (en) 1988-12-01 1990-06-05 Baker Norman A Eccentrically loaded computerized positive/negative exercise machine
US5044956A (en) 1989-01-12 1991-09-03 Atari Games Corporation Control device such as a steering wheel for video vehicle simulator with realistic feedback forces
US4949119A (en) 1989-01-12 1990-08-14 Atari Games Corporation Gearshift for a vehicle simulator using computer controlled realistic real world forces
JPH0721710B2 (en) 1989-01-25 1995-03-08 ヤマハ株式会社 Electronic keyboard instrument with pad
US5186695A (en) 1989-02-03 1993-02-16 Loredan Biomedical, Inc. Apparatus for controlled exercise and diagnosis of human performance
JPH02109714U (en) 1989-02-20 1990-09-03
US5019761A (en) 1989-02-21 1991-05-28 Kraft Brett W Force feedback control for backhoe
GB8904955D0 (en) 1989-03-03 1989-04-12 Atomic Energy Authority Uk Multi-axis hand controller
US4983901A (en) 1989-04-21 1991-01-08 Allergan, Inc. Digital electronic foot control for medical apparatus and the like
US5133076A (en) 1989-06-12 1992-07-21 Grid Systems Corporation Hand held computer
US5076517A (en) 1989-08-14 1991-12-31 United Technologies Corporation Programmable, linear collective control system for a helicopter
US5004391A (en) 1989-08-21 1991-04-02 Rutgers University Portable dextrous force feedback master for robot telemanipulation
US5121091A (en) 1989-09-08 1992-06-09 Matsushita Electric Industrial Co., Ltd. Panel switch
US4977298A (en) 1989-09-08 1990-12-11 Matsushita Electric Industrial Co., Ltd. Panel switch
US5139261A (en) 1989-09-15 1992-08-18 Openiano Renato M Foot-actuated computer game controller serving as a joystick
US5065145A (en) 1989-10-06 1991-11-12 Summagraphics Corporation Method and apparatus for producing signals corresponding to the position of a cursor
US4961038A (en) 1989-10-16 1990-10-02 General Electric Company Torque estimator for switched reluctance machines
GB2239376A (en) 1989-12-18 1991-06-26 Ibm Touch sensitive display
US5107080A (en) 1989-12-01 1992-04-21 Massachusetts Institute Of Technology Multiple degree of freedom damped hand controls
US5022407A (en) 1990-01-24 1991-06-11 Topical Testing, Inc. Apparatus for automated tactile testing
US5184319A (en) 1990-02-02 1993-02-02 Kramer James F Force feedback and textures simulating interface device
US5095303A (en) 1990-03-27 1992-03-10 Apple Computer, Inc. Six degree of freedom graphic object controller
US5035242A (en) 1990-04-16 1991-07-30 David Franklin Method and apparatus for sound responsive tactile stimulation of deaf individuals
US5172092A (en) 1990-04-26 1992-12-15 Motorola, Inc. Selective call receiver having audible and tactile alerts
US5022384A (en) 1990-05-14 1991-06-11 Capitol Systems Vibrating/massage chair
US5581243A (en) 1990-06-04 1996-12-03 Microslate Inc. Method and apparatus for displaying simulated keyboards on touch-sensitive displays
GB9014130D0 (en) 1990-06-25 1990-08-15 Hewlett Packard Co User interface
US5547382A (en) 1990-06-28 1996-08-20 Honda Giken Kogyo Kabushiki Kaisha Riding simulation system for motorcycles
US5197003A (en) 1990-08-01 1993-03-23 Atari Games Corporation Gearshift for a vehicle simulator having a solenoid for imposing a resistance force
US5165897A (en) 1990-08-10 1992-11-24 Tini Alloy Company Programmable tactile stimulator array system and method of operation
US5193963A (en) 1990-10-31 1993-03-16 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Force reflecting hand controller
JP3219761B2 (en) 1990-11-19 2001-10-15 ソニー株式会社 Remote commander
JP2563632Y2 (en) 1990-11-29 1998-02-25 マツダ株式会社 Car front body structure
NL194053C (en) 1990-12-05 2001-05-03 Koninkl Philips Electronics Nv Device with a rotationally symmetrical body.
US5159159A (en) 1990-12-07 1992-10-27 Asher David J Touch sensor and controller
FR2670635B1 (en) 1990-12-13 1993-03-19 Sextant Avionique SWITCHING DEVICE WITH DUAL MODE OF OPERATION.
DE69027778T2 (en) 1990-12-14 1997-01-23 Ibm Coordinate processor for a computer system with a pointer arrangement
US5223776A (en) 1990-12-31 1993-06-29 Honeywell Inc. Six-degree virtual pivot controller
JP2511577B2 (en) 1991-02-05 1996-06-26 株式会社紀文食品 Sustained-release preparation consisting of propylene glycol alginate
US5212473A (en) 1991-02-21 1993-05-18 Typeright Keyboard Corp. Membrane keyboard and method of using same
US5334027A (en) 1991-02-25 1994-08-02 Terry Wherlock Big game fish training and exercise device and method
US5354162A (en) 1991-02-26 1994-10-11 Rutgers University Actuator system for providing force feedback to portable master support
US5143505A (en) 1991-02-26 1992-09-01 Rutgers University Actuator system for providing force feedback to a dextrous master glove
US5240417A (en) 1991-03-14 1993-08-31 Atari Games Corporation System and method for bicycle riding simulation
JPH06507734A (en) 1991-03-21 1994-09-01 アタリ ゲームズ コーポレーション Vehicle simulator with cross-network feedback
US5203563A (en) 1991-03-21 1993-04-20 Atari Games Corporation Shaker control device
US5341459A (en) 1991-05-09 1994-08-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Generalized compliant motion primitive
WO1992021117A1 (en) 1991-05-23 1992-11-26 Atari Games Corporation Modular display simulator
US5146566A (en) 1991-05-29 1992-09-08 Ibm Corporation Input/output system for computer user interface using magnetic levitation
US5185561A (en) 1991-07-23 1993-02-09 Digital Equipment Corporation Torque motor as a tactile feedback device in a computer system
US5175459A (en) 1991-08-19 1992-12-29 Motorola, Inc. Low profile vibratory alerting device
US5186629A (en) 1991-08-22 1993-02-16 International Business Machines Corporation Virtual graphics display capable of presenting icons and windows to the blind computer user and method
US5235868A (en) 1991-10-02 1993-08-17 Culver Craig F Mechanism for generating control signals
US5262777A (en) 1991-11-16 1993-11-16 Sri International Device for generating multidimensional input signals to a computer
US5220260A (en) 1991-10-24 1993-06-15 Lex Computer And Management Corporation Actuator having electronically controllable tactile responsiveness
US5889670A (en) * 1991-10-24 1999-03-30 Immersion Corporation Method and apparatus for tactilely responsive user interface
US5271290A (en) 1991-10-29 1993-12-21 United Kingdom Atomic Energy Authority Actuator assembly
US5335557A (en) * 1991-11-26 1994-08-09 Taizo Yasutake Touch sensitive input control device
US5309140A (en) * 1991-11-26 1994-05-03 The United States Of America As Represented By The Secretary Of The Navy Feedback system for remotely operated vehicles
US5471571A (en) 1991-12-27 1995-11-28 Xerox Corporation Method and apparatus for setting a graphical object's position and orientation with viscous dragging
JP2812598B2 (en) 1992-01-21 1998-10-22 株式会社日立ビルシステム Equipment lifting device in hoistway
EP0556999B1 (en) 1992-02-18 1998-05-27 NCR International, Inc. Data processing apparatus with user input feedback
DE4205875A1 (en) * 1992-02-26 1993-09-02 Vdo Schindling Rotary selector e.g. for manual input of data in to electronic equipment - has movement of rotary input knob controlled by motor and generator with positions defined by load data in memory
US5589828A (en) 1992-03-05 1996-12-31 Armstrong; Brad A. 6 Degrees of freedom controller with capability of tactile feedback
EP0563477A1 (en) 1992-03-25 1993-10-06 Visage Inc. Touch screen sensing apparatus
JP3199130B2 (en) * 1992-03-31 2001-08-13 パイオニア株式会社 3D coordinate input device
US5757358A (en) 1992-03-31 1998-05-26 The United States Of America As Represented By The Secretary Of The Navy Method and apparatus for enhancing computer-user selection of computer-displayed objects through dynamic selection area and constant visual feedback
US5302132A (en) * 1992-04-01 1994-04-12 Corder Paul R Instructional system and method for improving communication skills
US5189355A (en) 1992-04-10 1993-02-23 Ampex Corporation Interactive rotary controller system with tactile feedback
JP2677315B2 (en) * 1992-04-27 1997-11-17 株式会社トミー Driving toys
US5437607A (en) 1992-06-02 1995-08-01 Hwe, Inc. Vibrating massage apparatus
US5942733A (en) 1992-06-08 1999-08-24 Synaptics, Inc. Stylus input capacitive touchpad sensor
US5889236A (en) * 1992-06-08 1999-03-30 Synaptics Incorporated Pressure sensitive scrollbar feature
US5880411A (en) 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
JPH0618341A (en) 1992-07-02 1994-01-25 Fuji Electric Co Ltd Tactual sense transmission device
US5313230A (en) 1992-07-24 1994-05-17 Apple Computer, Inc. Three degree of freedom graphic object controller
US5296871A (en) 1992-07-27 1994-03-22 Paley W Bradford Three-dimensional mouse with tactile feedback
DE69327866T2 (en) 1992-09-09 2000-11-02 Hitachi, Ltd. Mobile communication device
JPH06139018A (en) 1992-09-10 1994-05-20 Victor Co Of Japan Ltd Display device
US5982352A (en) 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US6008800A (en) 1992-09-18 1999-12-28 Pryor; Timothy R. Man machine interfaces for entering data into a computer
US5283970A (en) 1992-09-25 1994-02-08 Strombecker Corporation Toy guns
US5264768A (en) 1992-10-06 1993-11-23 Honeywell, Inc. Active hand controller feedback loop
US5286203A (en) 1992-10-07 1994-02-15 Aai Microflite Simulation International Simulating horizontal stabilizer trimming in an aircraft
US5316017A (en) 1992-10-07 1994-05-31 Greenleaf Medical Systems, Inc. Man-machine interface for a joint measurement system
US5666473A (en) 1992-10-08 1997-09-09 Science & Technology Corporation & Unm Tactile computer aided sculpting device
JP2804937B2 (en) 1992-10-15 1998-09-30 矢崎総業株式会社 System switch device
US5790108A (en) 1992-10-23 1998-08-04 University Of British Columbia Controller
US5629594A (en) 1992-12-02 1997-05-13 Cybernet Systems Corporation Force feedback system
US6433771B1 (en) 1992-12-02 2002-08-13 Cybernet Haptic Systems Corporation Haptic device attribute control
US6131097A (en) * 1992-12-02 2000-10-10 Immersion Corporation Haptic authoring
US5907615A (en) 1992-12-02 1999-05-25 Motorola, Inc. Miniature wireless communication device
US5769640A (en) 1992-12-02 1998-06-23 Cybernet Systems Corporation Method and system for simulating medical procedures including virtual reality and control method and system for use therein
US5389865A (en) * 1992-12-02 1995-02-14 Cybernet Systems Corporation Method and system for providing a tactile virtual reality and manipulator defining an interface device therefor
FI92111C (en) * 1992-12-11 1994-09-26 Icl Personal Systems Oy Method and arrangement for moving the cursor on a computer screen
US5796575A (en) 1992-12-21 1998-08-18 Hewlett-Packard Company Portable computer with hinged cover having a window
US5451924A (en) 1993-01-14 1995-09-19 Massachusetts Institute Of Technology Apparatus for providing sensory substitution of force feedback
US5355148A (en) 1993-01-14 1994-10-11 Ast Research, Inc. Fingerpoint mouse
US5389849A (en) * 1993-01-20 1995-02-14 Olympus Optical Co., Ltd. Tactility providing apparatus and manipulating device using the same
EP0607580A1 (en) 1993-01-21 1994-07-27 International Business Machines Corporation Tactile feedback mechanism for cursor control
US5690582A (en) 1993-02-02 1997-11-25 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
US5785630A (en) 1993-02-02 1998-07-28 Tectrix Fitness Equipment, Inc. Interactive exercise apparatus
JPH06265991A (en) 1993-03-10 1994-09-22 Canon Inc Camera system
US5563632A (en) 1993-04-30 1996-10-08 Microtouch Systems, Inc. Method of and apparatus for the elimination of the effects of internal interference in force measurement systems, including touch - input computer and related displays employing touch force location measurement techniques
JP3686686B2 (en) 1993-05-11 2005-08-24 松下電器産業株式会社 Haptic device, data input device, and data input device device
US5405152A (en) * 1993-06-08 1995-04-11 The Walt Disney Company Method and apparatus for an interactive video game with physical feedback
US5396266A (en) * 1993-06-08 1995-03-07 Technical Research Associates, Inc. Kinesthetic feedback apparatus and method
US5513100A (en) * 1993-06-10 1996-04-30 The University Of British Columbia Velocity controller with force feedback stiffness control
US5466213A (en) 1993-07-06 1995-11-14 Massachusetts Institute Of Technology Interactive robotic therapist
US5436622A (en) 1993-07-06 1995-07-25 Motorola, Inc. Variable frequency vibratory alert method and structure
JPH0738627A (en) 1993-07-15 1995-02-07 Casio Comput Co Ltd Radio telephone set and radio transmitter-receiver
US5721566A (en) * 1995-01-18 1998-02-24 Immersion Human Interface Corp. Method and apparatus for providing damping force feedback
US5739811A (en) * 1993-07-16 1998-04-14 Immersion Human Interface Corporation Method and apparatus for controlling human-computer interface systems providing force feedback
US5734373A (en) * 1993-07-16 1998-03-31 Immersion Human Interface Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US5805140A (en) 1993-07-16 1998-09-08 Immersion Corporation High bandwidth force feedback interface using voice coils and flexures
CA2167304C (en) 1993-07-16 1998-04-21 Louis B. Rosenberg Multi degree of freedom human-computer interface with tracking and forcefeedback
US5767839A (en) 1995-01-18 1998-06-16 Immersion Human Interface Corporation Method and apparatus for providing passive force feedback to human-computer interface systems
DE4323863A1 (en) 1993-07-16 1995-01-19 Andromeda Ges Fuer Computer Un Tactile information communication to the user of hand-held computers
US5731804A (en) * 1995-01-18 1998-03-24 Immersion Human Interface Corp. Method and apparatus for providing high bandwidth, low noise mechanical I/O for computer systems
JPH086493A (en) 1993-07-21 1996-01-12 Texas Instr Inc <Ti> Tangible-type display that can be electronically refreshed for braille text and braille diagram
US5491477A (en) * 1993-09-13 1996-02-13 Apple Computer, Inc. Anti-rotation mechanism for direct manipulation position input controller for computer
US5625576A (en) * 1993-10-01 1997-04-29 Massachusetts Institute Of Technology Force reflecting haptic interface
GB9321086D0 (en) 1993-10-13 1993-12-01 Univ Alberta Hand stimulator
JP2813728B2 (en) 1993-11-01 1998-10-22 インターナショナル・ビジネス・マシーンズ・コーポレイション Personal communication device with zoom / pan function
US5422656A (en) 1993-11-01 1995-06-06 International Business Machines Corp. Personal communicator having improved contrast control for a liquid crystal, touch sensitive display
EP0660258B1 (en) 1993-12-20 2000-03-08 Seiko Epson Corporation Electronic pointing device
US5473235A (en) 1993-12-21 1995-12-05 Honeywell Inc. Moment cell counterbalance for active hand controller
US5461711A (en) * 1993-12-22 1995-10-24 Interval Research Corporation Method and system for spatial accessing of time-based information
JPH07111663B2 (en) 1993-12-28 1995-11-29 コナミ株式会社 Foot pedal for operation
DE69428675T2 (en) * 1993-12-30 2002-05-08 Xerox Corp Apparatus and method for supporting an implicit structuring of free-form lists, overviews, texts, tables and diagrams in an input system and editing system based on hand signals
US5473344A (en) 1994-01-06 1995-12-05 Microsoft Corporation 3-D cursor positioning device
GB2286100A (en) * 1994-01-19 1995-08-02 Ibm Touch-sensitive display apparatus
US5577981A (en) 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5722068A (en) 1994-01-26 1998-02-24 Oki Telecom, Inc. Imminent change warning
WO1995020788A1 (en) 1994-01-27 1995-08-03 Exos, Inc. Intelligent remote multimode sense and display system utilizing haptic information compression
CA2140164A1 (en) * 1994-01-27 1995-07-28 Kenneth R. Robertson System and method for computer cursor control
WO1995020787A1 (en) * 1994-01-27 1995-08-03 Exos, Inc. Multimode feedback display technology
FI98163C (en) * 1994-02-08 1997-04-25 Nokia Mobile Phones Ltd Coding system for parametric speech coding
JP3389314B2 (en) 1994-03-28 2003-03-24 オリンパス光学工業株式会社 Tactile transmission device
GB9406702D0 (en) 1994-04-05 1994-05-25 Binstead Ronald P Multiple input proximity detector and touchpad system
US5564004A (en) * 1994-04-13 1996-10-08 International Business Machines Corporation Method and system for facilitating the selection of icons
KR100300397B1 (en) * 1994-04-21 2001-10-22 김순택 System having touch panel and digitizer function and driving method
JP2665313B2 (en) 1994-04-22 1997-10-22 国際電業株式会社 Reaction force generator
US6004134A (en) 1994-05-19 1999-12-21 Exos, Inc. Interactive simulation including force feedback
US5521336A (en) 1994-05-23 1996-05-28 International Business Machines Corporation Simplified digital pad sensor
US5457479A (en) 1994-06-13 1995-10-10 Primax Electronics Ltd. Apparatus having dual modes for controlling cursor on display screen
US6160489A (en) 1994-06-23 2000-12-12 Motorola, Inc. Wireless communication device adapted to generate a plurality of distinctive tactile alert patterns
US5565887A (en) 1994-06-29 1996-10-15 Microsoft Corporation Method and apparatus for moving a cursor on a computer screen
US5623582A (en) * 1994-07-14 1997-04-22 Immersion Human Interface Corporation Computer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects
CN1059303C (en) 1994-07-25 2000-12-06 国际商业机器公司 Apparatus and method for marking text on a display screen in a personal communications device
US5575761A (en) 1994-07-27 1996-11-19 Hajianpour; Mohammed-Ali Massage device applying variable-frequency vibration in a variable pulse sequence
US5530455A (en) 1994-08-10 1996-06-25 Mouse Systems Corporation Roller mouse for implementing scrolling in windows applications
JPH09505426A (en) 1994-09-07 1997-05-27 フィリップス エレクトロニクス ネムローゼ フェンノートシャップ Virtual workspace with user programmable haptic feedback
US6422941B1 (en) 1994-09-21 2002-07-23 Craig Thorner Universal tactile feedback system for computer video games and simulations
US5684722A (en) 1994-09-21 1997-11-04 Thorner; Craig Apparatus and method for generating a control signal for a tactile sensation generator
JPH08125720A (en) 1994-10-28 1996-05-17 Sony Corp Receiver
US5642469A (en) 1994-11-03 1997-06-24 University Of Washington Direct-drive manipulator for pen-based force display
US5766016A (en) 1994-11-14 1998-06-16 Georgia Tech Research Corporation Surgical simulator and method for simulating surgical procedure
US5666138A (en) 1994-11-22 1997-09-09 Culver; Craig F. Interface control
JP3236180B2 (en) * 1994-12-05 2001-12-10 日本電気株式会社 Coordinate pointing device
JPH08235159A (en) * 1994-12-06 1996-09-13 Matsushita Electric Ind Co Ltd Inverse cosine transformation device
US5828364A (en) 1995-01-03 1998-10-27 Microsoft Corporation One-piece case top and integrated switch for a computer pointing device
US5591082A (en) * 1995-01-05 1997-01-07 Thrustmaster, Inc. Side-mounted throttle and weapons controller for computer video games and flight simulation
JP3169523B2 (en) 1995-01-27 2001-05-28 三菱電機株式会社 Personal communication device
JP3123383B2 (en) * 1995-02-09 2001-01-09 トヨタ自動車株式会社 Fuel supply control device for internal combustion engine
JPH08221173A (en) 1995-02-09 1996-08-30 Hitachi Ltd Input device
JPH10500516A (en) 1995-03-13 1998-01-13 フィリップス エレクトロニクス ネムローゼ フェンノートシャップ Enables true 3D input by vertical movement of mouse or trackball
US5542672A (en) 1995-03-17 1996-08-06 Meredith; Chris Fishing rod and reel electronic game controller
JP3348265B2 (en) 1995-03-27 2002-11-20 富士通株式会社 Overhead transfer control method
US5882206A (en) 1995-03-29 1999-03-16 Gillio; Robert G. Virtual surgery system
JP3510632B2 (en) 1995-05-10 2004-03-29 任天堂株式会社 Game console operating device
US5736978A (en) * 1995-05-26 1998-04-07 The United States Of America As Represented By The Secretary Of The Air Force Tactile graphics display
US5691898A (en) 1995-09-27 1997-11-25 Immersion Human Interface Corp. Safe and low cost computer peripherals with force feedback for consumer applications
US6496182B1 (en) 1995-06-07 2002-12-17 Microsoft Corporation Method and system for providing touch-sensitive screens for the visually impaired
US6166723A (en) * 1995-11-17 2000-12-26 Immersion Corporation Mouse interface device providing force feedback
US5629595A (en) * 1995-06-19 1997-05-13 The Walt Disney Company Method and apparatus for an amusement ride having an interactive guided vehicle
US5589854A (en) 1995-06-22 1996-12-31 Tsai; Ming-Chang Touching feedback device
ZA965340B (en) 1995-06-30 1997-01-27 Interdigital Tech Corp Code division multiple access (cdma) communication system
US5724106A (en) * 1995-07-17 1998-03-03 Gateway 2000, Inc. Hand held remote control device with trigger button
US5771037A (en) 1995-07-24 1998-06-23 Altra Computer display cursor controller
DE19528457C2 (en) 1995-08-03 2001-03-08 Mannesmann Vdo Ag Control device
US5805165A (en) 1995-08-31 1998-09-08 Microsoft Corporation Method of selecting a displayed control item
US5805416A (en) 1995-09-11 1998-09-08 Norand Corporation Modular hand-held data capture terminal
US5808601A (en) 1995-09-12 1998-09-15 International Business Machines Corporation Interactive object selection pointer method and apparatus
WO1997011448A1 (en) 1995-09-18 1997-03-27 Intellinet User interface for home automation system
US6108704A (en) 1995-09-25 2000-08-22 Netspeak Corporation Point-to-point internet protocol
US5959613A (en) 1995-12-01 1999-09-28 Immersion Corporation Method and apparatus for shaping force signals for a force feedback device
US5999168A (en) 1995-09-27 1999-12-07 Immersion Corporation Haptic accelerator for force feedback computer peripherals
JPH09167050A (en) 1995-10-09 1997-06-24 Nintendo Co Ltd Operation device and image processing system using the device
KR100371456B1 (en) * 1995-10-09 2004-03-30 닌텐도가부시키가이샤 Three-dimensional image processing system
US5719561A (en) * 1995-10-25 1998-02-17 Gilbert R. Gonzales Tactile communication device and method
US5754023A (en) 1995-10-26 1998-05-19 Cybernet Systems Corporation Gyro-stabilized platforms for force-feedback applications
US5896125A (en) * 1995-11-06 1999-04-20 Niedzwiecki; Richard H. Configurable keyboard to personal computer video game controller adapter
US6473069B1 (en) 1995-11-13 2002-10-29 Cirque Corporation Apparatus and method for tactile feedback from input device
US5767457A (en) 1995-11-13 1998-06-16 Cirque Corporation Apparatus and method for audible feedback from input device
US6107997A (en) 1996-06-27 2000-08-22 Ure; Michael J. Touch-sensitive keyboard/mouse and computing device using the same
US6100874A (en) * 1995-11-17 2000-08-08 Immersion Corporation Force feedback mouse interface
US5825308A (en) 1996-11-26 1998-10-20 Immersion Human Interface Corporation Force feedback interface having isotonic and isometric functionality
US6639581B1 (en) 1995-11-17 2003-10-28 Immersion Corporation Flexure mechanism for interface device
US5877748A (en) * 1995-11-20 1999-03-02 Redlich; Sanford I. Computer control input interface system
US6061004A (en) * 1995-11-26 2000-05-09 Immersion Corporation Providing force feedback using an interface device including an indexing function
AU1328597A (en) 1995-11-30 1997-06-19 Virtual Technologies, Inc. Tactile feedback man-machine interface device
US6169540B1 (en) 1995-12-01 2001-01-02 Immersion Corporation Method and apparatus for designing force sensations in force feedback applications
US6028593A (en) * 1995-12-01 2000-02-22 Immersion Corporation Method and apparatus for providing simulated physical interactions within computer generated environments
US5956484A (en) 1995-12-13 1999-09-21 Immersion Corporation Method and apparatus for providing force feedback over a computer network
US6147674A (en) * 1995-12-01 2000-11-14 Immersion Corporation Method and apparatus for designing force sensations in force feedback computer applications
US6219032B1 (en) 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
JP3239727B2 (en) 1995-12-05 2001-12-17 トヨタ自動車株式会社 Automatic driving control device for vehicles
GB2308082A (en) 1995-12-12 1997-06-18 Dsnd Ugland As Pipe straightener
US5760764A (en) 1995-12-13 1998-06-02 Altra Computer display cursor controller with serial interface
US6161126A (en) * 1995-12-13 2000-12-12 Immersion Corporation Implementing force feedback over the World Wide Web and other computer networks
US6078308A (en) 1995-12-13 2000-06-20 Immersion Corporation Graphical click surfaces for force feedback applications to provide user selection using cursor interaction with a trigger position within a boundary of a graphical object
US6300936B1 (en) 1997-11-14 2001-10-09 Immersion Corporation Force feedback system including multi-tasking graphical host environment and interface device
US6859819B1 (en) 1995-12-13 2005-02-22 Immersion Corporation Force feedback enabled over a computer network
JPH09167541A (en) 1995-12-18 1997-06-24 Idec Izumi Corp Thin-type switch and display panel with switch
US5914705A (en) 1996-02-09 1999-06-22 Lucent Technologies Inc. Apparatus and method for providing detent-like tactile feedback
SE519661C2 (en) 1996-02-23 2003-03-25 Immersion Corp Pointing devices and method for marking graphic details on a display with sensory feedback upon finding said detail
DE19646226A1 (en) * 1996-03-19 1998-05-14 Bayerische Motoren Werke Ag Operating device for menu-controlled functions of a vehicle
US6111577A (en) * 1996-04-04 2000-08-29 Massachusetts Institute Of Technology Method and apparatus for determining forces to be applied to a user through a haptic interface
US5914708A (en) * 1996-04-04 1999-06-22 Cirque Corporation Computer input stylus method and apparatus
US6178157B1 (en) 1996-04-15 2001-01-23 Digital Papyrus Corporation Flying head with adjustable actuator load
US5823876A (en) 1996-05-03 1998-10-20 Unbehand; Erick Michael Steering wheel simulation assembly
US5802353A (en) 1996-06-12 1998-09-01 General Electric Company Haptic computer modeling system
US5699059A (en) 1996-06-28 1997-12-16 Hiller; Jeffrey H. Keyboard incorporating game player
US5748185A (en) * 1996-07-03 1998-05-05 Stratos Product Development Group Touchpad with scroll and pan regions
US6039258A (en) 1996-07-18 2000-03-21 Norand Corporation Hand-held portable data collection terminal system
US5791992A (en) 1996-07-31 1998-08-11 International Business Machines Corporation Video game system with internet cartridge
US6125385A (en) * 1996-08-01 2000-09-26 Immersion Corporation Force feedback implementation in web pages
US6084587A (en) 1996-08-02 2000-07-04 Sensable Technologies, Inc. Method and apparatus for generating and interfacing with a haptic virtual reality environment
US5943044A (en) 1996-08-05 1999-08-24 Interlink Electronics Force sensing semiconductive touchpad
US5821921A (en) 1996-08-09 1998-10-13 Osborn; John J. Cursor controller having cross-translating platforms with a cantilevered handle
JP2880963B2 (en) 1996-08-09 1999-04-12 静岡日本電気株式会社 Vibration motor holding structure
US5990869A (en) 1996-08-20 1999-11-23 Alliance Technologies Corp. Force feedback mouse
SE515663C2 (en) 1996-08-23 2001-09-17 Ericsson Telefon Ab L M Touch screen and use of touch screen
US5694013A (en) 1996-09-06 1997-12-02 Ford Global Technologies, Inc. Force feedback haptic interface for a three-dimensional CAD surface
US6024576A (en) * 1996-09-06 2000-02-15 Immersion Corporation Hemispherical, high bandwidth mechanical interface for computer systems
DE19638015A1 (en) 1996-09-18 1998-03-26 Mannesmann Vdo Ag Tactile panel for input to computer system
US6140987A (en) 1996-09-18 2000-10-31 Intellinet, Inc. User interface for home automation system
JP4295834B2 (en) 1996-09-20 2009-07-15 ソニー株式会社 Character string data processing apparatus and method
JPH114966A (en) 1996-10-01 1999-01-12 Sony Computer Entateimento:Kk Operation device for game machine and game device
US6028531A (en) 1996-10-21 2000-02-22 Wanderlich; Ronald E. Terminal units for a mobile communications system
US5828197A (en) 1996-10-25 1998-10-27 Immersion Human Interface Corporation Mechanical interface having multiple grounded actuators
GB9622556D0 (en) 1996-10-30 1997-01-08 Philips Electronics Nv Cursor control with user feedback mechanism
US6411276B1 (en) 1996-11-13 2002-06-25 Immersion Corporation Hybrid control of haptic feedback for host computer and interface device
US5884029A (en) 1996-11-14 1999-03-16 International Business Machines Corporation User interaction with intelligent virtual objects, avatars, which interact with other avatars controlled by different users
US6636197B1 (en) 1996-11-26 2003-10-21 Immersion Corporation Haptic feedback effects for control, knobs and other interface devices
US6128006A (en) 1998-03-26 2000-10-03 Immersion Corporation Force feedback mouse wheel and other control wheels
US6154201A (en) 1996-11-26 2000-11-28 Immersion Corporation Control knob with multiple degrees of freedom and force feedback
US5804780A (en) 1996-12-31 1998-09-08 Ericsson Inc. Virtual touch screen switch
US5973670A (en) * 1996-12-31 1999-10-26 International Business Machines Corporation Tactile feedback controller for computer cursor control device
IL119955A0 (en) 1997-01-01 1997-04-15 Advanced Recognition Tech An instruction and/or identification input unit
US5912661A (en) 1997-01-14 1999-06-15 Microsoft Corp. Z-encoder mechanism
GB9701793D0 (en) 1997-01-29 1997-03-19 Gay Geoffrey N W Means for inputting characters or commands into a computer
US5808603A (en) 1997-02-06 1998-09-15 Chen; Mei Yun Computer input device
EP0866592A3 (en) 1997-03-20 1999-06-16 Nortel Networks Corporation Personal communication device and call process status signalling method
US5982304A (en) 1997-03-24 1999-11-09 International Business Machines Corporation Piezoelectric switch with tactile response
US6118435A (en) * 1997-04-10 2000-09-12 Idec Izumi Corporation Display unit with touch panel
JPH10293644A (en) 1997-04-18 1998-11-04 Idec Izumi Corp Display device having touch panel
US6020876A (en) * 1997-04-14 2000-02-01 Immersion Corporation Force feedback interface with selective disturbance filter
US6110130A (en) * 1997-04-21 2000-08-29 Virtual Technologies, Inc. Exoskeleton device for directly measuring fingertip position and inferring finger joint angle
JPH10295937A (en) * 1997-04-24 1998-11-10 Sony Computer Entertainment:Kk Operation device for game machine
US6005551A (en) 1997-04-25 1999-12-21 Microsoft Corporation Offline force effect rendering
JPH114282A (en) 1997-06-13 1999-01-06 Kenwood Corp Mobile telephone with vibrator
US6071194A (en) 1997-06-19 2000-06-06 Act Labs Ltd Reconfigurable video game controller
US6081536A (en) 1997-06-20 2000-06-27 Tantivy Communications, Inc. Dynamic bandwidth allocation to transmit a wireless protocol across a code division multiple access (CDMA) radio link
US6151332A (en) 1997-06-20 2000-11-21 Tantivy Communications, Inc. Protocol conversion and bandwidth reduction technique providing multiple nB+D ISDN basic rate interface links over a wireless code division multiple access communication system
US6236647B1 (en) 1998-02-24 2001-05-22 Tantivy Communications, Inc. Dynamic frame size adjustment and selective reject on a multi-link channel to improve effective throughput and bit error rate
US6388999B1 (en) 1997-12-17 2002-05-14 Tantivy Communications, Inc. Dynamic bandwidth allocation for multiple access communications using buffer urgency factor
US6094565A (en) 1997-06-30 2000-07-25 Motorola, Inc. Closeable communication device and method of operating the same
US5953413A (en) 1997-07-14 1999-09-14 Motorola, Inc. Closeable communication device and method of operating same
US6292174B1 (en) 1997-08-23 2001-09-18 Immersion Corporation Enhanced cursor control using limited-workspace force feedback devices
FI114769B (en) 1997-09-04 2004-12-15 Nokia Corp Procedure for processing the telephone numbers in mobile station and mobile station
JP4567817B2 (en) 1997-09-11 2010-10-20 ソニー株式会社 Information processing apparatus and control method thereof
GB2329300B (en) 1997-09-16 2002-07-17 Nokia Mobile Phones Ltd Mobile telephone with handwritten data input
US6002184A (en) * 1997-09-17 1999-12-14 Coactive Drive Corporation Actuator with opposing repulsive magnetic forces
US5887995A (en) 1997-09-23 1999-03-30 Compaq Computer Corporation Touchpad overlay with tactile response
US5917906A (en) 1997-10-01 1999-06-29 Ericsson Inc. Touch pad with tactile feature
US6088019A (en) 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6448977B1 (en) 1997-11-14 2002-09-10 Immersion Corporation Textures and other spatial sensations for a relative haptic interface device
FI104928B (en) 1997-11-27 2000-04-28 Nokia Mobile Phones Ltd Wireless Communication and a Method of Making a Wireless Communication Device
JPH11205432A (en) 1998-01-08 1999-07-30 Matsushita Electric Ind Co Ltd Portable terminal device
US6323846B1 (en) 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
JP3987182B2 (en) 1998-01-26 2007-10-03 Idec株式会社 Information display device and operation input device
WO1999040504A1 (en) 1998-02-03 1999-08-12 Invibro Ltd. System and method for vibro generations
US6219034B1 (en) * 1998-02-23 2001-04-17 Kristofer E. Elbing Tactile computer interface
US6177881B1 (en) 1998-03-12 2001-01-23 Motorola, Inc. Vibrator mounting assembly for a portable communication device
US6198206B1 (en) * 1998-03-20 2001-03-06 Active Control Experts, Inc. Inertial/audio unit and construction
JP3098488B2 (en) 1998-04-20 2000-10-16 埼玉日本電気株式会社 Missed call notification device and method for foldable mobile phone
US6018711A (en) 1998-04-21 2000-01-25 Nortel Networks Corporation Communication system user interface with animated representation of time remaining for input to recognizer
JP3643946B2 (en) 1998-04-21 2005-04-27 株式会社クボタ Information terminal device, agricultural machine and recording medium
JPH11338629A (en) 1998-05-27 1999-12-10 Nec Corp Pointing device
US5977867A (en) 1998-05-29 1999-11-02 Nortel Networks Corporation Touch pad panel with tactile feedback
JP3424587B2 (en) * 1998-06-18 2003-07-07 富士通株式会社 Driving method of plasma display panel
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6686901B2 (en) 1998-06-23 2004-02-03 Immersion Corporation Enhancing inertial tactile feedback in computer interface devices having increased mass
US6184868B1 (en) 1998-09-17 2001-02-06 Immersion Corp. Haptic feedback control devices
US6563487B2 (en) 1998-06-23 2003-05-13 Immersion Corporation Haptic feedback for directional control pads
US6697043B1 (en) 1999-12-21 2004-02-24 Immersion Corporation Haptic interface device and actuator assembly providing linear haptic sensations
JP2000023253A (en) 1998-06-26 2000-01-21 Nec Corp Multimode mobile radio device and multimode radio device, and incoming call information method for the same
US6262717B1 (en) 1998-07-02 2001-07-17 Cirque Corporation Kiosk touch pad
NO310748B1 (en) 1998-07-10 2001-08-20 Computouch As Method and equipment for improved communication between man and computer
US6243080B1 (en) * 1998-07-14 2001-06-05 Ericsson Inc. Touch-sensitive panel with selector
US6373463B1 (en) * 1998-10-14 2002-04-16 Honeywell International Inc. Cursor control system with tactile feedback
US6218966B1 (en) * 1998-11-05 2001-04-17 International Business Machines Corporation Tactile feedback keyboard
US6435794B1 (en) 1998-11-18 2002-08-20 Scott L. Springer Force display master interface device for teleoperation
US6332075B1 (en) 1998-12-31 2001-12-18 Nortel Networks Limited Use of distinctive ringing in a wireless communication system to inform a called party of an increased billing rate
JP3437776B2 (en) 1999-01-12 2003-08-18 株式会社ソニー・コンピュータエンタテインメント Entertainment system, entertainment apparatus and recording medium
US6469695B1 (en) 1999-01-28 2002-10-22 Ncr Corporation Method and apparatus for touch screen touch ahead capability
US6502754B1 (en) 1999-03-01 2003-01-07 Symbol Technologies, Inc. Data acquisition device
DE19911416B4 (en) 1999-03-15 2005-02-10 Siemens Ag Pocket monitor for patient cards
US6483498B1 (en) * 1999-03-17 2002-11-19 International Business Machines Corporation Liquid crystal display with integrated resistive touch sensor
US7263073B2 (en) 1999-03-18 2007-08-28 Statsignal Ipc, Llc Systems and methods for enabling a mobile user to notify an automated monitoring system of an emergency situation
JP2000278368A (en) 1999-03-19 2000-10-06 Nec Corp Radio communication apparatus and display control method for the same
JP2000299575A (en) 1999-04-12 2000-10-24 Sony Corp Input device
US6424356B2 (en) 1999-05-05 2002-07-23 Immersion Corporation Command of force sensations in a forceback system using force effect suites
US6438390B1 (en) 1999-05-06 2002-08-20 Motorola, Inc. Plural status indication control method suitable for use in a communication device
US7061466B1 (en) 1999-05-07 2006-06-13 Immersion Corporation Force feedback device including single-phase, fixed-coil actuators
US6590596B1 (en) 1999-05-21 2003-07-08 Gateway, Inc. Right click for task bar content
US7151528B2 (en) * 1999-06-22 2006-12-19 Cirque Corporation System for disposing a proximity sensitive touchpad behind a mobile phone keypad
US6982696B1 (en) 1999-07-01 2006-01-03 Immersion Corporation Moving magnet actuator for providing haptic feedback
US6337678B1 (en) * 1999-07-21 2002-01-08 Tactiva Incorporated Force feedback computer input and output device with coordinated haptic elements
US7184765B1 (en) * 1999-08-27 2007-02-27 Lucent Technologies Inc. Enhanced roaming notification of call handoffs
JP2001076582A (en) * 1999-09-01 2001-03-23 Matsushita Electric Ind Co Ltd Electronic apparatus
DE20080209U1 (en) 1999-09-28 2001-08-09 Immersion Corp Control of haptic sensations for interface devices with vibrotactile feedback
US6680729B1 (en) * 1999-09-30 2004-01-20 Immersion Corporation Increasing force transmissibility for tactile feedback interface devices
WO2001029815A1 (en) * 1999-10-21 2001-04-26 Cirque Corporation Improved kiosk touchpad
US6388655B1 (en) 1999-11-08 2002-05-14 Wing-Keung Leung Method of touch control of an input device and such a device
FR2800966B1 (en) 1999-11-10 2001-12-21 Jean Max Coudon TOUCH STIMULATION DEVICE FOR USE BY A DEAF PERSON
US6850150B1 (en) 2000-11-21 2005-02-01 Nokia Mobile Phones Ltd. Portable device
JP3395741B2 (en) 1999-11-26 2003-04-14 日本電気株式会社 Circuit and method for setting incoming call notification pattern of mobile phone
US6529122B1 (en) * 1999-12-10 2003-03-04 Siemens Technology-To-Business Center, Llc Tactile sensor apparatus and methods
GB2364471B (en) 1999-12-13 2003-06-18 Matsushita Electric Ind Co Ltd Telephone apparatus
US6414674B1 (en) 1999-12-17 2002-07-02 International Business Machines Corporation Data processing system and method including an I/O touch pad having dynamically alterable location indicators
US6509892B1 (en) * 1999-12-17 2003-01-21 International Business Machines Corporation Method, system and program for topographical interfacing
US6535201B1 (en) * 1999-12-17 2003-03-18 International Business Machines Corporation Method and system for three-dimensional topographical modeling
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US6441599B1 (en) 2000-01-28 2002-08-27 Donald S. Kropidlowski Reference point locator for residential and commercial construction
JP3448003B2 (en) 2000-03-09 2003-09-16 株式会社東芝 Mobile communication terminal
JP2001268171A (en) 2000-03-16 2001-09-28 Denso Corp Radio communication equipment
US20010036832A1 (en) 2000-04-14 2001-11-01 Onscene, Inc. Emergency command and control system
JP3771420B2 (en) 2000-04-19 2006-04-26 富士通株式会社 Switching station apparatus, base station control apparatus, and multicall call number change method
US6445284B1 (en) 2000-05-10 2002-09-03 Juan Manuel Cruz-Hernandez Electro-mechanical transducer suitable for tactile display and article conveyance
JP3414359B2 (en) 2000-05-12 2003-06-09 日本電気株式会社 Method of transmitting perceptual information of mobile phone and mobile phone with perceptual information transmitting function
JP4420364B2 (en) 2000-06-06 2010-02-24 Smk株式会社 Touch panel pressure generating device
JP3853572B2 (en) 2000-06-07 2006-12-06 日本電気株式会社 Mobile communication terminal device and incoming call identification method used therefor
US7159008B1 (en) 2000-06-30 2007-01-02 Immersion Corporation Chat interface with haptic feedback functionality
JP2002022478A (en) 2000-07-06 2002-01-23 Seiko Instruments Inc Portable type gps receiver, navigator and navigation system
DE10034507C1 (en) 2000-07-15 2002-02-21 Schott Glas Process for the production of microstructures on glass or plastic substrates according to the hot molding technology and associated molding tool
JP3949912B2 (en) 2000-08-08 2007-07-25 株式会社エヌ・ティ・ティ・ドコモ Portable electronic device, electronic device, vibration generator, notification method by vibration and notification control method
JP2002057759A (en) 2000-08-09 2002-02-22 Nec Saitama Ltd Foldable portable telephone set
US6639582B1 (en) 2000-08-10 2003-10-28 International Business Machines Corporation System for combining haptic sensory-motor effects from two separate input devices into resultant sensory-motor effects and for feedback of such resultant effects between the input devices
JP3943876B2 (en) * 2000-08-11 2007-07-11 アルプス電気株式会社 INPUT DEVICE AND ELECTRONIC DEVICE HAVING THE SAME
US6819922B1 (en) 2000-08-14 2004-11-16 Hewlett-Packard Development Company, L.P. Personal digital assistant vehicle interface and method
JP2002111813A (en) 2000-08-24 2002-04-12 Sony Internatl Europ Gmbh Portable communication unit of radio communication system
EP1182851A1 (en) 2000-08-24 2002-02-27 Sony International (Europe) GmbH Input device for a portable communication device
JP3602040B2 (en) 2000-09-07 2004-12-15 Necアクセステクニカ株式会社 Personal digital assistant
DE10046099A1 (en) 2000-09-18 2002-04-04 Siemens Ag Touch sensitive display with tactile feedback
KR100491606B1 (en) 2000-09-29 2005-05-27 산요덴키가부시키가이샤 Folder type communication terminal device and the display control method of the same
US7533342B1 (en) 2000-09-29 2009-05-12 Palmsource, Inc. System and method of a personal computer device providing telephone capability
JP3926090B2 (en) 2000-10-04 2007-06-06 日本電気株式会社 Mobile communication terminal device and melody synchronous display control method used therefor
AU2002213017A1 (en) 2000-10-10 2002-04-22 Motorola Inc., A Corporation Of The State Of Delaware Data entry device
US6963839B1 (en) 2000-11-03 2005-11-08 At&T Corp. System and method of controlling sound in a multi-media communication application
US20020107936A1 (en) 2000-12-13 2002-08-08 Amon Thomas C. System and method for displaying emergency information on a user computer
GB2370353A (en) 2000-12-20 2002-06-26 Nokia Mobile Phones Ltd Navigation system
US6727916B1 (en) 2000-12-21 2004-04-27 Sprint Spectrum, L.P. Method and system for assisting a user to engage in a microbrowser-based interactive chat session
WO2002056272A1 (en) 2000-12-27 2002-07-18 Nokia Corporation Portable vibrating device with adjustable amplitude
US7463249B2 (en) 2001-01-18 2008-12-09 Illinois Tool Works Inc. Acoustic wave touch actuated switch with feedback
US6944482B2 (en) 2001-01-22 2005-09-13 Wildseed Ltd. Visualization supplemented wireless mobile telephony
US6418323B1 (en) 2001-01-22 2002-07-09 Wildseed, Ltd. Wireless mobile phone with Morse code and related capabilities
JP3703726B2 (en) 2001-03-02 2005-10-05 株式会社東芝 Mobile communication terminal device
JP2002259059A (en) 2001-03-05 2002-09-13 Sony Corp Input device
US7567232B2 (en) 2001-03-09 2009-07-28 Immersion Corporation Method of using tactile feedback to deliver silent status information to a user of an electronic device
US6885876B2 (en) 2001-03-12 2005-04-26 Nokia Mobile Phones Ltd. Mobile phone featuring audio-modulated vibrotactile module
US6981223B2 (en) 2001-03-19 2005-12-27 Ecrio, Inc. Method, apparatus and computer readable medium for multiple messaging session management with a graphical user interface
US9625905B2 (en) 2001-03-30 2017-04-18 Immersion Corporation Haptic remote control for toys
DE10117956B4 (en) 2001-04-10 2004-04-08 Schott Glas Touch switch with a control surface
US6834373B2 (en) 2001-04-24 2004-12-21 International Business Machines Corporation System and method for non-visually presenting multi-part information pages using a combination of sonifications and tactile feedback
US6636202B2 (en) 2001-04-27 2003-10-21 International Business Machines Corporation Interactive tactile display for computer screen
US6447069B1 (en) 2001-05-08 2002-09-10 Collins & Aikman Products Co. Quick connect/disconnect apparatus for removably securing armrests to vehicle seating assemblies
US6963762B2 (en) 2001-05-23 2005-11-08 Nokia Corporation Mobile phone using tactile icons
GB0114458D0 (en) * 2001-06-14 2001-08-08 Lucas Industries Ltd An in-vehicle display system
US20020194246A1 (en) 2001-06-14 2002-12-19 International Business Machines Corporation Context dependent calendar
US20020193125A1 (en) 2001-06-15 2002-12-19 Bryan Smith Method and apparatus for a user to avoid unintentional calls in a mobile telephone network
US20030002682A1 (en) 2001-07-02 2003-01-02 Phonex Broadband Corporation Wireless audio/mechanical vibration transducer and audio/visual transducer
US20030022701A1 (en) * 2001-07-25 2003-01-30 Aloke Gupta Buttonless communication device with touchscreen display
US20030045266A1 (en) 2001-08-08 2003-03-06 Staskal Duane J. Mobile wireless communication devices with airtime accounting and methods therefor
US20030048260A1 (en) * 2001-08-17 2003-03-13 Alec Matusis System and method for selecting actions based on the identification of user's fingers
CA2398798A1 (en) * 2001-08-28 2003-02-28 Research In Motion Limited System and method for providing tactility for an lcd touchscreen
DE10144634A1 (en) * 2001-09-11 2003-04-10 Trw Automotive Electron & Comp operating system
US7623114B2 (en) 2001-10-09 2009-11-24 Immersion Corporation Haptic feedback sensations based on audio output from computer devices
JP3798287B2 (en) * 2001-10-10 2006-07-19 Smk株式会社 Touch panel input device
US6940497B2 (en) 2001-10-16 2005-09-06 Hewlett-Packard Development Company, L.P. Portable electronic reading apparatus
US7127271B1 (en) 2001-10-18 2006-10-24 Iwao Fujisaki Communication device
US6987988B2 (en) 2001-10-22 2006-01-17 Waxess, Inc. Cordless and wireless telephone docking station with land line interface and switching mode
EP2793101A3 (en) 2001-11-01 2015-04-29 Immersion Corporation Method and apparatus for providing tactile feedback sensations
US20030095105A1 (en) 2001-11-16 2003-05-22 Johannes Vaananen Extended keyboard
US7009595B2 (en) 2002-01-03 2006-03-07 United States Of America Extended refreshable tactile graphic array for scanned tactile display
US20030128191A1 (en) 2002-01-07 2003-07-10 Strasser Eric M. Dynamically variable user operable input device
US8004496B2 (en) 2002-01-08 2011-08-23 Koninklijke Philips Electronics N.V. User interface for electronic devices for controlling the displaying of long sorted lists
JP2003288158A (en) 2002-01-28 2003-10-10 Sony Corp Mobile apparatus having tactile feedback function
US20030184574A1 (en) 2002-02-12 2003-10-02 Phillips James V. Touch screen interface with haptic feedback device
US7373120B2 (en) 2002-03-13 2008-05-13 Nokia Corporation Mobile communication terminal
US7171191B2 (en) 2002-04-08 2007-01-30 Gateway Inc. User dynamically definable centralized notification between portable devices
US7369115B2 (en) 2002-04-25 2008-05-06 Immersion Corporation Haptic devices having multiple operational modes including at least one resonant mode
US6710518B2 (en) 2002-05-31 2004-03-23 Motorola, Inc. Manually operable electronic apparatus
US20030236729A1 (en) 2002-06-21 2003-12-25 Kenneth Epstein Systems and methods of directing, customizing, exchanging, negotiating, trading and provisioning of information, goods and services to information users
US20040204049A1 (en) 2002-08-16 2004-10-14 High Tech Computer, Corp. Cover for a hand-held device
US7496631B2 (en) 2002-08-27 2009-02-24 Aol Llc Delivery of an electronic communication using a lifespan
US6990333B2 (en) 2002-11-27 2006-01-24 Microsoft Corporation System and method for timed profile changes on a mobile device
US8803795B2 (en) 2002-12-08 2014-08-12 Immersion Corporation Haptic communication devices
US7779166B2 (en) 2002-12-08 2010-08-17 Immersion Corporation Using haptic effects to enhance information content in communications
TWI221068B (en) 2003-03-27 2004-09-11 Benq Corp Communication apparatus for demonstrating non-audio message by decoded vibrations
US7363060B2 (en) 2003-05-02 2008-04-22 Nokia Corporation Mobile telephone user interface
US20050048955A1 (en) 2003-09-03 2005-03-03 Steffen Ring Method and apparatus for initiating a call from a communication device
US7791588B2 (en) 2003-12-22 2010-09-07 Immersion Corporation System and method for mapping instructions associated with haptic feedback
US7248924B2 (en) 2004-10-25 2007-07-24 Medtronic, Inc. Self limited rate response
US20060248183A1 (en) 2005-04-28 2006-11-02 Microsoft Corporation Programmable notifications for a mobile device

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090037840A1 (en) * 2007-08-03 2009-02-05 Siemens Medical Solutions Usa, Inc. Location Determination For Z-Direction Increments While Viewing Medical Images
US20100164895A1 (en) * 2008-12-31 2010-07-01 Samsung Electronics Co., Ltd. Apparatus and method for performing scroll function in portable terminal
US8860670B2 (en) * 2008-12-31 2014-10-14 Samsung Electronics Co., Ltd Apparatus and method for performing scroll function in portable terminal
US9373993B2 (en) 2012-07-07 2016-06-21 Saia-Burgess, Inc. Haptic actuators
US9056244B2 (en) 2012-09-12 2015-06-16 Wms Gaming Inc. Gaming apparatus incorporating targeted haptic feedback
WO2014046390A1 (en) * 2012-09-24 2014-03-27 Lg Electronics Inc. Portable device and control method thereof
US9436341B2 (en) 2012-12-21 2016-09-06 Johnson Electric S.A. Haptic feedback devices
US10019155B2 (en) 2014-06-30 2018-07-10 Honda Motor Co., Ltd. Touch control panel for vehicle control system
US20180321753A1 (en) * 2015-03-08 2018-11-08 Apple Inc. Device, Method, and User Interface for Processing Intensity of Touch Contact
US10558268B2 (en) * 2015-03-08 2020-02-11 Apple Inc. Device, method, and user interface for processing intensity of touch contact
US11099679B2 (en) 2015-03-08 2021-08-24 Apple Inc. Device, method, and user interface for processing intensity of touch contacts
US11556201B2 (en) 2015-03-08 2023-01-17 Apple Inc. Device, method, and user interface for processing intensity of touch contacts

Also Published As

Publication number Publication date
US9280205B2 (en) 2016-03-08
US20030038776A1 (en) 2003-02-27
JP3085481U (en) 2002-05-10
KR20010108361A (en) 2001-12-07
US7768504B2 (en) 2010-08-03
US8031181B2 (en) 2011-10-04
US20080062122A1 (en) 2008-03-13
US9740290B2 (en) 2017-08-22
US7592999B2 (en) 2009-09-22
US20080068348A1 (en) 2008-03-20
US7978183B2 (en) 2011-07-12
US20160252959A1 (en) 2016-09-01
WO2001054109A1 (en) 2001-07-26
US20070229483A1 (en) 2007-10-04
US20120056806A1 (en) 2012-03-08
US20070013677A1 (en) 2007-01-18
US20060187215A1 (en) 2006-08-24
US7728820B2 (en) 2010-06-01
US20140002386A1 (en) 2014-01-02
US20070040815A1 (en) 2007-02-22
KR200258353Y1 (en) 2001-12-29
US20040075676A1 (en) 2004-04-22
AU2001229543A1 (en) 2001-07-31
US20010035854A1 (en) 2001-11-01
US20080068349A1 (en) 2008-03-20
US7982720B2 (en) 2011-07-19
US20060119589A1 (en) 2006-06-08
US8059105B2 (en) 2011-11-15
US20080068350A1 (en) 2008-03-20
US7777716B2 (en) 2010-08-17
US20060192771A1 (en) 2006-08-31
US6429846B2 (en) 2002-08-06
US7944435B2 (en) 2011-05-17
US7148875B2 (en) 2006-12-12
US20080068351A1 (en) 2008-03-20
US7602384B2 (en) 2009-10-13
US20070229478A1 (en) 2007-10-04
US8063893B2 (en) 2011-11-22
US20080111788A1 (en) 2008-05-15
US8049734B2 (en) 2011-11-01

Similar Documents

Publication Publication Date Title
US9740290B2 (en) Haptic feedback for touchpads and other touch controls
US8232969B2 (en) Haptic feedback for button and scrolling action simulation in touch input devices

Legal Events

Date Code Title Description
AS Assignment

Owner name: IMMERSION CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROSENBERG, LOUIS B.;RIEGEL, JAMES R.;SIGNING DATES FROM 20000509 TO 20000510;REEL/FRAME:027229/0599

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE