KR20080096854A - Haptic interface for laptop computers and other portable devices - Google Patents

Haptic interface for laptop computers and other portable devices Download PDF


Publication number
KR20080096854A KR1020087025412A KR20087025412A KR20080096854A KR 20080096854 A KR20080096854 A KR 20080096854A KR 1020087025412 A KR1020087025412 A KR 1020087025412A KR 20087025412 A KR20087025412 A KR 20087025412A KR 20080096854 A KR20080096854 A KR 20080096854A
South Korea
Prior art keywords
Prior art date
Application number
Other languages
Korean (ko)
루이스 비. 로센버그
에릭 제이. 샤호이안
브루스 엠. 스체나
Original Assignee
임머숀 코퍼레이션
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US27444401P priority Critical
Priority to US60/274,444 priority
Priority to US09/917,263 priority patent/US6822635B2/en
Priority to US09/917,263 priority
Application filed by 임머숀 코퍼레이션 filed Critical 임머숀 코퍼레이션
Publication of KR20080096854A publication Critical patent/KR20080096854A/en




    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0362Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 1D translations or rotations of an operating part of the device, e.g. scroll wheels, sliders, knobs, rollers or belts
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry


A haptic feedback touch control to provide input to a computer. A touch input device includes a planar touch surface that provides position information to a computer based on a location of user contact. The computer can position a cursor in a displayed graphical environment based at least in part on the position information, or perform a different function. At least one actuator (336) is also coupled to the touch input device and outputs a force to provide a haptic sensation to the user. The actuator (336) can move the touchpad (332) laterally, or a separate surface member can be actuated. A flat E-core actuator, piezoelectric actuator, or other types of actuators can be used to provide forces. The touch input device can include multiple different regions to control different computer functions.


Haptic interface for laptop computers and other portable devices {HAPTIC INTERFACE FOR LAPTOP COMPUTERS AND OTHER PORTABLE DEVICES}

FIELD OF THE INVENTION The present invention relates generally to the interfacing of computer and mechanical devices by a user, and more particularly to a device used for interfacing with a computer system and an electronic device, which provides haptic feedback to a user.

People are constantly interested in interfacing with electronic and mechanical devices in a variety of applications, and the need for more natural, easy-to-use and informative interfaces. In this context of the present invention, people interface with computer devices of various applications. Such applications include interacting with computer-generated environments such as games, simulations, and application programs. Computer input devices such as mice and trackballs are often used in these applications to provide input and to control the cursor in a graphical environment.

In some interface devices, force feedback or tactile feedback is also provided to the user, collectively referred to as "haptic feedback." For example, a haptic version of a joystick, mouse game pad, steering wheel, or other type of device may output a force to a user based on an event or interaction that occurs in a graphical environment, such as a game or other application program.

In electronic devices such as portable computers or laptop computers, mice typically occupy too much work space for work. As a result, smaller devices such as trackballs are often used. A commonly used device in portable computers is a "touchpad," which is a small, quadrangular flat pad that is provided near the keyboard of the computer. The touchpad senses the position of a pointing object by various sensing techniques such as capacitive sensors or pressure sensors that detect pressure applied to the touchpad. The user most commonly touches the touchpad with his fingertips and moves his or her finger on the pad to move the cursor displayed in the graphical environment. In another embodiment, the user may press the stylus tip on the touchpad and move the stylus to operate the stylus in relation to the touchpad. Touch screens are similar devices that are used to enter information from a sensing pad layered on a display screen, which is used in devices such as personal digital assistants (PDAs) and other portable electronic devices.

One problem with current touchpads and screens is that they do not provide haptic feedback to the user. Thus, the touchpad user may not feel the haptic sensation that informs and assists the user's targeting and other control tasks in a graphical environment. Prior art touchpads also cannot utilize haptic-enabled software running on a computer.

The present invention relates to haptic feedback planar touch controls used to provide input to a computer system. This control may be a touchpad provided in a portable computer, or it may be a touch screen in various devices. The haptic sensory output of touch controls enhances interaction and manipulation in the displayed graphics environment or when controlling electronic devices.

More specifically, the present invention relates to a haptic feedback touch control that inputs a signal to a computer and outputs a force to a user of the haptic feedback touch control. The control includes a touch surface that is approximately planar and includes a touch input device operable to input a position signal to a processor of the computer based on a user's contact position on the touch surface. One or more actuators may be coupled to the touch input device to output a force to move the touch input device sideways approximately parallel to the surface to provide a haptic sensation to a user in contact with the touch surface. The computer may position the cursor in a graphical environment displayed on the display device based on the position signal. The touch input device may be a separate touch pad or may be included as a touch screen.

In another embodiment, the haptic feedback touch control is a touch input device that includes a generally planar touch surface for inputting a position signal to a computer processor, and is positioned adjacent to the touch input device to thereby make contact with the user when the user presses the touch input device. A surface member, and an actuator connected to the surface member. The actuator outputs a force to the surface member to provide a haptic sensation to the user. The surface member may be moved laterally parallel in a plane approximately parallel to the surface of the touch input device; For example, the surface member may be located on the touch input device and span the same space as the surface of the touch input device. Alternatively, the surface member may be positioned on one side of the touch input device such that the user may touch the surface member with another finger or palm while touching the touch input device with one finger. For example, the surface member may be located on a physical button located adjacent to the touch input device. Contact or inertial forces may be output at the surface member.

In yet another aspect of the invention, an actuator providing a linear force output includes a ferromagnetic piece comprising a central pull located between two side poles, a coil wound around the central pole, a central pole and side poles. A ferromagnetic piece comprising an adjacent magnet and a backing plate coupled to the magnet. Here, when current flows through the coil, the back plate and magnet move along with the ferromagnetic piece. A roller can be placed between the ferromagnetic piece and the back plate to allow movement. The flexure can reduce the relative motion between the plate and the ferromagnetic pieces in an unwanted direction and can provide a spring centering force.

In another feature, the haptic touch device includes a piezoelectric transducer connected to ground and including a metal diaphragm connected to a ceramic sensing element and a planar sensing element such as a touchpad. A spacer is provided between the piezoelectric transducer and the planar sensing element, and the metal diaphragm is in contact with the spacer. The spring element provides a spring restoring force to the planar sensing element.

In still another aspect of the present invention, a method for providing haptic feedback to a touch input device includes receiving a position signal from the touch input device indicating a contact location on a surface on which the user is pressing, and wherein the contact location is a plurality of surfaces of the surface. Determining where in the area is located. The output force is associated with the user moving the object over the surface of the touch input device. Functions such as moving the displayed cursor or class control of values can be associated with the area where the contact location is located. The haptic sensation may be output when the user moves the object over a boundary from another area of the touch input device to a contacted area.

The present invention advantageously provides haptic feedback to a flat touch control device in a computer, such as a touchpad or touch screen. Haptic feedback can assist and inform the user of interactions and events with a graphical user interface or other environment, and can facilitate cursor targeting. The present invention also enables portable computer devices with such touch controls to utilize current haptic feedback capable software. The haptic touch device disclosed herein is inexpensive, compact, and low in power consumption, so it can be easily coupled to a variety of portable and desktop computers and electronic devices.

These and other advantages of the present invention will become apparent to those skilled in the art from the following detailed description of the invention and various figures.

According to embodiments of the present invention, it is possible to provide a haptic sensation that informs and assists a user's targeting and other control tasks in a graphical environment. It enables the use of haptic-enabled software running on a computer, and the haptic sensory output of touch controls enhances interaction and manipulation in the displayed graphics environment or when controlling electronic devices.

1 is a perspective view of a portable computer 10 including a haptic touchpad of the present invention. Computer 10 may be a portable or "laptop" computer, which may be carried or transported by a user, and may be powered by other portable energy sources or batteries as well as other fixed power sources. Computer 10 preferably executes one or more host application programs that a user interacts with via peripherals.

Computer 10 may include a variety of input and output devices, as shown, a display device 12 for outputting graphical images to a user, a keyboard for providing text or toggle input from a user to a computer, and the present invention. Touch pad 16 is also included in this. Display device 12 may be any of a variety of types of display devices; Flat panel displays are most common in portable computers. Display device 12 may display graphical environment 18 based on an application program and / or operating system running on a CPU of computer 10, such as a graphical user interface (GUI), which computer may be configured to user input. May include a cursor 20, a window 22, an icon 24, and other graphical objects that are well known in a graphical user interface (GUI) environment. For example, other graphics environments or images may be displayed, such as games, movies or other presentations, spreadsheets or other application programs.

Other devices, such as storage devices (hard disk drives, DVD-ROMs, etc.), network servers or clients, game controllers, can also be connected or integrated into the computer 10. In other embodiments, the computer 10 may take many different forms, including computer devices placed on a table surface or other surface, stand up arcade game machines, other portable devices, or devices worn by a person, carried or used by a user with one hand. Can be taken. For example, host computer 10 may be a video game console, personal computer, workstation, television “set-top box” or “network computer”, or other computing or electronic device.

The touchpad device 16 according to the invention preferably looks similar in appearance to the touchpad according to the prior art. In many of the embodiments disclosed herein, such touchpad 16 includes a planar, quadrilateral (or other shaped) smooth surface, which is located below the keyboard of the computer housing or in another area of the housing, as shown. It can be located at When the user operates the computer, the user may conveniently place a fingertip or other object on the touchpad 16 and move the fingertip to cause the cursor 20 to move accordingly in the graphics environment 18.

In operation, the touchpad 16 enters coordinate data into the main microprocessor of the computer 10 based on the position at which the object is detected on (or near) the touchpad. Like many touchpads of the prior art, the touchpad 16 can use capacitive, resistive or other types of sensing. Capacitive touchpads typically sense the position of an object on or near the surface of the touchpad based on capacitive coupling between the capacitor and the object of the touchpad. Resistive touchpads are typically pressure sensitive, where pressure on the pads by fingers, stylus or other objects causes the pad's conductor layers, traces, switches, etc. to be electrically connected. Some resistive or other types of touchpads can detect the amount of pressure exerted by a user, and can use varying or proportional inputs to the computer 10 using a degree of pressure. Resistive touchpads can typically be at least partially modified. Thus, when pressure is applied to a specific location, the conductors at that location are in electrical contact. Such modification possibilities may be useful in the present invention. This is because the touch pad used in the present invention can amplify the magnitude of the output force such as pulse or vibration. The force can be amplified if a compliant suspension is provided that is fitted between the actuator and the moving object. Capacitive touchpads and other types of touchpads that do not require contact pressure may be suitable for the present invention in some embodiments, where excessive pressure on the touchpad may in some cases interfere with the operation of the touchpad for haptic feedback. Because it can. Other types of sensing techniques can also be used in touchpads. The term “touch pad” here preferably includes the surface of the touch pad 16 as well as any sensing device in which the touch pad unit is included.

The touchpad 16 can operate similar to current touchpads, where the speed of the fingertip on the touchpad is related to the distance the cursor moves in the graphical environment. For example, if the user moves their finger quickly on the pad, the cursor will move a greater distance than if the user slowly moved the fingertip. If the user's finger reaches the edge of the touchpad before the cursor reaches the desired destination in that direction, the user simply removes the finger from the touchpad and repositions the finger away from the edge to continue the cursor. I can move it. This is an "indexing" function similar to lifting the mouse on a surface to change the mouse position and the offset between the cursor. In addition, many touchpads may have specific areas each assigned to a specific function that may be independent of the cursor position. This embodiment will be described in more detail below with reference to FIG. 18. In some embodiments, the touchpad 16 may “tap” a particular location of the touchpad (quickly touching and releasing the pad) to provide a user with a command. For example, the user may finger or "double tap" the pad while the controlled cursor is over the icon to select the icon.

In the present invention, the touch pad 16 may output haptic feedback such as tactile sensation to a user who is in physical contact with the touch pad 16. Many embodiments of the structure of the haptic feedback touchpad are described in detail below. Some embodiments may move the device housing or separate moving surface rather than the touchpad itself.

Using one or more actuators connected to the touchpad 16 or an accessory surface, various haptic sensations can be output to a user in contact with the touchpad (or housing or detached surface). For example, jolts, vibrations (variable or constant intensity), and textures may be output. The force output to the user may be based at least in part on the position of the finger on the pad or the state of the controlled object in the graphical environment of the host computer 10 and / or may be independent of the finger position or object state. . Because microprocessors or other electronic controllers use electronic signals to control the direction and / or magnitude of the force output of the actuator, such force output to the user is perceived as "computer-controlled." .

In another embodiment, the touchpad 16 may be provided in a separate housing connected to a port of the computer via cable or wireless transmission, which receives force information from the computer 10 and locates it at the computer 10. Send the information. For example, the touchpad 10 can be connected to the computer 10 via a universal serial bus (USB), firewire, or a standard serial bus.

One or more buttons 26 used in connection with the touchpad 16 may be provided in the housing of the computer 10. The user's hand is easy to access the buttons, and each button can be pressed by the user to provide another input signal to the host computer 12. Typically each button 26 corresponds to a similar button on a mouse input device, so the left button can be used to select graphical objects (click or double click), and the right button can be used to bring up an environment menu or the like. have. In some embodiments, one or more buttons 26 may be provided with haptic feedback.

In addition, in some embodiments, one or more movable portions 28 of the housing of the computer device 10 may be included, which are contacted by the user when the user operates the touchpad 16, which is haptic feedback. Can be provided. Having a movable portion of the housing for haptic feedback is described in US Pat. Nos. 6,184,868 and 6,088,019. In some embodiments, the housing may provide haptic feedback (eg, through the use of an eccentric rotating mass of a motor connected to the housing) and the touchpad 16 may provide a separate haptic feed back. Can be. This allows the host to simultaneously control two different touches to the user; For example, low frequency vibration may be transmitted to the user through the housing, and high frequency vibration may be transmitted to the user through the touch pad 16. Different buttons or other controls that provide haptic feedback may each provide tactile feedback independently of the other controls.

The host application program and / or operating system preferably displays a graphical image environment on the display device 12. The software and environment running on the host computer 12 can vary widely. For example, a host application program may be a word processor, spreadsheet, movie, video or computer game, drawing program, operating system, graphical user interface, simulation, web page or browser that implements HTML or VRML instructions, scientific analysis programs, virtual It may be a reality training program or application, or another application that outputs a force feedback command to the touchpad 16 using input from the touchpad 16. For example, many games and other application programs include force feedback functionality, using standard protocols / drivers such as I-Force®, FEELit®, or Touchsense ™ from Immersion Corporation, San Jose, California. Communicate with the touch pad 16.

The touchpad 16 includes circuitry necessary to report control signals to the microprocessor of the host computer 10 and to process command signals from the microprocessor of the host. For example, suitable sensors (and associated circuits) are used to report the location of the user's finger on the touchpad 16. The touchpad device also includes circuitry that receives a signal from the host and outputs the tactile sense using one or more actuators in accordance with the host signal. Some touchpads may be integrated into a printed circuit board (PCB) that includes several of these components and circuits. In some embodiments, a separate local microprocessor may be provided to the touchpad 16 to report touchpad sensor data to the host and execute force commands received from the host, such instructions may for example command a haptic sensation commanded. Parameters to describe and the type of haptic gag. Also, the touchpad microprocessor can simply send streamed data from the main processor to the actuator. The term "force information" may include command / parameters and / or streamed data. The touchpad microprocessor can independently implement a haptic sensation after receiving a host command by controlling the touchpad actuator; Alternatively, the host processor can control the actuator more directly to maintain a higher degree of control over the haptic sensation. In another embodiment, logic circuitry, such as a state machine provided for the touchpad, can handle the haptic sensation as directed by the host main processor. Architectures and control methods that can be used to read sensor signals and provide haptic feedback to the device are described in more detail in US Pat. No. 5,734,373.

Unique touchpad embodiments are provided for current touchpad embodiments such as those manufactured by Synaptics Corp. The standard surface material for the touchpad is textured mylar and the textured surface is better when directed using the user's finger, but typically any non-conductive object is used on the touchpad surface to detect it. Can be. The touchpad may be sensed through a thin overlay. Typically there is space available for the addition of haptic feedback components; For example, more than half of the board in a 40x60 touchpad can be used for haptic circuits.

Many touchpads include a "palm check" feature, which allows the laptop to detect whether the user is touching the touchpad with a finger, palm, or other part of the hand. As a user may wish to rest the farm and not provide input, the farm check feature may ignore input determined to be provided by the user's farm. Basically, the palm check feature calculates contact areas created by conductive objects (fingers, palms, arms, etc.). If the contact area exceeds a certain threshold, the contact is rejected. This feature may not work in many embodiments.

2 is a perspective view of another embodiment of a device 30 that may include an active touchpad 16 in accordance with the present invention. This device may be a pocket remote control device 30, which is held by the user with one hand and remotely connected to an electronic device or device (television, video cassette recorder or DVD player, audio / video receiver, television by the user). Operate the controls to access the functions of the network computer or the Internet). For example, several buttons 32 may be included in the remote control device 30 to manipulate the functions of the control device. Touchpad 16 may also be provided to allow a user to provide more sophisticated direction input. For example, the control device may include a selection screen on which the cursor is moved, and the touch pad may be operated by controlling the cursor in two directions. The touchpad 16 has a function to output a haptic sensation to a user as described herein based on the controlled value or event. For example, a pulse may be output to the user and the touchpad when the volume level passes an intermediate point or reaches a maximum level.

In one application, the controlled device may be a computer system, such as another computing device that displays web pages and / or graphical user interfaces accessed through a network such as Microsoft Corp.'s web TV or the Internet. The user may control the direction of the cursor by moving a finger (or other object) on the touch pad 16. The cursor can be used to select and / or manipulate icons, windows, menu items, graphical buttons, slide bars, scroll bars, or other graphical objects of a graphical user interface or desktop interface. Cursors can also be used to select and / or manipulate graphical objects of web pages, such as links, images, buttons, and the like. Other force sensations associated with graphical objects are described below with reference to FIG. 18.

3 is a plan view of another computer device embodiment 50 that may include any embodiment of the haptic device according to the present invention. The device may be a "personal digital assistant (PDA)", "pen-based" computer, "web pad", "e-book" or similar device (collectively known as "personal digital assistant (PDA)"). In the form of a portable computer device. In addition to devices capable of button input, such devices that allow a user to enter information or read in various ways by touching the display screen are also related to the present invention. Such devices include Palm Pilot® or similar products from 3Com Corp., pocket-size computer devices from Casio, Hewlett-Packard or other manufacturers, e-books, cellular phones or pagers with touch screens, and touch screens. One laptop computer and the like.

In an embodiment of the device 50, the display screen 52 positioned adjacent the housing 54 may occupy a large portion of the surface of the computer device 50. Screen 52 is preferably a flat panel display well known to those skilled in the art, which can display text, images, animations, and the like; In some embodiments, screen 52 is convenient, such as any personal computer screen. Display screen 52 includes a sensor that allows a user to enter information into computer device 50 by physically contacting screen 50 (ie, similar to touch pad 16 of FIG. 1). Another type of flat "touch device"). For example, a transparent sensor film may be covered on the screen 50, where the film detects pressure from objects in contact with the film. Sensor devices for implementing touch screens are well known to those skilled in the art.

The user can select a graphically displayed button or other graphical object by pressing with a finger or a stylus the exact location where the graphical object is displayed on screen 52. In addition, some embodiments display a graphical "ink" image 56 at the end of a stylus, such as stylus 57, or at a location where the user presses with a finger or other object, thereby allowing the user to "draw" or "write". do. Handwritten characters may be recognized by software running on the device microprocessor as commands, data or other input. In another embodiment, the user may additionally or alternatively provide input via speech recognition, where the microphone of the device inputs the user's voice interpreted as appropriate command or data by software running on the device. The physical button 58 may be included in the housing of the device 50 to provide a specific command to the device 50 when the button is pressed. Many PDAs feature no standard keyboard for text entry from the user; Rather, other input modes are used, such as using a stylus or speech recognition to draw characters on the screen. However, some PDAs include full-function keyboards as well as touch screens, which are typically smaller than standard size keyboards. In another embodiment, a standard size laptop computer with a standard keyboard may include a flat panel touch input display screen, which screen (similar to screen 12 of FIG. 1) may be a haptic feedback according to the present invention. It may be provided.

In some embodiments of the invention, the touch screen 52 may provide haptic feedback to the user, similar to the touchpad 16 described in the previous embodiments. One or more actuators may be connected to the touch screen, or to a movable surface near the touch screen, in a similar manner to the embodiments described below. The user may feel haptic feedback through a finger or an object that is held, such as a stylus 57 in contact with the screen 52.

The touch screen 52 is connected to the housing 54 of the device 50 with one or more spring or compliant elements, such as spiral springs, leaf springs, bends or compliant materials (foam, rubber, etc.), approximately z The screen can be moved along an axis, thereby providing haptic feedback. The screen may also have bends or other connections that allow for side-to-side (x and / or y) movement similar to the preferred embodiments described below.

4 is a block diagram illustrating a haptic feedback system suitable for use in all embodiments described in accordance with the present invention. The haptic feedback system includes a host computer system 74 and an interface device 72.

The host computer system 74 preferably includes a host microprocessor 100, a clock 102, a display screen 76 and an audio output device 104. The host computer also includes other well known components, such as random access memory (RAM), read only memory (ROM), and input / output (I / O) electronics (not shown).

As discussed above, host computer 74 may be a personal computer, such as a laptop computer, which may operate on any well-known operating system. In addition, host computer system 74 may be any of a variety of home video game console systems that are generally connected to television sets or other displays, such as those from Nintendo, Sega, Sony, or Microsoft. In another embodiment, the host computer 74 may be a device, a "set top box", or other electronic device from which a user may provide input. Computer 74 may be a portable, hand-held computer, such as a PDA, or may be a vehicle computer, a stand-up arcade computer, a workstation, or the like.

The host computer 74 preferably implements a host application program that the user interacts with via an interface device 72 that includes a haptic feedback function. For example, the host application program may be a video game, word processor or spreadsheet, a browser or web page that executes HTML or VRML commands, a scientific analysis program, a movie player, a virtual reality training program or application, or an input of a mouse 12. And another application program that outputs a haptic feedback command to the device 72. Here, for simplicity, operating systems such as Windows ™, MS-DOS, MacOS, Unix, Palm OS, etc. are referred to as "application programs". Here, computer 74 may provide a "graphical environment" which may be a graphical user interface, game, simulation or other visual environment, and may include graphical objects such as icons, windows, game objects. Suitable software drivers for interfacing such software with computer input / output (I / O) devices are those of Immersion Corporation, San Jose, California.

Display device 76 may be included in host computer 74 and may be a standard display screen (LCD, CRT, plasma, flat panel, etc.), 3-D goggles, or other visual output device. Voice output device 104, such as a speaker, is preferably coupled to host microprocessor 100 to provide voice output to a user. Other types of peripheral devices, such as storage devices (hard disk drives, CD ROM drives, floppy disk drives, etc.), other input / output devices, may be connected to the host processor.

The interface device 72 is connected to the computer 74 by a bus 80, which communicates signals between the device 72 and the computer 74 and in some embodiments may power the device 72. have. In another embodiment, the signal may be transmitted between the device 72 and the computer 74 by wireless transmit / receive. In some embodiments, the power source of the actuator may be supplemented by a power storage device provided in the device, such as a capacitor or one or more batteries, or may be supplied alone. The bus 80 may be bidirectional, sending signals in either direction between the host 74 and the device 72. The bus 80 may be a serial interface bus such as RS232 serial interface, RS-422, universal serial bus (USB), MIDI or other protocols well known to those skilled in the art; Or may be a parallel bus or a wireless link.

Device 72 may be a separate device from host 74 with its own housing and may be integrated into the host computer housing as in the laptop computer of FIG. 1. Device 72 may include or be combined with a dedicated local processor 110. Processor 110 is said to be local to device 12, where “local” refers to processor 110, which is a processor separate from any host processor of host computer system 14. “Local” may be referred to as processor 110 dedicated to haptic feedback and sensor I / O of device 12. Processor 110 may include software instructions (eg, firmware) that wait for instructions or requests from computer host 74, decode the instructions or requests, and handle or control input and output signals in accordance with the instructions or requests. Can be. In addition, the processor 110 may operate independently of the host computer 74 by reading the sensor signal and calculating the appropriate force from the sensor signal, the time signal, and the stored or relayed command selected according to the host command. have. Microprocessors suitable for use as local processor 110 include bottom microprocessors as well as sophisticated force feedback processors such as the "Immersion Touchsense Processor". The processor 110 may include one microprocessor chip, a plurality of processors and / or coprocessor chips, a digital signal processor (DSP), and / or a logic, state machine, ASIC, and the like.

Processor 110 receives a signal from sensor 112 and provides a signal to actuator 88 in accordance with instructions provided via bus 80 by host computer 74. For example, in a local control embodiment, the host computer 74 provides a high level of supervisory instructions (one or more parameters and command identifiers indicating tactile) to the processor 110 via the bus 20, and the processor ( 110 decodes the command and manages the low level port control loop to the sensors and actuators according to the high level command independent of the host computer 74. This operation is described in detail in US Pat. No. 5,734,373. In the host control loop, a force command is output from the host computer to the processor 110 and instructs the processor to output a force or force sense having a predetermined characteristic. Local processor 110 reports data to the host computer, such as location data describing the location of the device in one or more provided degrees of freedom. The data may describe the state of a button, switch, or the like. The host computer can use the location data to update the executed program. In the local control loop, an actuator signal may be provided from the processor 110 to the actuator 88, and the sensor signal may be provided from the sensor 112 and other input device 118 to the processor 110. The term "tactile sensation" herein refers to a series of forces or signal forces output by an actuator 18 that provides a user with a sense. For example, vibration, jolt, or texture sensation can all be seen as tactile. The processor 110 may process the input sensor signal to determine the appropriate output actuator signal by the next stored instruction. The processor may use the sensor signal in local determination of the force to be output to the user object, and may report the position data derived from the sensor signal to the host computer.

In yet other embodiments, other hardware may be provided locally at the device 72 to provide similar functionality as the processor 110 instead of the processor 110. For example, a hardware state machine incorporating predetermined logic may be used to provide a signal to the actuator 88, receive a sensor signal from the sensor 112, and output a tactile signal.

In an embodiment controlled by another host, host computer 74 may provide a low level force command via bus 20, which may be processor 110 or other circuitry if processor 110 is absent. Can be sent directly to the actuator 18 via a. Host computer 14 thus directly controls and processes all signals with device 72. For example, the host computer directly controls the force output by actuator 88 and directly receives sensor signals from sensor 112 and input device 118. Other embodiments may use "hybrid" tissue in which some type of force (such as a closed loop effect) is purely controlled by a local microprocessor, and other types of effects (such as a dog loop effect) may be host. It can also be controlled by

Local memory 122, such as RAM and / or ROM, is preferably coupled to processor 110 of device 72 to store instructions, temporary values, and other data for processor 110. The memory 122 may further include or instead include a processor 110. In addition, the local clock 124 may be coupled to the processor 110 to provide timing data.

Sensor 112 senses the position or movement (eg, an object on the touchpad) in the desired degree of freedom and provides a signal to processor 110 (or host 74) that includes information indicative of the position or movement. Sensors suitable for detecting movement include capacitive or resistive sensors of touch pads, touch sensors of touch screens, and the like. Other types of sensors may be used. The optional sensor interface 114 may be used to convert sensor signals into signals that can be interpreted by the processor 110 and / or host computer system 14.

Actuator 88 sends a force to the housing, manipulator, button or other portion of device 72 in response to the signal received from processor 110 and / or host computer 14. Device 72 preferably includes one or more actuators that operate to generate force in the device 72 (or such components) and haptic sensations to the user. The actuator is "computer controlled", for example, the force output from the actuator is ultimately controlled by signals generated from a controller such as a microprocessor, ASIC, or the like. Many types of actuators can be used, including rotary DC motors, voice coil actuators, moving magnetic actuators, E-core actuators, compressed air / hydraulic actuators, solenoids, speaker voice coils, piezoelectric actuators, manual actuators (brakes), and the like. Some preferred actuator types are described below. Actuator interface 116 may be selectively connected between actuator 88 and processor 110 to convert a signal from processor 110 into a signal suitable for directing to actuator 88. The interface 116 may include a power amplifier, a switch, a digital-analog controller (DAC), an analog-digital controller (ADC), and other components well known to those skilled in the art.

In some embodiments herein, the actuator may apply a short term force sensation by moving the housing or manipulator or inertial mass of the device. This short term force sensation can be described as a "pulse". The "pulse" may in some embodiments be delivered substantially along a particular direction. In some embodiments, the size of the "pulse" can be controlled; The detection of "pulses" can be controlled; "Periodic force sensations" can be applied, which can have magnitude and frequency. For example, the periodic sense may be selected from sine waves, square waves, sawtooth waves upwards, sawtooth waves downwards and triangle waves; Envelopes can be applied to the periodic signal and their magnitude can change over time. The form of the wave can be "streamed" from the host to the device or carried via a high level command including parameters such as magnitude, frequency and duration.

Another input device 118 may be included in the device 72, which may transmit an input signal to the processor 110 or the host 74 when manipulated by a user. Such input devices may include buttons, dials, switches, scroll wheels, knobs or other controls or mechanisms. Power source 120 may optionally be included in device 72 connected to actuator interface 116 and / or actuator 18 to provide electrical power to the actuator. In addition, power may be drawn from a power source separate from device 72 or may be received via bus 80. In addition, the received power may be stored and regulated by the device 72 (and / or host 74), and thus may be used or additionally used when the actuator 88 needs to be driven.

The interface device 12 may be of any of a variety of types; Several embodiments are described below. The touchpad or touchscreen described here is provided for various types of devices, such as gamepads, joysticks, steering wheels, touchpads, older controllers, finger pads, handles, trackballs, remote controls, cell phones, personal digital assistants, and more. Can be.

detailed Example

The present invention presents various embodiments in which feedback is provided to a user of a laptop computer or other portable computing device and / or a user of any computing device with a touchpad or similar input device.

Some embodiments are based on displacing the finger skin when the user's finger contacts the touchpad. This embodiment sends a high fidelity sensation as long as it provides a good correlation between input and output at the user's fingertips. Actuator and linkage solutions are described for driving certain parallel translation embodiments. Another embodiment is generally based on the stimulation of the palm surface of the user in contact with the laptop computer 10. Such a surface can provide a haptic sensation based on the parallel movement or inertia connected force of the palm surface. The parallel movement of the surface in the face of the top surface of the touchpad or laptop (ie, in the X and / or Y axis directions) may convey haptic information, such as vibration or displacement in the Z axis (perpendicular to the touchpad or laptop top surface). Effective at This may be important when considering the volume limitations of laptops.

In many embodiments described herein, it is also useful that the user's contact is detected by the touch input device. Since the haptic feedback only needs to be output when the user touches the touch device, this detection can cause the tactile feedback to stop (actuator “power off”) when the object is not in contact with the touch input device. If a local touch device microprocessor (or similar circuit) is being used in a computer, the microprocessor can turn off the actuator output when no user contact is detected. This allows the host processor to offload additional computations until a contact is detected again and the actuator output is restarted.

In many preferred embodiments, the haptic does not force the user to learn how to control the laptop, but in a way that does not force the creator to design and build much different from the current design to provide haptic content. It can be added to a computer or other device. For example, in a laptop embodiment, when a user moves his finger on the touchpad, the cursor displayed on the laptop screen moves accordingly. The haptic effect may be output to a touchpad or other laptop component that is touched by the user when the cursor is in a graphic object or region, when an event occurs, and so on. In other applications, haptic effects may be output when an event or interaction occurs in a game or other application running on a laptop. Many types of actuators, sensors, linkages, amplification transmissions, and the like can be used with the present invention.

Currently manufactured touchpad surfaces are typically connected to a printed circuit board (PCB), which includes the standard components and electronic components needed to operate and connect the touchpad in a laptop. Thus, when a force is applied to the touchpad, they are applied to a PCB that is often directly connected to the touchpad, for example below the touchpad.

Embodiments herein are designed according to specific guidelines and characteristics. For example, in some embodiments, certain haptic experiences that are felt to be enforced, where the tactile content is physically located or focused, appear directly beneath the finger, including the indication of the user's finger on the touchpad, or the case / housing of the laptop. Spatial correlations of haptic feedback that can be generated somewhere, such as force intensity and power required for forced feedback, how the user interacts with the device, effects on the quality and content of feedback (such as finger contact angles), laptops Consider which of the actuators and mechanisms to fit in form, factor / housing is most desirable.

Preferably, current haptic feedback software and drivers, such as Immersion Corp's TouchSense software, can be used in the embodiments described herein. Standardized modules that work in many other types of products such as PDAs, laptops, cellular phones, and remote controls, such as certain touchpads, are desirable.

The focus of this inventive embodiment is primarily on the tactile feedback implementation, not on the motor sensory force feedback embodiment. As described herein there are two basic types of tactile feedback applied to the present invention: inertial haptic feedback and moving contact haptic feedback. Inertial feedback is generated using inertically coupled vibrations and is based on the motion of the inertial mass connected through the pleural bend to the housing / user, where mass motion is the inertia at the surface contacted by the user. Cause vibration. Moving contact feedback is directly related to moving the surface or member relative to the user in terms of earth ground and is generally generated by small displacements of the user's skin.

The difference between inertia and mobile contact feedback can be caused by the actual mechanism used to provide information to the user. Both inertia and tactile simulations cause displacement of hand or finger tissue; Inertial feedback is connected through the various enclosures by the conductivity of the enclosure and the compliance of maintaining the enclosure. Mobile contact feedback relates to a more direct mechanism for simulating the user's organization. In this example, there is a tactile point or surface that penetrates the finger skin to cause sensation by locally deforming the finger or palm tissue. This difference is made to classify the two types of embodiments described below, inertia and surface movement.

A new actuator called Flat-E is described below, which can be used in all embodiments herein, and represents a low height, power efficient and high quality type of flat actuator. Flat-E actuators have a satisfactory level of performance, can reduce volume, and can form factors required for laptop or other device applications.

inertia Example

This embodiment moves the inertial mass to give the user an inertial haptic feedback. It is typically transmitted through an enclosure or mechanism such as a housing or other surface. In many cases, the inertial mass does not impact any surface in progress, but this impact can be used instead to provide additional haptic effects.

5 is a perspective view of one embodiment 150 of an actuator assembly used to provide an inertial haptic sensation to the touchpad and housing of the device of the present invention. Actuator assembly 150 includes grounded bend 160 and actuator 155. Flexure 160 may be an integral part made of a material such as polypropylene plastic (“living hinge” material) or other flexible material. Flexure 160 may be seated in a housing of device 12, for example, at a portion 161.

The actuator 155 is connected to the bend 160. The housing of the actuator is connected to a container portion 162 of the bend 160 housing the actuator 155 as shown. The rotary shaft 164 of the actuator is connected to the bend 160 in the hole 165 of the bend 160 and is firmly connected to the central rotating member 170. The rotary shaft 164 of the actuator is rotated about axis A, which also rotates member 170 about the A axis. Rotating member 170 is connected to flexure joint 174 to a first portion of angled member 171. The flex joint 174 is made so thin that the flex joint 174 will wheel when the rotating portion 170 moves the first portion 172a almost linearly. The first portion 172a is connected with the bent joint 178 to the grounded portion 180 of the bent portion, and the first portion 172a is connected with the bent joint to the second portion 172b of the angled member. The opposite side of the second portion 172b is connected to the flexure junction 184 to the container portion 162 of the flexure.

Angled member 171 including first portion 172a and second portion 172b moves linearly along the x-axis as shown by arrow 176. In practice, only the parts "172a" and "172b" move almost linearly. When the bend is in its original position (rest position), the portions "172a" and "172b" are preferably angled as shown with respect to the longitudinal axis. This may cause the rotating member 170 to push or pull the angled member 171 in both directions indicated by the arrow 176.

The actuator 155 operates only in a part of the rotation range when driving the rotation member 170 in two directions, and enables wideband operation and high frequency pulse or vibration to be output. Flexure junction 192 is provided in the flexure portion between vessel portion 162 and grounded portion 180. Flexure joint 192 is characterized in that the container portion 162 (in addition to the actuator 155, the rotating member 170, and the second portion 172b) is nearly in the z axis in response to movement of the “172a” and “172b” portions. Allow to move linearly. The flex bond 190 is provided in the first portion " 172a " of the angled member 171, so that bending in the z direction relative to the flex bond 192 can occur more easily.

By rapidly changing the direction of rotation of the actuator shaft 164, the actuator / vessel can be made to vibrate along the z-axis and generate vibration in the housing with the actuator 155 operating as an inertial mass. In addition, the flex joints included in the flexure 160, such as flex joint 192, act as spring members to provide a return of force towards the original position (rest position) of the actuator 155 and the container portion 172. . In some embodiments, stops may be included in flexure 160 to limit movement of vessel portion 122 and actuator 155.

Other embodiments may provide other types of actuator assemblies that provide a sense of inertia, such as a bend that moves a separate inertial mass instead of the actuator itself. Alternatively, the eccentric mass connected to the rotating shaft of the actuator can be vibrated to provide a rotating inertial tactile feel to the housing. The eccentric mass can be driven in one direction or in both directions. Other types of actuator assemblies may be used, such as linear voice coil actuators, solenoids, moving magnet actuators, and the like.

In one embodiment, an actuator assembly as described above may be connected to any of a variety of locations in a laptop housing or other device housing, and vibrates parts of the housing upon vibration transmission through the product housing by means of separately mounted actuator modules. It can be used to The actuator assembly may be attached to a laptop housing or other area of the component to provide inertial haptic feedback as the inertial mass vibrates.

The user's experience may change depending on the exact position of the actuator assembly on the laptop and other haptic effects that are output. Locations for connection of the actuator assembly include a bottom, side, or front housing, an area proximate to or connected to the surface, touch pad or touch screen promised by the user's palm when the device is operated. The effective location can be the touchpad itself (eg, connection of the actuator assembly to the bottom of the touchpad). In some embodiments, when the touchpad is rectangular, more compliance may be made along the long axis of the touchpad.

In general, the haptic content can be received by the user over a limited frequency range by attaching the actuator assembly to the location of the laptop. The specific location most effective for transmitting vibrations can be determined, for example, by experiment.

In many cases, the output of various types of haptic effects may be weak or unclear to the user, and effects with significantly higher frequencies may be more noticeable. One of the effective tactile effects for this embodiment is the relatively high frequency "ringing" effect. Sympathetic vibrations in laptop cases and touchpads can amplify this vibration. Actuators designed to resonate at a certain ideal frequency can be used to exhibit a broad frequency spectrum by amplitude modulation techniques. For example, a forced 25 kHz square wave can be generated by modulating the 250 kHz natural frequency generated in a tuned resonant actuator such as a large piezoelectric ceramic resonator (see FIGS. 8A-8B). Such a modulation design may depend on the actuator tailored to the environment in which it is located and the object to which it should be driven. Some types of vibration isolation systems may be used in some embodiments to maintain energy confined to the haptic module as opposed to causing energy to diverge in resonant mode in a laptop or other part of another device.

FIG. 6 is a perspective view of another embodiment, wherein the touchpad module is dependent on the compliant auxiliary structure and may be connected to a harmonic source and vibrate along the Z axis. In this example, the touchpad 200 can move perpendicular to the surface of the touchpad with the actuator assembly 202 moving the touchpad surface in the Z axis. The touchpad may be connected to the laptop housing 204 with a layer or support of foam, rubber or other compliant material, as shown in FIG. 6, with the touchpad 200 and the housing 204. A strip of foam 206 is provided around the touchpad. The actuator assembly 202 may be connected to the bottom (or other location) of the touchpad assembly, whereby when the inertial mass vibrates the vibration is transmitted to the touchpad and causes it to move in the Z direction relative to the laptop housing 204. For example, the actuator assembly can be bonded directly underneath the touchpad PCB. Unlike simply moving the touchpad module about the Z axis, a floating assembly of touchpad and bezel (surface surrounding the touchpad) may move instead.

In another embodiment, a stand-alone touchpad device can be used, where the touchpad is housed in a separate housing and communicates via wires or transmissions with a laptop or other device. In one embodiment, a standalone touchpad device may be attached to a palm pad with an actuator assembly connected to the pad. When the inertial mass of the actuator assembly vibrates, the inertial sense is transmitted through the member to the touchpad. In other words, this effectively provides inertial coupling from the actuator assembly to the touchpad surface. A foam layer (or other compliant layer) may be connected between the touchpad and ground to provide compliance and to allow the pad and touchpad to move inertia. This embodiment may feel more compulsory than the embodiment in which the actuator assembly is mounted around a laptop or built-in touchpad because of the compliance of the foam that can output a stronger sense.

The entire touchpad can have a haptic sensation as one integral member; Alternatively, in other embodiments, the individually moving portions of the pad may each have a haptic feedback actuator and be associated with transmission, such that the haptic sensation is provided only to certain portions. For example, some embodiments may include a touchpad having other portions that are bent or moved relative to other portions of the touchpad.

In yet another embodiment, the peripheral or adjacent surface of the touchpad is connected to a harmonic vibration source (such as an actuator assembly) and vibrates in one or more axial directions. For example, the palm rest surface can be driven by an inertial actuator assembly housed in a laptop. 7 is a perspective view of one example of an inertia driven palm rest surface. Laptop computer 210 includes a touchpad 212 that functions similar to a typical touchpad. Palm rest surface 214 is positioned adjacent to touchpad 212, and surface 214 may be attached to a laptop housing on a layer of flexible open cell foam or other compliant material. In some embodiments, the surface 214 may be textured with a divot and / or bump to enable stronger contact with the surface by the user. Actuator assembly 216 is connected to palm rest surface 214; In the illustrated embodiment, assembly 216 is connected below surface 214. The assembly may be any of the actuator assemblies described above. For example, actuator assembly 216 may provide z-axis vibration to palm rest surface 214.

The user preferably rests the palm and / or fingers on the palm rest surface 214 while using the touchpad 212 with a pointing finger. Thus, the user can feel the haptic sensation through the palm rest surface while the touchpad is in operation. In addition, one hand of the user rests on the palm surface 214 and while the other hand senses the haptic feedback, the other hand points to the touchpad and is used to operate it. The palm surface is embodied as a substantially unavoidable contact surface, so the user will not miss so many haptic events while using the touchpad. In some embodiments, depending on how hard the user places the palm on the surface, it may make some differences in perceived size over a useful range of travel, and the result may be the robustness and mass of the particular device used. have. The tightness of the connection between the palm surface and the housing can be adjusted to a particular feel in other embodiments.

In a related embodiment, the actuator assembly can be mounted in another area. For example, the actuator assembly may be attached underneath the extension of the palm rest surface made of textured material mounted to a flexible open cell foam or other layer of compliant material.

In other embodiments, the palm surface may be translated in the X and / or Y directions similar to the touchpad translations described below. Inertial actuator assemblies can be used to force parallel translation, or other types of actuators with high robustness and high actuator privileges in parallel movement mode, for example, to avoid unintended hard stop limiting distortion. May be used. The parallel moving palm surface may be appropriate for a planar actuator, so that the assembly may be integrated into the laptop housing; The plane actuator will be described later.

8A is a perspective view of an actuator that may be used in another embodiment of an inertial haptic feedback device in accordance with the present invention. In this embodiment, a high frequency mechanical oscillator is mechanically connected to the touchpad. One example of such an implementation is a thin piezoelectric transducer 230 of large diameter commercially available, eg, 60 mm in diameter, 400 to 400 Hz at natural frequency when supported in the periphery. The piezoelectric transducer preferably comprises a thin diaphragm (sheet) 231 of metal. One embodiment may include an additional mass 232 in the center of the ceramic of the piezoelectric diaphragm to add an inertial mass and to lower the natural frequency to get a stronger inertial tactile feel from the vibrational mass. The outer perimeter of the transducer 230 may be grounded and the mass 232 may vibrate perpendicular to the disk surface to create an inertial haptic sensation with the housing to which the transducer is attached.

Actuator 230 may function as a harmonic oscillator operating at a relatively high frequency, which transmits the senses to the hand or part of an attached laptop. Amplitude modulation (envelope control) can be used to generate a wider haptic spectrum than a single fundamental drum mode. Large diameter piezoelectric drivers are available, such as Kingstate, Taiwan. Other size discs may be used.

In order to provide the desired haptic sensation, a large piezoelectric transducer or "buzzer" must provide a large supportable movement of the mass (acceleration) and the carrier frequency (vibrating frequency) must be modulated with the haptic signal. Some electronic components may be needed to operate this type of actuator. High voltage supply can be generated from 5 volts. For example, vibration circuits such as self exciting can drive the device. There may be a gating characteristic that initiates or vibrates as well as balanced control to enable modulation of the amplitude of the output. All digital implementations can be provided by turning the oscillator on and off.

8B is a side view of another embodiment 234 of a piezoelectric transducer that provides haptic feedback. The transducer here exerts vibrations (not inertia) directly along the z axis to the touchpad (or touch screen). The housing 236 of the laptop or other device includes a bezel covering the touchpad member 238, which the user physically contacts to provide input to the computer or processor. The touch pad member 238 can include electronic components required for interfacing the touch pad with other electronic components. The touchpad member 238 can be placed on the spacer plate 240 with a particular mass selected to enable efficient haptic output. The spacer plate 240 is at the edge of the piezoelectric metal septum that is part of the piezoelectric transducer, where an electrical lead 241 can be connected between the septum 231 and the signal source 246. A piezoelectric ceramic element 242, which is also part of the piezoelectric transducer, is connected to the metal diaphragm 231, and the electrical leads 243 are connected between the element 242 and the signal source 246. The conductive electrode is plated on the ceramic element 242. The contact pad 248 is positioned between the element 242 and the lower housing 250, where the contact pad is firmly connected to both the ceramic element 242 and the housing 250. The contact pads 248 are made small to increase the curvature of the diaphragm 231, which can result in a large acceleration and a strong haptic effect. The lower housing 250 may include, for example, a printed circuit board (PCB). For example, one or more preloaded spring elements 252, such as leaf springs, spiral springs, and the like, connect touch pad member 238 to lower housing 250.

In operation, the piezoelectric transducer moves along the z axis as current from the signal source 246 flows through the diaphragm 231 and the ceramic element 242. Accordingly, the spacer plate 240 is provided only at the edge of the diaphragm 231 to move the inner portion of the diaphragm and the ceramic element 242, and the ceramic element pushes the lower housing 250 so that the diaphragm 231 is the diaphragm 240. ), Which in turn pushes the touchpad element 238. This pushes the touchpad element up and the spring element 252 provides a spring restoring force to the touchpad to return to the neutral position. When the piezoelectric transducer similarly moves in the opposite direction, it moves the touchpad element 238 down toward the lower housing 250, as indicated by the vibration signal. The touchpad device thus vibrates along the z axis and provides a haptic sensation to the user in contact with the touchpad device.

In some embodiments, components of touchpad embodiment 234 are selected to provide more efficient haptic sensations. For example, when the piezoelectric transducer vibrates near the natural frequency of the mechanical system (including the transducer itself), a stronger force and an effective haptic sensation can be output. The natural frequency of this mobile mechanical system is represented by the root of approximately k1 plus k2 divided by m as shown below.

fn ≒ √ (k1 + k2 / m)

Where fn is the natural frequency, k1 is the spring constant of the metal diaphragm 231 of the piezoelectric transducer, k2 is the spring constant of the spring element 252, m is the spacer 240, the touchpad 238, and the touch. The total mass of the suspension portion of the pad. In addition to the spring constant, this zigzag is chosen to provide a desirable low natural frequency of about 120 Hz or less, which results in an effective haptic sensation. The spacer plate 240 allows, for example, a plurality of piezoelectric transducers located side by side to face the lower housing 250 so that the plurality of transducers are strong in order to provide a sense at a particular position of the touchpad element 238. It can be driven in line with the haptic effect or at different times.

One way of providing the drive signal is to initially oscillate the carrier signal at its natural frequency fn or close to it, modulate the carrier signal into an effect envelope if appropriate (e.g., to provide the desired frequency or effect. The amplifier is driven by a modulated signal and, conversely, a piezoelectric transducer. Unlike square waves or other types, the sinusoidal carrier signal used in this embodiment tends to produce a weaker haptic effect in the embodiment described in FIG. 8B, which is often desirable.

In other embodiments, the piezoelectric transducer and spacer may be reversed in orientation such that the spacer 240 is in contact with the lower housing 250, and the diaphragm 231 is placed over the spacer 240 and the ceramic The element 242 is positioned above the diaphragm, and the ceramic element may directly impact the pad or the touch pad element 238 connected to the touch pad member when vibrated by a drive signal. In yet another embodiment, the ceramic device may be directly connected to the touchpad device. However, in this case it generally results in less strong and effective haptic effect.

Move surface parallel Example

This embodiment provides haptic feedback to the user by moving the surface that the user contacts in parallel. The user feels the surface moving through the skin and creates an immediate sense. This type of haptic feedback is based on in-plane surface movement or relative movement between adjacent surfaces that a user's finger or hand contacts without interaction with a fixed surface. The parallel movement of the surface in contact with or adjacent to the touchpad module surface (FIGS. 9-12) and the displacement of the touchpad surface itself (FIG. 13-14) are also described below, and the actuators that can be used in any application will be described. . The moving surface invention of US Pat. No. 5,184,868 may be applied.

Small (less than 1 mm) displacement between adjacent surfaces provides good signal transmission to finger tissues. Enhanced surfaces may include physical surface textures (bumps, roughness, etc.) and may be modified to attract users of any location or orientation. In some embodiments, small lateral parallel movement (displacement below the fingertip between .25 mm and 0.5 mm) is caused by out-of-plan (z-axis) vibration or remote just below the touchpad. It may be more effective than inertial vibration. For example, an inertial connection can give a more distant and cut feeling in some embodiments.

Separate Surface Translation

Displacing a separate surface member located on top of the touchpad module is effective in providing highly correlated feedback. Such feedback makes it feel fully synchronized so that a transparent surface, such as on a PDA or touch screen, can be moved parallel on the visual display. In other embodiments, parallel movement of other surfaces to the side adjacent to the touchpad may also be performed.

9 is a perspective view of a first separating surface translational embodiment 250, wherein the thin surface in slidable contact with the touchpad module moves laterally under the finger. A parallel moving surface member (“surface”) 252 is positioned on and covers the top of the fixed touchpad 254. Expanding members 256a and 256b connected to parallel moving surface 252 may extend substantially vertically in the x and y directions from touchpad 254. Actuators 258a and 258b may be connected to associated expansion members 256a and 256b, respectively. In the described embodiment, actuator 258 is a linear actuator that outputs a linear force to associated expansion member 256, thereby moving the parallel moving surface in the output force direction. For example, the actuator may be an "E" core actuator. Two pole magnets are invisible due to the coil 260 under the ferromagnetic piece 382 in the actuator 258b.

Slide surface 252 may be any solid or substantially rigid material; For example, Kapton (polyamide) flexible printed circuit board material can be used. Expansion member 256 may include a stiffener section to prevent warpage. Sliding surface 252 may be a textured surface that provides a frictional force that engages the skin tissue of a user in contact with the surface. The top region of the moving surface 252 includes a texture that is sufficiently roughened to provide action in the skin without feeling rough in contact. Below the moving surface, a low friction coating can be included to slide between the surface 252 and the touchpad 254. This lower component is very thin, for example less than 0.010 inches (0.25 mm) thick.

The user points and contacts the moving surface 252 like the touchpad 254. The surface 252 is made thin enough so that the touchpad 254 can detect all of the user's surface 252 contacts, as if the user was touching the position of the touchpad just below the position being touched on the surface 252. have.

In some embodiments the actuator may be located relatively far from the moving surface (eg> 10 cm), in which case a more rigid expansion member 256 may exert tension and pressure with as little friction or movement out of the plane as possible. It may be necessary to send. For example, fiber laminates of glass or carbon may perform this function.

In some moving magnet actuator embodiments, a thin piece of rare earth magnet may be stacked on the moving surface 252 to act as a moving magnet. For example, if the moving magnet piece is less than 1 mm thick and the "E" core and coil are located below the distal segment of the extension 256, such as mounted directly on the touchpad PCB itself, a high level of integration can be done. have. The coil wire can be soldered directly to the touchpad PCB.

10 is a plan view of another parallel moving surface embodiment, in which a separate moving tactile surface is slidably contacted over the touchpad. This surface is moved parallel to the touchpad by a high moving range actuator through a high fidelity mechanical linkage.

In this embodiment, the parallel moving surface 272 is located above the touchpad 274 similar to the embodiment of FIG. 9. The expansion member 276 may be directed in the direction of the rotary actuator 278 (here x direction), which is grounded to the laptop housing 280. The actuator 278 is connected to the coupling linkage 284, and the opposite side may be a direct current rotating motor with a rotatable shaft 282 connected to the expansion member 276. For example, a linkage portion from the actuator assembly 150 described above can be used. When actuator 278 rotates shaft 282 in either direction, the linkage translates the rotation into movement of surface 272 in the corresponding direction (left or right). For example, a displacement of about ± 1 mm can be achieved. The user senses surface parallel movement as the finger moves on surface 272. Motor rotation can result in very clean and high fidelity parallel movements on the x-axis. The direct current motor design can operate in a blank space in front of or next to an enclosure or housing on a laptop. Similar extensions, linkages, and motors may be provided in the y direction to move the surface 272 in the y direction. User input is detected at the touchpad through the moving tactile surface.

Thin surfaces may be trimmed to fit inside the touchpad area with small borders at all perimeters. Rectangular expansion can be cut out of a larger stack to provide a fairly rigid strip driven by an actuator. This strip must be wide enough to allow the actuator to press on during operation without strip distortion.

As in the embodiment of FIG. 9, a smooth surface may be provided below the surface 272 in contact with the touchpad to provide a smooth, low friction sliding interface with a touchpad plastic covering generally touched by the user's fingers. . The upper side of the moving surface 272 is made frictional to enable good user grip, for example, textures such as fine sandpaper. This can provide a good contact surface because it provides mechanical bonding to the finger surface but is not rough enough to feel rough to the touch. Other embodiments may use various types of friction surfaces. Other embodiments may also use planar actuators such as moving magnet actuators or voice coil actuators.

Two strips 286 of plastic or other material surround the touchpad (ie, the rim of the housing opening of the touchpad) to press the moving surface 272 and to keep it parallel and flat to the touchpad 274, It may be attached to the bezel covering the edge of the surface 272.

9 and 10 may provide a compulsory haptic sensation. Adding a surface over the touchpad does not substantially interfere with sensor operation of the touchpad. The user can simultaneously instruct and receive haptic feedback through the movement of this surface relative to the fixed touchpad. It is simple to provide input to a laptop or electronic device as the user touches or moves the finger on the analysis surface on the touchpad. In some embodiments, the moving surface may be maintained thereon rather than in close contact with the touchpad, so some pressure may be required by the user to accept free play and approach the sensor array.

When the cursor moves around the scroll bar displayed and the desktop GUI displayed on the laptop, the user feels a clear high fidelity that it is spatially well correlated with the cursor. Correlation characteristics may vary depending on whether the user moves a finger or an object on the x or y axis. Thus, when moving the cursor up and down the icon, the user may pop in one direction or feel a similar tactile effect if the user does not concentrate or observes that the movement of the surface 272 is perpendicular to the cursor movement. .

The movement of the user's finger in the direction of translational movement in the example of FIG. 10 in the x-axis tends to be more mandatory. For example, when the user drags a finger in the x direction to move the cursor from one displayed radio button to the next, even if a very small surface movement is generated, the surface 272 can lead the user to the next button and I can feel it together. Surface translation force in the opposite direction of motion can be effective. For example, if the user moves left on the icon, the translation force to the right will feel clear and natural. If haptic feedback is possible on only one axis, in some embodiments the y-axis may be a better choice because there is more content that is vertically aligned in the GUI desktop and application.

Short, distinct pulses can provide excellent transitions when moving the cursor from one object to another, such as between graphical buttons. Vibration may be transmitted to the user in the surface analysis embodiment by vibrating the actuator driving the surface, for example with a sine or other periodic wave, so that the parallel moving surface vibrates in the opposite direction.

The user tends to essentially remove the finger from the touchpad between movements that control the cursor, and the unique spring centering of the interlock and motor causes the moving surface (252 or 272) to move to the neutral (reference) position to You can wait for it. The controlled cursor does not move until the feedback device returns to the neutral reference position because only the surface above the touchpad moves and does not provide input to the computer through the touchpad like a user's finger.

Movement in a particular direction causes the surface translational shift embodiment to work with local pseudo-kinesthetic feedback as an associated pointing device in some embodiments. Tactile feedback is still the main type of haptic feedback provided in this embodiment, but the small movement of the surface acts in such a way that it feels like a stopper "slope," which is only a pop, but forces the user's finger in one direction. Can be dramatically recognized as a spring force.

The overall rigidity of the actuator can affect the result. If the user presses the moving surface too hard, the user may move the surface while dragging or pointing the finger, which may cause the actuator to move away from its spring center. The preferred embodiment has a powerful but rigid actuator, which is hardly backdriven by the user pressing hard.

It is desirable in some embodiments to have a certain amount of travel or compliance on the moving surface, such as about 2 mm. There is strong spring centering from the motor and linkage, and moving the cursor between two objects, such as a button in the GUI, can be very realistic, which allows the user to accept real kinematic force feedback until finger pressure is reduced. This is because the finger moves quickly on the screen in a relationship mode. The output haptic effect is a simple haptic pop, and the actual kinesthetic spring does not provide force in the x or y direction. However, the user detects that finger is dragged to an adjacent object, such as a next button.

It should be appreciated that motor sensory force feedback is possible in other embodiments. For example, if the user locks his finger in one position on the moving surface and the moving surface has a sufficiently large displacement, the force can be output with degrees of freedom of movement of the moving surface to provide kinesthetic force feedback. The sensor of the touchpad can be used as a sensor to indicate the position of the finger / movement surface for the calculation of a force such as a spring force having a magnitude according to the distance traveled from the spring reference. Thus, such an embodiment is a dual mode haptic system and includes both tactile and kinesthetic modes.

Some embodiments with respect to the moving surface may enable sliding by the user, and in other embodiments may be very robust with little or no sliding. In many embodiments, if the maximum permissible movement of the surface is sufficient to allow traversal between two adjacent graphic targets, the kinesthetic mode may be effective and the user may not know that he is moving the surface. will be. Some embodiments may provide parallel movement and force in two axes (x and y), thus enabling this sensory related feedback (actual spring) in all directions of the touchpad.

If the user does not move the finger or object on the touchpad, the haptic effect may not feel the same. Along with the steering haptic effect (such as the detent pop effect) there are content and values corresponding to touchpad movement. For example, when a user moves a finger and receives a pop effect, it is effective when the finger translates the moving surface to the point of change between the icon or button.

Many of the benefits described above for a separate parallel moving surface can also be applied to the touchpad surface analysis described below.

11 is a perspective view of yet another embodiment 290 of a separate parallel moving surface and a moving coil actuator. In this embodiment frame 292 is positioned over touchpad 294 of the device. The frame 292 is positioned directly above the touchpad 294 and includes a thin surface portion 296 that is sufficiently thin so that a user's contact of this portion can be detected by the touchpad 294 below. Frame 292 includes an integrated voice coil 298 that is part of the voice coil actuator 300. Coil 298 may be a wire trace molded to frame 292, which may be a PCB. The other part of the actuator 300 is a magnet 302 having two fixed poles located above the coil 298 and grounded to the laptop housing, and made of steel and located on the other side of the frame 292 and grounded to the housing flux Include the back plate used for the return path. The steel subassembly can be attached to the touchpad PCB itself, for example.

Thus, the magnetic field of the magnet 302 and the current flowing through the coil 298 interact to cause a linear force in the frame 298, which causes the frame and portion 296 to move as shown by arrow 306. do. This provides a haptic sensation to the user, similar to the separate parallel moving surface embodiment described above. The housing may surround the entire frame except for openings surrounding portions 296 of frame 292. In some embodiments, the wiring from coil 298 may be connected to the touchpad PCB using a separate flex circuit finger that is cracked from moving frame 292.

12 is a perspective view of another embodiment 310 of a separating parallel moving surface. In this embodiment, the surface surrounding the touchpad is moved in parallel in the x and / or y directions with respect to the touchpad surface. A thumb surface 312 is located below the touchpad and is firmly connected to the link member 295. The link member 316 is connected to the flexible link 318, which is connected to the rotatable shaft of the actuator 320 grounded to the laptop housing. When the actuator 320 rotates the shaft, the flexible link 318 moves the link member 316 linearly, as shown by arrow 322, which moves the thumb surface 312 linearly along the x axis. Move. The thumb surface 312 is shown as slidably contacting a standard button (not shown) directly below the surface 312.

The user may place his thumb, palm or finger on the thumb surface 312 while operating the touchpad to feel the haptic sensation. To press a button located below the thumb surface 312 the user simply presses down the surface 312. Overall the senses tend to be similar to the senses for the other parallel moving surfaces described above. In other embodiments, the link member 316 may be longer to position the actuator 320 of the housing of the laptop or other device as desired.

One disadvantage is that no feedback is given to the user unless the user's thumb, finger or palm is in the thumb surface area. The user must touch other buttons to type, and can miss the haptic experience at this time. Larger surfaces 312 or palm pad extensions may be used in embodiments where it is difficult to keep the thumb of the user's same hand on the surface 312 while using one hand to direct with the touchpad.

Touch device translation

This embodiment moves the touchpad (or touch screen) surface itself in parallel rather than moving the separated surface. The user feels in the skin that the parallel moving touchpad moves sideways, creating an instant sense. The touchpad can be moved relative to a fixed surrounding, such as a laptop housing.

13 is a perspective view of one embodiment 330 that provides a parallel moving touchpad surface. The touchpad 332 is moved relative to the housing 334, such as a laptop or PDA housing, by the actuator 336. In the described embodiment, the actuator 336 is a rotating actuator with a rotating shaft 338 connected to the linkage 340, such as a direct current motor. The interlock device 340 is connected to the bracket 342 at the opposite end, and the bracket 342 is connected to the bottom of the touch pad 332 module. The interlock device may include bonding and / or flexibility / compliance such that the rotational movement of the shaft 338 is converted to linear force in the bracket 342, whereby the touchpad 332 is shown by the arrow 344. They move sideways together. For example, the interlock may be made of polypropylene similar to the interlock of the actuator assembly of FIG. 5. The laptop housing can act as a restraint structure for a mobile touchpad module.

For example, a standard direct current motor can be used as the polypropylene linkage assembly for the actuator 336 and the linkage 340. In one embodiment, the haptic feedback component may be present where a selection component of a laptop, such as an optical disk drive, is generally disposed.

In other embodiments, the actuator 338 may be positioned away from the touchpad 332. For example, it may be located in any space available in the housing instead of just below the touchpad as shown in FIG. 13. The interlock device may be used to position the actuator away from the touchpad as shown below in FIG. 14.

Paralleling the entire touchpad in one or two axial directions can be a good overall haptic approach. Very small displacements (0.2 mm <x <0.5 mm) of the touchpad are desired to provide a useful haptic. When calculating within the entity size range, the power consumption of this embodiment may be less than the power consumption of the currently available inertial mouse interface device, which may receive all necessary power via an interface to a host computer, such as USB.

This type of embodiment has several distinct advantages. Feedback experience is direct, consistent with instructions, and sophisticated. It can be implemented with flexibility and care, and the addition of haptic components does not change how the touchpad is used. Parallel moving surfaces require less displacement compared to the inertial approach-this can result in reduced power consumption and manufacturing benefits. In some embodiments, the movement of the touchpad may be aligned at an angle in the x-y plane. Some drawbacks can be included by using a relatively large DC motor, which is that the resilient linkage requires a lot of clearance, can cause friction, and has a relatively high power consumption.

14 is a perspective view of yet another embodiment 350 of a mobile touchpad, where the touchpad can move in the X and Y directions. The touchpad 351 is directly connected to the first linkage member 351, which is connected to the rotating shaft of the actuator 353 by a flexible member 354 such as polypropylene. Actuator 353 is grounded to the laptop housing. When the actuator 353 rotates its shaft, the flexible member 354 converts the rotational movement into a linear movement and moves the linkage member 352 in the x direction in parallel, on the contrary the touch as shown by arrow 355. Move the pads in parallel. At the distal end of the first linkage member 354, for example, by means of a flexible coupling, it is connected to the second linkage member 356.

The second actuator 358 grounded to the laptop housing is connected to the other end of the second linkage member 356 by a flexible member 357, where the axis of rotation of the rotary shaft of the actuator 358 is substantially the actuator 353. Is the same as the axis of rotation. The rotational force output by the rotating shaft of the actuator 358 is converted into a linear force by the flexible member 357. This linear force causes the second linkage member 356 to move linearly along its length direction, and conversely, the first linkage member 352 pivots about the distal end of the actuator 353 along the y axis and touches Allow pad 351 to move approximately along the y-axis. Actuators 353 and 358 can be other types of actuators or direct current motors, for example, linear actuators such as those shown in FIGS. 15-17 can be used. The linkage member may be made of any suitable material, such as carbon fiber. Preferably very little energy is not absorbed by unwanted deformation of the grounded structure or linkage assembly.

Thus, the mechanism separates x and y motions; X-axis motion is provided by activating actuator 353, and y-axis motion is provided by activating actuator 358; By activating both motors both x and y axis motion can be provided. Both actuators can be manipulated together (common mode) or otherwise (differential mode) to achieve pure X or Y movement without coupling the linkage. Also, any combination of drive currents creates a residual force along any axis without coupling with the same fidelity.

One embodiment may use firmware for rapid computation and output of X and Y forces, for example software running on a local controller such as a microprocessor or software running outside the host CPU. In some embodiments such firmware may be too complex, so a mechanism for the electronic method of switching between two basic feedback axes is used instead. In one embodiment, two DC motors may be connected in series with a switch that returns current through one of the two motors.

In the embodiment of FIG. 14, the user may feel a difference between the x and y direction forces when the finger or an object is moved in the x direction on the touchpad. There are haptic values with alignment or correlation of finger / cursor movement and tactile feedback; In some cases, the alignment can increase the haptic signal-to-noise ratio. For example, when feedback is given vertically along the x-axis, moving the cursor from right to left over an icon or button can give the user a better, more realistic button feel. Less power is required if the feedback is aligned in the cursor direction instead of in all directions or in the wrong direction. In some cases, a weaker aligned haptic effect may be more meaningful than a strongly misaligned effect.

The touchpad surface with enhanced texture moves in relation to a fixed peripheral surface with enhanced texture. The enhanced texture is rougher, wrinkled or otherwise textured to allow for stronger user contact.

In some other embodiments, the touchpad surface consists of interlocking surface properties that move relative to each other in the x and / or y directions. For example, two touchpad halves can be manipulated by the actuator to move relative to each other.

In other embodiments, other actuators may be used to move the touchpad, touch screen or other touch device in the z direction. For example, piezoelectric actuators, voice coil actuators or moving magnet actuators may be directly connected to the touchpad or touch screen to provide direct movement of the touch surface. In addition, the touchpad surface may consist of a fixed tactile surface and a reference surface, where the reference surface may be moved along the z axis with respect to the fixed tactile surface.

15A and 15B are top and bottom perspective views, respectively, of another embodiment of a new “flat-E” actuator used to move the touchpad in parallel. 15C is a side view of actuator 360. Actuator 360 is designed to be very flat, and thus may be more suitable for functioning within a flat assembly that may inherently be part of a touchpad, touch screen, or other similar input device. E-core actuator topologies provide good actuators with minimal magnetic material and provide good force and range of travel. One drawback of moving magnet actuators is that they are required to be thick ("E" -core ferromagnetic piece width can be somewhat compromised with height, perhaps reducing the overall thickness of the actuator).

Actuator 360 presents a unique embodiment of an E-core that can be used to translate the touchpad in parallel (or to parallelize the separation surface as in the embodiment of FIGS. 9-14). The nested, flat 3-D embodiment shown in FIGS. 15A-15B includes more interlocks, has non-uniform flux in the pawl and moves substantially as in the case of 2-D.

Actuator 360 includes ferromagnetic pieces 362 in the form of “E” made of a metal, such as an iron metal or carbon steel plate, and may be a piece of metal or a stack. Coil 364 of wire is wound around the central pole of "E" of ferromagnetic piece 362. The floating plastic cage 368 may be positioned above the ferromagnetic piece 362 and arranged to roll the roller about an axis parallel to the axis through which the coil passes through the coiled coil 364 and placed on the device of the cage. The above roller 370 may be included. The cage can be plastic and is floating. That is, it is not attached to other components so that the roller can turn. A magnet 366 with two poles is located above the poles of the cage 368 and the ferromagnetic piece 362, so there is an air gap between the magnet and the ferromagnetic piece. The magnet 366 contacts the roller 370 and is connected to the bottom of the backside steel piece 372 located at the top (in the case of the arrangement of the drawings, other arrangements are possible). The roller thus creates a very small magnetic gap between the magnet and the ferromagnetic piece 362. The backside steel piece 372 is firmly connected to the touchpad 373 as shown in FIG. 15D so that the touchpad, steel piece 372 and the magnet 366 can be moved in parallel with the ferromagnetic piece 362. . For example, the magnet may in some embodiments be a bonded neodymium wafer with two poles, and the steel portion may be stamped from one sheet to a thickness of about 1 mm. Additional rollers or foams may be used to hold the ends of the ferromagnetic pieces against the magnet 366. The magnet, cage back piece is placed on the side of the "E" pole rather than on the front edge as with other E-core actuators; This allows the actuator of the present invention to be made very flat for laptop and other portable device applications.

In operation, current flows through the coil 364, which causes the magnetic flux to flow through the ferromagnetic pieces in the direction of the arrow 374. In reaction, the steel plate 372 moves in the direction along the axis indicated by arrow 376 (this direction depends on the direction of the current in the coil). The roller 370 rotates to allow the steel plate 372 and the magnet 366 to move in parallel with the ferromagnetic piece 362. Floating cage 368 prevents the roller from moving in an undesired direction as the roller rotates. In addition, a magnetic attractive normal force generated between the ferromagnetic piece 362 and the magnet 366 acts on the roller 370. Other Flat-E related embodiments may include bends and sharp suspensions to which magnetic normal force acts (which enables movement therefrom).

The flat E actuator embodiments described herein can be used to translate the touch pad (or touch screen) or the separating surface member on or side of the touch pad. For example, two flat E actuators may be used to manipulate the touchpad or surface member in two axes, x and y, in a configuration similar to that of FIG. 9.

Actuator 360 can be made very thin compared to other actuators, for example an assembly can be made of about 3 or 4 mm, less than half the thickness of other E-core actuators. Magnetic design can be repeated for optimal performance. Linearity and stopper force are compromised in thickness.

It is also an advantage to include thin, planar shapes suitable for laptops, PDAs and other portable devices. The moving magnet approach does not require large voids so this may be more desirable for laptop haptic feedback. The "E" core specimen is 10 mm by 20 mm by 8 mm, smaller than most direct current motors with the same force and consuming the same power. Also, since this is a direct drive configuration, no transmission is required between the actuator and the touchpad. Efficient, low cost, easy to manufacture components allow actuators to be made inexpensively. It is easy to integrate the actuator into modules on current touchpad PCBs. This self-normal workforce has one drawback: it requires suspension. Rollers and / or bends and sharp edge suspensions may be used to act on the magnetic normal force.

Actuator 360 generally provides a good range of motion. Larger (eg> 1 mm) displacements can be achieved. This embodiment using a foam to support the opposite end of the ferromagnetic piece has a recovery spring with a low spring constant from the foam suspension operating in shear mode. Audible noise can be reduced by using foam and / or rollers. As long as the haptic performance is good, the surface displacement is small enough that when the user is moving a finger on the touchpad to move the cursor on the desktop, the surface displacement does not significantly affect the cursor movement.

16A and 16B are perspective views of the top and bottom surfaces of another embodiment 380 of the “flat E” actuator of FIGS. 15A-15C. Ferromagnetic piece 382 (or laminate in another embodiment) comprises a substantially " E " structure with coil 384 wound around central pole 385 of E. As shown in FIG. A two pole magnet 386 is located across the E center pole 385 so that a gap is provided between the ferromagnetic piece 382 and the magnet, similar to embodiment 360. Metal plate 388 (eg, steel) is connected to magnet 386 and provided parallel to the ferromagnetic pieces and magnet. Cage 390 may be provided as an intermediate layer, and rollers 389 (shown in dashed lines) may be located in the holes of cage 390 to allow plate 388 to slide sideways with respect to ferromagnetic pieces and magnets. Can be. The touchpad or touch screen (not shown) may be firmly connected over the plate 388 as a grounded piece 382. In another embodiment, the touchpad or touch screen may be connected to the piece 382 with a grounded plate 388.

Embodiment 380 also includes a flexible suspension, which includes two interlocks 392 that can be connected to the middle layer plastic cage 390 and that effectively connect between the steel plate 388 and the ferromagnetic pieces 382. do. The interlock device 392 is in contact with the steel plate 388 at the end 394 and is connected to the cage layer 390 at the end 396 (or molded into the cage layer as a piece of plastic). Each interlock device includes a thin portion 395 and a thick portion 400.

In operation, current flows through the coil 384 and the magnetic force by the current and the magnet 386 causes the plate 388 (and touchpad) to move like the shaft 402. The suspension including the interlock device 392 prevents the plate 388 from skewing due to magnetic normal force and other forces. Each linkage 392 is bent to accommodate the movement of the plate 388, where the thin member 398 is first bent and the thick member 400 is bent when the limit of bending of the member 398 is reached. The thin-thick construction allows the spring centering to work until the thick (rigid) beam is engaged, so that the stoppage of movement is felt more smoothly. The final limitation of movement is caused by a stop that hits the inner edge of the plate 388.

The flexible suspension described above allows the desired movement of the plate and touchpad to the side, but prevents movement in the other direction. This makes the movement of the plate 388 more stable and prevents the plate 388 from shaking at that position over time. The suspension also provides the desired spring centering force on the plate 388 and the touchpad, causing the touchpad to move to the center of its range of motion when the user stops touching or applying force to the touchpad.

17A-17G illustrate another flat-E actuator touchpad embodiment 420 that provides a fabricated surface mount device utilizing current lead flame and over molding manufacturing technology and miniaturizes this type of actuator. Shows. This small size device can be wave soldered on the touchpad module and can operate in parallel to provide the appropriate stroke and force for the touchpad parallel movement (in other embodiments for the z-axis force).

17A-17C are top views of a PCB 422 that includes a plurality of flat-E actuators 424. Actuator 424 may be located at each corner of PCB 422 as shown. In other configurations, more or fewer actuators may be disposed than shown. The use of a plurality of actuators 422 can provide a large magnitude of force and allow each actuator 422 to have a low force output and cost. In one embodiment, PCB 422 is a separate PCB grounded to the housing of the laptop. The touchpad (eg, including the PCB 422 and other unique PCBs) is connected to the moving portion of the actuator, for example the pad 426 shown in FIG. 17A. In another embodiment, the PCB 422 is a touchpad and the moving portion of the actuator is connected to the laptop's grounded surface, such as a housing. In such embodiments, the actuator 424 may be invisible to the user by the edge of the housing extending around the touchpad, and the central area of the PCB 422 may be exposed to the user.

17D is a side view of one end of the PCB 422 separated from the touchpad, including the flat-E actuator 424. The touchpad / PCB member 428 is connected to the moving portion 430 of the actuator 424.

FIG. 17E is a perspective view showing one embodiment of the bottom side of the PCB shown in FIG. 17A, where the flat-E actuator 424 is surface mounted below the PCB 422. This can be done as a "hand-placed" component or preferably using automatic surface mount technology placement equipment. "E" ferromagnetic piece 432 may be grounded to PCB 422 such that the magnet and steel back plate of the actuator move.

17F and 17G are perspective views of the top and bottom of a flat-E actuator 424 having three poles each and capable of operating similar to the flat-E actuator described above. Ferromagnetic piece 432 is shaped like the letter "E" and a coil 434 is wound around the central pole. The bend 436 causes the magnet 438 and the steel back plate 440 to move relative to the ferromagnetic piece 432 and the coil 434. As in the embodiment of FIGS. 17A-17E, a touchpad (not shown) may be connected to the steel backside piece 440, and the E-laminate piece 432 / coil 434 may be grounded. The back piece 440 and magnet 438 can also be grounded and the touchpad can be connected to a moving ferromagnetic piece 432.

The flat-E actuator described above can be used to directly parallel the touchpad module or palm surface. In the above embodiments, the total thickness of the flat-E actuator may be less than about 3 mm. It presents a preferred embodiment in terms of flat-E magnetic assembly size, manufacturability and size economics that can be integrated in current touchpad product production lines.

In other embodiments, other moving magnet actuator designs and other voice coil actuator designs may be used, such as those described in US Pat. Nos. 6,166,723 and 6,100,874.

In other embodiments, other types of input surfaces or display screens may be used to similarly translate in parallel using the various actuators described herein. For example, a clear surface (parallel to the screen surface) of a display screen of a personal digital assistant (PDA) or an input sensing device that covers a monitor or touch screen of a CRT to provide haptic feedback. Can be paralleled in a similar direction. An application for such clear screen interpretation is an ATM machine, in which a user typically enters information on a touch screen. Haptic feedback makes these inputs more accessible and easier for people below average vision. Haptic feedback can direct and identify a particular displayed button along with other haptic sensations when the user's finger is over the graphically displayed button. It is useful in many ATM applications, because there is no moving cursor in ATM; Thus, haptic feedback may be useful for, for example, outputting some vibration when the button is activated to inform the user that the button has been pressed. Haptic feedback may assist the user in a noisy environment where sounds may be inaudible to the user, such as in a high traffic area of a car.

Embodiments described herein may provide haptic feedback in embodiments where a user uses a stylus or other object to enter data into a touchpad, touch screen, or input area. The haptic sensations can be transmitted from the touchpad (or other moving surface) to the user via a stylus or other object.

Other features

In some embodiments a human factor issue associated with haptic feedback may include force overload protection. Ideally for non-inertial feedback actuators and transmission designs, such as analyzing surfaces and other surfaces, the actuators will faithfully create large forces regardless of the load of the actuator or its position within the actuator travel. It is required to pay. In other words, the user's finger or hand should not move the actuator so that half of the vibration cycle reaches the end of the attenuated travel condition. For this reason, it is desirable to design the actuator and transmission mechanism separately from user loading. An example of this is an E-core actuator with a very high spring centering with a rigid suspension as in the embodiment of FIGS. 16A-16B. Powerful actuators can easily overcome this spring force, and the finger drag force on the touchpad surface accounts for only a small percentage of the total actuator output. Weak actuators require more compliant suspension, which causes user interaction to be disturbed by vibrations and produce nonlinear output.

In some embodiments another person-factor related issue of haptic feedback is audibility. The use of palm rest surfaces and inertial actuator assemblies cause, for example, the inevitable side effects of haptic sounding substrates. Loaded surfaces radiate sound incompletely, such as when the user is touching the housing or touchpad, but the force transmits fairly well. Thus, in some embodiments, a load measurement device may be used to determine when the user's hand is on it so that the force is output only when the user's hand is on the moving surface.

Current sound electronics of laptops, PDAs or other devices may be used in some embodiments to save cost in providing haptic functionality to laptop touchpads or other similar input devices. For example, current sound analog outputs (eg, digital-to-analog converters) and sound power amplifiers are used for haptic feedback in the touchpad or other laptop components described above without adding additional microprocessors and / or additional power electronics. It is possible to drive the actuator. A notch filter or other pickoff from the sound signal can be used to provide the haptic feedback signal. For example, while audible range signals are being routed to the laptop's audio speakers, the haptic effect control signal is provided in the sound spectrum of an inaudible range, and the control signal can be filtered to be provided to the haptic actuator. Can be. Alternatively, dedicated signals that are outside the audible range and not included in the audio signal may be filtered and routed to control the haptic feedback actuator.

In addition, current software on many laptops tracks the laptop's battery power to indicate the power level, to alert the user, or to turn off the laptop to conserve battery power. This tracking software can be embedded in haptic feedback applications. For example, if the battery power falls below a certain level, the haptic feedback software routine may reduce or turn off the magnitude of the output force to the user. This may be done by reducing the size of the force or by reducing the number or type of graphical objects in the GUI that have a haptic effect. This can also be done by reducing the duration of the haptic effect, for example by reducing the effect, which is generally 50 ms, to 40 ms. Combinations of these methods may also be used. Finally, some laptop computers may include different settings that the user can select as needed, such as high power, medium power and low power. For example, a low power setting can make the battery last longer. The haptic feedback control can be linked to and be controlled by this setting. For example, if the user uses a low power mode, the haptic feedback controller can be applied to reduce the power requirement of the haptic effect as described above.

18 is a plan view of a touch pad 450 according to the present invention. The touchpad 450 may in some embodiments be simply used as a positioning device, where the entire area of the touchpad provides cursor control. In other embodiments, different functions may be assigned to different areas of the pad. In such area embodiments, each area may have actuators that are physically differently positioned below or associated with the area. Other area embodiments may use one actuator that applies a force to the entire touchpad 450. In the illustrated embodiment, the central cursor control area 452 can be used to locate a cursor or viewpoint displayed by a laptop computer or other device.

The cursor control area of the touchpad may cause the force to be output to the touchpad, including the graphical environment and / or events in the environment, based on the interaction of the controlled cursor. The user may move a finger or other object, for example, within the area 452, thereby moving the cursor 20. The force is preferably related to the interaction of the cursor with the displayed graphical object. For example, a jolt or “pulse” sensation can be output, which is one force impulse that quickly rises to the desired size and turns off or quickly shrinks back to zero or a small size. The touchpad 450 may vibrate violently in an inertial haptic feedback embodiment as inertia in one direction or as a vibration in the z or other axis. Alternatively, the touchpad may be translated in one direction or vibrated one or more times to provide a pulse. Vibration sensations may be output, which is typically a force that varies with periodic time. The vibrations may cause the touchpad 450 or portions thereof to vibrate back and forth many times, and may be output by the host or local microprocessor to simulate specific effects occurring in the host application.

Another type of force sensation that can be output from the touchpad is texture force. This type of force is similar to the pulse force, but depends on the position of the user's finger and / or the cursor position in the touchpad area of the graphical environment. Thus, texture bumps may be output depending on whether the cursor is moving over the position of the bump in the graphical environment. This type of force depends on space. That is, when the cursor moves over the specified texture area, the force is output according to the position of the cursor; If the cursor is located between the “bumps” of the texture, no force is output, and the force is output when the cursor moves over the bump. This can be done by sending a pulse signal by the host when the cursor is pulled over the grating. Alternatively, a separate touchpad processor can be used for haptic effects, including touchpads and texture effects, and can be performed using local controls (e.g., the host sends high-level commands with texture parameters and the senses are touchpad processors). Directly controlled by In other cases, texture can be done by vibrating the user, which is dependent on the current speed of the user's finger (or other object) on the touchpad. The vibration is deactivated when the finger is still; When a finger moves quickly, the frequency and magnitude of the vibrations increase. This sensation can be controlled by the touchpad processor (if present) or by the host. Other spatial force sensations may be output. In addition, any of the force sensations described may be output simultaneously or in combination as desired.

Other types of graphical objects may be associated with haptic sensations. The haptic sensations are output to the touchpad based on interactions between the cursor and windows, menus, icons, web page links, and the like. For example, when the cursor moves on the board of the window, a "bump" or pulse may be output to the touchpad to inform the user of the cursor position. In other related interactions, when a rating control or scrolling function is performed on the touchpad (through the use of a cursor), a sense may be output in relation to the rating control function. In addition, the magnitude of the output force on the touchpad may depend on events or interactions in the graphical environment including user independent events. This force sensation can be used in games or simulations. Such haptic sensations and other haptic sensations are described in US Pat. No. 6,211,861. Other control devices or grips, which may include a touchpad according to the invention in their housings, are gamepads, mouse or trackball devices or pressure spheres or the like for manipulating cursors or other graphical objects in a computer-generated environment. It includes.

Some types of touchpads and touchscreens allow a user to sense the amount of pressure applied to the touchpad. This allows the various haptic sensations to be determined based at least in part on the sensed pressure. For example, the periodic vibration may be output at a frequency according to the sensed pressure. Alternatively, the gain (magnitude) of the output haptic sensation can be adjusted based on the sensed pressure. Users who tend to use the touchpad at higher pressures all the time can choose auto-scaling to be consistent in effect.

Other embodiments of touchpads and touchscreens allow a user to insert a "gesture" or shortcut by following a symbol in a cursor control area or other area, which is recognized by the processor as an instruction or data. Haptic sensations may follow or be associated with a particular gesture. For example, mode confirmation may be communicated with a particular haptic sensation when a mode confirmation gesture is recognized. Characters recognized from the gesture may each have a specific haptic sensation associated with it. In most touchpad embodiments, a user can select a graphic object or menu item by "tapping" (tapping) the touchpad. Some touchpads can recognize "tap-and-a-half" or double taps, which keep a finger or object on the pad while the user taps, touches the pad again, and moves the finger. For example, such gestures provide a "drag" mode in which an object can be moved with the cursor. When the user is in this drag mode, vibration or other haptic sensations may be output to the user to indicate that the mode is active.

As described above, the touch pad 450 may have another control area that provides a separate input from the main cursor control area 452. In some embodiments, other areas may be physically marked with lines, borders, or textures on the surface of the touchpad 450 such that the user can visually, audibly, and / or tactilely allow the user to It can be seen that the area is in contact.

For example, scrolling or rating control areas 454a and 454b can be used to scroll documents, adjust values (volume, speaker balance, monitor display brightness, etc.), or pan / tilt views in a game or simulation. Can be used to provide input for performing class control tasks such as Region 454a can be used by placing a finger (or other object) within the region, the upper portion of which increases or scrolls up, the lower portion decreases or scrolls downwards. And the like. In embodiments in which the amount of pressure applied to the touchpad can be read, the amount of pressure can directly control the degree of adjustment; For example, if the pressure is high, the document scrolls faster. This area 454b may similarly be used for horizontal (left / right) scrolling or other grade control adjustments such as values, field of view, and the like.

Certain haptic effects may be associated with control regions 454a and 454b. For example, when using the grade control area 454a or 454b, vibrations of a particular frequency may be output to the touchpad. In embodiments with multiple actuators, an actuator located directly below the region 454a or 454b may be activated to provide a more localized tactile feel for the "active" (currently used) region. have. When a portion of the region 454 is pressed for the grade control, a pulse may be output to the touch pad (or the region of the touch pad) to indicate when scrolling a page, passing a specific value, or the like. The vibration may continue to be output while the user contacts area 454a or 454b.

Another area 456 may also be disposed on the touch pad 450. For example, each of the areas 456 may be a small rectangular area, such as a button, and the user may point to this to initiate a function related to the area indicated. Area 456 can be used to run a program, open or close a window, "forward" or "back" a page's queue in a web browser, power on the computer 10, or initiate a "sleep" mode, check mail. You can launch computer functions such as shooting a gun in a game, cutting or pasting data from a buffer, saving a file to a storage device, and choosing a font. Region 456 may copy functions and buttons provided to the application program or provide new other functions.

Similar to region 454, each of regions 456 can be associated with a haptic sensation; For example, area 456 may provide a pulse sensation that provides immediate feedback that the function was selected when selected by the user. For example, a haptic sensation such as a pulse may be output when the user “taps” the areas 456, 452, 454 with a finger or an object to make a selection. Similar to a physical analog button that provides an output range based on how far the button is pressed, one or more regions 456 provide analogized output by providing a stepped or analog output proportional to the pressure the user applied to the touchpad. It may be a button such as.

In addition, areas of the same type may be associated with similar peeling haptic sensations. For example, regions 456, each associated with a word processor, may cause a pulse of a particular intensity if pointed out. Game related area 456 may provide different intensity pulses or vibrations. In addition, when the user moves the intellectual object from one area 454 or 456 to another, a haptic sensation (such as a pulse) may be output to the touchpad 456 to indicate that it has spanned the area boundary. For example, when the intellectual object enters the designated area, a high frequency vibration rapidly decreasing to zero may be output. This may be useful because it represents the boundaries of areas 454 and 456 unknown to the user. This enables region reconstruction for size and / or position and allows the user to quickly learn a new layout by tactile. Regions can be associated with an "enclosure" that defines the region in a graphical environment, and with other haptic sensations that are output when the cursor enters, exits, and moves within a particular boundary of haptic relevance.

In addition, the areas are preferably programmable in size and shape as well as in the functions related thereto. Thus, the functionality of region 456 may be altered based on active applications in a graphical environment and / or user selections entered into computer 10 and / or user selections stored on computer 10. Preferably, the size and position of each area can be adjusted by a user or an application program, and some or all areas can be completely removed if desired. Further, the user may preferably assign a specific haptic sensation to a type of area or to a specific area based on the type of function associated with that area. Other haptic sensations can be designed in tools such as Immersion Studio ™ from Immersion Corp. in San Jose, California.

Note that regions 454 and 456 need not be physical regions of the touchpad. That is, the entire touchpad surface merely provides the coordinates of the user's contact with the computer's processor, and the computer software can specify where other areas are located. The computer may interpret the coordinates based on the location of the user's contact and interpret the touchpad input signal as another type of signal, such as rating control, button function, or cursor control signal (e.g., the driver program may interpret this function if desired). Can be provided). If there is a touchpad microprocessor, it can interpret the functions associated with the user's contact location and report the appropriate signal or data (such as position coordinates or button signals) to the host processor. As a result, the host processor or software ignores low-level processing. In other embodiments, the touchpad 450 may be physically designed to output other signals to the computer based on other areas physically marked on the surface of the touchpad contacted by the user; For example, each area may be sensed by another sensor or sensor array.

Any embodiment that provides haptic feedback to a touchpad or a user's finger or object in contact with the touchscreen described herein can be used in the region of the touchpad 450.

While the invention has been described in terms of several preferred embodiments, it should be understood that modifications, substitutions, and equivalents thereof will be apparent to those skilled in the art of reading and drawing the specification. For example, many of the features described in one embodiment may be exchanged in other embodiments. Also, certain terms are used for clarity of explanation and do not limit the present invention.

1 is a perspective view of a laptop computer including a haptic touchpad in accordance with the present invention.

2 is a perspective view of a remote control device including a touchpad according to the present invention.

3 is a plan view of an embodiment of a haptic touch screen according to the present invention;

4 is a block diagram of a haptic system suitable for use with the present invention.

5 is a perspective view of one embodiment of an actuator assembly suitable for use in the inertial embodiment of the present invention.

6 is a perspective view of the actuator assembly of FIG. 5 connected to a touchpad.

7 is a perspective view of a detachable palm surface providing an inertial haptic sensation adjacent the touchpad.

8A is a perspective view of a piezoelectric transducer suitable for use in providing an inertial sense in accordance with the present invention.

8B is a side view of a structure and piezoelectric transducer in accordance with the present invention suitable for providing a haptic sensation to a touch device.

9 is a perspective view of one embodiment of a parallel moving surface member driven by a linear actuator.

10 is a plan view of another embodiment of a parallel moving surface member driven by a rotary actuator.

11 is a perspective view of another embodiment of a parallel moving surface member driven by a voice coil actuator.

12 is a perspective view of an embodiment of a parallel moving surface adjacent to the touchpad.

13 is a perspective view of an embodiment of a touchpad that is moved in parallel in one direction by a rotary actuator.

14 is a perspective view of an embodiment of a touchpad that is moved in two directions in parallel by a rotary actuator.

15A and 15B are perspective views of a first embodiment of an E-core actuator in accordance with the present invention suitable for parallel movement of a touchpad or separated surface.

15C is a side view of the actuator of FIGS. 15A-15B.

15D is a perspective view of the actuator of FIGS. 15A-15B connected to a touchpad.

16A and 16B are top and bottom perspective views of another embodiment of a planar E-core actuator according to the present invention.

17A-17B are perspective and plan views of a surface mounted E-core actuator according to the present invention.

17C-17G are perspective and side views of the E-core actuator of FIGS. 17A-17B.

18 is a top plan view of an example of a haptic touchpad in accordance with the present invention having another control area.

Claims (5)

  1. For actuators that provide a linear force output,
    A grounded ferromagnetic piece comprising a central pole located between two side poles;
    A coil wound around the central pole;
    A magnet located adjacent the center pole and the side poles, wherein a void is provided between the magnet and the ferromagnetic piece; And
    A back plate connected to the magnet, the back plate and the magnet moving relative to the grounded ferromagnetic piece as a current flows through the coil;
    Actuator comprising a.
  2. The method of claim 1,
    And rollers positioned between the ferromagnetic piece and the back plate to allow relative movement between the back plate and the ferromagnetic piece.
  3. The method of claim 2,
    The rollers are bound to a cage member.
  4. The method of claim 1,
    And a bend connected between the back plate and the ferromagnetic piece, wherein the bend reduces relative movement in an undesired direction between the back plate and the ferromagnetic piece and provides a spring centering force between the back plate and the ferromagnetic piece. Providing actuator.
  5. The method of claim 1,
    And the total thickness of the actuator is about 4 mm or less.
KR1020087025412A 2000-01-19 2002-03-08 Haptic interface for laptop computers and other portable devices KR20080096854A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US27444401P true 2001-03-09 2001-03-09
US60/274,444 2001-03-09
US09/917,263 US6822635B2 (en) 2000-01-19 2001-07-26 Haptic interface for laptop computers and other portable devices
US09/917,263 2001-07-26

Publications (1)

Publication Number Publication Date
KR20080096854A true KR20080096854A (en) 2008-11-03



Family Applications (4)

Application Number Title Priority Date Filing Date
KR1020087010077A KR100950822B1 (en) 2000-01-19 2002-03-08 Haptic interface for laptop computers and other portable devices
KR20097008856A KR101136307B1 (en) 2000-01-19 2002-03-08 Haptic interface for laptop computers and other portable devices
KR20107013713A KR101035450B1 (en) 2000-01-19 2002-03-08 haptic interface for laptop computers and other portable devices
KR1020087025412A KR20080096854A (en) 2000-01-19 2002-03-08 Haptic interface for laptop computers and other portable devices

Family Applications Before (3)

Application Number Title Priority Date Filing Date
KR1020087010077A KR100950822B1 (en) 2000-01-19 2002-03-08 Haptic interface for laptop computers and other portable devices
KR20097008856A KR101136307B1 (en) 2000-01-19 2002-03-08 Haptic interface for laptop computers and other portable devices
KR20107013713A KR101035450B1 (en) 2000-01-19 2002-03-08 haptic interface for laptop computers and other portable devices

Country Status (2)

Country Link
KR (4) KR100950822B1 (en)
CN (1) CN100426213C (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100983091B1 (en) * 2009-05-15 2010-09-17 한양대학교 산학협력단 Haptic mouse
WO2012002664A2 (en) * 2010-06-30 2012-01-05 (주)하이소닉 Portable terminal having haptic module
WO2012134047A2 (en) * 2011-03-31 2012-10-04 (주)하이소닉 Haptic actuator

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080084384A1 (en) * 2006-10-05 2008-04-10 Immersion Corporation Multiple Mode Haptic Feedback System
US20080251364A1 (en) 2007-04-11 2008-10-16 Nokia Corporation Feedback on input actuator
TWI362673B (en) 2007-12-31 2012-04-21 Htc Corp Touch sensor keypad with tactile feedback mechanisms
CN101488046B (en) 2008-01-16 2011-06-15 宏达国际电子股份有限公司 Electronic device and keyboard module thereof
US8704649B2 (en) 2009-01-21 2014-04-22 Korea Institute Of Science And Technology Vibrotactile device and method using the same
US10007340B2 (en) * 2009-03-12 2018-06-26 Immersion Corporation Systems and methods for interfaces featuring surface-based haptic effects
CN101876863B (en) 2009-04-30 2013-02-27 华硕电脑股份有限公司 It means the reaction display device
TWI490736B (en) * 2009-04-30 2015-07-01 Asustek Comp Inc Display panel apparatus and reaction apparatus
WO2010134649A1 (en) 2009-05-19 2010-11-25 한국과학기술연구원 Vibration haptic mobile apparatus and operating method thereof
US9024908B2 (en) * 2009-06-30 2015-05-05 Microsoft Technology Licensing, Llc Tactile feedback display screen overlay
US8633916B2 (en) * 2009-12-10 2014-01-21 Apple, Inc. Touch pad with force sensors and actuator feedback
CN102339123B (en) * 2010-07-14 2014-10-29 Tcl集团股份有限公司 The method of controlling vibrations contact area
CN102722221B (en) * 2011-03-31 2014-10-22 宏达国际电子股份有限公司 The handheld electronic device
KR102010206B1 (en) * 2011-09-06 2019-08-12 임머숀 코퍼레이션 Haptic output device and method of generating a haptic effect in a haptic output device
US9513706B2 (en) * 2012-02-15 2016-12-06 Immersion Corporation High definition haptic effects generation using primitives
KR101379292B1 (en) * 2012-07-06 2014-04-02 한국표준과학연구원 Method using the same and recording medium thereof
EP2701033B1 (en) * 2012-08-24 2018-11-28 BlackBerry Limited Temporary keyboard having some individual keys that provide varying levels of capacitive coupling to a touch-sensitive display
KR101580685B1 (en) * 2012-12-26 2015-12-30 신성수 Method for obtaining 3dimension haptic and display apparatus using thereof
CN104035609A (en) * 2013-03-08 2014-09-10 联想(北京)有限公司 Information processing method and electronic device
JP5540249B1 (en) * 2013-04-01 2014-07-02 新シコー科技株式会社 Vibration device and electronic device
US10254836B2 (en) 2014-02-21 2019-04-09 Immersion Corporation Haptic power consumption management

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2511577Y2 (en) * 1987-04-14 1996-09-25 日本電気ホームエレクトロニクス株式会社 Data Tutsi panel Sui Tutsi
GB2260466B (en) 1991-09-28 1995-08-16 Star Mfg Co Electroacoustic transducer
JPH06265991A (en) * 1993-03-10 1994-09-22 Canon Inc Camera system
US5424592A (en) 1993-07-01 1995-06-13 Aura Systems, Inc. Electromagnetic transducer
US5684722A (en) * 1994-09-21 1997-11-04 Thorner; Craig Apparatus and method for generating a control signal for a tactile sensation generator
DE19638015A1 (en) 1996-09-18 1998-03-26 Mannesmann Vdo Ag Tactile panel for input to computer system
US5887995A (en) 1997-09-23 1999-03-30 Compaq Computer Corporation Touchpad overlay with tactile response
GB2339336B (en) 1998-06-16 2000-08-16 Huntleigh Technology Plc Magnetic actuator
CN2363316Y (en) 1998-12-15 2000-02-09 刘中华 Touch force action point coordinate detection sensor

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100983091B1 (en) * 2009-05-15 2010-09-17 한양대학교 산학협력단 Haptic mouse
WO2012002664A2 (en) * 2010-06-30 2012-01-05 (주)하이소닉 Portable terminal having haptic module
WO2012002664A3 (en) * 2010-06-30 2012-05-03 (주)하이소닉 Portable terminal having haptic module
US9053618B2 (en) 2010-06-30 2015-06-09 Hysonic. Co., Ltd. Portable terminal having haptic module with a hinge part supporting a touch pad
WO2012134047A2 (en) * 2011-03-31 2012-10-04 (주)하이소닉 Haptic actuator
WO2012134047A3 (en) * 2011-03-31 2012-12-06 (주)하이소닉 Haptic actuator

Also Published As

Publication number Publication date
KR20090052401A (en) 2009-05-25
KR100950822B1 (en) 2010-04-02
KR20100089104A (en) 2010-08-11
KR101035450B1 (en) 2011-05-18
CN100426213C (en) 2008-10-15
KR20080039551A (en) 2008-05-07
CN1924775A (en) 2007-03-07
KR101136307B1 (en) 2012-04-25

Similar Documents

Publication Publication Date Title
US7027032B2 (en) Designing force sensations for force feedback computer applications
US9619033B2 (en) Interactivity model for shared feedback on mobile devices
JP5308531B2 (en) Movable trackpad with additional functionality
JP4997335B2 (en) Portable device with touch screen and digital tactile pixels
US8294047B2 (en) Selective input signal rejection and modification
JP3069791U (en) Mouse and positioning device
JP5694205B2 (en) Friction display system and method and additional haptic effect
EP0995152B1 (en) Method and apparatus for designing force sensations in force feedback computer applications
US10234944B2 (en) Force feedback system including multi-tasking graphical host environment
US6750877B2 (en) Controlling haptic feedback for enhancing navigation in a graphical environment
CN104915002B (en) Touch tablet with force snesor and actuator feedback
CN104881175B (en) Multi-touch device having a dynamic haptic effect
CN101231553B (en) Method and system for providing tactile feedback sensations
CN102111496B (en) Method and apparatus for generating vibrations in a portable terminal
US7102541B2 (en) Isotonic-isometric haptic feedback interface
CN103257783B (en) Interaction model for shared feedback on the mobile device
US7336266B2 (en) Haptic pads for use with user-interface devices
EP1066616B1 (en) Force feedback control wheel
US6703550B2 (en) Sound data output and manipulation using haptic feedback
JP2013117951A (en) Systems and methods for multi-pressure interaction on touch-sensitive surfaces
JP4675152B2 (en) Computer with the portability
EP2626775A2 (en) Method and apparatus for haptic flex gesturing
US7564444B2 (en) System and method of applying force feedback to a manipulandum wheel utilized with a graphical user interface
US8502792B2 (en) Method and apparatus for providing haptic effects to a touch panel using magnetic devices
CN101910977B (en) Audio and tactile feedback based on visual environment

Legal Events

Date Code Title Description
A107 Divisional application of patent
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application