KR101096358B1 - An apparatus and a method for selective input signal rejection and modification - Google Patents

An apparatus and a method for selective input signal rejection and modification Download PDF

Info

Publication number
KR101096358B1
KR101096358B1 KR1020090026845A KR20090026845A KR101096358B1 KR 101096358 B1 KR101096358 B1 KR 101096358B1 KR 1020090026845 A KR1020090026845 A KR 1020090026845A KR 20090026845 A KR20090026845 A KR 20090026845A KR 101096358 B1 KR101096358 B1 KR 101096358B1
Authority
KR
South Korea
Prior art keywords
touch
input
pick
input data
method
Prior art date
Application number
KR1020090026845A
Other languages
Korean (ko)
Other versions
KR20100066283A (en
Inventor
크리스토퍼 텐진 멀렌스
웨인 카를 웨스터맨
Original Assignee
애플 인크.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US12/242,794 priority Critical patent/US8294047B2/en
Priority to US12/242,794 priority
Application filed by 애플 인크. filed Critical 애플 인크.
Publication of KR20100066283A publication Critical patent/KR20100066283A/en
Application granted granted Critical
Publication of KR101096358B1 publication Critical patent/KR101096358B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04105Pressure sensors for measuring the pressure or force exerted on the touch surface without providing the touch position
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

Embodiments relate to a user input device that accepts complex user input including a combination of touch and push (or pick) input. Embodiments of the present invention provide for selective ignore or rejection of input received from such devices to avoid interpreting unintended user actions as a command. In addition, some input signals may be modified. The selective rejection or modification may be performed by the user interface device itself or by a computing device including or attached to the user interface device. The selective rejection or modification may be performed by a module that processes the input signals, performs the necessary rejection and modifications, and sends the modified input signals to higher level modules.
Input Devices, Input Signals, Touch, Push, Pick, Touch Sensing Panel, Trackpad

Description

Device and method for selective input signal rejection and correction {AN APPARATUS AND A METHOD FOR SELECTIVE INPUT SIGNAL REJECTION AND MODIFICATION}

The present invention generally relates to processing signals from user input devices, and more particularly to selectively rejecting certain types of signals received from user input devices.

Various types of input devices, such as buttons or keys, mice, trackballs, joysticks, touch sensor panels, touch screens, and the like, are currently available for performing operations in computing systems. In particular, in the case of the touch panel, its operation is easy and flexible, and its price is falling, so it is becoming more and more widespread. The touch screen may comprise a transparent panel having a touch-sensitive surface. A computer or other type of electronic device may process signals generated by the touch panel to determine how and where a user is touching the touch panel.

Multi-touch panels are an advanced type of touch panel that allow multiple touch events to be detected simultaneously in the touch panel. Multi-touch panels allow for more complex user interaction because they allow the electronic device to detect all areas of the panel that are being touched at any given time. Thus, the electronic device can obtain an "image" representing the location and shape of all the touches occurring on the panel at any given time. Moreover, the multi-touch panel and the device connected thereto can track the movement over time of one or more touch events (eg, one or more fingers moving along the surface of the panel). This enables tracking of more complex "touch gestures".

Various types of multi-touch panels can be designed. One type provides for detecting touch events based on detecting a change in capacitance caused by a finger or other object touching the panel. An exemplary multi-touch panel of this type was discussed in US application Ser. No. 11 / 649,998 (published 20080158172), filed Jan. 3, 2007, the contents of which are hereby incorporated by reference in their entirety.

Touch sensing (single-touch or multi-touch) is undoubtedly beneficial, but in some situations touch sensing can collect too much information. For example, a user may touch the panel or move his finger along the panel inadvertently or at least without the intention of conveying this action to the computer or device. If the device responds to unintended actions by the user it may be embarrassing or misunderstand the commands or other communications received from the user.

Embodiments of the present invention relate to user input devices that accept complex user input including a combination of touch and push (or pick) input. These devices provide much richer user input than many existing user input devices. However, this may lead to some unintended consequences. Since the devices of the present invention can detect user actions that could not be detected by previous devices, these devices may detect certain user actions that are not intended to be machine interface actions by the user.

Therefore, embodiments of the present invention provide for selective ignore or rejection of input received from such devices to avoid interpreting unintended user actions as commands. Moreover, some input signals can be modified. This selective rejection or modification may be made by the user interface device itself or by a computing device including or attached to the user interface device. This selective rejection or modification may be done by a module that processes the input signals, performs the necessary rejection and modifications, and sends the modified input signals to higher level modules.

According to embodiments of the present invention, it is possible to avoid interpreting unintended user actions as a command.

In the following description of the preferred embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the embodiments of the present invention.

This generally relates to devices providing a surface characterized by a combination of touch sensing and mechanical pick sensing. Touch sensing refers to the detection of a finger or other object that just touches the surface, while mechanical pick sensing indicates that the surface registers a push that actually physically moves or deforms the surface. The touch sensing may be multi or single touch sensing. Embodiments of the present invention often detect any combination of mechanical and touch sensitive data that matches patterns identified as originating from unintended user input and modify the data to remove or reduce the effects of unintended input. Correct it.

Embodiments of the invention may be described and illustrated herein with respect to a laptop trackpad and a computer mouse, but embodiments of the invention are not so limited and in addition to any input devices that combine touch sensing and mechanical peak sensing. It should be understood that it can be applied. Moreover, although embodiments may be described and illustrated herein with respect to devices that perform multi-touch sensing, some embodiments may include input devices that perform only single touch sensing.

1 illustrates an exemplary user input device in accordance with some embodiments of the present invention. The device may include an upper surface 101. The upper surface may be a multi touch or single touch capable surface. The upper surface may be connected to the base 102 via a hinge 103. In the example of FIG. 1 the hinge 103 is shown as being at the distal ends of the top surface 101 and the base 102, but in other embodiments the hinge is positioned more towards the center of the top surface and base instead, You can create pivot points that can provide "rocker switch" actions. The hinge may be spring loaded, or other spring or similar mechanism may be used to elastically separate the top surface from the base.

The upper surface may be elastically movable in the downward direction. A switch 104 can be placed on the base and actuated when the top surface is pushed down. The switch may be a microswitch or other operable device. The hinge can ensure that the top surface returns to its original position after the user stops applying pressure to push the top surface down.

The user can interact with the touch surface simply by touching it without depressing the touch surface to activate the switch. The user can provide multi-touch signals, for example, by using two or more fingers to touch different locations on the surface. The user may also enter gestures by moving one or more fingers along its surface. This type of input is called touch input. The user may also push the surface down to activate the microswitch. This can be used as another type of user input called a pick. Moreover, the user may combine the two types of inputs. For example, when the user pushes the top surface down, he may do so by placing fingers of a particular configuration on that surface, perhaps at a particular location on that surface, which particular configuration and location is relative to the user interface. It can have a specific meaning. The user may also give gestures on the top surface while the top surface is pushed down and these gestures may also have a specific meaning within the user interface. This meaning may be the same as or different from the meaning resulting from making similar gestures while the top surface is not pushed down.

Device 100 may be included as a user interface device in various other devices. For example, it may be included as a trackpad of laptop computer 110. Moreover, the device 100 may be a trackpad for a standalone keypad, a standalone trackpad, a trackpad for a standalone toy or game console, a vending machine, an ATM machine or other type connected to a personal computer or the like. Other devices such as trackpads for electronic kiosks, and the like.

2 shows another exemplary device that includes a combination of touch and pick sensing. The device 200 may be a computer mouse, standalone trackpad or other input device. It may include an upper surface 201, which may be a multi-touch surface. Base 202 may be attached to the top surface through one or more spring components 203. Guides (not shown) can be used to keep the top surface in position over the base. The switch 204 can be disposed on the base and pushed down by the top surface when the top surface is pushed down. The device 200 may also include a location tracking module 205 that tracks the movement of the device. The position estimation module 205 may be conventional, including, for example, a ball or laser tracking system.

As with the trackpad discussed above, the user can simply touch (without depressing) the surface of the device, press the surface to actuate the switch 203, or a device attached to the device 200 by a combination of both. You may be able to pass commands or information to the. When giving multi-touch input, the user generates multi-touch combinations by simultaneously touching different portions of the upper surface and / or generates gestures by moving one or more fingers and / or other objects along the surface. can do.

The mouse embodiment may be used with various existing computer systems, such as computer system 210, or in any other application in which a computer mouse may be considered a useful user input device.

Other types of input devices may combine multi-touch and pick type interfaces by allowing a user to provide touch and pick input on the same surface, as discussed above. Some of these devices may feature two or more switches that allow for greater variety in pick inputs.

Devices of this type provide much richer user input than many existing input devices. However, this may lead to some unintended consequences. Since the devices of the present invention can detect user actions that could not be detected by previous devices, these devices may detect certain user actions that are not intended to be machine interface actions by the user. For example, a user can often rest his palm on a conventional laptop trackpad while typing, and he does not expect this behavior to be a command for the laptop as a result. However, one version of trackpad 100 may be depressed and register a pick as a result of the user's palm.

Therefore, embodiments of the present invention provide for selective ignore or rejection of input received from devices 100 and 200 to avoid interpreting unintended user actions as commands. Moreover, some input signals can be modified. This selective rejection or modification may be a computing device (eg, laptop 110 or computer (eg, laptop 110) or computer attached to or attached to the user interface device itself (eg, by mouse 200) or the user interface device. 210)). This selective rejection or modification may be done by a module that processes the input signals, performs the necessary rejection and modifications, and sends the modified input signals to higher level modules. This is discussed in more detail below with respect to FIG. 7.

In some embodiments, pick inputs are rejected when there are certain types of touch inputs. This can be done because users may unintentionally push the top surface, and often the way they push the top surface may indicate whether the pick was intentional.

3 and 4 show a plurality of touch panels and possible touch combinations thereon. Thus, these fingers show the current state of the touch panels. Touches are indicated by hatched areas in the figures. Embodiments of the invention may be configured to recognize certain patterns, in the manner in which these patterns are typically caused. For example, a small circle or ellipse can be perceived as a finger or fingertip, a larger ellipse is a thumb, and a larger ellipse whose smaller radius is above a certain threshold (eg 11 mm) is the palm of the hand. It can be recognized. Recognition of the finger or other hand parts is discussed in more detail in US Pat. No. 6,323,846, which is incorporated by reference herein in its entirety in all respects. This topic also discloses US Patent Application No. 11 / 619,464 "MULTI-TOUCH INPUT DISCRIMINATION", published January 3, 2007, and US Patent Application No. 11 / 756,211, filed May 31, 2007. MULTI-TOUCH INPUT DISCRIMINATION "(published number 20080158185). These two patent applications are hereby incorporated by reference in their entirety in all respects.

Referring to FIGS. 3A-3C, the panel 300 of FIG. 3A shows a pattern that may result from touching three fingers on the panel. If such a touch is detected and at the same time a pick is detected, embodiments of the present invention may allow the pick to be recognized (ie, do not reject it). Picks can be accepted because this pattern typically indicates that the user is intentionally attempting to push the surface with his finger. Picks may be allowed in other similar states where there are different numbers of finger touches and / or where the finger touches are arranged in a different manner. In some embodiments, the pick may be ignored if more fingers (eg, eight) appear than a predefined number. Multiple fingers may indicate that the user is resting his hands on the trackpad.

The pattern of panel 301 in FIG. 3B shows a portion of thumb touch 303 appearing at the bottom of the panel. Pick events that occur while this touch is appearing may be allowed. For it also usually represents an intentional pick. In fact, all touches perceived as thumb touches or portions thereof and appearing close to the edges of the panel can allow pick events that occur during those touches. In some embodiments, all finger touches or portions thereof appearing close to the edges of the panel can allow simultaneous pick events to be allowed regardless of whether they are perceived as thumbs.

Panel 302 of FIG. 3C shows two patterns 304 and 305 that can be identified as palm touches. If a pick is registered if one or both of these patterns (or similar patterns) appear at these or similar positions (ie, positions close to and relatively parallel to the sides of the panel), The pick can be rejected or ignored. Pick rejection is based on the fact that this pattern is likely to indicate that the user is simply resting his hand on the trackpad and not intending to pick up. If only some of the patterns 304 and 305 appear on the sides of the panel, the pick may similarly be rejected as long as these positions are recognizable as such.

Panel 400 of FIG. 4A shows a portion of palm pattern 401 and thumb pattern 402. Since the pattern of panel 400 may indicate that the user rests his hand on the trackpad, such a pattern may also cause any simultaneously detected pick to be rejected. The pattern of panel 400 may cause the pick to be rejected even though pattern 402 is not recognized as a thumb touch. In general, when a palm touch (e.g., pattern 401, or part thereof) is detected with any finger touch on the top portion of the panel, simultaneously detected picks may be ignored. The top can be defined, for example, as the top fifth of the panel (as defined by line 403 in Fig. 4A) A mirror image of panel 400 is also picked up. Can result in rejection.

Panel 404 of FIG. 4B shows finger touch 406 with palm touch 405 proximate the side. In pattern 404, the finger touch is not at the top of the panel. This pattern can result in allowing the finger touch 406 to indicate an intentional push, thus allowing the pick to be registered. Picks may be accepted even when palm touch 405 is a partial palm touch. Patterns that mirror the pattern 404 may also result in allowance of picks.

In some embodiments, if a palm touch such as palm touch 305 appears to cause the detected pick to be ignored, then a finger such as finger 406 appears while the multi-touch surface is still pushed down, The pick can continue to be ignored. In addition, if a finger touch such as pattern 402 first appears to cause a detected pick to be registered, and then a palm touch such as palm touch 401 appears while the multi-touch surface is pushed down, Picks may continue to be registered. More broadly, in some embodiments, if a pick occurs during a pattern that results in a determination of whether to register or ignore a pick, and then the pattern changes while the pick is still occurring, the later pattern Even if this results in a different decision, the first decision can still dominate.

Some embodiments may allow users to enter touch gestures by moving one or more fingers along the panel. In some embodiments, if a certain multi-touch gesture is detected as in progress and a pick is detected while the multi-touch gesture is in progress, the pick can be ignored. This can be done because the user may have unintentionally pressed on the panel while attempting to make the gesture.

Other embodiments may allow gestures and picks to be detected at the same time, and in some cases, provide different behavior in such events. For example, some embodiments provide a desktop by a user making a pick to “pick up an object” and making a gesture to move the object while the pick is being made (ie, while the panel is pushed down). You can allow to move objects around.

When processing gestures, the concept of the lowest path can be defined. In some embodiments the lowest path may simply be selected as the lowest touch on the panel (ie, it has the lowest y coordinate). In other embodiments, the lowest path may be selected as a relatively low and relatively inactive touch. In one example of the latter embodiments, the lowest path may be selected based on the height and movement rate of each touch. If there are a plurality of touches on the panel at a given time, the following parameters can be measured for each touch: height of the touch y and distance of movement for a predefined time period of the touch d. (The predefined time period can be a relatively short period, for example 0.1 second). The lowest path may be a touch in which (d + y) is minimum. In other embodiments, the expression (ad + by) may be used where a and b are predefined constants. The lowest path is typically a thumb, but may be another finger or object.

In some embodiments, touch events can also be ignored. One such example is shown in panel 407 of FIG. 4C. In embodiments shown by the panel, a thumb resting zone may be defined in the lower portion of the panel below line 407. The thumb rest zone can be, for example, 1 cm thick. If the lowest path (eg, the lowest path 409) appears in the thumb rest zone, any pick received may be registered, but touch input of that lowest path may be rejected or ignored. This can happen because the user may only be touching the panel to rest his finger or to make a pick and does not intend to make any touch input. Disregarding touches in the various rest zones is discussed in more detail in the US patent application "SELECTIVE REJECTION OF TOUCH CONTACTS IN AN EDGE REGION OF A TOUCH SURFACE", which is filed at the same time and has attorney docket No. 106842017800. This US patent application is incorporated herein by reference in its entirety in all respects. If the lowest path moves and leaves the thumb rest zone, his touch input may be allowed. However, there is another finger on the panel (eg finger touch 410) and the other finger touch advances from when the finger touch 410 appears until the lowest path 409 leaves the thumb rest zone. If you move more than a defined distance (eg 1 cm), the lowest path 409 can be ignored permanently (ie, until the user lifts his finger), regardless of where it moves. have. What this can be done is that the user may be concentrating on making a gesture with his finger (ie, touch 410) and unintentionally drifting his thumb (ie, lowest path 409) out of the resting zone. Because there is.

In some embodiments, if the lowest path was moving for at least a predefined time period before another finger touched, the lowest path is ignored before the touch of the other finger. The predetermined time may be, for example, 1/4 of 1 second.

In some embodiments, if there is a detected pick when there are more than two finger touches currently detected, the lowest path is selected from the current touches and while the pick is in progress (ie, while the user is pushing the panel down). The lowest path is ignored. It should be noted that in these embodiments, if only one touch is being detected when a pick is detected, this touch is not ignored.

If the lowest path is rejected under the above scenario, the lowest path completely leaves the surface or all other touches leave the surface, leaving the lowest path as the only remaining touch, or the pick is released and the lowest path is the thumb. The memory flag may be set such that the lowest path continues until it is not identified as a thumb path.

The path or touch can be identified as the thumb path by examining the geometric qualities of the touch pattern and determining whether the pattern is from a thumb touch. For example, a thumb touch can be larger and more elliptical than a conventional finger touch. In the scenario discussed above, if the lowest path is identified as the thumb path and not alone, it will continue to be rejected after the pick is released. The lowest path behavior discussed above may be different from the thumb rest zone behavior discussed in connection with FIG. 4C above and need not be dependent on the thumb rest zone behavior.

In some situations, the user's fingers may slide unintentionally on the surface of the panel when the user pushes the panel down (ie, when picking) and when the user causes the panel to return to its original state. It is recognized. This may be especially true for curved panels such as the panel for the mouse of FIG. 2, but may also occur for flat panels such as those shown in FIG. 1.

Graph 500 of FIG. 5 shows the state of the switch of the mechanical switch during the performance of the pick. Switch state 0 may indicate that the top surface is not pushed down with respect to the switch. Switch state 1 may indicate that the top surface is being pushed down or a pick is being performed. At time point 501, the user pushes the top surface down to cause the state to change from 0 to 1. The user keeps the top surface pushed down until time point 502. At time point 502, the user releases the top surface so that it returns to its original position or changes back to zero.

However, the user may not need to remove his finger from the top surface when releasing the top surface. He can only release the upper surface by releasing pressure from the surface but keeping his finger touching the surface. The user may want to release the surface without removing his finger from the surface in order to perform or continue performing the desired touch input or gesture. However, as noted above, the user may inadvertently move his fingers along the multi-touch panel when initially pressed down and when releasing the top surface. This may occur as a result of the change in pressure between the user's fingers and the surface. This may interfere with touch input or gestures that the user is about to make. Also, if the user is already moving his fingers along the surface or touch panel, the act of pushing or releasing the touch panel can result in an unintentional change in the speed of movement. Therefore embodiments of the present invention provide for eliminating the effects of this unintended finger movement by modifying the detected touch inputs.

This modification can be made based on a click memory variable, i.e. an internal variable called CMV. Graph 503 of FIG. 5 shows the state of the CMV in accordance with some embodiments over a certain time interval. The click memory variable may have a value between one and zero. It can be reset to 1 whenever a switch change occurs. This CMV may be set to 1 at time points 501 and 502. Once at a non-zero value, the CMV can decay exponentially over time until it reaches zero. This can be achieved, for example, by periodically performing the following calculation:

CMV NEW = 0.9 CMV OLD

In different embodiments the coefficient 0.9 and period of calculation may vary. Because of the inherent rounding of the electronic calculation, Equation 1 will result in the CMV decaying to zero after some time from the change in the switch, unless a new change in the switch resets the CMV back to one.

Each touch pattern forming an individual geometric object may be considered a touch object. Thus, for example, referring to panel 300, finger touch patterns 306-308 may be considered separate touch objects. Palm touch patterns 304 of panel 302 may also be considered individual touch objects. The speed of each touch object on the panel can be calculated. In some embodiments, this rate calculation needs to be done only if the value of the CMV is not zero. And the speed of the touch objects can be changed according to the following equation.

Figure 112009018910912-pat00001

Where V IN is the initial or detected speed, V R is the resulting or modified speed and K is a predefined constant. By experiment, a suitable constant K can be selected.

The result of equation (2) is shown in FIG. 6 is a graph of the modified rate relative to the initial rate for a predefined value of the CMV. As can be seen, the modified speed is generally proportional to the initial speed except for the dead zone defined by K · CMV. Thus, speed correction is similar to dead zone filtering. Dead zones represent a speed range where the initial sensed speed of various touch objects is likely due to unintended consequences of the user pushing or releasing the top surface.

Thus, embodiments of the present invention provide that slower touch objects may be stationary (ie their speed may be equal to zero), while the speed of faster touch objects may be reduced based on the value of the CMV. to provide.

The graph of FIG. 6 may represent the relationship of velocities at a single instant in time. Over time, the value of the CMV may change and the relationship of FIG. 6 may change. More specifically, if there is no new panel push or release event, the value of the CMV may attenuate, which may result in a reduction of dead zone. Thus, the modification of the touch object velocities decreases as the value of the CMV decreases or over time from the last push or release event. As a result, the CMV can go to zero, and the speed is not corrected at that time. This may reflect the fact that any unintended effects of pushing or releasing the top surface decrease over time and eventually disappear.

In some embodiments, the calculation of equation (2) can be done for the vertical and horizontal (x and y) components of the velocities of each object. Thus, V R , x and V R , y can be calculated based on V IN , x and V IN , y , respectively. In some embodiments, modified speeds of touch objects may be sent to higher level modules. Alternatively, or in addition, the modified speeds may later be used to determine modified positions of the various touch objects and these modified positions may be sent to higher level modules.

The speed correction can be done for the input devices and / or similar devices of FIG. 2 because it is particularly useful for input devices featuring non-flat top surfaces. However, it can also be done for devices featuring flat top surfaces such as laptop trackpads.

The various input signal rejection and correction strategies discussed above may be combined if they are not mutually exclusive. Thus, any device may be characterized by a combination of one or more of the strategies discussed above.

7 is a block diagram showing a modular representation of embodiments of the present invention. 7 may describe various embodiments, such as the laptop of FIG. 1, the computer of FIG. 2, and the like. Block 700 represents a user input device. This may be a combination of the mechanical pick input device and touch (or multi-touch) discussed above. The user input device may be the trackpad of FIG. 1, the mouse of FIG. 2, or another device that combines touch and mechanical pick sensing. Block 701 is a reject and modify module. This module can accept user input data from the user input device, modify it and / or reject various data sets as discussed above, and send the modified data to higher level modules 702-705.

Rejection and modification modules may be implemented in application specific hardware. Alternatively, it may be implemented as software running on a programmable processor. In the latter alternative, the reject and modify module may comprise a processor, a memory and software stored in the memory read and executed by the processor. The reject and modify module need not be directly connected to the user input device. Instead, there may be one or more intervening modules between blocks 700 and 701. They are modules for digitization, normalization and / or compression of input data of an input device, modules for performing other types of error correction of the input data, or processing the input data at different points. Modules, for example, segmentation modules that organize raw pixel based input touch data into separate touch objects that define specific touched areas. Since the rejection can be considered a type of modification, the rejection and correction module may be called a modification module.

The modified input data generated by the reject and modify modules can be used by higher level modules. For example, higher level modules may perform further processing and modification of the input data. Alternatively, higher level modules may actually use the input data to perform user interactions. Thus, some higher level modules may be applications such as web browsers, email clients, and the like. Higher level modules may also be implemented as software running on a programmable processor or as special purpose hardware. When implemented as software, higher level modules may include software stored in the same memory as that of the reject and modify modules, or software stored separately. Also, higher level modules may run on the same or different processors as reject and modify modules. The device discussed above, whether it is a laptop computer, desktop computer, or any other type of device, is better and more intuitive by providing a rich user input interface without requiring users to focus on preventing unintended user input. May be characterized by interaction.

Some embodiments discussed above are primarily discussed with reference to rectangular panels, but such embodiments may be used in the context of non-rectangular or curved panels. Embodiments with curved shaped panels may still feature a “flat” or two-dimensional representation of touch data sensed on the panels, and thus may relate to the panels discussed in FIGS. 3 and 4.

Various user input devices that may be used in accordance with embodiments of the present invention have been discussed above in connection with FIGS. 1 and 2. The text below and FIGS. 8-13 provide additional details of some of these types of user input devices. The invention is not limited to the user input devices discussed below.

Referring to FIG. 8, the touch-sensitive track pad 10 will be described in more detail. This track pad is generally a small (often rectangular) region comprising a protective / cosmetic shield 12 and a plurality of electrodes 14 disposed under the protective shield 12. The electrodes 14 may be located on a circuit board, for example a printed circuit board (PCB). For ease of discussion, part of the protective shield 12 has been removed to show the electrodes 14. Different electrodes 14 or combinations thereof may represent different x, y positions. In one configuration, when finger 16 (or alternatively, a stylus, not shown) approaches electrode grid 14, the finger may form capacitance with one or more electrodes close to the finger or It is possible to change existing capacitances between one or more such electrodes. Circuit board / sense electrodes (not shown) measure such capacitance change and generate an input signal 18 that is sent to a host device 20 (eg, computing device) having a display screen 22. . The input signal 18 is used to control the movement of the cursor 24 on the display screen 22. As shown, the input pointer moves in a similar x and y direction as detected x and y finger movements. 9 is a simplified perspective view of the input device 30 according to an embodiment of the present invention. Input device 30 is generally configured to send information or data to an electronic device (not shown) to perform an action on the display screen (eg, via a graphical user interface (GUI)). For example, moving an input pointer, making a selection, or providing commands. The input device can interact with the electronic device via wired (eg cable / connector) or wireless connection (eg IR, Bluetooth, etc.).

The input device 30 can be a standalone unit or it can be integrated into an electronic device. In the case of a standalone unit, the input device typically has its own enclosure. When integrated with an electronic device, the input device typically uses an enclosure of that electronic device. In either case, the input device can be structurally connected to the enclosure via, for example, screws, snaps, retainers, adhesives, and the like. In some cases, the input device may be removably connected to the electronic device via, for example, a docking station. The electronic device to which the input device is connected may correspond to any consumer-related electronic product. For example, the electronic device may correspond to a computer such as a desktop computer, a laptop computer or a PDA, a media player such as a music player, a communication device such as a mobile phone, another input device such as a keyboard, and the like.

As shown in FIG. 9, the input device 30 includes a frame 32 (or support structure) and a track pad 34. The frame 32 provides a structure for supporting the components of the input device. The frame 32 in the form of a housing may also enclose or contain components of the input device. Components comprising track pad 34 may correspond to electrical, optical, and / or mechanical components for operating input device 30.

Track pad 34 provides an intuitive interface configured to provide one or more control functions for controlling various applications associated with the electronic device to which it is attached. For example, a touch initiated control function may be used to move an object or perform an action on the display screen or to issue or select commands related to the operation of the electronic device. To implement the touch initiation control function, the track pad 34 may be moved from a finger (or object) moving along the surface of the track pad 34 (e.g., linearly, radially, at an angle, etc.), to the trackpad ( It may be adapted to receive input from a finger holding a particular position on 34 and / or by finger tapping on a specific position of the track pad 34. As will be appreciated, touch pad 34 provides for easy one-handed operation. That is, the user may interact with the electronic device with one or more fingers.

The track pad 34 may vary. For example, the touch pad 34 may be a conventional track pad based on the Cartesian coordinate system, or the track pad 34 may be a touch pad based on a polar coordinate system. Examples of touch pads based on polar coordinates can be found in US Pat. No. 7,046,230, “TOUCH PAD FOR HANDHELD DEVICE,” filed Jul. 1, 2002, to Zadesky et al. The patent is hereby incorporated by reference in its entirety in all respects.

Track pad 34 can be used in relative or absolute mode. In absolute mode, track pad 34 reports the absolute coordinates of where it is being touched. For example, x, y for Cartesian coordinates or (r, θ) for polar coordinates. In relative mode, track pad 34 reports the direction and / or distance of the change. For example, left / right, up / down, etc. In most cases, the signals generated by the track pad 34 direct movement on the display screen in a direction similar to the direction of the finger as the finger moves across the surface of the pack pad 34.

The shape of the pack pad 34 may vary. For example, track pad 34 may be circular, elliptical, square, rectangular, triangular, or the like. In general, the outer circumference of the track pad 34 defines the working boundary of the track pad 34. In the embodiment shown, the track pad is rectangular. Rectangular track pads are common in laptop computers. Circular track pads allow the user to swirl the fingers continuously in a free manner. That is, the finger can be rotated through the rotation of 360 degrees without stopping. In addition, the user can tangentially rotate his finger from all sides to provide it with a greater range of finger positions. Both of these features can be helpful when performing scrolling functions, so that circular track pads can be used in portable media players (e.g., iPod manufactured by Apple Inc. of Cupertino, Calif.). Media player). In addition, the size of the track pad 34 may generally correspond to the size that allows the track pad to be easily manipulated by a user (eg, the size of a fingertip or more).

Track pad 34, generally in the form of a rigid planar platform, includes a touchable outer track surface 36 for receiving a finger (or object) for manipulation of the track pad. Although not shown in FIG. 9, below the touchable outer track surface 36 is a sensor arrangement that is sensitive to things such as pressure and / or movement of a finger thereon. The sensor arrangement typically includes a plurality of sensors configured to act when a finger rests on, taps over, or passes over. In the simplest case, an electrical signal is generated each time a finger is placed over the sensor. The number of signals in a given time frame may indicate the position, direction, velocity, and acceleration of the finger on the track pad 34. That is, the more signals there are, the more the user moves his finger. In most cases, the signals are monitored by an electronic interface, which converts the number, combination and frequency of the signals into position, direction, speed and acceleration information. This information can be used by the electronic device to perform a desired control function on the display screen. Sensor arrangements can vary. By way of example, the sensors may be resistive sensing, surface acoustic wave sensing, pressure sensing (e.g., strain guage), infrared sensing, optical sensing, distributed signal technology. , Acoustic pulse recognition, capacitive sensing, and the like.

In the illustrated embodiment, the track pad 34 is based on capacitive sensing. As is generally well known, capacitance-based trackpads are designed to detect changes in capacitance as the user moves objects such as fingers up and down the trackpad. In most cases, electronic circuits include a capacitive track pad protective shield, one or more electrode layers, a circuit board, and an application specific integrated circuit (ASIC). A protective shield is disposed over the electrodes; The electrodes are installed on the upper surface of the circuit board; The ASIC is installed on the bottom surface of the circuit board. The protective shield serves to protect the underlying layers and to provide a surface that allows the finger to slide thereon. The surface is generally smooth so that the finger does not stick to it when the finger moves. The protective shield also provides an insulating layer between the finger and the electrode layers. The electrode layer comprises a plurality of spatially distinct electrodes. Any suitable number of electrodes can be used. In most cases, it would be desirable to increase the number of electrodes to provide higher resolution. That is, more information may be used for things like acceleration.

Capacitive sensing works according to the principle of capacitance. As will be appreciated, whenever two electrically conductive members come close to each other without actually touching, their electric fields interact to form a capacitance. In the configuration discussed above, the first electrically conductive member is at least one of the electrodes and the second electrically conductive member is, for example, the user's finger. Thus, when a finger approaches the touch pad, a small capacitance is formed between the finger and the electrodes proximate the finger. The capacitance at each of the electrodes is measured by an ASIC located on the back side of the circuit board. By detecting a change in capacitance at each of the electrodes, the ASIC can determine the position, direction, speed and acceleration of the finger as the finger moves across the touch pad. The ASIC can also report this information in a form that can be used by the electronic device.

According to one embodiment, the track pad 34 is movable relative to the frame 32 to initiate another signal set (other than track signals). By way of example, track pad 34 in the form of a rigid planar platform may rotate, pivot, slide, translate, or flex with respect to frame 32. Track pad 34 may be connected to frame 32 and / or it is movably restrained by frame 32. For example, the track pad 34 may be a screw, shaft, pin joint, slider joint, ball and socket joint, flexure joint, magnet, cushion, or the like. It may be connected to the frame 32 through. The track pad 34 may also float in the space of the frame (eg gimbal). Input device 30 may also be used to increase the range of motion (e.g., to increase degrees of freedom), such as pivot / movement joints, pivot / flexure joints, pivot / ball and socket joints, move / flexure joints, and the like. It should be noted that it may include a combination of joints. When moved, the touch pad 34 is configured to operate circuitry that generates one or more signals. This circuit generally includes one or more motion indicators, such as switches, sensors, encoders, and the like. An example of a gimbaled track pad can be found in patent application No. 10 / 643,256, filed Aug. 13, 2003, entitled "MOVABLE TOUCH PAD WITH ADDED FUNCTIONALITY," which patent application can be viewed as It is incorporated herein by reference in its entirety.

In the illustrated embodiment, the track pad 34 is in the form of a pushable button that performs a "picking" action. That is, a portion of the entire track pad 34 is like a single or multiple button so that one or more additional button functions can be implemented by pressing on the track pad 34 rather than tapping on the track pad or using individual buttons / individual zones. Works. As shown in FIGS. 10A and 10B, according to one embodiment of the invention, the track pad 34 is upright when force from the finger 38, palm, hand, or other object is applied to the track pad 34. It may move between a (or neutral) position (FIG. 10A) and a depressed (or activated) position (FIG. 10B). The force should not be small enough to allow accidental activation of the button signal, but should not be large enough to cause user discomfort by requiring excessive pressure. Track pad 34 may typically be biased in an upright position, for example, via a flexure hinge, spring member, or magnet. The track pad 34 moves to the active position when its bias is overcome by the object pressing the track pad 34. As shown in FIG. 10C, the track pad 34 may be pivoted at one end such that the activation position is slightly inclined relative to the neutral position. When a finger (or other object) is removed from the track pad 34, the bias member pushes it back toward the neutral position. Shims or other structures (not shown) may prevent the track pads from moving past their neutral position. For example, a portion of frame 32 may extend outward over a portion of track pad 34 to stop track pad 34 in a neutral position. In this way, the track surface can be maintained at the same height as frame 32 if desired. For example, in a laptop computer or handheld media device, it may be desirable to have the track pad flush with the housing of the computer or device.

As shown in FIG. 10A, in the upright / neutral position, the track pad 34 generates tracking signals when an object, such as a user's finger, moves over the top surface of the touch pad in the x and y planes. Although FIG. 10A shows the neutral position as being upright, the neutral position may be in any orientation. As shown in Fig. 10B, in the depressed position (z direction), the track pad 34 generates one or more button signals. The button signal may be used for various functions, including but not limited to issuing or selecting commands related to the operation of the electronic device. For example, in the case of a music player, button functions may relate to opening a menu, playing a song, fast forwarding a song, navigating through a menu, and the like. In the case of a laptop computer, button functions may relate to opening a menu, selecting text, selecting an icon, and the like. As shown in FIG. 10D, the input device 30 simultaneously moves the touch pad 34 while simultaneously providing a track signal and a button signal, i.e., moving along the track surface (ie, in the x and y directions). It can be pushed down in the direction. In other cases, the input device 30 may be adapted to provide a button signal only when the touch pad 34 is pressed down and to provide a track signal only when the touch pad 34 is upright.

In other words, the track pad 34 is configured to actuate one or more motion indicators, which can generate a button signal when the track pad 34 moves to an active position. Motion indicators are typically located within frame 32 and may be connected to track pad 34 and / or frame 32. The motion indicators can be any combination of switches and sensors. The switches are generally configured to provide pulsed or binary data, such as active (on) or inactive (off). As an example, the lower position of the track pad 34 may be configured to contact or engage with the switch (and thus activate the switch) when the user presses on the track pad 34. On the other hand, the sensors may generally be configured to provide continuous or analog data. By way of example, the sensor may be configured to measure the amount of position or tilt of the touch pad 34 relative to the frame as the user presses on the track pad 34. Any suitable mechanical, electrical and / or optical switch or sensor can be used. For example, a tact switch, a force sensing resistor, a pressure sensor, a proximity sensor, or the like may be used.

The track pads 10 and 30 shown in FIGS. 8-10 may in some embodiments be multi-touch trackpads. Multi-touch differs from standard touchscreens (eg computer touchpads, ATMs) that recognize only one touch point, as well as software that recognizes multiple simultaneous touch points, as well as touch surfaces (screens, tables , Walls, etc.) or touchpads. This effect is achieved through various means, the means of which are capacitive sensing, resistance sensing, surface acoustic wave sensing, heat, finger pressure, high capture rate camera, infrared light, optical capture ), Tuned electromagnetic induction, and shadow capture. An example of a multi-touch phone is an iPhone manufactured by Apple Inc. of Cupertino, California. An example of a multi-touch media device is an iPod Touch manufactured by Apple. Examples of laptop computers with multi-touch track pads are the MacBook Air and MacBook Pro manufactured by Apple. All input devices described herein may employ multi-touch technology in some embodiments, and alternatively the input devices described herein may employ single touch track pads.

11 is a simplified block diagram of a computer system 39 in accordance with an embodiment of the present invention. The computer system generally includes an input device 40 that is effectively connected to the computing device 42. By way of example, input device 40 may generally correspond to input device 30 shown in FIGS. 9 and 10, and computing device 42 may be a laptop computer, desktop computer, PDA, media player, mobile phone, smartphone, Video games and the like. As shown, input device 40 includes a retractable track pad 44 and one or more motion indicators 46. The track pad 44 is configured to generate tracking signals and the motion indicator 46 is configured to generate a button signal when the track pad 44 is pressed down. Although the track pad 44 can vary widely, in this embodiment, the track pad 44 acquires position signals from the capacitive sensors 48 and the sensors 48 and converts the signals into a computing device ( A control system 50 for supplying 42). The control system 50 monitors the signals from the sensors 48, calculates the position (cartesian or angle), direction, velocity and acceleration of the monitored signals and reports this information to the processor of the computing device 42. It may include an application specific integrated circuit (ASIC) configured to. The movement indicator 46 may also vary. However, in this embodiment, the movement indicator 46 takes the form of a switch that generates a button signal when the track pad 44 is pressed down. The switch 46 may correspond to a mechanical, electrical or optical style switch. In one particular implementation, the switch 46 is a machine style switch that includes a protruding actuator 52 that can be pushed by the track pad 44 to generate a button signal. By way of example, the switch may be a tact switch or a tactile dome.

Both the track pad 44 and the switch 46 are effectively connected to the computing device 42 via the communication interface 54. The communication interface provides a connection point for direct or indirect connection between the input device and the electronic device. Communication interface 54 may be wired (wire, cable, connector) or wireless (eg, transmitter / receiver).

Computing device 42 generally includes a processor 55 (eg, a CPU or microprocessor) configured to execute instructions and perform operations associated with computing device 42. For example, using instructions retrieved from memory, for example, the processor may control the reception and manipulation of input and output data between components of computing device 42. In most cases, processor 55 executes instructions under the control of an operating system or other software. Processor 55 may be a single-chip processor or may be implemented with a number of components.

Computing device 42 also includes an input / output (I / O) controller 56 that is effectively coupled to the processor 54. I / O controller 56 may be integrated with processor 54 or it may be a separate component as shown. I / O controller 56 is generally configured to control interaction with one or more I / O devices, eg, input device 40, that may be coupled to computing device 42. I / O controller 56 generally operates by exchanging data between computing device 42 and I / O devices that wish to communicate with computing device 42.

Computing device 42 also includes a display controller 58 that is effectively coupled to the processor 54. Display controller 58 may be integrated with processor 54 or it may be a separate component as shown. Display controller 58 is configured to process display commands to generate text and graphics on display screen 60. By way of example, display screen 60 may be a monochrome display, a color graphics adapter (CGA) display, an enhanced graphics adapter (EGA) display, a variable-graphics-array (VGA) display, a super VGA display, a liquid crystal display (LCD). (Eg, active matrix, passive matrix, etc.), cathode ray tube (CRT), plasma display, backlit light-emitting diode (LED) LCD display, and the like.

In one embodiment (not shown), the track pad 44 may include a glass surface that functions not only as a touch-sensitive surface, but also as a display screen, in which case the display screen 60 shown in FIG. Will be integrated with the glass surface of the track pad 44. This may be useful in computing devices with touch sensitive displays (eg, media player or mobile phone). An example of a media player with a touch-sensitive display is the iPod Touch manufactured by Apple Inc. of Cupertino, California. An example of a cell phone with a touch-sensitive display is an iPhone manufactured by Apple Inc. of Cupertino, California.

In most cases, processor 54 operates with an operating system to execute computer code and generate and use data. Computer code and data may reside in program storage area 62 that is effectively coupled to processor 54. Program storage area 62 generally provides a place for holding data being used by computing device 42. For example, the program storage area may include a read-only memory (ROM), a random-access memory (RAM), a hard disk drive, and the like. Computer code and data may also reside on removable program media and be loaded or installed on a computing device as needed. In one embodiment, program storage area 62 is configured to store information for controlling how tracking and button signals generated by input device 40 are used by computing device 42.

12 shows one embodiment of an input device, shown generally at 72, comprising a track pad 72 connected to a frame 76. Frame 76 may be a housing for a standalone input device, or it may be another device incorporating track pad 72, for example, a laptop computer, desktop computer, handheld media device, PDA, mobile phone, Casing for a smartphone or the like. Track pad 72 includes various layers that include an outer touch-sensitive track surface 74 for tracking finger movements. Track surface 74 may also provide a low friction cosmetic surface. In one embodiment, the track pad 72 is based on capacitive sensing, and as a result, it includes an electrode layer 80 that can be implemented, for example, on a PCB. For capacitive sensing, track surface 74 is a dielectric material. A stiffener is under the electrode layer 80. Stiffener 84 is shown in FIGS. 12 and 13, but may be omitted in some embodiments. Reinforcement 84 may be used to compensate for the inherent flexibility of electrode layer 80. The electrode layer 80 responds by signaling to the sensor 82 for finger movements along the track surface 74. For capacitive sensing, electrode layer 80 registers a change in capacitance based on finger movements and sensor 82 is a capacitive sensor. In this way, track pad 72 incorporates a touch sensor arrangement. Although the sensor 82 is shown disposed at the bottom of the electrode layer 80, it may be located elsewhere in other embodiments. If, as in the illustrated embodiment, the sensor 82 is located in the movable portion of the track pad 72, the input device is a flexible electrical connection (not shown) that can move with the system. ) Can be integrated.

The movement indicator 78 is disposed at the bottom of the track pad 72. The motion indicator 78 can vary widely, but in this embodiment it takes the form of a mechanical switch, typically disposed between the track pad 72 and the frame 76. In other embodiments, the motion indicator 78 can be a sensor, for example an electrical sensor. Motion indicator 78 may be attached to frame 76 or to track pad 72. In the illustrated embodiment, the movement indicator 78 is attached to the bottom side of the electrode layer 80. By way of example, if electrode layer 80 is located on a PCB, movement indicator 78 may be located at the bottom of the PCB. In another embodiment, the movement indicator 78 may take the form of a tact switch and in particular may be an SMT dome switch (dome switch packaged for SMT).

The track pad 72 is shown in the neutral position in FIG. 12, where the motion sensor 78 is not in contact with the frame 76. When the user applies downward pressure on the track surface 74, the track pad 72 may move downward to cause the motion sensor 78 to register a change in this position. In the illustrated embodiment, the motion sensor 78 (tact switch) will be in contact with the frame 76, or in this case a set screw 88. The set screw 88 can be manually adjusted to change the distance between the neutral position and the activation position. In one embodiment (not shown), the set screw 88 may directly abut the motion sensor 78 in a neutral position so that there is no slack or per-travel in the system. The flexure hinge 86 connects the track pad 72 and the frame 76. The flexure hinge 86 is a resilient material that bends when a force is applied, but applies restoring force to push the track pad 72 back to its neutral position. In one embodiment, flexure hinge 86 may be thin spring steel.

As shown in FIG. 13, the flexure hinge 86 will bend as the user pushes down on the track surface 74. Flexure 86 also pushes track pad 72 toward its neutral position, which is horizontal in the embodiment shown in FIG. 12. In this way, the user presses down virtually anywhere on the track surface 74 to cause a "pick", which means that the motion indicator 78 will register this depression. This contrasts with conventional track pads incorporating individual track regions and pick regions. Being able to pick anywhere on the track surface 74 will provide the user with a more intuitive and enjoyable interface. For example, a user may be able to generate tracking and button signals with just one finger without having to remove the finger from the track surface 74 at all. In contrast, a user who manipulates a track pad with separate track and pick zones may, for example, use his right hand for tracking and his left hand for picking, or the forefinger for tracking and picking. You can use your thumb for this.

A shoulder 90, which may be an extension or a separate member of the frame 76, is in contact with a portion of the track pad 72, for example a stiffener 84, so that the track pad 72 adjusts its neutral position. Do not move past. In this way, the track surface 74 can be maintained at substantially the same height as the top surface of the frame 76. There may be a shock absorber or upstop (not shown) integrated with the shoulder 90 to cushion the contact between the track pad 72 and the shoulder 90.

As will be appreciated, the pick generated by pressing on the track surface 74 may include selecting an item on the screen, opening a file or document, executing a command, starting a program, viewing a menu, and the like. have. Button functions also navigate through the electronic system, for example, zoom, scroll, open various menus, return the input pointer to home, enter, delete And functions that make it easier to perform keyboard related actions such as insert, page up / down, and the like.

The flexure hinge 86 allows the movable track pad in the smallest vertical space possible. Minimal vertical space is achieved where the flexure hinge 86 is thin and generally lies parallel to the bottom layer of the track pad 72, as a result of which the flexure hinge 86 is the thickness of the track pad 72. It doesn't add enough to detect it. Therefore, this arrangement is suitable for use in ultra-thin laptop computers. In such ultra-thin laptop computer applications, the vertical space is extremely limited. In the past, the size of electrical components has often been a limiting feature on how small electrical devices can be made. Today, electrical components are becoming smaller and smaller, which means that mechanical components (eg, movable track pads) can now be critical size-limiting components. With this in mind, it is easy to understand why linear-actuation (eg, supporting a movable track pad by a coil spring or the like) is not ideal in some applications. In addition, using springs can add unnecessary complexity (increased component count, higher cost, higher failure rates, etc.) to the manufacturing process. Another disadvantage of the spring is that in some embodiments the spring can mask or damage the tactile switch force profile. In contrast, flexure 86 can convey a substantially consistent feeling across track surface 74 and provide the user with a more faithful representation of the tactile switch force profile.

Referring now to FIG. 13, in accordance with an embodiment of the present invention, when the user presses the track surface 74 of the track pad 72, the track pad 72 pivots downward to engage the switch 78 disposed below. It works. The switch 78, when actuated, generates button signals that can be used by an electronic device connected to the input device 70. Flexure 86 may constrain track pad 72 to move only substantially around one axis. This can be achieved, for example, by using multiple flexures arranged along the axis of one side of the track pad 72, for example the rear side. In addition, if the track pad 72 is made stiff (eg, by insertion of the reinforcement 84 if necessary), a leveling architecture is achieved. That is, the flexure hinge 86 pushes the track pad 72 toward its neutral position and also moves only substantially around one axis, ie the axis along which the flexure hinge 86 is connected to the frame 76. Allow.
Therefore, according to some embodiments of the invention, a device includes a user interface comprising a touch sensing panel configured to detect touch events and a mechanical sensor configured to detect pick events including physical deformation or displacement of the touch sensing panel. Device. The device receives user input data defining touch and pick events detected from the user interface device, examines the user input data to determine whether unintended user input may have affected the user input data, It also includes a modification module configured to modify user input data to remove the effects of unintended user input. In some embodiments, the apparatus further includes one or more higher level modules coupled to the modification module, wherein the modification module is further configured to transmit the modified user data to the one or more higher level modules. In some embodiments, the user interface device is a trackpad and the touch sensitive panel is the top surface of the trackpad. In some embodiments, the device is a laptop computer. In some embodiments, the user interface device is a mouse and the touch sensitive panel is the top surface of the mouse. In some embodiments, the device is a desktop computer. In some embodiments, the touch sensitive panel is a multi-touch sensitive panel. In some embodiments, the modification module is first located in the lower portion of the multi-touch panel and then moved out of the bottom portion, and a second finger touch input moved while the first finger touch input moves out of the bottom portion. Receiving input data representing a finger touch input, the modification module is configured to reject the first finger touch input if the second finger touch input moves a distance longer than a predefined distance. In some embodiments, the first finger touch input is a lowest path input, and the lowest path input is a finger touch input selected from a plurality of concurrently occurring finger touch inputs as being relatively low and stationary. In some embodiments, the modification module receives input data indicative of a first finger touch input that is moving and a second finger touch input that is detected while the first finger touch input is moving, wherein the modification module is configured to generate at least a first finger touch input. And if the second finger touch input appears after moving for a predefined time period, the second finger touch input is configured to reject the first finger touch input. In some embodiments, the first finger touch input is a lowest path input, and the lowest path input is a finger touch input selected from a plurality of concurrent finger touch inputs as being relatively low and non-moving. In some embodiments, the modification module is configured to receive a plurality of finger touch inputs and input data indicative of a pick input, and the modification module is configured to select a lowest path input from the plurality of finger touch inputs and reject the lowest path input. . In some embodiments, the lowest path input is rejected while at least one of both (i) a pick input is in progress and (ii) the lowest path is not the only detected touch. In some embodiments, the modification module then receives input data indicating that pick input has stopped, and the modification module determines whether the lowest path is the thumb path and indicates that the lowest path is the thumb path. Continue to reject the lowest path if determined and allow the lowest path if the lowest path is determined not to be a thumb path. In some embodiments, the correction module receives input data indicative of one or more finger touch inputs (at least one of the finger touch inputs having a non-zero speed) and a pick input, wherein the correction module is a pick input And modify the speed of all finger touch inputs having a non-zero speed for a non-zero time period after detection of. In some embodiments, modifying the speed includes reducing the speed.
According to another embodiment, user interface events detected by the combined user interface device capable of detecting touch events on the touch sensitive panel and pick events including physical deformation or displacement of the touch sensitive panel are detected from the user interface device. Receiving user input data defining defined touch and pick events; Examining the user input data to determine if unintended user input may have affected the user input data; And modifying user input data to remove the effect of unintended user input. In some embodiments, the method further includes transmitting the modified user data to one or more higher level modules. In some embodiments, the method comprises a first finger touch input initially located at the bottom portion of the multi-touch panel and then moving out of the bottom portion, and a second finger touch input moving while the first finger touch input is moving out of the bottom portion. Receiving input data representative; Determining that the second finger touch input has moved a distance longer than the predefined distance before the first finger touch input exits the lower portion; And rejecting the first finger touch input. In some embodiments, the method further comprises determining the first finger touch input to be the lowest path input, wherein a determination is made with respect to the height and movement of the first finger touch input, wherein the first finger touch input is It is relatively low relative to other finger touch inputs and is in a low position on the multi-touch panel. In some embodiments, the method includes receiving input data indicative of a first finger touch input that is moving and a second finger touch input that is detected while the first finger touch input is moving; Determining that the second finger touch input appears after the first finger touch input has moved for a predefined time period; And rejecting the first finger touch input. In some embodiments, the method further comprises determining the first finger touch input to be the lowest path input, wherein a determination is made with respect to the height and movement of the first finger touch input, wherein the first finger touch input is It is relatively low relative to other finger touch inputs and is in a low position on the multi-touch panel. In some embodiments, the method includes receiving input data indicative of a plurality of finger touch inputs; Determining one of the plurality of finger touch inputs as the lowest path input; And rejecting the lowest path input while the pick input is in progress. In some embodiments, the method further includes continuing to reject the lowest path input while at least one of (i) a pick input is in progress and (ii) the lowest path is not the only detected touch. Include. In some embodiments, the method includes receiving input data indicating that pick input has stopped; Determining whether the lowest path is a thumb path; And continuously rejecting the lowest path if it is determined that the lowest path is the thumb path and allowing the lowest path if it is determined that the lowest path is not the thumb path. In some embodiments, the method includes receiving one or more finger touch inputs (at least one of the finger touch inputs having a non-zero speed) and input data indicative of a pick input; And modifying the velocity of all finger touch inputs having a non-zero velocity for a non-zero period of time after detection of the pick input. In some embodiments, the method includes modifying the speed, including reducing the speed.
According to yet another embodiment, the computer comprises a user interface device and a modification module. The user interface device includes a touch sensing panel configured to detect touch events, and a mechanical sensor configured to detect pick events including physical deformation or displacement of the touch sensing panel. The modification module receives user input data defining touch and pick events detected from the user interface device, examines the user input data to determine whether unintended user input may have affected the user input data, And modify user input data to remove the effects of unintended user input.
According to yet another embodiment, the computer mouse includes a user interface device and a modification module. The user interface device includes a touch sensing panel configured to detect touch events, and a mechanical sensor configured to detect pick events including physical deformation or displacement of the touch sensing panel. The modification module receives user input data defining touch and pick events detected from the user interface device, examines the user input data to determine whether unintended user input may have affected the user input data, And modify user input data to remove the effects of unintended user input.

While embodiments of this invention have been described fully with reference to the accompanying drawings, various changes and modifications will be apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of embodiments of the invention as defined by the appended claims.

1 illustrates an exemplary laptop trackpad in accordance with an embodiment of the present invention.

2 illustrates an exemplary computer mouse in accordance with an embodiment of the present invention.

3A-3C illustrate a plurality of exemplary touch panels and a touch combination sensed thereon.

4A-4C illustrate a plurality of exemplary touch panels and a touch combination sensed thereon.

5 illustrates exemplary switch state and clock memory variable graphs in accordance with an embodiment of the present invention.

6 shows an exemplary initial velocity versus modified velocity graph in accordance with an embodiment of the present invention.

7 illustrates an exemplary block diagram of one embodiment of the present invention.

8 is a simplified diagram of an exemplary touch pad and display according to one embodiment of the invention.

9 is a perspective view of an exemplary input device according to an embodiment of the present invention.

10A-10D are simplified side views of an exemplary input device having a button touch pad, in accordance with an embodiment of the present invention.

11 is a simplified block diagram of an exemplary input device connected to a computing device, in accordance with an embodiment of the present invention.

12 is a side cross-sectional view of an exemplary input device according to an embodiment of the present invention.

FIG. 13 is another side cross-sectional view of the exemplary input device of FIG. 12.

<Explanation of symbols for the main parts of the drawings>

101: multi-touch trackpad

102: hinge

104: switch

Claims (49)

  1. A touch sensing panel configured to detect touch events, and
    A mechanical sensor configured to detect pick events including physical deformation or displacement of the touch sensitive panel
    A user interface device comprising a; And
    Receive user input data defining touch and pick events detected from the user interface device,
    Examine the user input data to determine whether unintended user input caused at least a portion of the user input data-the determination being independent of the positional relationship between the user input data and an image displayed on a display -,
    A modification module configured to modify the user input data to remove a portion of the user input data caused by the unintended user input
    / RTI &gt;
  2. delete
  3. delete
  4. delete
  5. delete
  6. delete
  7. The method of claim 1,
    And the touch sensitive panel is a multi-touch sensitive panel.
  8. The method of claim 7, wherein
    The correction module is configured to maintain the pick input unmodified when detecting that the pick input and the touch input coexist,
    The touch input is
    Multiple finger touch inputs without other touch inputs,
    Finger touch input at an edge of the multi-touch sensing panel, or
    Palm touch input located in proximity to a side of the multi-touch sensing panel and finger touch input located outside of the top of the multi-touch sensing panel;
    Device.
  9. The method of claim 7, wherein
    The modification module is configured to reject the pick input when detecting that the pick input and the touch input coexist,
    The touch input is
    Multiple finger touch inputs greater than a predefined number,
    One or more palm touch inputs located proximate a side of the multi-touch sensing panel, or
    A palm touch input positioned close to a side of the multi-touch sensing panel and a finger touch input located above the multi-touch sensing panel;
    Device.
  10. delete
  11. delete
  12. delete
  13. delete
  14. delete
  15. The method of claim 7, wherein
    And the correction module is configured to reject the pick input when a pick input is input while touch inputs recognized by the correction module are being part of an ongoing gesture.
  16. delete
  17. delete
  18. delete
  19. delete
  20. delete
  21. delete
  22. delete
  23. The method of claim 7, wherein
    The correction module receives input data indicative of one or more finger touch inputs, at least one of the finger touch inputs having a non-zero speed, and a pick input, wherein the correction module is configured to receive the pick input. And modify the speed of all finger touch inputs having a non-zero speed for a non-zero time period after detection.
  24. delete
  25. 24. The method of claim 23,
    And the correction amount of the speed decreases with time from the detection of the pick input.
  26. The method of claim 7, wherein
    The correction module receives one or more finger touch inputs, at least one of the finger touch inputs having a non-zero speed, and input data indicating a stop of an ongoing pick input, wherein the modification module is configured to receive the ongoing pick input. And modify the speed of all finger touch inputs having a non-zero speed for a non-zero time period after stop.
  27. A method of processing user interface events detected by a combination user interface device capable of detecting touch events on a touch sensitive panel and pick events including physical deformation or displacement of the touch sensitive panel.
    Receiving user input data defining touch and pick events sensed from the user interface device;
    Examining the user input data to determine if unintended user input caused at least a portion of the user input data, the determination being independent of the positional relationship between the user input data and an image displayed on a display. ; And
    Modifying the user input data to remove a portion of the user input data caused by the unintended user input
    Method of processing a user interface event comprising a.
  28. delete
  29. The method of claim 27,
    Receiving user input data representing a pick input that coexists with a plurality of finger touch inputs without another touch input; And
    Maintaining the pick input unmodified as a result of the received user input data
    Method of processing a user interface event further comprising.
  30. The method of claim 27,
    Receiving user input data indicative of pick inputs that coexist with a plurality of finger touch inputs more than a predefined number; And
    Rejecting the pick input as a result of the received user input data
    Method of processing a user interface event further comprising.
  31. The method of claim 27,
    Receiving user input data indicating a pick input coexisting with a finger touch input at an edge of the touch sensing panel; And
    Maintaining the pick input unmodified as a result of the received user input data
    Method of processing a user interface event further comprising.
  32. The method of claim 27,
    Receiving user input data indicative of a pick input coexisting with one or more palm touch inputs located proximate sides of the touch sensitive panel; And
    Rejecting the pick input as a result of the received user input data
    Method of processing a user interface event further comprising.
  33. The method of claim 27,
    Receiving user input data indicating a palm touch input positioned close to a side of the touch sensing panel and a pick input coexisting with a finger touch input positioned above the touch sensing panel; And
    Rejecting the pick input as a result of the received user input data
    Method of processing a user interface event further comprising.
  34. The method of claim 27,
    Receiving user input data representing a palm touch input positioned close to a side of the touch sensing panel and a pick input coexisting with a finger touch input not located above the touch sensing panel; And
    Maintaining the pick input unmodified as a result of the received user input data
    Method of processing a user interface event further comprising.
  35. delete
  36. The method of claim 27,
    Receiving user input data representing a plurality of moving touch inputs;
    Recognizing the plurality of moving touch inputs as part of an ongoing touch gesture;
    Receiving user input data representing a pick input being input while the touch inputs recognized as part of an ongoing gesture are being input; And
    Rejecting the pick input as a result of the received user input data
    Method of processing a user interface event further comprising.
  37. delete
  38. delete
  39. delete
  40. delete
  41. delete
  42. delete
  43. delete
  44. The method of claim 27,
    Receiving input data indicative of one or more finger touch inputs, at least one of the finger touch inputs having a non-zero speed; and a pick input; And
    Modifying the velocity of all finger touch inputs having a non-zero velocity for a non-zero period of time after detection of the pick input
    Method of processing a user interface event further comprising.
  45. delete
  46. The method of claim 44,
    And the speed correction amount decreases as time elapses from detection of the pick input.
  47. The method of claim 27,
    Receiving input data indicative of one or more finger touch inputs, at least one of the finger touch inputs having a non-zero speed, and a stop of an ongoing pick input; And
    Modifying the velocity of all finger touch inputs having a nonzero velocity for a nonzero period of time after the ongoing pick input is stopped
    Method of processing a user interface event further comprising.
  48. delete
  49. delete
KR1020090026845A 2008-12-08 2009-03-30 An apparatus and a method for selective input signal rejection and modification KR101096358B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/242,794 US8294047B2 (en) 2008-12-08 2008-12-08 Selective input signal rejection and modification
US12/242,794 2008-12-08

Publications (2)

Publication Number Publication Date
KR20100066283A KR20100066283A (en) 2010-06-17
KR101096358B1 true KR101096358B1 (en) 2011-12-20

Family

ID=42229829

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020090026845A KR101096358B1 (en) 2008-12-08 2009-03-30 An apparatus and a method for selective input signal rejection and modification

Country Status (3)

Country Link
US (5) US8294047B2 (en)
JP (4) JP5259474B2 (en)
KR (1) KR101096358B1 (en)

Families Citing this family (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7561146B1 (en) 2004-08-25 2009-07-14 Apple Inc. Method and apparatus to reject accidental contact on a touchpad
US7834855B2 (en) 2004-08-25 2010-11-16 Apple Inc. Wide touchpad on a portable computer
US20070152983A1 (en) 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
US8022935B2 (en) 2006-07-06 2011-09-20 Apple Inc. Capacitance sensing electrode with integrated I/O mechanism
FR2917859B1 (en) * 2007-06-25 2009-10-02 Dav Sa Electrical control device
US20090174679A1 (en) 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
US9740341B1 (en) 2009-02-26 2017-08-22 Amazon Technologies, Inc. Capacitive sensing with interpolating force-sensitive resistor array
US10180746B1 (en) 2009-02-26 2019-01-15 Amazon Technologies, Inc. Hardware enabled interpolating sensor and display
US8294047B2 (en) * 2008-12-08 2012-10-23 Apple Inc. Selective input signal rejection and modification
US9046956B2 (en) * 2009-04-22 2015-06-02 Mitsubishi Electric Corporation Position input apparatus that detects a position where a pressure is applied
JP2010262557A (en) * 2009-05-11 2010-11-18 Sony Corp Information processing apparatus and method
US9740340B1 (en) 2009-07-31 2017-08-22 Amazon Technologies, Inc. Visually consistent arrays including conductive mesh
US9785272B1 (en) * 2009-07-31 2017-10-10 Amazon Technologies, Inc. Touch distinction
JP5325060B2 (en) * 2009-09-18 2013-10-23 株式会社バンダイナムコゲームス Program, information storage medium and image control system
EP2315186B1 (en) * 2009-10-26 2016-09-21 Lg Electronics Inc. Mobile terminal with flexible body for inputting a signal upon bending said body
KR101660842B1 (en) * 2009-11-05 2016-09-29 삼성전자주식회사 Touch input method and apparatus
US8810524B1 (en) 2009-11-20 2014-08-19 Amazon Technologies, Inc. Two-sided touch sensor
US20110134048A1 (en) * 2009-12-09 2011-06-09 Walline Erin K System for Interpretation of Gesture on a Non-All-Points-Addressable Multi-Touch Input Device Having Integrated Buttons
CN102117140A (en) * 2009-12-30 2011-07-06 联想(北京)有限公司 Touch processing method and mobile terminal
US9256304B2 (en) * 2010-05-28 2016-02-09 Lenovo (Singapore) Pte. Ltd. Systems and methods for automatic disable of input devices
TW201203017A (en) * 2010-07-08 2012-01-16 Acer Inc Input controlling method for a software keyboard and a device implementing the method
US8854316B2 (en) 2010-07-16 2014-10-07 Blackberry Limited Portable electronic device with a touch-sensitive display and navigation device and method
EP2407867B1 (en) * 2010-07-16 2015-12-23 BlackBerry Limited Portable electronic device with a touch-sensitive display and navigation device and method
KR101710657B1 (en) * 2010-08-05 2017-02-28 삼성디스플레이 주식회사 Display device and driving method thereof
KR20120015968A (en) * 2010-08-14 2012-02-22 삼성전자주식회사 Method and apparatus for preventing touch malfunction of a portable terminal
US8618428B2 (en) * 2010-12-14 2013-12-31 Synaptics Incorporated System and method for determining object information using an estimated rigid motion response
US9268390B2 (en) * 2010-12-14 2016-02-23 Microsoft Technology Licensing, Llc Human presence detection
US8660978B2 (en) * 2010-12-17 2014-02-25 Microsoft Corporation Detecting and responding to unintentional contact with a computing device
US8982045B2 (en) 2010-12-17 2015-03-17 Microsoft Corporation Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
US8994646B2 (en) 2010-12-17 2015-03-31 Microsoft Corporation Detecting gestures involving intentional movement of a computing device
US9244545B2 (en) 2010-12-17 2016-01-26 Microsoft Technology Licensing, Llc Touch and stylus discrimination and rejection for contact sensitive computing devices
US8988398B2 (en) 2011-02-11 2015-03-24 Microsoft Corporation Multi-touch input device with orientation sensing
US9201520B2 (en) 2011-02-11 2015-12-01 Microsoft Technology Licensing, Llc Motion and context sharing for pen-based computing inputs
JP2012173950A (en) * 2011-02-21 2012-09-10 Denso Corp Continuous operation learning device and navigation device
US8982062B2 (en) * 2011-05-09 2015-03-17 Blackberry Limited Multi-modal user input device
US20120324403A1 (en) * 2011-06-15 2012-12-20 Van De Ven Adriaan Method of inferring navigational intent in gestural input systems
WO2012172543A1 (en) * 2011-06-15 2012-12-20 Bone Tone Communications (Israel) Ltd. System, device and method for detecting speech
JP5857465B2 (en) 2011-06-16 2016-02-10 ソニー株式会社 Information processing apparatus, information processing method, and program
KR101529262B1 (en) * 2011-07-01 2015-06-29 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Adaptive user interface
JP5667002B2 (en) * 2011-07-16 2015-02-12 レノボ・シンガポール・プライベート・リミテッド Computer input device and portable computer
WO2013012424A1 (en) * 2011-07-21 2013-01-24 Research In Motion Limited Electronic device including a touch-sensitive display and a navigation device and method of controlling the same
US9658715B2 (en) 2011-10-20 2017-05-23 Microsoft Technology Licensing, Llc Display mapping modes for multi-pointer indirect input devices
US9274642B2 (en) 2011-10-20 2016-03-01 Microsoft Technology Licensing, Llc Acceleration-based interaction for multi-pointer indirect input devices
US8933896B2 (en) * 2011-10-25 2015-01-13 Microsoft Corporation Pressure-based interaction for indirect touch input devices
US9367230B2 (en) 2011-11-08 2016-06-14 Microsoft Technology Licensing, Llc Interaction models for indirect interaction devices
KR101383840B1 (en) * 2011-11-17 2014-04-14 도시바삼성스토리지테크놀러지코리아 주식회사 Remote controller, system and method for controlling by using the remote controller
TWI447630B (en) * 2011-11-25 2014-08-01 Wistron Corp Processing method for touch signal and computing device thereof
US9389679B2 (en) 2011-11-30 2016-07-12 Microsoft Technology Licensing, Llc Application programming interface for a multi-pointer indirect touch input device
US8633911B2 (en) 2011-12-14 2014-01-21 Synaptics Incorporated Force sensing input device and method for determining force information
US8902181B2 (en) 2012-02-07 2014-12-02 Microsoft Corporation Multi-touch-movement gestures for tablet computing devices
US20130207913A1 (en) * 2012-02-09 2013-08-15 Sony Mobile Communications Inc. Touch panel device, portable terminal, position detecting method, and recording medium
KR101907463B1 (en) * 2012-02-24 2018-10-12 삼성전자주식회사 Composite touch screen and operating method thereof
JP5868727B2 (en) * 2012-03-02 2016-02-24 アルプス電気株式会社 Input device with movable touchpad
US9046958B2 (en) * 2012-03-15 2015-06-02 Nokia Technologies Oy Method, apparatus and computer program product for user input interpretation and input error mitigation
JP2013200797A (en) * 2012-03-26 2013-10-03 Brother Ind Ltd Input device
US9665214B2 (en) 2012-03-29 2017-05-30 Synaptics Incorporated System and methods for determining object information using selectively floated electrodes
US8970525B1 (en) 2012-06-27 2015-03-03 Google Inc. Method and system for trackpad input error mitigation
US9921692B2 (en) * 2012-08-03 2018-03-20 Synaptics Incorporated Hinged input device
US20140211396A1 (en) * 2013-01-29 2014-07-31 Kabushiki Kaisha Toshiba Electronic apparatus
US9785228B2 (en) 2013-02-11 2017-10-10 Microsoft Technology Licensing, Llc Detecting natural user-input engagement
EP2778862A1 (en) * 2013-03-13 2014-09-17 Delphi Technologies, Inc. Push-button switch with touch sensitive surface
JP2015011610A (en) * 2013-07-01 2015-01-19 アルプス電気株式会社 Button combination type touch panel input device
CN104423656B (en) * 2013-08-20 2018-08-17 南京中兴新软件有限责任公司 Mistaken touch recognition methods and device
JP6135413B2 (en) * 2013-09-09 2017-05-31 富士通株式会社 Electronic device and program
EP3049895A4 (en) 2013-09-24 2017-06-07 Hewlett-Packard Development Company, L.P. Determining a segmentation boundary based on images representing an object
US10324563B2 (en) 2013-09-24 2019-06-18 Hewlett-Packard Development Company, L.P. Identifying a target touch region of a touch-sensitive surface based on an image
US9619044B2 (en) * 2013-09-25 2017-04-11 Google Inc. Capacitive and resistive-pressure touch-sensitive touchpad
US9727094B1 (en) * 2013-09-29 2017-08-08 Apple Inc. Window button assembly
KR102115283B1 (en) * 2013-12-02 2020-05-26 엘지디스플레이 주식회사 Palm recognition method
KR101573608B1 (en) * 2014-04-04 2015-12-01 현대자동차주식회사 Sound wave touch pad
US9727161B2 (en) 2014-06-12 2017-08-08 Microsoft Technology Licensing, Llc Sensor correlation for pen and touch-sensitive computing device interaction
US9870083B2 (en) 2014-06-12 2018-01-16 Microsoft Technology Licensing, Llc Multi-device multi-user sensor correlation for pen and computing device interaction
JP2016115011A (en) 2014-12-11 2016-06-23 トヨタ自動車株式会社 Touch operation detection device
US9612703B2 (en) * 2014-12-31 2017-04-04 Synaptics Incorporated Top mount clickpad module
US10282000B2 (en) * 2015-02-26 2019-05-07 Dell Products L.P. Touchpad with multiple tactile switches
US9785275B2 (en) * 2015-03-30 2017-10-10 Wacom Co., Ltd. Contact discrimination using a tilt angle of a touch-sensitive surface
US9727151B2 (en) 2015-04-16 2017-08-08 Google Inc. Avoiding accidental cursor movement when contacting a surface of a trackpad
JP5947962B1 (en) * 2015-07-16 2016-07-06 レノボ・シンガポール・プライベート・リミテッド Input device and electronic device
US10261619B2 (en) 2015-08-31 2019-04-16 Synaptics Incorporated Estimating force applied by an input object to a touch sensor
CN106547381B (en) * 2015-09-22 2019-08-02 中移(杭州)信息技术有限公司 A kind of method and apparatus of mobile terminal false-touch prevention
DE102016208496A1 (en) 2016-05-18 2017-11-23 Heidelberger Druckmaschinen Ag Multitouch control
US20180081477A1 (en) * 2016-09-16 2018-03-22 Microsoft Technology Licensing, Llc Hinged touchpad
JP2018106434A (en) * 2016-12-27 2018-07-05 デクセリアルズ株式会社 User interface apparatus and electronic device
KR20190100761A (en) * 2018-02-21 2019-08-29 삼성전자주식회사 Electronic device comprisng display with switch

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000200141A (en) 1999-01-01 2000-07-18 Smk Corp Tablet with switch

Family Cites Families (236)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4346376B1 (en) 1980-04-16 1988-12-13
US4477797A (en) * 1980-12-12 1984-10-16 Citizen Watch Company Limited Data input device for electronic device
US4658690A (en) * 1983-05-10 1987-04-21 Synthaxe Limited Electronic musical instrument
JPS6175981A (en) 1984-09-21 1986-04-18 Nippon Tsushin Kensetsu Kk Recognizer of handwritten character
JPH028710Y2 (en) 1984-10-24 1990-03-01
US4731058A (en) * 1986-05-22 1988-03-15 Pharmacia Deltec, Inc. Drug delivery system
US4797514A (en) * 1986-06-09 1989-01-10 Elographics, Inc. Touch sensitive device with increased linearity
DE3809677C2 (en) 1987-03-19 1993-07-29 Kabushiki Kaisha Toshiba, Kawasaki, Kanagawa, Jp
US5053758A (en) * 1988-02-01 1991-10-01 Sperry Marine Inc. Touchscreen control panel with sliding touch control
JPH0773278B2 (en) 1989-01-09 1995-08-02 日本電気株式会社 Multiprocessor system
DE69016463T2 (en) * 1990-05-01 1995-09-07 Wang Laboratories Hand-free hardware keyboard.
US5119079A (en) 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
KR0133549B1 (en) 1990-12-18 1998-04-23 프랭크린 씨. 웨이스 Laptop computer with pomlest and keyboard and cursor
US6141000A (en) * 1991-10-21 2000-10-31 Smart Technologies Inc. Projection display system with touch sensing on screen, computer assisted alignment correction and network conferencing
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US5963671A (en) 1991-11-27 1999-10-05 International Business Machines Corporation Enhancement of soft keyboard operations using trigram prediction
JPH05257594A (en) 1992-01-14 1993-10-08 Sony Corp Input unit
US5483261A (en) 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US6222525B1 (en) * 1992-03-05 2001-04-24 Brad A. Armstrong Image controllers with sheet connected sensors
JP2628819B2 (en) * 1992-04-01 1997-07-09 河西工業株式会社 Unit panel for door trim
EP0574213B1 (en) 1992-06-08 1999-03-24 Synaptics, Inc. Object position detector
US5488204A (en) 1992-06-08 1996-01-30 Synaptics, Incorporated Paintbrush stylus for capacitive touch sensor pad
US5880411A (en) 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
KR940001227A (en) * 1992-06-15 1994-01-11 에프. 제이. 스미트 Touch screen devices
US5481278A (en) 1992-10-21 1996-01-02 Sharp Kabushiki Kaisha Information processing apparatus
JP2963589B2 (en) 1992-11-05 1999-10-18 シャープ株式会社 Gesture processing device and gesture processing method
JPH06289969A (en) 1993-04-06 1994-10-18 Hitachi Gazou Joho Syst:Kk Electronic equipment
US5563632A (en) * 1993-04-30 1996-10-08 Microtouch Systems, Inc. Method of and apparatus for the elimination of the effects of internal interference in force measurement systems, including touch - input computer and related displays employing touch force location measurement techniques
JP3400111B2 (en) 1993-06-30 2003-04-28 株式会社東芝 Input device for portable electronic device, input method for portable electronic device, and portable electronic device
US5764218A (en) 1995-01-31 1998-06-09 Apple Computer, Inc. Method and apparatus for contacting a touch-sensitive cursor-controlling input device to generate button values
US5591945A (en) * 1995-04-19 1997-01-07 Elo Touchsystems, Inc. Acoustic touch position sensor using higher order horizontally polarized shear wave propagation
JPH0944293A (en) 1995-07-28 1997-02-14 Sharp Corp Electronic equipment
US5996080A (en) 1995-10-04 1999-11-30 Norand Corporation Safe, virtual trigger for a portable data capture terminal
US5856822A (en) 1995-10-27 1999-01-05 02 Micro, Inc. Touch-pad digital computer pointing-device
US5767457A (en) * 1995-11-13 1998-06-16 Cirque Corporation Apparatus and method for audible feedback from input device
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5757361A (en) 1996-03-20 1998-05-26 International Business Machines Corporation Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary
AU2808697A (en) * 1996-04-24 1997-11-12 Logitech, Inc. Touch and pressure sensing method and apparatus
US5835079A (en) 1996-06-13 1998-11-10 International Business Machines Corporation Virtual pointing device for touchscreens
WO1998043202A1 (en) 1997-03-25 1998-10-01 Gateway 2000, Inc. Button wheel pointing device for notebook pcs
US6118435A (en) 1997-04-10 2000-09-12 Idec Izumi Corporation Display unit with touch panel
JPH10289061A (en) 1997-04-10 1998-10-27 Idec Izumi Corp Display device having touch panel
JPH10293644A (en) 1997-04-18 1998-11-04 Idec Izumi Corp Display device having touch panel
US5821922A (en) 1997-05-27 1998-10-13 Compaq Computer Corporation Computer having video controlled cursor system
US5864334A (en) 1997-06-27 1999-01-26 Compaq Computer Corporation Computer keyboard with switchable typing/cursor control modes
JP3856910B2 (en) * 1997-07-08 2006-12-13 富士通株式会社 Automatic transaction equipment
KR19990015738A (en) 1997-08-08 1999-03-05 윤종용 Handheld Computer with Touchpad Input Control
US5943052A (en) 1997-08-12 1999-08-24 Synaptics, Incorporated Method and apparatus for scroll bar control
US6310610B1 (en) * 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
JPH11194883A (en) 1998-01-06 1999-07-21 Poseidon Technical Systems:Kk Touch operation type computer
AU759440B2 (en) * 1998-01-26 2003-04-17 Apple Inc. Method and apparatus for integrating manual input
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
JPH11327788A (en) 1998-05-20 1999-11-30 Kenwood Corp Touch panel device
US6369803B2 (en) 1998-06-12 2002-04-09 Nortel Networks Limited Active edge user interface
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
US6188391B1 (en) 1998-07-09 2001-02-13 Synaptics, Inc. Two-layer capacitive touchpad and method of making same
US6243080B1 (en) 1998-07-14 2001-06-05 Ericsson Inc. Touch-sensitive panel with selector
JP2000039964A (en) 1998-07-22 2000-02-08 Sharp Corp Handwriting inputting device
JP3267952B2 (en) 1998-09-21 2002-03-25 松下電器産業株式会社 Flat input device
US6154210A (en) 1998-11-25 2000-11-28 Flashpoint Technology, Inc. Method and system for implementing button interface compatibility in touch-screen equipped digital imaging device
JP4542637B2 (en) 1998-11-25 2010-09-15 セイコーエプソン株式会社 Portable information device and information storage medium
JP3758866B2 (en) 1998-12-01 2006-03-22 富士ゼロックス株式会社 Coordinate input device
US6560612B1 (en) 1998-12-16 2003-05-06 Sony Corporation Information processing apparatus, controlling method and program medium
US6246395B1 (en) 1998-12-17 2001-06-12 Hewlett-Packard Company Palm pressure rejection method and apparatus for touchscreens
JP2000194507A (en) 1998-12-25 2000-07-14 Tokai Rika Co Ltd Touch operation input device
US6452514B1 (en) 1999-01-26 2002-09-17 Harald Philipp Capacitive sensor and array
US6336614B1 (en) 1999-02-11 2002-01-08 Benjamin J. Kwitek Conformable portable computer hand pads
US6982695B1 (en) * 1999-04-22 2006-01-03 Palmsource, Inc. Method and apparatus for software control of viewing parameters
US6216988B1 (en) 1999-06-24 2001-04-17 International Business Machines Corporation Integrated wrist rest
US6501462B1 (en) 1999-07-01 2002-12-31 Gateway, Inc. Ergonomic touch pad
US6337678B1 (en) 1999-07-21 2002-01-08 Tactiva Incorporated Force feedback computer input and output device with coordinated haptic elements
US6459424B1 (en) 1999-08-10 2002-10-01 Hewlett-Packard Company Touch-sensitive input screen having regional sensitivity and resolution properties
US6492979B1 (en) * 1999-09-07 2002-12-10 Elo Touchsystems, Inc. Dual sensor touchscreen utilizing projective-capacitive and force touch sensors
US6504530B1 (en) * 1999-09-07 2003-01-07 Elo Touchsystems, Inc. Touch confirming touchscreen utilizing plural touch sensors
US6424338B1 (en) 1999-09-30 2002-07-23 Gateway, Inc. Speed zone touchpad
JP2001134382A (en) 1999-11-04 2001-05-18 Sony Corp Graphic processor
US6573844B1 (en) 2000-01-18 2003-06-03 Microsoft Corporation Predictive keyboard
JP4803883B2 (en) 2000-01-31 2011-10-26 キヤノン株式会社 Position information processing apparatus and method and program thereof.
US6765557B1 (en) 2000-04-10 2004-07-20 Interlink Electronics, Inc. Remote control having touch pad to screen mapping
US6756971B1 (en) 2000-05-19 2004-06-29 Steven E. Ramey Touch pad guard with optional wrist pad
US6611253B1 (en) 2000-09-19 2003-08-26 Harel Cohen Virtual input environment
US7190348B2 (en) 2000-12-26 2007-03-13 International Business Machines Corporation Method for touchscreen data input
JP2001265519A (en) * 2001-02-26 2001-09-28 Alps Electric Co Ltd Computer system
JP2002259050A (en) * 2001-02-28 2002-09-13 Denon Ltd Mouse with ten-key function
JP3988476B2 (en) * 2001-03-23 2007-10-10 セイコーエプソン株式会社 Coordinate input device and display device
JP2002287889A (en) 2001-03-23 2002-10-04 Sharp Corp Pen input device
JP4768143B2 (en) 2001-03-26 2011-09-07 株式会社リコー Information input / output device, information input / output control method, and program
JP4084582B2 (en) 2001-04-27 2008-04-30 俊司 加藤 Touch type key input device
US7088343B2 (en) 2001-04-30 2006-08-08 Lenovo (Singapore) Pte., Ltd. Edge touchpad input device
US20050024341A1 (en) 2001-05-16 2005-02-03 Synaptics, Inc. Touch screen with user interface enhancement
JP3800984B2 (en) 2001-05-21 2006-07-26 ソニー株式会社 User input device
US7068499B2 (en) 2001-06-25 2006-06-27 Chrono Data Llc. Modular computer user interface system
US6690365B2 (en) 2001-08-29 2004-02-10 Microsoft Corporation Automatic scrolling
JP2003173237A (en) 2001-09-28 2003-06-20 Ricoh Co Ltd Information input-output system, program and storage medium
US7046230B2 (en) 2001-10-22 2006-05-16 Apple Computer, Inc. Touch pad handheld device
TW528981B (en) * 2001-10-25 2003-04-21 Compal Electronics Inc Portable computer and related method for preventing input interruption by write-tracking input region
US7009599B2 (en) 2001-11-20 2006-03-07 Nokia Corporation Form factor for portable device
AUPR963001A0 (en) * 2001-12-19 2002-01-24 Canon Kabushiki Kaisha Selecting moving objects on a system
US6690387B2 (en) 2001-12-28 2004-02-10 Koninklijke Philips Electronics N.V. Touch-screen image scrolling system and method
JP2003208261A (en) 2002-01-16 2003-07-25 Toshiba Corp Electronic equipment and pointing means control method
GB2386707B (en) 2002-03-16 2005-11-23 Hewlett Packard Co Display and touch screen
US7038659B2 (en) 2002-04-06 2006-05-02 Janusz Wiktor Rajkowski Symbol encoding apparatus and method
US6943705B1 (en) 2002-05-03 2005-09-13 Synaptics, Inc. Method and apparatus for providing an integrated membrane switch and capacitive sensor
US7746325B2 (en) 2002-05-06 2010-06-29 3M Innovative Properties Company Method for improving positioned accuracy for a determined touch input
US6789049B2 (en) * 2002-05-14 2004-09-07 Sun Microsystems, Inc. Dynamically characterizing computer system performance by varying multiple input variables simultaneously
TWI313835B (en) * 2002-06-04 2009-08-21 Koninkl Philips Electronics Nv Method of measuring the movement of an object relative to a user's input device and related input device,mobile phone apparatus, cordless phone apparatus, laptor computer, mouse and remote control
US7023427B2 (en) * 2002-06-28 2006-04-04 Microsoft Corporation Method and system for detecting multiple touches on a touch-sensitive screen
US7102615B2 (en) * 2002-07-27 2006-09-05 Sony Computer Entertainment Inc. Man-machine interface using a deformable device
US7406666B2 (en) 2002-08-26 2008-07-29 Palm, Inc. User-interface features for computers with contact-sensitive displays
US6654001B1 (en) 2002-09-05 2003-11-25 Kye Systems Corp. Hand-movement-sensing input device
WO2004025449A2 (en) 2002-09-16 2004-03-25 Koninklijke Philips Electronics N.V. Method for inputting character and position information
JP2004127073A (en) * 2002-10-04 2004-04-22 Smk Corp Instruction input device
JP2004185258A (en) 2002-12-03 2004-07-02 Hitachi Ltd Information processor
JP3867664B2 (en) * 2002-12-12 2007-01-10 ソニー株式会社 Input device, portable information processing device, remote control device, and piezoelectric actuator drive control method in input device
JP4113440B2 (en) * 2003-02-13 2008-07-09 トヨタ自動車株式会社 Vehicle screen operation device
JP3846432B2 (en) * 2003-02-26 2006-11-15 ソニー株式会社 Display device, display method and program thereof
US7103852B2 (en) 2003-03-10 2006-09-05 International Business Machines Corporation Dynamic resizing of clickable areas of touch screen applications
JP2004295727A (en) * 2003-03-28 2004-10-21 Toshiba Corp Information processor and input control method
US7382360B2 (en) * 2003-04-15 2008-06-03 Synaptics Incorporated Methods and systems for changing the appearance of a position sensor with a light effect
US7884804B2 (en) * 2003-04-30 2011-02-08 Microsoft Corporation Keyboard with input-sensitive display device
US7148882B2 (en) 2003-05-16 2006-12-12 3M Innovatie Properties Company Capacitor based force sensor
GB0312465D0 (en) 2003-05-30 2003-07-09 Therefore Ltd A data input method for a computing device
KR100510731B1 (en) 2003-05-31 2005-08-30 엘지.필립스 엘시디 주식회사 Method for Driving Touch Panel
WO2005008444A2 (en) 2003-07-14 2005-01-27 Matt Pallakoff System and method for a portbale multimedia client
KR100522940B1 (en) 2003-07-25 2005-10-24 삼성전자주식회사 Touch screen system having active area setting function and control method thereof
US7499040B2 (en) 2003-08-18 2009-03-03 Apple Inc. Movable touch pad with added functionality
US9024884B2 (en) 2003-09-02 2015-05-05 Apple Inc. Touch-sensitive electronic apparatus for media applications, and methods therefor
JP4360871B2 (en) 2003-09-10 2009-11-11 富士通テン株式会社 Input device in information terminal
JP2005082806A (en) 2003-09-10 2005-03-31 Oriental Bio Kk Fucoidan originating from cladosiphon okamuranus and immunostimulator
US7176902B2 (en) 2003-10-10 2007-02-13 3M Innovative Properties Company Wake-on-touch for vibration sensing touch input devices
EP1691261A4 (en) * 2003-11-17 2011-07-06 Sony Corp Input device, information processing device, remote control device, and input device control method
US8164573B2 (en) 2003-11-26 2012-04-24 Immersion Corporation Systems and methods for adaptive interpretation of input from a touch-sensitive input device
US7982711B2 (en) * 2003-12-19 2011-07-19 Immersion Corporation Haptic profiling system and method
US7277087B2 (en) 2003-12-31 2007-10-02 3M Innovative Properties Company Touch sensing with touch down and lift off sensitivity
JP2005208991A (en) * 2004-01-23 2005-08-04 Canon Inc Position information output device and signal processing method
US7289111B2 (en) 2004-03-25 2007-10-30 International Business Machines Corporation Resistive touch pad with multiple regions of sensitivity
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
JP4903371B2 (en) 2004-07-29 2012-03-28 任天堂株式会社 Game device and game program using touch panel
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US7633076B2 (en) * 2005-09-30 2009-12-15 Apple Inc. Automated response to and sensing of user activity in portable devices
US7728821B2 (en) * 2004-08-06 2010-06-01 Touchtable, Inc. Touch detecting interactive display
US7692627B2 (en) 2004-08-10 2010-04-06 Microsoft Corporation Systems and methods using computer vision and capacitive sensing for cursor control
US7561146B1 (en) 2004-08-25 2009-07-14 Apple Inc. Method and apparatus to reject accidental contact on a touchpad
US7834855B2 (en) 2004-08-25 2010-11-16 Apple Inc. Wide touchpad on a portable computer
US7728823B2 (en) 2004-09-24 2010-06-01 Apple Inc. System and method for processing raw data of track pad device
US20060071915A1 (en) 2004-10-05 2006-04-06 Rehm Peter H Portable computer and method for taking notes with sketches and typed text
US7626569B2 (en) * 2004-10-25 2009-12-01 Graphics Properties Holdings, Inc. Movable audio/video communication interface system
US7847789B2 (en) 2004-11-23 2010-12-07 Microsoft Corporation Reducing accidental touch-sensitive device activation
US7944215B2 (en) * 2004-12-14 2011-05-17 Mark Anthony Howard Detector
US20060181517A1 (en) 2005-02-11 2006-08-17 Apple Computer, Inc. Display actuator
AU2006218381B8 (en) 2005-03-04 2012-02-16 Apple Inc. Multi-functional hand-held device
US7656393B2 (en) 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
JP4729560B2 (en) 2005-03-08 2011-07-20 日本写真印刷株式会社 Touch panel unit
US7186041B2 (en) 2005-04-08 2007-03-06 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Keyboard layout for mouse or rocker switch text entry
US7986307B2 (en) 2005-04-22 2011-07-26 Microsoft Corporation Mechanism for allowing applications to filter out or opt into tablet input
US20060256090A1 (en) * 2005-05-12 2006-11-16 Apple Computer, Inc. Mechanical overlay
US20060277466A1 (en) * 2005-05-13 2006-12-07 Anderson Thomas G Bimodal user interaction with a simulated object
WO2006130268A1 (en) * 2005-06-01 2006-12-07 Medtronic, Inc. Correlating a non-polysomnographic physiological parameter set with sleep states
US7279647B2 (en) * 2005-06-17 2007-10-09 Harald Philipp Control panel
US20070002192A1 (en) * 2005-06-29 2007-01-04 Casio Computer Co., Ltd. Liquid crystal display apparatus including touch panel
US7868874B2 (en) * 2005-11-15 2011-01-11 Synaptics Incorporated Methods and systems for detecting a position-based attribute of an object using digital codes
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US7958456B2 (en) 2005-12-23 2011-06-07 Apple Inc. Scrolling list with floating adjacent index symbols
US20070152983A1 (en) 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
CN101000529B (en) * 2006-01-13 2011-09-14 北京汇冠新技术股份有限公司 Device for detecting touch of infrared touch screen
WO2007089410A2 (en) * 2006-01-27 2007-08-09 Wms Gaming Inc. Handheld device for wagering games
US20070177804A1 (en) 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
US7770126B2 (en) * 2006-02-10 2010-08-03 Microsoft Corporation Assisting user interface element use
DE102006037156A1 (en) 2006-03-22 2007-09-27 Volkswagen Ag Interactive operating device and method for operating the interactive operating device
US8077153B2 (en) * 2006-04-19 2011-12-13 Microsoft Corporation Precise selection techniques for multi-touch screens
US7609178B2 (en) 2006-04-20 2009-10-27 Pressure Profile Systems, Inc. Reconfigurable tactile sensor input device
US7978181B2 (en) * 2006-04-25 2011-07-12 Apple Inc. Keystroke tactility arrangement on a smooth touch surface
JP4729433B2 (en) 2006-05-10 2011-07-20 アルプス電気株式会社 Input device
DE102006022610B4 (en) * 2006-05-15 2008-05-08 Siemens Ag Safety arrangement in or for a vehicle and motor vehicle
KR101327581B1 (en) 2006-05-24 2013-11-12 엘지전자 주식회사 Apparatus and Operating method of touch screen
US7880728B2 (en) 2006-06-29 2011-02-01 Microsoft Corporation Application switching via a touch screen interface
US20080040692A1 (en) 2006-06-29 2008-02-14 Microsoft Corporation Gesture input
WO2008007372A2 (en) 2006-07-12 2008-01-17 N-Trig Ltd. Hover and touch detection for a digitizer
US8686964B2 (en) 2006-07-13 2014-04-01 N-Trig Ltd. User specific recognition of intended user interaction with a digitizer
US7956849B2 (en) * 2006-09-06 2011-06-07 Apple Inc. Video manager for portable multifunction device
US7934156B2 (en) * 2006-09-06 2011-04-26 Apple Inc. Deletion gestures on a portable multifunction device
US7843427B2 (en) * 2006-09-06 2010-11-30 Apple Inc. Methods for determining a cursor position from a finger contact with a touch screen display
KR100843077B1 (en) 2006-09-28 2008-07-02 삼성전자주식회사 Apparatus and method for displaying grouped display information by variable size on touch screen
US8284165B2 (en) 2006-10-13 2012-10-09 Sony Corporation Information display apparatus with proximity detection performance and information display method using the same
JP2008140182A (en) 2006-12-01 2008-06-19 Sharp Corp Input device, transmission/reception system, input processing method and control program
US8902172B2 (en) 2006-12-07 2014-12-02 Cypress Semiconductor Corporation Preventing unintentional activation of a touch-sensor button caused by a presence of conductive liquid on the touch-sensor button
US7855718B2 (en) 2007-01-03 2010-12-21 Apple Inc. Multi-touch input discrimination
US7876310B2 (en) 2007-01-03 2011-01-25 Apple Inc. Far-field input identification
US8026903B2 (en) 2007-01-03 2011-09-27 Apple Inc. Double-sided touch sensitive panel and flex circuit bonding
US8125455B2 (en) 2007-01-03 2012-02-28 Apple Inc. Full scale calibration measurement for multi-touch surfaces
US8970501B2 (en) 2007-01-03 2015-03-03 Apple Inc. Proximity and multi-touch sensor detection and demodulation
US8130203B2 (en) 2007-01-03 2012-03-06 Apple Inc. Multi-touch input discrimination
US7956847B2 (en) 2007-01-05 2011-06-07 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
GB2446702A (en) * 2007-02-13 2008-08-20 Qrg Ltd Touch Control Panel with Pressure Sensor
US20080196945A1 (en) 2007-02-21 2008-08-21 Jason Konstas Preventing unintentional activation of a sensor element of a sensing device
KR100954594B1 (en) * 2007-02-23 2010-04-26 (주)티피다시아이 Virtual keyboard input system using pointing apparatus in digial device
EP1970799B1 (en) 2007-03-15 2017-08-16 LG Electronics Inc. Electronic device and method of controlling mode thereof and mobile communication terminal
JP4980105B2 (en) 2007-03-19 2012-07-18 シャープ株式会社 Coordinate input device and control method of coordinate input device
CN101286100A (en) * 2007-04-10 2008-10-15 鸿富锦精密工业(深圳)有限公司 Touch screen control apparatus and control method
US9423995B2 (en) 2007-05-23 2016-08-23 Google Technology Holdings LLC Method and apparatus for re-sizing an active area of a flexible display
US8681104B2 (en) 2007-06-13 2014-03-25 Apple Inc. Pinch-throw and translation gestures
GB2451267A (en) * 2007-07-26 2009-01-28 Harald Philipp Capacitive position sensor
WO2009025529A2 (en) * 2007-08-22 2009-02-26 Eui Jin Oh Piezo-electric sensing unit and data input device using piezo-electric sensing
US8125458B2 (en) 2007-09-28 2012-02-28 Microsoft Corporation Detecting finger orientation on a touch-sensitive device
EP2212764B1 (en) 2007-10-11 2017-06-14 Microsoft Technology Licensing, LLC Method for palm touch identification in multi-touch digitizing systems
US8421757B2 (en) * 2007-10-12 2013-04-16 Sony Corporation Touch sensor with a plurality of touch sensor sections
US8174508B2 (en) * 2007-11-19 2012-05-08 Microsoft Corporation Pointing and data entry input device
US8253698B2 (en) 2007-11-23 2012-08-28 Research In Motion Limited Tactile touch screen for electronic device
US8416198B2 (en) 2007-12-03 2013-04-09 Apple Inc. Multi-dimensional scroll wheel
JP4697551B2 (en) 2007-12-21 2011-06-08 ソニー株式会社 Communication device, input control method, and input control program
US9690474B2 (en) 2007-12-21 2017-06-27 Nokia Technologies Oy User interface, device and method for providing an improved text input
US20090172565A1 (en) * 2007-12-26 2009-07-02 John Clarke Jackson Systems, Devices, and Methods for Sharing Content
TWI360061B (en) 2007-12-31 2012-03-11 Htc Corp Electronic device and method for operating applica
US9720501B2 (en) 2008-01-04 2017-08-01 Tactus Technology, Inc. Dynamic tactile interface
US20090174679A1 (en) 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
US8232973B2 (en) 2008-01-09 2012-07-31 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US8766925B2 (en) * 2008-02-28 2014-07-01 New York University Method and apparatus for providing input to a processor, and a sensor pad
EP2104024B1 (en) 2008-03-20 2018-05-02 LG Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen using the same
EP2300899A4 (en) * 2008-05-14 2012-11-07 3M Innovative Properties Co Systems and methods for assessing locations of multiple touch inputs
TW201001258A (en) * 2008-06-23 2010-01-01 Flatfrog Lab Ab Determining the location of one or more objects on a touch surface
US20090322351A1 (en) * 2008-06-27 2009-12-31 Mcleod Scott C Adaptive Capacitive Sensing
US8698750B2 (en) * 2008-09-18 2014-04-15 Microsoft Corporation Integrated haptic control apparatus and touch sensitive display
US8421756B2 (en) 2008-09-23 2013-04-16 Sony Ericsson Mobile Communications Ab Two-thumb qwerty keyboard
US8385885B2 (en) * 2008-10-17 2013-02-26 Sony Ericsson Mobile Communications Ab Method of unlocking a mobile electronic device
US8294047B2 (en) 2008-12-08 2012-10-23 Apple Inc. Selective input signal rejection and modification
JP5484109B2 (en) 2009-02-09 2014-05-07 三菱電機株式会社 Electro-optic device
US20100201615A1 (en) * 2009-02-12 2010-08-12 David John Tupman Touch and Bump Input Control
US8924893B2 (en) 2009-10-14 2014-12-30 At&T Mobility Ii Llc Locking and unlocking of an electronic device using a sloped lock track
US9116583B1 (en) 2011-04-13 2015-08-25 Google Inc. Dampening thumb movement on a touch-sensitive input device
US8411060B1 (en) 2012-01-13 2013-04-02 Google Inc. Swipe gesture classification
US8847903B2 (en) 2012-04-26 2014-09-30 Motorola Mobility Llc Unlocking an electronic device
US9645729B2 (en) 2012-10-18 2017-05-09 Texas Instruments Incorporated Precise object selection in touch sensing systems
US8654095B1 (en) 2013-03-20 2014-02-18 Lg Electronics Inc. Foldable display device providing adaptive touch sensitive area and method for controlling the same
US8896561B1 (en) 2013-03-27 2014-11-25 Keysight Technologies, Inc. Method for making precise gestures with touch devices
US9285913B1 (en) 2014-12-31 2016-03-15 Lg Display Co., Ltd. Display device and driving method thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000200141A (en) 1999-01-01 2000-07-18 Smk Corp Tablet with switch

Also Published As

Publication number Publication date
US9632608B2 (en) 2017-04-25
JP2016029601A (en) 2016-03-03
US20130229376A1 (en) 2013-09-05
JP5993785B2 (en) 2016-09-14
JP2018032443A (en) 2018-03-01
US8445793B2 (en) 2013-05-21
JP5259474B2 (en) 2013-08-07
US8294047B2 (en) 2012-10-23
US20170220165A1 (en) 2017-08-03
JP2010134895A (en) 2010-06-17
US20120019468A1 (en) 2012-01-26
US8970533B2 (en) 2015-03-03
US20150153865A1 (en) 2015-06-04
JP2013157028A (en) 2013-08-15
JP6293109B2 (en) 2018-03-14
US10452174B2 (en) 2019-10-22
KR20100066283A (en) 2010-06-17
US20100139990A1 (en) 2010-06-10

Similar Documents

Publication Publication Date Title
US9870137B2 (en) Speed/positional mode translations
JP6293108B2 (en) Multi-touch device with dynamic haptic effect
US20190033996A1 (en) Touch pad for handheld device
US10114494B2 (en) Information processing apparatus, information processing method, and program
US9703435B2 (en) Touchpad combined with a display and having proximity and touch sensing capabilities to enable different functions or interfaces to be displayed
US9612674B2 (en) Movable track pad with added functionality
AU2016203222B2 (en) Touch-sensitive button with two levels
US20180129402A1 (en) Omnidirectional gesture detection
JP6429981B2 (en) Classification of user input intent
US10437360B2 (en) Method and apparatus for moving contents in terminal
US20200192490A1 (en) Touch sensitive mechanical keyboard
US8866780B2 (en) Multi-dimensional scroll wheel
US9886116B2 (en) Gesture and touch input detection through force sensing
US8686959B2 (en) Touch screen multi-control emulator
TWI608407B (en) Touch device and control method thereof
JP5295328B2 (en) User interface device capable of input by screen pad, input processing method and program
US8330061B2 (en) Compact input device
KR101661786B1 (en) Detecting touch on a curved surface
US8451236B2 (en) Touch-sensitive display screen with absolute and relative input modes
KR100954594B1 (en) Virtual keyboard input system using pointing apparatus in digial device
US20130155018A1 (en) Device and method for emulating a touch screen using force information
US8466934B2 (en) Touchscreen interface
KR100932037B1 (en) Computation system with input device and input device
EP1774429B1 (en) Gestures for touch sensitive input devices
US7884807B2 (en) Proximity sensor and method for indicating a display orientation change

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20141126

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20151118

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20161123

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20171117

Year of fee payment: 7

FPAY Annual fee payment

Payment date: 20181115

Year of fee payment: 8