WO2005057328A2 - Touch pad for handheld device - Google Patents

Touch pad for handheld device Download PDF

Info

Publication number
WO2005057328A2
WO2005057328A2 PCT/US2004/027102 US2004027102W WO2005057328A2 WO 2005057328 A2 WO2005057328 A2 WO 2005057328A2 US 2004027102 W US2004027102 W US 2004027102W WO 2005057328 A2 WO2005057328 A2 WO 2005057328A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch pad
native
recited
values
pad assembly
Prior art date
Application number
PCT/US2004/027102
Other languages
French (fr)
Other versions
WO2005057328A3 (en
Inventor
Greg Marriott
Guy Bar-Nahum
Steven Bollinger
Original Assignee
Apple Computer Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Computer Inc. filed Critical Apple Computer Inc.
Priority to EP04781727A priority Critical patent/EP1687684A4/en
Priority to DE202004021283U priority patent/DE202004021283U1/en
Priority to EP10011080.8A priority patent/EP2284658B1/en
Publication of WO2005057328A2 publication Critical patent/WO2005057328A2/en
Publication of WO2005057328A3 publication Critical patent/WO2005057328A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates generally to a media player having a touch pad. More particularly, the present invention relates to improved touch pads.
  • buttons, switches, keyboards, mice, trackballs, touch pads, joy sticks, touch screens and the like are examples of input devices.
  • the input devices are generally selected from buttons and switches. Buttons and switches are generally mechanical in nature and provide limited control with regards to the movement of a cursor (or other selector) and making selections.
  • the input devices tend to utilize touch-sensitive display screens.
  • touch screen a user makes a selection on the display screen by pointing directly to objects on the screen using a stylus or finger.
  • the input devices are commonly touch pads.
  • the movement of an input pointer corresponds to the relative movements of the user's finger (or stylus) as the finger is moved along a surface of the touch pad.
  • Touch pads can also make a selection on the display screen when one or more taps are detected on the surface of the touch pad. In some cases, any portion of the touch pad may be tapped, and in other cases a dedicated portion of the touch pad may be tapped.
  • the input devices are generally selected from mice and trackballs. With a mouse, the movement of the input pointer corresponds to the relative movements of the mouse as the user moves the mouse along a surface.
  • mice and trackballs generally include one or more buttons for making selections on the display screen.
  • the input devices may also allow a user to scroll across the display screen in the horizontal or vertical directions.
  • mice may include a scroll wheel that allows a user to simply roll the scroll wheel forward or backward to perform a scroll action.
  • touch pads may provide dedicated active areas that implement scrolling when the user passes his or her finger linearly across the active area in the x and y directions.
  • Both devices may also implement scrolling via horizontal and vertical scroll bars as part of the GUI.
  • scrolling is implemented by positioning the input pointer over the desired scroll bar, selecting the desired scroll bar, and moving the scroll bar by moving the mouse or fmger in the y direction (forwards and backwards) for vertical scrolling or in the x direction (left and right) for horizontal scrolling.
  • a Cartesian coordinate system is used to monitor the position of the finger, mouse and ball, respectively, as they are moved.
  • the Cartesian coordinate system is generally defined as a two dimensional coordinate system (x, y) in which the coordinates of a point (e.g., position of finger, mouse or ball) are its distances from two intersecting, often perpendicular straight lines, the distance from each being measured along a straight line parallel to each other.
  • x, y positions of the mouse, ball and fmger may be monitored.
  • the x, y positions are then used to correspondingly locate and move the input pointer on the display screen.
  • touch pads generally include one or more sensors for detecting the proximity of the finger thereto.
  • the sensors are generally dispersed about the touch pad with each sensor representing an x, y position.
  • the sensors are arranged in a grid of columns and rows. Distinct x and y position signals, which control the x, y movement of a pointer device on the display screen, are thus generated when a finger is moved across the grid of sensors within the touch pad.
  • capacitive sensing technologies It should be noted, however, that the other technologies have similar features.
  • Capacitive sensing touch pads generally contain several layers of material.
  • the touch pad may include a protective shield, one or more electrode layers and a circuit board.
  • the protective shield typically covers the electrode layer(s), and the electrode layer(s) is generally disposed on a front side of the circuit board.
  • the protective shield is the part of the touch pad that is touched by the user to implement cursor movements on a display screen.
  • the electrode layer(s) on the other hand, is used to interpret the x, y position of the user's fmger when the user's finger is resting or moving on the protective shield.
  • the electrode layer (s) typically consists of a plurality of electrodes that are positioned in columns and rows so as to form a grid array. The columns and rows are generally based on the Cartesian coordinate system and thus the rows and columns correspond to the x and y directions.
  • the touch pad may also include sensing electronics for detecting signals associated with the electrodes.
  • the sensing electronics may be adapted to detect the change in capacitance at each of the electrodes as the finger passes over the grid.
  • the sensing electronics are generally located on the backside of the circuit board.
  • the sensing electronics may include an application specific integrated circuit (ASIC) that is configured to measure the amount of capacitance in each of the electrodes and to compute the position of finger movement based on the capacitance in each of the electrodes.
  • ASIC application specific integrated circuit
  • the ASIC may also be configured to report this information to the computing device.
  • the touch pad 2 is generally a small rectangular area that includes a protective shield 4 and a plurality of electrodes 6 disposed underneath the protective shield layer 4. For ease of discussion, a portion of the protective shield layer 4 has been removed to show the electrodes 6.
  • Each of the electrodes 6 represents a different x, y position.
  • the circuit board/sensing electronics measures capacitance and produces an x, y input signal 10 corresponding to the active electrodes 6.
  • the x, y input signal 10 is sent to a host device 12 having a display screen 14.
  • the x, y input signal 10 is used to control the movement of a cursor 16 on the display screen 14. As shown, the input pointer moves in a similar x, y direction as the detected x, y fmger motion.
  • the invention relates, in one embodiment, to a touch pad assembly.
  • the touch pad assembly includes a touch pad having one or more sensors that map the touch pad plane into native sensor coordinates.
  • the touch pad assembly also includes a controller that divides the surface of the touch pad into logical device units that represent areas of the touch pad that can be actuated by a user, receives the native values of the native sensor coordinates from the sensors, adjusts the native values of the native sensor coordinates into a new value associated with the logical device units and reports the new value of the logical device units to a host device.
  • the invention relates, in another embodiment, to a method for a touch pad. The method includes mapping the touch pad into native sensor coordinates.
  • the method also includes producing native values of the native sensor coordinates when events occur on the touch pad.
  • the method further includes filtering the native values of the native sensor coordinates based on the type of events that occur on the touch pad.
  • the method additionally includes generating a control signal based on the native values of the native sensor coordinates when a desired event occurs on the touch pad.
  • the invention relates, in another embodiment, to a signal processing method.
  • the method includes receiving a current user location.
  • the method also includes determining the difference in user location by comparing the current user location to a last user location.
  • the method further includes only outputting the current user location when the difference in user location is larger than a threshold value.
  • the method additionally includes converting the outputted current user location into a logical device unit.
  • the method includes generating a message for a host device.
  • the message including the more logical user location.
  • the more logical user location being used by the host device to move a control object in a specified manner.
  • the invention relates, in another embodiment, to a message from a touch pad assembly to a host device in a computer system that facilitates bi-directional communications between the touch pad assembly and the host device.
  • the message includes an event field identifying whether the message is a touch pad event or a button event.
  • the message also includes an event identifier field identifying at least one event parameter, each event parameter having an event value, the event value for a touch pad event parameter indicating an absolute position, the event value for a button event parameter indicating button status.
  • the invention relates, in another embodiment, to a touch pad assembly capable of transforming a user action into motion onto a display screen, the touch pad system including a touch pad having a plurality of independent and spatially distinct button zones each of which represents a different movement direction on the display screen so as to enable joystick implementations, multiple dimensional menu selection or photo image panning.
  • Fig. 1 is a simplified diagram of a touch pad and display.
  • Fig. 2 is a diagram of a computing system, in accordance with one embodiment of the present invention.
  • Fig. 1 is a simplified diagram of a touch pad and display.
  • Fig. 2 is a diagram of a computing system, in accordance with one embodiment of the present invention.
  • Fig. 1 is a simplified diagram of a touch pad and display.
  • Fig. 2 is a diagram of a computing system, in accordance with one embodiment of the present invention.
  • Fig. 1 is a simplified diagram of a touch pad and display.
  • Fig. 2 is a diagram of
  • FIG. 3 is a flow diagram of signal processing, in accordance with one embodiment of the invention.
  • Fig. 4 is a flow diagram of touch pad processing, in accordance with one embodiment of the invention.
  • Fig. 5 is a flow diagram of a touch pad processing, in accordance with one embodiment of the invention.
  • Fig. 6 is a diagram of a communication protocol, in accordance with one embodiment of the present invention.
  • Fig. 7 is a diagram of a message format, in accordance with one embodiment of the present invention.
  • Fig. 8 is a perspective view of a media player, in accordance with one embodiment of the invention.
  • Fig. 9 is a front view of a media player, in accordance with one embodiment of the present invention.
  • FIG. 10 is a front view of a media player, in accordance with one embodiment of the present invention.
  • Figs. 11 A- 1 ID are top views of a media player in use, in accordance with one embodiment of the present invention.
  • Fig. 12 is a partially broken away perspective view of an annular capacitive touch pad, in accordance with one embodiment of the present invention.
  • Fig. 13 is a top view of a sensor arrangement of a touch pad, in accordance with another embodiment of the present invention.
  • Fig. 14 is a top view of a sensor arrangement of a touch pad, in accordance with another embodiment of the present invention.
  • Fig. 15 is a top view of a sensor arrangement of a touch pad, in accordance with another embodiment of the present invention.
  • FIG. 2 is a diagram of a computing system 20, in accordance with one embodiment of the present invention.
  • the computing system 20 includes at least a user interface 22 and a host device 24.
  • the user interface 22 is configured to provide control information for performing actions in the host device 24.
  • the actions may include making selections, opening a file or document, executing instructions, starting a program, viewing a menu, and/or the like.
  • the actions may also include moving an object such as a pointer or cursor on a display screen of the host device 24.
  • the user interface 22 may be integrated with the host device 24 (within the same housing) or it may be a separate component (different housing).
  • the user interface 22 includes one or more touch buttons 34, a touch pad 36 and a controller 38.
  • the touch buttons 34 generate button data when a user places their finger over the touch button 34.
  • the touch pad on the other hand, generates position data when a user places their finger (or object) over the touch pad 36.
  • the controller 38 is configured to acquire the button data from the touch buttons 34 and the position data from the touch pad 36.
  • the controller is also configured to output control data associated with the button data and/or position data to the host device 24. In one embodiment, the controller 38 only outputs control data associated with the touch buttons when the button status has changed. In another embodiment, the controller 38 only outputs control data associated with the touch pad when the position data has changed.
  • the control data which may include the raw data (button, position) or some form of thereof, may be used to implement a control function in the host device 24.
  • the control data may be used to move an object on the display 30 of the host device 24 or to make a selection or issue a command in the host device 24.
  • the touch buttons 34 and touch pad 36 generally include one or more sensors capable of producing the button and position data.
  • the sensors of the touch buttons 34 and touch pad 36 may be distinct elements or they may be grouped together as part of a sensor arrangement, i.e., divided into sensors for the touch buttons 34 and sensors for the touch pad 36.
  • the sensors of the touch buttons 34 are configured to produce signals associated with button status (activated, not activated). For example, the button status may indicate button activation when an object is positioned over the touch button and button deactivation at other times (or vice versa).
  • the sensors of the touch pad 36 are configured produce signals associated with the absolute position of an object on or near the touch pad 36. In most cases, the sensors of the touch pad 36 map the touch pad plane into native or physical sensor coordinates 40.
  • the native sensor coordinates 40 may be based on Cartesian coordinates or Polar coordinates (as shown). When Cartesian, the native sensor coordinates 40 typically correspond to x and y coordinates. When Polar (as shown), the native sensor coordinates typically correspond to radial and angular coordinates (r, ⁇ ).
  • the sensors may be based on resistive sensing, surface acoustic wave sensing, pressure sensing (e.g., strain gauge), optical sensing, capacitive sensing and the like.
  • the user interface 22 includes a sensor arrangement based on capacitive sensing.
  • the user interface 22 is therefore arranged to detect changes in capacitance as a fmger moves, taps, or rests on the touch buttons 34 and touch pad 36.
  • the capacitive touch assembly is formed from various layers including at least a set of labels, a set of electrodes (sensors) and a printed circuit board (PCB).
  • the electrodes are positioned on the PCB, and the labels are position over the electrodes.
  • the labels serve to protect the electrodes and provide a surface for receiving a finger thereon.
  • the label layer also provides an insulating surface between the finger and the electrodes.
  • the controller 38 can determine button status at each of the touch buttons 34 and position of the finger on the touch pad 36 by detecting changes in capacitance. In most cases, the controller 38 is positioned on the opposite side of the PCB.
  • the controller 38 may correspond to an application specific integrated circuit (ASIC), and it may operate under the control of Firmware stored on the ASIC.
  • ASIC application specific integrated circuit
  • the controller 38 is configured to monitor the sensors of the touch buttons 34 and touch pad 36 and decide what information to report to the host device 24.
  • the decision may include filtering and/or conversion processes.
  • the filtering process may be implemented to reduce a busy data stream so that the host device 24 is not overloaded with redundant or non-essential data.
  • a busy data stream may be created when multiple signals are produced at native sensor coordinates 40 that are in close proximity to one another.
  • processing a busy data stream tends to require a lot of power, and therefore it can have a disastrous effect on portable devices such as media players that use a battery with a limited power supply.
  • the filtering process throws out redundant signals so that they do not reach the host device 24.
  • the controller 38 is configured to only output a control signal when a significant change in sensor signals is detected.
  • a significant change corresponds to those changes that are significant, as for example, when the user decides to move his/her finger to a new position rather than when the user's finger is simply resting on a spot and moving ever so slightly because of finger balance (toggling back and forth).
  • the filter process may be implemented through Firmware as part of the application specific integrated circuit.
  • the conversion process is implemented to adjust the raw data into other form factors before sending or reporting them to the host device 24. That is, the controller 38 may convert the raw data into other types of data. The other types of data may have similar or different units as the raw data. In the case of the touch pad 36, the controller 38 may convert the position data into other types of position data. For example, the controller 38 may convert absolute position data to relative position data. As should be appreciated, absolute position refers to the position of the finger on the touch pad measured absolutely with respect to a coordinate system while relative position refers to a change in position of the finger relative to the finger's previous position.
  • the controller 38 may also convert multiple absolute coordinates into a single absolute coordinate, Polar coordinates into Cartesian coordinates, and/or Cartesian coordinates into Polar coordinates.
  • the controller 38 may also convert the position data into button data. For example, the controller may generate button control signals when an object is tapped on a predetermined portion of the touch pad or other control signals when an object is moved in a predetermined manner over the touch pad (e.g., gesturing).
  • the conversion may also include placing the control signal in a format that the host device 24 can understand.
  • the controller 38 may follow a predetermined communication protocol. As is generally well known, communication protocols are a set of rules and procedures for exchanging data between two devices such as the user interface 22 and the host device 24.
  • Communication protocols typically transmit information in data blocks or packets that contain the data to be transmitted, the data required to guide the packet to its destination, and the data that corrects errors that occur along the way.
  • the controller may support a variety of communication protocols for communicating with the host device, including but not limited to, PS/2, Serial, ADB and the like. In one particular implementation, a Serial protocol is used.
  • the conversion process may include grouping at least a portion of the native coordinates 40 together to form one or more virtual actuation zones 42.
  • the controller 38 may separate the surface of the touch pad 36 into virtual actuation zones 42A-D and convert the native values of the native sensor coordinates 40 into a new value associated with the virtual actuation zones 42A-D.
  • the new value may have similar or different units as the native value.
  • the new value is typically stored at the controller 38 and subsequently passed to the host device 24.
  • the controller 38 outputs a control signal associated with a particular virtual actuation zone 42 when most of the signals are from native sensor coordinates 40 located within the particular virtual actuation zone 42.
  • the virtual actuation zones 42 generally represent a more logical range of values than the native sensor coordinates 40 themselves, i.e., the virtual actuation zones 42 represent areas of touch pad 36 that can be better actuated by a user (magnitudes larger).
  • the ratio of native sensor coordinates 40 to virtual actuation zones 42 may be between about 1024: 1 to about 1 :1, and more particularly about 8:1.
  • the touch pad may include 128 virtual actuation areas based on 1024 native sensor coordinates.
  • the virtual actuation zones 42 may be widely varied. For example, they may represent absolute positions on the touch pad 36 that are magnitudes larger than the native sensor coordinates 40. For example, the touch pad 36 can be broken up into larger slices than would otherwise be attainable using the native sensor coordinates 40. In one implementation, the virtual actuation zones 42 are distributed on the touch pad 36 within a range of 0 to 95 angular positions. The angular position is zero at the 12 o clock position and progresses clockwise to 95 as it comes around to 12 o'clock again.
  • the virtual actuation zones 42 may also represent areas of the touch pad that can be actuated by a user to implement specific control functions such as button or movement functions.
  • the virtual actuation zones 42 may correspond to button zones that act like touch buttons.
  • each of the virtual actuation zones 42 may correspond to different movement directions such that they act like arrow keys.
  • virtual actuation zone 42A may represent an upward movement
  • virtual actuation zone 42B may represent a downward movement
  • virtual actuation zone 42C may represent a left movement
  • virtual actuation zone 42D may represent right movement.
  • this type of touch pad configuration may enable game stick implementations, two dimensional menu selection, photo image panning and the like.
  • the controller 38 may also include a storage element.
  • the storage element may store a touch pad program for controlling different aspects of the user interface 22.
  • the touch pad program may contain virtual actuation zone profiles that describe how the virtual actuation zones are distributed around the touch pad relative to the native sensor coordinates and what type of value to output based on the native values of the native sensor coordinates selected and the virtual actuation zone corresponding to the selected native sensor coordinates.
  • the controller 38 receives the position data from the touch pad 36.
  • the controller 38 passes the data through a filtering process.
  • the filtering process generally includes determining if the data is based on noise events or actual events.
  • Noise events are associated with non significant events such as when a user's finger is simply resting on a spot and moving ever so slightly because of finger balance.
  • Actual events are associated with significant events such as when a user decides to move his/her finger to a new position on the touch pad.
  • the noise events are filtered out and the actual events are passed through the controller 38.
  • the controller 38 determines if the position data should be adjusted. If not, the position data is reported to the host device 24. If so, the position data is converted into other form factors including but not limited to other position data or button data. For example, the native values of the sensor coordinates are converted into a new value associated with a selected virtual actuation zone. After the conversion, the controller 38 reports the converted data to the host device 24.
  • the controller 38 may pass the new value to a main system processor that executes the main application program running on the host device 24.
  • the host device 24 generally includes a control circuit 26.
  • the control circuit 26 is configured to execute instructions and carry out operations associated with the host device 24.
  • the control circuit 26 may control the reception and manipulation of input and output data between the components of the computing system 20.
  • the host device 24 may also include a hold switch 28 for activating or deactivating communications between the host device 24 and the user interface 22.
  • the host device may additionally include a display 30 configured to produce visual information such as text and graphics on a display screen 32 via display commands from the control circuit 26.
  • the visual information may be in the form of a graphical user interface (GUI).
  • GUI graphical user interface
  • the host device may additionally include one or more speakers or jacks that connect to headphones/speakers.
  • the control circuit may be widely varied.
  • the control circuit may include one or more processors 27 that together with an operating system operate to execute computer code and produce and use data.
  • the processor 27 can be a single-chip processor or can be implemented with multiple components.
  • the computer code and data may reside within data storage that is operatively coupled to the processor.
  • Data storage generally provides a place to hold data that is being used by the computer system 20.
  • the data storage may include Read-Only Memory (ROM), Random- Access Memory (RAM), hard disk drive and/or the like.
  • the control circuit may also include an input/output controller that is operatively coupled to the processor.
  • the input/output controller generally operates by exchanging data between the host device 24 and the I/O devices that desire to communicate with the host device 24 (e.g., touch pad assembly 22).
  • the control circuit also typically includes a display controller that is operatively coupled to the processor.
  • the display controller is configured to process display commands to produce text and graphics on the display screen 32 of the host device 24.
  • the input/output controller and display controller may be integrated with the processor or they may be separate components.
  • the control circuit 26 may be configured to perform some of the same functions as the controller 38. For example, the control circuit 26 may perform conversion processes on the data received from the controller 38. The conversion may be performed on raw data or on already converted data. [0033] Fig.
  • Signal processing 50 is a flow diagram of signal processing 50, in accordance with one embodiment of the invention.
  • the signal processing 50 may be performed by the computing system shown in Fig. 2.
  • Signal processing 50 generally begins at block 52 where a user input is produced at the user interface 22.
  • the user input is typically based on signals generated by the sensor arrangement of the touch buttons and touchpad.
  • the user input may include raw data.
  • the user input may also include filtered or converted data.
  • the processing proceeds to block 54 where the user input is reported to the control circuit of the host device.
  • the user input may contain both button and position data or it may only contain button data or position data.
  • the user input is typically reported when a change is made and more particularly when a desired change is made at the user interface (filtered). For example, button data may be reported when the button status has changed and position data may be reported when the position of a finger has changed.
  • Fig. 4 is a flow diagram of touch pad processing 60, in accordance with one embodiment of the invention.
  • Touch pad processing 60 generally begins at block 62 where at least one control object is displayed on a graphical user interface.
  • the control object may be a cursor, slider bar, image or the like.
  • the GUI may be displayed on the display 30 of the host device 24.
  • the GUI is typically under the control of the processor of the host device 24.
  • an angular or radial referenced input is received.
  • the angular or radial referenced input may be produced by the user interface 22 and received by the processor of the host device 24.
  • the angular or radial referenced input may be raw data formed by the sensor arrangement or converted data formed at the controller. Furthermore, the raw or converted data may be filtered so as to reduce a busy data stream.
  • touch pad processing proceeds to block 66 where the control object is modified based on the angular or radial referenced input. For example, the direction that a control object such as a football player in a football game is moving may be changed from a first direction to a second direction or a highlight bar may be moved through multiple images in a photo library.
  • the modification is typically implemented by the processor of the host device.
  • Fig. 5 is a flow diagram of a touch pad processing 70, in accordance with one embodiment of the invention.
  • touch pad processing may be performed by the controller shown in Fig. 2.
  • it may be associated with blocks 52/54 and 62 shown in Figs. 3 and 4.
  • Touch pad processing 70 generally begins at block 72 where a current user location is received.
  • the current user location corresponds to the current location of the user's finger on the touch pad.
  • the controller may detect the changes in sensor levels at each of the native sensor coordinates and thereafter determine the current location of the user's finger on the touch pad based on the change in sensor levels at each of the native sensor coordinates.
  • the process flow proceeds to block 74 where a determination is made as to whether the current user location is within a threshold from the last user location, i.e., the user location that precedes the current user location. In some cases, the current user location is compared to the last user location to determine the difference in user location, i.e., how much movement occurred between the current and last readings. If the current user location is within the threshold then an undesired change has been made and the process flow proceeds back to block 72. If the current location is outside the threshold then a desired change has been made and the process flow proceeds to block 76.
  • a threshold from the last user location, i.e., the user location that precedes the current user location. In some cases, the current user location is compared to the last user location to determine the difference in user location, i.e., how much movement occurred between the current and last readings. If the current user location is within the threshold then an undesired change has been made and the process flow proceeds back to block 72. If the current location is outside the threshold then
  • the threshold may be defined as the number of sensor levels that need to change in order to report a change in the user finger location to the main system processor of the host device. In one particular implementation, the threshold is equal to about 3.
  • the threshold may be determined by the following equation:
  • Threshold (T) C*(native sensor coordinate resolution/logical device unit resolution), where the native sensor coordinate resolution defines the maximum number of different positions that the sensors are able to detect for a specific plane coordinate system, the logical device unit resolution defines the number of values that are communicated to the main system processor of the host device for the said specific plane coordinate system, and coefficient C defines the width border area between the clusters of native sensor coordinates that define one logical device unit.
  • the coefficient C is generally determined by the sensitivity needed to initiate a user event to the main system processor of the host device. It customizes the threshold value to the physical limitations of the sensor technology and the expected noise of the user finger events. Larger values tend to filter more events and reduce sensitivity.
  • the system designer may pick the exact value of C by testing several values to strike optimal balance between sensitivity and stability of the user finger location.
  • the coefficient C is typically a value between 0 and 0.5, and more particularly about 0.25.
  • the threshold (T) is about 2 when the native sensor coordinate resolution is about 1024, the logical device unit resolution is about 128 and the coefficient is about 0.25.
  • a new value associated with a particular logical device unit is generated based on the changed native sensor coordinates associated with the particular logical device unit.
  • the raw number of slices in the form of native sensor coordinates are grouped into a more logical number of slices in the form of logical device units (e.g., virtual actuation zones).
  • the process flow proceeds to block 78 where the last user location is updated. That is, the last current location is changed to the current user location. The current user location now acts as the last user location for subsequent processing.
  • the process flow proceeds to block 80 where a message is sent.
  • the message is sent when the difference between the current and last user location is larger than the threshold value.
  • the message generally includes the new value associated with the selected logical device unit.
  • the touch pad may send a message to the main system processor of the host device. When received by the main system processor, the message may be used to make an adjustment in the host device, i.e., cause a control object to move in a specified manner.
  • Fig. 6 is a diagram of a communication protocol 82, in accordance with one embodiment of the present invention.
  • the communication protocol may be used by the user interface and host device of Fig. 2.
  • the user interface 22 has one dedicated input ACTIVE line that is controlled by the control circuit 26.
  • the state of the ACTIVE line signal may be set at LOW or HIGH.
  • the hold switch 28 may be used to change the state of the ACTIVE line signal (for example when the hold switch is in a first position or second position).
  • the ACTIVE signal when the ACTIVE signal is set to HIGH, the user interface 22 sends a synch message to the control circuit 26 that describes the Button and Touch pad status (e.g., button state and touch pad position).
  • new synch messages are only sent when the Button state and/or the Touch Pad status changes. For example, when the touch pad position has changed within a desired limit.
  • the ACTIVE signal is set to LOW, the user interface 22 does not send a synch message to the control circuit 26.
  • the ACTIVE signal is toggled from LOW to HIGH, the user interface 22 sends a Button state and touch pad position message. This may be used on startup to initialize the state.
  • the ACTIVE signal is toggled from HIGH to LOW, the user interface 22 does not send a synch message to the control circuit 26.
  • the user interface 22 is configured to send a two data byte message if both the Buttons and touch pad positions changes since the last message was sent, and a one data byte message if only one button state or touch pad position changes.
  • Fig. 7 is a diagram of a message format 86, in accordance with one embodiment of the present invention.
  • the message format 86 may correspond to the synch message described in Fig. 6.
  • the message format 86 may form a two data byte message or a one data byte message. Each data byte is configured as an 8 bit message.
  • the upper Most Significant Bit (MSB) of the message is the event type (1 bit) and the lower Least Significant Bits (LSB) are the event value (7 bits).
  • MSB Most Significant Bit
  • LSB Least Significant Bits
  • the event value is event type specific. In Fig. 7, the event type bits are marked as EO, and the event value is marked as D0-D6.
  • the event type may be a touch pad position change El or a button state change E0 when the button is being touched or El when the button is not being touched.
  • the event values may correspond to different button events such as seeking forwards (D4), seeking backwards (D3), playing and pausing (D2), providing a menu (Dl) and making selections (DO).
  • the event values may also correspond to touch pad events such as touchpad position (D5). For example, in a touch pad that defines the logical coordinates in polar coordinates from 0-127, the event value may correspond to an absolute touch pad position in the range of 0-127 angular positions where zero is 12 o clock, 32 is 3 o clock, 64 is 6 o clock and 96 is 9 o clock, etc. going clockwise.
  • the event values may also correspond to a reserve (D6).
  • the reserve is an unused bit that may be used to extend the API. .
  • Fig. 8 is a perspective diagram of a media player 100, in accordance with one embodiment of the present invention.
  • the media player 100 may generally correspond to the host device shown in Fig. 2.
  • the term "media player” generally refers to computing devices that are dedicated to processing media such as audio, video or other images, as for example, music players, game players, video players, video recorders, cameras, and the like.
  • the media players contain single functionality (e.g., a media player dedicated to playing music) and in other cases the media players contain multiple functionality (e.g., a media player that plays music, displays video, stores pictures and the like).
  • these devices are generally portable so as to allow a user to listen to music, play games or video, record video or take pictures wherever the user travels.
  • the media player 100 is a handheld device that is sized for placement into a pocket of the user.
  • the user does not have to directly carry the device and therefore the device can be taken almost anywhere the user travels (e.g., the user is not limited by carrying a large, bulky and often heavy device, as in a laptop or notebook computer).
  • a user may use the device while working out at the gym.
  • a user may use the device while mountain climbing.
  • the user can use the device while traveling in a car.
  • the device may be operated by the users hands, no reference surface such as a desktop is needed (this is shown in greater detail in Fig.
  • the media player 100 is a pocket sized hand held MP3 music player that allows a user to store a large collection of music (e.g., in some cases up to 4,000 CD-quality songs).
  • the MP3 music player may correspond to the iPod MP3 player manufactured by Apple Computer of Cupertino, C A.
  • the MP3 music player shown herein may also include additional functionality such as storing a calendar and phone lists, storing and playing games, storing photos and the like. In fact, in some cases, it may act as a highly transportable storage device.
  • the media player 100 includes a housing 102 that encloses internally various electrical components (including integrated circuit chips and other circuitry) to provide computing operations for the media player 100.
  • the housing may also define the shape or form of the media player. That is, the contour of the housing 102 may embody the outward physical appearance of the media player 100.
  • the integrated circuit chips and other circuitry contained within the housing may include a microprocessor (e.g., CPU), memory (e.g., ROM, RAM), a power supply (e.g., battery), a circuit board, a hard drive, other memory (e.g., flash) and/or various input/output (I O) support circuitry.
  • a microprocessor e.g., CPU
  • memory e.g., ROM, RAM
  • a power supply e.g., battery
  • I O input/output
  • the electrical components may also include components for inputting or outputting music or sound such as a microphone, amplifier and a digital signal processor (DSP).
  • the electrical components may also include components for capturing images such as image sensors (e.g., charge coupled device (CCD) or complimentary oxide semiconductor (CMOS)) or optics (e.g., lenses, splitters, filters).
  • image sensors e.g., charge coupled device (CCD) or complimentary oxide semiconductor (CMOS)
  • CMOS complimentary oxide semiconductor
  • optics e.g., lenses, splitters, filters.
  • the media player 100 includes a hard drive thereby giving the media player 100 massive storage capacity.
  • a 20GB hard drive can store up to 4000 songs or about 266 hours of music.
  • flash- based media players on average store up to 128MB, or about two hours, of music.
  • the hard drive capacity may be widely varied (e.g., 5, 10, 20 MB, etc.).
  • the media player 100 shown herein also includes a battery such as a rechargeable lithium polymer battery. These type of batteries are capable of offering about 10 hours of continuous playtime to the media player 100.
  • the media player 100 also includes a display screen 104 and related circuitry.
  • the display screen 104 is used to display a graphical user interface as well as other information to the user (e.g., text, objects, graphics).
  • the display screen 104 may be a liquid crystal display (LCD).
  • the display screen 104 corresponds to a 160-by- 128-pixel high-resolution display, with a white LED backlight to give clear visibility in daylight as well as low-light conditions. As shown, the display screen 104 is visible to a user of the media player 100 through an opening 105 in the housing 102.
  • the media player 100 also includes a touch pad 110.
  • the touch pad is an intuitive interface that provides easy one-handed operation, i.e., lets a user interact with the media player 100 with one or more fingers.
  • the touch pad 110 is configured to provide one or more control functions for controlling various applications associated with the media player 100.
  • the touch initiated control function may be used to move an object on the display screen 104 or to make selections or issue commands associated with operating the media player 100.
  • the touch pad 110 may be arranged to receive input from a finger moving across the surface of the touch pad 110, from a fmger holding a particular position on the touch pad and/or by a finger tapping on a particular position of the touch pad.
  • the touch pad 110 generally consists of a touchable outer surface 111 for receiving a finger for manipulation on the touch pad 110.
  • Beneath the touchable outer surface 111 is a sensor arrangement 112.
  • the sensor arrangement 112 includes one or more sensors that are configured to activate as the finger sits on, taps on or passes over them.
  • the sensor arrangement 112 may be based on a Cartesian coordinate system, a Polar coordinate system or some other coordinate system. In the simplest case, an electrical signal is produced each time the finger is positioned over a sensing coordinate of the sensor arrangement 112.
  • the number of signals in a given time frame may indicate location, direction, speed and acceleration of the finger on the touch pad, i.e., the more signals, the more the user moved his or her finger.
  • the signals are monitored by a control assembly that converts the number, combination and frequency of the signals into location, direction, speed and acceleration information and reports this information to the main system processor of the media player. This information may then be used by the media player 100 to perform the desired control function on the display screen 104.
  • the surface of the touch pad 110 is divided into several independent and spatially distinct actuation zones 113A-D disposed around the periphery of the touch pad 110.
  • the actuation zones generally represent a more logical range of user inputs than the sensors themselves.
  • the touch pad 110 outputs a control signal associated with a particular actuation zone 113 when most of the signals are from sensing coordinates located within the particular actuation zone 113.
  • a position signal is generated at one or more sensing coordinates.
  • the position signals generated by the one or more sensing coordinates may be used to inform the media player 100 that the object is at a specific zone 113 on the touch pad 110.
  • the actuation zones may be button zones or positional zones.
  • button zones a button control signal is generated when an object is placed over the button zone.
  • the button control signal may be used to make selections, open a file, execute instructions, start a program, view a menu in the media player.
  • positional zones a position control signal is generated when an object is placed over the positional zone.
  • the position signals may be used to control the movement of an object on a display screen of the media player.
  • the distribution of actuation zones may be controlled by touch pad translation software or firmware that converts physical or native coordinates into virtual representation in the form of actuation zones.
  • the touch pad translation software may be run by the control assembly of the touch pad or the main system processor of the media player.
  • the position control signals may be associated with a Cartesian coordinate system (x and y) or a Polar coordinate system (r, ⁇ ). Furthermore, the position signals may be provided in an absolute or relative mode. In absolute mode, the absolute coordinates of where it is being touched on the touch pad are used. For example x, y in the case of the Cartesian coordinate system or (r, ⁇ ) in the case of the Polar coordinate system. In relative mode, the change in position of the finger relative to the finger's previous position is used.
  • the touch pad may be configured to operate in a Cartesian-absolute mode, a Cartesian-relative mode, a Polar-absolute mode or a Polar-relative mode.
  • the mode may be controlled by the touch pad itself or by other components of the media player system.
  • a user may select which mode that they would like to operate in the media player system or the applications running on the media player system may automatically set the mode of the media player system.
  • a game application may inform the media player system to operate in an absolute mode so that the touch pad can be operated as a joystick or a list application may inform the media player system to operate in a relative mode so that the touch pad can be operated as a scroll bar.
  • each of the zones 113 represents a different polar angle that specifies the angular position of the zone 113 in the plane of the touch pad 110.
  • the zones 113 may be positioned at 90 degree increments all the way around the touch pad 110 or something smaller as for example 2 degree increments all the way around the touch pad 110.
  • the touch pad 110 may convert 1024 physical positions in the form of sensor coordinates, to a more logical range of 0 to 127 in the form of positional zones.
  • the touch pad internal accuracy (1024 positions) is much larger than the accuracy (128 positions) needed for making movements on the display screen.
  • the position of the touch pad 110 relative to the housing 102 may be widely varied.
  • the touch pad 110 may be placed at any external surface (e.g., top, side, front, or back) of the housing 102 that is accessible to a user during manipulation of the media player 100. In most cases, the touch sensitive surface 111 of the touch pad 110 is completely exposed to the user. In the illustrated embodiment, the touch pad 110 is located in a lower, front area of the housing 102. Furthermore, the touch pad 110 may be recessed below, level with, or extend above the surface of the housing 102. In the illustrated embodiment, the touch sensitive surface 111 of the touch pad 110 is substantially flush with the external surface of the housing 102. [0062] The shape of the touch pad 110 may also be widely varied. For example, the touch pad 110 may be circular, rectangular, triangular, and the like.
  • the outer perimeter of the shaped touch pad defines the working boundary of the touch pad.
  • the touch pad 110 is circular. This particular shape works well with Polar coordinates. More particularly, the touch pad is annular, i.e., shaped like or forming a ring. When annular, the inner and outer perimeter of the shaped touch pad defines the working boundary of the touch pad.
  • the media player 100 may also include one or more buttons 114. The buttons 114 are configured to provide one or more dedicated control functions for making selections or issuing commands associated with operating the media player 100.
  • the button functions may be associated with opening a menu, playing a song, fast forwarding a song, seeking through a menu and the like.
  • the buttons 114 may be mechanical clicking buttons and/or they may be touch buttons.
  • the buttons are touch buttons that receive input from a finger positioned over the touch button.
  • the touch buttons 114 generally consist of a touchable outer surface for receiving a finger and a sensor arrangement disposed below the touchable outer surface.
  • the touch buttons and touch pad may generally correspond to the touch buttons and touch pad shown in Fig. 2.
  • the position of the touch buttons 114 relative to the touch pad 110 may be widely varied.
  • buttons 114 are placed above the touch pad 110 in a linear manner as well as in the center of the annular touch pad 110.
  • the plurality of buttons 114 may consist of a menu button, play/stop button, forward seek button, a reverse seek button, and the like.
  • the media player 100 may also include a hold switch 115.
  • the hold switch 115 is configured to activate or deactivate the touch pad and/or buttons. This is generally done to prevent unwanted commands by the touch pad and/or buttons, as for example, when the media player is stored inside a user's pocket. When deactivated, signals from the buttons and/or touch pad are not sent or are disregarded by the media player. When activated, signals from the buttons and/or touch pad are sent and therefore received and processed by the media player.
  • the media player 100 may also include one or more headphone jacks 116 and one or more data ports 118.
  • the headphone jack 116 is capable of receiving a headphone connector associated with headphones configured for listening to sound being outputted by the media device 100.
  • the data port 118 is capable of receiving a data connector/cable assembly configured for transmitting and receiving data to and from a host device such as a general purpose computer (e.g., desktop computer, portable computer).
  • a host device such as a general purpose computer (e.g., desktop computer, portable computer).
  • the data port 118 may be used to upload or down load audio, video and other images to and from the media device 100.
  • the data port may be used to download songs and play lists, audio books, ebooks, photos, and the like into the storage mechanism of the media player.
  • the data port 118 may be widely varied.
  • the data port may be a PS/2 port, a serial port, a parallel port, a USB port, a Firewire port and/or the like.
  • the data port 118 may be a radio frequency (RF) link or optical infrared (IR) link to eliminate the need for a cable.
  • the media player 100 may also include a power port that receives a power connector/cable assembly configured for delivering powering to the media player 100.
  • the data port 118 may serve as both a data and power port.
  • the data port 118 is a Firewire port having both data and power capabilities.
  • the data port may include multiple data functionality, i.e., integrating the functionality of multiple data ports into a single data port.
  • the position of the hold switch, headphone jack and data port on the housing may be widely varied. That is, they are not limited to the positions shown in Fig. 2. They may be positioned almost anywhere on the housing (e.g., front, back, sides, top, bottom). For example, the data port may be positioned on the bottom surface of the housing rather than the top surface as shown. [0069] Referring to Fig. 9, the touch pad 110 will be described in greater detail.
  • the touch pad is operating in an absolute mode. That is, the touch pad reports the absolute coordinates of where it is being touched.
  • the touch pad 110 includes one or more zones 124.
  • the zones 124 represent regions of the touch pad 110 that may be actuated by a user to implement one or more actions or movements on the display screen 104.
  • the distribution of the zones 124 may be widely varied.
  • the zones 124 may be positioned almost anywhere on the touch pad 110.
  • the position of the zones 124 may depend on the coordinate system of the touch pad 110.
  • the zones 124 may have one or more radial and/or angular positions.
  • the zones 124 are positioned in multiple angular positions of the Polar coordinate system.
  • the zones 124 may be formed from almost any shape whether simple (e.g., squares, circles, ovals, triangles, rectangles, polygons, and the like) or complex (e.g., random shapes).
  • the shape of multiple button zones 124 may have identical shapes or they may have different shapes.
  • the size of the zones 124 may vary according to the specific needs of each device. In some cases, the size of the zones 124 corresponds to a size that allows them to be easily manipulated by a user (e.g., the size of a finger tip or larger). In other cases, the size of the zones 124 are small so as to improve resolution of the touch pad 110. Moreover, any number of zones 124 may be used. In the illustrated embodiment, four zones 124A-D are shown. It should be noted, however, that this is not a limitation and that the number varies according to the specific needs of each touch pad. For example, Fig. 5 shows the media player 100 with 16 button zones 124A-P.
  • the number of zones 124 generally depends on the number of sensor coordinates located within the touch pad 110 and the desired resolution of the touch pad 110.
  • the sensors are configured to sense user actions on the zones 124 and to send signals corresponding to the user action to the electronic system.
  • the sensors may be capacitance sensors that sense capacitance when a finger is in close proximity.
  • the arrangement of the sensors typically varies according to the specific needs of each device.
  • the touch pad 110 includes 1024 sensor coordinates that work together to form 128 zones.
  • the zones 124 when actuated are used to produce on screen movements 126.
  • the control signal for the on screen movements may be initiated by the touch pad electronics or by the main system processor of the media player.
  • each zone 124 may be configured to represent a particular movement on the display screen 104.
  • each of the zones 124 represents a particular direction of movement.
  • the directions may be widely varied, however, in the illustrated embodiment, the directions generally correspond to angular directions (e.g., similar to the arrow keys on the keyboard).
  • the touch pad 110 is divided into several independent and spatially distinct zones 124A-D, each of which corresponds to a particular movement direction 126A-D (as shown by arrows), respectively.
  • zone 124A When zone 124A is actuated, on screen movements 126 A (to the right) are implemented.
  • zone 124B is actuated, on screen movements 126B (upwards) are implemented.
  • zone 124C When zone 124C is actuated, on screen movements 126C (to the left) are implemented.
  • zone 124D is actuated, on screen movements 126D (down wards) are implemented.
  • these embodiments are well suited for joystick implementations, two dimensional menu selection, photo image panning and the like.
  • Figs. 11 A-l ID show the media player 100 of Fig. 8 being used by a user 130, in accordance with one embodiment of the invention.
  • the media player 100 is being addressed for one handed operation in which the media player 100 is held in the user's hand 136 while the buttons and touch pad 110 are manipulated by the thumb 138 of the same hand 136.
  • the palm 140 and rightmost fingers 141 (or leftmost fingers if left handed) of the hand 136 are used to grip the sides of the media player 100 while the thumb 138 is used to actuate the touch pad 110.
  • the entire top surface of the touch pad 110 is accessible to the user's thumb 138. Referring to Fig.
  • the media device may comfortably held by one hand while being comfortably addressed by the other hand.
  • This configuration generally allows the user to easily actuate the touch pad with one or more fingers.
  • the thumb and rightmost fingers (or leftmost fingers if left handed) of the first hand are used to grip the sides of the media player while a finger of the opposite hand is used to actuate the touch pad.
  • the entire top surface of the touch pad is accessible to the user's finger.
  • Figs. 12 is a partially broken away perspective view of an annular capacitive touch pad 150, in accordance with one embodiment of the present invention.
  • the annular capacitive touch pad 150 is arranged to detect changes in capacitance as the user moves, taps, rests an object such as a fmger on the touch pad 150.
  • the annular capacitive touch pad 150 is formed from various layers including at least a label layer 152, an electrode layer 154 and a circuit board 156.
  • the label layer 152 is disposed over the electrode layer 154 and the electrode layer 154 is disposed over the circuit board 156.
  • At least the label 152 and electrode layer 154 are annular such that they are defined by concentric circles, i.e., they have an inner perimeter and an outer perimeter.
  • the circuit board 156 is generally a circular piece having an outer perimeter that coincides with the outer perimeter of the label 152 and electrode layer 154. It should be noted, however, that in some cases the circuit board 156 may be annular or the label 152 and electrode layer 154 may be circular. [0077]
  • the label layer 152 serves to protect the underlayers and to provide a surface for allowing a finger to slide thereon. The surface is generally smooth so that the finger does not stick to it when moved.
  • the label layer 152 also provides an insulating layer between the finger and the electrode layer 154.
  • the electrode layer 154 includes a plurality of spatially distinct electrodes 158 that have positions based on the polar coordinate system.
  • the electrodes 158 are positioned angularly and/or radically on the circuit board 156 such that each of the electrodes 158 defines a distinct angular and/or radial position thereon. Any suitable number of electrodes 158 may be used. In most cases, it would be desirable to increase the number of electrodes 158 so as to provide higher resolution, i.e., more information can be used for things such as acceleration.
  • the electrode layer 154 is broken up into a plurality of angularly sliced electrodes 158. The angularly sliced electrodes 158 may be grouped together to form one or more distinct button zones 159. In one implementation, the electrode layer 154 includes about 1024 angularly sliced electrodes that work together to form 128 angularly sliced button zones 159.
  • the touch pad 150 provides a touch sensitive surface that works according to the principals of capacitance.
  • the first electrically conductive member is one or more of the electrodes 158 and the second electrically conductive member is the finger of the user. Accordingly, as the finger approaches the touch pad 150, a tiny capacitance forms between the finger and the electrodes 158 in close proximity to the fmger. The capacitance in each of the electrodes 158 is measured by control circuitry 160 located on the backside of the circuit board 156.
  • the control circuitry 160 can determine the angular and/or radial location, direction, speed and acceleration of the finger as it is moved across the touch pad 150.
  • the control circuitry 160 can also report this information in a form that can be used by a computing device such as a media player.
  • the control circuitry may include an ASIC (application specific integrated circuit).
  • ASIC application specific integrated circuit
  • the touch pad 178 may be divided into several independent and spatially distinct button zones 180 that are positioned radically from the center 182 of the touch pad 178 to the perimeter 184 of the touch pad 178. Any number of radial zones may be used. In one embodiment, each of the radial zones 180 represents a radial position in the plane of the touch pad 178. By way of example, the zones 180 may be spaced at 5 mm increments. Like above, each of the button zones 180 has one or more electrodes 186 disposed therein for detecting the presence of an object such as a finger. In the illustrated embodiment, a plurality of radial electrodes 186 are combined to form each of the button zones 180.
  • the touch pad 188 may be divided into several independent and spatially distinct button zones 190 that are positioned both angularly and radically about the periphery of the touch pad 188 and from the center of the touch pad 188 to the perimeter of the touch pad 138. Any number of combination zones may be used.
  • each of the combination button zones 190 represents both an angular and radial position in the plane of the touch pad 188.
  • the zones may be positioned at both 2 degrees and 5 mm increments.
  • each of the combination zones 190 has one or more electrodes 192 disposed therein for detecting the presence of an object such as a finger.
  • a plurality of angular/radial electrodes 192 are combined to form each of the button zones 190.
  • the touch pad 200 may include angular and radial electrodes 202 that are broken up such that consecutive zones do not coincide exactly.
  • the touch pad 200 has an annular shape and the electrodes 202 follow a spiral path around the touch pad 200 from the center to the outer perimeter of the touch pad 200.
  • the electrodes 202 may be grouped together to form one or more distinct button zones 204.
  • touch pads herein are all shown as circular that they may take on other forms such as other curvilinear shapes (e.g., oval, annular and the like), rectilinear shapes (e.g., hexagon, pentagon, octagon, rectangle, square, and the like) or a combination of curvilinear and rectilinear (e.g., dome).
  • curvilinear shapes e.g., oval, annular and the like
  • rectilinear shapes e.g., hexagon, pentagon, octagon, rectangle, square, and the like
  • a combination of curvilinear and rectilinear e.g., dome.
  • the various aspects of the inventions described above can be used alone or in various combinations.
  • the invention is preferably implemented by a combination of hardware and software, but can also be implemented in hardware or software.
  • the invention can also be embodied as computer readable code on a computer readable medium.
  • the computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDs, magnetic tape, optical data storage devices, and carrier waves. The computer readable medium can also be distributed over a network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • the touch pad assembly may communicate with the host device via a serial interface.
  • a serial interface An example of a serial interface will now be described.
  • the serial interface consists of at least four signals including a clock, ATN, DATA- IN, and DATA_OUT.
  • the clock and DATA_OUT are driven by the touch pad assembly.
  • the ATN and DATA_IN are driven by the host device.
  • packet transfers are initiated by the touch pad assembly, clocked by the touch pad assembly and done at a time convenient to the touch pad assembly.
  • the host device relies on the touch pad assembly to initiate transfers.
  • the touch pad assembly transfers a packet when it detects a change in button status or touch pad position or if it detects an ATN signal from the host. If the host wishes to send data to the touch pad assembly it asserts the ATN signal and keeps it asserted until after the packet it wants to send has been transferred.
  • the touch pad assembly monitors the ATN signal and initiates a transfer if it sees it asserted.
  • the touch pad assembly There are typically several defined packets types that the touch pad assembly can transmit.
  • the touch pad assembly sends unsolicited packets unless specifically asked by the host to send another type.
  • unsolicited packets the unsolicited packets are sent periodically whenever it detects a change in button status or touch pad position.
  • solicited packets the touch pad assembly typically only sends one for each request by the host and then reverts back to unsolicited packets.
  • Unsolicited packets generally have a delay between them while response packets may be sent at any time in response to the ATN signal.
  • the touch pad may also be used a stand alone input device that connects to a desktop or portable computer.
  • the touch pad may also be used a stand alone input device that connects to a desktop or portable computer.
  • touch pad has been described in terms of being actuated by a finger, it should be noted that other objects may be used to actuate it in some cases.
  • a stylus or other object may be used in some configurations of the touch pad. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A touch pad system is disclosed. The system includes mapping the touch pad into native sensor coordinates. The system also includes producing native values of the native sensor coordinates when events occur on the touch pad. The system further includes filtering the native values of the native sensor coordinates based on the type of events that occur on the touch pad. The system additionally includes generating a control signal based on the native values of the native sensor coordinates when a desired event occurs on the touch pad.

Description

TOUCH PAD FOR HANDHELD DEVICE
BACKGROUND OF THE INVENTION
1. Field of the Invention
[0001] The present invention relates generally to a media player having a touch pad. More particularly, the present invention relates to improved touch pads.
2. Description of the Related Art
[0002] There exist today many styles of input devices for performing operations in a consumer electronic device. The operations generally correspond to moving a cursor and making selections on a display screen. By way of example, the input devices may include buttons, switches, keyboards, mice, trackballs, touch pads, joy sticks, touch screens and the like. Each of these devices has advantages and disadvantages that are taken into account when designing the consumer electronic device. In handheld computing devices, the input devices are generally selected from buttons and switches. Buttons and switches are generally mechanical in nature and provide limited control with regards to the movement of a cursor (or other selector) and making selections. For example, they are generally dedicated to moving the cursor in a specific direction (e.g., arrow keys) or to making specific selections (e.g., enter, delete, number, etc.). In the case of hand-held personal digital assistants (PDA), the input devices tend to utilize touch-sensitive display screens. When using a touch screen, a user makes a selection on the display screen by pointing directly to objects on the screen using a stylus or finger.
[0003] In portable computing devices such as laptop computers, the input devices are commonly touch pads. With a touch pad, the movement of an input pointer (i.e., cursor) corresponds to the relative movements of the user's finger (or stylus) as the finger is moved along a surface of the touch pad. Touch pads can also make a selection on the display screen when one or more taps are detected on the surface of the touch pad. In some cases, any portion of the touch pad may be tapped, and in other cases a dedicated portion of the touch pad may be tapped. In stationary devices such as desktop computers, the input devices are generally selected from mice and trackballs. With a mouse, the movement of the input pointer corresponds to the relative movements of the mouse as the user moves the mouse along a surface. With a trackball, the movement of the input pointer corresponds to the relative movements of a ball as the user rotates the ball within a housing. Both mice and trackballs generally include one or more buttons for making selections on the display screen. [0004] In addition to allowing input pointer movements and selections with respect to a GUI presented on a display screen, the input devices may also allow a user to scroll across the display screen in the horizontal or vertical directions. For example, mice may include a scroll wheel that allows a user to simply roll the scroll wheel forward or backward to perform a scroll action. In addition, touch pads may provide dedicated active areas that implement scrolling when the user passes his or her finger linearly across the active area in the x and y directions. Both devices may also implement scrolling via horizontal and vertical scroll bars as part of the GUI. Using this technique, scrolling is implemented by positioning the input pointer over the desired scroll bar, selecting the desired scroll bar, and moving the scroll bar by moving the mouse or fmger in the y direction (forwards and backwards) for vertical scrolling or in the x direction (left and right) for horizontal scrolling.
[0005] With regards to touch pads, mice and track balls, a Cartesian coordinate system is used to monitor the position of the finger, mouse and ball, respectively, as they are moved. The Cartesian coordinate system is generally defined as a two dimensional coordinate system (x, y) in which the coordinates of a point (e.g., position of finger, mouse or ball) are its distances from two intersecting, often perpendicular straight lines, the distance from each being measured along a straight line parallel to each other. For example, the x, y positions of the mouse, ball and fmger may be monitored. The x, y positions are then used to correspondingly locate and move the input pointer on the display screen.
[0006] To elaborate further, touch pads generally include one or more sensors for detecting the proximity of the finger thereto. The sensors are generally dispersed about the touch pad with each sensor representing an x, y position. In most cases, the sensors are arranged in a grid of columns and rows. Distinct x and y position signals, which control the x, y movement of a pointer device on the display screen, are thus generated when a finger is moved across the grid of sensors within the touch pad. For brevity sake, the remaining discussion will be held to the discussion of capacitive sensing technologies. It should be noted, however, that the other technologies have similar features.
[0007] Capacitive sensing touch pads generally contain several layers of material. For example, the touch pad may include a protective shield, one or more electrode layers and a circuit board. The protective shield typically covers the electrode layer(s), and the electrode layer(s) is generally disposed on a front side of the circuit board. As is generally well known, the protective shield is the part of the touch pad that is touched by the user to implement cursor movements on a display screen. The electrode layer(s), on the other hand, is used to interpret the x, y position of the user's fmger when the user's finger is resting or moving on the protective shield. The electrode layer (s) typically consists of a plurality of electrodes that are positioned in columns and rows so as to form a grid array. The columns and rows are generally based on the Cartesian coordinate system and thus the rows and columns correspond to the x and y directions.
[0008] The touch pad may also include sensing electronics for detecting signals associated with the electrodes. For example, the sensing electronics may be adapted to detect the change in capacitance at each of the electrodes as the finger passes over the grid. The sensing electronics are generally located on the backside of the circuit board. By way of example, the sensing electronics may include an application specific integrated circuit (ASIC) that is configured to measure the amount of capacitance in each of the electrodes and to compute the position of finger movement based on the capacitance in each of the electrodes. The ASIC may also be configured to report this information to the computing device.
[0009] Referring to Fig. 1, a touch pad 2 will be described in greater detail. The touch pad 2 is generally a small rectangular area that includes a protective shield 4 and a plurality of electrodes 6 disposed underneath the protective shield layer 4. For ease of discussion, a portion of the protective shield layer 4 has been removed to show the electrodes 6. Each of the electrodes 6 represents a different x, y position. In one configuration, as a finger 8 approaches the electrode grid 6, a tiny capacitance forms between the finger 8 and the electrodes 6 proximate the finger 8. The circuit board/sensing electronics measures capacitance and produces an x, y input signal 10 corresponding to the active electrodes 6. The x, y input signal 10 is sent to a host device 12 having a display screen 14. The x, y input signal 10 is used to control the movement of a cursor 16 on the display screen 14. As shown, the input pointer moves in a similar x, y direction as the detected x, y fmger motion. Summary of the Invention
[0010] The invention relates, in one embodiment, to a touch pad assembly. The touch pad assembly includes a touch pad having one or more sensors that map the touch pad plane into native sensor coordinates. The touch pad assembly also includes a controller that divides the surface of the touch pad into logical device units that represent areas of the touch pad that can be actuated by a user, receives the native values of the native sensor coordinates from the sensors, adjusts the native values of the native sensor coordinates into a new value associated with the logical device units and reports the new value of the logical device units to a host device. [0011] The invention relates, in another embodiment, to a method for a touch pad. The method includes mapping the touch pad into native sensor coordinates. The method also includes producing native values of the native sensor coordinates when events occur on the touch pad. The method further includes filtering the native values of the native sensor coordinates based on the type of events that occur on the touch pad. The method additionally includes generating a control signal based on the native values of the native sensor coordinates when a desired event occurs on the touch pad. [0012] The invention relates, in another embodiment, to a signal processing method. The method includes receiving a current user location. The method also includes determining the difference in user location by comparing the current user location to a last user location. The method further includes only outputting the current user location when the difference in user location is larger than a threshold value. The method additionally includes converting the outputted current user location into a logical device unit. Moreover, the method includes generating a message for a host device. The message including the more logical user location. The more logical user location being used by the host device to move a control object in a specified manner. [0013] The invention relates, in another embodiment, to a message from a touch pad assembly to a host device in a computer system that facilitates bi-directional communications between the touch pad assembly and the host device. The message includes an event field identifying whether the message is a touch pad event or a button event. The message also includes an event identifier field identifying at least one event parameter, each event parameter having an event value, the event value for a touch pad event parameter indicating an absolute position, the event value for a button event parameter indicating button status. [0014] The invention relates, in another embodiment, to a touch pad assembly capable of transforming a user action into motion onto a display screen, the touch pad system including a touch pad having a plurality of independent and spatially distinct button zones each of which represents a different movement direction on the display screen so as to enable joystick implementations, multiple dimensional menu selection or photo image panning. BRIEF DESCRIPTION OF THE DRAWINGS The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which: Fig. 1 is a simplified diagram of a touch pad and display. Fig. 2 is a diagram of a computing system, in accordance with one embodiment of the present invention. Fig. 3 is a flow diagram of signal processing, in accordance with one embodiment of the invention. Fig. 4 is a flow diagram of touch pad processing, in accordance with one embodiment of the invention. Fig. 5 is a flow diagram of a touch pad processing, in accordance with one embodiment of the invention. Fig. 6 is a diagram of a communication protocol, in accordance with one embodiment of the present invention. Fig. 7 is a diagram of a message format, in accordance with one embodiment of the present invention. Fig. 8 is a perspective view of a media player, in accordance with one embodiment of the invention. Fig. 9 is a front view of a media player, in accordance with one embodiment of the present invention. Fig. 10 is a front view of a media player, in accordance with one embodiment of the present invention. Figs. 11 A- 1 ID are top views of a media player in use, in accordance with one embodiment of the present invention. Fig. 12 is a partially broken away perspective view of an annular capacitive touch pad, in accordance with one embodiment of the present invention. Fig. 13 is a top view of a sensor arrangement of a touch pad, in accordance with another embodiment of the present invention. Fig. 14 is a top view of a sensor arrangement of a touch pad, in accordance with another embodiment of the present invention. Fig. 15 is a top view of a sensor arrangement of a touch pad, in accordance with another embodiment of the present invention. DETAILED DESCRIPTION OF THE INVENTION [0015] The present invention will now be described in detail with reference to a few preferred embodiments thereof as illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order not to unnecessarily obscure the present invention. [0016] Fig. 2 is a diagram of a computing system 20, in accordance with one embodiment of the present invention. The computing system 20 includes at least a user interface 22 and a host device 24. The user interface 22 is configured to provide control information for performing actions in the host device 24. By way of example, the actions may include making selections, opening a file or document, executing instructions, starting a program, viewing a menu, and/or the like. The actions may also include moving an object such as a pointer or cursor on a display screen of the host device 24. Although not shown in Fig. 2, the user interface 22 may be integrated with the host device 24 (within the same housing) or it may be a separate component (different housing).
[0017] The user interface 22 includes one or more touch buttons 34, a touch pad 36 and a controller 38. The touch buttons 34 generate button data when a user places their finger over the touch button 34. The touch pad, on the other hand, generates position data when a user places their finger (or object) over the touch pad 36. The controller 38 is configured to acquire the button data from the touch buttons 34 and the position data from the touch pad 36. The controller is also configured to output control data associated with the button data and/or position data to the host device 24. In one embodiment, the controller 38 only outputs control data associated with the touch buttons when the button status has changed. In another embodiment, the controller 38 only outputs control data associated with the touch pad when the position data has changed. The control data, which may include the raw data (button, position) or some form of thereof, may be used to implement a control function in the host device 24. By way of example, the control data may be used to move an object on the display 30 of the host device 24 or to make a selection or issue a command in the host device 24.
[0018] The touch buttons 34 and touch pad 36 generally include one or more sensors capable of producing the button and position data. The sensors of the touch buttons 34 and touch pad 36 may be distinct elements or they may be grouped together as part of a sensor arrangement, i.e., divided into sensors for the touch buttons 34 and sensors for the touch pad 36. The sensors of the touch buttons 34 are configured to produce signals associated with button status (activated, not activated). For example, the button status may indicate button activation when an object is positioned over the touch button and button deactivation at other times (or vice versa). The sensors of the touch pad 36 are configured produce signals associated with the absolute position of an object on or near the touch pad 36. In most cases, the sensors of the touch pad 36 map the touch pad plane into native or physical sensor coordinates 40. The native sensor coordinates 40 may be based on Cartesian coordinates or Polar coordinates (as shown). When Cartesian, the native sensor coordinates 40 typically correspond to x and y coordinates. When Polar (as shown), the native sensor coordinates typically correspond to radial and angular coordinates (r, θ). By way of example, the sensors may be based on resistive sensing, surface acoustic wave sensing, pressure sensing (e.g., strain gauge), optical sensing, capacitive sensing and the like. [0019] In one embodiment, the user interface 22 includes a sensor arrangement based on capacitive sensing. The user interface 22 is therefore arranged to detect changes in capacitance as a fmger moves, taps, or rests on the touch buttons 34 and touch pad 36. The capacitive touch assembly is formed from various layers including at least a set of labels, a set of electrodes (sensors) and a printed circuit board (PCB). The electrodes are positioned on the PCB, and the labels are position over the electrodes. The labels serve to protect the electrodes and provide a surface for receiving a finger thereon. The label layer also provides an insulating surface between the finger and the electrodes. As should be appreciated, the controller 38 can determine button status at each of the touch buttons 34 and position of the finger on the touch pad 36 by detecting changes in capacitance. In most cases, the controller 38 is positioned on the opposite side of the PCB. By way of example, the controller 38 may correspond to an application specific integrated circuit (ASIC), and it may operate under the control of Firmware stored on the ASIC.
[0020] Referring to the controller 38, the controller 38 is configured to monitor the sensors of the touch buttons 34 and touch pad 36 and decide what information to report to the host device 24. The decision may include filtering and/or conversion processes. The filtering process may be implemented to reduce a busy data stream so that the host device 24 is not overloaded with redundant or non-essential data. By way of example, a busy data stream may be created when multiple signals are produced at native sensor coordinates 40 that are in close proximity to one another. As should be appreciated, processing a busy data stream tends to require a lot of power, and therefore it can have a disastrous effect on portable devices such as media players that use a battery with a limited power supply. Generally speaking, the filtering process throws out redundant signals so that they do not reach the host device 24. In one implementation, the controller 38 is configured to only output a control signal when a significant change in sensor signals is detected. A significant change corresponds to those changes that are significant, as for example, when the user decides to move his/her finger to a new position rather than when the user's finger is simply resting on a spot and moving ever so slightly because of finger balance (toggling back and forth). The filter process may be implemented through Firmware as part of the application specific integrated circuit.
[0021] The conversion process, on the other hand, is implemented to adjust the raw data into other form factors before sending or reporting them to the host device 24. That is, the controller 38 may convert the raw data into other types of data. The other types of data may have similar or different units as the raw data. In the case of the touch pad 36, the controller 38 may convert the position data into other types of position data. For example, the controller 38 may convert absolute position data to relative position data. As should be appreciated, absolute position refers to the position of the finger on the touch pad measured absolutely with respect to a coordinate system while relative position refers to a change in position of the finger relative to the finger's previous position. The controller 38 may also convert multiple absolute coordinates into a single absolute coordinate, Polar coordinates into Cartesian coordinates, and/or Cartesian coordinates into Polar coordinates. The controller 38 may also convert the position data into button data. For example, the controller may generate button control signals when an object is tapped on a predetermined portion of the touch pad or other control signals when an object is moved in a predetermined manner over the touch pad (e.g., gesturing). [0022] The conversion may also include placing the control signal in a format that the host device 24 can understand. By way of example, the controller 38 may follow a predetermined communication protocol. As is generally well known, communication protocols are a set of rules and procedures for exchanging data between two devices such as the user interface 22 and the host device 24. Communication protocols typically transmit information in data blocks or packets that contain the data to be transmitted, the data required to guide the packet to its destination, and the data that corrects errors that occur along the way. The controller may support a variety of communication protocols for communicating with the host device, including but not limited to, PS/2, Serial, ADB and the like. In one particular implementation, a Serial protocol is used.
[0023] The conversion process may include grouping at least a portion of the native coordinates 40 together to form one or more virtual actuation zones 42. For example, the controller 38 may separate the surface of the touch pad 36 into virtual actuation zones 42A-D and convert the native values of the native sensor coordinates 40 into a new value associated with the virtual actuation zones 42A-D. The new value may have similar or different units as the native value. The new value is typically stored at the controller 38 and subsequently passed to the host device 24. Generally speaking, the controller 38 outputs a control signal associated with a particular virtual actuation zone 42 when most of the signals are from native sensor coordinates 40 located within the particular virtual actuation zone 42. [0024] The virtual actuation zones 42 generally represent a more logical range of values than the native sensor coordinates 40 themselves, i.e., the virtual actuation zones 42 represent areas of touch pad 36 that can be better actuated by a user (magnitudes larger). The ratio of native sensor coordinates 40 to virtual actuation zones 42 may be between about 1024: 1 to about 1 :1, and more particularly about 8:1. For example, the touch pad may include 128 virtual actuation areas based on 1024 native sensor coordinates.
[0025] The virtual actuation zones 42 may be widely varied. For example, they may represent absolute positions on the touch pad 36 that are magnitudes larger than the native sensor coordinates 40. For example, the touch pad 36 can be broken up into larger slices than would otherwise be attainable using the native sensor coordinates 40. In one implementation, the virtual actuation zones 42 are distributed on the touch pad 36 within a range of 0 to 95 angular positions. The angular position is zero at the 12 o clock position and progresses clockwise to 95 as it comes around to 12 o'clock again.
[0026] The virtual actuation zones 42 may also represent areas of the touch pad that can be actuated by a user to implement specific control functions such as button or movement functions. With regards to button functions, the virtual actuation zones 42 may correspond to button zones that act like touch buttons. With regards to movement functions, each of the virtual actuation zones 42 may correspond to different movement directions such that they act like arrow keys. For example, virtual actuation zone 42A may represent an upward movement, virtual actuation zone 42B may represent a downward movement, virtual actuation zone 42C may represent a left movement, and virtual actuation zone 42D may represent right movement. As should be appreciated, this type of touch pad configuration may enable game stick implementations, two dimensional menu selection, photo image panning and the like. [0027] Although not shown, the controller 38 may also include a storage element. The storage element may store a touch pad program for controlling different aspects of the user interface 22. For example, the touch pad program may contain virtual actuation zone profiles that describe how the virtual actuation zones are distributed around the touch pad relative to the native sensor coordinates and what type of value to output based on the native values of the native sensor coordinates selected and the virtual actuation zone corresponding to the selected native sensor coordinates. [0028] In one particular touch pad operation, the controller 38 receives the position data from the touch pad 36. The controller 38 then passes the data through a filtering process. The filtering process generally includes determining if the data is based on noise events or actual events. Noise events are associated with non significant events such as when a user's finger is simply resting on a spot and moving ever so slightly because of finger balance. Actual events are associated with significant events such as when a user decides to move his/her finger to a new position on the touch pad. The noise events are filtered out and the actual events are passed through the controller 38. [0029] With actual events, the controller 38 determines if the position data should be adjusted. If not, the position data is reported to the host device 24. If so, the position data is converted into other form factors including but not limited to other position data or button data. For example, the native values of the sensor coordinates are converted into a new value associated with a selected virtual actuation zone. After the conversion, the controller 38 reports the converted data to the host device 24. By way of example, the controller 38 may pass the new value to a main system processor that executes the main application program running on the host device 24. [0030] Referring to the host device 24, the host device 24 generally includes a control circuit 26. The control circuit 26 is configured to execute instructions and carry out operations associated with the host device 24. For example, the control circuit 26 may control the reception and manipulation of input and output data between the components of the computing system 20. The host device 24 may also include a hold switch 28 for activating or deactivating communications between the host device 24 and the user interface 22. The host device may additionally include a display 30 configured to produce visual information such as text and graphics on a display screen 32 via display commands from the control circuit 26. By way of example, the visual information may be in the form of a graphical user interface (GUI). Although not shown, the host device may additionally include one or more speakers or jacks that connect to headphones/speakers.
[0031] The control circuit may be widely varied. The control circuit may include one or more processors 27 that together with an operating system operate to execute computer code and produce and use data. The processor 27 can be a single-chip processor or can be implemented with multiple components. The computer code and data may reside within data storage that is operatively coupled to the processor. Data storage generally provides a place to hold data that is being used by the computer system 20. By way of example, the data storage may include Read-Only Memory (ROM), Random- Access Memory (RAM), hard disk drive and/or the like. Although not shown, the control circuit may also include an input/output controller that is operatively coupled to the processor. The input/output controller generally operates by exchanging data between the host device 24 and the I/O devices that desire to communicate with the host device 24 (e.g., touch pad assembly 22). The control circuit also typically includes a display controller that is operatively coupled to the processor. The display controller is configured to process display commands to produce text and graphics on the display screen 32 of the host device 24. The input/output controller and display controller may be integrated with the processor or they may be separate components. [0032] It should be noted that the control circuit 26 may be configured to perform some of the same functions as the controller 38. For example, the control circuit 26 may perform conversion processes on the data received from the controller 38. The conversion may be performed on raw data or on already converted data. [0033] Fig. 3 is a flow diagram of signal processing 50, in accordance with one embodiment of the invention. By way of example, the signal processing 50 may be performed by the computing system shown in Fig. 2. Signal processing 50 generally begins at block 52 where a user input is produced at the user interface 22. The user input is typically based on signals generated by the sensor arrangement of the touch buttons and touchpad. The user input may include raw data. The user input may also include filtered or converted data.
[0034] Following block 52, the processing proceeds to block 54 where the user input is reported to the control circuit of the host device. The user input may contain both button and position data or it may only contain button data or position data. The user input is typically reported when a change is made and more particularly when a desired change is made at the user interface (filtered). For example, button data may be reported when the button status has changed and position data may be reported when the position of a finger has changed.
[0035] Following block 54, the processing proceeds to block 56 where an action is performed in the host device based on the user input. The actions are typically controlled by the control circuit of the host device. The actions may include making selections, opening a file or document, executing instructions, starting a program, viewing a menu, and/or the like. The actions may also include moving an object such as a pointer or cursor on a display screen of the host device 24. [0036] Fig. 4 is a flow diagram of touch pad processing 60, in accordance with one embodiment of the invention. Touch pad processing 60 generally begins at block 62 where at least one control object is displayed on a graphical user interface. The control object may be a cursor, slider bar, image or the like. By way of example, the GUI may be displayed on the display 30 of the host device 24. The GUI is typically under the control of the processor of the host device 24.
[0037] Following block 62, the processing proceeds to block 64 where an angular or radial referenced input is received. By way of example, the angular or radial referenced input may be produced by the user interface 22 and received by the processor of the host device 24. The angular or radial referenced input may be raw data formed by the sensor arrangement or converted data formed at the controller. Furthermore, the raw or converted data may be filtered so as to reduce a busy data stream.
[0038] Following block 64, touch pad processing proceeds to block 66 where the control object is modified based on the angular or radial referenced input. For example, the direction that a control object such as a football player in a football game is moving may be changed from a first direction to a second direction or a highlight bar may be moved through multiple images in a photo library. The modification is typically implemented by the processor of the host device.
[0039] Fig. 5 is a flow diagram of a touch pad processing 70, in accordance with one embodiment of the invention. By way of example, touch pad processing may be performed by the controller shown in Fig. 2. Furthermore, it may be associated with blocks 52/54 and 62 shown in Figs. 3 and 4. Touch pad processing 70 generally begins at block 72 where a current user location is received. The current user location corresponds to the current location of the user's finger on the touch pad. For example, the controller may detect the changes in sensor levels at each of the native sensor coordinates and thereafter determine the current location of the user's finger on the touch pad based on the change in sensor levels at each of the native sensor coordinates.
[0040] Following block 72, the process flow proceeds to block 74 where a determination is made as to whether the current user location is within a threshold from the last user location, i.e., the user location that precedes the current user location. In some cases, the current user location is compared to the last user location to determine the difference in user location, i.e., how much movement occurred between the current and last readings. If the current user location is within the threshold then an undesired change has been made and the process flow proceeds back to block 72. If the current location is outside the threshold then a desired change has been made and the process flow proceeds to block 76. By way of example:
Undesired change: | currentUserLocation - lastUserLocation | < Threshold Desired change: |currentUserLocation - lastUserLocation| > Threshold
[0041] In one embodiment, the threshold may be defined as the number of sensor levels that need to change in order to report a change in the user finger location to the main system processor of the host device. In one particular implementation, the threshold is equal to about 3. The threshold may be determined by the following equation:
Threshold (T) = C*(native sensor coordinate resolution/logical device unit resolution), where the native sensor coordinate resolution defines the maximum number of different positions that the sensors are able to detect for a specific plane coordinate system, the logical device unit resolution defines the number of values that are communicated to the main system processor of the host device for the said specific plane coordinate system, and coefficient C defines the width border area between the clusters of native sensor coordinates that define one logical device unit. [0042] The coefficient C is generally determined by the sensitivity needed to initiate a user event to the main system processor of the host device. It customizes the threshold value to the physical limitations of the sensor technology and the expected noise of the user finger events. Larger values tend to filter more events and reduce sensitivity. The system designer may pick the exact value of C by testing several values to strike optimal balance between sensitivity and stability of the user finger location. The coefficient C is typically a value between 0 and 0.5, and more particularly about 0.25. As should be appreciated, the threshold (T) is about 2 when the native sensor coordinate resolution is about 1024, the logical device unit resolution is about 128 and the coefficient is about 0.25.
[0043] In block 76, a new value associated with a particular logical device unit is generated based on the changed native sensor coordinates associated with the particular logical device unit. In most cases, the raw number of slices in the form of native sensor coordinates are grouped into a more logical number of slices in the form of logical device units (e.g., virtual actuation zones).
[0044] Following block 76, the process flow proceeds to block 78 where the last user location is updated. That is, the last current location is changed to the current user location. The current user location now acts as the last user location for subsequent processing.
[0045] Following block 78, the process flow proceeds to block 80 where a message is sent. In most cases, the message is sent when the difference between the current and last user location is larger than the threshold value. The message generally includes the new value associated with the selected logical device unit. By way of example, the touch pad may send a message to the main system processor of the host device. When received by the main system processor, the message may be used to make an adjustment in the host device, i.e., cause a control object to move in a specified manner.
[0046] Fig. 6 is a diagram of a communication protocol 82, in accordance with one embodiment of the present invention. By way of example, the communication protocol may be used by the user interface and host device of Fig. 2. In this particular embodiment, the user interface 22 has one dedicated input ACTIVE line that is controlled by the control circuit 26. The state of the ACTIVE line signal may be set at LOW or HIGH. The hold switch 28 may be used to change the state of the ACTIVE line signal (for example when the hold switch is in a first position or second position). As shown in Fig. 6, when the ACTIVE signal is set to HIGH, the user interface 22 sends a synch message to the control circuit 26 that describes the Button and Touch pad status (e.g., button state and touch pad position). In one embodiment, new synch messages are only sent when the Button state and/or the Touch Pad status changes. For example, when the touch pad position has changed within a desired limit. When the ACTIVE signal is set to LOW, the user interface 22 does not send a synch message to the control circuit 26. When the ACTIVE signal is toggled from LOW to HIGH, the user interface 22 sends a Button state and touch pad position message. This may be used on startup to initialize the state. When the ACTIVE signal is toggled from HIGH to LOW, the user interface 22 does not send a synch message to the control circuit 26. In one embodiment, the user interface 22 is configured to send a two data byte message if both the Buttons and touch pad positions changes since the last message was sent, and a one data byte message if only one button state or touch pad position changes.
[0047] Fig. 7 is a diagram of a message format 86, in accordance with one embodiment of the present invention. By way of example, the message format 86 may correspond to the synch message described in Fig. 6. The message format 86 may form a two data byte message or a one data byte message. Each data byte is configured as an 8 bit message. The upper Most Significant Bit (MSB) of the message is the event type (1 bit) and the lower Least Significant Bits (LSB) are the event value (7 bits). [0048] The event value is event type specific. In Fig. 7, the event type bits are marked as EO, and the event value is marked as D0-D6. As indicated in the diagram, the event type may be a touch pad position change El or a button state change E0 when the button is being touched or El when the button is not being touched. The event values may correspond to different button events such as seeking forwards (D4), seeking backwards (D3), playing and pausing (D2), providing a menu (Dl) and making selections (DO). The event values may also correspond to touch pad events such as touchpad position (D5). For example, in a touch pad that defines the logical coordinates in polar coordinates from 0-127, the event value may correspond to an absolute touch pad position in the range of 0-127 angular positions where zero is 12 o clock, 32 is 3 o clock, 64 is 6 o clock and 96 is 9 o clock, etc. going clockwise. The event values may also correspond to a reserve (D6). The reserve is an unused bit that may be used to extend the API. .
[0049] Fig. 8 is a perspective diagram of a media player 100, in accordance with one embodiment of the present invention. By way of example, the media player 100 may generally correspond to the host device shown in Fig. 2. The term "media player" generally refers to computing devices that are dedicated to processing media such as audio, video or other images, as for example, music players, game players, video players, video recorders, cameras, and the like. In some cases, the media players contain single functionality (e.g., a media player dedicated to playing music) and in other cases the media players contain multiple functionality (e.g., a media player that plays music, displays video, stores pictures and the like). In either case, these devices are generally portable so as to allow a user to listen to music, play games or video, record video or take pictures wherever the user travels.
[0050] In one embodiment, the media player 100 is a handheld device that is sized for placement into a pocket of the user. By being pocket sized, the user does not have to directly carry the device and therefore the device can be taken almost anywhere the user travels (e.g., the user is not limited by carrying a large, bulky and often heavy device, as in a laptop or notebook computer). For example, in the case of a music player, a user may use the device while working out at the gym. In case of a camera, a user may use the device while mountain climbing. In the case of a game player, the user can use the device while traveling in a car. Furthermore, the device may be operated by the users hands, no reference surface such as a desktop is needed (this is shown in greater detail in Fig. 6). In the illustrated embodiment, the media player 100 is a pocket sized hand held MP3 music player that allows a user to store a large collection of music (e.g., in some cases up to 4,000 CD-quality songs). By way of example, the MP3 music player may correspond to the iPod MP3 player manufactured by Apple Computer of Cupertino, C A. Although used primarily for storing and playing music, the MP3 music player shown herein may also include additional functionality such as storing a calendar and phone lists, storing and playing games, storing photos and the like. In fact, in some cases, it may act as a highly transportable storage device.
[0051] As shown in Fig. 8, the media player 100 includes a housing 102 that encloses internally various electrical components (including integrated circuit chips and other circuitry) to provide computing operations for the media player 100. In addition, the housing may also define the shape or form of the media player. That is, the contour of the housing 102 may embody the outward physical appearance of the media player 100. The integrated circuit chips and other circuitry contained within the housing may include a microprocessor (e.g., CPU), memory (e.g., ROM, RAM), a power supply (e.g., battery), a circuit board, a hard drive, other memory (e.g., flash) and/or various input/output (I O) support circuitry. The electrical components may also include components for inputting or outputting music or sound such as a microphone, amplifier and a digital signal processor (DSP). The electrical components may also include components for capturing images such as image sensors (e.g., charge coupled device (CCD) or complimentary oxide semiconductor (CMOS)) or optics (e.g., lenses, splitters, filters).
[0052] In the illustrated embodiment, the media player 100 includes a hard drive thereby giving the media player 100 massive storage capacity. For example, a 20GB hard drive can store up to 4000 songs or about 266 hours of music. In contrast, flash- based media players on average store up to 128MB, or about two hours, of music. The hard drive capacity may be widely varied (e.g., 5, 10, 20 MB, etc.). In addition to the hard drive, the media player 100 shown herein also includes a battery such as a rechargeable lithium polymer battery. These type of batteries are capable of offering about 10 hours of continuous playtime to the media player 100. [0053] The media player 100 also includes a display screen 104 and related circuitry. The display screen 104 is used to display a graphical user interface as well as other information to the user (e.g., text, objects, graphics). By way of example, the display screen 104 may be a liquid crystal display (LCD). In one particular embodiment, the display screen 104 corresponds to a 160-by- 128-pixel high-resolution display, with a white LED backlight to give clear visibility in daylight as well as low-light conditions. As shown, the display screen 104 is visible to a user of the media player 100 through an opening 105 in the housing 102.
[0054] The media player 100 also includes a touch pad 110. The touch pad is an intuitive interface that provides easy one-handed operation, i.e., lets a user interact with the media player 100 with one or more fingers. The touch pad 110 is configured to provide one or more control functions for controlling various applications associated with the media player 100. For example, the touch initiated control function may be used to move an object on the display screen 104 or to make selections or issue commands associated with operating the media player 100. In order to implement the touch initiated control function, the touch pad 110 may be arranged to receive input from a finger moving across the surface of the touch pad 110, from a fmger holding a particular position on the touch pad and/or by a finger tapping on a particular position of the touch pad.
[0055] The touch pad 110 generally consists of a touchable outer surface 111 for receiving a finger for manipulation on the touch pad 110. Beneath the touchable outer surface 111 is a sensor arrangement 112. The sensor arrangement 112 includes one or more sensors that are configured to activate as the finger sits on, taps on or passes over them. The sensor arrangement 112 may be based on a Cartesian coordinate system, a Polar coordinate system or some other coordinate system. In the simplest case, an electrical signal is produced each time the finger is positioned over a sensing coordinate of the sensor arrangement 112. The number of signals in a given time frame may indicate location, direction, speed and acceleration of the finger on the touch pad, i.e., the more signals, the more the user moved his or her finger. In most cases, the signals are monitored by a control assembly that converts the number, combination and frequency of the signals into location, direction, speed and acceleration information and reports this information to the main system processor of the media player. This information may then be used by the media player 100 to perform the desired control function on the display screen 104. [0056] In one embodiment, the surface of the touch pad 110 is divided into several independent and spatially distinct actuation zones 113A-D disposed around the periphery of the touch pad 110. The actuation zones generally represent a more logical range of user inputs than the sensors themselves. Generally speaking, the touch pad 110 outputs a control signal associated with a particular actuation zone 113 when most of the signals are from sensing coordinates located within the particular actuation zone 113. That is, when an object approaches a zone 113, a position signal is generated at one or more sensing coordinates. The position signals generated by the one or more sensing coordinates may be used to inform the media player 100 that the object is at a specific zone 113 on the touch pad 110.
[0057] The actuation zones may be button zones or positional zones. When button zones, a button control signal is generated when an object is placed over the button zone. The button control signal may be used to make selections, open a file, execute instructions, start a program, view a menu in the media player. When positional zones, a position control signal is generated when an object is placed over the positional zone. The position signals may be used to control the movement of an object on a display screen of the media player. The distribution of actuation zones may be controlled by touch pad translation software or firmware that converts physical or native coordinates into virtual representation in the form of actuation zones. The touch pad translation software may be run by the control assembly of the touch pad or the main system processor of the media player. In most cases, the control assembly converts the acquired signals into signals that represent the zones before sending the acquired signals to the main system processor of the media player. [0058] The position control signals may be associated with a Cartesian coordinate system (x and y) or a Polar coordinate system (r, θ). Furthermore, the position signals may be provided in an absolute or relative mode. In absolute mode, the absolute coordinates of where it is being touched on the touch pad are used. For example x, y in the case of the Cartesian coordinate system or (r, θ ) in the case of the Polar coordinate system. In relative mode, the change in position of the finger relative to the finger's previous position is used. The touch pad may be configured to operate in a Cartesian-absolute mode, a Cartesian-relative mode, a Polar-absolute mode or a Polar-relative mode. The mode may be controlled by the touch pad itself or by other components of the media player system.
[0059] In either case, a user may select which mode that they would like to operate in the media player system or the applications running on the media player system may automatically set the mode of the media player system. For example, a game application may inform the media player system to operate in an absolute mode so that the touch pad can be operated as a joystick or a list application may inform the media player system to operate in a relative mode so that the touch pad can be operated as a scroll bar.
[0060] In one embodiment, each of the zones 113 represents a different polar angle that specifies the angular position of the zone 113 in the plane of the touch pad 110. By way of example, the zones 113 may be positioned at 90 degree increments all the way around the touch pad 110 or something smaller as for example 2 degree increments all the way around the touch pad 110. In one embodiment, the touch pad 110 may convert 1024 physical positions in the form of sensor coordinates, to a more logical range of 0 to 127 in the form of positional zones. As should be appreciated, the touch pad internal accuracy (1024 positions) is much larger than the accuracy (128 positions) needed for making movements on the display screen. [0061] The position of the touch pad 110 relative to the housing 102 may be widely varied. For example, the touch pad 110 may be placed at any external surface (e.g., top, side, front, or back) of the housing 102 that is accessible to a user during manipulation of the media player 100. In most cases, the touch sensitive surface 111 of the touch pad 110 is completely exposed to the user. In the illustrated embodiment, the touch pad 110 is located in a lower, front area of the housing 102. Furthermore, the touch pad 110 may be recessed below, level with, or extend above the surface of the housing 102. In the illustrated embodiment, the touch sensitive surface 111 of the touch pad 110 is substantially flush with the external surface of the housing 102. [0062] The shape of the touch pad 110 may also be widely varied. For example, the touch pad 110 may be circular, rectangular, triangular, and the like. In general, the outer perimeter of the shaped touch pad defines the working boundary of the touch pad. In the illustrated embodiment, the touch pad 110 is circular. This particular shape works well with Polar coordinates. More particularly, the touch pad is annular, i.e., shaped like or forming a ring. When annular, the inner and outer perimeter of the shaped touch pad defines the working boundary of the touch pad. [0063] In addition to above, the media player 100 may also include one or more buttons 114. The buttons 114 are configured to provide one or more dedicated control functions for making selections or issuing commands associated with operating the media player 100. By way of example, in the case of an MP3 music player, the button functions may be associated with opening a menu, playing a song, fast forwarding a song, seeking through a menu and the like. The buttons 114 may be mechanical clicking buttons and/or they may be touch buttons. In the illustrated embodiment, the buttons are touch buttons that receive input from a finger positioned over the touch button. Like the touch pad 110, the touch buttons 114 generally consist of a touchable outer surface for receiving a finger and a sensor arrangement disposed below the touchable outer surface. By way of example, the touch buttons and touch pad may generally correspond to the touch buttons and touch pad shown in Fig. 2. [0064] The position of the touch buttons 114 relative to the touch pad 110 may be widely varied. For example, they may be adjacent one another or spaced apart. In the illustrated embodiment, the buttons 114 are placed above the touch pad 110 in a linear manner as well as in the center of the annular touch pad 110. By way of example, the plurality of buttons 114 may consist of a menu button, play/stop button, forward seek button, a reverse seek button, and the like.
[0065] Moreover, the media player 100 may also include a hold switch 115. The hold switch 115 is configured to activate or deactivate the touch pad and/or buttons. This is generally done to prevent unwanted commands by the touch pad and/or buttons, as for example, when the media player is stored inside a user's pocket. When deactivated, signals from the buttons and/or touch pad are not sent or are disregarded by the media player. When activated, signals from the buttons and/or touch pad are sent and therefore received and processed by the media player. [0066] Moreover, the media player 100 may also include one or more headphone jacks 116 and one or more data ports 118. The headphone jack 116 is capable of receiving a headphone connector associated with headphones configured for listening to sound being outputted by the media device 100. The data port 118, on the other hand, is capable of receiving a data connector/cable assembly configured for transmitting and receiving data to and from a host device such as a general purpose computer (e.g., desktop computer, portable computer). By way of example, the data port 118 may be used to upload or down load audio, video and other images to and from the media device 100. For example, the data port may be used to download songs and play lists, audio books, ebooks, photos, and the like into the storage mechanism of the media player.
[0067] The data port 118 may be widely varied. For example, the data port may be a PS/2 port, a serial port, a parallel port, a USB port, a Firewire port and/or the like. In some cases, the data port 118 may be a radio frequency (RF) link or optical infrared (IR) link to eliminate the need for a cable. Although not shown in Fig. 2, the media player 100 may also include a power port that receives a power connector/cable assembly configured for delivering powering to the media player 100. In some cases, the data port 118 may serve as both a data and power port. In the illustrated embodiment, the data port 118 is a Firewire port having both data and power capabilities.
[0068] Although only one data port is described, it should be noted that this is not a limitation and that multiple data ports may be incorporated into the media player. In a similar vein, the data port may include multiple data functionality, i.e., integrating the functionality of multiple data ports into a single data port. Furthermore, it should be noted that the position of the hold switch, headphone jack and data port on the housing may be widely varied. That is, they are not limited to the positions shown in Fig. 2. They may be positioned almost anywhere on the housing (e.g., front, back, sides, top, bottom). For example, the data port may be positioned on the bottom surface of the housing rather than the top surface as shown. [0069] Referring to Fig. 9, the touch pad 110 will be described in greater detail. In this particular embodiment, the touch pad is operating in an absolute mode. That is, the touch pad reports the absolute coordinates of where it is being touched. As shown, the touch pad 110 includes one or more zones 124. The zones 124 represent regions of the touch pad 110 that may be actuated by a user to implement one or more actions or movements on the display screen 104.
[0070] The distribution of the zones 124 may be widely varied. For example, the zones 124 may be positioned almost anywhere on the touch pad 110. The position of the zones 124 may depend on the coordinate system of the touch pad 110. For example, when using polar coordinates, the zones 124 may have one or more radial and/or angular positions. In the illustrated embodiment, the zones 124 are positioned in multiple angular positions of the Polar coordinate system. Further, the zones 124 may be formed from almost any shape whether simple (e.g., squares, circles, ovals, triangles, rectangles, polygons, and the like) or complex (e.g., random shapes). The shape of multiple button zones 124 may have identical shapes or they may have different shapes. In addition, the size of the zones 124 may vary according to the specific needs of each device. In some cases, the size of the zones 124 corresponds to a size that allows them to be easily manipulated by a user (e.g., the size of a finger tip or larger). In other cases, the size of the zones 124 are small so as to improve resolution of the touch pad 110. Moreover, any number of zones 124 may be used. In the illustrated embodiment, four zones 124A-D are shown. It should be noted, however, that this is not a limitation and that the number varies according to the specific needs of each touch pad. For example, Fig. 5 shows the media player 100 with 16 button zones 124A-P.
[0071] The number of zones 124 generally depends on the number of sensor coordinates located within the touch pad 110 and the desired resolution of the touch pad 110. The sensors are configured to sense user actions on the zones 124 and to send signals corresponding to the user action to the electronic system. By way of example, the sensors may be capacitance sensors that sense capacitance when a finger is in close proximity. The arrangement of the sensors typically varies according to the specific needs of each device. In one particular embodiment, the touch pad 110 includes 1024 sensor coordinates that work together to form 128 zones. [0072] Referring to Figs. 9 and 10, the zones 124 when actuated are used to produce on screen movements 126. The control signal for the on screen movements may be initiated by the touch pad electronics or by the main system processor of the media player. By tapping or touching the zone, an object can be moved on the display. For example, each zone 124 may be configured to represent a particular movement on the display screen 104. In the illustrated embodiments, each of the zones 124 represents a particular direction of movement. The directions may be widely varied, however, in the illustrated embodiment, the directions generally correspond to angular directions (e.g., similar to the arrow keys on the keyboard).
[0073] Referring to Fig. 9, for example, the touch pad 110 is divided into several independent and spatially distinct zones 124A-D, each of which corresponds to a particular movement direction 126A-D (as shown by arrows), respectively. When zone 124A is actuated, on screen movements 126 A (to the right) are implemented. When zone 124B is actuated, on screen movements 126B (upwards) are implemented. When zone 124C is actuated, on screen movements 126C (to the left) are implemented. When zone 124D is actuated, on screen movements 126D (down wards) are implemented. As should be appreciated, these embodiments are well suited for joystick implementations, two dimensional menu selection, photo image panning and the like.
[0074] Figs. 11 A-l ID show the media player 100 of Fig. 8 being used by a user 130, in accordance with one embodiment of the invention. In this embodiment, the media player 100 is being addressed for one handed operation in which the media player 100 is held in the user's hand 136 while the buttons and touch pad 110 are manipulated by the thumb 138 of the same hand 136. By way of example, the palm 140 and rightmost fingers 141 (or leftmost fingers if left handed) of the hand 136 are used to grip the sides of the media player 100 while the thumb 138 is used to actuate the touch pad 110. As shown, the entire top surface of the touch pad 110 is accessible to the user's thumb 138. Referring to Fig. 11A, on screen movements 126 A to the right are implemented when the thumb 138 is placed (or tapped) on button zone 124A. Referring to Fig. 1 IB, on screen movements 126B upwards are implemented when the thumb 138 is placed on button zone 124B. Referring to Fig. 11C, on screen movements 126C to the left are implemented when the thumb 138 is placed on button zone 124C. Referring to Fig. 1 ID, on screen movements 126D downwards are implemented when the thumb 138 is placed on button zone 124D. [0075] It should be noted that the configuration shown in Figs. 11 A-D is not a limitation and that the media player may be held a variety of ways. For example, in an alternate embodiment, the media device may comfortably held by one hand while being comfortably addressed by the other hand. This configuration generally allows the user to easily actuate the touch pad with one or more fingers. For example, the thumb and rightmost fingers (or leftmost fingers if left handed) of the first hand are used to grip the sides of the media player while a finger of the opposite hand is used to actuate the touch pad. The entire top surface of the touch pad is accessible to the user's finger.
[0076] Figs. 12 is a partially broken away perspective view of an annular capacitive touch pad 150, in accordance with one embodiment of the present invention. The annular capacitive touch pad 150 is arranged to detect changes in capacitance as the user moves, taps, rests an object such as a fmger on the touch pad 150. The annular capacitive touch pad 150 is formed from various layers including at least a label layer 152, an electrode layer 154 and a circuit board 156. The label layer 152 is disposed over the electrode layer 154 and the electrode layer 154 is disposed over the circuit board 156. At least the label 152 and electrode layer 154 are annular such that they are defined by concentric circles, i.e., they have an inner perimeter and an outer perimeter. The circuit board 156 is generally a circular piece having an outer perimeter that coincides with the outer perimeter of the label 152 and electrode layer 154. It should be noted, however, that in some cases the circuit board 156 may be annular or the label 152 and electrode layer 154 may be circular. [0077] The label layer 152 serves to protect the underlayers and to provide a surface for allowing a finger to slide thereon. The surface is generally smooth so that the finger does not stick to it when moved. The label layer 152 also provides an insulating layer between the finger and the electrode layer 154. The electrode layer 154 includes a plurality of spatially distinct electrodes 158 that have positions based on the polar coordinate system. For instance, the electrodes 158 are positioned angularly and/or radically on the circuit board 156 such that each of the electrodes 158 defines a distinct angular and/or radial position thereon. Any suitable number of electrodes 158 may be used. In most cases, it would be desirable to increase the number of electrodes 158 so as to provide higher resolution, i.e., more information can be used for things such as acceleration. In the illustrated embodiment, the electrode layer 154 is broken up into a plurality of angularly sliced electrodes 158. The angularly sliced electrodes 158 may be grouped together to form one or more distinct button zones 159. In one implementation, the electrode layer 154 includes about 1024 angularly sliced electrodes that work together to form 128 angularly sliced button zones 159.
[0078] When configured together, the touch pad 150 provides a touch sensitive surface that works according to the principals of capacitance. As should be appreciated, whenever two electrically conductive members come close to one another without actually touching, their electric fields interact to form capacitance. In this configuration, the first electrically conductive member is one or more of the electrodes 158 and the second electrically conductive member is the finger of the user. Accordingly, as the finger approaches the touch pad 150, a tiny capacitance forms between the finger and the electrodes 158 in close proximity to the fmger. The capacitance in each of the electrodes 158 is measured by control circuitry 160 located on the backside of the circuit board 156. By detecting changes in capacitance at each of the electrodes 158, the control circuitry 160 can determine the angular and/or radial location, direction, speed and acceleration of the finger as it is moved across the touch pad 150. The control circuitry 160 can also report this information in a form that can be used by a computing device such as a media player. By way of example, the control circuitry may include an ASIC (application specific integrated circuit). [0079] Referring to Fig. 13, a radial touch pad 178 (rather than an angular touch pad as shown in Fig. 12) will be discussed in accordance with one embodiment. The touch pad 178 may be divided into several independent and spatially distinct button zones 180 that are positioned radically from the center 182 of the touch pad 178 to the perimeter 184 of the touch pad 178. Any number of radial zones may be used. In one embodiment, each of the radial zones 180 represents a radial position in the plane of the touch pad 178. By way of example, the zones 180 may be spaced at 5 mm increments. Like above, each of the button zones 180 has one or more electrodes 186 disposed therein for detecting the presence of an object such as a finger. In the illustrated embodiment, a plurality of radial electrodes 186 are combined to form each of the button zones 180.
[0080] Referring to Fig. 14, a combination angular/radial touch pad 188 will be discussed in accordance with one embodiment. The touch pad 188 may be divided into several independent and spatially distinct button zones 190 that are positioned both angularly and radically about the periphery of the touch pad 188 and from the center of the touch pad 188 to the perimeter of the touch pad 138. Any number of combination zones may be used. In one embodiment, each of the combination button zones 190 represents both an angular and radial position in the plane of the touch pad 188. By way of example, the zones may be positioned at both 2 degrees and 5 mm increments. Like above, each of the combination zones 190 has one or more electrodes 192 disposed therein for detecting the presence of an object such as a finger. In the illustrated embodiment, a plurality of angular/radial electrodes 192 are combined to form each of the button zones 190.
[0081] Furthermore, in order to provide higher resolution, a more complex arrangement of angular/radial electrodes may be used. For example, as shown in Fig. 15, the touch pad 200 may include angular and radial electrodes 202 that are broken up such that consecutive zones do not coincide exactly. In this embodiment, the touch pad 200 has an annular shape and the electrodes 202 follow a spiral path around the touch pad 200 from the center to the outer perimeter of the touch pad 200. The electrodes 202 may be grouped together to form one or more distinct button zones 204.
[0082] It should be noted that although the touch pads herein are all shown as circular that they may take on other forms such as other curvilinear shapes (e.g., oval, annular and the like), rectilinear shapes (e.g., hexagon, pentagon, octagon, rectangle, square, and the like) or a combination of curvilinear and rectilinear (e.g., dome). [0083] The various aspects of the inventions described above can be used alone or in various combinations. The invention is preferably implemented by a combination of hardware and software, but can also be implemented in hardware or software. The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDs, magnetic tape, optical data storage devices, and carrier waves. The computer readable medium can also be distributed over a network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. [0084] As mentioned above, the touch pad assembly may communicate with the host device via a serial interface. An example of a serial interface will now be described. The serial interface consists of at least four signals including a clock, ATN, DATA- IN, and DATA_OUT. The clock and DATA_OUT are driven by the touch pad assembly. The ATN and DATA_IN are driven by the host device. In most cases, packet transfers are initiated by the touch pad assembly, clocked by the touch pad assembly and done at a time convenient to the touch pad assembly. The host device relies on the touch pad assembly to initiate transfers. The touch pad assembly transfers a packet when it detects a change in button status or touch pad position or if it detects an ATN signal from the host. If the host wishes to send data to the touch pad assembly it asserts the ATN signal and keeps it asserted until after the packet it wants to send has been transferred. The touch pad assembly monitors the ATN signal and initiates a transfer if it sees it asserted.
[0085] There are typically several defined packets types that the touch pad assembly can transmit. In this example, there are at least two kinds of packets: unsolicited packets and packets sent as a response to an ATN signal. The touch pad assembly sends unsolicited packets unless specifically asked by the host to send another type. In the case of unsolicited packets, the unsolicited packets are sent periodically whenever it detects a change in button status or touch pad position. In the case of solicited packets, the touch pad assembly typically only sends one for each request by the host and then reverts back to unsolicited packets. Unsolicited packets generally have a delay between them while response packets may be sent at any time in response to the ATN signal.
[0086] While this invention has been described in terms of several preferred embodiments, there are alterations, permutations, and equivalents, which fall within the scope of this invention. For example, although the invention has been described in terms of an MP3 music player, it should be appreciated that certain features of the invention may also be applied to other types of media players such as video recorders, cameras, and the like. Furthermore, the MP3 music player described herein is not limited to the MP3 music format. Other audio formats such as MP3 VBR (variable bit rate), AIFF and WAV formats may be used. Moreover, certain aspects of the invention are not limited to handheld devices. For example, the touch pad may also be used in other computing devices such as a portable computer, personal digital assistants (PDA), cellular phones, and the like. The touch pad may also be used a stand alone input device that connects to a desktop or portable computer. [0087] It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention. For example, although the touch pad has been described in terms of being actuated by a finger, it should be noted that other objects may be used to actuate it in some cases. For example, a stylus or other object may be used in some configurations of the touch pad. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.

Claims

What is claimed is:
1. A touch pad assembly, comprising: a touch pad having one or more sensors that map the touch pad plane into native sensor coordinates; and a controller that divides the surface of the touch pad into logical device units, receives the native values of the native sensor coordinates from the sensors, adjusts the native values of the native sensor coordinates into a new value associated with the logical device units and reports the new value of the logical device units to a host device, the logical device units representing areas of the touch pad that can be actuated by a user.
2. The touch pad assembly as recited in claim 1 wherein the controller passes the native values of the native sensor coordinates through a filtering process before adjusting the native values into a new value.
3. The touch pad assembly as recited in claim 2 wherein the filtering process includes determining if the native values are based on noise events or actual events.
4. The touch pad assembly as recited in claim 3 wherein the controller filters out the noise events and allows the actual events to pass through.
5. The touch pad assembly as recited in claim 1 wherein the controller further determines if a significant change has been made between the current and last received native values, and only reports the new value when a significant change has been made between the current and last received native values.
6. The touch pad assembly as recited in claim 1 wherein the native sensor coordinates are Cartesian coordinates.
7. The touch pad assembly as recited in claim 1 wherein the native sensor coordinates are Polar coordinates.
8. The touch pad assembly as recited in claim 1 wherein the logical device units are Cartesian coordinates.
9. The touch pad assembly as recited in claim 1 wherein the logical device units are Polar coordinates.
10. The touch pad assembly as recited in claim 1 wherein the new value of the logical device units are reported in an absolute mode.
11. The touch pad assembly as recited in claim 1 wherein the new value of the logical device units are reported in a relative mode.
12. The touch pad assembly as recited in claim 1 wherein the new value of the logical device units are reported in a Cartesian absolute mode, a Cartesian relative mode, a Polar absolute mode or a Polar relative mode.
13. The touch pad assembly as recited in claim 1 wherein the new value of the logical device units implements a specific control function in the host device.
14. The touch pad assembly as recited in claim 1 wherein the logical device units are angular Polar units distributed around the surface of the touch pad in a clock like manner.
15. The touch pad assembly as recited in claim 1 wherein the ratio of native sensor coordinates to logical device units is between about 1024:1 to about 8:1.
16. The touch pad assembly as recited in claim 1 further comprising one or more touch buttons having one or more sensors, and wherein the controller receives a native value from the sensors, determines a button status from the native value, and reports the button status to a host device, the button status being used by the host device to implement a button function in the host device.
17. The touch pad assembly as recited in claim 16 wherein the controller only reports the button status to the host device when it is determined that there is a change in button status.
18. The touch pad assembly as recited in claim 1 wherein each of the logical device units represent a different movement direction on a display screen of the host device so as to enable joystick implementations, multiple dimensional menu selection or photo image panning.
19. The touch pad assembly as recited in claim 1 wherein the host device is a media player for storing and playing media such as audio, video or images, the media player including a housing that supports the touch pad assembly, a display for displaying text and graphics to a user of the media player and a CPU capable of receiving the new value of the logical device units from the controller and issuing commands based on the new value of logical device units to other components of the media player, the commands being used to at least move an object on the display.
20. A method for a touch pad, comprising: mapping the touch pad into native sensor coordinates; producing native values of the native sensor coordinates when events occur on the touch pad; filtering the native values of the native sensor coordinates based on the type of events that occur on the touch pad; generating a control signal based on the native values of the native sensor coordinates when a desired event occurs on the touch pad.
21. The method as recited in claim 20 wherein the control signal includes the native values of the native sensor coordinates.
22. The method as recited in claim 20 further comprising: adjusting the native values of the native sensor coordinates into a new value when a desired event occurs on the touch pad, the control signal including the new value.
23. The method as recited in claim 20 wherein the new value has the same units as the native values.
24. The method as recited in claim 20 wherein the new value has different units as the native values.
25. The method as recited in claim 20 wherein the step of filtering comprises: determining if the native values are caused by noise events or actual events; and filtering out noise events and passing actual events.
26. The method as recited in claim 25 wherein the step of determining comprises: comparing a current set of native values with a last set of native values; classifying the current set of native values as noise events when the current set of native values is substantially similar to the previous set of native values; and classifying the current set of native values as actual events when the current set of native values is significantly different than the previous set of native values.
27. The method as recited in claim 25 wherein the control signal includes native values of the native sensor coordinates if it is determined that the events are actual events.
28. The method as recited in claim 25 further comprising: adjusting the native values of the native sensor coordinates into a new value if it is determined that the events are actual events, and including the new value in the control signal.
29. A signal processing method for a controller of a touch pad, comprising: receiving a current user location; determining the difference in user location by comparing the current user location to a last user location; only outputting the current user location when the difference in user location is larger than a threshold value; converting the outputted current user location into a logical device unit; and generating a message for a host device, the message including the more logical user location, the more logical user location being used by the host device to move a control object in a specified manner.
30. The method as recited in claim 29 wherein the threshold value is defined as the number of sensor levels that need to change in the touch pad in order to report a change in the user location.
31. The method as recited in claim 30 wherein the threshold is determined by the following equation: Threshold(T) = C*(native sensor resolution of the touch pad/logical device resolution of the touch pad), where the native sensor resolution defines the maximum number of different user locations that the sensors of the touch pad are able to detect over the touch pad plane, the logical device resolution defines the number of logical device units that the touch pad reports to the host device, and C defines the width border area between clusters of sensors of the touch pad that define one logical device unit.
32. The method as recited in claim 31 wherein the coefficient C is a value between about 0 and 0.5.
33. The method as recited in claim 31 wherein the native sensor resolution is about 1024 and the logical device resolution is about 128.
34. The method as recited in claim 29 further comprising: storing the current user location for subsequent processing, the current user location acting as the last user location in subsequent processing.
35. In a computer system that facilitates bi-directional communications between a touch pad assembly and a host device, a message from the touch pad assembly to the host device, the message comprising: an event field identifying whether the message is a touch pad event or a button event; an event identifier field identifying at least one event parameter, each event parameter having an event value, the event value for a touch pad event parameter indicating an absolute position, the event value for a button event parameter indicating button status.
36. A touch pad assembly capable of transforming a user action into motion onto a display screen, the touch pad system including a touch pad having a plurality of independent and spatially distinct button zones each of which represents a different movement direction on the display screen so as to enable joystick implementations, multiple dimensional menu selection or photo image panning.
PCT/US2004/027102 2003-11-25 2004-08-19 Touch pad for handheld device WO2005057328A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP04781727A EP1687684A4 (en) 2003-11-25 2004-08-19 Touch pad for handheld device
DE202004021283U DE202004021283U1 (en) 2003-11-25 2004-08-19 Touchpad for a portable device
EP10011080.8A EP2284658B1 (en) 2003-11-25 2004-08-19 Touch pad for a handheld device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/722,948 2003-11-25
US10/722,948 US7495659B2 (en) 2003-11-25 2003-11-25 Touch pad for handheld device

Publications (2)

Publication Number Publication Date
WO2005057328A2 true WO2005057328A2 (en) 2005-06-23
WO2005057328A3 WO2005057328A3 (en) 2006-09-21

Family

ID=34592119

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/027102 WO2005057328A2 (en) 2003-11-25 2004-08-19 Touch pad for handheld device

Country Status (7)

Country Link
US (4) US7495659B2 (en)
EP (2) EP1687684A4 (en)
CN (2) CN100369054C (en)
DE (1) DE202004021283U1 (en)
HK (1) HK1123860A1 (en)
TW (1) TWI262427B (en)
WO (1) WO2005057328A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007105151A1 (en) * 2006-03-13 2007-09-20 Koninklijke Philips Electronics N.V. Control device for controlling the hue of light emitted from a light source
GB2436135B (en) * 2006-03-09 2011-09-14 Pretorian Technologies Ltd User input device for electronic equipment
RU2719401C1 (en) * 2018-06-29 2020-04-17 Кэнон Кабусики Кайся Electronic device
US10897568B2 (en) 2018-06-29 2021-01-19 Canon Kabushiki Kaisha Electronic device

Families Citing this family (379)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6323846B1 (en) 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
US7614008B2 (en) * 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US7808479B1 (en) 2003-09-02 2010-10-05 Apple Inc. Ambidextrous mouse
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US7844914B2 (en) 2004-07-30 2010-11-30 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US7483967B2 (en) * 1999-09-01 2009-01-27 Ximeta Technology, Inc. Scalable server architecture based on asymmetric 3-way TCP
US7792923B2 (en) 2000-10-13 2010-09-07 Zhe Khi Pak Disk system adapted to be directly attached to network
US7030861B1 (en) * 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
US20050134578A1 (en) * 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
AU2002319929A1 (en) 2001-07-16 2003-03-03 Han Gyoo Kim Scheme for dynamically connecting i/o devices through network
US20050149682A1 (en) * 2001-10-09 2005-07-07 Han-Gyoo Kim Virtual multiple removable media jukebox
US20070085841A1 (en) * 2001-10-22 2007-04-19 Apple Computer, Inc. Method and apparatus for accelerated scrolling
US7312785B2 (en) 2001-10-22 2007-12-25 Apple Inc. Method and apparatus for accelerated scrolling
US7345671B2 (en) * 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
JP2003241682A (en) * 2002-01-03 2003-08-29 Samsung Electronics Co Ltd Display apparatus, rotating position detector thereof and computer
US7333092B2 (en) 2002-02-25 2008-02-19 Apple Computer, Inc. Touch pad for handheld device
US7656393B2 (en) 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
US11275405B2 (en) * 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
CA2496463A1 (en) * 2002-08-23 2004-03-04 Pfizer Products Inc. Apparatus for dispensing articles
US7358963B2 (en) * 2002-09-09 2008-04-15 Apple Inc. Mouse having an optically-based scrolling feature
US20070152977A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Illuminated touchpad
US7499040B2 (en) * 2003-08-18 2009-03-03 Apple Inc. Movable touch pad with added functionality
US20060181517A1 (en) * 2005-02-11 2006-08-17 Apple Computer, Inc. Display actuator
US7457880B1 (en) * 2003-09-26 2008-11-25 Ximeta Technology, Inc. System using a single host to receive and redirect all file access commands for shared data storage device from other hosts on a network
US8059099B2 (en) 2006-06-02 2011-11-15 Apple Inc. Techniques for interactive input to portable electronic devices
US7495659B2 (en) * 2003-11-25 2009-02-24 Apple Inc. Touch pad for handheld device
US7664836B2 (en) * 2004-02-17 2010-02-16 Zhe Khi Pak Device and method for booting an operation system for a computer from a passive directly attached network device
US20050193017A1 (en) * 2004-02-19 2005-09-01 Han-Gyoo Kim Portable multimedia player/recorder that accesses data contents from and writes to networked device
US20060069884A1 (en) * 2004-02-27 2006-03-30 Han-Gyoo Kim Universal network to device bridge chip that enables network directly attached device
US7554531B2 (en) * 2004-05-18 2009-06-30 Interlink Electronics, Inc. Annular potentiometric touch sensor
US7310089B2 (en) * 2004-05-18 2007-12-18 Interlink Electronics, Inc. Annular potentiometric touch sensor
US7515431B1 (en) * 2004-07-02 2009-04-07 Apple Inc. Handheld computing device
US7724532B2 (en) * 2004-07-02 2010-05-25 Apple Inc. Handheld computing device
US7746900B2 (en) 2004-07-22 2010-06-29 Zhe Khi Pak Low-level communication layers and device employing same
US20080129707A1 (en) * 2004-07-27 2008-06-05 Pryor Timothy R Method and apparatus employing multi-functional controls and displays
JP4439351B2 (en) * 2004-07-28 2010-03-24 アルパイン株式会社 Touch panel input device with vibration applying function and vibration applying method for operation input
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
JP2008511045A (en) 2004-08-16 2008-04-10 フィンガーワークス・インコーポレーテッド Method for improving the spatial resolution of a touch sense device
US7860943B2 (en) * 2004-08-23 2010-12-28 Zhe Khi Pak Enhanced network direct attached storage controller
JP3734820B1 (en) * 2004-09-03 2006-01-11 任天堂株式会社 GAME PROGRAM, GAME DEVICE, AND INPUT DEVICE
US20100231506A1 (en) * 2004-09-07 2010-09-16 Timothy Pryor Control of appliances, kitchen and home
JP4583893B2 (en) * 2004-11-19 2010-11-17 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
TWI273497B (en) * 2004-12-14 2007-02-11 Elan Microelectronics Corp Dual-axis unequal-interval interlacing-type sensing-scan capacitance-type touch panel
US7849257B1 (en) 2005-01-06 2010-12-07 Zhe Khi Pak Method and apparatus for storing and retrieving data
PL1851943T3 (en) * 2005-02-02 2018-07-31 Audiobrax Indústria E Comércio De Produtos Eletrônicos S.A. Mobile communication device with music instrumental functions
KR20160150116A (en) * 2005-03-04 2016-12-28 애플 인크. Multi-functional hand-held device
US20060227117A1 (en) * 2005-04-07 2006-10-12 Microsoft Corporation Circular touch sensor
US9727082B2 (en) * 2005-04-26 2017-08-08 Apple Inc. Back-side interface for hand-held devices
CN100455171C (en) * 2005-06-21 2009-01-21 华硕电脑股份有限公司 Action electronic device
CH697974B1 (en) * 2005-07-01 2009-04-15 Saeco Ipr Ltd Operating device for hot beverage vending machines.
KR20070010589A (en) * 2005-07-19 2007-01-24 엘지전자 주식회사 Mobile communication terminal with turn-table and its operating method
US9122518B2 (en) 2005-08-11 2015-09-01 Pantech Co., Ltd. Method for selecting and controlling second work process during first work process in multitasking mobile terminal
US7294089B2 (en) * 2005-08-15 2007-11-13 Ford Global Technologies, Llc Multiple-speed automatic transmission
EP1758013B1 (en) * 2005-08-24 2018-07-04 LG Electronics Inc. Mobile communications terminal having a touch input unit and controlling method thereof
JP4590328B2 (en) * 2005-08-30 2010-12-01 任天堂株式会社 Input data processing program and information processing apparatus
US7825907B2 (en) * 2005-08-30 2010-11-02 Lg Electronics Inc. Touch key assembly for a mobile terminal
JP4716956B2 (en) * 2005-08-30 2011-07-06 エルジー電子株式會社 Touch key assembly and mobile communication terminal having the same
KR100652755B1 (en) * 2005-08-30 2006-12-01 엘지전자 주식회사 Portable phone of a touching and pushing type able to be backlighted
US7671837B2 (en) * 2005-09-06 2010-03-02 Apple Inc. Scrolling input arrangements using capacitive sensors on a flexible membrane
JP2009523267A (en) * 2005-09-15 2009-06-18 アップル インコーポレイテッド System and method for processing raw data of a trackpad device
JP4819467B2 (en) * 2005-10-04 2011-11-24 任天堂株式会社 Object movement control program and information processing apparatus
JP2007102664A (en) * 2005-10-07 2007-04-19 Smk Corp Method for use of rotation input device
US7880729B2 (en) 2005-10-11 2011-02-01 Apple Inc. Center button isolation ring
US7808480B2 (en) * 2005-10-28 2010-10-05 Sap Ag Method and system for secure input
US20070097089A1 (en) * 2005-10-31 2007-05-03 Battles Amy E Imaging device control using touch pad
TWM289905U (en) * 2005-11-10 2006-04-21 Elan Microelectronics Corp Touch-control shaft having pushbutton function
US7868874B2 (en) 2005-11-15 2011-01-11 Synaptics Incorporated Methods and systems for detecting a position-based attribute of an object using digital codes
US20070109275A1 (en) * 2005-11-16 2007-05-17 Chen-Ting Chuang Method for controlling a touch screen user interface and device thereof
US20070137462A1 (en) * 2005-12-16 2007-06-21 Motorola, Inc. Wireless communications device with audio-visual effect generator
US7701440B2 (en) * 2005-12-19 2010-04-20 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Pointing device adapted for small handheld devices having two display modes
CN1988374B (en) * 2005-12-20 2010-10-06 太瀚科技股份有限公司 Inductance control system and method for control sound volume
US8077147B2 (en) * 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
US20070152983A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
KR101287497B1 (en) * 2006-01-06 2013-07-18 삼성전자주식회사 Apparatus and method for transmitting control command in home network system
WO2007082037A2 (en) * 2006-01-10 2007-07-19 Cirque Corporation Touchpad control of character actions in a virtual environment using gestures
US20070171188A1 (en) * 2006-01-25 2007-07-26 Nigel Waites Sensor for handheld device control illumination
JP4926494B2 (en) * 2006-02-20 2012-05-09 キヤノン株式会社 Image processing apparatus and control method
US20070268250A1 (en) * 2006-02-27 2007-11-22 Nubron Inc. Remote input device for computers
KR100746874B1 (en) * 2006-03-16 2007-08-07 삼성전자주식회사 Method and apparatus for providing of service using the touch pad in a mobile station
US20070220443A1 (en) * 2006-03-17 2007-09-20 Cranfill David B User interface for scrolling
US20070222765A1 (en) * 2006-03-22 2007-09-27 Nokia Corporation Slider input lid on touchscreen
US7787618B2 (en) 2006-03-29 2010-08-31 Nokia Corporation Portable electronic device
CN101432681B (en) * 2006-03-30 2013-01-16 塞奎公司 System and method for enabling function inspiration and operation of circular touchpad
US7538760B2 (en) * 2006-03-30 2009-05-26 Apple Inc. Force imaging input device and system
US7511702B2 (en) * 2006-03-30 2009-03-31 Apple Inc. Force and location sensitive display
US8040142B1 (en) 2006-03-31 2011-10-18 Cypress Semiconductor Corporation Touch detection techniques for capacitive touch sense systems
US8866750B2 (en) * 2006-04-10 2014-10-21 Microsoft Corporation Universal user interface device
US7978181B2 (en) 2006-04-25 2011-07-12 Apple Inc. Keystroke tactility arrangement on a smooth touch surface
US8279180B2 (en) 2006-05-02 2012-10-02 Apple Inc. Multipoint touch surface controller
US8004497B2 (en) 2006-05-18 2011-08-23 Cypress Semiconductor Corporation Two-pin buttons
KR102481798B1 (en) 2006-06-09 2022-12-26 애플 인크. Touch screen liquid crystal display
CN104965621B (en) 2006-06-09 2018-06-12 苹果公司 Touch screen LCD and its operating method
US8552989B2 (en) 2006-06-09 2013-10-08 Apple Inc. Integrated display and touch screen
KR100839696B1 (en) * 2006-06-20 2008-06-19 엘지전자 주식회사 Input device
US8022935B2 (en) 2006-07-06 2011-09-20 Apple Inc. Capacitance sensing electrode with integrated I/O mechanism
US8743060B2 (en) 2006-07-06 2014-06-03 Apple Inc. Mutual capacitance touch sensing device
US9360967B2 (en) 2006-07-06 2016-06-07 Apple Inc. Mutual capacitance touch sensing device
US20080006454A1 (en) * 2006-07-10 2008-01-10 Apple Computer, Inc. Mutual capacitance touch sensing device
US7889176B2 (en) * 2006-07-18 2011-02-15 Avago Technologies General Ip (Singapore) Pte. Ltd. Capacitive sensing in displacement type pointing devices
KR100781706B1 (en) * 2006-08-16 2007-12-03 삼성전자주식회사 Device and method for scrolling list in mobile terminal
US20100289737A1 (en) * 2006-08-25 2010-11-18 Kyocera Corporation Portable electronic apparatus, operation detecting method for the portable electronic apparatus, and control method for the portable electronic apparatus
JP5064395B2 (en) * 2006-08-25 2012-10-31 京セラ株式会社 Portable electronic device and input operation determination method
US8014760B2 (en) 2006-09-06 2011-09-06 Apple Inc. Missed telephone call management for a portable multifunction device
US20080055263A1 (en) * 2006-09-06 2008-03-06 Lemay Stephen O Incoming Telephone Call Management for a Portable Multifunction Device
US8736557B2 (en) * 2006-09-11 2014-05-27 Apple Inc. Electronic device with image based browsers
US8036766B2 (en) * 2006-09-11 2011-10-11 Apple Inc. Intelligent audio mixing among media playback and at least one other non-playback application
US7795553B2 (en) 2006-09-11 2010-09-14 Apple Inc. Hybrid button
US7729791B2 (en) * 2006-09-11 2010-06-01 Apple Inc. Portable media playback device including user interface event passthrough to non-media-playback processing
US7581186B2 (en) * 2006-09-11 2009-08-25 Apple Inc. Media manager with integrated browsers
US8564543B2 (en) * 2006-09-11 2013-10-22 Apple Inc. Media player with imaged based browsing
US7946918B2 (en) * 2006-09-11 2011-05-24 Apple Inc. Allowing media and gaming environments to effectively interact and/or affect each other
US8421602B2 (en) * 2006-09-13 2013-04-16 Savant Systems, Llc Remote control unit for a programmable multimedia controller
KR101241907B1 (en) * 2006-09-29 2013-03-11 엘지전자 주식회사 Remote controller and Method for generation of key code on remote controller thereof
KR101319871B1 (en) * 2006-09-29 2013-10-18 엘지전자 주식회사 Apparatus of coordinates cognition and Method for generation of key code on the apparatus thereof
KR101259116B1 (en) * 2006-09-29 2013-04-26 엘지전자 주식회사 Controller and Method for generation of key code on remote controller thereof
US20080088600A1 (en) * 2006-10-11 2008-04-17 Apple Inc. Method and apparatus for implementing multiple push buttons in a user input device
US20080088597A1 (en) * 2006-10-11 2008-04-17 Apple Inc. Sensor configurations in a user input device
US8274479B2 (en) 2006-10-11 2012-09-25 Apple Inc. Gimballed scroll wheel
US8090087B2 (en) * 2006-10-26 2012-01-03 Apple Inc. Method, system, and graphical user interface for making conference calls
US8482530B2 (en) 2006-11-13 2013-07-09 Apple Inc. Method of capacitively sensing finger position
US8547114B2 (en) 2006-11-14 2013-10-01 Cypress Semiconductor Corporation Capacitance to code converter with sigma-delta modulator
US20080143681A1 (en) * 2006-12-18 2008-06-19 Xiaoping Jiang Circular slider with center button
US8493330B2 (en) 2007-01-03 2013-07-23 Apple Inc. Individual channel phase delay scheme
US7639234B2 (en) * 2007-01-04 2009-12-29 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Capacitive sensing and absolute position mapping in displacement type pointing devices
US9710095B2 (en) 2007-01-05 2017-07-18 Apple Inc. Touch screen stack-ups
US7975242B2 (en) * 2007-01-07 2011-07-05 Apple Inc. Portable multifunction device, method, and graphical user interface for conference calling
US8519963B2 (en) * 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display
US10437459B2 (en) * 2007-01-07 2019-10-08 Apple Inc. Multitouch data fusion
US8607167B2 (en) * 2007-01-07 2013-12-10 Apple Inc. Portable multifunction device, method, and graphical user interface for providing maps and directions
US9001047B2 (en) 2007-01-07 2015-04-07 Apple Inc. Modal change based on orientation of a portable multifunction device
US20080165135A1 (en) * 2007-01-10 2008-07-10 Jao-Ching Lin Functional expansion system for a touch pad
US8058937B2 (en) 2007-01-30 2011-11-15 Cypress Semiconductor Corporation Setting a discharge rate and a charge rate of a relaxation oscillator circuit
US20080180399A1 (en) * 2007-01-31 2008-07-31 Tung Wan Cheng Flexible Multi-touch Screen
US20080196945A1 (en) * 2007-02-21 2008-08-21 Jason Konstas Preventing unintentional activation of a sensor element of a sensing device
FR2913272B1 (en) * 2007-03-02 2010-06-25 Dav SENSOR WITH TOUCH SURFACE
US7999789B2 (en) * 2007-03-14 2011-08-16 Computime, Ltd. Electrical device with a selected orientation for operation
TW200839587A (en) * 2007-03-16 2008-10-01 Inventec Appliances Corp Touch sensing device and method for electrical apparatus
TW200841531A (en) * 2007-04-02 2008-10-16 Asustek Comp Inc Slot device
TWI339806B (en) * 2007-04-04 2011-04-01 Htc Corp Electronic device capable of executing commands therein and method for executing commands in the same
JP4333768B2 (en) * 2007-04-06 2009-09-16 ソニー株式会社 Display device
JP2008276548A (en) * 2007-04-27 2008-11-13 Toshiba Corp Electrostatic pad apparatus and information processing apparatus
US20080270900A1 (en) * 2007-04-27 2008-10-30 Wezowski Martin M R Device, method and computer program product for switching a device between application modes
US20080273017A1 (en) * 2007-05-04 2008-11-06 Woolley Richard D Touchpad using a combination of touchdown and radial movements to provide control signals
PL1988445T3 (en) * 2007-05-04 2016-08-31 Whirlpool Co User interface and cooking oven provided with such user interface
US7911771B2 (en) * 2007-05-23 2011-03-22 Apple Inc. Electronic device with a metal-ceramic composite component
US8185839B2 (en) * 2007-06-09 2012-05-22 Apple Inc. Browsing or searching user interfaces and other aspects
US8201096B2 (en) * 2007-06-09 2012-06-12 Apple Inc. Browsing or searching user interfaces and other aspects
US9772667B2 (en) 2007-06-13 2017-09-26 Apple Inc. Integrated multi-touch surface having varying sensor granularity
US9052817B2 (en) * 2007-06-13 2015-06-09 Apple Inc. Mode sensitive processing of touch data
US8350815B2 (en) 2007-06-20 2013-01-08 Sony Mobile Communications Portable communication device including touch input with scrolling function
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US8302033B2 (en) 2007-06-22 2012-10-30 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
US9500686B1 (en) 2007-06-29 2016-11-22 Cypress Semiconductor Corporation Capacitance measurement system and methods
US8570053B1 (en) 2007-07-03 2013-10-29 Cypress Semiconductor Corporation Capacitive field sensor with sigma-delta modulator
US8169238B1 (en) 2007-07-03 2012-05-01 Cypress Semiconductor Corporation Capacitance to frequency converter
US9654104B2 (en) 2007-07-17 2017-05-16 Apple Inc. Resistive force sensor with capacitive discrimination
TW200907768A (en) * 2007-08-09 2009-02-16 Asustek Comp Inc Portable apparatus and rapid cursor positioning method
US8683378B2 (en) 2007-09-04 2014-03-25 Apple Inc. Scrolling techniques for user interfaces
US7910843B2 (en) 2007-09-04 2011-03-22 Apple Inc. Compact input device
US20090058801A1 (en) * 2007-09-04 2009-03-05 Apple Inc. Fluid motion user interface control
US20090073130A1 (en) * 2007-09-17 2009-03-19 Apple Inc. Device having cover with integrally formed sensor
US20090096749A1 (en) * 2007-10-10 2009-04-16 Sun Microsystems, Inc. Portable device input technique
US8234262B2 (en) * 2007-10-24 2012-07-31 The Invention Science Fund I, Llc Method of selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US20090112696A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Method of space-available advertising in a mobile device
US9513699B2 (en) * 2007-10-24 2016-12-06 Invention Science Fund I, LL Method of selecting a second content based on a user's reaction to a first content
US8001108B2 (en) * 2007-10-24 2011-08-16 The Invention Science Fund I, Llc Returning a new content based on a person's reaction to at least two instances of previously displayed content
US20090112694A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Targeted-advertising based on a sensed physiological response by a person to a general advertisement
US20090112849A1 (en) * 2007-10-24 2009-04-30 Searete Llc Selecting a second content based on a user's reaction to a first content of at least two instances of displayed content
US9582805B2 (en) * 2007-10-24 2017-02-28 Invention Science Fund I, Llc Returning a personalized advertisement
US8112407B2 (en) * 2007-10-24 2012-02-07 The Invention Science Fund I, Llc Selecting a second content based on a user's reaction to a first content
US8126867B2 (en) * 2007-10-24 2012-02-28 The Invention Science Fund I, Llc Returning a second content based on a user's reaction to a first content
US20090112693A1 (en) * 2007-10-24 2009-04-30 Jung Edward K Y Providing personalized advertising
US20090113297A1 (en) * 2007-10-24 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Requesting a second content based on a user's reaction to a first content
US20090112697A1 (en) * 2007-10-30 2009-04-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Providing personalized advertising
TWI406551B (en) * 2007-11-06 2013-08-21 Lg Electronics Inc Mobile terminal
US8416198B2 (en) 2007-12-03 2013-04-09 Apple Inc. Multi-dimensional scroll wheel
US20090146970A1 (en) * 2007-12-10 2009-06-11 Research In Motion Limited Electronic device and touch screen having discrete touch-sensitive areas
US8446371B2 (en) 2007-12-19 2013-05-21 Research In Motion Limited Method and apparatus for launching activities
TWI368161B (en) * 2007-12-21 2012-07-11 Htc Corp Electronic apparatus and input interface thereof
US8947383B2 (en) 2008-01-04 2015-02-03 Tactus Technology, Inc. User interface system and method
US8553005B2 (en) 2008-01-04 2013-10-08 Tactus Technology, Inc. User interface system
US8922510B2 (en) 2008-01-04 2014-12-30 Tactus Technology, Inc. User interface system
US8547339B2 (en) 2008-01-04 2013-10-01 Tactus Technology, Inc. System and methods for raised touch screens
US8154527B2 (en) 2008-01-04 2012-04-10 Tactus Technology User interface system
US8179375B2 (en) 2008-01-04 2012-05-15 Tactus Technology User interface system and method
US8456438B2 (en) 2008-01-04 2013-06-04 Tactus Technology, Inc. User interface system
US9052790B2 (en) 2008-01-04 2015-06-09 Tactus Technology, Inc. User interface and methods
US8207950B2 (en) * 2009-07-03 2012-06-26 Tactus Technologies User interface enhancement system
US8243038B2 (en) 2009-07-03 2012-08-14 Tactus Technologies Method for adjusting the user interface of a device
US9557915B2 (en) 2008-01-04 2017-01-31 Tactus Technology, Inc. Dynamic tactile interface
US9430074B2 (en) 2008-01-04 2016-08-30 Tactus Technology, Inc. Dynamic tactile interface
US9423875B2 (en) 2008-01-04 2016-08-23 Tactus Technology, Inc. Dynamic tactile interface with exhibiting optical dispersion characteristics
US9274612B2 (en) 2008-01-04 2016-03-01 Tactus Technology, Inc. User interface system
US8922502B2 (en) 2008-01-04 2014-12-30 Tactus Technology, Inc. User interface system
US9588683B2 (en) 2008-01-04 2017-03-07 Tactus Technology, Inc. Dynamic tactile interface
US9013417B2 (en) 2008-01-04 2015-04-21 Tactus Technology, Inc. User interface system
US9612659B2 (en) 2008-01-04 2017-04-04 Tactus Technology, Inc. User interface system
US9063627B2 (en) 2008-01-04 2015-06-23 Tactus Technology, Inc. User interface and methods
US9128525B2 (en) 2008-01-04 2015-09-08 Tactus Technology, Inc. Dynamic tactile interface
US9552065B2 (en) 2008-01-04 2017-01-24 Tactus Technology, Inc. Dynamic tactile interface
US20160187981A1 (en) 2008-01-04 2016-06-30 Tactus Technology, Inc. Manual fluid actuator
US9720501B2 (en) 2008-01-04 2017-08-01 Tactus Technology, Inc. Dynamic tactile interface
US8928621B2 (en) 2008-01-04 2015-01-06 Tactus Technology, Inc. User interface system and method
US9298261B2 (en) 2008-01-04 2016-03-29 Tactus Technology, Inc. Method for actuating a tactile interface layer
US8570295B2 (en) 2008-01-04 2013-10-29 Tactus Technology, Inc. User interface system
US8327272B2 (en) * 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US8125461B2 (en) 2008-01-11 2012-02-28 Apple Inc. Dynamic input graphic display
ES2374208T3 (en) * 2008-01-24 2012-02-14 Koninklijke Philips Electronics N.V. COLOR SELECTION INPUT DEVICE AND METHOD.
US8525798B2 (en) 2008-01-28 2013-09-03 Cypress Semiconductor Corporation Touch sensing
US8820133B2 (en) 2008-02-01 2014-09-02 Apple Inc. Co-extruded materials and methods
US8319505B1 (en) 2008-10-24 2012-11-27 Cypress Semiconductor Corporation Methods and circuits for measuring mutual and self capacitance
US8358142B2 (en) 2008-02-27 2013-01-22 Cypress Semiconductor Corporation Methods and circuits for measuring mutual and self capacitance
US9104273B1 (en) 2008-02-29 2015-08-11 Cypress Semiconductor Corporation Multi-touch sensing method
CN101533296A (en) * 2008-03-12 2009-09-16 深圳富泰宏精密工业有限公司 Touch control system and method for hand-hold mobile electronic device
US9454256B2 (en) 2008-03-14 2016-09-27 Apple Inc. Sensor configurations of an input device that are switchable based on mode
US9448721B2 (en) * 2008-03-19 2016-09-20 Blackberry Limited Electronic device including touch-sensitive input device and method of determining selection
US20090289917A1 (en) * 2008-03-20 2009-11-26 Saunders Samuel F Dynamic visual feature coordination in an electronic hand held device
CN101551726A (en) * 2008-04-03 2009-10-07 深圳富泰宏精密工业有限公司 Touch control system and method of electronic device
US9189102B2 (en) * 2008-05-16 2015-11-17 8631654 Canada Inc. Data filtering method and electronic device using the same
TW200949616A (en) * 2008-05-16 2009-12-01 High Tech Comp Corp Data filtering method and electronic device and readable recording medium using the same
TWI396967B (en) * 2008-05-16 2013-05-21 Htc Corp Signal filtering method and electronic device and readable recording medium using the same
TWI358029B (en) 2008-05-16 2012-02-11 Htc Corp Method for filtering out signals from touch device
KR100957836B1 (en) * 2008-06-02 2010-05-14 주식회사 애트랩 Touch panel device and contact position detection method of it
US8217908B2 (en) 2008-06-19 2012-07-10 Tactile Displays, Llc Apparatus and method for interactive display with tactile feedback
US8665228B2 (en) 2008-06-19 2014-03-04 Tactile Displays, Llc Energy efficient interactive display with energy regenerative keyboard
US8115745B2 (en) 2008-06-19 2012-02-14 Tactile Displays, Llc Apparatus and method for interactive display with tactile feedback
US9513705B2 (en) * 2008-06-19 2016-12-06 Tactile Displays, Llc Interactive display with tactile feedback
US20090327974A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation User interface for gestural control
JP4636141B2 (en) * 2008-08-28 2011-02-23 ソニー株式会社 Information processing apparatus and method, and program
WO2010027492A2 (en) * 2008-09-04 2010-03-11 Savant Systems Llc Touch-sensitive wireless device and on screen display for remotely controlling a system
US8341557B2 (en) * 2008-09-05 2012-12-25 Apple Inc. Portable touch screen device, method, and graphical user interface for providing workout support
US8810542B2 (en) * 2008-09-10 2014-08-19 Apple Inc. Correction of parasitic capacitance effect in touch sensor panels
IT1393376B1 (en) 2008-09-12 2012-04-20 Sicam Srl BALANCING MACHINE FOR WHEEL BALANCING OF VEHICLES
US8259082B2 (en) 2008-09-12 2012-09-04 At&T Intellectual Property I, L.P. Multimodal portable communication interface for accessing video content
IT1393377B1 (en) 2008-09-12 2012-04-20 Sicam Srl BALANCING MACHINE FOR WHEEL BALANCING OF VEHICLES
US8816967B2 (en) 2008-09-25 2014-08-26 Apple Inc. Capacitive sensor having electrodes arranged on the substrate and the flex circuit
US8321174B1 (en) 2008-09-26 2012-11-27 Cypress Semiconductor Corporation System and method to measure capacitance of capacitive sensor array
US10289199B2 (en) * 2008-09-29 2019-05-14 Apple Inc. Haptic feedback system
US8368654B2 (en) * 2008-09-30 2013-02-05 Apple Inc. Integrated touch sensor and solar assembly
KR101323015B1 (en) * 2008-10-20 2013-10-29 엘지디스플레이 주식회사 Touch sensing deving and method for correcting output thereof
US8477103B2 (en) 2008-10-26 2013-07-02 Microsoft Corporation Multi-touch object inertia simulation
US8466879B2 (en) * 2008-10-26 2013-06-18 Microsoft Corporation Multi-touch manipulation of application objects
KR101016221B1 (en) * 2008-11-14 2011-02-25 한국표준과학연구원 Method for Embodiment of Algorism Using Force Sesor
US9128543B2 (en) * 2008-12-11 2015-09-08 Pixart Imaging Inc. Touch pad device and method for determining a position of an input object on the device using capacitive coupling
US9075457B2 (en) * 2008-12-12 2015-07-07 Maxim Integrated Products, Inc. System and method for interfacing applications processor to touchscreen display for reduced data transfer
US8395590B2 (en) 2008-12-17 2013-03-12 Apple Inc. Integrated contact switch and touch sensor elements
US9600070B2 (en) 2008-12-22 2017-03-21 Apple Inc. User interface having changeable topography
US8623494B2 (en) 2008-12-29 2014-01-07 Otter Products, Llc Protective cushion cover for an electronic device
WO2010078597A1 (en) * 2009-01-05 2010-07-08 Tactus Technology, Inc. User interface system
US9588684B2 (en) 2009-01-05 2017-03-07 Tactus Technology, Inc. Tactile interface for a computing device
WO2010078596A1 (en) * 2009-01-05 2010-07-08 Tactus Technology, Inc. User interface system
US8487975B2 (en) 2009-01-27 2013-07-16 Lifesize Communications, Inc. Conferencing system utilizing a mobile communication device as an interface
US20100220063A1 (en) * 2009-02-27 2010-09-02 Panasonic Corporation System and methods for calibratable translation of position
JP5392900B2 (en) * 2009-03-03 2014-01-22 現代自動車株式会社 In-vehicle device operation device
US8570280B2 (en) * 2009-03-25 2013-10-29 Lenovo (Singapore) Pte. Ltd. Filtering of inadvertent contact with touch pad input device
TWI469015B (en) * 2009-03-27 2015-01-11 Hon Hai Prec Ind Co Ltd Electronic device with sliding touch control function and control method thereof
US8154529B2 (en) 2009-05-14 2012-04-10 Atmel Corporation Two-dimensional touch sensors
US9354751B2 (en) 2009-05-15 2016-05-31 Apple Inc. Input device with optimized capacitive sensing
TWI419017B (en) * 2009-05-25 2013-12-11 Micro Nits Co Ltd Input system having a sheet-like light shield
US8549432B2 (en) * 2009-05-29 2013-10-01 Apple Inc. Radial menus
US8464182B2 (en) * 2009-06-07 2013-06-11 Apple Inc. Device, method, and graphical user interface for providing maps, directions, and location-based information
US8872771B2 (en) 2009-07-07 2014-10-28 Apple Inc. Touch sensing device having conductive nodes
US8654524B2 (en) 2009-08-17 2014-02-18 Apple Inc. Housing as an I/O device
US8965458B2 (en) 2009-08-21 2015-02-24 Otter Products, Llc Protective cushion cover for an electronic device
US9176962B2 (en) * 2009-09-07 2015-11-03 Apple Inc. Digital media asset browsing with audio cues
KR101624218B1 (en) * 2009-09-14 2016-05-25 삼성전자주식회사 Digital photographing apparatus and controlling method thereof
JP5218353B2 (en) * 2009-09-14 2013-06-26 ソニー株式会社 Information processing apparatus, display method, and program
US20110078626A1 (en) * 2009-09-28 2011-03-31 William Bachman Contextual Presentation of Digital Media Asset Collections
US9158409B2 (en) * 2009-09-29 2015-10-13 Beijing Lenovo Software Ltd Object determining method, object display method, object switching method and electronic device
TW201120689A (en) * 2009-12-10 2011-06-16 Chih-Ming Tsao Processing method of input device to perform multi-directional control.
WO2011087817A1 (en) 2009-12-21 2011-07-21 Tactus Technology User interface system
US9239623B2 (en) 2010-01-05 2016-01-19 Tactus Technology, Inc. Dynamic tactile interface
US20110163977A1 (en) * 2010-01-06 2011-07-07 Ulrich Barnhoefer Mode Dependent Configuration of Portable Electronic Device
US8736561B2 (en) * 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US8456297B2 (en) * 2010-01-06 2013-06-04 Apple Inc. Device, method, and graphical user interface for tracking movement on a map
US8862576B2 (en) * 2010-01-06 2014-10-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results
US8866791B2 (en) * 2010-01-06 2014-10-21 Apple Inc. Portable electronic device having mode dependent user input controls
US8619035B2 (en) 2010-02-10 2013-12-31 Tactus Technology, Inc. Method for assisting user input to a device
WO2011112984A1 (en) 2010-03-11 2011-09-15 Tactus Technology User interface system
US9025317B2 (en) 2010-03-17 2015-05-05 Otter Products, Llc Multi-material protective case for sliding/articulating/rotating handheld electronic devices
US10719131B2 (en) 2010-04-05 2020-07-21 Tactile Displays, Llc Interactive display with tactile feedback
TWI514127B (en) * 2010-04-16 2015-12-21 Via Tech Inc A computer system with an e-reader mode and e-book processing method thereof
WO2011133605A1 (en) 2010-04-19 2011-10-27 Tactus Technology Method of actuating a tactile interface layer
US8591334B2 (en) * 2010-06-03 2013-11-26 Ol2, Inc. Graphical user interface, system and method for implementing a game controller on a touch-screen device
CN103098483A (en) * 2010-07-26 2013-05-08 安纳科梅得泰克 Control device for audio-visual display
US8824140B2 (en) 2010-09-17 2014-09-02 Apple Inc. Glass enclosure
US9549598B2 (en) 2010-10-12 2017-01-24 Treefrog Developments, Inc. Housing for encasing an electronic device
EP2628064A2 (en) 2010-10-12 2013-08-21 Tree Frog Developments, Inc. Housing for encasing an object
KR20140037011A (en) 2010-10-20 2014-03-26 택투스 테크놀로지, 아이엔씨. User interface system
CN102169397A (en) * 2010-11-19 2011-08-31 苏州瀚瑞微电子有限公司 Wiring method of capacitance touch screen
JP5691464B2 (en) * 2010-12-09 2015-04-01 ソニー株式会社 Information processing device
US8804056B2 (en) * 2010-12-22 2014-08-12 Apple Inc. Integrated touch screens
EP2474886A1 (en) * 2011-01-05 2012-07-11 Research In Motion Limited Electronic device and method of controlling same
TWI573048B (en) * 2011-01-26 2017-03-01 奇景光電股份有限公司 Sensing device and sensing module
US9615476B2 (en) 2011-06-13 2017-04-04 Treefrog Developments, Inc. Housing for encasing a mobile device
CA2838333C (en) 2011-06-13 2021-07-20 Treefrog Developments, Inc. Housing for encasing a tablet computer
USD736777S1 (en) 2012-06-13 2015-08-18 Treefrog Developments, Inc. Case for an electronic device
US9204094B2 (en) 2011-06-28 2015-12-01 Lifesize Communications, Inc. Adjusting volume of a videoconference using touch-based gestures
US8605872B2 (en) 2011-06-28 2013-12-10 Lifesize Communications, Inc. Muting a videoconference using touch-based gestures
US8605873B2 (en) 2011-06-28 2013-12-10 Lifesize Communications, Inc. Accessing settings of a videoconference using touch-based gestures
US8194036B1 (en) 2011-06-29 2012-06-05 Google Inc. Systems and methods for controlling a cursor on a display using a trackpad input device
US8319746B1 (en) * 2011-07-22 2012-11-27 Google Inc. Systems and methods for removing electrical noise from a touchpad signal
US9047007B2 (en) 2011-07-28 2015-06-02 National Instruments Corporation Semantic zoom within a diagram of a system
US8782525B2 (en) 2011-07-28 2014-07-15 National Insturments Corporation Displaying physical signal routing in a diagram of a system
US8713482B2 (en) 2011-07-28 2014-04-29 National Instruments Corporation Gestures for presentation of different views of a system diagram
US20130143657A1 (en) * 2011-11-14 2013-06-06 Amazon Technologies, Inc. Input Mapping Regions
RU2583754C2 (en) * 2011-12-15 2016-05-10 Тойота Дзидося Кабусики Кайся Control device
WO2013106474A1 (en) 2012-01-10 2013-07-18 The Joy Factory, Inc. Protective casing providing impact absorption and water resistance for portable electronic devices
US10216286B2 (en) * 2012-03-06 2019-02-26 Todd E. Chornenky On-screen diagonal keyboard
CN103425362A (en) * 2012-05-23 2013-12-04 南京华睿川电子科技有限公司 Round projected capacitive touch screen
WO2013181644A1 (en) 2012-06-01 2013-12-05 Treefrog Developments, Inc. Housing for an electronic device with camera, microphone and flash isolation
CN103455254B (en) * 2012-06-05 2018-05-22 腾讯科技(深圳)有限公司 Interface focus control method for movement and control device
US9241551B2 (en) 2012-06-13 2016-01-26 Otter Products, Llc Protective case with compartment
US8934675B2 (en) * 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US20140055368A1 (en) * 2012-08-22 2014-02-27 Ming-Hsein Yu Method and Apparatus by Using Touch Screen to Implement Functions of Touch Screen and Keypad
CN103677376B (en) * 2012-09-21 2017-12-26 联想(北京)有限公司 The method and electronic equipment of information processing
WO2014047656A2 (en) 2012-09-24 2014-03-27 Tactus Technology, Inc. Dynamic tactile interface and methods
US9405417B2 (en) 2012-09-24 2016-08-02 Tactus Technology, Inc. Dynamic tactile interface and methods
US9557846B2 (en) 2012-10-04 2017-01-31 Corning Incorporated Pressure-sensing touch system utilizing optical and capacitive systems
GB2506676B (en) * 2012-10-08 2015-03-25 Touchnetix Ltd Touch sensors and touch sensing methods
US10130286B2 (en) 2012-10-12 2018-11-20 Medicustek Inc. Pressure-sensing device with biplanar sensor array
US9030839B2 (en) * 2012-10-18 2015-05-12 Apple Inc. Track pad acoustic features related to a portable computer
DE102013201458A1 (en) * 2013-01-30 2014-07-31 Robert Bosch Gmbh Method and device for detecting at least one signal
EP2963530A4 (en) * 2013-02-27 2016-10-26 Alps Electric Co Ltd Operation detection device
KR102092062B1 (en) * 2013-04-30 2020-03-23 인텔렉추얼디스커버리 주식회사 Input device of display system and input method thereof
CN104156147A (en) * 2013-05-15 2014-11-19 中兴通讯股份有限公司 Self-adaptive adjusting method of terminal interface display mode and terminal
WO2014189807A2 (en) 2013-05-18 2014-11-27 Otter Products, Llc Waterproof protective case for an electronic device
CN105452992B (en) * 2013-05-30 2019-03-08 Tk控股公司 Multidimensional Trackpad
US9557813B2 (en) 2013-06-28 2017-01-31 Tactus Technology, Inc. Method for reducing perceived optical distortion
EP2824426B1 (en) 2013-07-09 2018-05-09 Leica Geosystems AG Capacitative rotation angle sensor
US9300078B2 (en) 2013-08-23 2016-03-29 Otter Products, Llc Waterproof housing for mobile electronic device and waterproof adapter for accessory device
CN105612476B (en) 2013-10-08 2019-09-20 Tk控股公司 Self-alignment stereognosis tactile multi-touch Multifunctional switch panel
US9304575B2 (en) 2013-11-26 2016-04-05 Apple Inc. Reducing touch sensor panel power consumption
CN103716672B (en) * 2013-12-19 2016-06-08 京东方科技集团股份有限公司 A kind of remote controller, display device and remote control display system
CN104793774A (en) * 2014-01-20 2015-07-22 联发科技(新加坡)私人有限公司 Electronic device control method
US20150363026A1 (en) * 2014-06-16 2015-12-17 Touchplus Information Corp. Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof
US9280228B1 (en) * 2014-08-13 2016-03-08 Anacom Medtek Patient-actuated control device for controlling an audio-visual display and ancillary functions in a hospital room
US10048754B2 (en) * 2014-08-27 2018-08-14 Grayhill, Inc. Localized haptic response
US10466826B2 (en) 2014-10-08 2019-11-05 Joyson Safety Systems Acquisition Llc Systems and methods for illuminating a track pad system
USD787553S1 (en) * 2014-11-20 2017-05-23 General Electric Company Display screen or portion thereof with icon
US9910531B2 (en) * 2015-01-12 2018-03-06 Synaptics Incorporated Circular outline single layer pattern
CN106293425A (en) * 2015-05-14 2017-01-04 冠捷投资有限公司 There is the display device of touch control screen regulation menu
US9577697B2 (en) 2015-05-27 2017-02-21 Otter Products, Llc Protective case with stylus access feature
US20160356602A1 (en) 2015-06-03 2016-12-08 Reginald K. Puana E-Car Trip Planner
CN105068693B (en) * 2015-08-28 2018-01-02 京东方科技集团股份有限公司 Touch electrode structure, contact panel and display device
CN105242814B (en) * 2015-09-14 2018-11-06 友达光电(苏州)有限公司 Capacitance type touch-control structure and its touch control display apparatus
CN105094495B (en) * 2015-09-15 2018-05-18 京东方科技集团股份有限公司 Touch electrode structure, touch-screen and display device
US10496271B2 (en) * 2016-01-29 2019-12-03 Bose Corporation Bi-directional control for touch interfaces
US9960521B2 (en) 2016-02-24 2018-05-01 Otter Products, Llc Connector for fluidly sealing an aperture of a protective case
CN106020678A (en) * 2016-04-29 2016-10-12 青岛海信移动通信技术股份有限公司 Method and device for implementing touch operation in mobile equipment
US11228754B2 (en) 2016-05-06 2022-01-18 Qualcomm Incorporated Hybrid graphics and pixel domain architecture for 360 degree video
US9922679B2 (en) 2016-06-01 2018-03-20 James Tallantyre Slow motion video playback method for computing devices with touch interfaces
CN106055163A (en) * 2016-06-30 2016-10-26 北京集创北方科技股份有限公司 Touch display control method, touch display control device and touch display module
US10159320B2 (en) 2016-09-07 2018-12-25 Otter Products, Llc Protective enclosure for encasing an electronic device
CN106775216B (en) * 2016-11-30 2020-07-17 努比亚技术有限公司 Terminal and input box self-adaption method
CN107528958B (en) * 2017-08-04 2019-03-12 Oppo广东移动通信有限公司 Method for controlling mobile terminal, device, readable storage medium storing program for executing and mobile terminal
CN107661630A (en) * 2017-08-28 2018-02-06 网易(杭州)网络有限公司 A kind of control method and device of shooting game, storage medium, processor, terminal
CN107741819B (en) * 2017-09-01 2018-11-23 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN107648848B (en) * 2017-09-01 2018-11-16 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
US10847330B2 (en) 2017-10-06 2020-11-24 Grayhill, Inc. No/low-wear bearing arrangement for a knob system
CN109656392A (en) * 2017-10-10 2019-04-19 芋头科技(杭州)有限公司 A kind of multidirectional touch control operation system and method for multiple spot
CN107890664A (en) * 2017-10-23 2018-04-10 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
EP3735629A4 (en) 2018-01-03 2021-10-06 Grayhill, Inc. Touch encoder, touch panel, and input method editor with integrated development environment and methods thereof
US10827809B2 (en) 2018-04-05 2020-11-10 Otter Products, Llc Protective case for electronic device
US11036390B2 (en) * 2018-05-25 2021-06-15 Mpi Corporation Display method of display apparatus
DE102018120575A1 (en) * 2018-07-12 2020-01-16 Preh Gmbh Input device with a movable handle on a capacitive detection surface and capacitive coupling devices
CN109395382A (en) * 2018-09-12 2019-03-01 苏州蜗牛数字科技股份有限公司 A kind of linear optimization method for rocking bar
US10775853B2 (en) * 2018-10-16 2020-09-15 Texas Instruments Incorporated Secondary back surface touch sensor for handheld devices
JP7224874B2 (en) * 2018-11-29 2023-02-20 株式会社ジャパンディスプレイ sensor device
CN112558802A (en) * 2019-09-26 2021-03-26 深圳市万普拉斯科技有限公司 Device and method for mode switching and electronic equipment
US11422629B2 (en) 2019-12-30 2022-08-23 Joyson Safety Systems Acquisition Llc Systems and methods for intelligent waveform interruption
JP2023526928A (en) 2020-05-18 2023-06-26 アップル インコーポレイテッド USER INTERFACE FOR DISPLAYING AND IMPROVING THE CURRENT LOCATION OF AN ELECTRONIC DEVICE
US11625131B2 (en) 2021-03-12 2023-04-11 Apple Inc. Continous touch input over multiple independent surfaces
CN113687729B (en) * 2021-08-11 2023-11-28 合肥联宝信息技术有限公司 Remote control equipment
CN113437960A (en) * 2021-08-13 2021-09-24 四川中微芯成科技有限公司 Method for realizing annular touch by capacitive touch key

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030076306A1 (en) 2001-10-22 2003-04-24 Zadesky Stephen Paul Touch pad handheld device

Family Cites Families (550)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1061578A (en) 1912-03-25 1913-05-13 Heinrich Wischhusen Push-button switch.
US2063276A (en) 1932-05-25 1936-12-08 Servel Inc Absorption type refrigerating system
GB765556A (en) 1953-04-21 1957-01-09 Castelco Great Britain Ltd Improvements in rotary electric switches
US2903229A (en) 1956-02-24 1959-09-08 Robert F Lange Device for supporting a frying pan in tilted position
US3005055A (en) 1957-10-08 1961-10-17 Bell Telephone Labor Inc Tilting dial circuit selector
US2945111A (en) 1958-10-24 1960-07-12 Thomas C Mccormick Push button electrical switch
US3996441A (en) 1973-07-09 1976-12-07 Shigeo Ohashi Switch with rocker actuator having detachable cover
US3965399A (en) 1974-03-22 1976-06-22 Walker Jr Frank A Pushbutton capacitive transducer
JPS5168726A (en) 1974-12-12 1976-06-14 Hosiden Electronics Co
US4115670A (en) 1976-03-15 1978-09-19 Geno Corporation Electrical switch assembly
US4071691A (en) 1976-08-24 1978-01-31 Peptek, Inc. Human-machine interface apparatus
US4103252A (en) 1976-11-26 1978-07-25 Xerox Corporation Capacitive touch-activated transducer system including a plurality of oscillators
US4121204A (en) 1976-12-14 1978-10-17 General Electric Company Bar graph type touch switch and display device
US4110749A (en) 1977-05-06 1978-08-29 Tektronix, Inc. Touch display to digital encoding system
US4242676A (en) 1977-12-29 1980-12-30 Centre Electronique Horloger Sa Interactive device for data input into an instrument of small dimensions
US4158216A (en) 1978-02-21 1979-06-12 General Electric Company Capacitive touch control
US4177421A (en) 1978-02-27 1979-12-04 Xerox Corporation Capacitive transducer
US4338502A (en) 1978-04-27 1982-07-06 Sharp Kabushiki Kaisha Metallic housing for an electronic apparatus with a flat keyboard
US4264903A (en) 1978-06-12 1981-04-28 General Electric Company Capacitive touch control and display
USD264969S (en) 1978-11-08 1982-06-15 Pye (Electronic Products) Limited Cabinet for electronic equipment
US4246452A (en) 1979-01-05 1981-01-20 Mattel, Inc. Switch apparatus
US4293734A (en) 1979-02-23 1981-10-06 Peptek, Incorporated Touch panel system and method
US4266144A (en) 1979-05-14 1981-05-05 Emhart Industries, Inc. Detection means for multiple capacitive sensing devices
CA1152603A (en) 1979-09-28 1983-08-23 Bfg Glassgroup Capacitive systems for touch control switching
JPS56114028A (en) 1980-02-12 1981-09-08 Kureha Chem Ind Co Ltd Capacity-type coordinate input device
DE3119495A1 (en) 1980-05-27 1982-02-25 Playmont AG, St. Gallen "APPROACH SWITCH"
US4394649A (en) 1980-07-28 1983-07-19 I/O Corporation Communication terminal providing user communication of high comprehension
JPS57152725U (en) 1981-03-20 1982-09-25
US4583161A (en) 1981-04-16 1986-04-15 Ncr Corporation Data processing system wherein all subsystems check for message errors
US4739191A (en) 1981-04-27 1988-04-19 Signetics Corporation Depletion-mode FET for the regulation of the on-chip generated substrate bias voltage
JPS5837784A (en) 1981-08-28 1983-03-05 Toshiba Corp Coordinate input device
US4604786A (en) 1982-11-05 1986-08-12 The Grigoleit Company Method of making a composite article including a body having a decorative metal plate attached thereto
US4570149A (en) 1983-03-15 1986-02-11 Koala Technologies Corporation Simplified touch tablet data device
US4866602A (en) 1983-11-02 1989-09-12 Microsoft Corporation Power supply for a computer peripheral device which positions a cursor on a computer display
US5125077A (en) 1983-11-02 1992-06-23 Microsoft Corporation Method of formatting data from a mouse
US5838304A (en) 1983-11-02 1998-11-17 Microsoft Corporation Packet-based mouse data protocol
GB8409877D0 (en) 1984-04-17 1984-05-31 Binstead Ronald Peter Capacitance effect keyboard
US4587378A (en) 1984-07-30 1986-05-06 Koala Technologies Corporation Two-layer touch tablet
CA1306539C (en) 1984-10-08 1992-08-18 Takahide Ohtani Signal reproduction apparatus including touched state pattern recognitionspeed control
US4752655A (en) 1984-11-16 1988-06-21 Nippon Telegraph & Telephone Corporation Coordinate input device
US4822957B1 (en) 1984-12-24 1996-11-19 Elographics Inc Electrographic touch sensor having reduced bow of equipotential field lines therein
US4644100A (en) 1985-03-22 1987-02-17 Zenith Electronics Corporation Surface acoustic wave touch panel system
US4734034A (en) 1985-03-29 1988-03-29 Sentek, Incorporated Contact sensor for measuring dental occlusion
US4856993A (en) 1985-03-29 1989-08-15 Tekscan, Inc. Pressure and contact sensor system for measuring dental occlusion
JPS6226532A (en) 1985-07-19 1987-02-04 リチヤ−ド エル.ジエンキンス Isometric controller
US4736191A (en) 1985-08-02 1988-04-05 Karl E. Matzke Touch activated control method and apparatus
US4810992A (en) 1986-01-17 1989-03-07 Interlink Electronics, Inc. Digitizer pad
US4739299A (en) 1986-01-17 1988-04-19 Interlink Electronics, Inc. Digitizer pad
US5179648A (en) 1986-03-24 1993-01-12 Hauck Lane T Computer auxiliary viewing system
DE3615742A1 (en) 1986-05-09 1987-11-12 Schoeller & Co Elektrotech Push-button film switch
US4771139A (en) 1986-06-27 1988-09-13 Desmet Gregory L Keyboard with metal cover and improved switches
US5416498A (en) 1986-10-21 1995-05-16 Ergonomics, Inc. Prehensile positioning computer keyboard
US4764717A (en) 1986-10-27 1988-08-16 Utah Scientific Advanced Development Center, Inc. Touch-sensitive potentiometer for operator control panel
US5194852A (en) * 1986-12-01 1993-03-16 More Edward S Electro-optic slate for direct entry and display and/or storage of hand-entered textual and graphic information
US4755765A (en) 1987-01-16 1988-07-05 Teradyne, Inc. Differential input selector
US4917516A (en) 1987-02-18 1990-04-17 Retter Dale J Combination computer keyboard and mouse data entry system
US5856645A (en) * 1987-03-02 1999-01-05 Norton; Peter Crash sensing switch
GB2204131B (en) 1987-04-28 1991-04-17 Ibm Graphics input tablet
US5053757A (en) * 1987-06-04 1991-10-01 Tektronix, Inc. Touch panel with adaptive noise reduction
JPS63314633A (en) 1987-06-17 1988-12-22 Gunze Ltd Method for detecting contact position of touch panel
US4990900A (en) 1987-10-01 1991-02-05 Alps Electric Co., Ltd. Touch panel
US4860768A (en) 1987-11-09 1989-08-29 The Hon Group Transducer support base with a depending annular isolation ring
US5450075A (en) 1987-11-11 1995-09-12 Ams Industries Plc Rotary control
US4831359A (en) 1988-01-13 1989-05-16 Micro Research, Inc. Four quadrant touch pad
US4914624A (en) 1988-05-06 1990-04-03 Dunthorn David I Virtual button for touch screen
US4951036A (en) 1988-08-04 1990-08-21 The Grass Valley Group, Inc. Touchpad jogger
US4849852A (en) 1988-09-30 1989-07-18 Alps Electric (U.S.A.), Inc. Variable capacitance push-button switch
US4976435A (en) 1988-10-17 1990-12-11 Will Shatford Video game control adapter
CA2002912A1 (en) * 1988-11-14 1990-05-14 William A. Clough Portable computer with touch screen and computer system employing same
JPH0322259A (en) 1989-03-22 1991-01-30 Seiko Epson Corp Small-sized data display and reproducing device
GB8914235D0 (en) 1989-06-21 1989-08-09 Tait David A G Finger operable control devices
JP2934672B2 (en) 1989-07-03 1999-08-16 直之 大纒 Capacitive detector
US5305017A (en) 1989-08-16 1994-04-19 Gerpheide George E Methods and apparatus for data input
US5036321A (en) 1989-08-31 1991-07-30 Otis Elevator Company Capacitive sensing, solid state touch button system
GB8921473D0 (en) 1989-09-22 1989-11-08 Psion Plc Input device
GB9004532D0 (en) * 1990-02-28 1990-04-25 Lucas Ind Plc Switch assembly
US5008497A (en) 1990-03-22 1991-04-16 Asher David J Touch controller
JP3301079B2 (en) 1990-06-18 2002-07-15 ソニー株式会社 Information input device, information input method, information processing device, and information processing method
US5192082A (en) 1990-08-24 1993-03-09 Nintendo Company Limited TV game machine
US5086870A (en) * 1990-10-31 1992-02-11 Division Driving Systems, Inc. Joystick-operated driving system
JP3192418B2 (en) 1990-11-30 2001-07-30 株式会社リコー Electrostatic latent image developing carrier and developer
US5159159A (en) 1990-12-07 1992-10-27 Asher David J Touch sensor and controller
DE69027778T2 (en) 1990-12-14 1997-01-23 Ibm Coordinate processor for a computer system with a pointer arrangement
US5204600A (en) 1991-02-06 1993-04-20 Hewlett-Packard Company Mechanical detent simulating system
US5841423A (en) 1991-02-15 1998-11-24 Carroll, Jr.; George L. Multifunction space bar for video screen graphics cursor control
US5479192A (en) 1991-02-15 1995-12-26 Carroll, Jr.; George L. Multifunction space bar for video screen graphics cursor control
US5272469A (en) * 1991-07-01 1993-12-21 Ncr Corporation Process for mapping high resolution data into a lower resolution depiction
US5237311A (en) 1991-08-01 1993-08-17 Picker International, Inc. Hingedly supported integrated trackball and selection device
JPH0620570A (en) 1991-12-26 1994-01-28 Nippon Kaiheiki Kogyo Kk Display-equipped push button switch
US5186646A (en) 1992-01-16 1993-02-16 Pederson William A Connector device for computers
FR2686440B1 (en) 1992-01-17 1994-04-01 Sextant Avionique DEVICE FOR MULTIMODE MANAGEMENT OF A CURSOR ON THE SCREEN OF A DISPLAY DEVICE.
US5231326A (en) 1992-01-30 1993-07-27 Essex Electronics, Inc. Piezoelectric electronic switch
JPH05233141A (en) 1992-02-25 1993-09-10 Mitsubishi Electric Corp Pointing device
JPH05258641A (en) 1992-03-16 1993-10-08 Matsushita Electric Ind Co Ltd Panel switch
US5367199A (en) 1992-05-01 1994-11-22 Triax Technologies Sliding contact control switch pad
US5543591A (en) 1992-06-08 1996-08-06 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5880411A (en) 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
DE69324067T2 (en) 1992-06-08 1999-07-15 Synaptics Inc Object position detector
US5889236A (en) * 1992-06-08 1999-03-30 Synaptics Incorporated Pressure sensitive scrollbar feature
US5543588A (en) 1992-06-08 1996-08-06 Synaptics, Incorporated Touch pad driven handheld computing device
US5861875A (en) * 1992-07-13 1999-01-19 Cirque Corporation Methods and apparatus for data input
US5508717A (en) 1992-07-28 1996-04-16 Sony Corporation Computer pointing device with dynamic sensitivity
AR247303A1 (en) 1992-08-21 1994-11-30 Gilligan Federico Gustavo Y Fa New computer keyboard.
JP3227218B2 (en) 1992-09-11 2001-11-12 キヤノン株式会社 Information processing device
JPH0696639A (en) 1992-09-14 1994-04-08 Smk Corp Membrane switch having jog function
US5907152A (en) 1992-10-05 1999-05-25 Logitech, Inc. Pointing device utilizing a photodetector array
US6084574A (en) 1992-10-05 2000-07-04 Logitech, Inc. Compact cursor pointing device utilizing photodetector array
US5703356A (en) 1992-10-05 1997-12-30 Logitech, Inc. Pointing device utilizing a photodetector array
USD349280S (en) 1992-10-06 1994-08-02 Microsoft Corporation Computer mouse
US5414445A (en) 1992-10-07 1995-05-09 Microsoft Corporation Ergonomic pointing device
US5632679A (en) 1992-10-26 1997-05-27 Tremmel; Michael Touch sensitive computer interface controller
US5561445A (en) 1992-11-09 1996-10-01 Matsushita Electric Industrial Co., Ltd. Three-dimensional movement specifying apparatus and method and observational position and orientation changing apparatus
US5339213A (en) 1992-11-16 1994-08-16 Cirque Corporation Portable computer touch pad attachment
US5521617A (en) 1993-04-15 1996-05-28 Sony Corporation Three-dimensional image special effect apparatus
JP2986047B2 (en) 1993-04-29 1999-12-06 インターナショナル・ビジネス・マシーンズ・コーポレイション Digital input display device and input processing device and method
US5424756A (en) 1993-05-14 1995-06-13 Ho; Yung-Lung Track pad cursor positioning device and method
US5408621A (en) 1993-06-10 1995-04-18 Ben-Arie; Jezekiel Combinatorial data entry system having multi-position switches, each switch having tiltable control knob
NO932270D0 (en) 1993-06-21 1993-06-21 Steinar Pedersen GUIDELINES FOR PC MARKETS
US5959610A (en) 1993-06-21 1999-09-28 Euphonix Computer-mirrored panel input device
US5581670A (en) 1993-07-21 1996-12-03 Xerox Corporation User interface having movable sheet with click-through tools
CA2124624C (en) 1993-07-21 1999-07-13 Eric A. Bier User interface having click-through tools that can be composed with other tools
CA2124505C (en) 1993-07-21 2000-01-04 William A. S. Buxton User interface having simultaneously movable tools and cursor
US5555004A (en) 1993-08-30 1996-09-10 Hosiden Corporation Input control device
AU7727694A (en) 1993-09-13 1995-04-03 David J. Asher Joystick with membrane sensor
US5956019A (en) 1993-09-28 1999-09-21 The Boeing Company Touch-pad cursor control device
US5596697A (en) * 1993-09-30 1997-01-21 Apple Computer, Inc. Method for routing items within a computer system
US5564112A (en) 1993-10-14 1996-10-08 Xerox Corporation System and method for generating place holders to temporarily suspend execution of a selected command
US5661632A (en) 1994-01-04 1997-08-26 Dell Usa, L.P. Hand held computer with dual display screen orientation capability controlled by toggle switches having first and second non-momentary positions
US5473344A (en) 1994-01-06 1995-12-05 Microsoft Corporation 3-D cursor positioning device
CA2140164A1 (en) 1994-01-27 1995-07-28 Kenneth R. Robertson System and method for computer cursor control
US5613137A (en) * 1994-03-18 1997-03-18 International Business Machines Corporation Computer system with touchpad support in operating system
EP0674288A1 (en) 1994-03-24 1995-09-27 AT&T Corp. Multidimensional mouse
MY118477A (en) 1994-04-20 2004-11-30 Sony Corp Communication terminal apparatus and control method thereof
WO1995031791A1 (en) 1994-05-12 1995-11-23 Apple Computer, Inc. Method and apparatus for noise filtering for an input device
USD362431S (en) 1994-05-18 1995-09-19 Microsoft Corporation Computer input device
US5473343A (en) 1994-06-23 1995-12-05 Microsoft Corporation Method and apparatus for locating a cursor on a computer screen
US5559943A (en) 1994-06-27 1996-09-24 Microsoft Corporation Method and apparatus customizing a dual actuation setting of a computer input device switch
US5565887A (en) 1994-06-29 1996-10-15 Microsoft Corporation Method and apparatus for moving a cursor on a computer screen
US5559301A (en) 1994-09-15 1996-09-24 Korg, Inc. Touchscreen interface having pop-up variable adjustment displays for controllers and audio processing systems
US5627531A (en) 1994-09-30 1997-05-06 Ohmeda Inc. Multi-function menu selection device
US5494157A (en) * 1994-11-14 1996-02-27 Samsonite Corporation Computer bag with side accessible padded compartments
US5495566A (en) 1994-11-22 1996-02-27 Microsoft Corporation Scrolling contents of a window
US5589893A (en) 1994-12-01 1996-12-31 Zenith Electronics Corporation On-screen remote control of a television receiver
US5805144A (en) 1994-12-14 1998-09-08 Dell Usa, L.P. Mouse pointing device having integrated touchpad
US5585823A (en) 1994-12-30 1996-12-17 Apple Computer, Inc. Multi-state one-button computer pointing device
US5828364A (en) 1995-01-03 1998-10-27 Microsoft Corporation One-piece case top and integrated switch for a computer pointing device
JP3442893B2 (en) 1995-01-27 2003-09-02 富士通株式会社 Input device
US5611060A (en) 1995-02-22 1997-03-11 Microsoft Corporation Auto-scrolling during a drag and drop operation
US6323845B1 (en) 1995-03-06 2001-11-27 Ncr Corporation Single finger controlled computer input apparatus and method
US5959611A (en) 1995-03-06 1999-09-28 Carnegie Mellon University Portable computer system with ergonomic input device
US5611040A (en) 1995-04-05 1997-03-11 Microsoft Corporation Method and system for activating double click applications with a single click
GB9507817D0 (en) * 1995-04-18 1995-05-31 Philips Electronics Uk Ltd Touch sensing devices and methods of making such
US5825353A (en) 1995-04-18 1998-10-20 Will; Craig Alexander Control of miniature personal digital assistant using menu and thumbwheel
US6122526A (en) 1997-04-24 2000-09-19 Eastman Kodak Company Cellular telephone and electronic camera system with programmable transmission capability
JPH08307954A (en) * 1995-05-12 1996-11-22 Sony Corp Device and method for coordinate input and information processor
JPH0934644A (en) 1995-07-21 1997-02-07 Oki Electric Ind Co Ltd Pointing device
JP3743458B2 (en) * 1995-07-29 2006-02-08 ソニー株式会社 Input pad device
US5790769A (en) 1995-08-04 1998-08-04 Silicon Graphics Incorporated System for editing time-based temporal digital media including a pointing device toggling between temporal and translation-rotation modes
US5751274A (en) 1995-09-14 1998-05-12 Davis; Michael Foot-operable cursor control device
US6025832A (en) * 1995-09-29 2000-02-15 Kabushiki Kaisha Toshiba Signal generating apparatus, signal inputting apparatus and force-electricity transducing apparatus
US5764066A (en) 1995-10-11 1998-06-09 Sandia Corporation Object locating system
US5884323A (en) 1995-10-13 1999-03-16 3Com Corporation Extendible method and apparatus for synchronizing files on two different computer systems
US5856822A (en) * 1995-10-27 1999-01-05 02 Micro, Inc. Touch-pad digital computer pointing-device
US6473069B1 (en) 1995-11-13 2002-10-29 Cirque Corporation Apparatus and method for tactile feedback from input device
US6100874A (en) * 1995-11-17 2000-08-08 Immersion Corporation Force feedback mouse interface
US5964661A (en) 1995-11-24 1999-10-12 Dodge; Samuel D. Apparatus and method for timing video games
US5730165A (en) * 1995-12-26 1998-03-24 Philipp; Harald Time domain capacitive field detector
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
USD385542S (en) 1996-01-05 1997-10-28 Microsoft Corporation Pointing device
USD382550S (en) 1996-01-16 1997-08-19 Microsoft Corporation Rear portion of a pointing device
US5754890A (en) 1996-02-01 1998-05-19 Microsoft Corporation System for automatic identification of a computer data entry device interface type using a transistor to sense the voltage generated by the interface and output a matching voltage level
JP3280559B2 (en) 1996-02-20 2002-05-13 シャープ株式会社 Jog dial simulation input device
FR2745400B1 (en) 1996-02-23 1998-05-07 Asulab Sa DEVICE FOR ENTERING DATA IN ELECTRONIC MEANS FOR PROCESSING SUCH DATA
US5808602A (en) 1996-03-15 1998-09-15 Compaq Computer Corporation Rotary cursor positioning apparatus
US5721849A (en) * 1996-03-29 1998-02-24 International Business Machines Corporation Method, memory and apparatus for postponing transference of focus to a newly opened window
US5815141A (en) 1996-04-12 1998-09-29 Elo Touch Systems, Inc. Resistive touchscreen having multiple selectable regions for pressure discrimination
AU2808697A (en) 1996-04-24 1997-11-12 Logitech, Inc. Touch and pressure sensing method and apparatus
US5859629A (en) * 1996-07-01 1999-01-12 Sun Microsystems, Inc. Linear touch input device
US5748185A (en) 1996-07-03 1998-05-05 Stratos Product Development Group Touchpad with scroll and pan regions
US6009336A (en) 1996-07-10 1999-12-28 Motorola, Inc. Hand-held radiotelephone having a detachable display
US5729219A (en) * 1996-08-02 1998-03-17 Motorola, Inc. Selective call radio with contraposed touchpad
US5943044A (en) 1996-08-05 1999-08-24 Interlink Electronics Force sensing semiconductive touchpad
DE19639119A1 (en) 1996-09-24 1998-03-26 Philips Patentverwaltung Electronic device with a bidirectional rotary switch
US5812239A (en) 1996-10-22 1998-09-22 Eger; Jeffrey J. Method of and arrangement for the enhancement of vision and/or hand-eye coordination
US5883619A (en) * 1996-11-12 1999-03-16 Primax Electronics Ltd. Computer mouse for scrolling a view of an image
US6128006A (en) 1998-03-26 2000-10-03 Immersion Corporation Force feedback mouse wheel and other control wheels
US6636197B1 (en) 1996-11-26 2003-10-21 Immersion Corporation Haptic feedback effects for control, knobs and other interface devices
JPH10188720A (en) * 1996-12-26 1998-07-21 Smk Corp Keyboard switch
US5889511A (en) * 1997-01-17 1999-03-30 Tritech Microelectronics International, Ltd. Method and system for noise reduction for digitizing devices
US5907318A (en) 1997-01-17 1999-05-25 Medina; Carlos A. Foot-controlled computer mouse
US6300946B1 (en) 1997-01-29 2001-10-09 Palm, Inc. Method and apparatus for interacting with a portable computer
US6227966B1 (en) 1997-02-19 2001-05-08 Kabushiki Kaisha Bandai Simulation device for fostering a virtual creature
JP2957507B2 (en) 1997-02-24 1999-10-04 インターナショナル・ビジネス・マシーンズ・コーポレイション Small information processing equipment
US6222528B1 (en) 1997-03-07 2001-04-24 Cirque Corporation Method and apparatus for data input
US5909211A (en) 1997-03-25 1999-06-01 International Business Machines Corporation Touch pad overlay driven computer system
FI115689B (en) 1997-05-21 2005-06-15 Nokia Corp Procedure and arrangement for scrolling information presented on mobile display
US6031518A (en) 1997-05-30 2000-02-29 Microsoft Corporation Ergonomic input device
DE19722636A1 (en) 1997-06-01 1998-12-03 Kilian Fremmer Multi function mouse for control of computer system
US5953000A (en) 1997-06-02 1999-09-14 Weirich; John P. Bounded-display-surface system for the input and output of computer data and video graphics
JP4137219B2 (en) 1997-06-05 2008-08-20 アルプス電気株式会社 Data input device
US5910802A (en) 1997-06-11 1999-06-08 Microsoft Corporation Operating system for handheld computing device having taskbar auto hide
JP3820595B2 (en) 1997-06-13 2006-09-13 セイコーエプソン株式会社 Display device, electronic device using the same, and polarization separator
USD402281S (en) 1997-06-18 1998-12-08 Microsoft Corporation Positional control device
JPH1115596A (en) 1997-06-19 1999-01-22 Alps Electric Co Ltd Data input device
US6020760A (en) 1997-07-16 2000-02-01 Altera Corporation I/O buffer circuit with pin multiplexing
TW462026B (en) 1997-07-19 2001-11-01 Primax Electronics Ltd Method for applying a 3D mouse in windows software
US6166721A (en) 1997-07-25 2000-12-26 Mitsumi Electric Co., Ltd. Mouse as computer input device having additional mechanism for controlling additional function such as scrolling
KR100294260B1 (en) * 1997-08-06 2001-07-12 윤종용 Touch panel device and portable computer installing the touch panel device
KR19990015738A (en) 1997-08-08 1999-03-05 윤종용 Handheld Computer with Touchpad Input Control
JP3978818B2 (en) 1997-08-08 2007-09-19 ソニー株式会社 Manufacturing method of micro head element
US5933102A (en) 1997-09-24 1999-08-03 Tanisys Technology, Inc. Capacitive sensitive switch method and system
KR200225264Y1 (en) 1997-10-01 2001-06-01 김순택 Portable display
US6496181B1 (en) 1997-10-03 2002-12-17 Siemens Information And Communication Mobile Llc Scroll select-activate button for wireless terminals
FR2770022B1 (en) * 1997-10-20 1999-12-03 Itt Mfg Enterprises Inc MULTIPLE ELECTRIC SWITCH WITH SINGLE OPERATION LEVER
US6181322B1 (en) * 1997-11-07 2001-01-30 Netscape Communications Corp. Pointing device having selection buttons operable from movement of a palm portion of a person's hands
US6243078B1 (en) 1998-06-23 2001-06-05 Immersion Corporation Pointing device with forced feedback button
US6211861B1 (en) * 1998-06-23 2001-04-03 Immersion Corporation Tactile mouse device
JP3865169B2 (en) 1997-11-28 2007-01-10 ソニー株式会社 COMMUNICATION TERMINAL DEVICE AND COMMUNICATION TERMINAL DEVICE CONTROL METHOD
US6256011B1 (en) 1997-12-03 2001-07-03 Immersion Corporation Multi-function control device with force feedback
JP3861273B2 (en) 1997-12-18 2006-12-20 ソニー株式会社 Portable information terminal device and information display control method for portable information terminal device
JPH11184601A (en) 1997-12-22 1999-07-09 Sony Corp Portable information terminal device, screen scroll method, recording medium and microcomputer device
US5933141A (en) 1998-01-05 1999-08-03 Gateway 2000, Inc. Mutatably transparent displays
JPH11194863A (en) 1998-01-06 1999-07-21 Poseidon Technical Systems:Kk Touch input detecting method and touch input detector
JPH11194883A (en) 1998-01-06 1999-07-21 Poseidon Technical Systems:Kk Touch operation type computer
JPH11194872A (en) 1998-01-06 1999-07-21 Poseidon Technical Systems:Kk Contact operation type input device and its electronic part
GB2333215B (en) 1998-01-13 2002-05-08 Sony Electronics Inc Systems and methods for enabling manipulation of a plurality of graphic images on a display screen
US6323846B1 (en) 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US7800592B2 (en) 2005-03-04 2010-09-21 Apple Inc. Hand held electronic device with multiple touch sensing devices
US6225980B1 (en) 1998-02-06 2001-05-01 Carnegie Mellon University Multi-functional, rotary dial input device for portable computers
US6259491B1 (en) 1998-02-06 2001-07-10 Motorola, Inc. Double sided laminated liquid crystal display touchscreen and method of making same for use in a wireless communication device
TW469379B (en) 1998-02-16 2001-12-21 Sony Computer Entertainment Inc Portable electronic device
US6313853B1 (en) * 1998-04-16 2001-11-06 Nortel Networks Limited Multi-service user interface
JPH11311523A (en) 1998-04-28 1999-11-09 Aisin Aw Co Ltd Navigation apparatus for vehicle
USD412940S (en) 1998-05-14 1999-08-17 Sega Enterprises, Ltd. Video game machine
TW541193B (en) 1998-06-01 2003-07-11 Sony Computer Entertainment Inc Portable electronic machine and entertaining system
USD437860S1 (en) * 1998-06-01 2001-02-20 Sony Corporation Selector for audio visual apparatus
US6563487B2 (en) 1998-06-23 2003-05-13 Immersion Corporation Haptic feedback for directional control pads
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6262717B1 (en) 1998-07-02 2001-07-17 Cirque Corporation Kiosk touch pad
US6452427B1 (en) 1998-07-07 2002-09-17 Wen H. Ko Dual output capacitance interface circuit
US6188391B1 (en) * 1998-07-09 2001-02-13 Synaptics, Inc. Two-layer capacitive touchpad and method of making same
US6243080B1 (en) 1998-07-14 2001-06-05 Ericsson Inc. Touch-sensitive panel with selector
TW383883U (en) 1998-08-18 2000-03-01 Ind Tech Res Inst Remote network browser with turning button selection element
JP4019515B2 (en) 1998-08-21 2007-12-12 松下電器産業株式会社 Push / turn operation type electronic component and communication terminal device using the same
US6002093A (en) 1998-08-21 1999-12-14 Dell Usa, L.P. Button with flexible cantilever
US6188393B1 (en) * 1998-10-05 2001-02-13 Sysgration Ltd. Scroll bar input device for mouse
US6198473B1 (en) * 1998-10-06 2001-03-06 Brad A. Armstrong Computer mouse with enhance control button (s)
US6225976B1 (en) 1998-10-30 2001-05-01 Interlink Electronics, Inc. Remote computer input peripheral
US6678891B1 (en) * 1998-11-19 2004-01-13 Prasara Technologies, Inc. Navigational user interface for interactive television
GB2345193B (en) 1998-12-22 2002-07-24 Nokia Mobile Phones Ltd Metallic keys
JP2000200147A (en) 1999-01-06 2000-07-18 Fujitsu Takamisawa Component Ltd Input device
US6552719B2 (en) * 1999-01-07 2003-04-22 Microsoft Corporation System and method for automatically switching between writing and text input modes
JP2000215549A (en) 1999-01-22 2000-08-04 Sony Corp Portable audio reproducing device
WO2000044018A1 (en) 1999-01-26 2000-07-27 Harald Philipp Capacitive sensor and array
US6104790A (en) 1999-01-29 2000-08-15 International Business Machines Corporation Graphical voice response system and method therefor
US6373265B1 (en) 1999-02-02 2002-04-16 Nitta Corporation Electrostatic capacitive touch sensor
US6377530B1 (en) * 1999-02-12 2002-04-23 Compaq Computer Corporation System and method for playing compressed audio data
JP4172867B2 (en) 1999-02-22 2008-10-29 富士通コンポーネント株式会社 Mouse with wheel
SE513866C2 (en) 1999-03-12 2000-11-20 Spectronic Ab Hand- or pocket-worn electronic device and hand-controlled input device
JP2000267797A (en) 1999-03-15 2000-09-29 Seiko Epson Corp Information processor
JP2000267777A (en) 1999-03-16 2000-09-29 Internatl Business Mach Corp <Ibm> Method for inputting numerical value using touch panel and inputting device
JP2000267786A (en) 1999-03-16 2000-09-29 Ntt Docomo Inc Information communication equipment
US6338013B1 (en) 1999-03-19 2002-01-08 Bryan John Ruffner Multifunctional mobile appliance
US6147856A (en) 1999-03-31 2000-11-14 International Business Machine Corporation Variable capacitor with wobble motor disc selector
TW431607U (en) 1999-04-02 2001-04-21 Quanta Comp Inc Touch plate structure for notebook computer
USD443616S1 (en) 1999-04-06 2001-06-12 Microsoft Corporation Portion of a computer input device
USD442592S1 (en) 1999-04-06 2001-05-22 Microsoft Corporation Portion of a computer input device
JP3742529B2 (en) 1999-05-10 2006-02-08 アルプス電気株式会社 Coordinate input device
US6357887B1 (en) * 1999-05-14 2002-03-19 Apple Computers, Inc. Housing for a computing device
US6977808B2 (en) 1999-05-14 2005-12-20 Apple Computer, Inc. Display housing for computing device
US6297811B1 (en) 1999-06-02 2001-10-02 Elo Touchsystems, Inc. Projective capacitive touchscreen
JP2000353045A (en) 1999-06-09 2000-12-19 Canon Inc Portable information processor and focus movement control method
US7151528B2 (en) * 1999-06-22 2006-12-19 Cirque Corporation System for disposing a proximity sensitive touchpad behind a mobile phone keypad
US6639584B1 (en) 1999-07-06 2003-10-28 Chuang Li Methods and apparatus for controlling a portable electronic device using a touchpad
JP2001023473A (en) 1999-07-07 2001-01-26 Matsushita Electric Ind Co Ltd Mobile communication terminal unit and transparent touch panel switch for use in it
US6396523B1 (en) 1999-07-29 2002-05-28 Interlink Electronics, Inc. Home entertainment device remote control
US6677927B1 (en) * 1999-08-23 2004-01-13 Microsoft Corporation X-Y navigation input device
JP2001076582A (en) 1999-09-01 2001-03-23 Matsushita Electric Ind Co Ltd Electronic apparatus
US6492979B1 (en) 1999-09-07 2002-12-10 Elo Touchsystems, Inc. Dual sensor touchscreen utilizing projective-capacitive and force touch sensors
US6641154B1 (en) 1999-09-09 2003-11-04 Jeffrey Vey Air bladder suspension for three-wheeled vehicle
US6606244B1 (en) * 1999-09-10 2003-08-12 Saint Song Corp. Pointing device having computer host
US6865718B2 (en) 1999-09-29 2005-03-08 Microsoft Corp. Accelerated scrolling
US6424338B1 (en) 1999-09-30 2002-07-23 Gateway, Inc. Speed zone touchpad
JP4222704B2 (en) 1999-10-12 2009-02-12 株式会社ノーバス Information input device
US6757002B1 (en) 1999-11-04 2004-06-29 Hewlett-Packard Development Company, L.P. Track pad pointing device with areas of specialized function
US6844871B1 (en) 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
US7006077B1 (en) 1999-11-30 2006-02-28 Nokia Mobile Phones, Ltd. Electronic device having touch sensitive slide
USD430169S (en) 1999-12-15 2000-08-29 Advanced Communication Design, Inc. Interactive multimedia control panel with speakers
US6978127B1 (en) 1999-12-16 2005-12-20 Koninklijke Philips Electronics N.V. Hand-ear user interface for hand-held device
US6248017B1 (en) 1999-12-23 2001-06-19 Hasbro, Inc Hand-held electronic game with rotatable display
US6179496B1 (en) * 1999-12-28 2001-01-30 Shin Jiuh Corp. Computer keyboard with turnable knob
US20040252867A1 (en) 2000-01-05 2004-12-16 Je-Hsiung Lan Biometric sensor
US6844872B1 (en) * 2000-01-12 2005-01-18 Apple Computer, Inc. Computer mouse having side areas to maintain a depressed button position
US6373470B1 (en) 2000-01-12 2002-04-16 Apple Computer, Inc. Cursor control device having an integral top member
GB2359177A (en) 2000-02-08 2001-08-15 Nokia Corp Orientation sensitive display and selection mechanism
AU2001231524A1 (en) 2000-02-10 2001-08-20 Ergomouse Pty. Ltd. Pointing means for a computer
US6492602B2 (en) 2000-02-10 2002-12-10 Alps Electric Co., Ltd. Two-position pushbutton switch
US20010050673A1 (en) 2000-02-14 2001-12-13 Davenport Anthony G. Ergonomic fingertip computer mouse
DE10011645A1 (en) 2000-03-10 2001-09-13 Ego Elektro Geraetebau Gmbh Touch switch with an LC display
JP3754268B2 (en) 2000-04-07 2006-03-08 三洋電機株式会社 KEY INPUT DEVICE AND MOBILE PHONE WITH THE SAME
US6765557B1 (en) * 2000-04-10 2004-07-20 Interlink Electronics, Inc. Remote control having touch pad to screen mapping
CA2405846C (en) * 2000-04-11 2007-09-04 Cirque Corporation Efficient entry of characters into a portable information appliance
JP4325075B2 (en) * 2000-04-21 2009-09-02 ソニー株式会社 Data object management device
AU144018S (en) 2000-05-09 2001-05-24 Sony Computer Entertainment Inc Control unit
US6340800B1 (en) * 2000-05-27 2002-01-22 International Business Machines Corporation Multiplexing control device and method for electronic systems
US6640250B1 (en) 2000-05-31 2003-10-28 3Com Corporation Method and apparatus for previewing and selecting a network resource using a rotary knob for user input
US6724817B1 (en) 2000-06-05 2004-04-20 Amphion Semiconductor Limited Adaptive image data compression
JP2001350188A (en) 2000-06-06 2001-12-21 Olympus Optical Co Ltd Camera apparatus
FI108901B (en) 2000-06-26 2002-04-15 Nokia Corp Touch-sensitive electromechanical data input mechanism
JP3785902B2 (en) * 2000-07-11 2006-06-14 インターナショナル・ビジネス・マシーンズ・コーポレーション Device, device control method, pointer movement method
USD454568S1 (en) * 2000-07-17 2002-03-19 Apple Computer, Inc. Mouse
US7117136B1 (en) 2000-08-18 2006-10-03 Linden Research, Inc. Input and feedback system
JP2002077329A (en) 2000-08-31 2002-03-15 Nintendo Co Ltd Electronic device
US6497412B1 (en) 2000-09-08 2002-12-24 Peter J. Bramm Method and apparatus for playing a quiz game
US6788288B2 (en) * 2000-09-11 2004-09-07 Matsushita Electric Industrial Co., Ltd. Coordinate input device and portable information apparatus equipped with coordinate input device
JP2002107806A (en) 2000-09-29 2002-04-10 Fuji Photo Optical Co Ltd Structure of operation button part
US7667123B2 (en) * 2000-10-13 2010-02-23 Phillips Mark E System and method for musical playlist selection in a portable audio device
US6810271B1 (en) 2000-10-31 2004-10-26 Nokia Mobile Phones Ltd. Keypads for electrical devices
DE20019074U1 (en) 2000-11-09 2001-01-18 Siemens Ag Mobile electronic device with display and control element
US6897853B2 (en) * 2000-11-10 2005-05-24 Microsoft Corp. Highlevel active pen matrix
USD455793S1 (en) * 2000-12-04 2002-04-16 Legend Technology Co., Ltd. Liquid crystal display monitor for multi-media games
USD452250S1 (en) 2000-12-06 2001-12-18 Perfect Union Co., Ltd. MP3 player
US7054441B2 (en) 2000-12-12 2006-05-30 Research In Motion Limited Mobile device having a protective user interface cover
US20070018970A1 (en) * 2000-12-22 2007-01-25 Logitech Europe S.A. Optical slider for input devices
JP2002202855A (en) 2000-12-28 2002-07-19 Matsushita Electric Ind Co Ltd Touch panel and electronic equipment using the same
US6999804B2 (en) 2001-01-22 2006-02-14 Wildseed, Ltd. Interchangeable covering additions to a mobile communication device for display and key reorientation
JP2002215311A (en) 2001-01-22 2002-08-02 Sony Corp Portable terminal device, image plane information selecting method, and recording-readable medium
US20020103796A1 (en) 2001-01-31 2002-08-01 Sonicblue, Inc. Method for parametrically sorting music files
US6686904B1 (en) * 2001-03-30 2004-02-03 Microsoft Corporation Wheel reporting method for a personal computer keyboard interface
US6750803B2 (en) * 2001-02-23 2004-06-15 Interlink Electronics, Inc. Transformer remote control
US6738045B2 (en) 2001-02-26 2004-05-18 Microsoft Corporation Method and system for accelerated data navigation
US6781576B2 (en) 2001-03-14 2004-08-24 Sensation, Inc. Wireless input apparatus and method using a three-dimensional pointing device
USD450713S1 (en) 2001-03-16 2001-11-20 Sony Corporation Audio player
US6873863B2 (en) 2001-03-19 2005-03-29 Nokia Mobile Phones Ltd. Touch sensitive navigation surfaces for mobile telecommunication systems
US6879930B2 (en) * 2001-03-30 2005-04-12 Microsoft Corporation Capacitance touch slider
US6822640B2 (en) 2001-04-10 2004-11-23 Hewlett-Packard Development Company, L.P. Illuminated touch pad
US6587091B2 (en) 2001-04-23 2003-07-01 Michael Lawrence Serpa Stabilized tactile output mechanism for computer interface devices
US6608616B2 (en) 2001-04-23 2003-08-19 Silitek Corporation Ergonomic scrolling device
AU2002257217A1 (en) * 2001-04-24 2002-11-05 Broadcom Corporation Alerting system, architecture and circuitry
US6700564B2 (en) 2001-04-30 2004-03-02 Microsoft Corporation Input device including a wheel assembly for scrolling an image in multiple directions
US7239800B2 (en) 2001-05-02 2007-07-03 David H. Sitrick Portable player for personal video recorders
US7206599B2 (en) 2001-05-09 2007-04-17 Kyocera Wireless Corp. Integral navigation keys for a mobile handset
US20050024341A1 (en) * 2001-05-16 2005-02-03 Synaptics, Inc. Touch screen with user interface enhancement
US20030043121A1 (en) * 2001-05-22 2003-03-06 Richard Chen Multimedia pointing device
FI20015005A (en) 2001-05-31 2002-12-01 Nokia Corp A mobile station comprising a display element
US7113196B2 (en) 2001-06-15 2006-09-26 Apple Computer, Inc. Computing device with dynamic ornamental appearance
US7452098B2 (en) * 2001-06-15 2008-11-18 Apple Inc. Active enclosure for computing device
US7766517B2 (en) 2001-06-15 2010-08-03 Apple Inc. Active enclosure for computing device
US20020196239A1 (en) 2001-06-26 2002-12-26 Lee Siew Fei Joy-dial for providing input signals to a device
US6791533B2 (en) 2001-06-28 2004-09-14 Behavior Tech Computer Corporation Seamless mouse
JP2003015796A (en) 2001-07-02 2003-01-17 Sharp Corp Key inputting device
JP2003022057A (en) 2001-07-09 2003-01-24 Alps Electric Co Ltd Image signal driving circuit and display device equipped with image signal driving circuit
US20030050092A1 (en) * 2001-08-03 2003-03-13 Yun Jimmy S. Portable digital player--battery
KR100474724B1 (en) * 2001-08-04 2005-03-08 삼성전자주식회사 Apparatus having touch screen and external display device using method therefor
JP4485103B2 (en) 2001-08-10 2010-06-16 京セラ株式会社 Mobile terminal device
US6985137B2 (en) * 2001-08-13 2006-01-10 Nokia Mobile Phones Ltd. Method for preventing unintended touch pad input due to accidental touching
US6690365B2 (en) * 2001-08-29 2004-02-10 Microsoft Corporation Automatic scrolling
US6727889B2 (en) * 2001-09-14 2004-04-27 Stephen W. Shaw Computer mouse input device with multi-axis palm control
JP2003099198A (en) 2001-09-25 2003-04-04 Shinichi Komatsu Touch panel using four-contact input
US6703550B2 (en) * 2001-10-10 2004-03-09 Immersion Corporation Sound data output and manipulation using haptic feedback
USD469109S1 (en) * 2001-10-22 2003-01-21 Apple Computer, Inc. Media player
US20070085841A1 (en) 2001-10-22 2007-04-19 Apple Computer, Inc. Method and apparatus for accelerated scrolling
US7084856B2 (en) * 2001-10-22 2006-08-01 Apple Computer, Inc. Mouse having a rotary dial
US7312785B2 (en) * 2001-10-22 2007-12-25 Apple Inc. Method and apparatus for accelerated scrolling
US7345671B2 (en) * 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
TWI220491B (en) 2001-11-09 2004-08-21 Prolific Technology Inc Input device and input method thereof
JP2003150303A (en) 2001-11-09 2003-05-23 Ota Kazuhiko Two-stage selection type character input device
US7009599B2 (en) 2001-11-20 2006-03-07 Nokia Corporation Form factor for portable device
KR200265059Y1 (en) 2001-11-30 2002-02-21 주식회사 성림정공 Can cap
US6825833B2 (en) 2001-11-30 2004-11-30 3M Innovative Properties Company System and method for locating a touch on a capacitive touch screen
AU2002356643A1 (en) 2001-12-11 2003-06-23 Wolfgang Fallot-Burghardt Combination consisting of a computer keyboard and mouse control device
FI20012610A (en) 2001-12-31 2003-07-01 Nokia Corp Electronic device and control element
JP2003296015A (en) 2002-01-30 2003-10-17 Casio Comput Co Ltd Electronic equipment
JP2005301322A (en) 2002-02-07 2005-10-27 Kathenas Inc Input device, cellular phone, and portable information device
US7333092B2 (en) 2002-02-25 2008-02-19 Apple Computer, Inc. Touch pad for handheld device
US6795057B2 (en) 2002-02-28 2004-09-21 Agilent Technologies, Inc. Facile ergonomic computer pointing device
US6658773B2 (en) 2002-03-11 2003-12-09 Dennis Rohne Label with luminescence inside
USD468365S1 (en) * 2002-03-12 2003-01-07 Digisette, Llc Dataplay player
US7233318B1 (en) 2002-03-13 2007-06-19 Apple Inc. Multi-button mouse
JP4175007B2 (en) 2002-03-22 2008-11-05 松下電器産業株式会社 Rotation operation type input device
JP2003280799A (en) 2002-03-25 2003-10-02 Sony Corp Information input device and electronic equipment using the same
EP1351121A3 (en) 2002-03-26 2009-10-21 Polymatech Co., Ltd. Input Device
JP4020246B2 (en) 2002-03-26 2007-12-12 ポリマテック株式会社 Touchpad device
TW564694U (en) 2002-03-28 2003-12-01 Universal Trim Supply Co Ltd Safety surface buckle capable of preventing children from biting
US7466307B2 (en) 2002-04-11 2008-12-16 Synaptics Incorporated Closed-loop sensor on a solid-state object position detector
US7111788B2 (en) 2002-04-22 2006-09-26 Nokia Corporation System and method for navigating applications using a graphical user interface
JP2003323259A (en) 2002-05-02 2003-11-14 Nec Corp Information processing apparatus
DE50308334D1 (en) 2002-05-07 2007-11-22 Schott Ag Lighting device for buttons
USD483809S1 (en) 2002-05-13 2003-12-16 Storm Electronics Company Limited System selector for electronic game console
JP4090939B2 (en) 2002-05-29 2008-05-28 ニッタ株式会社 Capacitive sensor and manufacturing method thereof
US7780463B2 (en) 2002-06-11 2010-08-24 Henry Milan Selective flash memory drive with quick connector
US7327352B2 (en) 2002-06-14 2008-02-05 3M Innovative Properties Company Linearized conductive surface
DE10228185A1 (en) 2002-06-24 2004-01-22 Völckers, Oliver Device for detecting a mechanical actuation of an input element using digital technology and method for processing and converting the digital input signal into commands for controlling a consumer
JP4147839B2 (en) 2002-06-26 2008-09-10 ポリマテック株式会社 Sliding multi-directional input key
JP4086564B2 (en) 2002-07-04 2008-05-14 キヤノン株式会社 Switch button and recording device
US7743000B2 (en) * 2002-07-16 2010-06-22 Hewlett-Packard Development Company, L.P. Printer
TW547716U (en) 2002-07-31 2003-08-11 Jia-Jen Wu Positioning structure for the cursor on a touch panel of portable computer
US7446757B2 (en) * 2002-09-17 2008-11-04 Brother Kogyo Kabushiki Kaisha Foldable display, input device provided with the display and foldable keyboard, and personal computer provided with the input device
US7196931B2 (en) 2002-09-24 2007-03-27 Sandisk Corporation Non-volatile memory and method with reduced source line bias errors
TWM243724U (en) 2002-09-26 2004-09-11 Wistron Corp Button illumination module for data processing device
US6894916B2 (en) 2002-09-27 2005-05-17 International Business Machines Corporation Memory array employing single three-terminal non-volatile storage elements
US20040080682A1 (en) 2002-10-29 2004-04-29 Dalton Dan L. Apparatus and method for an improved electronic display
JP3900063B2 (en) 2002-10-30 2007-04-04 株式会社デンソー Mobile phone case
MXPA03009945A (en) 2002-11-05 2007-04-16 Lg Electronics Inc Touch screen mounting assembly for lcd monitor.
JP4205408B2 (en) 2002-11-20 2009-01-07 大日本印刷株式会社 Product information management system and product information management program
US6784384B2 (en) 2002-12-03 2004-08-31 Samsung Electronics Co., Ltd. Rotation key device for a portable terminal
US7236154B1 (en) 2002-12-24 2007-06-26 Apple Inc. Computer light adjustment
TWI237282B (en) * 2003-01-07 2005-08-01 Pentax Corp Push button device having an illuminator
US7730430B2 (en) 2003-01-24 2010-06-01 Microsoft Corporation High density cursor system and method
JP4344639B2 (en) 2003-04-11 2009-10-14 日本航空電子工業株式会社 Press operation type switch unit
US7392411B2 (en) 2003-04-25 2008-06-24 Ati Technologies, Inc. Systems and methods for dynamic voltage scaling of communication bus to provide bandwidth based on whether an application is active
USD497618S1 (en) 2003-04-25 2004-10-26 Apple Computer, Inc. Media device
US7627343B2 (en) 2003-04-25 2009-12-01 Apple Inc. Media player system
EP1621000B1 (en) 2003-05-08 2011-02-23 Nokia Corporation A mobile telephone having a rotator input device
GB0312465D0 (en) 2003-05-30 2003-07-09 Therefore Ltd A data input method for a computing device
US20040239622A1 (en) 2003-05-30 2004-12-02 Proctor David W. Apparatus, systems and methods relating to improved user interaction with a computing device
JP2004362097A (en) 2003-06-03 2004-12-24 Fujitsu Ltd Glide point device with scroll function, personal computer, keyboard and program
US20040253989A1 (en) 2003-06-12 2004-12-16 Tupler Amy M. Radio communication device having a navigational wheel
FI116548B (en) 2003-06-18 2005-12-15 Nokia Corp Digital multidirectional control switch
US9160714B2 (en) 2003-06-30 2015-10-13 Telefonaktiebolaget L M Ericsson (Publ) Using tunneling to enhance remote LAN connectivity
US7250907B2 (en) 2003-06-30 2007-07-31 Microsoft Corporation System and methods for determining the location dynamics of a portable computing device
JP2005030901A (en) 2003-07-11 2005-02-03 Alps Electric Co Ltd Capacitive sensor
US7265686B2 (en) 2003-07-15 2007-09-04 Tyco Electronics Corporation Touch sensor with non-uniform resistive band
KR100522940B1 (en) * 2003-07-25 2005-10-24 삼성전자주식회사 Touch screen system having active area setting function and control method thereof
US20050030048A1 (en) * 2003-08-05 2005-02-10 Bolender Robert J. Capacitive sensing device for use in a keypad assembly
USD489731S1 (en) 2003-08-05 2004-05-11 Tatung Co., Ltd. Portable media player
US20070152977A1 (en) * 2005-12-30 2007-07-05 Apple Computer, Inc. Illuminated touchpad
US7499040B2 (en) * 2003-08-18 2009-03-03 Apple Inc. Movable touch pad with added functionality
US20060181517A1 (en) 2005-02-11 2006-08-17 Apple Computer, Inc. Display actuator
DE212004000044U1 (en) 2003-08-21 2006-06-01 Philipp, Harald, Hamble Capacitive position sensor
US6930494B2 (en) 2003-08-29 2005-08-16 Agilent Technologies, Inc. Capacitive probe assembly with flex circuit
JP4214025B2 (en) 2003-09-04 2009-01-28 株式会社東海理化電機製作所 Monitor display control device
US20050052426A1 (en) * 2003-09-08 2005-03-10 Hagermoser E. Scott Vehicle touch input device and methods of making same
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7280346B2 (en) 2003-09-29 2007-10-09 Danger, Inc. Adjustable display for a data processing apparatus
US8068186B2 (en) 2003-10-15 2011-11-29 3M Innovative Properties Company Patterned conductor touch screen having improved optics
US7181251B2 (en) 2003-10-22 2007-02-20 Nokia Corporation Mobile communication terminal with multi orientation user interface
US7495659B2 (en) 2003-11-25 2009-02-24 Apple Inc. Touch pad for handheld device
US8059099B2 (en) 2006-06-02 2011-11-15 Apple Inc. Techniques for interactive input to portable electronic devices
US20050113144A1 (en) 2003-11-26 2005-05-26 Tupler Amy M. Pivotal display for a mobile communications device
KR100754687B1 (en) 2003-12-12 2007-09-03 삼성전자주식회사 Multi input device of wireless terminal and his control method
JP4165646B2 (en) 2003-12-25 2008-10-15 ポリマテック株式会社 Key sheet
US7307624B2 (en) 2003-12-30 2007-12-11 3M Innovative Properties Company Touch sensor with linearized response
US7085590B2 (en) 2003-12-31 2006-08-01 Sony Ericsson Mobile Communications Ab Mobile terminal with ergonomic imaging functions
CA106580S (en) 2004-01-05 2005-10-31 Apple Computer Media device
US20050162402A1 (en) 2004-01-27 2005-07-28 Watanachote Susornpol J. Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
WO2005076117A1 (en) 2004-02-10 2005-08-18 Takuya Ogihara Touch screen-type input device
KR100611182B1 (en) * 2004-02-27 2006-08-10 삼성전자주식회사 Portable electronic device for changing menu display state according to rotating degree and method thereof
US7487441B2 (en) 2004-03-11 2009-02-03 Yahoo!Inc. Method and system of enhanced messaging
US7623119B2 (en) 2004-04-21 2009-11-24 Nokia Corporation Graphical functions by gestures
ATE375544T1 (en) 2004-04-22 2007-10-15 Sony Ericsson Mobile Comm Ab CONTROL INTERFACE FOR AN ELECTRONIC DEVICE
US7310089B2 (en) 2004-05-18 2007-12-18 Interlink Electronics, Inc. Annular potentiometric touch sensor
US7382139B2 (en) 2004-06-03 2008-06-03 Synaptics Incorporated One layer capacitive sensing apparatus having varying width sensing elements
CN100483319C (en) 2004-06-17 2009-04-29 皇家飞利浦电子股份有限公司 Use of a two finger input on touch screens
JP2008511045A (en) * 2004-08-16 2008-04-10 フィンガーワークス・インコーポレーテッド Method for improving the spatial resolution of a touch sense device
US7737953B2 (en) * 2004-08-19 2010-06-15 Synaptics Incorporated Capacitive sensing apparatus having varying depth sensing elements
WO2006021211A2 (en) 2004-08-23 2006-03-02 Bang & Olufsen A/S Operating panel
DE102004043663B4 (en) 2004-09-07 2006-06-08 Infineon Technologies Ag Semiconductor sensor component with cavity housing and sensor chip and method for producing a semiconductor sensor component with cavity housing and sensor chip
US7735012B2 (en) 2004-11-04 2010-06-08 Apple Inc. Audio user interface for computing devices
FR2878646B1 (en) 2004-11-26 2007-02-09 Itt Mfg Enterprises Inc ELECTRICAL SWITCH WITH MULTIPLE SWITCHES
JP4319975B2 (en) 2004-12-21 2009-08-26 アルプス電気株式会社 Input device
EP1677182B1 (en) 2004-12-28 2014-04-23 Sony Mobile Communications Japan, Inc. Display method, portable terminal device, and display program
JP4238222B2 (en) 2005-01-04 2009-03-18 インターナショナル・ビジネス・マシーンズ・コーポレーション Object editing system, object editing method, and object editing program
US7593782B2 (en) 2005-01-07 2009-09-22 Apple Inc. Highly portable media device
US7471284B2 (en) 2005-04-15 2008-12-30 Microsoft Corporation Tactile scroll bar with illuminated document position indicator
US7466040B2 (en) 2005-04-19 2008-12-16 Frederick Johannes Bruwer Touch sensor controlled switch with intelligent user interface
US7710397B2 (en) 2005-06-03 2010-05-04 Apple Inc. Mouse with improved input mechanisms using touch sensors
US8300841B2 (en) 2005-06-03 2012-10-30 Apple Inc. Techniques for presenting sound effects on a portable media player
KR100538572B1 (en) 2005-06-14 2005-12-23 (주)멜파스 Apparatus for controlling digital device based on touch input interface capable of visual input feedback and method for the same
US7279647B2 (en) 2005-06-17 2007-10-09 Harald Philipp Control panel
US7288732B2 (en) 2005-07-06 2007-10-30 Alps Electric Co., Ltd. Multidirectional input device
JP4256866B2 (en) * 2005-09-01 2009-04-22 ポリマテック株式会社 Key sheet and key sheet manufacturing method
US7503193B2 (en) 2005-09-02 2009-03-17 Bsh Home Appliances Corporation Button apparatus and method of manufacture
US7671837B2 (en) * 2005-09-06 2010-03-02 Apple Inc. Scrolling input arrangements using capacitive sensors on a flexible membrane
US7880729B2 (en) 2005-10-11 2011-02-01 Apple Inc. Center button isolation ring
JP2007123473A (en) 2005-10-27 2007-05-17 Alps Electric Co Ltd Soft magnetic film, its manufacturing method, thin film magnetic head using the same and its manufacturing method
US8552988B2 (en) 2005-10-31 2013-10-08 Hewlett-Packard Development Company, L.P. Viewing device having a touch pad
US7839391B2 (en) 2005-11-04 2010-11-23 Electronic Theatre Controls, Inc. Segmented touch screen console with module docking
US7834850B2 (en) 2005-11-29 2010-11-16 Navisense Method and system for object control
US7788607B2 (en) 2005-12-01 2010-08-31 Navisense Method and system for mapping virtual coordinates
US20070152983A1 (en) 2005-12-30 2007-07-05 Apple Computer, Inc. Touch pad with symbols based on mode
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US7860536B2 (en) 2006-01-05 2010-12-28 Apple Inc. Telephone interface for a portable communication device
JP4463770B2 (en) 2006-01-25 2010-05-19 Ykk株式会社 Manufacturing method of physical quantity detector
KR100767686B1 (en) 2006-03-30 2007-10-17 엘지전자 주식회사 Terminal device having touch wheel and method for inputting instructions therefor
DE202007005237U1 (en) 2006-04-25 2007-07-05 Philipp, Harald, Southampton Touch-sensitive position sensor for use in control panel, has bus bars arranged at distance to substrate, and detection region with units that are arranged at distance by non-conductive openings such that current flows into region
US20070247421A1 (en) 2006-04-25 2007-10-25 Timothy James Orsley Capacitive-based rotational positioning input device
US20070252853A1 (en) 2006-04-28 2007-11-01 Samsung Electronics Co., Ltd. Method and apparatus to control screen orientation of user interface of portable device
US7996788B2 (en) 2006-05-18 2011-08-09 International Apparel Group, Llc System and method for navigating a dynamic collection of information
US8059102B2 (en) 2006-06-13 2011-11-15 N-Trig Ltd. Fingertip touch recognition for a digitizer
US20070291016A1 (en) 2006-06-20 2007-12-20 Harald Philipp Capacitive Position Sensor
US8068097B2 (en) 2006-06-27 2011-11-29 Cypress Semiconductor Corporation Apparatus for detecting conductive material of a pad layer of a sensing device
US8743060B2 (en) 2006-07-06 2014-06-03 Apple Inc. Mutual capacitance touch sensing device
US9360967B2 (en) * 2006-07-06 2016-06-07 Apple Inc. Mutual capacitance touch sensing device
US8022935B2 (en) * 2006-07-06 2011-09-20 Apple Inc. Capacitance sensing electrode with integrated I/O mechanism
US20080007529A1 (en) 2006-07-07 2008-01-10 Tyco Electronics Corporation Touch sensor
US20080006454A1 (en) * 2006-07-10 2008-01-10 Apple Computer, Inc. Mutual capacitance touch sensing device
US7688080B2 (en) * 2006-07-17 2010-03-30 Synaptics Incorporated Variably dimensioned capacitance sensor elements
US7253643B1 (en) 2006-07-19 2007-08-07 Cypress Semiconductor Corporation Uninterrupted radial capacitive sense interface
CN101110299B (en) 2006-07-21 2012-07-25 深圳富泰宏精密工业有限公司 Key structure and portable electronic device with this structure
US7645955B2 (en) * 2006-08-03 2010-01-12 Altek Corporation Metallic linkage-type keying device
US20080036473A1 (en) * 2006-08-09 2008-02-14 Jansson Hakan K Dual-slope charging relaxation oscillator for measuring capacitance
US8564544B2 (en) * 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US7795553B2 (en) 2006-09-11 2010-09-14 Apple Inc. Hybrid button
US20080069412A1 (en) 2006-09-15 2008-03-20 Champagne Katrina S Contoured biometric sensor
US7965281B2 (en) 2006-10-03 2011-06-21 Synaptics, Inc. Unambiguous capacitance sensing using shared inputs
US8786553B2 (en) 2006-10-06 2014-07-22 Kyocera Corporation Navigation pad and method of using same
US20080088600A1 (en) 2006-10-11 2008-04-17 Apple Inc. Method and apparatus for implementing multiple push buttons in a user input device
US20080088597A1 (en) 2006-10-11 2008-04-17 Apple Inc. Sensor configurations in a user input device
US8274479B2 (en) 2006-10-11 2012-09-25 Apple Inc. Gimballed scroll wheel
US7772507B2 (en) 2006-11-03 2010-08-10 Research In Motion Limited Switch assembly and associated handheld electronic device
US20080110739A1 (en) 2006-11-13 2008-05-15 Cypress Semiconductor Corporation Touch-sensor device having electronic component situated at least partially within sensor element perimeter
US8482530B2 (en) 2006-11-13 2013-07-09 Apple Inc. Method of capacitively sensing finger position
US20080143681A1 (en) 2006-12-18 2008-06-19 Xiaoping Jiang Circular slider with center button
JP5041135B2 (en) 2006-12-26 2012-10-03 ライオン株式会社 Oral composition and oral biofilm formation inhibitor
US9710095B2 (en) 2007-01-05 2017-07-18 Apple Inc. Touch screen stack-ups
WO2008090516A1 (en) 2007-01-22 2008-07-31 Nokia Corporation System and method for screen orientation in a rich media environment
GB2446702A (en) 2007-02-13 2008-08-20 Qrg Ltd Touch Control Panel with Pressure Sensor
US20080196945A1 (en) 2007-02-21 2008-08-21 Jason Konstas Preventing unintentional activation of a sensor element of a sensing device
KR100868353B1 (en) 2007-03-08 2008-11-12 한국화학연구원 Piperazinyl-propyl-pyrazole derivatives as dopamine D4 receptor antagonists, and pharmaceutical compositions containing them
US20090033635A1 (en) * 2007-04-12 2009-02-05 Kwong Yuen Wai Instruments, Touch Sensors for Instruments, and Methods or Making the Same
CN101295595B (en) 2007-04-26 2012-10-10 鸿富锦精密工业(深圳)有限公司 Key
US7742783B2 (en) 2007-05-10 2010-06-22 Virgin Mobile Usa, L.P. Symmetric softkeys on a mobile electronic device
US20090036176A1 (en) * 2007-08-01 2009-02-05 Ure Michael J Interface with and communication between mobile electronic devices
US20090058802A1 (en) 2007-08-27 2009-03-05 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Input device
US20090058801A1 (en) 2007-09-04 2009-03-05 Apple Inc. Fluid motion user interface control
US7910843B2 (en) 2007-09-04 2011-03-22 Apple Inc. Compact input device
US20090073130A1 (en) 2007-09-17 2009-03-19 Apple Inc. Device having cover with integrally formed sensor
KR100836628B1 (en) 2007-09-20 2008-06-10 삼성전기주식회사 Rotational inputting apparatus
US20090109181A1 (en) 2007-10-26 2009-04-30 Research In Motion Limited Touch screen and electronic device
JP5080938B2 (en) 2007-10-31 2012-11-21 株式会社竹中工務店 Vibration control device
US8416198B2 (en) 2007-12-03 2013-04-09 Apple Inc. Multi-dimensional scroll wheel
US8125461B2 (en) 2008-01-11 2012-02-28 Apple Inc. Dynamic input graphic display
JP5217464B2 (en) 2008-01-31 2013-06-19 株式会社ニコン Lighting device, projector, and camera
US8820133B2 (en) 2008-02-01 2014-09-02 Apple Inc. Co-extruded materials and methods
US9454256B2 (en) 2008-03-14 2016-09-27 Apple Inc. Sensor configurations of an input device that are switchable based on mode
US20100058251A1 (en) 2008-08-27 2010-03-04 Apple Inc. Omnidirectional gesture detection
US20100060568A1 (en) 2008-09-05 2010-03-11 Apple Inc. Curved surface input device with normalized capacitive sensing
JP5274956B2 (en) 2008-09-19 2013-08-28 株式会社ニューギン Game machine
US8816967B2 (en) 2008-09-25 2014-08-26 Apple Inc. Capacitive sensor having electrodes arranged on the substrate and the flex circuit
US8395590B2 (en) 2008-12-17 2013-03-12 Apple Inc. Integrated contact switch and touch sensor elements
JP5298955B2 (en) 2009-03-02 2013-09-25 日本電気株式会社 Node device, operation monitoring device, processing method, and program
US9354751B2 (en) 2009-05-15 2016-05-31 Apple Inc. Input device with optimized capacitive sensing
US8872771B2 (en) * 2009-07-07 2014-10-28 Apple Inc. Touch sensing device having conductive nodes
JP5211021B2 (en) 2009-11-26 2013-06-12 東芝テック株式会社 Product information input device and control program thereof
JP5265656B2 (en) 2010-12-27 2013-08-14 ヤフー株式会社 Clustering apparatus and clustering method
JP5205565B2 (en) 2011-03-03 2013-06-05 株式会社カラット Oil separation method and oil drain trap
JP5101741B2 (en) 2011-04-08 2012-12-19 シャープ株式会社 Semiconductor device and inverter, converter and power conversion device using the same

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030076306A1 (en) 2001-10-22 2003-04-24 Zadesky Stephen Paul Touch pad handheld device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1687684A4

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2436135B (en) * 2006-03-09 2011-09-14 Pretorian Technologies Ltd User input device for electronic equipment
WO2007105151A1 (en) * 2006-03-13 2007-09-20 Koninklijke Philips Electronics N.V. Control device for controlling the hue of light emitted from a light source
US7948394B2 (en) 2006-03-13 2011-05-24 Koninklijke Philips Electronics N.V. Control device for controlling the hue of light emitted from a light source
US8279079B2 (en) 2006-03-13 2012-10-02 Koninklijke Philips Electronics N.V. Control device for controlling the hue of light emitted from a light source
RU2719401C1 (en) * 2018-06-29 2020-04-17 Кэнон Кабусики Кайся Electronic device
US10897568B2 (en) 2018-06-29 2021-01-19 Canon Kabushiki Kaisha Electronic device

Also Published As

Publication number Publication date
HK1123860A1 (en) 2009-06-26
US20050110768A1 (en) 2005-05-26
CN1637776A (en) 2005-07-13
US7495659B2 (en) 2009-02-24
CN101201715B (en) 2012-02-15
CN100369054C (en) 2008-02-13
CN101201715A (en) 2008-06-18
US8552990B2 (en) 2013-10-08
EP1687684A4 (en) 2007-05-09
TWI262427B (en) 2006-09-21
EP2284658A2 (en) 2011-02-16
TW200517928A (en) 2005-06-01
DE202004021283U1 (en) 2007-05-24
EP2284658B1 (en) 2018-11-28
EP2284658A3 (en) 2014-09-24
US20140191990A1 (en) 2014-07-10
US20190033996A1 (en) 2019-01-31
EP1687684A2 (en) 2006-08-09
US20080012837A1 (en) 2008-01-17
WO2005057328A3 (en) 2006-09-21

Similar Documents

Publication Publication Date Title
US20190033996A1 (en) Touch pad for handheld device
US10353565B2 (en) Input apparatus and button arrangement for handheld device
US7348967B2 (en) Touch pad for handheld device
AU2004267727C1 (en) An input device for a portable media device
US8330061B2 (en) Compact input device
US20080087476A1 (en) Sensor configurations in a user input device
WO2010027803A1 (en) Omnidirectional gesture detection
WO2008045830A1 (en) Method and apparatus for implementing multiple push buttons in a user input device
WO2010028139A2 (en) Curved surface input device with normalized capacitive sensing
AU2008100398A4 (en) A portable media device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 2004781727

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2004781727

Country of ref document: EP