WO2013081672A1 - Multi-touch input device - Google Patents

Multi-touch input device Download PDF

Info

Publication number
WO2013081672A1
WO2013081672A1 PCT/US2012/044056 US2012044056W WO2013081672A1 WO 2013081672 A1 WO2013081672 A1 WO 2013081672A1 US 2012044056 W US2012044056 W US 2012044056W WO 2013081672 A1 WO2013081672 A1 WO 2013081672A1
Authority
WO
WIPO (PCT)
Prior art keywords
input device
touch
interface
multi
layout
Prior art date
Application number
PCT/US2012/044056
Other languages
French (fr)
Inventor
Jason Giddings
David Rogers
Original Assignee
TransluSense, LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201161565494P priority Critical
Priority to US61/565,494 priority
Application filed by TransluSense, LLC filed Critical TransluSense, LLC
Publication of WO2013081672A1 publication Critical patent/WO2013081672A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0238Programmable keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Abstract

Multi-touch input devices utilizing frustrated total internal reflection (FTIR) techniques can be provided. A FTIR multi-touch input device can comprise a base portion, a transparent interface panel connected to the base portion, an interface map attached to the interface panel, an infrared light source for emitting infrared light inside the interface panel using total internal reflection, and one or more digital cameras for detecting infrared light that is scattered from the interface panel using FTIR when the interface panel is touched. The interface map can be a custom or pre¬ defined interface map with a custom or pre-defined layout of keys and/or touchpad zones. The interface map can be removably attached to the interface panel. The input device can comprise an ultraviolet light source. The ultraviolet light source can provide a sterilization effect to the input device.

Description

MULTI-TOUCH INPUT DEVICE

CROSS REFERENCE TO RELATED APPLICATION

[001] This application claims priority to U.S. Provisional Application No.

61/565,494, filed on November 30, 2011, which is incorporated herein in its entirety by reference.

BACKGROUND

[002] Traditional keyboards, mice, and touchpads suffer from a number of limitations. For example, traditional keyboards with mechanical keys can collect debris, such as dirt, germs, and other types of undesirable material. Cleaning a traditional keyboard can be problematic and time consuming. Because a traditional keyboard has an uneven surface (e.g., individual key shapes and gaps between keys), wiping the surface down may not be very effective. Furthermore, cleaning between, behind, or underneath keys may be impractical or impossible (e.g., without removing the keys).

[003] Traditional keyboards, mice, and touchpads may not be suitable for environments where debris is common, such as an industrial or manufacturing environment where dirt, grease, or other debris may interfere with the proper functioning of the keyboard, mouse, or touchpad. A keyboard cover, such as a clear plastic or silicon cover, may provide some protection in such an environment, but it may also interfere or degrade performance of the device, such as the ability to depress single keys precisely.

[004] Traditional keyboards, mice, and touchpads may not be suitable for environments where a clean and sterile surface is needed, such as in a health care or hospital environment. For example, a keyboard, mouse, or touchpad surface may have gaps, edges, crevices, or other surface areas that are difficult to clean or sanitize. [005] In addition, traditional keyboards, mice, and touchpads provide a fixed input configuration. For example, mechanical keyboards are created with keys of fixed size and layout. Similarly, mice have fixed buttons and scroll wheels, and touchpads have a fixed input area. While software can be used to map fixed keyboard keys to new functions, it cannot change the physical size, position, or layout of the keys.

[006] Therefore, ample opportunity exists to improve technologies related to multi- touch input devices.

SUMMARY

[007] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

[008] Technologies, techniques, and tools are described for multi-touch input devices utilizing frustrated total internal reflection (FTIR) techniques. For example, a FTIR multi-touch input device can comprise a base portion, a transparent interface panel connected to the base portion, an interface map attached to the interface panel, an infrared light source for emitting infrared light inside the interface panel using total internal reflection, and one or more digital cameras, where the one or more digital cameras are configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched. The interface map can be a custom or pre-defined removably attached interface map (e.g., a transparent polycarbonate film). The interface map can depict a layout comprising one or more touch areas (e.g., one or more keys and/or one or more touchpad zones). [009] As another example, a method can be provided for creating a custom interface map for a multi-touch input device (e.g., a frustrated total internal reflection (FTIR) multi-touch input device). The method can comprise receiving layout information for a custom interface map, where the layout information defines a plurality of touch areas, and where each touch area is one of a key and a touchpad zone, generating a layout for the custom interface map according to the layout information, and generating a configuration file from the received layout

information, where the configuration file is loadable on the multi-touch input device to configure the multi-touch input device to use the custom interface map. The method can also comprise outputting the configuration file (e.g., storing the configuration file or sending the configuration file to a user) and/or providing the layout for printing on a transparent film to create the custom interface map.

[010] As another example, a FTIR multi-touch input device can comprise a base portion, a transparent interface panel connected to the base portion, an infrared light source for emitting infrared light inside the interface panel using total internal reflection, an ultraviolet light source for emitting ultra-violet light inside the transparent interface panel, and one or more digital cameras, where the one or more digital cameras are configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched. The ultraviolet light can provide a sterilization effect to the transparent interface panel.

[011] As another example, a custom interface map and configuration file can be provided for a multi-touch input device (e.g., a FTIR multi-touch input device), comprising a custom interface map that is defined by layout information, where the layout information comprises, for each of a plurality of touch areas: configuration of the touch area, location of the touch area on the custom interface map, and function performed by the touch area when touched, and comprising a configuration file corresponding to the custom interface map, where the configuration file is loadable on the multi-touch input device to configure the multi-touch input device to use the custom interface map and process touch events according to the layout information. [012] As another example, a method, implemented at least in part by a frustrated total internal reflection multi-touch input device, can be provided for removing interference. The method can comprise capturing a first digital image, where the first digital image is captured when an infrared light source of the FTIR multi-touch input device is turned off, capturing a second digital image, where the second digital image is captured when the infrared light source of the FTIR multi-touch input device is turned on, and processing the first and second digital images to remove infrared interference.

[013] As another example, a frustrated total internal reflection multi-touch input device can comprise a transparent interface panel, an infrared light source for emitting infrared light inside the interface panel using total internal reflection, and a digital camera, where the digital camera is configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched. The input device can be configured to perform operations comprising detecting infrared interference based at least in part from digital images captured using the digital camera when the infrared light source is off, and compensating for the detected infrared interference in digital images captured using the digital camera when the infrared light source is on

[014] As yet another example, a method, implemented at least in part by a frustrated total internal reflection multi-touch input device, can be provided for removing contamination from digital images. The method can comprise capturing a first digital image, where the first digital image is captured when an infrared light source of the FTIR multi-touch input device is turned on, and where the first digital image is captured when an interface panel of the FTIR multi-touch input device is not being touched, determining contamination information based at least in part upon the first digital image, where the contamination information is determined from detected infrared light present in the first digital image, capturing a plurality of additional digital images, where the plurality of additional digital images are captured when the infrared light source of the FTIR multi-touch input device is turned on, and processing the plurality of additional digital images based at least in part upon the contamination information to remove contamination from the plurality of additional digital images

[015] As described herein, a variety of other features and advantages can be incorporated into the technologies as desired. BRIEF DESCRIPTION OF THE DRAWINGS

[016] FIG. 1 is a perspective view depicting an example multi-touch input device, which can be in the format of a keyboard.

[017] FIG. 2 is a perspective view depicting an example multi-touch input device, which can be in the format of a mouse or touchpad.

[018] FIG. 3 is a side elevation view of the multi-touch input device of Fig. 1 or Fig. 2.

[019] FIG. 4 is an exploded view depicting an example multi-touch input device.

[020] FIG. 5 is a schematic side elevation view of an example multi-touch input device in use, depicting scattered IR light from a touch.

[021] FIGS. 6A and 6B are plan views of example interface maps with a keyboard layout.

[022] FIG. 7 is a plan view of an example interface map with a mouse or touchpad layout. [023] FIG. 8 is a flowchart of an exemplary method for creating a custom interface map for a multi-touch input device.

[024] FIGS. 9A, 9B, and 9C are diagrams depicting example digital images, including images depicting infrared interference.

[025] FIG. 10 is a flowchart of an exemplary method for removing infrared interference.

[026] FIGS. 11A, 1 IB, and 11C are diagrams depicting example digital images, including images depicting scattered light from contamination.

[027] FIG. 12 is a flowchart of an exemplary method for removing contamination from digital images. [028] FIG. 13 is a system diagram depicting a generalized example of a suitable computing system in which the described innovations may be implemented

DETAILED DESCRIPTION [029] Before beginning a detailed description of the multi-touch input device, mention of the following is in order. When appropriate, like reference materials and characters are used to designate identical, corresponding, or similar components in differing figure drawings. The figure drawings associated with this disclosure typically are not drawn with dimensional accuracy to scale, i.e., such drawings have been drafted with a focus on clarity of viewing and understanding rather than dimensional accuracy.

[030] The following description is directed to aspects of touch panel input devices utilizing frustrated total internal reflection (FTIR) techniques. Multi-touch input devices can be provided in the format of keyboards, mice, touchpads, or

combinations thereof. For example, a multi-touch input device can be provided in the format of a traditional keyboard, comprising letter keys, number keys, function keys, special keys, etc. A multi-touch input device can be provided in the format of a mouse or touchpad, comprising a touchpad zone for cursor movement and one or more buttons. A multi-touch input device can also comprise functionality of a keyboard in addition to a mouse or touchpad. For example, a layout for a multi- touch input device can comprise keys and buttons in combination with touchpad zones (e.g., in a pre-defined or user-defined layout).

[031] In any of the examples herein, a multi-touch input device (e.g., keyboard, mouse, touchpad, and the like) can have a touch sensitive surface (e.g., an interface panel) implemented, at least in part, using FTIR techniques. For example, the touch sensitive surface can be a sheet of material, such as glass or plastic, within which light (e.g., infrared (IR), ultraviolet (UV), and/or visible light) is reflected internally (e.g., via total internal reflection). In some implementations, the interface panel is made from glass or a glass blend, such as a borosilicate type glass. [032] The size or dimensions of a multi-touch input device can vary according to implementation details. For example, a multi-touch input device that will be used as a traditional keyboard can be sized similarly to a traditional keyboard. An input device that will be used as a touchpad with a small number of buttons can be sized accordingly (e.g., smaller than a traditional keyboard).

[033] Multi-touch input devices can detect touch by an object (e.g., a person's finger, stylus, or other type of object) using FTIR technology. Multi-touch input devices can detect touch by more than one object (e.g., multiple fingers) at once (e.g., by imaging the interface panel to detect one or more touch events as spots or blobs of scattered IR light).

[034] In some implementations, a multi-touch keyboard, mouse, or touchpad has a glass touch area where activation of standard keys or manipulation of ID or 2D sliders can occur (e.g., ID volume control or zoom sliders, or 2D cursor movement zones). The multi-touch input device can be supported by a base part that houses control circuitry, cameras, communication connections, and/or a battery. The input device can have raised surface indicators (e.g., small bumps) to indicate specific locations on the surface to a user (e.g., bumps on the J and F keys to help with finger placement). The input device can provide (e.g., via software or firmware) selectable functions to create keystroke sounds for each touch. The input device can include a backlight that can be turned on or off.

[035] FTIR technology can use infrared light-emitting diodes (LEDs) placed at the edge of a panel which transmits the wavelength(s) of light produced by the LEDs. Because of the low angle at which the light impacts the internal glass surface, it is internally reflected or "bounced around inside the glass," much like looking through a tube to see the reflections on the inside walls. When the glass is touched it frustrates the reflection and the IR light is scattered, with some of the light being scattered downward, out of the glass, which allows a camera to see it. The device can then determine the location and send the appropriate information to a computer or other computing device. [036] The multi-touch input devices described herein can be completely

customizable. The interface map can be easily changed by affixing (e.g., placing on or adhering) clear static sheets that can be purchased or printed custom by the end user. This allows the user to adapt to specific languages or dedicate areas of the input device to anything they choose, in addition to selecting standard layouts for users who want to use the number pad area just for track pad function for instance. Flexible software can allow even more versatility by providing the ability to develop special functionality for any desired interface map layout.

[037] The multi-touch input devices described herein can provide advantages over existing FTIR technology and conventional capacitive touch screen systems. For example, the input device technology described herein can be less expensive and use less power, because the touch panel is separate from the display screen, and no projection apparatus is needed to project the interface map onto the touch panel. In addition custom interface maps can be developed and removably applied to satisfy a variety of standard and special purpose applications. Furthermore, the touch surface can be easily cleaned and can support sterilization techniques. Other advantages of the multi-touch input device technology are described elsewhere herein.

Multi-Touch Input Devices

[038] This section describes different aspects of multi-touch input devices that utilize FTIR techniques. Different features and elements of the devices described in this section can be used separately or in combination with other features and elements described in this section and/or described elsewhere herein.

[039] Fig. 1 is a perspective view depicting an example multi-touch input device 100. For example, the multi-touch input device 100 can be in the format of a keyboard. The multi-touch input device 100 comprises a transparent interface panel 102, a base portion 104, a housing portion 107 connecting the interface panel 102 to the base portion 104, a light source (not depicted) for projecting light into the interface panel 102 (e.g., for directing IR light to an edge of the interface panel 102), and one or more digital cameras 116 positioned to scan a surface of the interface panel 102. The multi-touch input device 100 also comprises an interface map 126 (e.g., a custom or pre-defined interface map), such as a transparent film with a printed layout, affixed to the interface panel 102.

[040] In some embodiments, the digital cameras 116 are mounted in the base portion 104, distal from interface panel 102, in order to provide adequate field of vision. The digital cameras 116 can scan (e.g., continuously capture images or frames a number of times per second, such as 30 or 60 frames per second) at least the area of the interface panel 102 corresponding to the interface map 126. The light source can flicker at a frequency and pulse length which is synchronized with the scan rate of the corresponding digital cameras 116. Synchronizing the light source and the digital cameras 116 can help prevent interference from other potential light sources (e.g., IR light), such as IR remote control devices. Depending on

implementation details, the light source and digital cameras 116 can be synchronized at a rate of, for example, 15Hz, 30Hz, 60Hz, or at another rate. [041] Fig. 2 is a perspective view depicting an example multi-touch input device 200. For example, the multi-touch input device 200 can be in the format of a touchpad or mouse. The multi-touch input device 200 comprises a transparent interface panel 202, a base portion 204, a housing portion 207 connecting the interface panel 202 to the base portion 204, a light source (not depicted) for projecting light into the interface panel 202, and one or more digital cameras 216 positioned to scan a surface of the interface panel 202. The multi-touch input device 200 also comprises an interface map 226 (e.g., a custom or pre-defined interface map) affixed to the interface panel 202.

[042] In some implementations, the multi-touch input devices depicted in Figs. 1 and 2 comprise many of the same design features. For example, the input device of Fig. 2 can be a smaller version (e.g., in width, length, and/or height) of the input device of Fig. 1. The input device of Fig. 2 may have a different interface map 226 (e.g., a touchpad style interface map instead of a keyboard style interface map) and may have a fewer number of digital cameras 216 (e.g., only one digital camera instead of multiple digital cameras). [043] The multi-touch input devices depicted in Figs. 1 and 2 can be configured for different applications. For example, a larger input device, such as depicted in Fig. 1 can operate as a typical keyboard (e.g., it can operate in place of a standard mechanical keyboard without any special software or drivers). A smaller input device, such as depicted in Fig. 2 can operate as a typical mouse (e.g., it can operate in place of a standard opto-mechanical mouse or touchpad without any special software or drivers). In this way, the input device can be directly connected to a computer system and operate as a keyboard or mouse without the need for additional software or drivers. In addition, while Fig. 1 generally depicts an input device with a keyboard type layout and Fig. 2 generally depicts an input device with a mouse or touchpad type layout, the input devices are not limited to such layouts. For example, the input device depicted in Fig. 2 can be configured (e.g., with a pre-defined or custom layout) with keys performing standard functions (e.g., letters or numbers, such as a number pad) or custom functions (e.g., arbitrary defined key shapes and placement that perform customizable actions when activated). Similarly, the input device depicted in Fig. 1 can be configured with touchpad zones (e.g., including a touchpad zone that performs cursor movement), separately or in combination with keys performing standard or custom functions.

[044] The multi-touch input devices depicted in Figs. 1 and 2 can detect touch events by an object (e.g., a person's finger, stylus, or other type of object) using FTIR technology. The touch events can be, for example, a touch (e.g., a touch or press by a finger with a large enough area to indicate a user's desire to activate a key or button), a rest (e.g., a touch or press by a finger for a longer duration to indicate the user does not desire to activate a key or button), a movement (e.g., movement of a touch area over successive frames to indicate a user's desire to perform a movement action, such as cursor movement on a touchpad zone).

[045] The transparent interface panel (e.g., 102 or 202) can be a panel made out of a material, such as glass or plastic, within which light (e.g., infrared (IR), ultraviolet (UV), and/or visible light) is reflected internally. The panel can be a curved panel (e.g., curved in one direction). [046] The base portion (e.g., 104 or 204) of the multi-touch input devices depicted in Figs. 1 and 2 can provide support for the input device and can house various components. For example, the base portion can house the digital cameras (e.g., 116 or 216). The digital cameras can be housed in the base portion in a position to view (e.g., capture images) of the underside of the interface panel (e.g., 102 or 202). The base portion can house components needed for communication with a computing device (e.g., a computer system, such as a desktop computer, or another type of computing device), such as a USB connection and/or wireless communication technology (e.g., Bluetooth). The base portion can house control components (e.g., processors or other types of controllers) for operating the input device (e.g., controlling the digital cameras to scan or image the underside of the interface panel, process the information, and communicate results to an associated computing device).

[047] The housing portion (e.g., 107 or 207) of the multi-touch input devices depicted in Figs. 1 and 2 can connect the base portion (e.g., 104 or 204) to the interface panel (e.g., 102 or 202). The housing portion can house a light source (e.g., IR, UV, and/or visible light sources, such as LEDs). For example, the light source (e.g., one or more LEDs) can be located in in the housing portion or in the base portion, and be positioned for directing light into an edge of the interface panel (e.g., positioned to directly shine light into the edge of the interface panel or to indirectly conduct light, such as via a light pipe).

[048] The light source of the multi-touch input devices depicted in Figs. 1 and 2 can provide UV, IR, and/or visible light. In some implementations, one or more IR LEDs provide a light source within the interface panel, for detection by the digital cameras using FTIR technology. In some implementations, one or more UV LEDs provide a light source to the interface panel and provides a sterilization effect to the interface panel. UV, IR, and/or visible light sources can be used separately or in combination.

[049] The digital cameras (e.g., 116 or 216) of the multi-touch input devices depicted in Figs. 1 and 2 capture images (e.g., continuously capture a sequence of images as a video stream) of the interface panel (e.g., 102 or 202). The digital cameras can be configured to capture at a specific rate (e.g., 30, 50, or 80 pictures or frames per second (FPS)). The digital cameras can be configured to capture at a specific rate that depends on the rate of a light source. For example, if an IR light source is strobing or flashing at a specific rate, such as 15Hz or 30Hz (cycles per second), the digital camera capture rate can be set accordingly (e.g., to the same rate, such as 15Hz or 30Hz, or to a different rate, such as 30Hz or 60Hz, which may be a multiple of the flashing rate of the IR light). The digital cameras can be selected, or filtered, to be primarily sensitive to specific types of light, such as IR light. The digital images captured by the digital cameras can depict the presence and absence of infrared light. For example, the digital images can be grayscale digital images in which areas of detected infrared light are lighter (e.g., where the lighter the grayscale value, the greater the intensity of IR light) and areas with little or no detected infrared light are darker (e.g., where the darker the grayscale value, the lesser the intensity of IR light).

[050] The multi-touch input devices depicted in Figs. 1 and 2 can include an interface map (e.g., 126 or 226). The interface map depicts a layout for the various key and/or touchpad areas of the input device. For example, the interface map can depict key areas for a standard keyboard (e.g., depicting letters, numbers, special characters, arrow keys, function keys, etc.). In some implementations, the interface map is a separate sheet (e.g., a static plastic sheet) that is removably attached to the top surface of the interface panel (e.g., a separate sheet or film that is placed on the interface panel or adhered to the interface panel). Alternatively, the interface map can be attached to the bottom surface of the interface panel. In some

implementations, the interface map is permanently attached to the top or bottom surface of the interface panel (e.g., printed directly on the surface of the interface panel, etched into the interface panel, or otherwise permanently applied or attached). The interface map can be a custom interface map (e.g., user-defined or selected from a number of layout options or combinations of elements) or a pre-defined interface map (e.g., a standard keyboard layout in a specific language). [051] The multi-touch input devices depicted in Figs. 1 and 2 can include a controller (e.g., one or more processors, integrated circuits, and/or associated components). The controller can be located, for example, in the base portion (e.g., 104 or 204) of the multi-touch input device. The controller can control functions of the input device, such as communication (e.g., wired or wireless communication) with an associated computing device, operation of the digital cameras, processing of images to detect touch events, translation of touch events into specific key presses or touchpad movement (e.g., using an interface map), sterilization cleaning cycles, processing of images to remove interference and/or contamination, communication with an associated computer as a standard keyboard and/or mouse, and other functions performed by the input device.

[052] Fig. 3 is a side elevation view of the multi-touch input device 100 of Fig. 1 or 200 of Fig. 2, showing the transparent interface panel 102 or 202, the base portion 104 or 204, the housing portion 107 or 207 connecting the interface panel to the base portion, a light source (not depicted) for projecting light into the interface panel, and one or more digital cameras 116 or 216 positioned to scan a surface of the interface panel. The multi-touch input device also has the interface map 126, 226 (e.g., a visually perceptible and/or tactile perceptible interface map) attached (e.g., removably attached or permanently attached) to the top 318 (as shown) and/or bottom 320 surface of the interface panel. The interface map can be selected from a plurality of pre-existing layouts or a custom layout can be created for a particular user or purpose. The interface map can be configured in a single piece, or multiple interface map parts can be provided.

[053] As shown in Fig. 3, the base portion 104 or 204 and/or housing portion 107 or 207 can comprise various components for operating the multi-touch input device 100, 200. For example, the components can include controllers (e.g., one or more processors), one or more of the cameras 116 or 216, power supplies, interfaces to other systems (e.g., USB and/or wireless connection components for communicating with other computing devices), LEDs, light pipes, etc. [054] From the side, the multi-touch input device has a streamlined aesthetic appearance. In the examples, the interface panel is slightly curved and attached only at one end, such that its forward edge is cantilevered over the cameras and the forward edge of the base. [055] Fig. 4 is an exploded view depicting an example multi-touch input device 400. The multi-touch input device 400 comprises a transparent interface panel 402 (e.g., corresponding to 102 or 202), an interface map 426 (e.g., corresponding to 126 or 226), a base portion 404 (e.g., corresponding to 104 or 204) coupled to interface panel 402, a housing portion 407 (e.g., corresponding to 107 or 207) connecting the interface panel 402 to the base portion 404, and three digital cameras 416 positioned to scan a surface of the interface panel 402. Depending on implementation details, a different number of digital cameras 416 can be used.

[056] Fig. 4 also depicts components that are internal to the multi-touch input device 400 and that may be present in the multi-touch input devices depicted in Figs. 1 and 2. For example, the exploded view 400 depicts eight light sources (e.g., comprising IR and/or UV LEDs), two of which are depicted at 410. The light sources (e.g., 410) are optically coupled to an edge of the interface panel 402, such that light is internally reflected inside the interface panel 402. Light from the light sources (e.g., 410) can be optically coupled to the edge of the interface panel 402 using light pipes. Depending on implementation details, a different number of light sources (e.g., 410) can be used.

[057] Fig. 4 illustrates just one example of how the components (including internal components) of the multi-touch input device can be designed and configured.

Depending on implementation details, different configurations, selections, and/or arrangements of components can be used.

[058] Fig. 5 depicts a schematic side elevation view of an example multi-touch input device 500 in use, depicting scattered IR light from a touch. In Fig. 5, IR light is being directed to an edge of the interface panel 502 via one or more IR LEDs 510. When the interface panel 502 is touched (e.g., by a user's finger or by another object), the reflectivity at the surface of the interface panel 502 is altered, causing some of the IR light to "escape" (to be scattered outward) from the point of contact 530. The light that escapes is detected by the digital cameras 516 as a point source of IR light. The digital cameras 516 can capture images (or frames) multiple times per second, recording the presence or absence of any point sources during each capture. The captured images can be analyzed (e.g., to determine touch events, such as using blob detection) and compared to an interface map to determine the corresponding location on the interface map that has been touched. The captured images can also be analyzed to determine other types of touch events, such as a tap, hold (e.g., for repetitive key entry), multiple touches at different locations, movement, etc.

[059] For example, each touch location (e.g., as determined by a center point of the touch location) can activate a corresponding key or other function according to the interface map and its key and/or touch zone layout.

Interface Maps [060] A multi-touch input device (e.g., the multi-touch input devices depicted in Figs. 1 and 2) can include an interface map (e.g., 126 or 226). The interface map provides a layout for keys and/or touchpad areas on the interface panel (e.g., 102 or 202) surface.

[061] The interface map can be a plastic film that is affixed or adhered (e.g., removably affixed or adhered) to the interface panel (e.g., to the top or bottom surface of the interface panel). In some implementations, the interface map is a polycarbonate film (e.g., a transparent polycarbonate film). In some

implementations, the interface map can be removably adhered to the interface panel such that the interface map can be later removed (e.g., and replaced with a different interface map). The interface map can be a transparent or semi-transparent sheet or film (e.g., transparent to at least IR wavelengths).

[062] The interface map provides a layout for touch areas (e.g., keys and/or touchpad zones) on the interface panel. The layout can be a pre-defined layout or a custom (e.g., user-defined) layout. For example, pre-defined layouts can be provided for standard keyboard keys in a variety of languages. The layout can be printed or imprinted (e.g., using various printing technologies, such as solvent inkjet printing, UV inkjet printing, laser printing, etc.) on the interface map. For example, the layout of a keyboard key can be depicted on the interface map as an outline of the key area and a symbol indicating the function performed by the key (e.g., the symbol "A" for a key that will enter the letter "A" when touched).

[063] Figs. 6A and 6B are plan views of example interface maps, 600 and 640. The first example interface map 600 in Fig. 6 A is an English language keyboard layout (e.g., a pre-defined layout). The keyboard layout includes letter keys, number keys, function keys, special keys (e.g., control keys, shift keys, etc.), and a number pad 610. The second example interface map 640 is an English language keyboard layout (e.g., a pre-defined layout), which is similar to the first example layout 610, but includes a touchpad zone 650 instead of the number pad 610. The second example layout 640 also includes keys which can function as mouse buttons, a right mouse button "RMB" and a left mouse button "LMB" 652, as well as a zoom slider 650. For example, the second example interface map 650 can perform the functions of both a keyboard and a mouse.

[064] The example interface maps 600 and 640 can represent pre-defined layouts. For example, they can be provided as two pre-defined alternative interface maps for a multi-touch input device. The example interface maps 600 and 640 can be a transparent films (e.g., polycarbonate films) on which the keyboard layouts are printed.

[065] Fig. 7 is a plan view of an example interface map 700. The example interface map 700 represents a mouse or touchpad layout (e.g., a pre-defined layout). The mouse/touchpad layout includes a touchpad zone 7 10 (e.g., for moving a cursor on a computer screen) and eight buttons 720. For example, the eight buttons can include mouse buttons (e.g., a right mouse button "RMB" and a left mouse button "LMB") and/or buttons that perform other functions (e.g., pre-defined or user-assignable functions). [066] The layout of the interface map can be defined using layout information. The layout information (e.g., size and location of keys and touchpad zones) can be used when creating the interface map (e.g., by printing a layout on plastic film, according to the layout information, to create the interface map) and when processing touches to determine actions to perform (e.g., key presses or touchpad movement). For example, the layout information for keyboard keys can include size (e.g., height and width), shape (e.g., square, round, etc.), and/or function information (e.g., a key, symbol, or function corresponding to the key). Touchpad zone layout information can include size (e.g., height and width), shape (e.g., square, round, etc.), and/or function information (e.g., an indication of how touch information is to be processed, such as a 1 -dimension slider or a 2-dimension area). For example, the layout information can comprise configuration of the touchpad area (e.g., size, shape, and/or type), location of the touch area on the interface map, and the function performed by the touch area when used. [067] Table 1 below depicts a simplified example of layout information for a number of keys and a touchpad area (touchpad zone). The "type" column indicates whether the element is a key or touchpad area, the "height x width" column indicates the height and width of the key or touchpad area, the "location" column indicates an x/y offset from the upper-left corner of the interface map where the key or touchpad area is located, and the "key/function" column indicates how touches within the key or touchpad area are processed.

Figure imgf000019_0001

Table 1 - Example Layout Information [068] The interface map can be a pre-defined interface map. For example, a multi- touch input device can be provided with a standard pre-defined interface map (e.g., a standard English-language keyboard interface map), or multiple standard predefined interface maps can be provided from which a selection can be made. For example, a purchaser of a multi-touch input device may be provided with the option of a standard keyboard layout interface map with a number pad or a standard keyboard layout interface map with a touchpad in place of the number pad. One or more standard pre-defined interface maps can be provided with an input device (e.g., pre-attached to the input device or user-attached), or purchased separately. [069] The interface map can be a custom or user-defined interface map. For example, a custom interface map can be defined (e.g., by a user) using pre-defined elements and/or by defining the layout using individual or custom elements. Predefined elements can include blocks of keys (e.g., a block of letters, numbers, and associated space bar, control key etc., a block of arrow keys, a block of function keys, a block of keys for a number pad, etc.) or pre-defined touchpad areas (e.g., a touchpad zone of a pre-determined size for use as a cursor movement or mouse area). Defining the layout using individual or custom elements can include selecting individual elements (e.g., specific keys, such as letters, numbers, function keys, arrow keys, etc., including location and/or size) and/or defining custom elements (e.g., defining the custom element by size, shape, function, etc.). For example, a custom element can be a key of a specific size, shape, and location that performs a simple or complex function (e.g., a key that types a sequence of letters or that performs one or more actions).

[070] Interface maps can be created using a web site. For example, a user can create a custom interface map using a web site (e.g., using a graphical user interface to create the layout of the interface map). The user can then order or purchase the custom interface map. The custom interface map can be printed and delivered to the user. The user can then attach the custom interface map to a multi-touch input device. The user can also receive a configuration or data file (e.g., a firmware file) comprising layout information for the custom interface map (e.g., as a downloaded file or on a storage device received with the interface map). The user can install (e.g., via a firmware update) the configuration or data file on the multi-touch input device (e.g., via a USB or wireless connection from the user's computer), thus configuring the multi-touch input device to process touch events according to the custom interface map created by the user. In addition to a configuration file, a custom driver can be provided allowing a computing device (e.g., a computer) to process events from the input device (e.g., events for custom functions, other than standard key and mouse events).

[071] Interface maps can also be created locally by a user. For example, a user can create a custom interface map using software on the user's computer (e.g., a custom interface map design application). The user can then order or purchase the custom interface map (e.g., by uploading or sending the custom interface map to the manufacturer or to a third party), or the user can locally print the custom interface map. For example, the user can use a variety of printers (e.g., UV printers, inkjet printers, and solvent printers) to print the custom interface map on a sheet or film (e.g., on a polycarbonate film) and attach the custom interface map to a multi-touch input device. The user can also generate (e.g., via the software on the user's computer) a configuration or data file (e.g., a firmware file) comprising layout information for the custom interface map. The user can install (e.g., via a firmware update) the configuration or data file on the multi-touch input device (e.g., via a USB or wireless connection from the user's computer), thus configuring the multi- touch input device to process touch events according to the custom interface map created by the user (e.g., by detecting one or more touches by a finger or other object and mapping the location(s), according to the custom interface map, to generate key press and/or touchpad movement events and communicate the events to an attached computing device). In addition to a configuration file, a custom driver can be provided allowing a computing device (e.g., a computer) to process events from the input device (e.g., events for custom functions, other than standard key and mouse events). [072] The interface maps can be attached to the input device at a specific location. For example, indicators (e.g., raised bumps, imprinted marks, etched marks, etc.) can be present on the input device interface panel for determining where to attach the interface map. The indicators can be located, for example, at or near one or more corners of the input device interface panel corresponding to one or more corners of the interface map.

[073] The interface map can be a transparent or semi-transparent film (e.g., a transparent or semi-transparent polycarbonate film). The interface map can also be colored (e.g., a transparent film with a colored tint), such as with colors that fluoresce under UV light. The layout that is printed on the interface map can also be printed in a variety of colors, including colors that fluoresce under UV light. The layout that is printed on the interface map can be printed in an outline format showing the outline of keys and/or touchpad zones, in addition to symbols indicating the function or action performed by the key and/or touchpad zone (e.g., a letter, such as "S," indicating that the key will type the letter "S" when touched).

[074] The interface map can be attached or applied (e.g., removably attached or applied) to the top or bottom surface of the interface panel. For example, the interface map can be a polycarbonate film coated with a transparent adhesive (e.g., a "static film") allowing the interface map to be attached, and later removed if needed (e.g., to replace the interface map with a new interface map having the same layout or a different layout).

[075] In some implementations the interface map can be permanently attached to the top or bottom surface of the interface panel. In yet other implementations, instead of being a separate film or sheet, the interface map can be integrated with the interface panel (e.g., printed directly on the surface of the interface panel or etched into the interface panel).

[076] In some implementations, the interface map may be changed by a user, and a corresponding configuration file provided to load a new interface map. For example, an interface map comprising a standard QWERTY keyboard printed on a static film may be applied to an interface panel with a corresponding configuration file, allowing the input device to be used as a Western-alphabet keyboard. The user could then remove and replace the interface map with a different interface map having a Chinese-character keyboard layout, load the corresponding configuration file (e.g., as a firmware update to the input device), and use the input device as a Chinese character keyboard. The user could also create her own customized interface map and configuration file, for a desired specific functionality, print the interface map on static film, and apply it to the interface panel. The interface panel may be made from extremely rugged materials, such as tempered glass, GORILLA GLASS™, acrylic, or other materials which can easily transmit IR wavelengths.

Methods for Creating Interface Maps

[077] FIG. 8 is a flowchart of an exemplary method 800 for creating a custom interface map for a frustrated total internal reflection (FTIR) multi-touch input device. At 810, layout information is received for a custom interface map. The layout information defines a plurality of touch areas. For example, the touch areas can be keys and/or touchpad zones. The layout information can be received from a user using a local software application or from a user using a remote service (e.g., via a web site).

[078] At 820, a layout is generated for the custom interface map according to the received layout information 810. For example, the layout can depict outlines and/or symbols for the various touch areas (e.g., keys and touchpad zones) defined by the layout information.

[079] At 830, a configuration file is generated from the received layout information 810. The configuration file is loadable on the FTIR multi-touch input device to configure the FTIR multi-touch input device to use the custom interface map. In addition, a custom driver can be created to support input device functions (e.g., functions that use custom processing) on an associated computing device. [080] At 840, the configuration file is output. For example, the configuration file can be stored on a server computer or on a local computer. The configuration file can be delivered (e.g., separately or with the custom interface map) to a user to install on the multi-touch input device (e.g., via a USB or wireless firmware update of the multi-touch input device). In addition, a custom driver can be provided for storage and/or installation.

[081] The custom interface map can be created by printing the generated layout 830 on a transparent film. The interface map can be printed, for example, at a manufacturer, third party, or locally by a user. [082] For example, an online web service can provide a design tool for receiving the layout information 810 from a user. The online web service can generate the layout 820, generate the configuration file 830, and output the configuration file (e.g., store the configuration file and/or send the configuration file to the user). The online web service can provide the layout for printing on a transparent film (e.g., at a manufacturer of the multi-touch input device or a third party printing service) and delivery to the user.

[083] A user can also print the custom interface map locally. For example, the user can use local software (e.g., running on the user's computer) or a remote service to design the custom interface map and print the layout on transparent film using a local printer. The user can also generate or download a corresponding configuration file for the custom interface map and install it on the multi-touch input device along with the transparent film with the printed custom layout (the custom interface map). The user can also generate or download a corresponding custom driver to install on the user' s computing device to support the input device with the custom layout. UV Sterilization

[084] Sterilization technology can be used with any of the multi-touch input devices described herein. For example, specific wavelengths of UV light can be used to sterilize the surface of the multi-touch input device (e.g., to sterilize the interface panel, including the interface map, of the multi-touch input device). In some implementations, the ultraviolet light is emitted at wavelengths of approximately 265nm to 280nm.

[085] In some implementations, sterilization is performed using UV light. UV light can be applied to the interface panel of a multi-touch input device. For example, UV LEDs can be used to direct UV light into the edge of the interface panel. The UV light can provide a sterilization effect to the multi-touch input device from inside the interface panel. Alternatively, UV light can be directed to the outside (e.g., top or bottom) surface of the interface panel (e.g., separately or in combination with internally-directed UV light).

[086] In an example implementation, the light source (e.g., of multi-touch input devices depicted in Figs. 1 through 5) can comprise UV LEDs (e.g., in addition to IR LEDs). The UV LEDs can be enabled when a UV sterilization effect is desired. For example, the UV LEDs can be enabled at pre-defined times (e.g., on a schedule) or at user-defined times (e.g., on a user-defined schedule or manually enabled). The UV LEDs can be enabled for a specific duration sufficient to provide a sterilization effect to the multi-touch input device (e.g., a number of seconds or minutes, such as 2-3 minutes).

[087] The UV LEDs can be enabled during a cleaning cycle of the multi-touch input device. For example, the cleaning cycle can be enabled according to a schedule (e.g., once a day when the input device is not in use, such as during the night). The cleaning cycle can be enabled when the input device is in a sleep mode (e.g., when the input device has not been used for an amount of time, such as a number of minutes or hours and/or when a computing device connected to the input device has not been used for an amount of time). The cleaning cycle can be enabled during a period of inactivity according to an inactivity timer (e.g., when the input device has been inactive for a number of minutes or hours). [088] The interface panel of the multi-touch input device can be made from a material (e.g., specific types of glass or glass blend) that does not attenuate (or does not significantly attenuate) the specific UV wavelengths used.

[089] The UV cleaning cycle can be user-initiated. For example, a user can manually initiate a UV cleaning (e.g., a 2-3 minute UV cleaning cycle).

Removing Infrared Interference

[090] Infrared interference can be removed from digital images captured using any of the multi-touch input devices described herein. For example, a first digital image can be captured with an infrared light source of a multi-touch input device turned off, and a second digital image can be captured with the infrared light source turned on. Because the first digital image is captured with the infrared light source turned off, any infrared light detected in the first digital image will be the result of an infrared light source other than the infrared light source of the multi-touch input device. Such infrared interference present in the first digital image can be removed from the second digital image. For example, various image processing techniques can be applied to filter out or remove (e.g., subtract) the infrared interference during image processing (e.g., to produce a processed image with the infrared interference removed).

[091] There can be many sources of infrared interference in everyday

environments. Infrared interference can interfere with the proper determination of touch events on a FTIR multi-touch input device. For example, an external infrared light source, such as an IR remote control, can be detected by the digital camera of the input device. Depending on the size, shape, and/or duration of the detected IR remote control light source, the input device could determine that a touch event has occurred, and activate a corresponding keyboard key for example.

[092] Figs. 9 A, 9B, and 9C are diagrams depicting example digital images, from which infrared interference can be detected and removed. The first digital image 900 in Fig. 9A represents a digital image that is captured by a digital camera of a multi- touch input device when an infrared light source of the multi-touch input device is turned off. Because the infrared light source of the input device is turned off, any detected infrared light will be from a source other than the infrared light source of the input device. In addition, even if an interface panel of the input device is being touched (e.g., by a person's finger), the touch location will not be detected because the infrared light source of the input device is off.

[093] The first digital image 900 depicts two locations where infrared light is detected, 902 and 904. The two locations of infrared light, 902 and 904, detected when the infrared light source is turned off are two locations of infrared interference. For example, the infrared interference could be generated by an IR remote control. The two locations of infrared light, 902 and 904, are considered to be infrared interference because they could be incorrectly interpreted as touch events.

[094] The second digital image 920 in Fig. 9B represents a digital image that is captured by a digital camera of a multi-touch input device when an infrared light source of the multi-touch input device is turned on. Because the infrared light source of the input device is turned on, infrared light will be detected from touch events (e.g., scattered infrared light from a person touching an interface panel of the input device) as well as any from any other infrared light source. Therefore, the second digital image 920 will depict actual touch events in addition to any infrared interference.

[095] The second digital image 920 depicts three locations where infrared light is detected, 922, 924, and 926. The two locations of infrared light, 922 and 924, correspond to the two locations of infrared light, 902 and 904, detected in image 900. An additional location of infrared light 926 is also depicted in the second digital image 920.

[096] The third digital image 940 in Fig. 9C represents a digital image that can be generated using the first digital image 900 and the second digital image 920. The third digital image 940 depicts detected infrared light with infrared interference removed. Specifically, the third digital image 940 depicts one location of detected infrared light 946, which can be determined to be a touch event (e.g., a valid touch event). For example, the third digital image 940 can be a new digital image that is created by taking the second digital image 920 and subtracting the first digital image 900. The third digital image 940 can also represent a modified version of the second digital image 920 (e.g., modified by removing the locations of detected infrared light present in the first digital image 900 from the second digital image 920).

[097] In a first example image processing technique, the second digital image modified to filter out (e.g., subtract) the infrared interference from the first digital image. The result of the first image processing technique is a modified second digital image.

[098] In a second example image processing technique, a new digital image is created by subtracting the first digital image from the second digital image. The result of the second image processing technique is the new digital image, which depicts detected infrared light present in the second digital image, but not the first digital image.

[099] In a third example image processing technique, locations of detected infrared light are determined from the first digital image and from the second digital image. For example, the locations can be determined based on size, shape, position, and/or intensity. Specific locations of detected infrared light that are present in both the first and second digital image can then be removed (e.g., they can be discarded from consideration as valid touch events). The result of the third image processing technique are any specific locations of detected infrared light that are present in the second digital image and that do not have any corresponding locations in the first digital image (e.g., corresponding locations 902 and 922, and 904 and 924, can be removed leaving location 926 as a valid touch event).

[0100] In order to detect infrared interference, some digital images can be captured by an FTIR multi-touch input device when the infrared light source (e.g., one or more IR LEDs) of the FTIR multi-touch input device is on, and other digital images can be captured when the infrared light source is off. For example, digital images can be captured at a first rate (e.g., a predetermined number of images per second) and the infrared light source can be configured to flicker at a second rate. By setting the first rate to be greater than the second rate, some of the digital images will be captured when the infrared light source is off. The digital images that are captured when the infrared light source is off can show infrared interference and can be used to process digital images taken when the infrared light source is on to remove the infrared interference.

[0101] In some implementations, the infrared light source is configured to switch between on and off between each successive digital image. For example, a first digital image can be captured with the IR light source on, the next digital image can be captured with the IR light source off, the next digital image can be captured with the IR light source on, and so on. For example, if the rate of digital image capture is 30 images per second, then the cycling rate of the IR light source can be set to 15 cycles per second, such that every other image is captured with the IR light source on. Alternatively, other rates of capture and IR cycling can be used (e.g., image capture at 50 images per second with IR light cycling at 25 cycles per second).

Similarly, other ratios of images captured with and without the IR light source can be used. For example, every third or every fourth digital image can be captured with the IR light source turned off. [0102] Fig. 10 depicts an example method 1000 for removing infrared interference.

At 1010, a first digital image is captured with an infrared light source turned off. At

1020, a second digital image is captured with the infrared light source turned on.

The second digital image can be the next image captured after the first digital image.

The first and second digital images can be captured by a digital camera of a FTIR multi-touch input device. The infrared light source can be configured to emit infrared light within an interface panel of the FTIR multi-touch input device.

[0103] At 1030, the first and second digital images can be processed to remove infrared interference. For example, infrared interference from the first captured digital image 1010 can be subtracted from the second captured digital image 1020. In some implementations, the second digital image is filtered, based at least in part upon the first digital image, to subtract infrared interference present in the first digital image from the second digital image.

[0104] In some implementations, the first digital image is analyzed to determine whether infrared interference is present (e.g., wither the size, shape, content, and/or position of infrared light is sufficient to be capable of interfering with the detection of touch events). When infrared interference is present (e.g., only when the infrared interference is present), the second digital image can be filtered to remove the infrared interference. Alternatively, a new digital image can be created by removing (e.g., subtracting) the infrared interference present in the first digital image from corresponding infrared interference present in the second digital image.

[0105] In some implementations, infrared interference is detected in digital images captured when the infrared light source is off. Compensation for the infrared interference is then performed for digital images captured when the infrared light source is on. For example, the digital images captured when the infrared light source is on can be filtered to remove the infrared interference, filtering can be performed when (e.g., only when) infrared interference is present is a corresponding digital images captured when the infrared light source is off, and/or processed digital images can be generated by subtracting the infrared interference.

Removing Contamination from Digital Images [0106] Contamination can be removed from digital images captured using any of the multi-touch input devices described herein. For example, a digital image can be captured with an infrared light source of a multi-touch input device turned on and when an interface panel of the multi-touch input device is not being touched. The digital image can be captured, for example, when the input device is turned on (e.g., as part of an initialization or wake -up process).

[0107] The digital image can be processed to determine whether any contamination is present. For example, any infrared light detected in the digital image can be treated as contamination. Alternatively, the digital image can be compared to a reference image (e.g., a digital image captured when the input device was manufactured or during a setup process) and any difference can be treated as contamination. Contamination information can be determined from the digital image (e.g., information indicating size, shape, intensity, and/or position of infrared light corresponding to instances of contamination). One or more subsequent images can then be filtered to remove detected infrared light resulting from the contamination (e.g., to subtract instances of contamination, using the contamination information, from the subsequent images).

[0108] In another implementation, a first digital image can be captured with an infrared light source of a multi-touch input device turned on and when an interface panel of the multi-touch input device is not being touched. The first image can represent a "default" or "reference" image (e.g., an image captured when the device is manufactured, first activated, or at a later time such as during a setup process). A second digital image can be captured at a later time with an infrared light source of a multi-touch input device turned on and when the interface panel of the multi-touch input device is not being touched. The second digital image can be captured, for example, when the input device is turned on or during an automatic or user- initiated contamination detection operation. The first and second digital images can be processed to determine whether any contamination is present in the second digital image in comparison to the first digital image. For example, contamination information (e.g., a contamination mask) can be determined that indicates the size, shape, intensity, and/or position of infrared light created by the contamination. One or more subsequent images can then be filtered, using the contamination

information, to remove detected infrared light corresponding to the contamination. [0109] There can be many sources of contamination in everyday environments. For example, contamination (e.g., dust, dirt, grease, food crumbs, coffee, and other types of debris) can interfere with the proper determination of touch events on an FTIR multi-touch input device. For example, such debris can scatter infrared light being reflected within the interface panel of the input device and be detected by the digital cameras. The detected infrared light from such contamination may be difficult to distinguish from touch events. For example, depending on the size, shape, and/or duration of the detected infrared light from the contamination, the input device could determine that a touch event has occurred, and activate a corresponding keyboard key. [0110] Figs. 11 A, 11B, and 11C are diagrams depicting example digital images, from which infrared contamination can be detected and removed. The first digital image 1100 in Fig. 11A represents a digital image that is captured by a digital camera of a multi-touch input device when an infrared light source of the multi- touch input device is turned on, when the multi-touch input device is not being touched, and when an interface map is applied to an interface panel of the multi- touch input device. The first digital image 1100 can be captured, for example, during startup process of the multi-touch input device (e.g., when the multi-touch input device is being powered-on or waking from a sleep mode). The first digital image 1100 can also be captured, for example, during a contamination detection operation (e.g., an automatic or manual contamination detection operation). The first digital image 1100 depicts infrared light that is scattered from any contamination present on the interface panel of the multi-touch input device. Specifically, there are three locations of scattered infrared light, 1102, 1104, and 1106, from contamination depicted in the first digital image 1100. The first digital image 1100 also depicts a faint outline of scattered infrared light from the interface map. Depending on the type of interface map used, infrared light may or may not be scattered by the interface map. Depending on implementation details, scattered infrared light from the interface map may or may not be processed as contamination and removed from subsequent images. [0111] The second digital image 1120 in Fig. 1 IB represents a digital image that is captured after the first digital image 1100. The second digital image 1120 can be one of a number of digital images captured during use of the multi-touch input device (e.g., when the multi-touch input device is being touched). The second digital image 1120 depicts infrared light scattered from three locations of contamination (1122, 1124, and 1126) which correspond to the three locations detected in the first digital image (1102, 1104, and 1106). The second digital image also depicts an additional location of scattered infrared light 1128.

[0112] Contamination information can be determined, at least in part, using the first digital image 1100. For example, the three locations of contamination (1102, 1104, and 1106) can be identified. The contamination information can be used to filter subsequent images. For example, the corresponding locations (1122, 1124, and 1126) in the second digital image 1120 can be filtered (e.g., removed or subtracted). Remaining locations (e.g., location 1128) can be identified as touch locations (e.g., as valid touch locations). [0113] The third digital image 1140 in Fig. 11C represents a processed image in which the contamination present in the first digital image 1100, in addition to the interface map outline, has been removed from the second digital image 1120. The processed digital image 1140 depicts a touch location 1128 that remains after the contamination (including the interface map outline) has been removed. The remaining touch location 1128 can be identified as a valid touch location and processing of the touch location can be performed (e.g., a button press, touchpad movement, etc.).

[0114] Fig. 12 depicts a flowchart for an example method 1200 for removing contamination from digital images. At 1210, a first digital image is captured when an infrared light source of a FTIR multi-touch input device is turned and when an interface panel of the input device is not being touched. At 1220, contamination information is determined based at least in part upon the first digital image. In some implementations, a reference digital image is captured prior to capturing the first digital image 1210 (e.g., during manufacturing). The reference digital image can be used, for example, in determining contamination information 1220 (e.g., the reference digital image can be used to distinguish between scattered infrared light from an interface map and scattered infrared light from other sources).

[0115] At 1230, a plurality of digital images are captured after the first digital image. The plurality of additional digital images are captured when the infrared light source is turned on. The plurality of additional digital images can be captured when the input device is being touched (e.g., when the input device is being used).

[0116] At 1240, the plurality of additional digital images are processed based at least in part upon the contamination information to remove contamination from the plurality of additional digital images.

Touch and Rest

[0117] Various types of touch events can be determined using a FTIR multi-touch input device. In some situations, it can be desirable to distinguish between different types of events, including touch (single, multiple, and/or simultaneous multiple), rest, and/or movement events. For example, during use of an input device, users may rest their fingers on the device (e.g., rest their fingers on a keyboard input device in the home position in preparation for the next keystroke). In order to avoid recognizing such rest events as touches (e.g., key presses), touch events can be distinguished from rest events. [0118] In some implementations, a touch is distinguished from a rest based at least in part upon duration. For example, a touch that lasts for more than one -half second can be determined to be a rest (e.g., and no key activation or touchpad movement events can be performed when a rest is detected). Additional criteria can be used to distinguish a rest from a touch. For example, movement criteria can be used in combination with duration to distinguish between a touch and a rest. In a specific implementation, a touch that lasts for more than one-half second and that moves less than 2mm (e.g., as determined by movement of the center of the touch area) is considered to be a rest.

Example Computing Systems

[0119] FIG. 13 depicts a generalized example of a suitable computing system 1300 in which the described innovations may be implemented. The computing system 1300 is not intended to suggest any limitation as to scope of use or functionality, as the innovations may be implemented in diverse general-purpose or special-purpose computing systems.

[0120] With reference to FIG. 13, the computing system 1300 includes one or more processing units 1310, 1315 and memory 1320, 1325. In FIG. 13, this basic configuration 1330 is included within a dashed line. The processing units 1310,

1315 execute computer-executable instructions. A processing unit can be a general- purpose central processing unit (CPU), processor in an application- specific integrated circuit (ASIC) or any other type of processor. In a multi-processing system, multiple processing units execute computer-executable instructions to increase processing power. For example, FIG. 13 shows a central processing unit 1310 as well as a graphics processing unit or co-processing unit 1315. The tangible memory 1320, 1325 may be volatile memory (e.g., registers, cache, RAM), nonvolatile memory (e.g., ROM, EEPROM, flash memory, etc.), or some combination of the two, accessible by the processing unit(s). The memory 1320, 1325 stores software 1380 implementing one or more innovations described herein, in the form of computer-executable instructions suitable for execution by the processing unit(s).

[0121] A computing system may have additional features. For example, the computing system 1300 includes storage 1340, one or more input devices 1350, one or more output devices 1360, and one or more communication connections 1370. An interconnection mechanism (not shown) such as a bus, controller, or network interconnects the components of the computing system 1300. Typically, operating system software (not shown) provides an operating environment for other software executing in the computing system 1300, and coordinates activities of the components of the computing system 1300. [0122] The tangible storage 1340 may be removable or non-removable, and includes magnetic disks, magnetic tapes or cassettes, CD-ROMs, DVDs, or any other medium which can be used to store information in a non-transitory way and which can be accessed within the computing system 1300. The storage 1340 stores instructions for the software 1380 implementing one or more innovations described herein. [0123] The input device(s) 1350 may be a touch input device such as a keyboard, mouse, pen, or trackball, a voice input device, a scanning device, or another device that provides input to the computing system 1300. For video encoding, the input device(s) 1350 may be a camera, video card, TV tuner card, or similar device that accepts video input in analog or digital form, or a CD-ROM or CD-RW that reads video samples into the computing system 1300. The output device(s) 1360 may be a display, printer, speaker, CD-writer, or another device that provides output from the computing system 1300.

[0124] The communication connection(s) 1370 enable communication over a communication medium to another computing entity. The communication medium conveys information such as computer-executable instructions, audio or video input or output, or other data in a modulated data signal. A modulated data signal is a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media can use an electrical, optical, RF, or other carrier.

[0125] The innovations can be described in the general context of computer- executable instructions, such as those included in program modules, being executed in a computing system on a target real or virtual processor. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Computer-executable instructions for program modules may be executed within a local or distributed computing system. [0126] The terms "system" and "device" are used interchangeably herein. Unless the context clearly indicates otherwise, neither term implies any limitation on a type of computing system or computing device. In general, a computing system or computing device can be local or distributed, and can include any combination of special-purpose hardware and/or general-purpose hardware with software implementing the functionality described herein. [0127] For the sake of presentation, the detailed description uses terms like

"determine" and "use" to describe computer operations in a computing system. These terms are high-level abstractions for operations performed by a computer, and should not be confused with acts performed by a human being. The actual computer operations corresponding to these terms vary depending on implementation.

Example Implementations

[0128] Although the operations of some of the disclosed methods are described in a particular, sequential order for convenient presentation, it should be understood that this manner of description encompasses rearrangement, unless a particular ordering is required by specific language set forth below. For example, operations described sequentially may in some cases be rearranged or performed concurrently. Moreover, for the sake of simplicity, the attached figures may not show the various ways in which the disclosed methods can be used in conjunction with other methods.

[0129] Any of the disclosed methods can be implemented as computer-executable instructions or a computer program product stored on one or more computer- readable storage media and executed on a computing device (e.g., any available computing device, including smart phones or other mobile devices that include computing hardware). Computer-readable storage media are any available tangible media that can be accessed within a computing environment (e.g., non-transitory computer-readable media, such as one or more optical media discs such as DVD or CD, volatile memory components (such as DRAM or SRAM), or nonvolatile memory components (such as flash memory or hard drives)). By way of example and with reference to Fig. 13, computer-readable storage media include memory 1320 and 1325, and storage 1340. As should be readily understood, the term computer-readable storage media does not include communication connections (e.g., 1370) such as modulated data signals.

[0130] Any of the computer-executable instructions for implementing the disclosed techniques as well as any data created and used during implementation of the disclosed embodiments can be stored on one or more computer-readable storage media (e.g., non-transitory computer-readable media). The computer-executable instructions can be part of, for example, a dedicated software application or a software application that is accessed or downloaded via a web browser or other software application (such as a remote computing application). Such software can be executed, for example, on a single local computer (e.g., any suitable

commercially available computer) or in a network environment (e.g., via the Internet, a wide-area network, a local-area network, a client-server network (such as a cloud computing network), or other such network) using one or more network computers. [0131] For clarity, only certain selected aspects of the software-based

implementations are described. Other details that are well known in the art are omitted. For example, it should be understood that the disclosed technology is not limited to any specific computer language or program. For instance, the disclosed technology can be implemented by software written in C++, Java, Perl, JavaScript, Adobe Flash, or any other suitable programming language. Likewise, the disclosed technology is not limited to any particular computer or type of hardware. Certain details of suitable computers and hardware are well known and need not be set forth in detail in this disclosure.

[0132] Furthermore, any of the software-based embodiments (comprising, for example, computer-executable instructions for causing a computer to perform any of the disclosed methods) can be uploaded, downloaded, or remotely accessed through a suitable communication means. Such suitable communication means include, for example, the Internet, the World Wide Web, an intranet, software applications, cable (including fiber optic cable), magnetic communications, electromagnetic

communications (including RF, microwave, and infrared communications), electronic communications, or other such communication means.

[0133] The disclosed methods, apparatus, and systems should not be construed as limiting in any way. Instead, the present disclosure is directed toward all novel and nonobvious features and aspects of the various disclosed embodiments, alone and in various combinations and sub combinations with one another. The disclosed methods, apparatus, and systems are not limited to any specific aspect or feature or combination thereof, nor do the disclosed embodiments require that any one or more specific advantages be present or problems be solved.

Alternatives [0134] The technologies from any example or implementation can be combined with the technologies described in any one or more of the other examples or implementations. In view of the many possible embodiments to which the principles of the disclosed technology may be applied, it should be recognized that the illustrated embodiments are examples of the disclosed technology and should not be taken as a limitation on the scope of the disclosed technology. Rather, the scope of the disclosed technology includes what is covered by the following claims. We therefore claim as our invention all that comes within the scope and spirit of the claims.

Claims

We claim:
1. A frustrated total internal reflection (FTIR) multi-touch input device, the input device comprising:
a base portion;
a transparent interface panel connected to the base portion;
an interface map attached to the interface panel;
an infrared light source for emitting infrared light inside the interface panel using total internal reflection; and
one or more digital cameras, wherein the one or more digital cameras are configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched.
2. The input device of claim 1 wherein the interface map is a transparent film that is removably attached to the interface panel.
3. The input device of claim 1 wherein the interface map is a
polycarbonate film that is removably attached to the interface panel.
4. The input device of claim 1 wherein the interface map depicts a layout comprising one or more touch areas, wherein each touch area is one of a key and a touchpad zone, and wherein the layout is defined by layout information.
5. The input device of claim 4 wherein the layout information comprises, for each touch area:
configuration of the touch area;
location of the touch area on the interface map; and
function performed by the touch area when touched.
6. The input device of claim 1 wherein the interface map is a predefined interface map depicting a keyboard layout, and wherein the multi-touch input device is a multi-touch keyboard input device.
7. The input device of claim 1 wherein the interface map is a custom interface map comprising one or more touch areas defined by a user, wherein the one or more touch areas comprise one or more keys and/or one or more touchpad zones.
8. The input device of claim 1 wherein the interface map is a transparent film, wherein the interface map depicts a layout comprising one or more touch areas, and wherein the layout is printed on the transparent film.
9. The input device of claim 1 wherein the interface map depicts a layout comprising one or more touch areas, and wherein the layout is defined by layout information that is stored by the input device.
10. The input device of claim 1 further comprising:
a processing unit; and
memory;
wherein the interface map depicts a layout comprising one or more touch areas, wherein the layout is defined by layout information that is stored on the memory of the input device, and wherein the processing unit is configured to process touch events according to the layout information.
11. A method, implemented at least in part by a computing device, for creating a custom interface map for a multi-touch input device, the method comprising:
by the computing device:
receiving layout information for a custom interface map, wherein the layout information defines a plurality of touch areas, wherein each touch area is one of a key and a touchpad zone;
generating a layout for the custom interface map according to the layout information;
generating a configuration file from the received layout information, wherein the configuration file is loadable on the multi-touch input device to configure the multi-touch input device to use the custom interface map and to process touch events according to the custom interface map; and
outputting the configuration file.
12. The method of claim 11, wherein the multi-touch input device is a frustrated total internal reflection (FTIR) multi-touch input device, wherein the custom interface map is created by printing the layout on a transparent film, and wherein the transparent film is removably attachable to the FTIR multi-touch input device.
13. The method of claim 11 further comprising:
providing the layout for printing on a transparent film to create the custom interface map.
14. The method of claim 11 wherein the layout information is received from a user via a custom interface map design application.
15. The method of claim 11 wherein the layout information comprises, for each touch area:
configuration of the touch area;
location of the touch area on the interface map; and
function performed by the touch area when touched.
16. A frustrated total internal reflection (FTIR) multi-touch input device, the input device comprising:
a base portion; a transparent interface panel connected to the base portion, wherein the transparent interface panel comprises an interface map with a layout indicating a plurality of touch zones;
an infrared light source for emitting infrared light inside the interface panel using total internal reflection;
an ultraviolet light source for emitting ultra-violet light inside the interface panel; and
one or more digital cameras, wherein the one or more digital cameras are configured to detect infrared light that is scattered from the interface panel using FTIR when the interface panel is touched.
17. The input device of claim 16, wherein the ultraviolet light is emitted at wavelengths of approximately 265nm to 280nm.
18. The input device of claim 16, wherein the ultraviolet light provides a sterilization effect to the transparent interface panel.
19. The input device of claim 16, wherein the ultraviolet light source comprises one or more ultraviolet light-emitting diodes (LEDs).
20. The input device of claim 16, wherein the transparent interface panel comprises a material that does not attenuate ultraviolet light.
21. The input device of claim 16, wherein the ultraviolet light source only operates during a cleaning cycle.
22. The input device of claim 16, wherein the cleaning cycle is initiated during a sleep mode of the input device.
23. A custom interface map and configuration file for a multi-touch input device, comprising: a custom interface map, wherein the custom interface map is defined by layout information, wherein the layout information comprises, for each of a plurality of touch areas:
configuration of the touch area;
location of the touch area on the custom interface map; and function performed by the touch area when touched; and a configuration file corresponding to the custom interface map, wherein the configuration file is loadable on the multi-touch input device to configure the multi- touch input device to:
use the custom interface map; and
process touch events according to the layout information.
24. The custom interface map and configuration file of claim 23, wherein the multi-touch input device is a frustrated total internal reflection (FTIR) multi- touch input device, wherein the plurality of touch areas includes at least one key and at least one touchpad zone, and wherein the custom interface map is created by printing a layout according to the layout information on a transparent film, wherein the transparent film is removably attachable to the FTIR multi-touch input device.
PCT/US2012/044056 2011-11-30 2012-06-25 Multi-touch input device WO2013081672A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201161565494P true 2011-11-30 2011-11-30
US61/565,494 2011-11-30

Publications (1)

Publication Number Publication Date
WO2013081672A1 true WO2013081672A1 (en) 2013-06-06

Family

ID=48535928

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2012/044055 WO2013081671A1 (en) 2011-11-30 2012-06-25 Compensating for interference and contamination for multi-touch input devices
PCT/US2012/044056 WO2013081672A1 (en) 2011-11-30 2012-06-25 Multi-touch input device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2012/044055 WO2013081671A1 (en) 2011-11-30 2012-06-25 Compensating for interference and contamination for multi-touch input devices

Country Status (1)

Country Link
WO (2) WO2013081671A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015051024A1 (en) * 2013-10-01 2015-04-09 Vioguard LLC Touchscreen sanitizing system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016018416A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Determining the location of a user input device
EP3032386A1 (en) 2014-12-10 2016-06-15 PR Electronics A/S Optical keypad for explosive locations

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5936614A (en) * 1991-04-30 1999-08-10 International Business Machines Corporation User defined keyboard entry system
US6776546B2 (en) * 2002-06-21 2004-08-17 Microsoft Corporation Method and system for using a keyboard overlay with a touch-sensitive display screen
US20060167531A1 (en) * 2005-01-25 2006-07-27 Michael Gertner Optical therapies and devices
US7176905B2 (en) * 2003-02-19 2007-02-13 Agilent Technologies, Inc. Electronic device having an image-based data input system
US7403191B2 (en) * 2004-01-28 2008-07-22 Microsoft Corporation Tactile overlay for an imaging display
US20100066690A1 (en) * 2008-05-17 2010-03-18 Darin Beamish Digitizing tablet devices, methods and systems
US20100188340A1 (en) * 2009-01-27 2010-07-29 Disney Enterprises, Inc. Touch detection system and method for use by a display panel
US20110256019A1 (en) * 2010-04-19 2011-10-20 Microsoft Corporation Self-sterilizing user input device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5196835A (en) * 1988-09-30 1993-03-23 International Business Machines Corporation Laser touch panel reflective surface aberration cancelling
US8026904B2 (en) * 2007-01-03 2011-09-27 Apple Inc. Periodic sensor panel baseline adjustment
US8004502B2 (en) * 2007-10-05 2011-08-23 Microsoft Corporation Correcting for ambient light in an optical touch-sensitive device
JP2012508913A (en) * 2008-11-12 2012-04-12 フラットフロッグ ラボラトリーズ アーベーFlatFrog Laboratories AB Integrated touch-sensing display device and manufacturing method thereof
US20110109594A1 (en) * 2009-11-06 2011-05-12 Beth Marcus Touch screen overlay for mobile devices to facilitate accuracy and speed of data entry

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5936614A (en) * 1991-04-30 1999-08-10 International Business Machines Corporation User defined keyboard entry system
US6776546B2 (en) * 2002-06-21 2004-08-17 Microsoft Corporation Method and system for using a keyboard overlay with a touch-sensitive display screen
US7176905B2 (en) * 2003-02-19 2007-02-13 Agilent Technologies, Inc. Electronic device having an image-based data input system
US7403191B2 (en) * 2004-01-28 2008-07-22 Microsoft Corporation Tactile overlay for an imaging display
US20060167531A1 (en) * 2005-01-25 2006-07-27 Michael Gertner Optical therapies and devices
US20100066690A1 (en) * 2008-05-17 2010-03-18 Darin Beamish Digitizing tablet devices, methods and systems
US20100188340A1 (en) * 2009-01-27 2010-07-29 Disney Enterprises, Inc. Touch detection system and method for use by a display panel
US20110256019A1 (en) * 2010-04-19 2011-10-20 Microsoft Corporation Self-sterilizing user input device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015051024A1 (en) * 2013-10-01 2015-04-09 Vioguard LLC Touchscreen sanitizing system
US9233179B2 (en) 2013-10-01 2016-01-12 Vioguard LLC Touchscreen sanitizing system

Also Published As

Publication number Publication date
WO2013081671A1 (en) 2013-06-06

Similar Documents

Publication Publication Date Title
CN202189336U (en) Capture system for capturing and processing handwritten annotation data and capture equipment therefor
US8274484B2 (en) Tracking input in a screen-reflective interface environment
US7168047B1 (en) Mouse having a button-less panning and scrolling switch
US8463023B2 (en) Enhanced input using flashing electromagnetic radiation
US8842096B2 (en) Interactive projection system
US8314773B2 (en) Mouse having an optically-based scrolling feature
EP2724213B1 (en) Intelligent stylus
US6262717B1 (en) Kiosk touch pad
KR101070111B1 (en) Hand held electronic device with multiple touch sensing devices
US7786980B2 (en) Method and device for preventing staining of a display device
US8289292B2 (en) Electronic device with touch input function and touch input method thereof
Butler et al. SideSight: multi-touch interaction around small devices
US6791531B1 (en) Device and method for cursor motion control calibration and object selection
US20080179507A2 (en) Multi-touch sensing through frustrated total internal reflection
CN101071354B (en) Force and location sensitive display
US8130202B2 (en) Infrared touch screen gated by touch force
US20070063981A1 (en) System and method for providing an interactive interface
AU2007342094B2 (en) Back-side interface for hand-held devices
US8456447B2 (en) Touch screen signal processing
US8933876B2 (en) Three dimensional user interface session control
US8674961B2 (en) Haptic interface for touch screen in mobile device or other device
KR101363726B1 (en) Light-based finger gesture user interface
CN102934069B (en) Automatically modified user interface apparatus and method for adaptively
US20110205151A1 (en) Methods and Systems for Position Detection
Caprani et al. Touch screens for the older user

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12852778

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 12852778

Country of ref document: EP

Kind code of ref document: A1