EP2473909A1 - Methods for mapping gestures to graphical user interface commands - Google Patents

Methods for mapping gestures to graphical user interface commands

Info

Publication number
EP2473909A1
EP2473909A1 EP10813167A EP10813167A EP2473909A1 EP 2473909 A1 EP2473909 A1 EP 2473909A1 EP 10813167 A EP10813167 A EP 10813167A EP 10813167 A EP10813167 A EP 10813167A EP 2473909 A1 EP2473909 A1 EP 2473909A1
Authority
EP
European Patent Office
Prior art keywords
fingers
bunched
gesture
finger
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10813167A
Other languages
German (de)
French (fr)
Other versions
EP2473909A4 (en
Inventor
Dax Kukulj
Andrew Kleinert
Ian Andrew Maxwell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
RPO Pty Ltd
Original Assignee
RPO Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2009904315A external-priority patent/AU2009904315A0/en
Application filed by RPO Pty Ltd filed Critical RPO Pty Ltd
Publication of EP2473909A1 publication Critical patent/EP2473909A1/en
Publication of EP2473909A4 publication Critical patent/EP2473909A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to methods for mapping gestures to graphical user interface commands.
  • it relates to methods for mapping gestures performed on a touch screen with configurations of bunched fingers.
  • the invention has been developed primarily for use with infrared-type touch screens, and will be described hereinafter with reference to this application. However, it will be appreciated that the invention is not limited to this particular field of use.
  • touch screens Input devices based on touch sensing (touch screens) have long been used in electronic devices such as computers, personal digital assistants (PDAs), handheld games and point of sale kiosks, and are now appearing in other portable consumer electronics devices such as mobile phones.
  • PDAs personal digital assistants
  • touch-enabled devices allow a user to interact with the device, for example by touching one or more graphical elements such as icons or keys of a virtual keyboard presented on a display, or by writing or drawing on a display.
  • touch-sensing technologies including resistive, capacitive, projected capacitive, surface acoustic wave, optical and infrared, all of which have advantages and disadvantages in areas such as cost, reliability, ease of viewing in bright light, ability to sense different types of touch object (e.g. finger, gloved finger, stylus) and single or multi-touch capability.
  • resistive touch screens are inexpensive and can sense virtually any rigid touch object, but have poor screen viewability in bright light and can only sense single touches.
  • Projected capacitive has multi-touch capability but cannot sense a non-conductive stylus or a gloved finger, and likewise has poor screen viewability in bright light.
  • Optical and infrared have good screen viewability in bright light, some multi-touch capability and are sensitive to virtually any touch object, but there is the potential for the detectors to be saturated by sunlight. Furthermore some touch-sensing technologies, including optical, infrared and surface acoustic wave, are sensitive to near-touches as well as to actual touches, whereas other technologies such as resistive require an actual touch.
  • gestural inputs where a user places and/or moves one or more touch objects (usually fingers, with the thumb considered to be a finger) across a touch- sensitive surface, or places one or more touch objects on a touch- sensitive surface in a particular sequence, is an increasingly popular means for enhancing the power of touch input devices beyond the simple 'touch to select' function, with a large number of gestures of varying complexity for touch input devices known in the art (see for example US Patent Application Publication Nos 2006/0026535 Al, 2006/0274046 Al and 2007/0177804 Al).
  • touch technologies such as projected capacitive with interrogation of every node can accurately detect several simultaneous touch events and are well suited to gestural input, with gestures interpreted according to the number of fingers used.
  • US 2007/0177804 Al discusses the concept of a 'chord' as a set of fingers contacting a multi-touch surface, and suggests the use of a gesture dictionary assigning functions to different motions of a chord.
  • touch technologies with no multi-touch capability e.g. resistive and surface capacitive
  • gestural input based on chords is of limited applicability.
  • Still other touch technologies typically those that detect touch objects based on the occlusion of one or more light or acoustic beam paths, occupy a middle ground in that they can routinely detect the presence of multiple simultaneous touch objects, but generally cannot unambiguously determine their location.
  • This 'double touch ambiguity' is illustrated in Fig. 1, for the example of a conventional 'infrared' touch screen 2 with arrays of emitters 4 and detectors 6 along opposing edges of an input area 8.
  • Two touch objects 10 occlude two 'X' beam paths 12 and two 'Y' beam paths 14 such that their presence is detected, but their location cannot be distinguished from the two 'phantom' points 16.
  • This ambiguity, and some of its implications for gestural input is described in more detail in International Patent Application Publication No WO 2008/138046 Al.
  • the number of gestures that can be mapped to various combinations of fingers can be increased by taking account of the distances between a user's fingers on the touch- sensitive surface.
  • US Patent No 7,030,861 discloses the concept of discriminating between 'neutral' gestures, where a user's hand is relaxed with the fingers relatively close together, and 'spread-hand' gestures, where a user's fingers are intentionally spread apart, with different functions assigned accordingly.
  • the requirement for a touch technology to resolve two closely spaced touch objects, represented by the contact points of two adjacent fingers 18 on a touch screen 2, separated by a gap 20 significantly smaller than either finger is in many ways distinct from the requirement to detect and locate two simultaneous touch events.
  • a touch technology capable of detecting and locating two widely separated touch objects will not necessarily be able to do so when the objects are closely spaced or in mutual physical contact.
  • the ability to resolve two closely spaced objects can be influenced by changes in the conductivity of the touch objects (e.g. damp versus dry fingers).
  • mapping gestures performed on a touch screen to graphical user interface commands comprising the steps of: detecting a gesture performed with a bunched finger configuration on a surface of a touch screen associated with said graphical user interface, said bunched finger configuration comprising two or more fingers; and generating a graphical user interface command in response to said gesture.
  • the bunched finger configuration comprises two or more fingers held in mutual contact. In another preferred form, the bunched finger configuration comprises two or more fingers perceived by the touch screen to be separated by a gap less than half the size of any of the fingers. In yet another preferred form, the bunched finger configuration comprises two or more fingers perceived by the touch screen to be separated by a gap less than a quarter the size of any of the fingers.
  • the gesture preferably includes one or more taps on the surface of at least a subset of the fingers.
  • the gesture includes movement across the surface of at least a subset of the fingers.
  • the fingers are preferably moved across the surface in unison.
  • the bunched finger configuration comprises two fingers, and the gesture is mapped to a rotation command such that a graphical object displayed on the graphical user interface is rotated.
  • the bunched finger configuration comprises two fingers, the gesture further comprises a tap of the two fingers prior to the movement, and the gesture is mapped to a fixed increment rotation command such that a graphical object displayed on the graphical user interface is rotated by fixed increments.
  • the gesture further comprises a double tap of a single finger on the graphical object prior to the movement, to define a centre of rotation, and the gesture is mapped to a rotation command such that the graphical object is rotated about the centre of rotation.
  • Figure 1 illustrates the 'double touch ambiguity' experienced by many touch screen technologies
  • Figure 2 illustrates in plan view the contact of two closely spaced fingers on a touch screen
  • Figure 3 illustrates in plan view the operation of an 'optical' touch screen
  • Figure 4 illustrates in plan view the operation of a type of infrared touch screen
  • Figures 5(a) to 5(c) illustrate in side view some possible configurations for combining the infrared touch screen of Fig. 4 with a display
  • Figures 6(a) and 6(b) illustrate in plan and side view the interaction of two closely spaced touch objects with the touch screen of Fig. 4;
  • Figure 7 illustrates an example edge detection algorithm for resolving two closely spaced touch objects
  • Figure 8 illustrates an example implementation of bunched finger detection within a touch screen microcontroller
  • Figures 9(a) and 9(b) illustrate how a swipe gesture can be interpreted as a command to either rotate or translate a graphical object depending on the configuration of fingers performing the gesture;
  • Figures 10(a) to 10(c) illustrate how a swipe gesture applied to a list of items can be interpreted differently depending on the configuration of fingers performing the gesture;
  • Figures 11(a) to 11(d) illustrate the application of a bunched finger gesture in a drawing program;
  • Figure 12 illustrates an example graphical user interface implementation of the preferred embodiment.
  • Fig. 4 illustrates in plan view an infrared- style touch screen 36 with a field of substantially parallel energy paths in the form of two directional sheets of light 38 established in a touch area 40.
  • the light is preferably in the infrared portion of the spectrum so as to be invisible to a user, but in certain embodiments may be in the visible or even ultraviolet portions of the spectrum.
  • collimation/redirection elements may include lenses, segmented reflectors or segmented lenses.
  • Light 50 emitted by a pair of optical sources (e.g. infrared LEDs) 52 is launched into the transmissive element, then collimated and re-directed by the
  • collimation/redirection elements to produce two sheets of light 38 that propagate in front of the transmissive element towards X,Y arrays of integrated optical waveguides 54 with in-plane lenses 56 to help collect the light, then guided to a position-sensitive (i.e. multielement) detector 58 such as a line camera or a digital camera chip.
  • a position-sensitive (i.e. multielement) detector 58 such as a line camera or a digital camera chip.
  • the optical sources and the detector are connected to a microcontoller 60 that controls the operation of the touch screen device.
  • the connections may be physical or wireless, and for clarity the connections between the microcontroller and the optical sources have been omitted from Fig. 4.
  • the two sheets of light are co-planar, but could alternatively be in parallel planes.
  • the touch input device 36 includes a display (not shown in Fig. 4) more or less coincident with the touch area, for presenting a graphical user interface with which a user can interact via touch input.
  • the type of display is not particularly important, and may for example be an LCD, an OLED display, a MEMs display or an e-paper display (also known as an e-book display or an electrophoretic display).
  • Fig. 5(a) shows in side view one possible configuration for combining an infrared touch screen 36 of the type shown in Fig. 4 with a display 62. In this configuration the planar transmissive element 44 is located in front of the display. In a variant configuration shown in Fig.
  • a front glass sheet 64 of the display serves as the planar transmissive element 44 component of the transmissive body, with the collimation/redirection element 46 provided as a separate component.
  • Fig. 5(c) shows in side view another configuration where the planar transmissive element 44 is located behind the display 62.
  • Figs 5(b) and 5(c) have the advantage of there being nothing in front of the display (and particularly no high refractive index layers of conductive material such as ITO, unlike in resistive and capacitive touch screens) that may cause problems of glare or display dimming. This is particularly advantageous for displays such as e-paper that can have relatively poor contrast and that may be viewed for extended periods.
  • the planar transmissive element 44 can double as a display protector, and because it has no high refractive index layers its contribution to glare or screen dimming will be modest.
  • Fig. 6(a) there is shown in plan view the shadows 66 cast by the tips of two closely spaced or physically contacting fingers, shown as contact areas 67, placed on a display associated with a touch screen of the type shown in Fig. 4.
  • a single axis sheet of light 38 is shown, directed towards an array of in-plane lenses 56 with associated receive waveguides 54.
  • the spacings between adjacent in- plane lenses can be made arbitrarily small, limited only by the resolution of the technique used to pattern them, and in certain embodiments are omitted altogether such that the lenses are contiguous.
  • Fig. 6(a) the particular example shown in Fig.
  • lenses A, M and Y receive all of the available amount of signal light
  • lenses B, L, N and X receive a reduced amount of signal light
  • lenses C to K and O to W receive essentially no signal light.
  • Fig. 6(b) illustrates the situation in side view, with the two fingers 18 placed on the surface 61 of the touch screen/display 62 and cutting the sheet of light 38 located just above the display surface.
  • the sheet of light is sufficiently close to the display surface, which is a relatively straightforward aspect of the transmit and receive optics design, the sheet of light intersects the gap 20 between the tips of the fingers, present even when the fingers are held in mutual contact in the region of the last finger joint 69 as shown.
  • Fig. 7 illustrates the basics of an example edge detection algorithm that can identify the configuration of the double finger contact shown in Figs 6(a) and 6(b).
  • the points 71 representing the intensity of light collected by the lenses 56 and guided to the detector pixels are interpolated to produce a line 72 and the edges 73 of the fingers determined by comparison with a predetermined threshold level 74, enabling calculation of the width 68 of each finger and of the gap 20 between them.
  • the edge detection algorithm determines the width of each finger to be 10 mm and the gap to be 2 mm.
  • the received signal intensity pattern will vary with the orientation of the two fingers.
  • at least one lens e.g. lens L, M or N in Fig. 7
  • the fingers can be resolved.
  • our edge detection algorithms are surprisingly robust to changes in finger orientation: when the touch screen is touched with two adjacent adult-sized fingers, there is only a narrow range of orientation angles, approximately 45° to the two sheets of light, where the algorithms fail to resolve the two fingers.
  • touch object detection based on a Cartesian grid of light paths, the ability of the algorithms to resolve closely spaced objects is independent of the position of the objects within the input area. As explained above with reference to Fig. 3, this is to be contrasted with the situation for 'optical' touch screens where the ability to resolve closely spaced objects is highly dependent on their position within the input area.
  • a double finger contact could also be identified by an algorithm that detects minima in the signal levels received by the detector pixels.
  • inspection of the signal levels 71 in Fig. 7 shows that the minima can be quite broad in situations where touch objects shadow a number of adjacent lenses, so that the determination of the position of a minimum may be inaccurate.
  • a minimum-detecting algorithm may be confused by small amounts of stray light that reach isolated pixels, as represented by point 75 in Fig. 7.
  • the controller interprets this pattern as a 'bunched finger' configuration.
  • a gesture (such as a tap or swipe) performed on the touch screen with a bunched finger configuration can then be mapped by the controller to a command to perform some action on objects displayed on a graphical user interface associated with the touch screen.
  • a bunched finger configuration to be a configuration where a user is contacting the touch screen with a set of fingers, or fingerlike objects, comprising two or more fingers that are sufficiently close together as to appear to the device controller to be being held in mutual physical contact close to the point of contact with the touch screen surface, but are nonetheless resolvable as individual objects.
  • this generally means that the fingers are touching each other at least in the region of the last finger joint.
  • two adjacent touch objects are taken to be 'sufficiently close' when the touch screen perceives them as being separated by a gap less than half the size of either object, preferably less than a quarter the size of either touch object.
  • a touch screen controller provides a routine to calibrate the device for the finger sizes and bunched finger configurations of a particular user, e.g. an adult or child, to enable the controller to distinguish more reliably between, say, a gesture performed with fingers in a bunched configuration and a gesture performed with fingers in a 'neutral'
  • a bunched finger configuration will often comprise two or more fingers (including the thumb) from the same hand, e.g. index and middle fingers, but may comprise fingers from both hands, such as the two index fingers.
  • a bunched finger configuration can also include one or more isolated fingers, so long as it includes at least two fingers that are bunched together. For example a configuration comprising bunched index and middle fingers and a more distant thumb is considered to be a bunched finger configuration.
  • a bunched finger configuration can also be performed with touch objects other than fingers, although it is envisaged that it will be more convenient and intuitive for users to interact with the touch screen using their fingers.
  • a gesture performed with a bunched finger configuration may include movement of fingers (a swipe) across the touch screen surface.
  • the bunched fingers are moved in unison, i.e. they remain bunched throughout the movement. This is preferable for gestures performed on a touch screen susceptible to the double touch ambiguity.
  • the fingers may be bunched at the beginning of a gesture and moved apart, while in other embodiments they may be moved towards each other such that the gesture ends with the fingers bunched.
  • the fingers are bunched at least at the beginning of the gesture to ensure the gesture is mapped to the desired command, but if the command is not executed or completed until after the gesture is finished, the gesture need only include bunched fingers at some point during its performance.
  • gestures that include one or more taps of a user's fingers on the touch screen surface; so long as a gesture includes screen contact with at least two bunched fingers at some point during its execution, it is considered to be a 'bunched finger gesture'.
  • Bunched finger configurations are convenient to apply, because the mutual finger contact provides a user with tactile feedback on their relative finger positions, ensuring that a gesture is mapped to the desired command by the controller. They are to be distinguished from the 'neutral' gestures of the abovementioned US 7,030,861 patent that are applied with a relaxed hand.
  • To apply a bunched finger gesture a user needs actively to bunch his/her fingers together, whereas when a user initiates contact with a relaxed hand, the fingers are relatively close together but not bunched. Inspection of Fig. 6(b) shows that because the sheet of light 38 is located above the touch surface 61, an infrared touch screen will be able to detect the two fingers 18 and the gap 20 between them before the fingers actually contact the surface.
  • the touch screen is also sensitive to 'near touches' of bunched finger configurations. Given the limited height of the gap between two fingers held in mutual contact, for most practical implementations a user will place their fingers in contact with the touch surface. Nevertheless, the terms 'touch', 'contact' and the like are intended to encompass near touches.
  • Fig. 8 there is illustrated one form of implementation of the bunch detection algorithm within the microcontroller 60 of Fig. 4.
  • the microcontroller receives the sensor inputs for the row and column sensors from the position sensitive detector 58.
  • the microcontroller executes two continuous loops 120, 121.
  • a first loop 120 examines the sensor values to determine if any finger down conditions exist.
  • the row and column details of potential finger down positions are stored.
  • a second loop 121 continually examines the finger down positions, utilising the aforementioned algorithm of Fig. 7 to determine if any of the finger down positions further satisfy the bunched finger condition.
  • the finger down positions and bunch status are stored in an I/O register 122 at a high frequency rate.
  • the microcontroller interfaces with a main computer system running a predetermined operating system (e.g. Microsoft Windows, Linux, Mac OS etc).
  • a predetermined operating system e.g. Microsoft Windows, Linux, Mac OS etc.
  • the positions and bunch status are read out by device drivers and forwarded to user interface routines over a main computer I/O bus.
  • the device drivers implement bunch finger tracking for the generation of swipe gestures.
  • the finger down positions are then forwarded to a second loop 121 which continuously reviews the finger down positions to determine if a bunched finger condition exists.
  • Fig. 9(a) illustrates a 'two finger rotate' gesture performed with a configuration of two bunched fingers 75 on a graphical user interface 76 displaying a graphical object 77.
  • the graphical object is touched by two bunched fingers and the fingers moved (swiped) in unison across the screen, preferably in a roughly circular fashion 78, the object is rotated about its centre of mass, the default centre of rotation. That is, the gesture comprising a swipe of the two bunched fingers is 'mapped' to a graphical user interface command such that the device controller generates a rotation command in response to the gesture and applies it to a graphical object.
  • a swipe of the finger will be mapped by the device controller to a 'pan' or 'translate' gesture as shown in Fig. 9(b).
  • a similar 'touch and drag' gesture performed with two fingers that are not interpreted as being bunched may also be mapped a 'pan' gesture, or to some other gesture such as a 'fixed increment' rotation, e.g. with increments of 15, 30 or 90 degrees that may be user-definable.
  • a number of variations on the 'two finger rotate' gesture shown in Fig. 9(a) are possible.
  • the object to be rotated can be selected with a single finger tap before the rotation is commenced.
  • a 'fixed increment' rotation function can be selected by tapping the object to be rotated with the two bunched fingers before commencing the 'two finger rotate' gesture.
  • some other centre of rotation can be predefined with a double tap with a single finger.
  • Figs 10(a) to 10(c) some other examples of gestures with two bunched fingers will be explained with reference to a list of items 79 partially displayed in a window 80 of a graphical user interface 76.
  • a downwards or upwards swipe 82 with a single finger 18 is mapped to a 'scroll' command allowing further items in the list to be displayed.
  • a swipe 84 with two widely separated fingers 18 is mapped to a 'pan' command where the entire window 80 is translated.
  • Fig. 10(a) a downwards or upwards swipe 82 with a single finger 18 is mapped to a 'scroll' command allowing further items in the list to be displayed.
  • a swipe 84 with two widely separated fingers 18 is mapped to a 'pan' command where the entire window 80 is translated.
  • a downwards or upwards swipe 86 with a configuration of two bunched fingers 75 is mapped to a 'move' command whereby a selected item 88 is moved to another location in the list.
  • the desired item is selected by the initial contact with the fingers on the display and 'dropped' into its new location by removing the fingers from the display.
  • an item can be selected by initial contact with a configuration of two bunched fingers then 'copied and moved' into a new location by a downwards or upwards swipe of one finger only. This is an example of a gesture that begins but does not end with a bunched finger configuration.
  • Figs 11(a) to 11(d) illustrate the application of a bunched finger gesture in a drawing package.
  • Two separate graphical objects 90 displayed on a graphical user interface 76 are selected with single finger contacts 18 as shown in Fig. 11(a), then panned into an overlapping arrangement as the two fingers are moved towards each other, possibly but not necessarily into a bunched configuration, as shown in Fig. 11(b).
  • a single tap with a configuration of two bunched fingers 75 on the overlap region 92 (Fig. 11(c)) is mapped to a command to join the two objects to form a single object 94 as shown in Fig. 11(d).
  • the 'merge' command i.e. a tap with two bunched fingers as shown in Fig. 11(d)
  • Fig. 12 illustrates one form of implementation of gesture or swipe recognition within a user interface.
  • the sensor pad outputs are continuously read out by the device drivers 126 of the user interface 125 running as part of the computer operating system.
  • the information read from the microcontroller 60 includes a bunch finger status flag which is active when a bunch finger determination has been made.
  • the device drivers 126 store current and previous finger position information to determine if a two finger swipe is currently active. Where multiple different bunch finger data is received in a continuous stream of data values, an extended bunch finger swipe is determined to have occurred.
  • the device driver outputs are forwarded to user interface controller 127 which controls the overall user interactive experience and includes an object store of current objects being displayed.
  • the user interface controller 127 updates the current interactive user interface position, outputting to a display output controller 129 which includes the usual frame buffer for output to the output display via the microcontroller 60.
  • the detection optics may be in the form of an array of discrete photodetectors located along the edge of the input area, or the sensing light may be in the form of discrete beams generated for example by an array of transmit waveguides (as disclosed in US Patent No 5,914,709) or an array of discrete emitters (as in 'conventional' infrared touch).
  • the sensing field is in the form of acoustic beams transmitted and received by arrays of reflectors spaced along the edges of the input area, as disclosed in US Patent No 6,856,259 for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Methods are presented for mapping gestures to graphical user interface commands, where the gestures are performed on a touch screen using a configuration of fingers including at least two fingers bunched together. Preferably the bunched fingers are held in mutual contact, but are nonetheless resolvable as individual fingers. The gestures can include one or more taps or swipes with bunched fingers. In one particularly preferred embodiment a gesture comprising a swipe of two bunched fingers is mapped to a rotation command applied to a graphical object displayed on the graphical user interface.

Description

METHODS FOR MAPPING GESTURES TO GRAPHICAL USER INTERFACE
COMMANDS
Field of the Invention
The present invention relates to methods for mapping gestures to graphical user interface commands. In particular, it relates to methods for mapping gestures performed on a touch screen with configurations of bunched fingers. The invention has been developed primarily for use with infrared-type touch screens, and will be described hereinafter with reference to this application. However, it will be appreciated that the invention is not limited to this particular field of use.
Background of the Invention
Any discussion of the prior art throughout the specification should in no way be considered as an admission that such prior art is widely known or forms part of the common general knowledge in the field.
Input devices based on touch sensing (touch screens) have long been used in electronic devices such as computers, personal digital assistants (PDAs), handheld games and point of sale kiosks, and are now appearing in other portable consumer electronics devices such as mobile phones. Generally, touch-enabled devices allow a user to interact with the device, for example by touching one or more graphical elements such as icons or keys of a virtual keyboard presented on a display, or by writing or drawing on a display.
Several touch-sensing technologies are known, including resistive, capacitive, projected capacitive, surface acoustic wave, optical and infrared, all of which have advantages and disadvantages in areas such as cost, reliability, ease of viewing in bright light, ability to sense different types of touch object (e.g. finger, gloved finger, stylus) and single or multi-touch capability. For example resistive touch screens are inexpensive and can sense virtually any rigid touch object, but have poor screen viewability in bright light and can only sense single touches. Projected capacitive has multi-touch capability but cannot sense a non-conductive stylus or a gloved finger, and likewise has poor screen viewability in bright light. Optical and infrared have good screen viewability in bright light, some multi-touch capability and are sensitive to virtually any touch object, but there is the potential for the detectors to be saturated by sunlight. Furthermore some touch-sensing technologies, including optical, infrared and surface acoustic wave, are sensitive to near-touches as well as to actual touches, whereas other technologies such as resistive require an actual touch.
The concept of gestural inputs, where a user places and/or moves one or more touch objects (usually fingers, with the thumb considered to be a finger) across a touch- sensitive surface, or places one or more touch objects on a touch- sensitive surface in a particular sequence, is an increasingly popular means for enhancing the power of touch input devices beyond the simple 'touch to select' function, with a large number of gestures of varying complexity for touch input devices known in the art (see for example US Patent Application Publication Nos 2006/0026535 Al, 2006/0274046 Al and 2007/0177804 Al). As discussed in US 2006/0097991 Al, touch technologies such as projected capacitive with interrogation of every node can accurately detect several simultaneous touch events and are well suited to gestural input, with gestures interpreted according to the number of fingers used. US 2007/0177804 Al discusses the concept of a 'chord' as a set of fingers contacting a multi-touch surface, and suggests the use of a gesture dictionary assigning functions to different motions of a chord. On the other hand for touch technologies with no multi-touch capability (e.g. resistive and surface capacitive), gestural input based on chords is of limited applicability.
Still other touch technologies, typically those that detect touch objects based on the occlusion of one or more light or acoustic beam paths, occupy a middle ground in that they can routinely detect the presence of multiple simultaneous touch objects, but generally cannot unambiguously determine their location. This 'double touch ambiguity' is illustrated in Fig. 1, for the example of a conventional 'infrared' touch screen 2 with arrays of emitters 4 and detectors 6 along opposing edges of an input area 8. Two touch objects 10 occlude two 'X' beam paths 12 and two 'Y' beam paths 14 such that their presence is detected, but their location cannot be distinguished from the two 'phantom' points 16. This ambiguity, and some of its implications for gestural input, is described in more detail in International Patent Application Publication No WO 2008/138046 Al.
Essentially the same ambiguity is encountered in 'optical' touch screens that detect touch objects using cameras in two adjacent corners (US Patent No 6,943,779 and US Patent Application Publication No 2006/0232792 Al), surface acoustic wave devices (US Patent Nos 6,856,259 and 7,061,475), and projected capacitive systems with column/row interrogation (simpler and faster than interrogating every node, US Patent Application Publication No 2008/0150906 Al). The ambiguity can sometimes be resolved by various techniques described in the abovementioned documents, but in any event the touch technologies in this class are still suitable for gestural input to a limited extent. In particular, the 'double touch ambiguity' does not arise for every arrangement of two or more touch objects. For example, with reference to Fig. 1, if two touch objects occlude different 'X' beam paths 12 but the same 'Y' beam path 14, or different 'Y' beam paths but the same 'X' beam path, their locations can be determined
unambiguously.
The number of gestures that can be mapped to various combinations of fingers can be increased by taking account of the distances between a user's fingers on the touch- sensitive surface. For example US Patent No 7,030,861 discloses the concept of discriminating between 'neutral' gestures, where a user's hand is relaxed with the fingers relatively close together, and 'spread-hand' gestures, where a user's fingers are intentionally spread apart, with different functions assigned accordingly. Referring to Fig. 2, the requirement for a touch technology to resolve two closely spaced touch objects, represented by the contact points of two adjacent fingers 18 on a touch screen 2, separated by a gap 20 significantly smaller than either finger, is in many ways distinct from the requirement to detect and locate two simultaneous touch events. That is, a touch technology capable of detecting and locating two widely separated touch objects will not necessarily be able to do so when the objects are closely spaced or in mutual physical contact. For example, in projected capacitive touch screens where conductive objects (such as a user's fingers) are detected in an analogue fashion via their interaction with a grid of conductive traces, the ability to resolve two closely spaced objects can be influenced by changes in the conductivity of the touch objects (e.g. damp versus dry fingers). Another touch technology that can in principle detect multiple touch objects, but will have difficulty in resolving closely spaced objects, is the 'in-cell optical' technology described in US Patent No 7,166,966 for example, where photodetector pixels located amongst the emitter pixels of an LCD detect a touch object via shadowing of ambient light. In this case, unless the ambient light source is fortuitously located, two adjacent fingers as shown in Fig. 2 will appear as a single shadow. In the case of 'optical' (as opposed to 'infrared') touch screens, the ability to resolve closely spaced touch objects varies with position on the touch area. To explain with reference to Fig. 3, an optical touch screen 22 with co-located emitters 24 and cameras 26 in two adjacent corners, and three retro -reflective sides 28, locates a touch object 10 by triangulation of the shadowed beam paths 30. It will be appreciated that while two closely spaced objects at position 32 may be resolved (depending on the camera resolution) because the gap 20 will be seen by at least one of the cameras, as shown by the beam path 33, they will not be resolved at position 34.
Summary of the Invention
It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative. It is an object of the present invention in its preferred form to provide methods for mapping gestures performed on a touch screen to graphical user interface commands. In accordance with a first aspect of the invention there is provided a method for mapping gestures to graphical user interface commands, said method comprising the steps of: detecting a gesture performed with a bunched finger configuration on a surface of a touch screen associated with said graphical user interface, said bunched finger configuration comprising two or more fingers; and generating a graphical user interface command in response to said gesture.
In one preferred form, the bunched finger configuration comprises two or more fingers held in mutual contact. In another preferred form, the bunched finger configuration comprises two or more fingers perceived by the touch screen to be separated by a gap less than half the size of any of the fingers. In yet another preferred form, the bunched finger configuration comprises two or more fingers perceived by the touch screen to be separated by a gap less than a quarter the size of any of the fingers.
The gesture preferably includes one or more taps on the surface of at least a subset of the fingers. Preferably, the gesture includes movement across the surface of at least a subset of the fingers. The fingers are preferably moved across the surface in unison. In one preferred form, the bunched finger configuration comprises two fingers, and the gesture is mapped to a rotation command such that a graphical object displayed on the graphical user interface is rotated. In another preferred form, the bunched finger configuration comprises two fingers, the gesture further comprises a tap of the two fingers prior to the movement, and the gesture is mapped to a fixed increment rotation command such that a graphical object displayed on the graphical user interface is rotated by fixed increments. In yet another preferred form, the gesture further comprises a double tap of a single finger on the graphical object prior to the movement, to define a centre of rotation, and the gesture is mapped to a rotation command such that the graphical object is rotated about the centre of rotation.
In accordance with a second aspect of the present invention there is provided a touch screen system when used to implement a method according to the first aspect of the invention. Brief Description of the Drawings
Preferred embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:
Figure 1 illustrates the 'double touch ambiguity' experienced by many touch screen technologies;
Figure 2 illustrates in plan view the contact of two closely spaced fingers on a touch screen;
Figure 3 illustrates in plan view the operation of an 'optical' touch screen;
Figure 4 illustrates in plan view the operation of a type of infrared touch screen; Figures 5(a) to 5(c) illustrate in side view some possible configurations for combining the infrared touch screen of Fig. 4 with a display;
Figures 6(a) and 6(b) illustrate in plan and side view the interaction of two closely spaced touch objects with the touch screen of Fig. 4;
Figure 7 illustrates an example edge detection algorithm for resolving two closely spaced touch objects;
Figure 8 illustrates an example implementation of bunched finger detection within a touch screen microcontroller;
Figures 9(a) and 9(b) illustrate how a swipe gesture can be interpreted as a command to either rotate or translate a graphical object depending on the configuration of fingers performing the gesture;
Figures 10(a) to 10(c) illustrate how a swipe gesture applied to a list of items can be interpreted differently depending on the configuration of fingers performing the gesture; Figures 11(a) to 11(d) illustrate the application of a bunched finger gesture in a drawing program; and
Figure 12 illustrates an example graphical user interface implementation of the preferred embodiment.
Detailed Description of the Invention
Referring to the drawings, Fig. 4 illustrates in plan view an infrared- style touch screen 36 with a field of substantially parallel energy paths in the form of two directional sheets of light 38 established in a touch area 40. The light is preferably in the infrared portion of the spectrum so as to be invisible to a user, but in certain embodiments may be in the visible or even ultraviolet portions of the spectrum. As described in US Patent
Application Publication No 2008/0278460 Al, the contents of which are incorporated herein by reference, the sheets of light are generated by a transmissive body 42
comprising a planar transmissive element 44 and two collimation/redirection elements 46 that include parabolic reflectors 48. In alternative embodiments the
collimation/redirection elements may include lenses, segmented reflectors or segmented lenses. Light 50 emitted by a pair of optical sources (e.g. infrared LEDs) 52 is launched into the transmissive element, then collimated and re-directed by the
collimation/redirection elements to produce two sheets of light 38 that propagate in front of the transmissive element towards X,Y arrays of integrated optical waveguides 54 with in-plane lenses 56 to help collect the light, then guided to a position-sensitive (i.e. multielement) detector 58 such as a line camera or a digital camera chip. Methods for fabricating the optical waveguides and in-plane lenses from photo-curable polymers are described in US Patent No 7,218,812 and US Patent Application Publication No
2007/0190331A1, the contents of which are incorporated herein by reference. The optical sources and the detector are connected to a microcontoller 60 that controls the operation of the touch screen device. The connections may be physical or wireless, and for clarity the connections between the microcontroller and the optical sources have been omitted from Fig. 4. Preferably the two sheets of light are co-planar, but could alternatively be in parallel planes. When an object 10 such as a user's finger blocks portions of the sheets of light, it will be detected and its location within the touch area determined based on the detector elements that receive little or no light.
Generally the touch input device 36 includes a display (not shown in Fig. 4) more or less coincident with the touch area, for presenting a graphical user interface with which a user can interact via touch input. The type of display is not particularly important, and may for example be an LCD, an OLED display, a MEMs display or an e-paper display (also known as an e-book display or an electrophoretic display). Fig. 5(a) shows in side view one possible configuration for combining an infrared touch screen 36 of the type shown in Fig. 4 with a display 62. In this configuration the planar transmissive element 44 is located in front of the display. In a variant configuration shown in Fig. 5(b) a front glass sheet 64 of the display serves as the planar transmissive element 44 component of the transmissive body, with the collimation/redirection element 46 provided as a separate component. Fig. 5(c) shows in side view another configuration where the planar transmissive element 44 is located behind the display 62.
The configurations shown in Figs 5(b) and 5(c) have the advantage of there being nothing in front of the display (and particularly no high refractive index layers of conductive material such as ITO, unlike in resistive and capacitive touch screens) that may cause problems of glare or display dimming. This is particularly advantageous for displays such as e-paper that can have relatively poor contrast and that may be viewed for extended periods. Alternatively, in the configuration shown in Fig 5(a) the planar transmissive element 44 can double as a display protector, and because it has no high refractive index layers its contribution to glare or screen dimming will be modest.
The methods of the present invention will now be described, by way of example only, with reference to the infrared touch screen illustrated in Fig. 4, equipped with a display for presenting a graphical user interface. However the methods are also applicable to other types of touch screen capable of resolving closely spaced touch objects.
Turning now to Fig. 6(a), there is shown in plan view the shadows 66 cast by the tips of two closely spaced or physically contacting fingers, shown as contact areas 67, placed on a display associated with a touch screen of the type shown in Fig. 4. For simplicity only a single axis sheet of light 38 is shown, directed towards an array of in-plane lenses 56 with associated receive waveguides 54. Note that the spacings between adjacent in- plane lenses can be made arbitrarily small, limited only by the resolution of the technique used to pattern them, and in certain embodiments are omitted altogether such that the lenses are contiguous. In the particular example shown in Fig. 6(a), lenses A, M and Y receive all of the available amount of signal light, lenses B, L, N and X receive a reduced amount of signal light, and lenses C to K and O to W receive essentially no signal light. When the two fingers are positioned in a line 70 oriented approximately perpendicular to the propagation direction of the single axis sheet of light 38, as shown in Fig. 6(a), the other (perpendicular) sheet of light, not shown in Fig. 6(a), will see them as a single touch object. This is in fact advantageous because it avoids the 'double touch ambiguity' as mentioned above with reference to Fig. 1. The same principle applies to arrangements of three or more fingers (or other touch objects) oriented more or less in a line approximately perpendicular to the propagation direction of one or other of the sheets of light.
Fig. 6(b) illustrates the situation in side view, with the two fingers 18 placed on the surface 61 of the touch screen/display 62 and cutting the sheet of light 38 located just above the display surface. Importantly, it can be seen that provided the sheet of light is sufficiently close to the display surface, which is a relatively straightforward aspect of the transmit and receive optics design, the sheet of light intersects the gap 20 between the tips of the fingers, present even when the fingers are held in mutual contact in the region of the last finger joint 69 as shown.
Fig. 7 illustrates the basics of an example edge detection algorithm that can identify the configuration of the double finger contact shown in Figs 6(a) and 6(b). The points 71 representing the intensity of light collected by the lenses 56 and guided to the detector pixels are interpolated to produce a line 72 and the edges 73 of the fingers determined by comparison with a predetermined threshold level 74, enabling calculation of the width 68 of each finger and of the gap 20 between them. By way of specific example, with an arrangement of receive optics including 1 mm wide in-plane lenses 56 and associated waveguides 54 on a 1 mm pitch, the edge detection algorithm determines the width of each finger to be 10 mm and the gap to be 2 mm. It will be appreciated that for a given arrangement of lenses, the received signal intensity pattern will vary with the orientation of the two fingers. However provided at least one lens (e.g. lens L, M or N in Fig. 7) receives a signal level above the predetermined threshold, the fingers can be resolved. We have found that with 1 mm wide lenses our edge detection algorithms are surprisingly robust to changes in finger orientation: when the touch screen is touched with two adjacent adult-sized fingers, there is only a narrow range of orientation angles, approximately 45° to the two sheets of light, where the algorithms fail to resolve the two fingers. It will also be appreciated that with touch object detection based on a Cartesian grid of light paths, the ability of the algorithms to resolve closely spaced objects is independent of the position of the objects within the input area. As explained above with reference to Fig. 3, this is to be contrasted with the situation for 'optical' touch screens where the ability to resolve closely spaced objects is highly dependent on their position within the input area.
We note that a double finger contact could also be identified by an algorithm that detects minima in the signal levels received by the detector pixels. However inspection of the signal levels 71 in Fig. 7 shows that the minima can be quite broad in situations where touch objects shadow a number of adjacent lenses, so that the determination of the position of a minimum may be inaccurate. Furthermore a minimum-detecting algorithm may be confused by small amounts of stray light that reach isolated pixels, as represented by point 75 in Fig. 7. When a pattern of detected light intensity is determined by the algorithms to be consistent with a touch from two or more finger-sized objects separated by distances considerably less than the dimensions of each object, the controller interprets this pattern as a 'bunched finger' configuration. A gesture (such as a tap or swipe) performed on the touch screen with a bunched finger configuration can then be mapped by the controller to a command to perform some action on objects displayed on a graphical user interface associated with the touch screen. For the purposes of this specification, we define a bunched finger configuration to be a configuration where a user is contacting the touch screen with a set of fingers, or fingerlike objects, comprising two or more fingers that are sufficiently close together as to appear to the device controller to be being held in mutual physical contact close to the point of contact with the touch screen surface, but are nonetheless resolvable as individual objects. Anatomically, this generally means that the fingers are touching each other at least in the region of the last finger joint. In preferred embodiments, two adjacent touch objects are taken to be 'sufficiently close' when the touch screen perceives them as being separated by a gap less than half the size of either object, preferably less than a quarter the size of either touch object. In certain embodiments a touch screen controller provides a routine to calibrate the device for the finger sizes and bunched finger configurations of a particular user, e.g. an adult or child, to enable the controller to distinguish more reliably between, say, a gesture performed with fingers in a bunched configuration and a gesture performed with fingers in a 'neutral'
configuration, i.e. slightly separated. Note that a user's fingers need not actually be being held in mutual physical contact for the controller to interpret the finger configuration as a 'bunched finger configuration', although to minimise the chance of the controller misinterpreting the finger configuration and mapping it to the wrong command, a user should preferably have their fingers held in mutual contact. A bunched finger configuration will often comprise two or more fingers (including the thumb) from the same hand, e.g. index and middle fingers, but may comprise fingers from both hands, such as the two index fingers. A bunched finger configuration can also include one or more isolated fingers, so long as it includes at least two fingers that are bunched together. For example a configuration comprising bunched index and middle fingers and a more distant thumb is considered to be a bunched finger configuration. A bunched finger configuration can also be performed with touch objects other than fingers, although it is envisaged that it will be more convenient and intuitive for users to interact with the touch screen using their fingers.
A gesture performed with a bunched finger configuration may include movement of fingers (a swipe) across the touch screen surface. In preferred embodiments the bunched fingers are moved in unison, i.e. they remain bunched throughout the movement. This is preferable for gestures performed on a touch screen susceptible to the double touch ambiguity. However in other embodiments the fingers may be bunched at the beginning of a gesture and moved apart, while in other embodiments they may be moved towards each other such that the gesture ends with the fingers bunched. Preferably the fingers are bunched at least at the beginning of the gesture to ensure the gesture is mapped to the desired command, but if the command is not executed or completed until after the gesture is finished, the gesture need only include bunched fingers at some point during its performance. Similar considerations apply for gestures that include one or more taps of a user's fingers on the touch screen surface; so long as a gesture includes screen contact with at least two bunched fingers at some point during its execution, it is considered to be a 'bunched finger gesture'.
Bunched finger configurations are convenient to apply, because the mutual finger contact provides a user with tactile feedback on their relative finger positions, ensuring that a gesture is mapped to the desired command by the controller. They are to be distinguished from the 'neutral' gestures of the abovementioned US 7,030,861 patent that are applied with a relaxed hand. To apply a bunched finger gesture a user needs actively to bunch his/her fingers together, whereas when a user initiates contact with a relaxed hand, the fingers are relatively close together but not bunched. Inspection of Fig. 6(b) shows that because the sheet of light 38 is located above the touch surface 61, an infrared touch screen will be able to detect the two fingers 18 and the gap 20 between them before the fingers actually contact the surface. That is, the touch screen is also sensitive to 'near touches' of bunched finger configurations. Given the limited height of the gap between two fingers held in mutual contact, for most practical implementations a user will place their fingers in contact with the touch surface. Nevertheless, the terms 'touch', 'contact' and the like are intended to encompass near touches.
Turning now to Fig. 8 there is illustrated one form of implementation of the bunch detection algorithm within the microcontroller 60 of Fig. 4. The microcontroller receives the sensor inputs for the row and column sensors from the position sensitive detector 58. The microcontroller executes two continuous loops 120, 121. A first loop 120 examines the sensor values to determine if any finger down conditions exist. The row and column details of potential finger down positions are stored. A second loop 121 continually examines the finger down positions, utilising the aforementioned algorithm of Fig. 7 to determine if any of the finger down positions further satisfy the bunched finger condition. The finger down positions and bunch status are stored in an I/O register 122 at a high frequency rate.
The microcontroller interfaces with a main computer system running a predetermined operating system (e.g. Microsoft Windows, Linux, Mac OS etc). Depending on the operating system implementation details, the positions and bunch status are read out by device drivers and forwarded to user interface routines over a main computer I/O bus. The device drivers implement bunch finger tracking for the generation of swipe gestures.
The finger down positions are then forwarded to a second loop 121 which continuously reviews the finger down positions to determine if a bunched finger condition exists.
We turn now to a description of some non-limiting examples of graphical user interface commands that can be mapped to gestures (such as taps or swipes) performed with a bunched finger configuration. EXAMPLE 1
Fig. 9(a) illustrates a 'two finger rotate' gesture performed with a configuration of two bunched fingers 75 on a graphical user interface 76 displaying a graphical object 77. When the graphical object is touched by two bunched fingers and the fingers moved (swiped) in unison across the screen, preferably in a roughly circular fashion 78, the object is rotated about its centre of mass, the default centre of rotation. That is, the gesture comprising a swipe of the two bunched fingers is 'mapped' to a graphical user interface command such that the device controller generates a rotation command in response to the gesture and applies it to a graphical object. If on the other hand the graphical object 77 is touched with a single finger 18, a swipe of the finger will be mapped by the device controller to a 'pan' or 'translate' gesture as shown in Fig. 9(b). A similar 'touch and drag' gesture performed with two fingers that are not interpreted as being bunched (e.g. if the device controller sees them as being separated by a distance comparable to the width of either finger) may also be mapped a 'pan' gesture, or to some other gesture such as a 'fixed increment' rotation, e.g. with increments of 15, 30 or 90 degrees that may be user-definable.
A number of variations on the 'two finger rotate' gesture shown in Fig. 9(a) are possible. In one example, if several graphical objects are displayed, the object to be rotated can be selected with a single finger tap before the rotation is commenced. In another example, a 'fixed increment' rotation function can be selected by tapping the object to be rotated with the two bunched fingers before commencing the 'two finger rotate' gesture. In yet another example, instead of rotating the object about its centre of mass, some other centre of rotation can be predefined with a double tap with a single finger.
EXAMPLE 2
Referring now to Figs 10(a) to 10(c), some other examples of gestures with two bunched fingers will be explained with reference to a list of items 79 partially displayed in a window 80 of a graphical user interface 76. As shown in Fig. 10(a), a downwards or upwards swipe 82 with a single finger 18 is mapped to a 'scroll' command allowing further items in the list to be displayed. As shown in Fig. 10(b), a swipe 84 with two widely separated fingers 18 is mapped to a 'pan' command where the entire window 80 is translated. As shown in Fig. 10(c), a downwards or upwards swipe 86 with a configuration of two bunched fingers 75 is mapped to a 'move' command whereby a selected item 88 is moved to another location in the list. The desired item is selected by the initial contact with the fingers on the display and 'dropped' into its new location by removing the fingers from the display. In another variation, an item can be selected by initial contact with a configuration of two bunched fingers then 'copied and moved' into a new location by a downwards or upwards swipe of one finger only. This is an example of a gesture that begins but does not end with a bunched finger configuration. EXAMPLE 3
Figs 11(a) to 11(d) illustrate the application of a bunched finger gesture in a drawing package. Two separate graphical objects 90 displayed on a graphical user interface 76 are selected with single finger contacts 18 as shown in Fig. 11(a), then panned into an overlapping arrangement as the two fingers are moved towards each other, possibly but not necessarily into a bunched configuration, as shown in Fig. 11(b). Finally, a single tap with a configuration of two bunched fingers 75 on the overlap region 92 (Fig. 11(c)) is mapped to a command to join the two objects to form a single object 94 as shown in Fig. 11(d). This is an example of a gesture that ends but does not begin with a bunched finger configuration. Depending on the separation of the two objects 90 as shown in Fig. 11(a), it may be convenient to initiate this 'merge' command by selecting and panning the objects with a finger from each hand. We note that the 'merge' command, i.e. a tap with two bunched fingers as shown in Fig. 11(d), can alternatively be performed in isolation, to merge two graphical objects that were previously moved into or created in overlapping fashion.
Fig. 12 illustrates one form of implementation of gesture or swipe recognition within a user interface. In this arrangement, the sensor pad outputs are continuously read out by the device drivers 126 of the user interface 125 running as part of the computer operating system. The information read from the microcontroller 60 includes a bunch finger status flag which is active when a bunch finger determination has been made. The device drivers 126 store current and previous finger position information to determine if a two finger swipe is currently active. Where multiple different bunch finger data is received in a continuous stream of data values, an extended bunch finger swipe is determined to have occurred.
The device driver outputs are forwarded to user interface controller 127 which controls the overall user interactive experience and includes an object store of current objects being displayed. The user interface controller 127 updates the current interactive user interface position, outputting to a display output controller 129 which includes the usual frame buffer for output to the output display via the microcontroller 60.
While the methods of the present invention have been described and exemplified for use on an infrared- style input device of the type shown in Fig. 4, the skilled person will understand that many variations are possible, provided the detection optics have sufficient resolution. For example the detection optics may be in the form of an array of discrete photodetectors located along the edge of the input area, or the sensing light may be in the form of discrete beams generated for example by an array of transmit waveguides (as disclosed in US Patent No 5,914,709) or an array of discrete emitters (as in 'conventional' infrared touch). Two sets of parallel beams are usually required to determine the X,Y coordinates of a touch object, with the sets generally being perpendicular to each other although this is not essential (see US Patent No 4,933,544 for example). In yet another variation the sensing field is in the form of acoustic beams transmitted and received by arrays of reflectors spaced along the edges of the input area, as disclosed in US Patent No 6,856,259 for example.
It will be appreciated that the illustrated embodiments expand the range of gestures that can be mapped to commands of a touch-enabled graphical user interface equipped. Although the invention has been described with reference to specific examples, it will be appreciated by those skilled in the art that the invention may be embodied in many other forms.

Claims

We claim:
1. A method for mapping gestures to graphical user interface commands, said method comprising the steps of: detecting a gesture performed with a bunched finger configuration on a surface of a touch screen associated with said graphical user interface, said bunched finger configuration comprising two or more fingers; and generating a graphical user interface command in response to said gesture.
2. A method according to claim 1, wherein said bunched finger configuration comprises two or more fingers held in mutual contact.
3. A method according to claim 1 or claim 2, wherein said bunched finger configuration comprises two or more fingers perceived by said touch screen to be separated by a gap less than half the size of any of said fingers.
4. A method according to claim 3, wherein said bunched finger configuration comprises two or more fingers perceived by said touch screen to be separated by a gap less than a quarter the size of any of said fingers.
5. A method according to any one of the preceding claims, wherein said gesture includes one or more taps on said surface of at least a subset of said fingers.
6. A method according to any one of the preceding claims, wherein said gesture includes movement across said surface of at least a subset of said fingers.
7. A method according to claim 6, wherein said fingers are moved across said surface in unison.
8. A method according to claim 7, wherein said bunched finger configuration comprises two fingers, and said gesture is mapped to a rotation command such that a graphical object displayed on said graphical user interface is rotated.
9. A method according to claim 7, wherein said bunched finger configuration comprises two fingers, said gesture further comprises a tap of said two fingers prior to said movement, and said gesture is mapped to a fixed increment rotation command such that a graphical object displayed on said graphical user interface is rotated by fixed increments.
10. A method according to claim 8, wherein said gesture further comprises a double tap of a single finger on said graphical object prior to said movement, to define a centre of rotation, and said gesture is mapped to a rotation command such that said graphical object is rotated about said centre of rotation.
11. A touch screen system when used to implement a method according to any one of claims 1 to 10.
EP10813167.3A 2009-09-04 2010-09-03 Methods for mapping gestures to graphical user interface commands Withdrawn EP2473909A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2009904315A AU2009904315A0 (en) 2009-09-04 Methods for mapping gestures to graphical user interface commands
PCT/AU2010/001135 WO2011026186A1 (en) 2009-09-04 2010-09-03 Methods for mapping gestures to graphical user interface commands

Publications (2)

Publication Number Publication Date
EP2473909A1 true EP2473909A1 (en) 2012-07-11
EP2473909A4 EP2473909A4 (en) 2014-03-19

Family

ID=43648778

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10813167.3A Withdrawn EP2473909A4 (en) 2009-09-04 2010-09-03 Methods for mapping gestures to graphical user interface commands

Country Status (2)

Country Link
EP (1) EP2473909A4 (en)
WO (1) WO2011026186A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201011687D0 (en) * 2010-07-12 2010-08-25 Faster Imaging As User interactions
EP2708995A4 (en) * 2011-05-12 2014-10-01 Nec Casio Mobile Comm Ltd Electronic device, method for controlling same and program
EP2812784A4 (en) * 2012-02-07 2015-11-11 Blackberry Ltd Methods and devices for merging contact records
CN107003804B (en) 2014-11-21 2020-06-12 习得智交互软件开发公司 Method, system and non-transitory computer readable recording medium for providing prototype design tool
US10319408B2 (en) 2015-03-30 2019-06-11 Manufacturing Resources International, Inc. Monolithic display with separately controllable sections
US10922736B2 (en) 2015-05-15 2021-02-16 Manufacturing Resources International, Inc. Smart electronic display for restaurants
US10269156B2 (en) 2015-06-05 2019-04-23 Manufacturing Resources International, Inc. System and method for blending order confirmation over menu board background
US10055120B2 (en) 2015-07-07 2018-08-21 International Business Machines Corporation Managing content displayed on a touch screen enabled device using gestures
US10319271B2 (en) 2016-03-22 2019-06-11 Manufacturing Resources International, Inc. Cyclic redundancy check for electronic displays
WO2017210317A1 (en) 2016-05-31 2017-12-07 Manufacturing Resources International, Inc. Electronic display remote image verification system and method
US10510304B2 (en) 2016-08-10 2019-12-17 Manufacturing Resources International, Inc. Dynamic dimming LED backlight for LCD array
US11895362B2 (en) 2021-10-29 2024-02-06 Manufacturing Resources International, Inc. Proof of play for images displayed at electronic displays

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
EP1517228A2 (en) * 2003-09-16 2005-03-23 Smart Technologies, Inc. Gesture recognition method and touch system incorporating the same
US20060274046A1 (en) * 2004-08-06 2006-12-07 Hillis W D Touch detecting interactive display
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20080278460A1 (en) * 2007-05-11 2008-11-13 Rpo Pty Limited Transmissive Body
US20090138800A1 (en) * 2007-11-23 2009-05-28 Mckesson Financial Holdings Limited Apparatus, method and computer-readable storage medium for directing operation of a software application via a touch-sensitive surface

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5896126A (en) * 1996-08-29 1999-04-20 International Business Machines Corporation Selection device for touchscreen systems
EP1717679B1 (en) * 1998-01-26 2016-09-21 Apple Inc. Method for integrating manual input
JP5172334B2 (en) * 2004-06-17 2013-03-27 アドレア エルエルシー Using two-finger input on a touch screen
US8077153B2 (en) * 2006-04-19 2011-12-13 Microsoft Corporation Precise selection techniques for multi-touch screens
US20070257891A1 (en) * 2006-05-03 2007-11-08 Esenther Alan W Method and system for emulating a mouse on a multi-touch sensitive surface
US8098235B2 (en) * 2007-09-28 2012-01-17 Immersion Corporation Multi-touch device having dynamic haptic effects

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
EP1517228A2 (en) * 2003-09-16 2005-03-23 Smart Technologies, Inc. Gesture recognition method and touch system incorporating the same
US20060274046A1 (en) * 2004-08-06 2006-12-07 Hillis W D Touch detecting interactive display
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20080278460A1 (en) * 2007-05-11 2008-11-13 Rpo Pty Limited Transmissive Body
US20090138800A1 (en) * 2007-11-23 2009-05-28 Mckesson Financial Holdings Limited Apparatus, method and computer-readable storage medium for directing operation of a software application via a touch-sensitive surface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2011026186A1 *

Also Published As

Publication number Publication date
WO2011026186A1 (en) 2011-03-10
EP2473909A4 (en) 2014-03-19

Similar Documents

Publication Publication Date Title
EP2473909A1 (en) Methods for mapping gestures to graphical user interface commands
US20110012856A1 (en) Methods for Operation of a Touch Input Device
US8614664B2 (en) Multi-touch multi-dimensional mouse
US9001087B2 (en) Light-based proximity detection system and user interface
US20110069018A1 (en) Double Touch Inputs
Yee Two-handed interaction on a tablet display
CA2647561C (en) Selective rejection of touch contacts in an edge region of a touch surface
KR101365394B1 (en) Light-based finger gesture user interface
KR101471267B1 (en) Method and device for generating dynamically touch keyboard
CN107066137B (en) Apparatus and method for providing user interface
US20120218215A1 (en) Methods for Detecting and Tracking Touch Objects
US20120154313A1 (en) Multi-touch finger registration and its applications
CN102822774B (en) Presentation graphics
US20150193023A1 (en) Devices for use with computers
JP2011503709A (en) Gesture detection for digitizer
TW201112081A (en) Two-dimensional touch sensors
EP2550579A1 (en) Gesture mapping for display device
TW201322064A (en) Multi-touch mouse
US20110248946A1 (en) Multi-mode prosthetic device to facilitate multi-state touch screen detection
US20120075202A1 (en) Extending the touchable area of a touch screen beyond the borders of the screen
TW201218036A (en) Method for combining at least two touch signals in a computer system
TW201411426A (en) Electronic apparatus and control method thereof
EP2872973A1 (en) Improvements in devices for use with computers
US20130088427A1 (en) Multiple input areas for pen-based computing
US9436304B1 (en) Computer with unified touch surface for input

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120402

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20140219

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0484 20130101ALI20140213BHEP

Ipc: G06F 3/041 20060101ALI20140213BHEP

Ipc: G06F 3/0488 20130101ALI20140213BHEP

Ipc: G06F 3/042 20060101AFI20140213BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20140918