WO2009155981A1 - Gesture on touch sensitive arrangement - Google Patents

Gesture on touch sensitive arrangement Download PDF

Info

Publication number
WO2009155981A1
WO2009155981A1 PCT/EP2008/058164 EP2008058164W WO2009155981A1 WO 2009155981 A1 WO2009155981 A1 WO 2009155981A1 EP 2008058164 W EP2008058164 W EP 2008058164W WO 2009155981 A1 WO2009155981 A1 WO 2009155981A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
touch
parameter
predicted
contiguous
Prior art date
Application number
PCT/EP2008/058164
Other languages
French (fr)
Inventor
Erik Sparre
Original Assignee
Uiq Technology Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uiq Technology Ab filed Critical Uiq Technology Ab
Priority to PCT/EP2008/058164 priority Critical patent/WO2009155981A1/en
Publication of WO2009155981A1 publication Critical patent/WO2009155981A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Definitions

  • the present invention relates generally to a method and an arrangement for detection of a gesture on a touch sensitive arrangement, e.g. a touchscreen.
  • Embodiments of the present invention relates to a method and arrangement in a portable device.
  • a button, a track ball, a thumbwheel and/or a computer mouse or similar are commonly provided as an interface between a device and a user of the device, i.e. provided as the user interface or the so-called Man-Machine Interface (MMI).
  • MMI Man-Machine Interface
  • touch sensitive arrangements such as touchscreens or similar are frequently preferred as a user interface in small devices, e.g. in portable communication arrangements such as cell phones or similar and in other portable arrangements such as personal digital assistants (PDA) or similar. This is i.a. due to the fact that touch sensitive arrangements do usually not involve the intricate assembly and/or operational space etc that is required for implementing electromechanical user interfaces as those mentioned above.
  • Common electromechanical user interfaces e.g. a computer mouse or similar - can be used for moving a cursor on a screen or similar, e.g. such as a Liquid Crystal Display (LCD) or similar.
  • user interfaces can be operated to indicate an object presented on the screen, e.g. by a click or a double click on the user interface.
  • Electromechanical user interfaces can also be operated to drag an object presented on the screen, e.g. by a click on the user interface to indicate the object (e.g. a click on a computer mouse) and then dragging the object by moving the user interface (e.g. moving the computer mouse).
  • a click on the user interface to indicate the object
  • moving the user interface e.g. moving the computer mouse
  • a click, double click and/or a drag gesture can be performed on touch sensitive arrangements.
  • a single click or a double click may e.g. be preformed by a single-tap or a double tap respectively on the touch sensitive arrangement, e.g. by means of a finger or a stylus or similar.
  • a drag may be performed by a tap of a finger or a stylus or similar on the touch sensitive arrangement to indicate the object, whereupon the indicated object may be dragged by sliding the finger or stylus or similar over the surface of the touch sensitive arrangement.
  • Figure 1 a illustrates the timing of a typical single tap gesture.
  • the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time ⁇ tap.
  • the pressure decreases and remains at or below a certain minimum value for certain time being more than a reference amount of time ⁇ rel. This may be detected by a touch sensitive arrangement as a single tap corresponding to a click on a computer mouse or similar.
  • Figure 1 b illustrates the timing of a typical double tap gesture.
  • the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time ⁇ tap.
  • the pressure decreases and remains at or below a certain minimum value for certain time being less than a reference amount of time ⁇ rel.
  • the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time ⁇ tap.
  • Figure 1 c illustrates an exemplifying timing of a typical drag gesture.
  • the pressure increases to or above a certain maximum value, where it may remain for certain time being less than a first reference amount of time ⁇ d1 as schematically indicated in Fig. 1 c.
  • the drag gesture continues it is natural to lift the stylus or finger from the touch sensitive arrangement to reduce friction etc.
  • the problem is to recognize a drag gesture made on the touch sensitive arrangement even if the pressure of the touch varies during the gesture.
  • a gesture causing two or more detected movements separated by missing detections - as illustrated by the three lines in Fig. 1d separated by two sections of missing detections - should be recognized as a continuous gesture and constructed as a single gesture as illustrated by the continuous line in Fig. 1d.
  • the present invention is directed to solving the problem of providing a simple and efficient method and device for detecting a drag gesture such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
  • At least one of the problems identified above is solved according to a first aspect of the invention providing a method for recognizing at least one drag gesture detected by a touch sensitive arrangement of a portable device, which method in the portable device comprises the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
  • a second embodiment of the invention is directed to a method comprising the features of the first aspect wherein the first parameter comprises at least one first gesture position, the predicted parameter comprises at least one predicted gesture position, and the second parameter comprises at least one second gesture position, and; the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said predicted gesture position and said second gesture position is less than a predetermined value.
  • a third embodiment of the invention is directed to a method comprising the features of the first aspect or the features of the second embodiment wherein the first parameter comprises a first gesture velocity, the predicted parameter comprises a predicted gesture velocity, and the second parameter comprises a second gesture velocity; and the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said predicted gesture velocity and said second gesture velocity is less than a predetermined value.
  • a fourth embodiment of the invention is directed to a method comprising the features of the third embodiment wherein the first touch gesture and the second touch gesture are converted to a contiguous gesture when the predicted gesture velocity and said second gesture velocity are substantially equal.
  • a fifth embodiment of the invention is directed to a method comprising the features of any one of the first aspect the second, third or fourth embodiment wherein the first parameter comprises a first gesture time stamp and the second parameter comprises a second gesture time stamp; and the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said first gesture time stamp and said second gesture time stamp is less than a predetermined value.
  • a sixth embodiment of the invention is directed to a method comprising the features of any one of the second, third, fourth or fifth embodiment, which method comprises the steps of: converting the first touch gesture and the second touch gesture to a contiguous gesture by using a filtered version of at least one second parameter indicative of the second touch gesture.
  • a seventh embodiment of the invention is directed to a method comprising the features of the sixth embodiment, which method comprises the steps of: converting the first touch gesture and the second touch gesture to a contiguous gesture by using an Alpha-Beta filter for filtering said at least one second parameter indicative of the second touch gesture.
  • An eighth embodiment of the invention is directed to a portable device that comprises a touch sensing arrangement and that is configured to perform the method according to any one of the first aspect or the second, third, fourth, fifth, sixth or seventh embodiment.
  • a ninth embodiment of the invention is directed to a computer program product stored on a computer usable medium, comprising readable program means for causing a portable device to execute, when said program means is loaded in the portable device comprising a touch sensing arrangement configured to recognize at least one drag gesture detected by a touch sensitive arrangement, the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
  • Fig. 1 a is a schematic illustration of the finger/stylus pressure and the timing at a single tap gesture
  • Fig. 1 b is a schematic illustration of the finger/stylus pressure and the timing of a double tap gesture
  • FFiigg.. 11 cc is a schematic illustration of the finger/stylus pressure and the timing at a drag gesture
  • Fig. 1d is a schematic illustration of a drag gesture detected as a series of separated detection that should be constructed as a single gesture.
  • Fig. 2 is a schematic illustration of a portable device in the form of a cell phone 10.
  • FFiigg.. 33 is a schematic illustration of the relevant parts of the cell phone 10 in Fig. 1 ,
  • Fig. 4 is a flowchart of an exemplifying operation of an embodiment of the invention.
  • Fig. 5 is a CD ROM 56 on which program code for executing the method according to the invention is provided.
  • the present invention relates to portable devices comprising a touch sensitive arrangement.
  • the invention relates to portable communication devices comprising a touchscreen or similar touch sensitive arrangement.
  • the invention is by no means limited to communication devices or touchscreens. Rather, it can be applied to any suitable portable device comprising a suitable touch sensitive arrangement.
  • FIG. 2 shows an exemplifying portable communication device according to a preferred embodiment of the invention.
  • the device is a mobile cell phone 10.
  • a cell phone is just one example of a portable device in which the invention can be implemented.
  • the invention can for instance be implemented in a PDA (Personal Digital Assistant), a palm top computer, a lap top computer or a smartphone or any other suitable portable device.
  • the cell phone 10 in Fig. 2 comprises a keypad 12, a loudspeaker 13 and a microphone 14.
  • the keypad 12 is used for entering information such as selection of functions and responding to prompts.
  • the keypad 12 may be of any suitable kind, including but not limited to keypads with suitable push-buttons or similar and/or a combination of different suitable button arrangements.
  • the key pad 12 may even be an integral part of a touch sensitive arrangement comprised by the phone 10 being described below.
  • the loudspeaker 13 is used for presenting sounds to a user and the microphone 14 is used for sensing the voice from the user or similar.
  • the cell phone 10 includes an antenna, which is used for communication with other users via a network. The antenna is in-built in the cell phone 10 and hence not shown in Fig 2.
  • the cell phone 10 in Fig. 2 comprises a touch sensitive arrangement comprising an exemplifying touchscreen 20.
  • the touchscreen 20 comprises a touch function unit arranged to operatively receive and/or sense touches made by a user on the surface of the touchscreen 20. It is also preferred that the touchscreen 20 comprises a display function unit arranged to operatively present such items as functions, prompts, still and/or moving images etc to a user.
  • a touch function unit and a display function unit are almost mandatory features of typical touchscreens and they are also well known to those skilled in the art. Exemplifying touchscreens in this category can e.g. be found in modern cell phones such as the M600i, W950i, P990i and others from Sony Ericsson. Hence, the touch function unit and display function unit of a touchscreen are well known and they need no detailed description.
  • Figure 3 shows parts of the interior of the cell phone 10 being relevant but not necessarily mandatory for the present invention.
  • the cell phone 10 comprises a keypad 12, a speaker 13, a microphone 14 and a touchscreen 20.
  • the touchscreen 20 comprises a touch function unit 22 for receiving and detecting touches from a user of the cell phone 10, and a display function unit 24 (e.g. comprising a display such as an LCD or similar) for presenting functions, prompts, still images and/or moving images etc as mentioned above.
  • a touch function unit 22 for receiving and detecting touches from a user of the cell phone 10
  • a display function unit 24 e.g. comprising a display such as an LCD or similar
  • the cell phone 10 is preferably provided with a memory arrangement 16 for storing such items as e.g. system files and data files etc.
  • the memory arrangement 16 may be any suitable memory or combination of memories that are commonly used in known portable devices such as e.g. cell phones or similar.
  • the cell phone 10 comprises an antenna 17 connected to a radio circuit 18 for enabling wireless communication with a cellular network.
  • control unit 40 for controlling and supervising the operation of the cell phone 10.
  • the control unit 40 may be implemented by means of hardware and/or software, and it may comprise one or several hardware units and/or software modules, e.g. one or several separate processor arrangements provided with or having access to the appropriate software and hardware required for the functions to be performed by the cell phone 10, as is well known by those skilled in the art.
  • the control unit 40 is connected to or at least arranged to operatively communicate with the keypad 12, the speaker 13, the microphone 14, the touchscreen 20, the radio unit 18 and the memory 16. This provides the control unit 40 with the ability to control and communicate with these units to e.g. exchange information and instructions with the units.
  • the control unit 40 is provided with a drag gesture control 42, which is of special interest in connection with the present invention. Being a part of the control unit 40 implies that the drag gesture control 42 can be implemented by means of hardware and/or software and it can comprise one or several hardware units and/or software modules, e.g. one or several separate processor units provided with or having access to the software and hardware appropriate for the functions required.
  • the drag gesture control 42 is arranged to operatively control the touchscreen arrangement 20 so as to comprise and/or communicate with the touch function unit of the touchscreen arrangement 20 for sensing touches received and detected by the touch function unit 22 of the touchscreen 20.
  • the drag gesture control 42 is arranged so as to operatively detect a drag gesture (e.g. a tap and drag gesture) received and detected by the touch function unit 22 such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
  • a drag gesture e.g. a tap and drag gesture
  • the touch function unit 22 of the touchscreen 20 may comprise any of: a resistive, a capacitive, a surface-wave-acoustic (SAW) or an infrared (IR) technique or some other suitable touch sensing technique as is well known to those skilled in the art.
  • SAW surface-wave-acoustic
  • IR infrared
  • the pressure exerted by a finger or a stylus on the touch sensing surface of the touch function unit 22 can be represented in a graph as discussed above with reference to Fig. 1 a-1d.
  • a finger or a stylus or similar applied with an increasing pressure on a resistive touch sensing arrangement will typically cause the detected signal to increase gradually, whereas a decreased pressure will typically cause the detected signal to decrease.
  • a higher pressure causes a larger area of the finger to be applied on the touch sensing arrangement and the other way around for a lower pressure, which can be detected by the touch sensing arrangement.
  • a stylus applied on a capacitive, SAW or IR touch sensing arrangement with a varying pressure may not cause the detected pressure to vary, since the area of the stylus applied on the touch sensing arrangement remains essentially the same. Rather, a constant pressure may be detected as long the stylus remains applied on a capacitive, SAW or IR touch sensing arrangement, even if the stylus is applied with a varying pressure.
  • the attention is now directed to an exemplifying method to be performed by the cell phone 10 described above and particularly by the drag gesture control 42 in the cell phone 10.
  • the exemplifying method detects a tap and drag gesture such that the drag gesture is distinguished from other gestures, e.g. a single tap and/or a double tap.
  • a finger gesture or similar is assumed. However, the same applies mutatis mutandis for a stylus gesture or similar.
  • the method is performed by the drag gesture control 42 being arranged so as to logically implement the method between the touch screen driver software etc controlling the touch function unit 22 and the operating system level human interface input device handler etc controlling the display function unit 24 and the features displayed thereon.
  • the touch function unit 22 provides X and Y coordinates to indicate the position of a finger or similar during a tap and drag on the touchscreen arrangement 20 plus a time stamp with each event, e.g. each new sampled position.
  • the touch function unit 22 provides Z values in a similar manner to indicate the level of pressure against the touch screen. Time, X, Y, and Z positions / values can all be used by the method.
  • the touch function unit 22 may e.g.
  • the method uses an Alpha-Beta filter or en Alpha-Beta-Gamma filter.
  • an Alpha-Beta filter or en Alpha-Beta-Gamma filter it is preferred that the method uses an Alpha-Beta filter or en Alpha-Beta-Gamma filter.
  • a person skilled in the art having the benefit of this disclosure may contemplate other filters with inferior performance and/or being more complicated and thus more costly to develop.
  • Aax Avx/ AT (20)
  • Aay Avy / AT (21 )
  • an initialization of the cell phone 10 and particularly the drag gesture control 42 is performed.
  • the initialisation may e.g. include such actions as activating the touch function unit 22, e.g. activating a resistive, capacitive, surface-wave-acoustic (SAW) or infrared touch sensing arrangement or some other suitable touch sensing technique comprised by the touch function unit 22.
  • the initialisation may also include such actions as initialising the display function unit 24 and preferably the other necessary functions and/or units of the cell phone 10.
  • a first touch gesture G1 is recorded at a first press event by the touch function unit 22 and made available to the drag gesture control 42. It is assumed that the recording of the first touch gesture G1 comprises a number of consecutive samples of X, Y, and possibly Z positions / values and a number of time differences ⁇ T each marking the time difference between two consecutive and adjacent samples.
  • a tap and drag gesture such as that illustrated in Fig. 1 c there will normally be movements that can be detected and sampled during the whole sequence including but not limited to ⁇ d1 and ⁇ d2 as schematically indicated in Fig. 1 c.
  • the first sample in the consecutive number of samples constituting the first gesture G1.
  • the first raw X, Y, and possibly Z positions / values as they are, or alternatively only as starting values, to predict and filter the next X, Y, and possibly Z positions / values (i.e. throw away the first sample).
  • Speed and acceleration may have to be assumed to be zero for the first sample.
  • a predicted gesture Gp is obtained based on the samples in the first gesture G1 and expressions 1-9 given above.
  • the prediction is based on the last X, Y, and possibly Z positions / values and the time difference ⁇ T between two events (e.g. between two samples).
  • ⁇ T may be given by the sampling rate / sampling interval.
  • a second touch gesture G2 is recorded at a second press event by the touch function unit 22 and made available to the drag gesture control 42. It is assumed that the recording of the second touch gesture G2 comprises a number of consecutive samples of X, Y, and possibly Z positions / values and a number of time differences ⁇ T each marking the time difference between two consecutive and adjacent samples.
  • a fifth step S5 of the exemplifying method it is preferably determined whether the first gesture G1 should be considered as a part of the same gesture as the second gesture G2, i.e. it is determined if the two gestures G1 , G2 forms a single gesture.
  • the prediction Gp - obtained in the third step S3 by expressions 1-9 operating on samples forming the first gesture G1 as described above - is compare to the first sample of X, Y, and possibly Z positions / values of the second gesture G2. If the first sample of X, Y and possibly Z of the second gesture G2 is measured very close to the prediction, and preferably at the same time as the time ⁇ T G between the first gesture G1 and the first sample of the second gesture G2 is short, then we can assume that G1 and G2 is part of the same gesture.
  • G1 and G2 may be considered a parts of the same gesture when the difference ( ⁇ x, ⁇ y) between said predicted gesture position xp, yp and said second gesture position x2, y2 is less than 0, 1 millimetres, 0,5 millimetres or less than 1 millimetre.
  • the short time AT G between the first gesture G1 and the first sample of the second gesture G2 may e.g. be less than 0,1 seconds, 0,5 seconds or less than 1 second
  • G1 and G2 is assumed to be part of the same gesture if ⁇ vx and ⁇ vy as defined by expressions 17 and 18 respectively are small, preferably at the same time as the time ⁇ T G lapsed between the events is short (i.e. ⁇ T G is small as indicated above).
  • G1 and G2 may be considered a parts of the same gesture when the difference ( ⁇ vx, ⁇ vy) between said predicted gesture velocity vp and said second gesture velocity v2 is less than 0,1 millimetres per second, 0,5 millimetres per second or less than 1 millimetre per second.
  • the method returns to the second step S2 described above. However, if it can be assumed that the gestures G1 , G2 form parts of the same gesture it is preferred that the method proceeds to the next sixth step S6.
  • a sixth step S6 of the exemplifying method it is preferably determined whether the gestures G1 , G2 that is assumed to form parts of the same gesture should actually be converted into a single contiguous gesture. In other words, under the assumption that the two gestures G1 , G2 are parts of a single gesture it is determined whether the gestures G1 , G2 should actually form a contiguous gesture.
  • first gesture G1 and the second gesture G2 are converted into a single contiguous gesture if:
  • kz , kvz and kaz are constants. If it can be assumed that the gestures G1 , G2 should not form a contiguous gesture it is preferred that the method returns to the second step S2 described above. However, if it can be assumed that the gestures G1 , G2 should form a contiguous gesture it is preferred that the method proceeds to the next seventh step S7.
  • a seventh step S7 of the exemplifying method it is preferred that the first gesture G1 and the second gesture G2 are converted to a single contiguous gesture. This is preferably accomplished by means of the expressions 23-31 above.
  • the first sample of X, Y, and possibly Z positions / values of the second gesture G2 is preferably replaced by new filtered positions / values preferably calculated as indicated by the expressions 23-31 above, i.e. calculated depending on the first sample of X, Y, and possibly Z positions / values in the second gesture G adjusted with a weight of the a difference between the first sample of X, Y, and possibly Z positions / values in the second gesture G2 and the prediction of these positions / values.
  • the drag control unit 46 is adapted to perform the exemplifying method as described above by being provided with one or more processors having corresponding memory containing the appropriate software in the form of a program code or similar.
  • program code or similar can also be provided on a data carrier such as a CD ROM disc 56 as depicted in Fig. 5 or an insertable memory stick, which code or similar will perform the invention when loaded into a computer or into a phone having suitable processing capabilities.
  • the program code can also be downloaded remotely from a server either outside or inside the cellular network or be downloaded via a computer like a PC to which the phone is temporarily connected.

Abstract

The present invention is directed to a method for recognizing at least one drag gesture detected by a touch sensitive arrangement 20, 22, 24 of a portable device 10. The method comprises the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.

Description

GESTURE ON TOUCH SENSITIVE ARRANGEMENT
TECHNICAL FIELD
The present invention relates generally to a method and an arrangement for detection of a gesture on a touch sensitive arrangement, e.g. a touchscreen. Embodiments of the present invention relates to a method and arrangement in a portable device.
DESCRIPTION OF RELATED ART
As is well known to those skilled in the art a button, a track ball, a thumbwheel and/or a computer mouse or similar are commonly provided as an interface between a device and a user of the device, i.e. provided as the user interface or the so-called Man-Machine Interface (MMI). It is also well known that touch sensitive arrangements such as touchscreens or similar are frequently preferred as a user interface in small devices, e.g. in portable communication arrangements such as cell phones or similar and in other portable arrangements such as personal digital assistants (PDA) or similar. This is i.a. due to the fact that touch sensitive arrangements do usually not involve the intricate assembly and/or operational space etc that is required for implementing electromechanical user interfaces as those mentioned above.
Common electromechanical user interfaces - e.g. a computer mouse or similar - can be used for moving a cursor on a screen or similar, e.g. such as a Liquid Crystal Display (LCD) or similar. In addition, such user interfaces can be operated to indicate an object presented on the screen, e.g. by a click or a double click on the user interface.
Electromechanical user interfaces can also be operated to drag an object presented on the screen, e.g. by a click on the user interface to indicate the object (e.g. a click on a computer mouse) and then dragging the object by moving the user interface (e.g. moving the computer mouse). Clicking, double clicking and dragging an object on a screen as briefly described above are well known facts to those skilled in the art and they need no further description.
It is also well known that a click, double click and/or a drag gesture can be performed on touch sensitive arrangements. A single click or a double click may e.g. be preformed by a single-tap or a double tap respectively on the touch sensitive arrangement, e.g. by means of a finger or a stylus or similar. Similarly, a drag may be performed by a tap of a finger or a stylus or similar on the touch sensitive arrangement to indicate the object, whereupon the indicated object may be dragged by sliding the finger or stylus or similar over the surface of the touch sensitive arrangement. Clicking, double clicking and dragging an object on a touch sensitive arrangement as briefly described above are well known to those skilled in the art and they need no further description.
Nevertheless, some characteristics of a single tap, a double tap and a drag gesture on a touch sensitive arrangement will be schematically elaborated below with reference to Fig. 1 a, 1 b, 1 c and 1 c'.
Figure 1 a illustrates the timing of a typical single tap gesture. As the finger or stylus or similar is detected by the touch sensitive arrangement the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time Δtap. As the finger or stylus is subsequently lifted from the touch sensitive arrangement the pressure decreases and remains at or below a certain minimum value for certain time being more than a reference amount of time Δrel. This may be detected by a touch sensitive arrangement as a single tap corresponding to a click on a computer mouse or similar.
Figure 1 b illustrates the timing of a typical double tap gesture. As the finger or stylus or similar is detected by the touch sensitive arrangement the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time Δtap. As the finger or stylus is subsequently lifted from the touch sensitive arrangement the pressure decreases and remains at or below a certain minimum value for certain time being less than a reference amount of time Δrel. As the finger or stylus or similar is once again detected by the touch sensitive arrangement the pressure increases and remains at or above a certain maximum value for certain time being less than a reference amount of time Δtap. As the finger or stylus is subsequently lifted from the touch sensitive arrangement the pressure decreases and remains at or below a certain minimum value for certain time being more than a reference amount of time Δrel. This may be detected by a touch sensitive arrangement as a double tap corresponding to a double click on a computer mouse or similar. Figure 1 c illustrates an exemplifying timing of a typical drag gesture. As the finger or stylus or similar is detected by the touch sensitive arrangement the pressure increases to or above a certain maximum value, where it may remain for certain time being less than a first reference amount of time Δd1 as schematically indicated in Fig. 1 c. As the drag gesture continues it is natural to lift the stylus or finger from the touch sensitive arrangement to reduce friction etc. The pressure will then decrease to or below a certain minimum value for certain time Δd2. A typical drag gesture as described briefly above is too often recognized by the touch sensitive arrangement as a single tap, described above with reference to Fig. 1 a. This is particularly so if Δtap ≥ Δd1 and Δd2 ≥ Δrel.
Hence, the problem is to recognize a drag gesture made on the touch sensitive arrangement even if the pressure of the touch varies during the gesture. In other words, a gesture causing two or more detected movements separated by missing detections - as illustrated by the three lines in Fig. 1d separated by two sections of missing detections - should be recognized as a continuous gesture and constructed as a single gesture as illustrated by the continuous line in Fig. 1d.
In view of the above it would be advantageous to have a simple and efficient method and arrangement for detecting a tap and drag gesture such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
SUMMARY
The present invention is directed to solving the problem of providing a simple and efficient method and device for detecting a drag gesture such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
At least one of the problems identified above is solved according to a first aspect of the invention providing a method for recognizing at least one drag gesture detected by a touch sensitive arrangement of a portable device, which method in the portable device comprises the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
A second embodiment of the invention is directed to a method comprising the features of the first aspect wherein the first parameter comprises at least one first gesture position, the predicted parameter comprises at least one predicted gesture position, and the second parameter comprises at least one second gesture position, and; the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said predicted gesture position and said second gesture position is less than a predetermined value.
A third embodiment of the invention is directed to a method comprising the features of the first aspect or the features of the second embodiment wherein the first parameter comprises a first gesture velocity, the predicted parameter comprises a predicted gesture velocity, and the second parameter comprises a second gesture velocity; and the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said predicted gesture velocity and said second gesture velocity is less than a predetermined value.
A fourth embodiment of the invention is directed to a method comprising the features of the third embodiment wherein the first touch gesture and the second touch gesture are converted to a contiguous gesture when the predicted gesture velocity and said second gesture velocity are substantially equal.
A fifth embodiment of the invention is directed to a method comprising the features of any one of the first aspect the second, third or fourth embodiment wherein the first parameter comprises a first gesture time stamp and the second parameter comprises a second gesture time stamp; and the first touch gesture and the second touch gesture are converted to a contiguous gesture when the difference between said first gesture time stamp and said second gesture time stamp is less than a predetermined value.
A sixth embodiment of the invention is directed to a method comprising the features of any one of the second, third, fourth or fifth embodiment, which method comprises the steps of: converting the first touch gesture and the second touch gesture to a contiguous gesture by using a filtered version of at least one second parameter indicative of the second touch gesture.
A seventh embodiment of the invention is directed to a method comprising the features of the sixth embodiment, which method comprises the steps of: converting the first touch gesture and the second touch gesture to a contiguous gesture by using an Alpha-Beta filter for filtering said at least one second parameter indicative of the second touch gesture.
An eighth embodiment of the invention is directed to a portable device that comprises a touch sensing arrangement and that is configured to perform the method according to any one of the first aspect or the second, third, fourth, fifth, sixth or seventh embodiment.
A ninth embodiment of the invention is directed to a computer program product stored on a computer usable medium, comprising readable program means for causing a portable device to execute, when said program means is loaded in the portable device comprising a touch sensing arrangement configured to recognize at least one drag gesture detected by a touch sensitive arrangement, the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture made on the touch sensitive arrangement; obtaining at least one predicted parameter indicative of a predicted next gesture based on said first parameter; recording and obtaining at least one second parameter indicative of a second touch gesture made on the touch sensitive arrangement; converting the first touch gesture and the second touch gesture to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
Further advantages of the present invention and embodiments thereof will appear from the following detailed description of the invention.
It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components, but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof. BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described in more detail in relation to the enclosed drawings, in which:
Fig. 1 a is a schematic illustration of the finger/stylus pressure and the timing at a single tap gesture,
Fig. 1 b is a schematic illustration of the finger/stylus pressure and the timing of a double tap gesture,
FFiigg.. 11 cc is a schematic illustration of the finger/stylus pressure and the timing at a drag gesture,
Fig. 1d is a schematic illustration of a drag gesture detected as a series of separated detection that should be constructed as a single gesture.
Fig. 2 is a schematic illustration of a portable device in the form of a cell phone 10.
FFiigg.. 33 is a schematic illustration of the relevant parts of the cell phone 10 in Fig. 1 ,
Fig. 4 is a flowchart of an exemplifying operation of an embodiment of the invention,
Fig. 5 is a CD ROM 56 on which program code for executing the method according to the invention is provided.
DETAILED DESCRIPTION OF EMBODIMENTS
Features of Embodiments
The present invention relates to portable devices comprising a touch sensitive arrangement. In particular, the invention relates to portable communication devices comprising a touchscreen or similar touch sensitive arrangement. However, the invention is by no means limited to communication devices or touchscreens. Rather, it can be applied to any suitable portable device comprising a suitable touch sensitive arrangement.
Figure 2 shows an exemplifying portable communication device according to a preferred embodiment of the invention. Preferably, the device is a mobile cell phone 10. However, as indicated above, a cell phone is just one example of a portable device in which the invention can be implemented. The invention can for instance be implemented in a PDA (Personal Digital Assistant), a palm top computer, a lap top computer or a smartphone or any other suitable portable device. The cell phone 10 in Fig. 2 comprises a keypad 12, a loudspeaker 13 and a microphone 14. The keypad 12 is used for entering information such as selection of functions and responding to prompts. The keypad 12 may be of any suitable kind, including but not limited to keypads with suitable push-buttons or similar and/or a combination of different suitable button arrangements. The key pad 12 may even be an integral part of a touch sensitive arrangement comprised by the phone 10 being described below. The loudspeaker 13 is used for presenting sounds to a user and the microphone 14 is used for sensing the voice from the user or similar. In addition, the cell phone 10 includes an antenna, which is used for communication with other users via a network. The antenna is in-built in the cell phone 10 and hence not shown in Fig 2.
Moreover, the cell phone 10 in Fig. 2 comprises a touch sensitive arrangement comprising an exemplifying touchscreen 20. The touchscreen 20 comprises a touch function unit arranged to operatively receive and/or sense touches made by a user on the surface of the touchscreen 20. It is also preferred that the touchscreen 20 comprises a display function unit arranged to operatively present such items as functions, prompts, still and/or moving images etc to a user. A touch function unit and a display function unit are almost mandatory features of typical touchscreens and they are also well known to those skilled in the art. Exemplifying touchscreens in this category can e.g. be found in modern cell phones such as the M600i, W950i, P990i and others from Sony Ericsson. Hence, the touch function unit and display function unit of a touchscreen are well known and they need no detailed description.
Figure 3 shows parts of the interior of the cell phone 10 being relevant but not necessarily mandatory for the present invention. As previously explained, it is preferred that the cell phone 10 comprises a keypad 12, a speaker 13, a microphone 14 and a touchscreen 20.
In particular, it is preferred that the touchscreen 20 comprises a touch function unit 22 for receiving and detecting touches from a user of the cell phone 10, and a display function unit 24 (e.g. comprising a display such as an LCD or similar) for presenting functions, prompts, still images and/or moving images etc as mentioned above.
In addition, the cell phone 10 is preferably provided with a memory arrangement 16 for storing such items as e.g. system files and data files etc. The memory arrangement 16 may be any suitable memory or combination of memories that are commonly used in known portable devices such as e.g. cell phones or similar. In addition, the cell phone 10 comprises an antenna 17 connected to a radio circuit 18 for enabling wireless communication with a cellular network.
Furthermore, the cell phone 10 is provided with a control unit 40 for controlling and supervising the operation of the cell phone 10. The control unit 40 may be implemented by means of hardware and/or software, and it may comprise one or several hardware units and/or software modules, e.g. one or several separate processor arrangements provided with or having access to the appropriate software and hardware required for the functions to be performed by the cell phone 10, as is well known by those skilled in the art. As can be seen in Fig. 3, it is preferred that the control unit 40 is connected to or at least arranged to operatively communicate with the keypad 12, the speaker 13, the microphone 14, the touchscreen 20, the radio unit 18 and the memory 16. This provides the control unit 40 with the ability to control and communicate with these units to e.g. exchange information and instructions with the units.
In particular, the control unit 40 is provided with a drag gesture control 42, which is of special interest in connection with the present invention. Being a part of the control unit 40 implies that the drag gesture control 42 can be implemented by means of hardware and/or software and it can comprise one or several hardware units and/or software modules, e.g. one or several separate processor units provided with or having access to the software and hardware appropriate for the functions required. The drag gesture control 42 is arranged to operatively control the touchscreen arrangement 20 so as to comprise and/or communicate with the touch function unit of the touchscreen arrangement 20 for sensing touches received and detected by the touch function unit 22 of the touchscreen 20. In particular, the drag gesture control 42 is arranged so as to operatively detect a drag gesture (e.g. a tap and drag gesture) received and detected by the touch function unit 22 such that the drag gesture is distinguished from other gestures such as e.g. a single tap and/or a double tap.
Before we proceed it should be added that the touch function unit 22 of the touchscreen 20 may comprise any of: a resistive, a capacitive, a surface-wave-acoustic (SAW) or an infrared (IR) technique or some other suitable touch sensing technique as is well known to those skilled in the art. In the exemplifying resistive, capacitive, SAW or IR techniques or similar the pressure exerted by a finger or a stylus on the touch sensing surface of the touch function unit 22 can be represented in a graph as discussed above with reference to Fig. 1 a-1d. For example, a finger or a stylus or similar applied with an increasing pressure on a resistive touch sensing arrangement will typically cause the detected signal to increase gradually, whereas a decreased pressure will typically cause the detected signal to decrease. The same is valid mutatis mutandis in case of a finger applied on a capacitive, SAW or IR touch sensing arrangement. A higher pressure causes a larger area of the finger to be applied on the touch sensing arrangement and the other way around for a lower pressure, which can be detected by the touch sensing arrangement. However, a stylus applied on a capacitive, SAW or IR touch sensing arrangement with a varying pressure may not cause the detected pressure to vary, since the area of the stylus applied on the touch sensing arrangement remains essentially the same. Rather, a constant pressure may be detected as long the stylus remains applied on a capacitive, SAW or IR touch sensing arrangement, even if the stylus is applied with a varying pressure.
Function of Embodiments
The attention is now directed to an exemplifying method to be performed by the cell phone 10 described above and particularly by the drag gesture control 42 in the cell phone 10. The exemplifying method detects a tap and drag gesture such that the drag gesture is distinguished from other gestures, e.g. a single tap and/or a double tap. Below a finger gesture or similar is assumed. However, the same applies mutatis mutandis for a stylus gesture or similar.
First, an algorithm is needed to make a prediction of the most likely finger path, based on the most recent press-drag-release event stream. Second, a criterion for accepting or rejecting a new press-drag-release event stream must be established. Third, a method for "connecting" the first event stream with the second event stream is needed.
It is preferred that the method is performed by the drag gesture control 42 being arranged so as to logically implement the method between the touch screen driver software etc controlling the touch function unit 22 and the operating system level human interface input device handler etc controlling the display function unit 24 and the features displayed thereon. Here it is assumed that the touch function unit 22 provides X and Y coordinates to indicate the position of a finger or similar during a tap and drag on the touchscreen arrangement 20 plus a time stamp with each event, e.g. each new sampled position. It is also preferred that the touch function unit 22 provides Z values in a similar manner to indicate the level of pressure against the touch screen. Time, X, Y, and Z positions / values can all be used by the method. The touch function unit 22 may e.g. output more than 20, 40, 60, 80, 100, 120, 140, 160, 180, 200, 250, 300, 350 or more than 400 coordinates per second. Hence, we may obtain at least one of X and Y coordinate and possibly a Z value plus a sample time stamp with each such sampling event.
It is preferred that the method uses an Alpha-Beta filter or en Alpha-Beta-Gamma filter. However, a person skilled in the art having the benefit of this disclosure may contemplate other filters with inferior performance and/or being more complicated and thus more costly to develop.
Using an Alpha-Beta-Gamma filter and assuming that we know the speed and acceleration of the finger or similar since last x/y coordinate sample, then we can predict the next coordinate (if we know the time difference ΔT between two events).
We use the well-known equations for motion:
Position (prediction)
X predicted = * + (Δ7>* + l ' 2(AT) ^ (1 ) y predicted = J + (&T)vy + 1 / 2(AT) 2 ay (2)
Z predicted = Z + (Δ7>^ + l ' 2(ATf ^Z (3)
Velocity (prediction) vx predicted = vx + (ΔT)θX (4) vy predicted = Y? + (ΔT)ay (5)
ZX predicted = VC + (AT)az (6) Acceleration (prediction) ax predicted = ax (7) ay predicted = W (8) az predicted = &Z (9)
As can be seen from expressions 7-9 it is assumed that the acceleration is constant between two samples, which is particularly true when the sampling rate is high.
Now, it is preferred to use the latest raw x, y coordinates and possibly also the z coordinate measured by the touch function unit 22, even though these measures may be heavily influenced by noise. So we mix these measures in with the prediction to a fixed degree, which is determined by the Alpha (α), Beta (β)and Gamma factors (Y), i.e.:
Change = Measured - Predicted (10) (i.e. a change is the offset of the measure from the prediction)
Expression (10) written for position, speed and acceleration:
"new position" = "position prediction" + α "position change" (1 1 )
"new speed" = "speed prediction" + β "speed change" (12) "new acceleration" = prediction + Y "acceleration change" (13)
So the necessary calculations are:
Change (offset from prediction) Ax = xmeasured - (x + (AT)vx + 1/ 2(AT)2 CiX) (14)
( I.e. Iλx — X measured ~ x predicted )
^y = y measured ~ (y + (AT)vy + 1/ 2(AT)2 ay) (15)
Δz = Zmeasured ~ (z + (AT)VZ + \l 2(AT)2 Ciz) (16)
ΔVX = ΔJC / ΔΓ (17) Avy = Ay l AT (18)
AvZ = Az I AT (19)
Aax = Avx/ AT (20) Aay = Avy / AT (21 )
Aaz = Avy / AT (22)
x/y/z positions updated (filtered), using expressions 1-3, 1 1 and 14-16 xnew = x + (AT)vx + 1 / 2(AT) ax + ccAx = xmeasured - Ax + ocAx =^>
*new = ^measured + Δ*(l " «) (23) ynew = y measured + ^y(I - a) (24)
Z-new = ^-measured + ^z(I ~ 0O (25)
x/y/z speeds updated (filtered), using expressions 4-6, 12 and 17-19 vxnew = vx + (AT)ax + /?Δvx (26)
^y new = vy + (ΔT)ay + βAvy (27) vznew = vz + (AT)az + βAvz (28)
x/y/z accelerations updated (filtered), using expressions 7-9, 13 and 20-22 axnew = ax + γAax (29) ay new = ay + y^ay (30) aznew = az + yAaz (31 )
The attention is now directed to exemplifying steps of a method according an embodiment if the invention for detecting a drag gesture based on the predictions as described above. In this example it is assumed that the method is performed by the cell phone 10 as described above and particularly by the drag gesture control 42 in the cell phone 10. The steps of the exemplifying method will be described with reference to the exemplifying flowchart in Fig. 7. The steps of the method are preferably implemented by means of the feedback-control 42 as schematically illustrated in Fig. 2.
In a first step S1 of an exemplifying method according to an embodiment of the present invention an initialization of the cell phone 10 and particularly the drag gesture control 42 is performed. The initialisation may e.g. include such actions as activating the touch function unit 22, e.g. activating a resistive, capacitive, surface-wave-acoustic (SAW) or infrared touch sensing arrangement or some other suitable touch sensing technique comprised by the touch function unit 22. The initialisation may also include such actions as initialising the display function unit 24 and preferably the other necessary functions and/or units of the cell phone 10.
In a second step S2 of the exemplifying method it is preferred that a first touch gesture G1 is recorded at a first press event by the touch function unit 22 and made available to the drag gesture control 42. It is assumed that the recording of the first touch gesture G1 comprises a number of consecutive samples of X, Y, and possibly Z positions / values and a number of time differences ΔT each marking the time difference between two consecutive and adjacent samples.
Here, it may be clarified that in a tap and drag gesture such as that illustrated in Fig. 1 c there will normally be movements that can be detected and sampled during the whole sequence including but not limited to Δd1 and Δd2 as schematically indicated in Fig. 1 c.
However, there may be an issue with the first sample in the consecutive number of samples constituting the first gesture G1. For the first sample we prefer to use the first raw X, Y, and possibly Z positions / values as they are, or alternatively only as starting values, to predict and filter the next X, Y, and possibly Z positions / values (i.e. throw away the first sample). Speed and acceleration may have to be assumed to be zero for the first sample.
In a third step S3 of the exemplifying method it is preferred that a predicted gesture Gp is obtained based on the samples in the first gesture G1 and expressions 1-9 given above. The prediction is based on the last X, Y, and possibly Z positions / values and the time difference ΔT between two events (e.g. between two samples). Here, ΔT may be given by the sampling rate / sampling interval.
In a fourth step S4 of the exemplifying method it is preferred that a second touch gesture G2 is recorded at a second press event by the touch function unit 22 and made available to the drag gesture control 42. It is assumed that the recording of the second touch gesture G2 comprises a number of consecutive samples of X, Y, and possibly Z positions / values and a number of time differences ΔT each marking the time difference between two consecutive and adjacent samples. In a fifth step S5 of the exemplifying method it is preferably determined whether the first gesture G1 should be considered as a part of the same gesture as the second gesture G2, i.e. it is determined if the two gestures G1 , G2 forms a single gesture.
Here, it is preferred that the prediction Gp - obtained in the third step S3 by expressions 1-9 operating on samples forming the first gesture G1 as described above - is compare to the first sample of X, Y, and possibly Z positions / values of the second gesture G2. If the first sample of X, Y and possibly Z of the second gesture G2 is measured very close to the prediction, and preferably at the same time as the time ΔTG between the first gesture G1 and the first sample of the second gesture G2 is short, then we can assume that G1 and G2 is part of the same gesture. For example, G1 and G2 may be considered a parts of the same gesture when the difference (Δx, Δy) between said predicted gesture position xp, yp and said second gesture position x2, y2 is less than 0, 1 millimetres, 0,5 millimetres or less than 1 millimetre. Similarly, the short time ATG between the first gesture G1 and the first sample of the second gesture G2 may e.g. be less than 0,1 seconds, 0,5 seconds or less than 1 second
It is even more preferred that that G1 and G2 is assumed to be part of the same gesture if Δvx and Δvy as defined by expressions 17 and 18 respectively are small, preferably at the same time as the time ΔTG lapsed between the events is short (i.e. ΔTG is small as indicated above). For example, G1 and G2 may be considered a parts of the same gesture when the difference (Δvx, Δvy) between said predicted gesture velocity vp and said second gesture velocity v2 is less than 0,1 millimetres per second, 0,5 millimetres per second or less than 1 millimetre per second.
We use Δvx and Δvy defined in expressions 17, 18 rather than Δx and Δy defined in expressions 14, 15, since the acceptable difference should preferably depend on the duration of a "dropout", i.e. determined by the duration between the last sample of G1 and the first sample of G2 or a similar or corresponding time difference such as e.g. ATG . Hence, it is more relevant to look at discrepancies in speed between the prediction and the actual measure.
If it can be assumed that the gestures G1 , G2 are not a part of the same gesture it is preferred that the method returns to the second step S2 described above. However, if it can be assumed that the gestures G1 , G2 form parts of the same gesture it is preferred that the method proceeds to the next sixth step S6.
In a sixth step S6 of the exemplifying method it is preferably determined whether the gestures G1 , G2 that is assumed to form parts of the same gesture should actually be converted into a single contiguous gesture. In other words, under the assumption that the two gestures G1 , G2 are parts of a single gesture it is determined whether the gestures G1 , G2 should actually form a contiguous gesture.
It is preferred that the first gesture G1 and the second gesture G2 are converted into a single contiguous gesture if:
Δvx < kvx (32)
Avy < kvy (33) ATG < kt (34)
wherein kvx , kvy and kt are constants.
Note, that while it will be reasonable to hope for a good vx/vy prediction during a dropout, the same is not true for the optional Z-values. If there indeed was a dropout, Z should have decreased up until the dropout and then increased again after the dropout. We must look for low Z values (low pressure) before and after the dropout. In addition, Z-speed and Z-acceleration may be considered. Acceleration should be moderate and speed should be negative before the dropout (providing that more pressure means higher Z value).
If Z-values are available, we also need these conditions to be true just before the dropout:
z < kz (35) vz < 0 (36) abs{vz) < kvz (37) abs(az) < kaz (38)
wherein kz , kvz and kaz are constants. If it can be assumed that the gestures G1 , G2 should not form a contiguous gesture it is preferred that the method returns to the second step S2 described above. However, if it can be assumed that the gestures G1 , G2 should form a contiguous gesture it is preferred that the method proceeds to the next seventh step S7.
In a seventh step S7 of the exemplifying method it is preferred that the first gesture G1 and the second gesture G2 are converted to a single contiguous gesture. This is preferably accomplished by means of the expressions 23-31 above. In other words, the first sample of X, Y, and possibly Z positions / values of the second gesture G2 is preferably replaced by new filtered positions / values preferably calculated as indicated by the expressions 23-31 above, i.e. calculated depending on the first sample of X, Y, and possibly Z positions / values in the second gesture G adjusted with a weight of the a difference between the first sample of X, Y, and possibly Z positions / values in the second gesture G2 and the prediction of these positions / values.
In general, as previously explained, it is preferred that the drag control unit 46 is adapted to perform the exemplifying method as described above by being provided with one or more processors having corresponding memory containing the appropriate software in the form of a program code or similar. However, program code or similar can also be provided on a data carrier such as a CD ROM disc 56 as depicted in Fig. 5 or an insertable memory stick, which code or similar will perform the invention when loaded into a computer or into a phone having suitable processing capabilities. The program code can also be downloaded remotely from a server either outside or inside the cellular network or be downloaded via a computer like a PC to which the phone is temporarily connected.
The present invention has now been described with reference to exemplifying embodiments. However, the invention is not limited to the embodiments described herein. On the contrary, the full extent of the invention is only determined by the scope of the appended claims.

Claims

1. A method for recognizing at least one drag gesture detected by a touch sensitive arrangement (20, 22, 24) of a portable device (10), which method in the portable device (10) comprises the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture (G1 ) made on the touch sensitive arrangement (20), obtaining at least one predicted parameter indicative of a predicted next gesture (Gp) based on said first parameter, - recording and obtaining at least one second parameter indicative of a second touch gesture (G2) made on the touch sensitive arrangement (20), converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
2. The method according to claim 1 , wherein the first parameter comprises at least one first gesture position (x1 , y1 ), the predicted parameter comprises at least one predicted gesture position (xp, yp), and the second parameter comprises at least one second gesture position (x2, y2), and the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the difference (Δx, Δy) between said predicted gesture position (xp, yp) and said second gesture position (x2, y2) is less than a predetermined value.
3. The method according to any one of claim 1 or 2, wherein the first parameter comprises a first gesture velocity (v1x, v1y), the predicted parameter comprises a predicted gesture velocity (vpx, vpy), and the second parameter comprises a second gesture velocity (v2x, v2y), and - the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the difference (Δvx, Δvy) between said predicted gesture velocity (vpx, vpy) and said second gesture velocity (v2x, v2y) is less than a predetermined value.
4. The method according to claim 3, wherein the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the predicted gesture velocity (vpx, vpy) and said second gesture velocity (v2x, vpy) are substantially equal.
5. The method according to any one of claim 1 , 2, 3 or 4, wherein the first parameter comprises a first gesture time stamp (t1 ) and the second parameter comprises a second gesture time stamp (t2), and the first touch gesture (G1 ) and the second touch gesture (G2) are converted to a contiguous gesture when the difference ( ΔTG ) between said first gesture time stamp (t1 ) and said second gesture time stamp (t2) is less than a predetermined value.
6. The method according to any one of claim 2, 3, 4 or 5, which method comprises the steps of: converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture by using a filtered version of at least one second parameter indicative of the second touch gesture (G2).
7. The method according to claim 6, which method comprises the steps of: converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture by using an Alpha-Beta filter for filtering said at least one second parameter indicative of the second touch gesture (G2).
8. A portable device (10) comprising a touch sensing arrangement (20, 22, 24) and being configured to perform the method according to any one of claim 1-9.
9. A computer program product stored on a computer usable medium (56), comprising readable program means for causing a portable device (10) to execute, when said program means is loaded in the portable device (10) comprising a touch sensing arrangement (20, 22, 24) configured to recognize at least one drag gesture detected by a touch sensitive arrangement (20, 22, 24); the steps of: recording and obtaining at least one first parameter indicative of a first touch gesture (G1 ) made on the touch sensitive arrangement (20), obtaining at least one predicted parameter indicative of a predicted next gesture (Gp) based on said first parameter, recording and obtaining at least one second parameter indicative of a second touch gesture (G2) made on the touch sensitive arrangement (20), converting the first touch gesture (G1 ) and the second touch gesture (G2) to a contiguous gesture when the difference between said predicted parameter and said second parameter is below a predetermined threshold.
PCT/EP2008/058164 2008-06-26 2008-06-26 Gesture on touch sensitive arrangement WO2009155981A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2008/058164 WO2009155981A1 (en) 2008-06-26 2008-06-26 Gesture on touch sensitive arrangement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2008/058164 WO2009155981A1 (en) 2008-06-26 2008-06-26 Gesture on touch sensitive arrangement

Publications (1)

Publication Number Publication Date
WO2009155981A1 true WO2009155981A1 (en) 2009-12-30

Family

ID=40382000

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/058164 WO2009155981A1 (en) 2008-06-26 2008-06-26 Gesture on touch sensitive arrangement

Country Status (1)

Country Link
WO (1) WO2009155981A1 (en)

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012001208A1 (en) * 2010-06-28 2012-01-05 Nokia Corporation Haptic surface compression
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
EP0870223B1 (en) * 1994-10-07 2005-08-24 Synaptics Incorporated Method for compensating for unintended lateral motion in a tap gesture made on a touch-sensor pad

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
EP0870223B1 (en) * 1994-10-07 2005-08-24 Synaptics Incorporated Method for compensating for unintended lateral motion in a tap gesture made on a touch-sensor pad

Cited By (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102971689A (en) * 2010-06-28 2013-03-13 诺基亚公司 Haptic surface compression
WO2012001208A1 (en) * 2010-06-28 2012-01-05 Nokia Corporation Haptic surface compression
US10664097B1 (en) 2011-08-05 2020-05-26 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10275087B1 (en) 2011-08-05 2019-04-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10338736B1 (en) 2011-08-05 2019-07-02 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10345961B1 (en) 2011-08-05 2019-07-09 P4tents1, LLC Devices and methods for navigating between user interfaces
US10365758B1 (en) 2011-08-05 2019-07-30 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10386960B1 (en) 2011-08-05 2019-08-20 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10540039B1 (en) 2011-08-05 2020-01-21 P4tents1, LLC Devices and methods for navigating between user interface
US10649571B1 (en) 2011-08-05 2020-05-12 P4tents1, LLC Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10656752B1 (en) 2011-08-05 2020-05-19 P4tents1, LLC Gesture-equipped touch screen system, method, and computer program product
US9753639B2 (en) 2012-05-09 2017-09-05 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US9996231B2 (en) 2012-05-09 2018-06-12 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10969945B2 (en) 2012-05-09 2021-04-06 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US9823839B2 (en) 2012-05-09 2017-11-21 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10942570B2 (en) 2012-05-09 2021-03-09 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10908808B2 (en) 2012-05-09 2021-02-02 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US10884591B2 (en) 2012-05-09 2021-01-05 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects
US10782871B2 (en) 2012-05-09 2020-09-22 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US9886184B2 (en) 2012-05-09 2018-02-06 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10775999B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10775994B2 (en) 2012-05-09 2020-09-15 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US10175757B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for touch-based operations performed and reversed in a user interface
US11010027B2 (en) 2012-05-09 2021-05-18 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US9971499B2 (en) 2012-05-09 2018-05-15 Apple Inc. Device, method, and graphical user interface for displaying content associated with a corresponding affordance
US11023116B2 (en) 2012-05-09 2021-06-01 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US9990121B2 (en) 2012-05-09 2018-06-05 Apple Inc. Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input
US11068153B2 (en) 2012-05-09 2021-07-20 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10996788B2 (en) 2012-05-09 2021-05-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10592041B2 (en) 2012-05-09 2020-03-17 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10042542B2 (en) 2012-05-09 2018-08-07 Apple Inc. Device, method, and graphical user interface for moving and dropping a user interface object
US11221675B2 (en) 2012-05-09 2022-01-11 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10496260B2 (en) 2012-05-09 2019-12-03 Apple Inc. Device, method, and graphical user interface for pressure-based alteration of controls in a user interface
US10481690B2 (en) 2012-05-09 2019-11-19 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface
US10073615B2 (en) 2012-05-09 2018-09-11 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US11314407B2 (en) 2012-05-09 2022-04-26 Apple Inc. Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US10095391B2 (en) 2012-05-09 2018-10-09 Apple Inc. Device, method, and graphical user interface for selecting user interface objects
US11354033B2 (en) 2012-05-09 2022-06-07 Apple Inc. Device, method, and graphical user interface for managing icons in a user interface region
US9619076B2 (en) 2012-05-09 2017-04-11 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10114546B2 (en) 2012-05-09 2018-10-30 Apple Inc. Device, method, and graphical user interface for displaying user interface objects corresponding to an application
US10126930B2 (en) 2012-05-09 2018-11-13 Apple Inc. Device, method, and graphical user interface for scrolling nested regions
US9612741B2 (en) 2012-05-09 2017-04-04 Apple Inc. Device, method, and graphical user interface for displaying additional information in response to a user contact
US11947724B2 (en) 2012-05-09 2024-04-02 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US10168826B2 (en) 2012-05-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
US10191627B2 (en) 2012-05-09 2019-01-29 Apple Inc. Device, method, and graphical user interface for manipulating framed graphical objects
US10175864B2 (en) 2012-05-09 2019-01-08 Apple Inc. Device, method, and graphical user interface for selecting object within a group of objects in accordance with contact intensity
US9996233B2 (en) 2012-12-29 2018-06-12 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10101887B2 (en) 2012-12-29 2018-10-16 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US10185491B2 (en) 2012-12-29 2019-01-22 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or enlarge content
US10175879B2 (en) 2012-12-29 2019-01-08 Apple Inc. Device, method, and graphical user interface for zooming a user interface while performing a drag operation
US9778771B2 (en) 2012-12-29 2017-10-03 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US10915243B2 (en) 2012-12-29 2021-02-09 Apple Inc. Device, method, and graphical user interface for adjusting content selection
US9857897B2 (en) 2012-12-29 2018-01-02 Apple Inc. Device and method for assigning respective portions of an aggregate intensity to a plurality of contacts
US9959025B2 (en) 2012-12-29 2018-05-01 Apple Inc. Device, method, and graphical user interface for navigating user interface hierarchies
US9965074B2 (en) 2012-12-29 2018-05-08 Apple Inc. Device, method, and graphical user interface for transitioning between touch input to display output relationships
US10620781B2 (en) 2012-12-29 2020-04-14 Apple Inc. Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics
US10037138B2 (en) 2012-12-29 2018-07-31 Apple Inc. Device, method, and graphical user interface for switching between user interfaces
US10437333B2 (en) 2012-12-29 2019-10-08 Apple Inc. Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture
US10078442B2 (en) 2012-12-29 2018-09-18 Apple Inc. Device, method, and graphical user interface for determining whether to scroll or select content based on an intensity theshold
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10613634B2 (en) 2015-03-08 2020-04-07 Apple Inc. Devices and methods for controlling media presentation
US10338772B2 (en) 2015-03-08 2019-07-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10180772B2 (en) 2015-03-08 2019-01-15 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10387029B2 (en) 2015-03-08 2019-08-20 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US11112957B2 (en) 2015-03-08 2021-09-07 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US10402073B2 (en) 2015-03-08 2019-09-03 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645709B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10268341B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10860177B2 (en) 2015-03-08 2020-12-08 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9990107B2 (en) 2015-03-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US10067645B2 (en) 2015-03-08 2018-09-04 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10048757B2 (en) 2015-03-08 2018-08-14 Apple Inc. Devices and methods for controlling media presentation
US10268342B2 (en) 2015-03-08 2019-04-23 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11054990B2 (en) 2015-03-19 2021-07-06 Apple Inc. Touch input cursor manipulation
US9785305B2 (en) 2015-03-19 2017-10-10 Apple Inc. Touch input cursor manipulation
US11550471B2 (en) 2015-03-19 2023-01-10 Apple Inc. Touch input cursor manipulation
US9639184B2 (en) 2015-03-19 2017-05-02 Apple Inc. Touch input cursor manipulation
US10222980B2 (en) 2015-03-19 2019-03-05 Apple Inc. Touch input cursor manipulation
US10599331B2 (en) 2015-03-19 2020-03-24 Apple Inc. Touch input cursor manipulation
US10067653B2 (en) 2015-04-01 2018-09-04 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10152208B2 (en) 2015-04-01 2018-12-11 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10303354B2 (en) 2015-06-07 2019-05-28 Apple Inc. Devices and methods for navigating between user interfaces
US9674426B2 (en) 2015-06-07 2017-06-06 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US9602729B2 (en) 2015-06-07 2017-03-21 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10841484B2 (en) 2015-06-07 2020-11-17 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10455146B2 (en) 2015-06-07 2019-10-22 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11835985B2 (en) 2015-06-07 2023-12-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11681429B2 (en) 2015-06-07 2023-06-20 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
US9830048B2 (en) 2015-06-07 2017-11-28 Apple Inc. Devices and methods for processing touch inputs with instructions in a web page
US11240424B2 (en) 2015-06-07 2022-02-01 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US11231831B2 (en) 2015-06-07 2022-01-25 Apple Inc. Devices and methods for content preview based on touch input intensity
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9706127B2 (en) 2015-06-07 2017-07-11 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9916080B2 (en) 2015-06-07 2018-03-13 Apple Inc. Devices and methods for navigating between user interfaces
US10705718B2 (en) 2015-06-07 2020-07-07 Apple Inc. Devices and methods for navigating between user interfaces
US10235035B2 (en) 2015-08-10 2019-03-19 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US11182017B2 (en) 2015-08-10 2021-11-23 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US10698598B2 (en) 2015-08-10 2020-06-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10754542B2 (en) 2015-08-10 2020-08-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10963158B2 (en) 2015-08-10 2021-03-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10162452B2 (en) 2015-08-10 2018-12-25 Apple Inc. Devices and methods for processing touch inputs based on their intensities
US11327648B2 (en) 2015-08-10 2022-05-10 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10209884B2 (en) 2015-08-10 2019-02-19 Apple Inc. Devices, Methods, and Graphical User Interfaces for Manipulating User Interface Objects with Visual and/or Haptic Feedback
US10248308B2 (en) 2015-08-10 2019-04-02 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures
US10203868B2 (en) 2015-08-10 2019-02-12 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US11740785B2 (en) 2015-08-10 2023-08-29 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US10884608B2 (en) 2015-08-10 2021-01-05 Apple Inc. Devices, methods, and graphical user interfaces for content navigation and manipulation
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback

Similar Documents

Publication Publication Date Title
WO2009155981A1 (en) Gesture on touch sensitive arrangement
CN103262008B (en) Intelligent wireless mouse
US8786547B2 (en) Effects of gravity on gestures
US20160162064A1 (en) Method for actuating a tactile interface layer
US20090066659A1 (en) Computer system with touch screen and separate display screen
US20100259499A1 (en) Method and device for recognizing a dual point user input on a touch based user input device
US20110187652A1 (en) Bump suppression
WO2018194719A1 (en) Electronic device response to force-sensitive interface
CN103210366A (en) Apparatus and method for proximity based input
KR20150101213A (en) Electronic device, wearable device and method for the input of the electronic device
WO2006020305A2 (en) Gestures for touch sensitive input devices
EP2188702A2 (en) Method and apparatus for manipulating a displayed image
CN110069178B (en) Interface control method and terminal equipment
CN108595044B (en) Control method of touch screen and terminal
EP3612917A1 (en) Force-sensitive user input interface for an electronic device
US8810529B2 (en) Electronic device and method of controlling same
CN110703972B (en) File control method and electronic equipment
CN107463290A (en) Response control mehtod, device, storage medium and the mobile terminal of touch operation
CN110795189A (en) Application starting method and electronic equipment
CN111273993A (en) Icon sorting method and electronic equipment
KR20080105724A (en) Communication terminal having touch panel and method for calculating touch coordinates thereof
CN103809894B (en) A kind of recognition methods of gesture and electronic equipment
EP3528103B1 (en) Screen locking method, terminal and screen locking device
CN108427534B (en) Method and device for controlling screen to return to desktop
US10733280B2 (en) Control of a mobile device based on fingerprint identification

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08761387

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE