US20150103013A9 - Electronic Device and Method Using a Touch-Detecting Surface - Google Patents

Electronic Device and Method Using a Touch-Detecting Surface Download PDF

Info

Publication number
US20150103013A9
US20150103013A9 US13/647,427 US201213647427A US2015103013A9 US 20150103013 A9 US20150103013 A9 US 20150103013A9 US 201213647427 A US201213647427 A US 201213647427A US 2015103013 A9 US2015103013 A9 US 2015103013A9
Authority
US
United States
Prior art keywords
touch
touch gesture
electronic device
parameter
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/647,427
Other versions
US20130093705A1 (en
US9141195B2 (en
Inventor
Meng Huang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google Technology Holdings LLC
Original Assignee
Motorola Mobility LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to PCT/CN2010/072127 priority Critical patent/WO2011130919A1/en
Priority to CNPCT/CN2010/072127 priority
Application filed by Motorola Mobility LLC filed Critical Motorola Mobility LLC
Assigned to MOTOROLA MOBILITY LLC reassignment MOTOROLA MOBILITY LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUANG, Meng
Publication of US20130093705A1 publication Critical patent/US20130093705A1/en
Assigned to Google Technology Holdings LLC reassignment Google Technology Holdings LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MOTOROLA MOBILITY LLC
Publication of US20150103013A9 publication Critical patent/US20150103013A9/en
Publication of US9141195B2 publication Critical patent/US9141195B2/en
Application granted granted Critical
Application status is Active legal-status Critical
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text

Abstract

An electronic device includes a processor and one or more touch-detecting surfaces for detecting a gesture sequence including a first touch gesture and a subsequent second touch gesture. A related method includes detecting a first touch gesture applied to the touch-detecting surface; and initiating a sequential touch mode if the first touch gesture has a predetermined characteristic. When in the sequential touch mode, the method detects a second touch gesture that is subsequently applied to the touch-detecting surface, wherein the second touch gesture includes a glide movement. The method determines a first parameter and a second parameter associated with the second touch gesture, identifies a corresponding device function in accordance with the determined first parameter, and controls the execution of the identified device function in accordance with the determined second parameter.

Description

    FIELD OF THE INVENTION
  • The present invention relates to sequential touch gestures for simulating simultaneous touch gestures using a touch-detecting surface of an electronic device.
  • BACKGROUND
  • Electronic devices such as mobile phones, smart phones, and other handheld or portable electronic devices such as personal digital assistants (PDAs), audio players, headsets, etc. have become popular and ubiquitous. As more and more features have been added to such devices, there has been an increasing desire to equip them with input/output mechanisms that accommodate numerous user commands and/or react to numerous user behaviors. For example, many mobile devices are now equipped not only with various buttons and/or keypads, but also with touch-detecting surfaces such as touch screens or touch pads by which a user, simply by touching the surface of the mobile device and/or moving the user's finger along the surface of the mobile device, is able to communicate to the mobile device a variety of instructions.
  • A so-called multi-touch touch-detecting surface allows for the detection of multiple touches that occur simultaneously, while a single-touch touch-detecting surface only allows for the detection of a single touch at a time. Multi-touch surfaces are advantageous in that various gesture combinations performed using two or more simultaneous touches (such as using two fingers) are detectable, so that a richer and more varied set of device functions (such as scaling and translation) can be controlled in a straightforward manner. For example, two fingers moving apart on a multi-touch touch-detecting surface can be used to zoom out on an associated map or document or photograph, while two fingers moving together can be used to zoom in. Also, two fingers moving together across a multi-touch touch-detecting surface can be used to translate an item at twice the speed compared to moving just one finger across the touch-detecting surface. However, multi-touch touch-detecting surfaces are in general more expensive and complex than single-touch touch-detecting surfaces, so that single-touch surfaces are advantageous from a cost and simplicity standpoint. Further, in certain circumstances, it can be difficult for a user to interact with an electronic device using simultaneous touches, such as when a user has only one finger (e.g., a thumb) available for touch interaction or otherwise has limited finger mobility due to, for example, injury or arthritis.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a front view of an exemplary electronic device that employs a single- touch touch-detecting surface wherein the electronic device is capable of emulating a multi-touch touch-detecting surface in accordance with one embodiment;
  • FIG. 2 is a block diagram showing exemplary components of the electronic device of FIG. 1;
  • FIGS. 3A, 3B, 4A, 4B, 5A, and 5B show in schematic form several exemplary gesture sequences in accordance with some embodiments; and
  • FIG. 6 is a flowchart showing exemplary steps of a method that can be performed by the electronic device of FIGS. 1 and 2.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • An electronic device with a processor and one or more touch-detecting surfaces detects sequential touches, identifies a corresponding gesture sequence, and operates to control the execution of various device functions such as a scaling function (zooming in and out), a translation function (scrolling and panning), and a rotation function, which are associated with an item displayed on a display screen of the electronic device. A corresponding method controls the execution of such electronic device functions. The functionality of an electronic device having a multi-touch touch-detecting surface and capable of detecting multiple simultaneous touches can be emulated using sequential touches on a touch-detecting surface, providing a simpler user experience because simultaneous touches are not required. The touch-detecting surface can take the form of a single-touch touch-detecting surface, thereby providing simplicity and saving cost as compared to the use of a multi-touch touch-detecting surface. The touch-detecting surface can alternately be a multi-touch touch-detecting surface that receives single-touch user interactions. The touch-detecting surfaces can be any of a variety of known touch-detecting technologies such as a resistive technology, a capacitive technology, an optical technology, or others.
  • Referring to FIG. 1, an exemplary electronic device 102 is shown as a mobile smart phone, which can include various functions such as email, messaging, and internet browsing functions, as well as various other functions. In other embodiments, the device can be one of a variety of other electronic devices such as a personal digital assistant, an audio and/or video player, a headset, a navigation device, a notebook, laptop or other computing device, or any other device that can utilize or benefit from a touch-detecting surface. The touch-detecting surface can be either a touch screen or a touch pad. As illustrated, device 102 includes a touch-detecting surface 104 which can be a light permeable panel or other technology which overlaps a display screen 106 to create a touch screen on all or a portion of the display screen 106. A touch screen is advantageous because an item being manipulated can be displayed directly underlying the touch-detecting surface on which controlling touch gestures are applied. The electronic device 102 can also include a keypad 108 having numerous keys for inputting various user commands for operation of the device.
  • Referring to FIG. 2, a block diagram 200 illustrates exemplary internal components of the mobile smart phone implementation of the electronic device 102. These components can include wireless transceivers 202, a processor 204 (e.g., a microprocessor, microcomputer, application-specific integrated circuit, or the like), memory 206, one or more output components 208, one or more input components 210, and one or more sensors 228. The device can also include a component interface 212 to provide a direct connection to auxiliary components or accessories for additional or enhanced functionality, and a power supply 214, such as a battery, for providing power to the other internal components. All of the internal components can be coupled to one another, and in communication with one another, by way of one or more internal communication links 232 such as an internal bus.
  • More specifically, the wireless transceivers 202 can include both cellular transceivers 203 and a wireless local area network (WLAN) transceiver 205. Each of the wireless transceivers 202 utilizes a wireless technology for communication, such as cellular-based communication technologies including analog communications (using AMPS), digital communications (using CDMA, TDMA, GSM, iDEN, GPRS, EDGE, etc.), and next generation communications (using UMTS, WCDMA, LTE, IEEE 802.16, etc.) or variants thereof, or peer-to-peer or ad hoc communication technologies such as HomeRF, Bluetooth and IEEE 802.11 (a, b, g or n), or other wireless communication technologies.
  • The memory 206 can encompass one or more memory devices of any of a variety of forms (e.g., read-only memory, random access memory, static random access memory, dynamic random access memory, etc.), and can be used by the processor 204 to store and retrieve data. The data that is stored by the memory portion 206 can include operating systems, applications, and informational data. Each operating system includes executable code that controls basic functions of the communication device, such as interaction among the various internal components, communication with external devices via the wireless transceivers 202 and/or the component interface 212, and storage and retrieval of applications and data to and from the memory portion 206. Each application includes executable code that utilizes an operating system to provide more specific functionality for the communication devices, such as file system service and handling of protected and unprotected data stored in the memory portion 206. Informational data is non-executable code or information that can be referenced and/or manipulated by an operating system or application for performing functions of the communication device.
  • Exemplary operation of the wireless transceivers 202 in conjunction with others of the internal components 200 of the electronic device 102 can take a variety of forms and can include, for example, operation in which, upon reception of wireless signals, the internal components detect communication signals and the transceiver 202 demodulates the communication signals to recover incoming information, such as voice and/or data, transmitted by the wireless signals. After receiving the incoming information from the transceiver 202, the processor 204 formats the incoming information for the one or more output devices 208. Likewise, for transmission of wireless signals, the processor 204 formats outgoing information, which may or may not be activated by the input devices 210, and conveys the outgoing information to one or more of the wireless transceivers 202 for modulation as communication signals. The wireless transceiver(s) 202 convey the modulated signals to a remote device, such as a cell tower or an access point (not shown).
  • The output components 210 can include a variety of visual, audio, and/or mechanical outputs. For example, the output devices 208 can include one or more visual output devices 216 including the display screen 106, which can be a liquid crystal display. One or more audio output devices 218 can include a speaker, alarm, and/or buzzer, and a mechanical output device 220 can include a vibrating mechanism for example. Similarly, the input devices 210 can include one or more visual input devices 222 such as an optical sensor of a camera, an audio input device 224 such as a microphone, and a mechanical input device 226. In particular, the mechanical input device 226 can include, among other things, the touch-detecting surface 104, and the keypad 108 of FIG. 1. Actions that can actuate one or more input devices 210 can include for example, opening the electronic device, unlocking the device, moving the device, and operating the device.
  • The sensors 228 can include both proximity sensors 229 and other sensors 231, such as an accelerometer, a gyroscope, or any other sensor that can provide pertinent information, such as to identify a current location or orientation of the device 102.
  • The touch-detecting surface 104 provides a signal via link 232 to the processor 204 indicative of an applied touch gesture. The processor monitors output signals from the touch-detecting surface 104 and, in conjunction therewith, can detect applied touch gestures having appropriate characteristics, and can determine a location (e.g., co-ordinates) of the applied touch on the touch-detecting surface 104. As more fully described below, the processor 204 can also be programmed to determine one or more other parameters associated with a touch gesture, such as a relative location with respect to another touch gesture, a movement amount (e.g., a touch distance), a direction, a speed, and/or a duration of a glide movement portion of the touch gesture. Further, the processor can be programmed to identify a touch gesture sequence and thus a corresponding device function to control according to one or more of the determined parameters. For example, the relative location of first and second touch gestures and/or a glide movement direction or movement pattern can be used to determine whether the touch gesture sequence is one associated with a scaling function, a rotation function, or a translation function, such that instructions can then be generated to control the execution of the corresponding device function in accordance with another of the determined parameters, such as a glide movement touch distance.
  • Controlling device functions of the device 102 with a sequence of single-touch touch gestures is explained with reference to FIGS. 3-5. These figures illustrate various examples of a touch gesture sequence used to control the display of an item (such as a picture, web page, map, text block, etc) on the display screen 106 of the electronic device using the touch-detecting surface 104. Specifically, FIGS. 3( a)-3(b) illustrate a touch gesture sequence for controlling a scaling function of the electronic device 102, such as to zoom in and zoom out on a displayed item. FIGS. 4( a)-4(b) illustrate a touch gesture sequence for controlling a translation function of a displayed item on the electronic device, and FIGS. 5( a)-5(b) illustrate a touch gesture sequence for controlling a rotation function of a displayed item on the electronic device. Various other touch gesture sequences can be defined to control other electronic device functions as well.
  • In particular, as shown in FIG. 3( a), a first single-touch touch gesture can be applied on the touch detecting surface 104, such as at location 300. As shown in FIG. 3( b), the first touch gesture can be followed by a second single-touch touch gesture beginning at an initial location 302, wherein the second touch gesture includes a glide movement as indicated by arrow 304. Because the second touch gesture begins at location 302, which is to the left of location 300 for example, this can indicate that a scaling (zoom) function is desired. In other words, the starting point of the second touch gesture, relative to the first touch gesture (e.g., to the left), may indicate the desired function. Because the glide movement of the second touch gesture occurs in a linear direction away from location 300, this can indicate that a zoom out scaling function is desired. Thus, the direction of the glide movement may control the execution of the function. To zoom in, a linear glide movement could be directed toward the first touch location 300. The center of the scaling function can be defined by the location 300, by the location 302, or by a midpoint between these locations, and the amount of scaling can correspond to a touch distance of the glide movement.
  • The touch gestures can be provided by way of touching the touch-detecting surface 104 by various means, including for example using a finger (including a thumb), fingernail, hand or portion thereof, or a stylus device. In some embodiments, the touch-detecting surface 104 can be activated by way of other types of actions, such as by swiping, pinching, and applying pressure, which actions are all considered touches. However, the touch-detecting surface 104 need not be capable of distinguishing between different pressures or forces of touches. Further, as used herein, a glide movement occurs when a finger or other object remains in contact with the touch detecting surface, for an amount determined by the user and as more fully described below.
  • The touch gesture sequence of FIGS. 3( a)-(b) can have the same effect as a gesture wherein two fingers simultaneously touching a multi-touch touch-detecting surface are moved apart to zoom out on a displayed item (or moved together to zoom in on a displayed item).
  • FIGS. 4( a)-4(b) illustrate a touch gesture sequence for controlling a translation function of the device 102 display controller, such as to move (scroll, pan) a displayed item or object or portion thereof such that a different portion of that item is then viewable on the display screen 106. In particular, as shown in FIG. 4( a), a first single-touch touch gesture can be applied such as at a location 400 of the touch-detecting surface 104. As shown in FIG. 4( b), the first touch gesture can be followed by a second single-touch touch gesture at an initial location 402, wherein the second touch gesture includes a glide movement as indicated by arrow 404, wherein a finger or other object remains in contact with the touch detecting surface for a measurable amount. In this example, because the second touch gesture begins at location 402, which is to the right of location 400, this can indicate that a translation function is desired. In one embodiment, the second touch gesture alone (defined by a location 402 and glide movement defined by arrow 404) would result in a translation of the displayed image roughly equivalent to the speed and direction of the glide movement. However, the illustrated touch gesture sequence can be used to increase the translation speed of an item on the display screen by a predetermined factor such as two (indicated by arrow 406), such that it can have the same (or similar) effect as a multi-touch gesture wherein two fingers are simultaneously touching a multi-touch touch-detecting surface and are moved together along the surface to scroll or pan on a displayed item at an increased speed as compared to the use of a single finger glide movement. Scrolling up could be controlled by gliding 404 up instead of down. Similarly, scrolling left could be controlled by gliding left, and scrolling right could be controlled by gliding right.
  • FIGS. 5( a)-5(b) illustrate a touch gesture sequence for controlling a rotation function of the device 102, in order to rotate an item on the display screen 106 to effect a desired orientation of the item on the display screen. Again, a first touch gesture can be applied at a location 500, followed by a subsequent second touch gesture including a glide movement, such as in an arc denoted by arrow 504, starting from an initial location 502. Because the second gesture begins at location 502, which is to the left of area 500 this can indicate, for example, that either a zoom function or a rotation function is desired. Because the subsequent glide movement is arc-like, the processor can determine that a rotation function is desired. If the glide movement had been linear, a zoom function would have been implemented.
  • The arc 504 is shown as counter-clockwise, which can control the rotation in a counter-clockwise direction. If the arc was in a clockwise direction, the displayed item's rotation would be controlled in a clockwise direction. The rotation of the item can occur around an axis defined by the location 500 of the first touch gesture, the starting location 502 of the second touch gesture, or at a midpoint between these locations, and the amount of rotation (e.g., degrees of rotation) of the object can correspond to the amount of the glide movement. This touch gesture can have the same effect as a gesture wherein two fingers are simultaneously touching a multi-touch touch-detecting surface and one is rotating around the other (or both touches are rotating about a center point) to effect rotation of a displayed item.
  • Referring now to FIG. 6, a flowchart 600 is shown which includes exemplary steps of operation of the electronic device 102. The process starts at a step 602 when the electronic device is turned on for use or is otherwise initialized in a known manner. Next, at a step 604, the electronic device monitors user inputs, though remains in a single-touch mode.
  • At a step 606, the electronic device detects whether a first touch gesture is applied to the touch detecting surface 104. If not, processing proceeds to step 604, and if so, processing proceeds to a step 607. At a step 607, it is determined whether a proper first touch gesture has occurred, that is, whether the first touch gesture has a characteristic such that indicates that a sequential single-touch mode should be initiated. A proper first touch gesture occurs for example when a touch gesture is applied to the touch-detecting surface for a duration that is greater than a predetermined duration. For this purpose, the processor can be programmed to determine a duration of the applied first touch gesture. In other embodiments, a proper first touch gesture can be one that is defined by another predetermined characteristic, such as being applied to a designated location of the touch-detecting surface.
  • If a proper first touch gesture is not detected, then processing proceeds to step 609. At step 609, the touch gesture is processed as a normal (non-sequence) single-touch touch gesture. If a proper first touch gesture is detected, then processing proceeds to step 608.
  • At step 608, the electronic device determines and records the location of the first touch gesture, and continues to monitor user inputs. Processing then proceeds to a step 610, at which the electronic device determines whether a proper second touch gesture is applied to the touch detecting surface. A proper second touch gesture can occur for example when a second touch gesture is applied to a location on the touch-detecting surface, such as a second designated touch area, and/or if the second touch gesture occurs within a predetermined time from the first touch gesture, and/or if a stationary portion of the second touch gesture is applied for a duration that is greater than a predetermined duration. For example, if there is no second touch gesture after a predetermined time period such as two seconds, then a proper second touch gesture is not detected.
  • If a proper second touch gesture is not detected, then processing again proceeds to step 604. If a proper second touch gesture is detected, then processing proceeds to step 612. At step 612, a first parameter and a second parameter of the second touch gesture are determined in order to identify a corresponding device function to be controlled at a step 614 and to control the identified device function in a desired way at a step 616.
  • With respect to the first parameter, this can be a relative location of the second touch gesture's initial location with respect to the first touch gesture's location. For example, with reference to FIGS. 3( a)-(b), 4(a)-(b), and FIG. 5( a)-(b), a second touch gesture that is applied to the left of an applied first touch gesture can be used to indicate that a zoom function or a rotation function is desired, and a second touch gesture that is applied to the right of an applied first touch gesture can be used to indicate that a translation function is desired. In some cases, the relative location of the first and second touch gestures can also be used in combination with the glide movement that occurs as part of the second touch gesture. For example, a second touch gesture starting to the left of a first touch gesture in combination with an arc-like glide movement can be used to indicate that a rotation function is desired and a second touch gesture starting to the right of a first touch gesture, in combination with a linear glide movement can be used to indicate that a zoom function is desired . Thus at step 614, the electronic device can identify the corresponding function to be controlled based on the determined first parameter.
  • At step 616, the device can control the execution of the identified device function based on the determined second parameter. With respect to the second parameter, this can be a movement amount (touch distance) of the glide movement, a movement shape (e.g., linear or arcuate), a movement direction (e.g., toward or away from the first touch location; a horizontal, vertical, or diagonal direction; a clockwise or counter-clockwise direction; etc.), or a glide movement's duration or speed. A movement amount of a glide movement can be determined at various points during the glide movement and can then be used to control a corresponding function, such as to control the amount of scaling, the amount of translation, or the amount of rotation of a displayed item. After the glide movement is completed (for example when a finger is removed from the touch-detecting surface), the control of the device function can be terminated.
  • At a step 618, the device determines whether the sequential touch-detecting routine has been turned off. If not, the process returns to step 604. If so, the process ends at step 620.
  • By utilizing a touch-detecting surface to recognize two or more predefined single-touch gestures in sequence, the functionality provided by a multi-touch touch-detecting surface can be achieved, and touch gesture sequences which are easy to perform can be defined.
  • It is specifically intended that the present invention not be limited to the embodiments and illustrations contained herein, but include modified forms of those embodiments, including portions of the embodiments and combinations of elements of different embodiments as come within the scope of the following claims.

Claims (19)

We claim:
1. An electronic device comprising:
a touch-detecting surface; and
a processor in electronic communication with the touch-detecting surface programmed to detect a first touch gesture applied to the touch-detecting surface, to initiate a sequential touch mode if the first touch gesture has a predetermined characteristic, to detect a subsequent second touch gesture including a glide movement applied to the touch-detecting surface in the sequential touch mode, and to determine a first parameter and a second parameter of the second touch gesture, wherein the first parameter is relative location of an initial location of the second touch gesture with respect to a location of the first touch gesture;
further wherein the processor identifies a corresponding one of a plurality of device functions of the electronic device in accordance with the determined first parameter, and controls the execution of the identified device function of the electronic device in accordance with the determined second parameter.
2. The electronic device of claim 1, wherein the sequential touch mode is initiated if a duration of the first touch gesture is greater than a predetermined duration.
3. The electronic device of claim 1, further wherein the processor is programmed to determine whether an applied second touch gesture occurs within a predetermined period after the first touch gesture, and provide instructions to exit the sequential touch mode if the applied second touch gesture occurs after the predetermined period.
4. The electronic device of claim 1, further wherein the processor is programmed to determine whether an applied second touch gesture has a duration greater than a predetermined duration, and to provide instructions to exit the sequential touch mode if the applied second touch gesture has a duration greater than the predetermined duration.
5. The electronic device of claim 1, wherein a controlled device function of the electronic device includes one of a scaling function, a translation function, and a rotation function.
6. The electronic device of claim 1, wherein the controlled device function controls the display of an item on a display screen of the electronic device.
7. The electronic device of claim 1, wherein the touch-detecting surface is one of a single-touch and a multi-touch touch-detecting surface.
8. The electronic device of claim 1, wherein the touch-detecting surface forms a touchscreen.
9. The electronic device of claim 1, wherein the touch-detecting surface is a touchpad.
10. A method for controlling an electronic device having a touch-detecting surface, the method comprising:
detecting a first touch gesture applied to the touch-detecting surface;
initiating a sequential touch mode if the first touch gesture has a predetermined characteristic;
in the sequential touch mode, detecting a second touch gesture that is subsequently applied to the touch-detecting surface, wherein the second touch gesture includes a glide movement;
determining a first parameter and a second parameter associated with the second touch gesture, wherein the first parameter is a relative location of an initial location of the second touch gesture with respect to a location of the first touch gesture;
identifying a corresponding device function in accordance with the determined first parameter; and
controlling the execution of the identified device function in accordance with the determined second parameter.
11. The method of claim 10, further including determining a duration of the applied first touch gesture and initiating the sequential touch mode if the duration of the applied first touch gesture is greater than a predetermined duration.
12. The method of claim 10, further including determining whether an applied second touch gesture occurs within a predetermined period after the first touch gesture, and if not, then exiting the sequential touch mode.
13. The method of claim 10, further including determining whether an applied second touch gesture has a duration less than a predetermined duration, and if so, then exiting the sequential touch mode.
14. The method of claim 10, further including controlling at least one of a scaling function, a translation function, and a rotation function.
15. The method of claim 14, wherein the controlling includes controlling the display of an item on a display screen of the electronic device.
16. The method of claim 10, wherein the second parameter is one of a touch distance, a duration, and a speed of the glide movement.
17. The method of claim 10, wherein the second parameter is a direction of the glide movement.
18. The method of claim 10, wherein the first parameter further includes a direction of the glide movement.
19. The method of claim 10, wherein the determining a first parameter includes determining a movement pattern of the glide movement and the identifying includes identifying the controlled device function in accordance with the determined movement pattern.
US13/647,427 2010-04-23 2012-10-09 Electronic device and method using a touch-detecting surface Active 2030-11-01 US9141195B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2010/072127 WO2011130919A1 (en) 2010-04-23 2010-04-23 Electronic device and method using touch-detecting surface
CNPCT/CN2010/072127 2010-04-23

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2010/072127 Continuation WO2011130919A1 (en) 2010-04-23 2010-04-23 Electronic device and method using touch-detecting surface

Publications (3)

Publication Number Publication Date
US20130093705A1 US20130093705A1 (en) 2013-04-18
US20150103013A9 true US20150103013A9 (en) 2015-04-16
US9141195B2 US9141195B2 (en) 2015-09-22

Family

ID=44833645

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/647,427 Active 2030-11-01 US9141195B2 (en) 2010-04-23 2012-10-09 Electronic device and method using a touch-detecting surface

Country Status (3)

Country Link
US (1) US9141195B2 (en)
CN (1) CN102906682B (en)
WO (1) WO2011130919A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140237401A1 (en) * 2013-02-15 2014-08-21 Flatfrog Laboratories Ab Interpretation of a gesture on a touch sensing device
US9874978B2 (en) 2013-07-12 2018-01-23 Flatfrog Laboratories Ab Partial detect mode
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
US10126882B2 (en) 2014-01-16 2018-11-13 Flatfrog Laboratories Ab TIR-based optical touch systems of projection-type
US10146376B2 (en) 2014-01-16 2018-12-04 Flatfrog Laboratories Ab Light coupling in TIR-based optical touch systems
US10161886B2 (en) 2014-06-27 2018-12-25 Flatfrog Laboratories Ab Detection of surface contamination
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
US10401546B2 (en) 2015-03-02 2019-09-03 Flatfrog Laboratories Ab Optical component for light coupling
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10474249B2 (en) 2008-12-05 2019-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US10481737B2 (en) 2017-03-22 2019-11-19 Flatfrog Laboratories Ab Pen differentiation for touch display
US10496227B2 (en) 2015-02-09 2019-12-03 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130147810A1 (en) * 2011-12-07 2013-06-13 Nokia Corporation Apparatus responsive to at least zoom-in user input, a method and a computer program
US9728145B2 (en) 2012-01-27 2017-08-08 Google Technology Holdings LLC Method of enhancing moving graphical elements
US9448635B2 (en) * 2012-04-16 2016-09-20 Qualcomm Incorporated Rapid gesture re-engagement
KR101990039B1 (en) * 2012-11-30 2019-06-18 엘지전자 주식회사 Mobile terminal and method of controlling the same
JP2014115919A (en) * 2012-12-11 2014-06-26 Toshiba Corp Electronic device, and method and program for controlling the same
CN104182159B (en) * 2013-05-23 2018-12-25 华为终端(东莞)有限公司 A kind of method, device and equipment touching screen unlocks
CN104182166A (en) * 2013-05-28 2014-12-03 腾讯科技(北京)有限公司 Control method and device of intelligent terminal application program
US9177362B2 (en) * 2013-08-02 2015-11-03 Facebook, Inc. Systems and methods for transforming an image
US9727915B2 (en) * 2013-09-26 2017-08-08 Trading Technologies International, Inc. Methods and apparatus to implement spin-gesture based trade action parameter selection
US10101844B2 (en) 2014-03-14 2018-10-16 Lg Electronics Inc. Mobile terminal and method of controlling the same based on type of touch object used to apply touch input
US20160132181A1 (en) * 2014-11-12 2016-05-12 Kobo Incorporated System and method for exception operation during touch screen display suspend mode
CN105843594A (en) * 2015-01-13 2016-08-10 阿里巴巴集团控股有限公司 Method and device for displaying application program page of mobile terminal
US20160231772A1 (en) * 2015-02-09 2016-08-11 Mediatek Inc. Wearable electronic device and touch operation method
EP3515089A1 (en) * 2016-09-14 2019-07-24 Shenzhen Royole Technologies Co., Ltd. Earphone assembly, and headphone and head-mounted display device with earphone assembly

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130265237A1 (en) * 2012-04-04 2013-10-10 Google Inc. System and method for modifying content display size

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5260697A (en) 1990-11-13 1993-11-09 Wang Laboratories, Inc. Computer with separate display plane and user interface processor
US7142205B2 (en) 2000-03-29 2006-11-28 Autodesk, Inc. Single gesture map navigation graphical user interface for a personal digital assistant
US7030861B1 (en) 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
WO2003042804A1 (en) 2001-11-16 2003-05-22 Myorigo Oy Extended keyboard
US7743348B2 (en) 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
DE202005021492U1 (en) 2004-07-30 2008-05-08 Apple Inc., Cupertino Electronic device with touch-sensitive input device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
FR2878344B1 (en) 2004-11-22 2012-12-21 Sionnest Laurent Guyot Data controller and input device
CN100435078C (en) * 2005-06-20 2008-11-19 义隆电子股份有限公司 Object detection method for capacitance type touch panel
US7840912B2 (en) 2006-01-30 2010-11-23 Apple Inc. Multi-touch gesture dictionary
JP4208894B2 (en) 2006-05-15 2009-01-14 株式会社東芝 Light emitting element
JP2008052062A (en) * 2006-08-24 2008-03-06 Ricoh Co Ltd Display device, display method of display device, program and recording medium
US8106856B2 (en) * 2006-09-06 2012-01-31 Apple Inc. Portable electronic device for photo management
CN101529367B (en) * 2006-09-06 2016-02-17 苹果公司 For the voicemail manager of portable multifunction device
US9740386B2 (en) 2007-06-13 2017-08-22 Apple Inc. Speed/positional mode translations
US8130211B2 (en) 2007-09-24 2012-03-06 Microsoft Corporation One-touch rotation of virtual objects in virtual workspace
CN101458585B (en) * 2007-12-10 2010-08-11 义隆电子股份有限公司 Touch control panel detecting method
US20100060588A1 (en) 2008-09-09 2010-03-11 Microsoft Corporation Temporally separate touch input
JP4752900B2 (en) * 2008-11-19 2011-08-17 ソニー株式会社 Image processing apparatus, image display method, and image display program
US20100149114A1 (en) 2008-12-16 2010-06-17 Motorola, Inc. Simulating a multi-touch screen on a single-touch screen
TW201122969A (en) * 2009-12-17 2011-07-01 Weltrend Semiconductor Inc Multi-touch command detecting method for touch surface capacitive touch panel

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130265237A1 (en) * 2012-04-04 2013-10-10 Google Inc. System and method for modifying content display size

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10474249B2 (en) 2008-12-05 2019-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
US20140237401A1 (en) * 2013-02-15 2014-08-21 Flatfrog Laboratories Ab Interpretation of a gesture on a touch sensing device
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
US9874978B2 (en) 2013-07-12 2018-01-23 Flatfrog Laboratories Ab Partial detect mode
US10126882B2 (en) 2014-01-16 2018-11-13 Flatfrog Laboratories Ab TIR-based optical touch systems of projection-type
US10146376B2 (en) 2014-01-16 2018-12-04 Flatfrog Laboratories Ab Light coupling in TIR-based optical touch systems
US10161886B2 (en) 2014-06-27 2018-12-25 Flatfrog Laboratories Ab Detection of surface contamination
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
US10496227B2 (en) 2015-02-09 2019-12-03 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US10401546B2 (en) 2015-03-02 2019-09-03 Flatfrog Laboratories Ab Optical component for light coupling
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US10481737B2 (en) 2017-03-22 2019-11-19 Flatfrog Laboratories Ab Pen differentiation for touch display
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly

Also Published As

Publication number Publication date
CN102906682B (en) 2016-10-26
WO2011130919A1 (en) 2011-10-27
US20130093705A1 (en) 2013-04-18
CN102906682A (en) 2013-01-30
US9141195B2 (en) 2015-09-22

Similar Documents

Publication Publication Date Title
US10349273B2 (en) User authentication using gesture input and facial recognition
USRE46864E1 (en) Insertion marker placement on touch sensitive display
US9507469B2 (en) Information processing device, operation input method and operation input program
US9864504B2 (en) User Interface (UI) display method and apparatus of touch-enabled device
EP2680099B1 (en) Mobile terminal and control method thereof
EP3041201A1 (en) User terminal device and control method thereof
EP2825950B1 (en) Touch screen hover input handling
US9619139B2 (en) Device, method, and storage medium storing program
KR102027555B1 (en) Method for displaying contents and an electronic device thereof
US9357396B2 (en) Terminal device
US9170672B2 (en) Portable electronic device with a touch-sensitive display and navigation device and method
US9298341B2 (en) Apparatus and method for switching split view in portable terminal
US9383918B2 (en) Portable electronic device and method of controlling same
US8599163B2 (en) Electronic device with dynamically adjusted touch area
EP2652580B1 (en) Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device
EP2652579B1 (en) Detecting gestures involving movement of a computing device
JP5759660B2 (en) Portable information terminal having touch screen and input method
US8872773B2 (en) Electronic device and method of controlling same
KR101445944B1 (en) User interface methods for ending an application
EP2812796B1 (en) Apparatus and method for providing for remote user interaction
RU2611023C2 (en) Device comprising plurality of touch screens and method of screens switching for device
US9082350B2 (en) Electronic device, display control method, and storage medium storing display control program
RU2605359C2 (en) Touch control method and portable terminal supporting same
US9104288B2 (en) Method and apparatus for providing quick access to media functions from a locked screen
KR101886753B1 (en) Mobile terminal and control method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: MOTOROLA MOBILITY LLC, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HUANG, MENG;REEL/FRAME:029094/0737

Effective date: 20121008

AS Assignment

Owner name: GOOGLE TECHNOLOGY HOLDINGS LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MOTOROLA MOBILITY LLC;REEL/FRAME:034625/0001

Effective date: 20141028

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4