WO2014161156A1 - Method and apparatus for controlling a touch-screen device - Google Patents

Method and apparatus for controlling a touch-screen device Download PDF

Info

Publication number
WO2014161156A1
WO2014161156A1 PCT/CN2013/073661 CN2013073661W WO2014161156A1 WO 2014161156 A1 WO2014161156 A1 WO 2014161156A1 CN 2013073661 W CN2013073661 W CN 2013073661W WO 2014161156 A1 WO2014161156 A1 WO 2014161156A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
touch event
determining
event
scroll
Prior art date
Application number
PCT/CN2013/073661
Other languages
French (fr)
Inventor
Mengge DUAN
Haiqing HU
Original Assignee
Motorola Solutions, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Solutions, Inc. filed Critical Motorola Solutions, Inc.
Priority to PCT/CN2013/073661 priority Critical patent/WO2014161156A1/en
Publication of WO2014161156A1 publication Critical patent/WO2014161156A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention generally relates to touch-screen devices, and more particularly to a method and apparatus for controlling a touch-screen device.
  • Touch-sensitive displays also known as “touch screens” are well known in the art. Touch screens are used in many electronic devices to display control buttons, graphics, text, and to provide a user interface through which a user may interact with the device. A touch screen detects and responds to contact on its surface. A device may display one or more control buttons, soft keys, menus, and other user-interface elements on the touch screen. A user may interact with the device by contacting the touch screen at locations corresponding to the user-interface (Ul) elements with which they wish to interact.
  • Ul user-interface
  • FIG. 1 is block diagram illustrating a general operational environment, according to one embodiment of the present invention
  • FIG. 2 illustrates multiple fingers making contact with a touch screen.
  • FIG. 3 illustrates the determination of a relative location of a touch event.
  • FIG. 4 is a flow chart showing operation of the device of FIG. 1 .
  • FIG. 5 is a flow chart showing operation of the device of FIG. 1 .
  • a method and apparatus for controlling a touch-screen device is provided herein.
  • a user will execute a series of touch events on the touch screen.
  • the type of touch events, the timing sequence (e.g. before, after, simultaneous) of the touch events with respect to each other, and/or the location of the touch events with respect to each other are analyzed, and a specific action is executed by the touch-screen device based on the type of touch event, the timing sequence of the touch events with respect to each other, and/or the location of the touch events with respect to each other.
  • a user can use a combination of one or more touching- and-hold behaviors with one or more tap behaviors to perform one complete input operation on the touch screen.
  • the device will interpret the user inputs according to a predefined pattern and launches the associated functionality.
  • an operation comprising a touch-and-hold may be followed with a subsequent tap.
  • the function executed by a device is decided by the relative locations of the touch-and-hold and the tap points. For example, if the tap takes place on the right side of the touch-and-hold, the device may execute a first function (e.g., increase a volume level), on the contrary, if the tap is on the left side of the touch-and-hold, the device may execute a second function (e.g., decrease a volume level).
  • an operation comprising a touch-and-hold may be followed with two subsequent simultaneous taps. If the two taps take place on the right side of the touch-and-hold, the device may zoom in the displaying content, on the contrary, if the two taps are on the left side of the touch-and-hold, the device may zoom out the displaying content.
  • FIG. 1 is a block diagram of a portable electronic device that preferably comprises a touch screen 126.
  • the device 100 includes a memory 102, a memory controller 104, one or more processing units (CPU's) 106, a peripherals interface 108, RF circuitry 1 12, audio circuitry 1 14, a speaker 1 16, a microphone 1 18, an input/output (I/O) subsystem 120, a touch screen 126, other input or control devices 128, and an external port 148. These components communicate over the one or more communication buses or signal lines 1 10.
  • the device 100 can be any portable electronic device, including but not limited to a handheld computer, a tablet computer, a mobile phone, a police radio, a media player, a personal digital assistant (PDA), or the like, including a combination of two or more of these items. It should be appreciated that the device 100 is only one example of a portable electronic device 100, and that the device 100 may have more or fewer components than shown, or a different configuration of components.
  • the various components shown in FIG. 1 may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the memory 102 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state memory devices.
  • the memory 102 may further include storage remotely located from the one or more processors 106, for instance network attached storage accessed via the RF circuitry 1 12 or external port 148 and a communications network (not shown) such as the Internet, intranet(s), Local Area Networks (LANs), Wide Local Area Networks (WLANs), Storage Area Networks (SANs) and the like, or any suitable combination thereof. Access to the memory 102 by other components of the device 100, such as the CPU 106 and the peripherals interface 108, may be controlled by the memory controller 104.
  • the peripherals interface 108 couples the input and output peripherals of the device to the CPU 106 and the memory 102.
  • the one or more processors 106 run various software programs and/or sets of instructions stored in the memory 102 to perform various functions for the device 100 and to process data.
  • the peripherals interface 108, the CPU 106, and the memory controller 104 may be implemented on a single chip, such as a chip 1 1 1 . In some other embodiments, they may be implemented on separate chips.
  • the RF (radio frequency) circuitry 1 12 receives and sends electromagnetic waves.
  • the RF circuitry 1 12 converts electrical signals to/from electromagnetic waves and communicates with communications networks and other communications devices via the electromagnetic waves.
  • the RF circuitry 1 12 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
  • an antenna system an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
  • SIM subscriber identity module
  • the RF circuitry 1 12 may communicate with the networks, such as the Internet, also referred to as the World Wide Web (WWW), an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
  • networks such as the Internet, also referred to as the World Wide Web (WWW), an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
  • the networks such as the Internet, also referred to as the World Wide Web (WWW), an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
  • WLAN wireless local area network
  • MAN metropolitan area network
  • the wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.1 1 a, IEEE 802.1 1 b, IEEE 802.1 1 g and/or IEEE 802.1 1 ⁇ ), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • W-CDMA wideband code division multiple access
  • CDMA code division multiple access
  • TDMA time division multiple access
  • Wi-Fi e.g., IEEE 802.1 1 a, IEEE 802.1 1 b, IEEE 802.1
  • the audio circuitry 1 14, the speaker 1 16, and the microphone 1 18 provide an audio interface between a user and the device 100.
  • the audio circuitry 1 14 receives audio data from the peripherals interface 108, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 1 16.
  • the speaker converts the electrical signal to human-audible sound waves.
  • the audio circuitry 1 14 also receives electrical signals converted by the microphone 1 16 from sound waves.
  • the audio circuitry 1 14 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 108 for processing. Audio data may be may be retrieved from and/or transmitted to the memory 102 and/or the RF circuitry 1 12 by the peripherals interface 108.
  • the audio circuitry 1 14 also includes a headset jack (not shown).
  • the headset jack provides an interface between the audio circuitry 1 14 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (headphone for one or both ears) and input (microphone).
  • the I/O subsystem 120 provides the interface between input/output peripherals on the device 100, such as the touch screen 126 and other input/control devices 128, and the peripherals interface 108.
  • the I/O subsystem 120 includes a touch-screen controller 122 and one or more input controllers 124 for other input or control devices.
  • the one or more input controllers 124 receive/send electrical signals from/to other input or control devices 128.
  • the other input/control devices 128 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, sticks, and so forth.
  • the touch screen 126 provides both an output interface and an input interface between the device and a user.
  • the touch-screen controller 122 receives/sends electrical signals from/to the touch screen 126.
  • the touch screen 126 displays visual output to the user.
  • the visual output may include text, graphics, video, and any combination thereof.
  • the touch screen 126 also accepts input from the user based on haptic and/or tactile contact.
  • the touch screen 126 forms a touch-sensitive surface that accepts user input.
  • the CPU 106, touch screen 126, and the touch screen controller 122 (along with any associated modules and/or sets of instructions in the memory 102) detects contact (and any movement or break of the contact) on the touch screen 126 and converts the detected contact into an action, such as the execution of one or more user-interface elements (e.g., soft keys), or other actions such as, but not limited to a volume up/down, or a scroll up/down.
  • an action such as the execution of one or more user-interface elements (e.g., soft keys), or other actions such as, but not limited to a volume up/down, or a scroll up/down.
  • a point of contact between the touch screen 126 and the user corresponds to one or more digits of the user.
  • the touch screen 126 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments.
  • the touch screen 126 and touch screen controller 122 may detect contact and any movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 126.
  • the touch-sensitive display may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat.
  • touch screen 126 displays visual output from the portable device, whereas touch sensitive tablets do not provide visual output.
  • the touch screen 126 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen 126 may have a resolution of approximately 168 dpi.
  • the device 100 may include a touchpad (not shown) for activating or deactivating particular functions.
  • the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output.
  • the touchpad may be a touch-sensitive surface that is separate from the touch screen 126 or an extension of the touch-sensitive surface formed by the touch screen 126.
  • the device 100 also includes a power system 130 for powering the various components.
  • the power system 130 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
  • a power management system e.g., one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
  • power sources e.g., battery, alternating current (AC)
  • AC alternating current
  • a recharging system e.
  • the software components include an operating system 132, a communication module (or set of instructions) 134, an electronic contact module (or set of instructions) 138, a graphics module (or set of instructions) 140, a user interface state module (or set of instructions) 144, and one or more applications (or set of instructions) 146.
  • the operating system 132 e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks
  • the operating system 132 includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
  • the communication module 134 facilitates communication with other devices over one or more external ports 148 and also includes various software components for handling data received by the RF circuitry 1 12 and/or the external port 148.
  • the external port 148 e.g., Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • the external port 148 is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.).
  • the contact/contact module 138 detects contact with the touch screen 126, in conjunction with the touch-screen controller 122.
  • the contact/contact module 138 includes various software components for performing various operations related to detection of contact with the touch screen 126, such as determining if contact has occurred, determining a type of contact (e.g., tap, touch and hold, swipe, pinch, . . . , etc.) determining if there is movement of the contact and tracking the movement across the touch screen, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (including magnitude and/or direction) of the point of contact.
  • the contact/contact module 138 and the touch screen controller 122 also detects contact on the touchpad.
  • the graphics module 140 includes various known software components for rendering and displaying graphics on the touch screen 126.
  • graphics includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user- interface objects including soft keys), digital images, videos, animations and the like.
  • the graphics module 140 includes an optical intensity module 142.
  • the optical intensity module 142 controls the optical intensity of graphical objects, such as user-interface objects, displayed on the touch screen 126. Controlling the optical intensity may include increasing or decreasing the optical intensity of a graphical object. In some embodiments, the increase or decrease may follow predefined functions.
  • the user interface state module 144 controls the user interface state of the device 100.
  • the user interface state module 144 may include a lock module 150 and an unlock module 152.
  • the lock module detects satisfaction of any of one or more conditions to transition the device 100 to a user- interface lock state and to transition the device 100 to the lock state.
  • the unlock module detects satisfaction of any of one or more conditions to transition the device to a user-interface unlock state and to transition the device 100 to the unlock state.
  • the one or more applications 146 can include any applications installed on the device 100, including without limitation, a browser, address book, contact list, email, instant messaging, word processing, keyboard emulation, widgets, JAVA-enabled applications, encryption, digital rights management, voice recognition, voice replication, location determination capability (such as that provided by the global positioning system (GPS)), a music player (which plays back recorded music stored in one or more files, such as MP3 or AAC files), or applications used to control the device, etc.
  • a browser address book, contact list, email, instant messaging, word processing, keyboard emulation, widgets, JAVA-enabled applications, encryption, digital rights management, voice recognition, voice replication, location determination capability (such as that provided by the global positioning system (GPS)), a music player (which plays back recorded music stored in one or more files, such as MP3 or AAC files), or applications used to control the device, etc.
  • GPS global positioning system
  • the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.).
  • the device 100 may, therefore, include a 36-pin connector that is compatible with the iPod.
  • the device 100 may include one or more optional optical sensors (not shown), such as CMOS or CCD image sensors, for use in imaging applications.
  • the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through the touch screen 126 and, if included on the device 100, the touchpad.
  • the touch screen and touchpad as the primary input/control device for operation of the device 100, the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced.
  • the device 100 includes the touch screen 126, the touchpad, a push button for powering the device on/off and locking the device, a volume adjustment rocker button and a slider switch for toggling ringer profiles.
  • the push button may be used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval, or may be used to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed.
  • the device 100 also may accept verbal input for activation or deactivation of some functions through the microphone 1 18.
  • the predefined set of functions that are performed exclusively through the touch screen and the touchpad include navigation between user interfaces.
  • the touchpad when touched by the user, navigates the device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100.
  • the touchpad may be referred to as a "menu button.”
  • the menu button may be a physical push button or other physical input/control device instead of a touchpad.
  • the device 100 may have a plurality of user interface states.
  • a user interface state is a state in which the device 100 responds in a predefined manner to user input.
  • the plurality of user interface states includes a user-interface lock state and a user-interface unlock state.
  • the plurality of user interface states includes states for a plurality of applications.
  • contact module 138 will detect a user's current finger positions on touch screen 126 and also detect a type of touch event. Then contact module 138 will provide this information to cpu 106, which will then execute an assigned task.
  • the above technique makes it much easier and more time-efficient for a user to execute tasks on device 100.
  • hand 201 is using finger 202 and finger 203 to create two touch events on touch screen 126.
  • a first touch event is made on touch screen 126 by first finger 202 at a first location 204
  • a second touch event is made on touch screen 126 by second finger 203 at a second location 205.
  • a touch event occurs when a pointing apparatus, such as a user's fingertip or stylus, is making physical contact with, disengages from or moves along the touch screen.
  • Types of touch events can include, for example, a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, a touch-and- hold, or any other type of physical motion.
  • contact module 138 will detect a user's current finger positions on touch screen 126 and a type of touch event.
  • the first touch event may comprise a touch-and-hold where finger 202 remains in contact with touch screen 126 for over a predetermined period of time
  • the second touch event may comprise a tap, where finger 203 simply taps (quick touch and release) touch screen 126.
  • contact module 138 will provide CPU 106 with a notification of the touch events.
  • CPU 106 will also be provided with a location of the touch events. When the touch events occur within a predetermined time period of each other, CPU 106 will associate the type of touch events, their timing, and their relative location with respect to each other with an appropriate action to take (e.g., execute a volume up, a scroll up, . . . , etc.).
  • first touch event is a touch and hold
  • second touch event is a tap
  • CPU 106 will then determine that a tap was executed to the right hand side of a touch and hold.
  • a first command will then be executed based on this event. For example, when a tap occurs to the right of a touch and hold on any portion of touch screen 126, a volume-up command will be issued to speaker 1 16.
  • a second command may be executed by CPU. For example, when a first tap occurs to the left of a second touch and hold, a zoom in command may be issued to speaker 1 16.
  • CPU 106 will not attempt to discern what finger is being used in making the touch events.
  • the finger being used for the touch events is irrelevant.
  • CPU 106 only determines the relative position of the touch events with respect to each other, and bases the execution of a command on the relative positions of the two touch events, the timing sequence of the two touch events and/or the type of touch event.
  • the executed commands are not assigned to specific fingers. For example, the same common will be issued, no matter if a first tap is performed by the index finger and a second touch-and-hold is performed by the middle finger, or a first tap is performed by the thumb and a second touch-and-hold is performed by the ring finger.
  • graphics module 140 defines the left upper corner of the layout as the origin point and the right direction is the positive direction for the horizontal coordinate (x). Touch events located at the left-most portion of screen 126 will have a lower x value while touch events located at the right-most portion of screen 126 will have a higher x value. Thus, the determinations such as a first touch event being to the left of a second touch event is based on the x coordinate of each touch event.
  • the Y coordinate can be used to define touch events having a higher or lower position This is illustrated in FIG. 3 where an origin point and an X axis is used to determine x and y values for three touch events, A, B, and C. The higher the x value, the further right the touch event exists, while the higher the y value, the higher the touch event exists.
  • Module 138 detects the location, type, and/or timing of the touches and CPU 106 is notified.
  • CPU 106 determines an action associated with the types of touch event(s), the timing sequence of the touch events, and a relative position of each touch event with respect to each other. The action associated is determined by accessing memory 104.
  • Button- A push-button as found on mechanical or electronic instruments may be executed. For example an action may serve to control a volume function for speaker 1 16, or to key microphone 1 18.
  • Control knob - An action may take the form of turning a button or knob that cycles its content through two or more values, thus enabling selection of one from a group of items.
  • Switch - An action may flip a switch such.
  • Ul element 1 may take the form of an on/off switch that controls power to device 100.
  • FIG. 4 is a flow chart showing operation of device 100.
  • the logic flow begins at step 401 where CPU 106 receives a first touch event and at least a second touch event occurred during a predetermined period of time.
  • CPU 106 determines a location of the touch events with respect to each other. For example, CPU 106 may determine that the first touch event was to the right or left of the second touch event. Similarly, CPU 106 may determine that the first touch event is above or below a second touch event.
  • the logic flow continues to step 405 where a type of touch event is determined for each received touch event. For example, the first touch event may be a touch-and- hold event while the second touch event may be a tap event.
  • CPU 106 accesses memory 104 to determine an appropriate action to take. Finally, at step 409 the appropriate action is taken.
  • FIG. 5 is a more detailed flow chart showing operation of device 100 when two touch events are detected.
  • the logic flow begins at step 501 where contact module 138 determines that a first and a second touch events have occurred within a predetermined period of time (for example, within 1 second of each other).
  • contact module 138 determines a first type of touch event for the first touch event and a second type of touch event for the second touch event.
  • module 138 determines a relative location of the first touch event with respect to the second touch event.
  • the information determined in steps 501 -505 is passed to processor 106 (step 507).
  • Processor 106 determines an action based on the first type of touch event, the second type of touch event, and the relative location of the first touch event with respect to the second touch event (step 509).
  • the action is executed by processor 106 at step 51 1 .
  • the first and the second types of touch events may comprise a touch event taken from the group consisting of a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, and a touch-and-hold.
  • the predetermined period of time comprises a period of time less than a second, other periods of time may be used.
  • the relative location may comprise whether or not the second touch event was: to the right of the first touch event;
  • the action may comprise: a volume up;
  • processor 106 determines the relative location without determining what finger is performing the touch events.
  • a user may use more touch events to launch a particular function; such as using three fingers in sequence.
  • a first tap at left-most touch event, a second tap to the right-most touch event, and a third tap to the right of the second tap may be used to lock or unlock a device.
  • the device may determining at least a third touch event has occurred, determining a type of touch event for the third touch event, determining a relative location of the third touch event with respect to the first or second touch event, and determining a function (action) based additionally on the third type of touch event and the relative location of the third touch event.
  • references to specific implementation embodiments such as “circuitry” may equally be accomplished via either on general purpose computing apparatus (e.g., CPU) or specialized processing apparatus (e.g., DSP) executing software instructions stored in non-transitory computer-readable memory.
  • general purpose computing apparatus e.g., CPU
  • specialized processing apparatus e.g., DSP
  • DSP digital signal processor
  • processors such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
  • processors or “processing devices”
  • FPGAs field programmable gate arrays
  • unique stored program instructions including both software and firmware
  • some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic.
  • ASICs application specific integrated circuits
  • an embodiment can be implemented as a computer- readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein.
  • Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and apparatus for controlling a touch-screen device is provided herein. During operation a user will execute multiple touch events on the touch screen with multiple fingers. The type of touch event and the location of the several touch events are analyzed, and a specific action is executed by the touch-screen device based on the type of touch event and the location of the touch events.

Description

METHOD AND APPARATUS FOR CONTROLLING A TOUCH-SCREEN
DEVICE
Field of the Invention
[0001 ] The present invention generally relates to touch-screen devices, and more particularly to a method and apparatus for controlling a touch-screen device.
Background of the Invention
[0002] Touch-sensitive displays (also known as "touch screens") are well known in the art. Touch screens are used in many electronic devices to display control buttons, graphics, text, and to provide a user interface through which a user may interact with the device. A touch screen detects and responds to contact on its surface. A device may display one or more control buttons, soft keys, menus, and other user-interface elements on the touch screen. A user may interact with the device by contacting the touch screen at locations corresponding to the user-interface (Ul) elements with which they wish to interact.
[0003] One problem associated with using touch screens on portable devices is quickly and easily finding a desired user-interface element to launch the desired functionality. Considering the rich functionalities the application can provide, there may be lots of Ul elements (e.g. buttons, knobs, . . . , etc.) on a display. A major problem is that it may be troublesome for user to find the right Ul element in a timely manner, especially in a mission critical situation. Therefore a need exists for a method and apparatus for controlling a touchscreen device that makes controlling the device easier and more time- efficient. BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0004] The accompanying figures where like reference numerals refer to identical or functionally similar elements throughout the separate views, and which together with the detailed description below are incorporated in and form part of the specification, serve to further illustrate various embodiments and to explain various principles and advantages all in accordance with the present invention.
[0005] FIG. 1 is block diagram illustrating a general operational environment, according to one embodiment of the present invention;
[0006] FIG. 2 illustrates multiple fingers making contact with a touch screen.
[0007] FIG. 3 illustrates the determination of a relative location of a touch event.
[0008] FIG. 4 is a flow chart showing operation of the device of FIG. 1 .
[0009] FIG. 5 is a flow chart showing operation of the device of FIG. 1 .
[0010] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present invention. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present invention. It will further be appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. DETAILED DESCRIPTION
[001 1 ] In order to address the above-mentioned need a method and apparatus for controlling a touch-screen device is provided herein. During operation a user will execute a series of touch events on the touch screen. The type of touch events, the timing sequence (e.g. before, after, simultaneous) of the touch events with respect to each other, and/or the location of the touch events with respect to each other are analyzed, and a specific action is executed by the touch-screen device based on the type of touch event, the timing sequence of the touch events with respect to each other, and/or the location of the touch events with respect to each other.
[0012] As an example, a user can use a combination of one or more touching- and-hold behaviors with one or more tap behaviors to perform one complete input operation on the touch screen. The device will interpret the user inputs according to a predefined pattern and launches the associated functionality.
[0013] As an example of the above, an operation comprising a touch-and-hold may be followed with a subsequent tap. The function executed by a device is decided by the relative locations of the touch-and-hold and the tap points. For example, if the tap takes place on the right side of the touch-and-hold, the device may execute a first function (e.g., increase a volume level), on the contrary, if the tap is on the left side of the touch-and-hold, the device may execute a second function (e.g., decrease a volume level). In another embodiment, an operation comprising a touch-and-hold may be followed with two subsequent simultaneous taps. If the two taps take place on the right side of the touch-and-hold, the device may zoom in the displaying content, on the contrary, if the two taps are on the left side of the touch-and-hold, the device may zoom out the displaying content.
[0014] Turning now to the drawings, where like numerals designate like components, FIG. 1 is a block diagram of a portable electronic device that preferably comprises a touch screen 126. The device 100 includes a memory 102, a memory controller 104, one or more processing units (CPU's) 106, a peripherals interface 108, RF circuitry 1 12, audio circuitry 1 14, a speaker 1 16, a microphone 1 18, an input/output (I/O) subsystem 120, a touch screen 126, other input or control devices 128, and an external port 148. These components communicate over the one or more communication buses or signal lines 1 10. The device 100 can be any portable electronic device, including but not limited to a handheld computer, a tablet computer, a mobile phone, a police radio, a media player, a personal digital assistant (PDA), or the like, including a combination of two or more of these items. It should be appreciated that the device 100 is only one example of a portable electronic device 100, and that the device 100 may have more or fewer components than shown, or a different configuration of components. The various components shown in FIG. 1 may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
[0015] The memory 102 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state memory devices. In some embodiments, the memory 102 may further include storage remotely located from the one or more processors 106, for instance network attached storage accessed via the RF circuitry 1 12 or external port 148 and a communications network (not shown) such as the Internet, intranet(s), Local Area Networks (LANs), Wide Local Area Networks (WLANs), Storage Area Networks (SANs) and the like, or any suitable combination thereof. Access to the memory 102 by other components of the device 100, such as the CPU 106 and the peripherals interface 108, may be controlled by the memory controller 104.
[0016] The peripherals interface 108 couples the input and output peripherals of the device to the CPU 106 and the memory 102. The one or more processors 106 run various software programs and/or sets of instructions stored in the memory 102 to perform various functions for the device 100 and to process data.
[0017] In some embodiments, the peripherals interface 108, the CPU 106, and the memory controller 104 may be implemented on a single chip, such as a chip 1 1 1 . In some other embodiments, they may be implemented on separate chips.
[0018] The RF (radio frequency) circuitry 1 12 receives and sends electromagnetic waves. The RF circuitry 1 12 converts electrical signals to/from electromagnetic waves and communicates with communications networks and other communications devices via the electromagnetic waves. The RF circuitry 1 12 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The RF circuitry 1 12 may communicate with the networks, such as the Internet, also referred to as the World Wide Web (WWW), an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.1 1 a, IEEE 802.1 1 b, IEEE 802.1 1 g and/or IEEE 802.1 1 η), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email, instant messaging, and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document. [0019] The audio circuitry 1 14, the speaker 1 16, and the microphone 1 18 provide an audio interface between a user and the device 100. The audio circuitry 1 14 receives audio data from the peripherals interface 108, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 1 16. The speaker converts the electrical signal to human-audible sound waves. The audio circuitry 1 14 also receives electrical signals converted by the microphone 1 16 from sound waves. The audio circuitry 1 14 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 108 for processing. Audio data may be may be retrieved from and/or transmitted to the memory 102 and/or the RF circuitry 1 12 by the peripherals interface 108. In some embodiments, the audio circuitry 1 14 also includes a headset jack (not shown). The headset jack provides an interface between the audio circuitry 1 14 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (headphone for one or both ears) and input (microphone).
[0020] The I/O subsystem 120 provides the interface between input/output peripherals on the device 100, such as the touch screen 126 and other input/control devices 128, and the peripherals interface 108. The I/O subsystem 120 includes a touch-screen controller 122 and one or more input controllers 124 for other input or control devices. The one or more input controllers 124 receive/send electrical signals from/to other input or control devices 128. The other input/control devices 128 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, sticks, and so forth.
[0021 ] The touch screen 126 provides both an output interface and an input interface between the device and a user. The touch-screen controller 122 receives/sends electrical signals from/to the touch screen 126. The touch screen 126 displays visual output to the user. The visual output may include text, graphics, video, and any combination thereof. [0022] The touch screen 126 also accepts input from the user based on haptic and/or tactile contact. The touch screen 126 forms a touch-sensitive surface that accepts user input. The CPU 106, touch screen 126, and the touch screen controller 122 (along with any associated modules and/or sets of instructions in the memory 102) detects contact (and any movement or break of the contact) on the touch screen 126 and converts the detected contact into an action, such as the execution of one or more user-interface elements (e.g., soft keys), or other actions such as, but not limited to a volume up/down, or a scroll up/down. I
[0023] In an exemplary embodiment, a point of contact between the touch screen 126 and the user corresponds to one or more digits of the user. The touch screen 126 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen 126 and touch screen controller 122 may detect contact and any movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch screen 126. The touch-sensitive display may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat. Nos. 6,323,846 (Westerman et al.), 6,570,557 (Westerman et al.), and/or 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1 . However, touch screen 126 displays visual output from the portable device, whereas touch sensitive tablets do not provide visual output. The touch screen 126 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen 126 may have a resolution of approximately 168 dpi. Although the following text describes contact with touch screen 126 with a user's fingers, a user may make contact with the touch screen 126 using any suitable object or appendage, such as a stylus, finger, and so forth. [0024] In some embodiments, in addition to the touch screen, the device 100 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from the touch screen 126 or an extension of the touch-sensitive surface formed by the touch screen 126.
[0025] The device 100 also includes a power system 130 for powering the various components. The power system 130 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
[0026] In some embodiments, the software components include an operating system 132, a communication module (or set of instructions) 134, an electronic contact module (or set of instructions) 138, a graphics module (or set of instructions) 140, a user interface state module (or set of instructions) 144, and one or more applications (or set of instructions) 146.
[0027] The operating system 132 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
[0028] The communication module 134 facilitates communication with other devices over one or more external ports 148 and also includes various software components for handling data received by the RF circuitry 1 12 and/or the external port 148. The external port 148 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.).
[0029] The contact/contact module 138 detects contact with the touch screen 126, in conjunction with the touch-screen controller 122. The contact/contact module 138 includes various software components for performing various operations related to detection of contact with the touch screen 126, such as determining if contact has occurred, determining a type of contact (e.g., tap, touch and hold, swipe, pinch, . . . , etc.) determining if there is movement of the contact and tracking the movement across the touch screen, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (including magnitude and/or direction) of the point of contact. In some embodiments, the contact/contact module 138 and the touch screen controller 122 also detects contact on the touchpad.
[0030] The graphics module 140 includes various known software components for rendering and displaying graphics on the touch screen 126. Note that the term "graphics" includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user- interface objects including soft keys), digital images, videos, animations and the like.
[0031 ] In some embodiments, the graphics module 140 includes an optical intensity module 142. The optical intensity module 142 controls the optical intensity of graphical objects, such as user-interface objects, displayed on the touch screen 126. Controlling the optical intensity may include increasing or decreasing the optical intensity of a graphical object. In some embodiments, the increase or decrease may follow predefined functions.
[0032] The user interface state module 144 controls the user interface state of the device 100. The user interface state module 144 may include a lock module 150 and an unlock module 152. The lock module detects satisfaction of any of one or more conditions to transition the device 100 to a user- interface lock state and to transition the device 100 to the lock state. The unlock module detects satisfaction of any of one or more conditions to transition the device to a user-interface unlock state and to transition the device 100 to the unlock state.
[0033] The one or more applications 146 can include any applications installed on the device 100, including without limitation, a browser, address book, contact list, email, instant messaging, word processing, keyboard emulation, widgets, JAVA-enabled applications, encryption, digital rights management, voice recognition, voice replication, location determination capability (such as that provided by the global positioning system (GPS)), a music player (which plays back recorded music stored in one or more files, such as MP3 or AAC files), or applications used to control the device, etc.
[0034] In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.). The device 100 may, therefore, include a 36-pin connector that is compatible with the iPod. In some embodiments, the device 100 may include one or more optional optical sensors (not shown), such as CMOS or CCD image sensors, for use in imaging applications.
[0035] In some embodiments, the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through the touch screen 126 and, if included on the device 100, the touchpad. By using the touch screen and touchpad as the primary input/control device for operation of the device 100, the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced. In one embodiment, the device 100 includes the touch screen 126, the touchpad, a push button for powering the device on/off and locking the device, a volume adjustment rocker button and a slider switch for toggling ringer profiles. The push button may be used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval, or may be used to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed. In an alternative embodiment, the device 100 also may accept verbal input for activation or deactivation of some functions through the microphone 1 18.
[0036] The predefined set of functions that are performed exclusively through the touch screen and the touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates the device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100. In such embodiments, the touchpad may be referred to as a "menu button." In some other embodiments, the menu button may be a physical push button or other physical input/control device instead of a touchpad.
[0037] The device 100 may have a plurality of user interface states. A user interface state is a state in which the device 100 responds in a predefined manner to user input. In some embodiments, the plurality of user interface states includes a user-interface lock state and a user-interface unlock state. In some embodiments, the plurality of user interface states includes states for a plurality of applications.
[0038] As described above, one problem associated with using touch screens 126 on portable devices is quickly and easily controlling the device 100. In particular it may be troublesome for user to find the right Ul element to execute a desired function with a timely manner, especially in a mission critical situation. In order to address this need, contact module 138 will detect a user's current finger positions on touch screen 126 and also detect a type of touch event. Then contact module 138 will provide this information to cpu 106, which will then execute an assigned task. The above technique makes it much easier and more time-efficient for a user to execute tasks on device 100. [0039] The above technique is illustrated in FIG. 2. As shown in FIG. 2, hand 201 is using finger 202 and finger 203 to create two touch events on touch screen 126. In particular, a first touch event is made on touch screen 126 by first finger 202 at a first location 204, and a second touch event is made on touch screen 126 by second finger 203 at a second location 205. A touch event occurs when a pointing apparatus, such as a user's fingertip or stylus, is making physical contact with, disengages from or moves along the touch screen. Types of touch events can include, for example, a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, a touch-and- hold, or any other type of physical motion.
[0040] During operation contact module 138 will detect a user's current finger positions on touch screen 126 and a type of touch event. For example, the first touch event may comprise a touch-and-hold where finger 202 remains in contact with touch screen 126 for over a predetermined period of time, while the second touch event may comprise a tap, where finger 203 simply taps (quick touch and release) touch screen 126.
[0041 ] Regardless of the type of touch event, contact module 138 will provide CPU 106 with a notification of the touch events. In addition to the type of touch events, CPU 106 will also be provided with a location of the touch events. When the touch events occur within a predetermined time period of each other, CPU 106 will associate the type of touch events, their timing, and their relative location with respect to each other with an appropriate action to take (e.g., execute a volume up, a scroll up, . . . , etc.).
[0042] For example, if the first touch event is a touch and hold, and the second touch event is a tap, CPU 106 will then determine that a tap was executed to the right hand side of a touch and hold. A first command will then be executed based on this event. For example, when a tap occurs to the right of a touch and hold on any portion of touch screen 126, a volume-up command will be issued to speaker 1 16. However, if the first touch event was a tap, and the second touch event was a touch and hold, then a second command may be executed by CPU. For example, when a first tap occurs to the left of a second touch and hold, a zoom in command may be issued to speaker 1 16.
[0043] It should be noted that CPU 106 will not attempt to discern what finger is being used in making the touch events. The finger being used for the touch events is irrelevant. CPU 106 only determines the relative position of the touch events with respect to each other, and bases the execution of a command on the relative positions of the two touch events, the timing sequence of the two touch events and/or the type of touch event. Thus, the executed commands are not assigned to specific fingers. For example, the same common will be issued, no matter if a first tap is performed by the index finger and a second touch-and-hold is performed by the middle finger, or a first tap is performed by the thumb and a second touch-and-hold is performed by the ring finger.
[0044] In order to determine a relative location of each touch event, graphics module 140 defines the left upper corner of the layout as the origin point and the right direction is the positive direction for the horizontal coordinate (x). Touch events located at the left-most portion of screen 126 will have a lower x value while touch events located at the right-most portion of screen 126 will have a higher x value. Thus, the determinations such as a first touch event being to the left of a second touch event is based on the x coordinate of each touch event. In a similar manner, the Y coordinate can be used to define touch events having a higher or lower position This is illustrated in FIG. 3 where an origin point and an X axis is used to determine x and y values for three touch events, A, B, and C. The higher the x value, the further right the touch event exists, while the higher the y value, the higher the touch event exists.
[0045] Operation of the above during anticipated use is described. A user contacts the touch screen at several points (although contact does not need to be simultaneous). Module 138 detects the location, type, and/or timing of the touches and CPU 106 is notified. CPU 106 then determines an action associated with the types of touch event(s), the timing sequence of the touch events, and a relative position of each touch event with respect to each other. The action associated is determined by accessing memory 104. Some possible actions are described:
[0046] Button- A push-button as found on mechanical or electronic instruments may be executed. For example an action may serve to control a volume function for speaker 1 16, or to key microphone 1 18.
[0047] Control knob - An action may take the form of turning a button or knob that cycles its content through two or more values, thus enabling selection of one from a group of items.
[0048] Switch - An action may flip a switch such. For example, Ul element 1 may take the form of an on/off switch that controls power to device 100.
[0049] The following table gives a quick example of some form a table in memory 104 may take.
Figure imgf000016_0001
Table 1 : Example commands associated with events. [0050] FIG. 4 is a flow chart showing operation of device 100. The logic flow begins at step 401 where CPU 106 receives a first touch event and at least a second touch event occurred during a predetermined period of time. At step 403, CPU 106 determines a location of the touch events with respect to each other. For example, CPU 106 may determine that the first touch event was to the right or left of the second touch event. Similarly, CPU 106 may determine that the first touch event is above or below a second touch event. The logic flow continues to step 405 where a type of touch event is determined for each received touch event. For example, the first touch event may be a touch-and- hold event while the second touch event may be a tap event. At step 407, CPU 106 accesses memory 104 to determine an appropriate action to take. Finally, at step 409 the appropriate action is taken.
[0051 ] FIG. 5 is a more detailed flow chart showing operation of device 100 when two touch events are detected. The logic flow begins at step 501 where contact module 138 determines that a first and a second touch events have occurred within a predetermined period of time (for example, within 1 second of each other). At step 503 contact module 138 determines a first type of touch event for the first touch event and a second type of touch event for the second touch event. At step 505 module 138 determines a relative location of the first touch event with respect to the second touch event. The information determined in steps 501 -505 is passed to processor 106 (step 507). Processor 106 then determines an action based on the first type of touch event, the second type of touch event, and the relative location of the first touch event with respect to the second touch event (step 509). The action is executed by processor 106 at step 51 1 .
[0052] As discussed above, the first and the second types of touch events may comprise a touch event taken from the group consisting of a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, and a touch-and-hold. Additionally, while the predetermined period of time comprises a period of time less than a second, other periods of time may be used. As discussed above, the relative location may comprise whether or not the second touch event was: to the right of the first touch event;
to the left of the first touch event;
above the first touch event; and/or
below the first touch event.
[0053] The action may comprise: a volume up;
a volume down;
a scroll up;
a scroll down;
a scroll left;
a scroll right;
a zoom-in; or
a zoom-out.
[0054] Finally, as discussed, processor 106 determines the relative location without determining what finger is performing the touch events.
[0055] It should be noted that while the above description was provided using two fingers primarily for touch events, one of ordinary skill in the art will recognize that the above technique is not limited to any number of tap/touch events. For example, a user may use more touch events to launch a particular function; such as using three fingers in sequence. As an example, a first tap at left-most touch event, a second tap to the right-most touch event, and a third tap to the right of the second tap may be used to lock or unlock a device. Thus, the device may determining at least a third touch event has occurred, determining a type of touch event for the third touch event, determining a relative location of the third touch event with respect to the first or second touch event, and determining a function (action) based additionally on the third type of touch event and the relative location of the third touch event.
[0056] Those skilled in the art will further recognize that references to specific implementation embodiments such as "circuitry" may equally be accomplished via either on general purpose computing apparatus (e.g., CPU) or specialized processing apparatus (e.g., DSP) executing software instructions stored in non-transitory computer-readable memory. It will also be understood that the terms and expressions used herein have the ordinary technical meaning as is accorded to such terms and expressions by persons skilled in the technical field as set forth above except where different specific meanings have otherwise been set forth herein.
[0057] The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
[0058] Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," "has", "having," "includes", "including," "contains", "containing" or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by "comprises ...a", "has ...a", "includes ...a", "contains ...a" does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms "a" and "an" are defined as one or more unless explicitly stated otherwise herein. The terms "substantially", "essentially", "approximately", "about" or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1 % and in another embodiment within 0.5%. The term "coupled" as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is "configured" in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
[0059] It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or "processing devices") such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
[0060] Moreover, an embodiment can be implemented as a computer- readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
[0061 ] The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
[0095] What is claimed is:

Claims

1 . A method for controlling a touch-screen device, the method comprising the steps of:
determining that a first and a second touch events have occurred within a predetermined period of time;
determining a first type of touch event for the first touch event;
determining a second type of touch event for the second touch event determining a relative location of the first touch event with respect to the second touch event;
determining an action based on the first type of touch event, the second type of touch event, and the relative location of the first touch event with respect to the second touch event; and
executing the action.
2. The method of claim 1 wherein the step of determining the first and the second types of touch events comprises the step of determining if the first and the second types of touch events comprises a touch event taken from the group consisting of a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, and a touch-and-hold.
3. The method of claim 1 wherein the predetermined period of time comprises a period of time less than a second.
4. The method of claim 1 wherein the step of determining the relative location of the first touch event with respect to the second touch event comprises the step of determining if the second touch event was:
to the right of the first touch event;
to the left of the first touch event;
above the first touch event; and/or
below the first touch event.
5. The method of claim 1 wherein the step of determining the action comprises the step of determining a:
a volume up;
a volume down;
a scroll up;
a scroll down;
a scroll left;
a scroll right;
a zoom-in; or
a zoom-out.
6. The method of claim 1 :
wherein the step of determining the first and the second types of touch events comprises the step of determining if the first and the second types of touch events comprises a touch event taken from the group consisting of a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, and a touch-and-hold;
wherein the predetermined period of time comprises a period of time less than a second;
wherein the step of determining the relative location of the first touch event with respect to the second touch event comprises the step of determining if the second touch event was:
to the right of the first touch event;
to the left of the first touch event;
above the first touch event; and/or
below the first touch event;
wherein the step of determining the action comprises the step of determining a:
a volume up;
a volume down;
a scroll up;
a scroll down; a scroll left; a scroll right;
a zoom-in; or
a zoom-out.
7. The method of claim 1 wherein the step of determining the relative location of the first touch event with respect to the second touch event is made without determining what finger is performing the touch events.
8. The method of claim 1 further comprising the steps of:
determining at least a third touch event has occurred;
determining a type of touch event for the third touch event;
determining a relative location of the third touch event with respect to the first or second touch event; and
wherein the step of determining the action comprises the step of determining the action based additionally on the third type of touch event and the relative location of the third touch event.
9. A touch screen device comprising:
a contact module determining that a first and a second touch events have occurred within a predetermined period of time, determining a first type of touch event for the first touch event, determining a second type of touch event for the second touch event, and determining a relative location of the first touch event with respect to the second touch event; and
a processor determining an action based on the first type of touch event, the second type of touch event, and the relative location of the first touch event with respect to the second touch event, and executing the action.
10. The touch screen device of claim 9 wherein the first and the second types of touch events comprises a touch event taken from the group consisting of a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, and a touch-and-hold.
1 1 . The touch screen device of claim 9 wherein the predetermined period of time comprises a period of time less than a second.
12. The touch screen device of claim 9 wherein the relative location comprises whether or not the second touch event was:
to the right of the first touch event;
to the left of the first touch event;
above the first touch event; and/or
below the first touch event.
13. The touch screen device of claim 9 wherein the action comprises:
a volume up;
a volume down;
a scroll up;
a scroll down;
a scroll left;
a scroll right;
a zoom-in; or
a zoom-out.
14. The touch screen device of claim 9:
wherein the first and the second types of touch events comprises a touch event taken from the group consisting of a single tap touch, a double tap touch, a slow slide, a fast slide, a circular movement, and a touch-and- hold;
wherein the predetermined period of time comprises a period of time less than a second;
wherein the relative location comprises whether or not the second touch event was: to the right of the first touch event;
to the left of the first touch event;
above the first touch event; and/or
below the first touch event;
wherein the action comprises:
a volume up;
a volume down;
a scroll up;
a scroll down;
a scroll left;
a scroll right;
a zoom-in; or
a zoom-out.
15. The touch screen device of claim 9 wherein the processor determines the relative location without determining what finger is performing the touch events.
PCT/CN2013/073661 2013-04-02 2013-04-02 Method and apparatus for controlling a touch-screen device WO2014161156A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/073661 WO2014161156A1 (en) 2013-04-02 2013-04-02 Method and apparatus for controlling a touch-screen device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/073661 WO2014161156A1 (en) 2013-04-02 2013-04-02 Method and apparatus for controlling a touch-screen device

Publications (1)

Publication Number Publication Date
WO2014161156A1 true WO2014161156A1 (en) 2014-10-09

Family

ID=51657407

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/073661 WO2014161156A1 (en) 2013-04-02 2013-04-02 Method and apparatus for controlling a touch-screen device

Country Status (1)

Country Link
WO (1) WO2014161156A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090135152A1 (en) * 2007-11-23 2009-05-28 Jia-Yih Lii Gesture detection on a touchpad
US20100245282A1 (en) * 2009-03-24 2010-09-30 Hon Hai Precision Industry Co., Ltd. Touch-screen based input method and system, and electronic device using same
WO2011038655A1 (en) * 2009-09-29 2011-04-07 北京联想软件有限公司 Method and electronic device for gesture recognition
CN102591546A (en) * 2012-02-02 2012-07-18 袁海滨 Method for recognizing multi-touch gesture on four-wire resistive touch screen of electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090135152A1 (en) * 2007-11-23 2009-05-28 Jia-Yih Lii Gesture detection on a touchpad
US20100245282A1 (en) * 2009-03-24 2010-09-30 Hon Hai Precision Industry Co., Ltd. Touch-screen based input method and system, and electronic device using same
WO2011038655A1 (en) * 2009-09-29 2011-04-07 北京联想软件有限公司 Method and electronic device for gesture recognition
CN102591546A (en) * 2012-02-02 2012-07-18 袁海滨 Method for recognizing multi-touch gesture on four-wire resistive touch screen of electronic equipment

Similar Documents

Publication Publication Date Title
US20240078006A1 (en) Unlocking a device by performing gestures on an unlock image
US20150378502A1 (en) Method and apparatus for managing user interface elements on a touch-screen device
US7667148B2 (en) Method, device, and graphical user interface for dialing with a click wheel
US8519963B2 (en) Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display
US7480870B2 (en) Indication of progress towards satisfaction of a user input condition
US8082523B2 (en) Portable electronic device with graphical user interface supporting application switching
US20080165145A1 (en) Portable Multifunction Device, Method, and Graphical User Interface for Interpreting a Finger Swipe Gesture
US20140195943A1 (en) User interface controls for portable devices
US20160026850A1 (en) Method and apparatus for identifying fingers in contact with a touch screen
WO2016115700A1 (en) Method and apparatus for controlling user interface elements on a touch screen
US10019151B2 (en) Method and apparatus for managing user interface elements on a touch-screen device
WO2014161156A1 (en) Method and apparatus for controlling a touch-screen device
AU2008100419A4 (en) Unlocking a device by performing gestures on an unlock image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13881148

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13881148

Country of ref document: EP

Kind code of ref document: A1