EP2812778A1 - Portable electronic device and method of controlling same - Google Patents

Portable electronic device and method of controlling same

Info

Publication number
EP2812778A1
EP2812778A1 EP12868313.3A EP12868313A EP2812778A1 EP 2812778 A1 EP2812778 A1 EP 2812778A1 EP 12868313 A EP12868313 A EP 12868313A EP 2812778 A1 EP2812778 A1 EP 2812778A1
Authority
EP
European Patent Office
Prior art keywords
touch
electronic device
event handling
secondary input
sensitive display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP12868313.3A
Other languages
German (de)
French (fr)
Other versions
EP2812778A4 (en
Inventor
Alexander Samson HIRSCH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BlackBerry Ltd
Original Assignee
BlackBerry Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BlackBerry Ltd filed Critical BlackBerry Ltd
Publication of EP2812778A1 publication Critical patent/EP2812778A1/en
Publication of EP2812778A4 publication Critical patent/EP2812778A4/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present disclosure relates to electronic devices, including but not limited to, portable electronic devices having touch-sensitive displays and their control.
  • Portable electronic devices include, for example, several types of mobile stations such as simple cellular telephones, smart phones, wireless personal digital assistants (PDAs), and laptop computers with wireless 802.11 or Bluetooth capabilities.
  • PIM personal information manager
  • Portable electronic devices such as PDAs or smart telephones are generally intended for handheld use and ease of portability. Smaller devices are generally desirable for portability.
  • a touch-sensitive display also known as a touchscreen display, is particularly useful on handheld devices, which are small and have limited space for user input and output. The information displayed on the touch-sensitive displays may be modified depending on the functions and operations being performed. With continued demand for decreased size of portable electronic devices, touch-sensitive displays continue to decrease in size.
  • FIG. 1 is a block diagram of a portable electronic device in accordance with the disclosure.
  • FIG. 2 is a flowchart illustrating a method of performing a function associated with a touch in accordance with the disclosure.
  • FIG. 3 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
  • FIG. 4 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
  • FIG. 5 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
  • FIG. 6 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
  • the following describes an apparatus and method of detecting a touch on a touch-sensitive display of an electronic device.
  • a first function associated with the touch is performed according to a first event handling process.
  • a second function associated with the touch is performed according to a second event handling process.
  • the disclosure generally relates to an electronic device, such as a portable electronic device as described herein.
  • electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, wirelessly enabled notebook computers, tablet computers, mobile internet devices, electronic navigation devices, and so forth.
  • the electronic device may be a portable electronic device without wireless communication capabilities, such as a handheld electronic game, digital photograph album, digital camera, media player, e-book reader, and so forth.
  • FIG. 1 A block diagram of an example of a portable electronic device 100 is shown in FIG. 1.
  • the portable electronic device 100 includes multiple components, such as a processor 102 that controls the overall operation of the portable electronic device 100. Communication functions, including data and voice communications, are performed through a communication subsystem 104. Data received by the portable electronic device 100 is decompressed and decrypted by a decoder 106.
  • the communication subsystem 104 receives messages from and sends messages to a wireless network 150.
  • the wireless network 150 may be any type of wireless network, including, but not limited to, data wireless networks, voice wireless networks, and networks that support both voice and data communications.
  • a power source 142 such as one or more rechargeable batteries or a port to an external power supply, powers the portable electronic device 100.
  • the processor 102 interacts with other components, such as a Random Access Memory (RAM) 108, memory 110, a touch-sensitive display 118, one or more actuators 120, one or more force sensors 122, an auxiliary input/output (I/O) subsystem 124, a data port 126, a speaker 128, a microphone 130, short-range communications 132 and other device subsystems 134.
  • the touch-sensitive display 118 includes a display 112 and touch sensors 114 that are coupled to at least one controller 116 that is utilized to interact with the processor 102. Input via a graphical user interface is provided via the touch-sensitive display 118.
  • Information such as text, characters, symbols, images, icons, and other items that may be displayed or rendered on a portable electronic device, is displayed on the touch-sensitive display 118 via the processor 102.
  • the processor 102 may also interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity-induced reaction forces.
  • the portable electronic device 100 may utilize a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 for communication with a network, such as the wireless network 150.
  • SIM/RUIM Removable User Identity Module
  • user identification information may be programmed into memory 110.
  • the portable electronic device 100 includes an operating system 146 and software programs, applications, or components 148 that are executed by the processor 102 and are typically stored in a persistent, updatable store such as the memory 110. Additional applications or programs may be loaded onto the portable electronic device 100 through the wireless network 150, the auxiliary I/O subsystem 124, the data port 126, the short-range communications subsystem 132, or any other suitable subsystem 134.
  • a received signal such as a text message, an e-mail message, or web page download is processed by the communication subsystem 104 and input to the processor 102.
  • the processor 102 processes the received signal for output to the display 112 and/or to the auxiliary I/O subsystem 124.
  • a subscriber may generate data items, for example e-mail messages, which may be transmitted over the wireless network 150 through the communication subsystem 104.
  • the speaker 128 outputs audible information converted from electrical signals
  • the microphone 130 converts audible information into electrical signals for processing .
  • the touch-sensitive display 118 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth.
  • a capacitive touch-sensitive display includes one or more capacitive touch sensors 114.
  • the capacitive touch sensors may comprise any suitable material, such as indium tin oxide (ITO).
  • One or more touches may be detected by the touch-sensitive display 118.
  • the processor 102 may determine attributes of the touch, including a location of the touch.
  • Touch location data may include data for an area of contact or data for a single point of contact, such as a point at or near a center of the area of contact.
  • the location of a detected touch may include x and y components, e.g., horizontal and vertical components, respectively, with respect to one's view of the touch-sensitive display 118.
  • the x location component may be determined by a signal generated from one touch sensor
  • the y location component may be determined by a signal generated from another touch sensor.
  • a touch may be detected from any suitable input member, such as a finger, thumb, appendage, or other objects, for example, a stylus, pen, or other pointer, depending on the nature of the touch-sensitive display 118. Multiple simultaneous touches may be detected .
  • One or more gestures may also be detected by the touch-sensitive display 118.
  • a gesture such as a swipe, also known as a flick, is a particular type of touch on a touch-sensitive display 118 and may begin at an origin point and continue to an end point, for example, a concluding end of the gesture.
  • a gesture may be identified by attributes of the gesture, including the origin point, the end point, the distance traveled, the duration, the velocity, and the direction, for example.
  • a gesture may be long or short in distance and/or duration. Two points of the gesture may be utilized to determine a direction of the gesture.
  • a gesture may also include a hover.
  • a hover may be a touch at a location that is generally unchanged over a period of time or is associated with the same selection item for a period of time.
  • the optional actuator(s) 120 may be depressed or activated by applying sufficient force to the touch-sensitive display 118 to overcome the actuation force of the actuator 120.
  • the actuator(s) 120 may be actuated by pressing anywhere on the touch-sensitive display 118.
  • the actuator(s) 120 may provide input to the processor 102 when actuated . Actuation of the actuator(s) 120 may result in provision of tactile feedback.
  • the touch-sensitive display 118 is depressible, pivotable, and/or movable. Such a force may actuate the
  • the touch-sensitive display 118 may, for example, float with respect to the housing of the portable electronic device, i.e., the touch-sensitive display 118 may not be fastened to the housing .
  • a mechanical dome switch actuator may be utilized. In this example, tactile feedback is provided when the dome collapses due to imparted force and when the dome returns to the rest position after release of the switch.
  • the actuator 120 may comprise one or more piezoelectric (piezo) devices that provide tactile feedback for the touch-sensitive display 118.
  • Optional force sensors 122 may be disposed in conjunction with the touch-sensitive display 118 to determine or react to forces applied to the touch- sensitive display 118.
  • the force sensor 122 may be disposed in line with a piezo actuator 120.
  • the force sensors 122 may be force-sensitive resistors, strain gauges, piezoelectric or piezoresistive devices, pressure sensors, quantum tunneling composites, force-sensitive switches, or other suitable devices.
  • force information related to a detected touch may be utilized to select information, such as information associated with a location of a touch.
  • a touch that does not meet a force threshold may highlight a selection option, whereas a touch that meets a force threshold may select or input that selection option.
  • Selection options include, for example, displayed or virtual keys of a keyboard; selection boxes or windows, e.g ., "cancel,” "delete,” or
  • the touch-sensitive display 118 includes a display area 300 in which information may be displayed, and a non-display area 302 extending around the periphery of the display area 300, such as shown in the example of FIG. 3.
  • the display area 300 generally corresponds to the area of the display 112.
  • Information is not displayed in the non-display area 302 by the display, which non-display area 302 is utilized to accommodate, for example, electronic traces or electrical connections, adhesives or other sealants, and/or protective coatings around the edges of the display area 300.
  • the non-display area 302 may be referred to as an inactive area and is not part of the physical housing or frame of the electronic device.
  • a secondary display not part of the primary display 112, may be disposed under the non-display area 302.
  • Touch sensors may be disposed in the non-display area 302, which touch sensors may be extended from the touch sensors in the display area 300 or distinct or separate touch sensors from the touch sensors in the display area 300.
  • a touch, including a gesture may be associated with the display area 300, the non-display area 302, or both areas.
  • the touch sensors may extend across substantially the entire non-display area 302 or may be disposed in only part of the non-display area 302.
  • an event handler process also known as an event listener or an event handler, associated with an application is notified of the event, and performs a function or action in response to the event. For example, when a touch is detected at the location of an icon, the icon may be selected, and the function associated with the icon may be executed.
  • the portable electronic device 100 may support only a limited number of touches. For example, event handling processes may be provided to respond to a simple touch, navigation gestures, such as swipes, and re-sizing gestures, such as pinch and zoom gestures.
  • the limited number of supported gestures may not provide all the functionality of a comparable desktop environment. For example, in a desktop environment, a mouse hover over a hypertext link may result in the link target address being displayed. Similarly, a mouse hover over an icon may result in the icon name, or function, being displayed.
  • FIG. 2 A flowchart illustrating a method of performing functions associated with a touch according to a first or a second event handling process is shown in FIG. 2. The method may be carried out by software executed, for example, by the processor 102. Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description.
  • the method may contain additional or fewer processes than shown and/or described, and may be performed in a different order.
  • Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer- readable medium .
  • a touch is detected 202 on a touch-sensitive display of the electronic device 100.
  • a first function associated with the touch is performed 206 according to a first event handling process.
  • the first event handling process may be one of a set of first event handling processes for handling different touches in a first manner.
  • the first set of event handling processes may include processes for processing a simple touch as a command to select a button or icon associated with a location of the touch, and execute a function associated with the button or icon; for processing a swipe gesture as a command to navigate from one page of a website, or document, to an adjacent page; and for processing a pinch and zoom gesture as a command to zoom in or out on a displayed page.
  • a second function associated with the touch is performed 206 according to a second event handling process.
  • the secondary input may be a second touch, such as a touch and hold gesture, a series of second touches, such as one or more taps, actuation of a key on a keyboard, actuation of a physical button or key on the portable electronic device 100, or any other secondary input that overlaps the touch at least partially in time.
  • the terms “secondary” and “second” do not indicate a time order of touches and inputs.
  • a secondary input that overlaps the touch at least partially in time may begin before, after, or at the same time the touch begins.
  • the second event process may be one of a set of second event handling processes for handling touches in a second manner.
  • the second set of event handling processes may include processes for processing a simple touch as a rollover command to display information about a hyperlink target address or a menu item; for processing a swipe gesture as a command switch to another active application; and for processing a pinch and zoom gesture as a command to rotate a displayed image.
  • the electronic device may process a subsequent touch according to one of the set of second event handling processes.
  • the electronic device may process a subsequent touch according to one of the set of first event handling processes.
  • FIG. 3 and FIG. 4 An example of detecting a touch on a portable electronic device is shown in FIG. 3 and FIG. 4.
  • a touch at a location 304 is detected within a first region, such as display area 300, and no secondary input is detected that overlaps the touch at least partially in time, the touch is processed according to a first event handling process.
  • the location 304 may be associated with a button or icon displayed in the display area 300.
  • a first event handling process may execute a command associated with the button or icon.
  • a touch at a location 400 is detected in the display area 300, and a secondary input, such as a second touch at location 402, that overlaps the touch at least partially in time, is detected in a second region, such as non-display area 302, a second event handling process may process the touch.
  • the location 400 may be associated with a hyperlink, and the second event handling process may perform a function, such as a rollover function to display a hypertext link associated with the hyperlink.
  • More than one second event handling process e.g., a set of second event handling processes, may be available for processing touches. Selection of a second event handling process may be determined by the secondary input to the electronic device 100. For example, as shown in FIG.
  • a second event handling process associated with a first set of set of second event handling processes, may process the touch, and perform a rollover function as described above.
  • a second event handling process associated with a second set of set of second event handling processes, may process the touch to perform a function to highlight, or indicate selection of, text associated with the location 400.
  • a second event handling processes may be selected or invoked by actuating a first pre-assigned key, while another second event handling process may be selected or invoked by actuating a second pre-assigned key.
  • FIG. 5 and FIG. 6 Additional examples of detecting a touch on a portable electronic device are shown in FIG. 5 and FIG. 6.
  • a gesture such as pinch gesture 500
  • a first region such as display area 300
  • no secondary input is detected that overlaps the gesture at least partially in time
  • the gesture is processed according to a first event handling process.
  • a first event handling process may perform a zoom out function, causing a displayed image to be reduced in size or resolution.
  • a pinch gesture 600 is detected in the display area 300, and a secondary input, such as touch and hold gesture 602, is detected in a second region, such as non-display area 302, at least partially overlapping in time with the gesture 600, a second event handling process may process the gesture 600.
  • the second event handling process may perform a function, such as a rotate function to rotate an image or a page displayed at a location associated with the gesture 600.
  • the secondary input may be, for example, the depression of a pre-assigned key, such as an ALT key.
  • the touch may be processed according to a first event handling process.
  • a first event handling process may execute an operation associated with a button or icon.
  • a second event handling process may process the touch, and perform a second function, such as a rollover function.
  • the present method and apparatus increases the number of functions that may be supported through a limited number of touches or gestures. By switching, changing, or toggling, between two or more event handling processes, the number of commands and processes associated with touches is multiplied without the electronic device needing to detect any new or additional gestures that may be complicated . User experience is also improved, because a limited number of gestures is provided, and more than one command or process for a given touch may be invoked by providing a secondary input, such as a second touch that overlaps the first touch at least partially in time, or a key actuation that overlaps the first touch at least partially in time.
  • a secondary input such as a second touch that overlaps the first touch at least partially in time, or a key actuation that overlaps the first touch at least partially in time.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method includes detecting, on a touch-sensitive display of an electronic device, a touch. When no secondary input to the electronic device is detected during the touch, a first function associated with the touch is performed according to a first event handling process. When a secondary input is detected that overlaps at least partially in time with the touch, a second function associated with the touch is performed according to a second event handling process.

Description

PORTABLE ELECTRONIC DEVICE AND METHOD OF CONTROLLING SAME
Field of Technology
[0001] The present disclosure relates to electronic devices, including but not limited to, portable electronic devices having touch-sensitive displays and their control.
Background
[0002] Electronic devices, including portable electronic devices, have gained widespread use and may provide a variety of functions including, for example, telephonic, electronic messaging and other personal information manager (PIM) application functions. Portable electronic devices include, for example, several types of mobile stations such as simple cellular telephones, smart phones, wireless personal digital assistants (PDAs), and laptop computers with wireless 802.11 or Bluetooth capabilities.
[0003] Portable electronic devices such as PDAs or smart telephones are generally intended for handheld use and ease of portability. Smaller devices are generally desirable for portability. A touch-sensitive display, also known as a touchscreen display, is particularly useful on handheld devices, which are small and have limited space for user input and output. The information displayed on the touch-sensitive displays may be modified depending on the functions and operations being performed. With continued demand for decreased size of portable electronic devices, touch-sensitive displays continue to decrease in size.
[0004] Improvements in devices with touch-sensitive displays are desirable.
Brief Description of the Drawings
[0005] FIG. 1 is a block diagram of a portable electronic device in accordance with the disclosure.
[0006] FIG. 2 is a flowchart illustrating a method of performing a function associated with a touch in accordance with the disclosure. [0007] FIG. 3 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
[0008] FIG. 4 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
[0009] FIG. 5 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
[0010] FIG. 6 illustrates a touch at a location on a touch-sensitive display of an electronic device in accordance with the disclosure.
Detailed Description
[0011] The following describes an apparatus and method of detecting a touch on a touch-sensitive display of an electronic device. When no secondary input to the electronic device is detected during the touch, a first function associated with the touch is performed according to a first event handling process. When a secondary input is detected that overlaps at least partially in time with the touch, a second function associated with the touch is performed according to a second event handling process..
[0012] For simplicity and clarity of illustration, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
Numerous details are set forth to provide an understanding of the examples described herein. The examples may be practiced without these details. In other instances, well-known methods, procedures, and components are not described in detail to avoid obscuring the examples described . The description is not to be considered as limited to the scope of the examples described herein.
[0013] The disclosure generally relates to an electronic device, such as a portable electronic device as described herein. Examples of electronic devices include mobile, or handheld, wireless communication devices such as pagers, cellular phones, cellular smart-phones, wireless organizers, personal digital assistants, wirelessly enabled notebook computers, tablet computers, mobile internet devices, electronic navigation devices, and so forth. The electronic device may be a portable electronic device without wireless communication capabilities, such as a handheld electronic game, digital photograph album, digital camera, media player, e-book reader, and so forth.
[0014] A block diagram of an example of a portable electronic device 100 is shown in FIG. 1. The portable electronic device 100 includes multiple components, such as a processor 102 that controls the overall operation of the portable electronic device 100. Communication functions, including data and voice communications, are performed through a communication subsystem 104. Data received by the portable electronic device 100 is decompressed and decrypted by a decoder 106. The communication subsystem 104 receives messages from and sends messages to a wireless network 150. The wireless network 150 may be any type of wireless network, including, but not limited to, data wireless networks, voice wireless networks, and networks that support both voice and data communications. A power source 142, such as one or more rechargeable batteries or a port to an external power supply, powers the portable electronic device 100.
[0015] The processor 102 interacts with other components, such as a Random Access Memory (RAM) 108, memory 110, a touch-sensitive display 118, one or more actuators 120, one or more force sensors 122, an auxiliary input/output (I/O) subsystem 124, a data port 126, a speaker 128, a microphone 130, short-range communications 132 and other device subsystems 134. The touch-sensitive display 118 includes a display 112 and touch sensors 114 that are coupled to at least one controller 116 that is utilized to interact with the processor 102. Input via a graphical user interface is provided via the touch-sensitive display 118.
Information, such as text, characters, symbols, images, icons, and other items that may be displayed or rendered on a portable electronic device, is displayed on the touch-sensitive display 118 via the processor 102. The processor 102 may also interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity-induced reaction forces.
[0016] To identify a subscriber for network access, the portable electronic device 100 may utilize a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 for communication with a network, such as the wireless network 150. Alternatively, user identification information may be programmed into memory 110.
[0017] The portable electronic device 100 includes an operating system 146 and software programs, applications, or components 148 that are executed by the processor 102 and are typically stored in a persistent, updatable store such as the memory 110. Additional applications or programs may be loaded onto the portable electronic device 100 through the wireless network 150, the auxiliary I/O subsystem 124, the data port 126, the short-range communications subsystem 132, or any other suitable subsystem 134.
[0018] A received signal such as a text message, an e-mail message, or web page download is processed by the communication subsystem 104 and input to the processor 102. The processor 102 processes the received signal for output to the display 112 and/or to the auxiliary I/O subsystem 124. A subscriber may generate data items, for example e-mail messages, which may be transmitted over the wireless network 150 through the communication subsystem 104. For voice communications, the overall operation of the portable electronic device 100 is similar. The speaker 128 outputs audible information converted from electrical signals, and the microphone 130 converts audible information into electrical signals for processing .
[0019] The touch-sensitive display 118 may be any suitable touch-sensitive display, such as a capacitive, resistive, infrared, surface acoustic wave (SAW) touch-sensitive display, strain gauge, optical imaging, dispersive signal technology, acoustic pulse recognition, and so forth. A capacitive touch-sensitive display includes one or more capacitive touch sensors 114. The capacitive touch sensors may comprise any suitable material, such as indium tin oxide (ITO).
[0020] One or more touches, also known as touch contacts or touch events, may be detected by the touch-sensitive display 118. The processor 102 may determine attributes of the touch, including a location of the touch. Touch location data may include data for an area of contact or data for a single point of contact, such as a point at or near a center of the area of contact. The location of a detected touch may include x and y components, e.g., horizontal and vertical components, respectively, with respect to one's view of the touch-sensitive display 118. For example, the x location component may be determined by a signal generated from one touch sensor, and the y location component may be determined by a signal generated from another touch sensor. A touch may be detected from any suitable input member, such as a finger, thumb, appendage, or other objects, for example, a stylus, pen, or other pointer, depending on the nature of the touch-sensitive display 118. Multiple simultaneous touches may be detected .
[0021] One or more gestures may also be detected by the touch-sensitive display 118. A gesture, such as a swipe, also known as a flick, is a particular type of touch on a touch-sensitive display 118 and may begin at an origin point and continue to an end point, for example, a concluding end of the gesture. A gesture may be identified by attributes of the gesture, including the origin point, the end point, the distance traveled, the duration, the velocity, and the direction, for example. A gesture may be long or short in distance and/or duration. Two points of the gesture may be utilized to determine a direction of the gesture. A gesture may also include a hover. A hover may be a touch at a location that is generally unchanged over a period of time or is associated with the same selection item for a period of time.
[0022] The optional actuator(s) 120 may be depressed or activated by applying sufficient force to the touch-sensitive display 118 to overcome the actuation force of the actuator 120. The actuator(s) 120 may be actuated by pressing anywhere on the touch-sensitive display 118. The actuator(s) 120 may provide input to the processor 102 when actuated . Actuation of the actuator(s) 120 may result in provision of tactile feedback. When force is applied, the touch-sensitive display 118 is depressible, pivotable, and/or movable. Such a force may actuate the
actuator(s) 120. The touch-sensitive display 118 may, for example, float with respect to the housing of the portable electronic device, i.e., the touch-sensitive display 118 may not be fastened to the housing . A mechanical dome switch actuator may be utilized. In this example, tactile feedback is provided when the dome collapses due to imparted force and when the dome returns to the rest position after release of the switch. Alternatively, the actuator 120 may comprise one or more piezoelectric (piezo) devices that provide tactile feedback for the touch-sensitive display 118.
[0023] Optional force sensors 122 may be disposed in conjunction with the touch-sensitive display 118 to determine or react to forces applied to the touch- sensitive display 118. The force sensor 122 may be disposed in line with a piezo actuator 120. The force sensors 122 may be force-sensitive resistors, strain gauges, piezoelectric or piezoresistive devices, pressure sensors, quantum tunneling composites, force-sensitive switches, or other suitable devices. Force as utilized throughout the specification, including the claims, refers to force measurements, estimates, and/or calculations, such as pressure, deformation, stress, strain, force density, force-area relationships, thrust, torque, and other effects that include force or related quantities. Optionally, force information related to a detected touch may be utilized to select information, such as information associated with a location of a touch. For example, a touch that does not meet a force threshold may highlight a selection option, whereas a touch that meets a force threshold may select or input that selection option. Selection options include, for example, displayed or virtual keys of a keyboard; selection boxes or windows, e.g ., "cancel," "delete," or
"unlock"; function buttons, such as play or stop on a music player; and so forth. Different magnitudes of force may be associated with different functions or input. For example, a lesser force may result in panning, and a higher force may result in zooming.
[0024] The touch-sensitive display 118 includes a display area 300 in which information may be displayed, and a non-display area 302 extending around the periphery of the display area 300, such as shown in the example of FIG. 3. The display area 300 generally corresponds to the area of the display 112. Information is not displayed in the non-display area 302 by the display, which non-display area 302 is utilized to accommodate, for example, electronic traces or electrical connections, adhesives or other sealants, and/or protective coatings around the edges of the display area 300. The non-display area 302 may be referred to as an inactive area and is not part of the physical housing or frame of the electronic device. Typically, no pixels of the display are in the non-display area 302, thus no image can be displayed by the display 112 in the non-display area 302. Optionally, a secondary display, not part of the primary display 112, may be disposed under the non-display area 302. Touch sensors may be disposed in the non-display area 302, which touch sensors may be extended from the touch sensors in the display area 300 or distinct or separate touch sensors from the touch sensors in the display area 300. A touch, including a gesture, may be associated with the display area 300, the non-display area 302, or both areas. The touch sensors may extend across substantially the entire non-display area 302 or may be disposed in only part of the non-display area 302.
[0025] When a touch event is detected by the touch controller 116, an event handler process, also known as an event listener or an event handler, associated with an application is notified of the event, and performs a function or action in response to the event. For example, when a touch is detected at the location of an icon, the icon may be selected, and the function associated with the icon may be executed.
[0026] The portable electronic device 100 may support only a limited number of touches. For example, event handling processes may be provided to respond to a simple touch, navigation gestures, such as swipes, and re-sizing gestures, such as pinch and zoom gestures. The limited number of supported gestures may not provide all the functionality of a comparable desktop environment. For example, in a desktop environment, a mouse hover over a hypertext link may result in the link target address being displayed. Similarly, a mouse hover over an icon may result in the icon name, or function, being displayed. However, due to the difficulty in differentiating between a simple touch and a hover, event handling processes in a portable electronic device, such as device 100, are generally set to execute a command when a simple touch is detected, and do not differentiate between a simple touch and a hover. Accordingly, devices with touch-sensitive displays may be unable to respond, or may respond unexpectedly or inappropriately, to touches, such as a hover. [0027] A flowchart illustrating a method of performing functions associated with a touch according to a first or a second event handling process is shown in FIG. 2. The method may be carried out by software executed, for example, by the processor 102. Coding of software for carrying out such a method is within the scope of a person of ordinary skill in the art given the present description. The method may contain additional or fewer processes than shown and/or described, and may be performed in a different order. Computer-readable code executable by at least one processor of the portable electronic device to perform the method may be stored in a computer-readable medium, such as a non-transitory computer- readable medium .
[0028] A touch is detected 202 on a touch-sensitive display of the electronic device 100. When no secondary input overlapping the touch at least partially in time is detected, also referred to as an overlapping input, a first function associated with the touch is performed 206 according to a first event handling process. The first event handling process may be one of a set of first event handling processes for handling different touches in a first manner. For example, the first set of event handling processes may include processes for processing a simple touch as a command to select a button or icon associated with a location of the touch, and execute a function associated with the button or icon; for processing a swipe gesture as a command to navigate from one page of a website, or document, to an adjacent page; and for processing a pinch and zoom gesture as a command to zoom in or out on a displayed page.
[0029] When a secondary input is detected 204, a second function associated with the touch is performed 206 according to a second event handling process. For example, the secondary input may be a second touch, such as a touch and hold gesture, a series of second touches, such as one or more taps, actuation of a key on a keyboard, actuation of a physical button or key on the portable electronic device 100, or any other secondary input that overlaps the touch at least partially in time. The terms "secondary" and "second" do not indicate a time order of touches and inputs. A secondary input that overlaps the touch at least partially in time may begin before, after, or at the same time the touch begins. A secondary input that overlaps the touch at least partially in time may end before, after, or at the same time the touch ends. The second event process may be one of a set of second event handling processes for handling touches in a second manner. For example, the second set of event handling processes may include processes for processing a simple touch as a rollover command to display information about a hyperlink target address or a menu item; for processing a swipe gesture as a command switch to another active application; and for processing a pinch and zoom gesture as a command to rotate a displayed image.
[0030] According to an example, when the secondary input is maintained, such as by continuing to depress a pre-assigned key on a keyboard or by holding a touch, the electronic device may process a subsequent touch according to one of the set of second event handling processes. When the secondary input ends, or is terminated, such as by release of a keyboard key or the end of the touch, the electronic device may process a subsequent touch according to one of the set of first event handling processes.
[0031] An example of detecting a touch on a portable electronic device is shown in FIG. 3 and FIG. 4. When a touch at a location 304 is detected within a first region, such as display area 300, and no secondary input is detected that overlaps the touch at least partially in time, the touch is processed according to a first event handling process. For example, the location 304 may be associated with a button or icon displayed in the display area 300. When a touch is detected at the location 304 and no secondary input to the electronic device 100 is detected that overlaps the touch at least partially in time, a first event handling process may execute a command associated with the button or icon. When, as shown in FIG. 4, a touch at a location 400 is detected in the display area 300, and a secondary input, such as a second touch at location 402, that overlaps the touch at least partially in time, is detected in a second region, such as non-display area 302, a second event handling process may process the touch. For example, the location 400 may be associated with a hyperlink, and the second event handling process may perform a function, such as a rollover function to display a hypertext link associated with the hyperlink. [0032] More than one second event handling process, e.g., a set of second event handling processes, may be available for processing touches. Selection of a second event handling process may be determined by the secondary input to the electronic device 100. For example, as shown in FIG. 4, when a second touch is detected at a location 402 in the lower left area of the non-display area 302, a second event handling process, associated with a first set of set of second event handling processes, may process the touch, and perform a rollover function as described above. When a second touch is detected at a location 404 in the upper left area of the non-display area 302, a second event handling process, associated with a second set of set of second event handling processes, may process the touch to perform a function to highlight, or indicate selection of, text associated with the location 400. By way of further example, in an electronic device having a physical keyboard, a second event handling processes may be selected or invoked by actuating a first pre-assigned key, while another second event handling process may be selected or invoked by actuating a second pre-assigned key.
[0033] Additional examples of detecting a touch on a portable electronic device are shown in FIG. 5 and FIG. 6. When a gesture, such as pinch gesture 500, is detected within a first region, such as display area 300, and no secondary input is detected that overlaps the gesture at least partially in time, the gesture is processed according to a first event handling process. For example, when a pinch gesture 500 is detected in the display area 300, without detection of a secondary input to the electronic device 100, a first event handling process may perform a zoom out function, causing a displayed image to be reduced in size or resolution. When, as shown in FIG. 6, a pinch gesture 600 is detected in the display area 300, and a secondary input, such as touch and hold gesture 602, is detected in a second region, such as non-display area 302, at least partially overlapping in time with the gesture 600, a second event handling process may process the gesture 600. For example, the second event handling process may perform a function, such as a rotate function to rotate an image or a page displayed at a location associated with the gesture 600. [0034] In a portable electronic device 100 that includes a physical keyboard (not shown), the secondary input may be, for example, the depression of a pre-assigned key, such as an ALT key. When a touch is detected on the touch-sensitive display 118, and no pre-assigned key is actuated overlapping the touch at least partially in time, the touch may be processed according to a first event handling process. For example, when a button or icon is associated with the location of the gesture 600, the first event handling process may execute an operation associated with a button or icon. When a touch is detected on the touch-sensitive display 118, and the actuation of a pre-assigned key, overlapping the touch at least partially in time, is detected, a second event handling process may process the touch, and perform a second function, such as a rollover function.
[0035] The present method and apparatus increases the number of functions that may be supported through a limited number of touches or gestures. By switching, changing, or toggling, between two or more event handling processes, the number of commands and processes associated with touches is multiplied without the electronic device needing to detect any new or additional gestures that may be complicated . User experience is also improved, because a limited number of gestures is provided, and more than one command or process for a given touch may be invoked by providing a secondary input, such as a second touch that overlaps the first touch at least partially in time, or a key actuation that overlaps the first touch at least partially in time.
[0036] The present disclosure may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the disclosure is, therefore, indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims

What is claimed is: Claims
1. A method comprising : detecting a touch on a touch-sensitive display of an electronic device; when no secondary input to the electronic device is detected during the touch, performing a first function associated with the touch, which first function is performed according to a first event handling process; when a secondary input is detected that overlaps at least partially in time with the touch, performing a second function associated with the touch, which second function is performed according to a second event handling process.
2. The method of claim 1, wherein the first event handling process is one of a set of first event handling processes.
3. The method according to claim 2, further comprising : detecting termination of the secondary input; performing a function associated with a subsequent touch according to one of the set of first event handling processes.
4. The method according to claim 1, wherein the secondary input is an actuation of a key of the electronic device.
5. The method according to claim 1, wherein : detecting the touch comprises detecting a first touch in a first region of the touch- sensitive display; wherein the secondary input is a second touch in a second region of the touch- sensitive display.
6. The method according to claim 5, wherein the second region is a non-display area .
7. The method according to claim 5, wherein the second touch is a touch and hold gesture.
8. The method of claim 1, wherein the second event handling process is one of a set of second event handling processes.
9. The method according to claim 8, further comprising : detecting a subsequent touch while the secondary input is maintained; performing a function associated with the subsequent touch according to one of the set of second event handling processes.
10. A non-transitory computer-readable medium having computer-readable code executable by at least one processor of the electronic device to perform the method of claim 1.
11. An electronic device comprising : a touch-sensitive display; a processor coupled to the touch-sensitive display and configured to : detect a touch on a touch-sensitive display of an electronic device; when no secondary input to the electronic device is detected during the touch, perform a first function associated with the touch, which first function is performed according to a first event handling process; when a secondary input is detected that overlaps at least partially in time with the touch, perform a second function associated with the touch, which second function is performed according to a second event handling process.
12. The electronic device according to claim 11, wherein the first event handling process is one of a set of first event handling processes.
13. The electronic device according to claim 12, wherein the processor is further configured to : detect termination of the secondary input; perform a function associated with a subsequent touch according to one of a set of first event handling processes.
14. The electronic device according to claim 11, further comprising a keyboard, wherein the secondary input is provided through an actuation of a key on the keyboard.
15. The electronic device according to claim 10, wherein : the touch is a first touch in a first region of the touch-sensitive display; the secondary input is a second touch in a second region of the touch-sensitive display.
16. The electronic device according to claim 14, wherein the second region is a non-display area .
17. The electronic device according to claim 14, wherein the second touch is a touch and hold gesture.
18. The electronic device according to claim 10, wherein the second event handling process is one of a set of second event handling processes.
19. The electronic device according to claim 14, wherein the processor is further configured to : detect a subsequent touch while the secondary input is maintained; perform a function associated with the subsequent touch according to one of the set of second event handling processes.
EP12868313.3A 2012-02-08 2012-02-08 Portable electronic device and method of controlling same Ceased EP2812778A4 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2012/024337 WO2013119225A1 (en) 2012-02-08 2012-02-08 Portable electronic device and method of controlling same

Publications (2)

Publication Number Publication Date
EP2812778A1 true EP2812778A1 (en) 2014-12-17
EP2812778A4 EP2812778A4 (en) 2015-12-02

Family

ID=48947858

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12868313.3A Ceased EP2812778A4 (en) 2012-02-08 2012-02-08 Portable electronic device and method of controlling same

Country Status (2)

Country Link
EP (1) EP2812778A4 (en)
WO (1) WO2013119225A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7656393B2 (en) * 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
US20090278806A1 (en) * 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US7956847B2 (en) * 2007-01-05 2011-06-07 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
US8432367B2 (en) * 2009-11-19 2013-04-30 Google Inc. Translating user interaction with a touch screen into input commands
US20110209098A1 (en) * 2010-02-19 2011-08-25 Hinckley Kenneth P On and Off-Screen Gesture Combinations
KR101641250B1 (en) * 2010-05-19 2016-07-29 엘지전자 주식회사 Mobile terminal and control method thereof
KR101764751B1 (en) * 2010-07-12 2017-08-03 엘지전자 주식회사 Mobile terminal and control method therof

Also Published As

Publication number Publication date
EP2812778A4 (en) 2015-12-02
WO2013119225A1 (en) 2013-08-15

Similar Documents

Publication Publication Date Title
US8872773B2 (en) Electronic device and method of controlling same
US20110179381A1 (en) Portable electronic device and method of controlling same
US8863020B2 (en) Portable electronic device and method of controlling same
EP2508970B1 (en) Electronic device and method of controlling same
US20120256846A1 (en) Electronic device and method of controlling same
US20110086674A1 (en) Electronic device including touch-sensitive display and method of controlling same
EP2660697B1 (en) Method and apparatus for text selection
US20120235919A1 (en) Portable electronic device including touch-sensitive display and method of controlling same
EP3211510B1 (en) Portable electronic device and method of providing haptic feedback
US20110258576A1 (en) Portable electronic device and method of controlling same
US9395901B2 (en) Portable electronic device and method of controlling same
EP2500807A1 (en) Portable electronic device including touch-sensitive display and method of entering text via virtual keyboard
EP2700000B1 (en) Text indicator method and electronic device
EP2348392A1 (en) Portable electronic device and method of controlling same
US20120007876A1 (en) Electronic device and method of tracking displayed information
CA2773818C (en) Electronic device and method of controlling same
US20140028563A1 (en) Electronic device including touch-sensitive display and method of controlling same
US20130293483A1 (en) Selectable object display method and apparatus
EP2690538A1 (en) Electronic device including touch-sensitive display and method of controlling same
EP2405333A1 (en) Electronic device and method of tracking displayed information
EP2812778A1 (en) Portable electronic device and method of controlling same
EP2660698A1 (en) Selectable object display method and apparatus
EP2466434B1 (en) Portable electronic device and method of controlling same
CA2735040C (en) Portable electronic device and method of controlling same
US20130021264A1 (en) Electronic device including a touch-sensitive display and navigation device and method of controlling same

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140808

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20151104

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0488 20130101AFI20151029BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180103

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

APBK Appeal reference recorded

Free format text: ORIGINAL CODE: EPIDOSNREFNE

APBN Date of receipt of notice of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA2E

APBR Date of receipt of statement of grounds of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA3E

APAF Appeal reference modified

Free format text: ORIGINAL CODE: EPIDOSCREFNE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

APBT Appeal procedure closed

Free format text: ORIGINAL CODE: EPIDOSNNOA9E

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20220826