WO2012129808A1 - Recognizing touch screen inputs - Google Patents

Recognizing touch screen inputs Download PDF

Info

Publication number
WO2012129808A1
WO2012129808A1 PCT/CN2011/072341 CN2011072341W WO2012129808A1 WO 2012129808 A1 WO2012129808 A1 WO 2012129808A1 CN 2011072341 W CN2011072341 W CN 2011072341W WO 2012129808 A1 WO2012129808 A1 WO 2012129808A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
area
touch
interface
touch display
Prior art date
Application number
PCT/CN2011/072341
Other languages
French (fr)
Inventor
Fan Yang
Shijun Yuan
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to PCT/CN2011/072341 priority Critical patent/WO2012129808A1/en
Publication of WO2012129808A1 publication Critical patent/WO2012129808A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • Example embodiments of the present invention relate generally to user interface technology and, more particularly, relate to methods and apparatuses for recognizing inputs on a touch screen.
  • the smaller sized individual keys may lead to an increased number of errors when using such a virtual keyboard or virtual keypad.
  • a user engaging a virtual keypad comprising a number of virtual keys may inadvertently touch multiple keys instead of a singular desired virtual key.
  • Methods, apparatuses, and computer program products are herein provided for facilitating interaction with a touch display user interface.
  • Methods, apparatuses, and computer program products in accordance with various embodiments may provide several advantages to application developers, computing devices, and computing device users.
  • Some example embodiments facilitate interaction with a touch display user interface through the use of touch inputs.
  • one example embodiment may provide a method, apparatus, and computing program product for selecting a target area of the touch display user interface corresponding to a virtual key, based in part, on a relation of the target area to a touch area, the touch area being defined, in part, by the surface of the touch display engaged by the user contacting the touch display.
  • embodiments of the present invention provide a touch display user interface that decreases the amount of error tapping by a user.
  • a method may include receiving a user input to a touch display user-interface.
  • the method may further include determining, by a processor, a touch area that corresponds to the user input of the touch display user-interface.
  • the method may also include determining, by a processor a relation of the touch area to at least one target area disposed on the touch display user-interface.
  • the method may include causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
  • the method may further include determining the area of the touch display user-interface engaged by the user contacting the touch display user- interface.
  • Another embodiment may include a method comprising determining a target midpoint of at least one of the target areas disposed on the touch display user-interface and determining if the target midpoint is disposed within the touch area.
  • another embodiment may comprise a method including selecting a function associated with a target area when a midpoint corresponding to the target area is disposed within the touch area.
  • the method may comprise selecting no functions associated with the plurality of target areas when at least two midpoints corresponding to two separate target areas are disposed within the touch area.
  • the method may further include determining a centroid point of the touch area.
  • the method may comprise determining at least one target midpoint for at least one target area disposed on the touch display user-interface and determining the distance between the centroid point and the at least one target midpoint.
  • the method may include determining if the target midpoint is disposed within the touch area.
  • the method may also include selecting a function associated with a target area when the target midpoint corresponding to the target area is disposed within the touch area and may also include selecting a function associated with a target area having the shortest distance between the centroid point and the target midpoint corresponding to the target area when the touch area includes less than one target midpoint.
  • an apparatus may comprise at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least receive a user input to a touch display user-interface. Further, the apparatus may be further configured to determine a touch area that corresponds to the user input to the touch display user-interface and determine a relation of the touch area to at least one target area disposed on the touch display user-interface. In addition, the apparatus may be configured to cause, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
  • a computer program product may include at least one non- transitory computer-readable storage medium having computer-readable program instructions stored therein.
  • the computer-readable program instructions may comprise program instructions configured to cause an apparatus to perform a method comprising receiving a user input to a touch display user-interface. Further, the method may include determining, by a processor, a touch area that corresponds to the user input to the touch display user-interface and may also include determining, by a processor, a relation of the touch area to at least one target area disposed on the touch display user-interface. In addition, the method may include causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
  • an apparatus comprising means for receiving a user input to a touch display user-interface. Further, the apparatus may include means for determining a touch area that corresponds to the user input to the touch display user-interface. In addition, the apparatus may include means for determining a relation of the touch area to a representative subregion of at least one target area disposed on the touch display user-interface and means for causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
  • FIG. 1 illustrates a block diagram of an apparatus for facilitating interaction with a touch display user interface according to an example embodiment
  • FIG. 2 is a schematic block diagram of a mobile terminal according to an example embodiment
  • FlGs. 3A-3C illustrate example interaction with an example touch display user interface according to an example embodiment
  • FIG. 4 illustrates a flowchart according to an example method for facilitating interaction with a touch display user interface according to an example embodiment
  • FIG, 5 illustrates a flowchart according to an example method for facilitating interaction with a touch display user interface according to another example embodiment.
  • computer-readable medium refers to any medium configured to participate in providing information to a processor, including instructions for execution. Such a medium may take many forms, including, but not limited to a non- transitory computer-readable storage medium (e.g., non-volatile media, volatile media), and transmission media.
  • Transmission media include, for example, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves.
  • Signals include man-made transient variations in amplitude, frequency, phase, polarization or other physical properties transmitted through the transmission media.
  • non- transitory computer-readable media examples include a magnetic computer readable medium (e.g., a floppy disk, hard disk, magnetic tape, any other magnetic medium), an optical computer readable medium (e.g., a compact disc read only memory (CD-ROM), a digital versatile disc (DVD), a Blu-Ray disc, or the like), a random access memory (RAM), a programmable read only memory (PROM), an erasable programmable read only memory (EPROM), a FLASH- EPROM, or any other non-transitory medium from which a computer can read.
  • the term computer-readable storage medium is used herein to refer to any computer-readable medium except transmission media. However, it will be appreciated that where embodiments are described to use a computer-readable storage medium, other types of computer-readable mediums may be substituted for or used in addition to the computer-readable storage medium in alternative embodiments.
  • circuitry refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present.
  • This definition of 'circuitry' applies to all uses of this term herein, including in any claims.
  • the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware.
  • the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
  • FIG. 1 illustrates a block diagram of an apparatus 102 for facilitating interaction with a user interface according to an example embodiment.
  • the apparatus 102 is provided as an example of one embodiment and should not be construed to narrow the scope or spirit of the invention in any way.
  • the scope of the disclosure encompasses many potential embodiments in addition to those illustrated and described herein.
  • FIG. 1 illustrates one example of a configuration of an apparatus for facilitating interaction with a user interface, other configurations may also be used to implement embodiments of the present invention.
  • the apparatus 102 may be embodied as a desktop computer, laptop computer, mobile terminal, mobile computer, mobile phone, mobile communication device, game device, digital camera/camcorder, audio/video player, television device, radio receiver, digital video recorder, positioning device, a chipset, a computing device comprising a chipset, any combination thereof, and/or the like.
  • the apparatus 102 may comprise any computing device that comprises or is in operative communication with a touch display capable of displaying a graphical user interface.
  • the apparatus 102 is embodied as a mobile computing device, such as the mobile terminal illustrated in FIG. 2.
  • FIG. 2 illustrates a block diagram of a mobile terminal 10 representative of one example embodiment of an apparatus 102.
  • the mobile terminal 10 illustrated and hereinafter described is merely illustrative of one type of apparatus 102 that may implement and/or benefit from various example embodiments of the invention and, therefore, should not be taken to limit the scope of the disclosure.
  • While several embodiments of the electronic device are illustrated and will be hereinafter described for purposes of example, other types of electronic devices, such as mobile telephones, mobile computers, personal digital assistants (PDAs), pagers, laptop computers, desktop computers, gaming devices, televisions, e-papers, and other types of electronic systems, may employ various embodiments of the invention.
  • PDAs personal digital assistants
  • the mobile terminal 10 may include an antenna 12 (or multiple antennas 12) in communication with a transmitter 14 and a receiver 16.
  • the mobile terminal 10 may also include a processor 20 configured to provide signals to and receive signals from the transmitter and receiver, respectively.
  • the processor 20 may, for example, be embodied as various means including circuitry, one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an ASIC (application specific integrated circuit) or FPGA (field programmable gate array), or some combination thereof. Accordingly, although illustrated in FIG.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the processor 20 comprises a plurality of processors.
  • These signals sent and received by the processor 20 may include signaling information in accordance with an air interface standard of an applicable cellular system, and/or any number of different wireline or wireless networking techniques, comprising but not limited to Wi-Fi, wireless local access network (WLAN) techniques such as Institute of Electrical and Electronics Engineers (IEEE) 802.11, 802.16, and/or the like.
  • these signals may include speech data, user generated data, user requested data, and/or the like.
  • the mobile terminal may be capable of operating with one or more air interface standards, communication protocols, modulation types, access types, and/or the like.
  • the mobile terminal may be capable of operating in accordance with various first generation (1G), second generation (2G), 2.5G, third-generation (3G) communication protocols, fourth-generation (4G) communication protocols, Internet Protocol Multimedia Subsystem (IMS) communication protocols (e.g., session initiation protocol (SIP)), and/or the like.
  • the mobile terminal may be capable of operating in accordance with 2G wireless communication protocols IS-136 (Time Division Multiple Access (TDMA)), Global System for Mobile communications (GSM), IS-95 (Code Division Multiple Access (CDMA)), and/or the like.
  • TDMA Time Division Multiple Access
  • GSM Global System for Mobile communications
  • CDMA Code Division Multiple Access
  • the mobile terminal may be capable of operating in accordance with 2.5G wireless communication protocols General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), and/or the like.
  • the mobile terminal may be capable of operating in accordance with 3G wireless communication protocols such as Universal Mobile
  • the mobile terminal may be additionally capable of operating in accordance with 3.9G wireless communication protocols such as Long Term Evolution (LTE) or Evolved Universal Terrestrial Radio Access Network (E-UTRAN) and/or the like. Additionally, for example, the mobile terminal may be capable of operating in accordance with fourth-generation (4G) wireless communication protocols and/or the like as well as similar wireless communication protocols that may be developed in the future.
  • 4G wireless communication protocols such as Long Term Evolution (LTE) or Evolved Universal Terrestrial Radio Access Network (E-UTRAN) and/or the like.
  • NAMPS Narrow-band Advanced Mobile Phone System
  • TACS Total Access Communication System
  • the mobile terminal 10 may be capable of operating according to Wi-Fi or Worldwide Interoperability for Microwave Access (WiMAX) protocols,
  • the processor 20 may comprise circuitry for implementing audio/video and logic functions of the mobile terminal 10.
  • the processor 20 may comprise a digital signal processor device, a microprocessor device, an analog- to-digital converter, a digital-to-analog converter, and/or the like. Control and signal processing functions of the mobile terminal may be allocated between these devices according to their respective capabilities.
  • the processor may additionally comprise an internal voice coder (VC) 20a, an internal data modem (DM) 20b, and/or the like.
  • the processor may comprise functionality to operate one or more software programs, which may be stored in memory.
  • the processor 20 may be capable of operating a connectivity program, such as a web browser.
  • the connectivity program may allow the mobile terminal 10 to transmit and receive web content, such as location-based content, according to a protocol, such as Wireless Application Protocol (WAP), hypertext transfer protocol (HTTP), and/or the like.
  • WAP Wireless Application Protocol
  • HTTP hypertext transfer protocol
  • the mobile terminal 10 may be capable of using a Transmission Control Protocol/Bitemet
  • TCP/TP Transmission Control Protocol
  • the mobile terminal 10 may also comprise a user interface including, for example, an earphone or speaker 24, a ringer 22, a microphone 26, a display 28, a user input interface, and/or the like, which may be operationally coupled to the processor 20.
  • the processor 20 may comprise user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as, for example, the speaker 24, the ringer 22, the microphone 26, the display 28, and/or the like.
  • the processor 20 and/or user interface circuitry comprising the processor 20 may be configured to control one or more functions of one or more elements of the user interface through computer program
  • the mobile terminal may comprise a battery for powering various circuits related to the mobile terminal, for example, a circuit to provide mechanical vibration as a detectable output.
  • the display 28 of the mobile terminal may be of any type appropriate for the electronic device in question with some examples including a plasma display panel (PDP), a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode display (OLED), a projector, a holographic display or the like.
  • the display 28 may, for example, comprise a three-dimensional touch display, examples of which will be described further herein below.
  • the user input interface may comprise devices allowing the mobile terminal to receive data, such as a keypad 30, a touch display (e.g., some example embodiments wherein the display 28 is configured as a touch display), a joystick (not shown), and/or other input device.
  • the keypad may comprise numeric (0-9) and related keys (#, *), and/or other keys for operating the mobile terminal.
  • the mobile terminal 10 may comprise memory, such as a subscriber identity module (SM) 38, a removable user identity module (R-UIM), and/or the like, which may store information elements related to a mobile subscriber.
  • SM subscriber identity module
  • R-UIM removable user identity module
  • the mobile terminal 10 may include volatile memory 40 and/or non-volatile memory 42.
  • volatile memory 40 may include Random Access Memory (RAM) including dynamic and/or static RAM, on- chip or off-chip cache memory, and/or the like.
  • RAM Random Access Memory
  • Non-volatile memory 42 which may be embedded and/or removable, may include, for example, read-only memory, flash memory, magnetic storage devices (e.g., hard disks, floppy disk drives, magnetic tape, etc.), optical disc drives and/or media, non-volatile random access memory (NVRAM), and/or the like.
  • NVRAM non-volatile random access memory
  • the memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the mobile terminal for performing functions of the mobile terminal.
  • the memories may comprise an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying the mobile terminal 10.
  • IMEI international mobile equipment identification
  • the apparatus 102 includes various means for performing the various functions herein described. These means may comprise one or more of a processor 110, memory 112, communication interface 114, user interface 116, or user interface (UI) control circuitry 122,
  • the means of the apparatus 102 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions (e.g., software or firmware) stored on a computer-readable medium (e.g. memory 112) that is executable by a suitably configured processing device (e.g., the processor 1 10), or some combination thereof.
  • a suitably configured processing device e.g., the processor 1 10
  • one or more of the means illustrated in FIG. 1 may be embodied as a chip or chip set.
  • the apparatus 102 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard).
  • the structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon.
  • the processor 110, memory 112, communication interface 114, user interface 116, and/or UI control circuitry 122 may be embodied as a chip or chip set.
  • the apparatus 102 may therefore, in some cases, be configured to or may comprise component(s) configured to implement embodiments of the present invention on a single chip or as a single "system on a chip.”
  • a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein and/or for enabling user interface navigation with respect to the
  • the processor 110 may, for example, be embodied as various means including one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an ASIC (application specific integrated circuit) or FPGA (field programmable gate array), one or more other types of hardware processors, or some combination thereof.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the processor 110 comprises a plurality of processors.
  • the plurality of processors may be in operative communication with each other and may be collectively configured to perform one or more functionalities of the apparatus 102 as described herein.
  • the plurality of processors may be embodied on a single computing device or distributed across a plurality of computing devices collectively configured to function as the apparatus 102.
  • the apparatus 102 is embodied as a mobile terminal 10
  • the processor 110 may be embodied as or comprise the processor 20.
  • the processor 110 is configured to execute instructions stored in the memory 112 or otherwise accessible to the processor 110.
  • the processor 110 may comprise an entity capable of performing operations according to embodiments of the present invention while configured accordingly.
  • the processor 110 may comprise specifically configured hardware for conducting one or more operations described herein.
  • the processor 110 when the processor 110 is embodied as an executor of instructions, such as may be stored in the memory 112, the instructions may specifically configure the processor 110 to perform one or more algorithms and operations described herein.
  • the memory 112 may comprise, for example, volatile memory, non-volatile memory, or some combination thereof.
  • the memory 112 may comprise a non- transitory computer-readable storage medium.
  • the memory 112 may comprise a plurality of memories.
  • the plurality of memories may be embodied on a single computing device or may be distributed across a plurality of computing devices collectively configured to function as the apparatus 102.
  • the memory 112 may comprise a hard disk, random access memory, cache memory, flash memory, a compact disc read only memory (CD-ROM), digital versatile disc read only memory (DVD-ROM), an optical disc, circuitry configured to store
  • the memory 112 may comprise the volatile memory 40 and/or the non-volatile memory 42.
  • the memory 112 may be configured to store information, data, applications, instructions, or the like for enabling the apparatus 102 to carry out various functions in accordance with various example embodiments.
  • the memory 112 is configured to buffer input data for processing by the processor 110.
  • the memory 112 may be configured to store program instructions for execution by the processor 110.
  • the memory 112 may store information in the form of static and/or dynamic information.
  • the stored information may include, for example, images, content, media content, user data, application data, and/or the like. This stored information may be stored and/or used by the UI control circuitry 122 during the course of performing its functionalities.
  • the communication interface 114 may be embodied as any device or means embodied in circuitry, hardware, a computer program product comprising computer readable program instructions stored on a computer readable medium (e.g., the memory 112) and executed by a processing device (e.g., the processor 110), or a combination thereof that is configured to receive and/or transmit data from/to another computing device.
  • the communication interface 114 is at least partially embodied as or otherwise controlled by the processor 110.
  • the communication interface 114 may be in communication with the processor 110, such as via a bus.
  • the communication interface 114 may include, for example, an antenna, a transmitter, a receiver, a transceiver and/or supporting hardware or software for enabling communications with one or more remote computing devices.
  • the communications interface 114 may be embodied as or comprise the antenna 12, the transmitter 14 and/or the receiver 16.
  • the communication interface 114 may be configured to receive and/or transmit data using any protocol that may be used for communications between computing devices.
  • the communication interface 114 may be configured to receive and/or transmit data using any protocol that may be used for transmission of data over a wireless network, wireline network, some
  • the communication interface 114 may be configured to receive and/or otherwise access content (e.g., web page content, streaming media content, and/or the like) over a network from a server or other content source.
  • the communication interface 114 may additionally be in communication with the memory 112, user interface 116, and/or UI control circuitry 122, such as via a bus.
  • the user interface 116 may be in communication with the processor 110 to receive an indication of a user input and/or to provide an audible, visual, mechanical, or other output to a user.
  • the user interface 116 may include, for example, a keyboard, a mouse, a joystick, a display, a touch screen display, a microphone, a speaker, and/or other input/output mechanisms, such as the display 28 and/or keypad 30 in the embodiment of Figure 2.
  • the user interface 116 may be in communication with the memory 112,
  • the apparatus 102 comprises a user interface 116 comprising a touch display.
  • the apparatus 102 may be operatively connected with the touch display user interface 116 such that the apparatus 102 may control the touch display, receive an indication of and/or otherwise determine a user input (e.g., a touch input) to the touch display user interface 116, and/or the like.
  • the touch display user interface 116 may comprise any type of display capable of displaying a user interface, image, virtual keyboard, virtual keypad, and/or the like.
  • the touch display user interface 116 may also be configured to enable the detection of a touch input.
  • the touch display user interface 116 may comprise a capacitive touch display, which may be configured to enable detection of capacitance of a finger or other input object by which an input may be made by physically contacting the display surface.
  • the touch display may also be configured to enable the detection of a hovering gesture input.
  • a hovering gesture input may comprise a gesture input to the touch display without making physical contact with a surface of the touch display, such as a gesture made in a space some distance above/in front of the surface of the touch display.
  • the touch display may comprise a projected capacitive touch display, which may be configured to enable detection of capacitance of a finger or other input object by which a gesture may be made without physically contacting a display surface.
  • the touch display may be configured to enable detection of a hovering gesture input through use of acoustic wave touch sensor technology, electromagnetic touch sensing technology, near field imaging technology, optical sensing technology, infrared proximity sensing technology, some combination thereof, or the like.
  • the touch display user interface 116 may further be in communication with one or more of the processor 110, memory 1 12,
  • the UI control circuitry 122 may be embodied as various means, such as circuitry, hardware, a computer program product comprising computer readable program instructions stored on a computer readable medium (e.g., the memory 112) and executed by a processing device (e.g., the processor 110), or some combination thereof and, in some embodiments, is embodied as or otherwise controlled by the processor 110.
  • the UI control circuitry 122 may be in communication with the processor 110.
  • the UI control circuitry 122 may further be in communication with one or more of the memory 112, communication interface 114, or user interface 116, such as via a bus.
  • the UI control circuitry 122 may be configured to receive an indication of a touch input to the touch display user interface 1 16 and/or otherwise determine a touch input to the touch display user interface 116.
  • the touch display user interface 116 may be configured to detect a touch input to the touch display and generate a signal indicative of the touch input. This signal may be received by the UI control circuitry 122, which may determine the touch input in response to receiving the signal.
  • the signal may carry information indicative of a position of the touch input.
  • the position may comprise an area of the touch display engaged by a user contacting the touch display user interface 116. Such an area may be distinguished from a single point location in that it may comprise two or more point locations disposed within the area. Examples of areas include a single continuous area (e.g., an area bounded by a single continuous perimeter), a
  • an area may comprise a plurality of point locations, such as a plurality of point locations falling within a particular perimeter and/or outside another perimeter.
  • the position may, for example, comprise a plurality of coordinate positions relative to a two-dimensional coordinate system (e.g., an X and Y axis) of the touch display user interface 116, such that the positions may be described in terms of an area of the surface of the touch display engaged by the user contacting the touch display.
  • the UI control circuitry 122 may accordingly be configured to determine a position of a touch input based at least in part on a received signal or other indication of a touch input.
  • the UI control circuitry 122 may be further configured to determine a relation of a position of a touch input to a user interface that may be displayed by the touch display user interface 116.
  • the user interface may comprise any image that may be displayed by the touch display.
  • the user interface may comprise a keypad interface, which is described further herein below.
  • the user interface may comprise any user interface, graphic(s) or object(s), some combination thereof, or the like, which may be displayed by the touch display user interface 116.
  • the UI control circuitry 122 may be configured to determine an element of the user interface displayed in relation to the determined position.
  • the UI control circuitry 122 may be configured to track and/or determine positions at which graphical elements of the user interface may be displayed to a user. Accordingly, the UI control circuitry 122 may determine an element of the user interface that is displayed at the determined position of the touch input. As another example, the UI control circuitry 122 may determine the element of the user interface that is displayed closest to the determined position of the touch input.
  • the UI control circuitry 122 may determine which of the two or more elements displayed are closest to the determined position of the touch input, as explained in further detail below and illustrated in the flowchart of Figure 5. Further, the UI control circuitry 122 may determine an element, which is not displayed on the user interface, but is associated with another element that is displayed on the user interface, that is displayed at the determined position of the touch input.
  • the UI control circuitry may determine an element, which is not displayed on the user interface, such as a target midpoint 255, 255', but is associated with another element that is displayed on the user interface, such as a target area 250, 250', that is displayed at the determined position of the touch input.
  • an element which is not displayed on the user interface, such as a target midpoint 255, 255', but is associated with another element that is displayed on the user interface, such as a target area 250, 250', that is displayed at the determined position of the touch input.
  • the UI control circuitry 122 may be configured to control display of a virtual keypad interface.
  • the virtual keypad may comprise a plurality of virtual numbered keys displayed on a touch display user interface 116.
  • the UI control circuitry 122 may be configured to control display of a virtual keyboard interface comprising a plurality of virtual alphanumeric keys on a touch display user interface 116.
  • the UI control circuitry 122 may be configured to determine the position of an area covered by at least one virtual key displayed on a touch display user interface 1 16.
  • the UI control circuitry 122 may be configured to determine a representative point or other subregion of at least one virtual key, such as a midpoint of an area covered by at least one virtual key displayed on a touch display user interface 116. As with the midpoint of the area covered by a virtual key, the representative point or other subregion is within the respective virtual key and is a subset of and smaller than the respective virtual key. Further, the UI control circuitry 122 may be further configured to determine a position of a touch input and a relation of the touch input to the representative region, such as the midpoint, of an area covered by at least one virtual key displayed on a touch display user interface 116.
  • FIGs. 3A-3C illustrate example interaction with an example touch display 210, 210', 210" of an apparatus 200 according to an example embodiment.
  • the apparatus 200 may further comprise a touch display 210, 210', 210" comprising a plurality of elements 220, 220', 220".
  • a touch display 210, 210', 210" comprising a plurality of elements 220, 220', 220".
  • the plurality of elements 220, 220', 220" may comprise a plurality of virtual keys, which collectively form a virtual keypad.
  • Each of the elements 220, 220', 220" may correspond to an instruction, command, or indication based on a determined position on the touch display 210, 210', 210".
  • a user may desire to engage an element 220, such as a virtual key, corresponding to a target area 250 of the touch display 210, such as a virtual keypad.
  • the small size of the virtual keys may cause an error to occur, such as the apparatus 200 receiving an indication that the "8" key was selected when the user intended to select the "5" key.
  • a UI control circuitry may be configured to be in communication with the touch display 210' and may be further configured to track and/or determine positions at which graphical elements 220' of the user interface may be displayed to a user.
  • a UI control circuitry may be configured to track and/or determine a midpoint 255 or other representative subregion for at least one of the elements 220' displayed on the touch display user interface 210'.
  • the desired target area 250' a user desires to engage may include at least one target midpoint 255.
  • the UI control circuitry may be configured to receive an indication of a touch input to the touch display user interface 210", as shown in FIG. 3C.
  • the touch display user interface 210' may be configured to detect a touch input 270 to the touch display user interface 210" and generate a signal indicative of the touch input.
  • the signal may carry information indicative of a position of the touch input 270.
  • the position of the touch input 270 may comprise an area of the touch display engaged by a user contacting the touch display user interface 210",
  • the position may comprise a plurality of coordinate positions relative to a two-dimensional coordinate system (e.g., an X and Y axis) of the touch display user interface 210", such that the positions may be described in terms of an area of the surface of the touch display engaged by the user contacting the touch display.
  • the UI control circuitry may accordingly be configured to determine a position of a touch input based at least in part on a received signal or other indication of a touch input.
  • the UI control circuitry may be configured to determine a relation of the touch input 270 to a touch display user interface 210". In one embodiment, the UI control circuitry may be configured to determine if at least one midpoint 255' or other representative subregion is disposed within a position, such as an area, of a touch input 270. In another embodiment, the UI control circuitry may be configured to determine a centroid of a positional area of a touch input 270. Further, the UI control circuitry may be configured to determine at least one distance of the centroid of a positional area of a touch input 270 to at least one midpoint 255' or other representative subregion of an element 220" displayed on the touch display user interface 210".
  • one embodiment of the present invention may comprise causing the selection of a function associated with a target area 250, 250', when the midpoint 255, 255' or other representative subregion associated with the target area is disposed within the touch area 270.
  • the processor 110, memory 112, communication interface 114, user interface 116, and/or UI control circuitry 122 may, for example, provide means for causing the selection of the function associated with the target area 250, 250', when the midpoint 255, 255' or other representative subregion associated with the target area is disposed within the touch area 270.
  • the touch display user interface may be configured to display a 3x3 grid of squares, each square measuring approximately 6mm x 6mm. As such, the touch display user interface measures approximately 18mm x 18mm. Further, the apparatus 200 may be configured to display each target area as one of the 6mm x 6mm squares. As shown in FIGs 3A-3C, the user may desire to select the target area 250, 250' associated with the "5" key. Although the touch display user interface displays the target area as a 6mm x 6mm square, the effective target area 260 may be an area
  • the apparatus 200 may be configured to cause the selection of a function associated with a target area when the midpoint or other representative subregion of the target area is disposed within the touch area 270.
  • a user desiring to select the "5" key may contact the touch display user interface, and thus create a touch area, at any position so long as the midpoint 255, 255' or other representative subregion associated with the target area of the "5" key is disposed within the touch area 270.
  • an arrangement of shapes may have dimensions other than 3x3 grids, may not be a grid, may use shapes other than squares, such as circles, rectangles, or other shapes, or may use squares of dimensions other than 6mm x 6mm.
  • FIG. 4 illustrates a flowchart according to an example method for preventing erroneous touch screen inputs made with a touch display user interface according to an example embodiment.
  • the operations illustrated in and described with respect to FIG. 4 may, for example, be performed by, with the assistance of, and/or under the control of one or more of the processor 1 10, memory 112, communication interface 114, user interface 116, or UI control circuitry 122.
  • Operation 302 may comprise receiving a user input to a touch display (e.g., a touch display user interface 116).
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 302.
  • Operation 304 may comprise determining a touch area corresponding to user input.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 302.
  • Operation 306 may comprise determining a relation of the touch area to a representative subregion of at least one target area of the touch display user interface.
  • the processor 110, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for performing operation 306.
  • Operation 308 may comprise causing, based at least in part on the determined relation, a selection corresponding to the user input.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 308 [0048] FIG.
  • FIG. 5 illustrates a flowchart according to an example method for preventing erroneous touch screen inputs made with a touch display user interface according to another example embodiment.
  • the operations illustrated in and described with respect to FIG. 5 may, for example, be performed by, with the assistance of, and/or under the control of one or more of the processor 110, memory 112, communication interface 114, user interface 116, or UI control circuitry 122.
  • Operation 402 may comprise receiving a user input to a touch display user interface.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 402.
  • Operation 404 may comprise determining a touch area of the touch display user-interface engaged by the user contacting the touch display user interface.
  • the processor 110, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for
  • Operation 406 may comprise determining a target midpoint of at least one target area disposed on the touch display user-interface.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 406.
  • Operation 410 may comprise determining whether at least one target midpoint is disposed within the touch area.
  • the processor 1 10, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for performing operation 410. If at least one target midpoint is disposed within the touch area, operation 420 may comprise determining whether a plurality of target midpoints is disposed within the touch area.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 420. If a single midpomt is disposed within the touch area, operation 430 may comprise causing a selection corresponding to the target area corresponding to the midpoint disposed within the touch area. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 430. If no midpoints are disposed within the touch area, as determined by operation 410, then operation 440 may comprise causing no selection corresponding to any of the target areas. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for
  • operation 450 may comprise determining a centroid of the touch area.
  • operation 452 may comprise determining the shortest distance between the centroid and the plurality of target midpoints, and operation 454 may comprise causing a selection corresponding to the target area corresponding to the midpomt with the shortest distance to the centroid of the touch area.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operations 450, 452, and 454.
  • FIG. 6 illustrates a flowchart of another embodiment according to an example method for preventing erroneous touch screen inputs made with a touch display user interface according to another example embodiment.
  • the operations illustrated in and described with respect to FIG, 6 may, for example, be performed by, with the assistance of, and/or under the control of one or more of the processor 110, memory 112, communication interface 114, user interface 116, or UI control circuitry 122.
  • Operation 502 may comprise receiving a user input to a touch display user interface.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 502.
  • Operation 504 may comprise determining a touch area of the touch display user- interface engaged by the user contacting the touch display user interface.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 504.
  • Operation 506 may comprise determining a target midpoint of at least one target area disposed on the touch display user-interface.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 506.
  • Operation 510 may comprise determining whether at least one target midpoint is disposed within the touch area.
  • the processor 110, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for performing operation 510.
  • operation 520 may comprise determining whether a plurality of target midpoints is disposed within the touch area.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for
  • operation 530 may comprise causing a selection corresponding to the target area corresponding to the midpoint disposed within the touch area.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 530. If no midpoints are disposed within the touch area, as determined by operation 510, then operation 540 may comprise causing no selection corresponding to any of the target areas.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 540.
  • operation 550 may comprise causing no selection corresponding to any of the target areas.
  • the processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operations 550.
  • FIGs. 4-6 each illustrate a flowchart of a system, method, and computer program product according to an example embodiment. It will be understood that each block of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by various means, such as hardware and/or a computer program product comprising one or more computer-readable mediums having computer readable program instructions stored thereon. For example, one or more of the procedures described herein may be embodied by computer program instructions of a computer program product.
  • the computer program product(s) which embody the procedures described herein may be stored by one or more memory devices of a mobile terminal, server, or other computing device (for example, in the memory 112) and executed by a processor in the computing device (for example, by the processor 110).
  • the computer program instructions comprising the computer program product(s) which embody the procedures described above may be stored by memory devices of a plurality of computing devices.
  • any such computer program product may be loaded onto a computer or other programmable apparatus (for example, an apparatus 102) to produce a machine, such that the computer program product including the instructions which execute on the computer or other programmable apparatus creates means for implementing the functions specified in the flowchart block(s).
  • the computer program product may comprise one or more computer-readable memories on which the computer program instructions may be stored such that the one or more computer-readable memories can direct a computer or other programmable apparatus to function in a particular manner, such that the computer program product comprises an article of manufacture which implements the function specified in the flowchart block(s).
  • the computer program instructions of one or more computer program products may also be loaded onto a computer or other programmable apparatus (for example, an apparatus 102) to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus implement the functions specified in the flowchart block(s).
  • blocks of the flowcharts support combinations of means for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer program product(s).
  • a suitably configured processor may provide all or a portion of the elements, such as each of those shown in Figures 4-6, and, as such, may constitute means for receiving a user input to the touch display user-interface, means for determining a touch area that corresponds to the user input to the touch display user-interface, means for determining a relation of the touch area to a representative subregion of at least one target area disposed on the touch display user-interface and/or means for causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
  • the computer program product for performing the methods of an example embodiment of the invention includes a computer-readable storage medium (for example, the memory 112), such as the non- volatile storage medium, and computer-readable program code portions, such as a series of computer instructions, embodied in the computer-readable storage medium.
  • a computer-readable storage medium for example, the memory 112
  • computer-readable program code portions such as a series of computer instructions

Abstract

Methods and apparatuses are provided for preventing erroneous touch screen inputs made on a touch display user interface. A method may include receiving a user input to a touch display user interface. The method may further include determining a touch area corresponding to the user input. The method may additionally include determining a relation of the touch area to at least one target area. The method may also include causing, based at least in part of the determined relation, a selection corresponding to the user input. Corresponding apparatuses are also provided.

Description

RECOGNIZING TOUCH SCREEN INPUTS
TECHNOLOGICAL FIELD
[0001] Example embodiments of the present invention relate generally to user interface technology and, more particularly, relate to methods and apparatuses for recognizing inputs on a touch screen.
BACKGROUND
[0002] The modern communications era has brought about a tremendous expansion of wireline and wireless networks. Wireless and mobile networking technologies have addressed related consumer demands, while providing more flexibility and immediacy of information transfer. Concurrent with the expansion of networking technologies, an expansion in computing power has resulted in development of affordable computing devices capable of taking advantage of services made possible by modern networking technologies. This expansion in computing power has led to a reduction in the size of computing devices and given rise to a new generation of mobile devices that are capable of performing functionality that only a few years ago required processing power that could be provided only by the most advanced desktop computers. Consequently, mobile computing devices having a small form factor have become ubiquitous and are used to access network applications and services by consumers of all socioeconomic backgrounds.
[0003] The evolution of power and affordability of modern computing devices has included the release of computing devices including enhanced user interface technologies, such as enhanced interactive displays. For example, touch screen displays enable a user to intuitively interact with content displayed on the display. In addition, the evolution of modern computing devices has included the release of computing devices including smaller sized touch screen displays. Further, these smaller touch screen displays include a number of virtual keys or a virtual keyboard comprising multiple keys. The desired functionality of such a virtual keyboard may require multiple keys, such as one key for each letter of the alphabet or one key for each number of a numeric keypad. Consequently, users of such computing devices may have difficulties accurately selecting and engaging a desired virtual key as the touch screen displays have become smaller in size. The smaller sized individual keys may lead to an increased number of errors when using such a virtual keyboard or virtual keypad. In one instance, a user engaging a virtual keypad comprising a number of virtual keys may inadvertently touch multiple keys instead of a singular desired virtual key.
BRIEF SUMMARY
[0004] Methods, apparatuses, and computer program products are herein provided for facilitating interaction with a touch display user interface. Methods, apparatuses, and computer program products in accordance with various embodiments may provide several advantages to application developers, computing devices, and computing device users. Some example embodiments facilitate interaction with a touch display user interface through the use of touch inputs. Further, one example embodiment may provide a method, apparatus, and computing program product for selecting a target area of the touch display user interface corresponding to a virtual key, based in part, on a relation of the target area to a touch area, the touch area being defined, in part, by the surface of the touch display engaged by the user contacting the touch display. As such, embodiments of the present invention provide a touch display user interface that decreases the amount of error tapping by a user.
[0005] In one example embodiment, a method may include receiving a user input to a touch display user-interface. The method may further include determining, by a processor, a touch area that corresponds to the user input of the touch display user-interface. The method may also include determining, by a processor a relation of the touch area to at least one target area disposed on the touch display user-interface. In addition, the method may include causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
[0006] In another embodiment, the method may further include determining the area of the touch display user-interface engaged by the user contacting the touch display user- interface. Another embodiment may include a method comprising determining a target midpoint of at least one of the target areas disposed on the touch display user-interface and determining if the target midpoint is disposed within the touch area. Further, another embodiment may comprise a method including selecting a function associated with a target area when a midpoint corresponding to the target area is disposed within the touch area. Further still, the method may comprise selecting no functions associated with the plurality of target areas when at least two midpoints corresponding to two separate target areas are disposed within the touch area.
[0007] According to another example embodiment, the method may further include determining a centroid point of the touch area. In addition, the method may comprise determining at least one target midpoint for at least one target area disposed on the touch display user-interface and determining the distance between the centroid point and the at least one target midpoint. Further, the method may include determining if the target midpoint is disposed within the touch area. The method may also include selecting a function associated with a target area when the target midpoint corresponding to the target area is disposed within the touch area and may also include selecting a function associated with a target area having the shortest distance between the centroid point and the target midpoint corresponding to the target area when the touch area includes less than one target midpoint.
[0008] In another example embodiment, an apparatus may comprise at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to at least receive a user input to a touch display user-interface. Further, the apparatus may be further configured to determine a touch area that corresponds to the user input to the touch display user-interface and determine a relation of the touch area to at least one target area disposed on the touch display user-interface. In addition, the apparatus may be configured to cause, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
[0009] In another example embodiment, a computer program product is provided. The computer program product of this example embodiment may include at least one non- transitory computer-readable storage medium having computer-readable program instructions stored therein. The computer-readable program instructions may comprise program instructions configured to cause an apparatus to perform a method comprising receiving a user input to a touch display user-interface. Further, the method may include determining, by a processor, a touch area that corresponds to the user input to the touch display user-interface and may also include determining, by a processor, a relation of the touch area to at least one target area disposed on the touch display user-interface. In addition, the method may include causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
[0010] In another example embodiment, an apparatus comprising means for receiving a user input to a touch display user-interface is provided. Further, the apparatus may include means for determining a touch area that corresponds to the user input to the touch display user-interface. In addition, the apparatus may include means for determining a relation of the touch area to a representative subregion of at least one target area disposed on the touch display user-interface and means for causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
BRIEF DESCRIPTION OF THE DRAWINGS
[0011] Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
[0012] FIG. 1 illustrates a block diagram of an apparatus for facilitating interaction with a touch display user interface according to an example embodiment;
[0013] FIG. 2 is a schematic block diagram of a mobile terminal according to an example embodiment;
[0014] FlGs. 3A-3C illustrate example interaction with an example touch display user interface according to an example embodiment;
[0015] FIG. 4 illustrates a flowchart according to an example method for facilitating interaction with a touch display user interface according to an example embodiment; and
[0016] FIG, 5 illustrates a flowchart according to an example method for facilitating interaction with a touch display user interface according to another example embodiment.
DETAILED DESCRIPTION
[0017] Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
[0018] As used herein, the terms "data," "content," "information" and similar terms may be used interchangeably to refer to data capable of being transmitted, received, displayed and/or stored in accordance with various example embodiments. Thus, use of any such terms should not be taken to limit the spirit and scope of the disclosure,
[0019] The term "computer-readable medium" as used herein refers to any medium configured to participate in providing information to a processor, including instructions for execution. Such a medium may take many forms, including, but not limited to a non- transitory computer-readable storage medium (e.g., non-volatile media, volatile media), and transmission media. Transmission media include, for example, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves. Signals include man-made transient variations in amplitude, frequency, phase, polarization or other physical properties transmitted through the transmission media. Examples of non- transitory computer-readable media include a magnetic computer readable medium (e.g., a floppy disk, hard disk, magnetic tape, any other magnetic medium), an optical computer readable medium (e.g., a compact disc read only memory (CD-ROM), a digital versatile disc (DVD), a Blu-Ray disc, or the like), a random access memory (RAM), a programmable read only memory (PROM), an erasable programmable read only memory (EPROM), a FLASH- EPROM, or any other non-transitory medium from which a computer can read. The term computer-readable storage medium is used herein to refer to any computer-readable medium except transmission media. However, it will be appreciated that where embodiments are described to use a computer-readable storage medium, other types of computer-readable mediums may be substituted for or used in addition to the computer-readable storage medium in alternative embodiments.
[0020] Additionally, as used herein, the term 'circuitry' refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
[0021] FIG. 1 illustrates a block diagram of an apparatus 102 for facilitating interaction with a user interface according to an example embodiment. It will be appreciated that the apparatus 102 is provided as an example of one embodiment and should not be construed to narrow the scope or spirit of the invention in any way. In this regard, the scope of the disclosure encompasses many potential embodiments in addition to those illustrated and described herein. As such, while FIG. 1 illustrates one example of a configuration of an apparatus for facilitating interaction with a user interface, other configurations may also be used to implement embodiments of the present invention.
[0022] The apparatus 102 may be embodied as a desktop computer, laptop computer, mobile terminal, mobile computer, mobile phone, mobile communication device, game device, digital camera/camcorder, audio/video player, television device, radio receiver, digital video recorder, positioning device, a chipset, a computing device comprising a chipset, any combination thereof, and/or the like. In this regard, the apparatus 102 may comprise any computing device that comprises or is in operative communication with a touch display capable of displaying a graphical user interface. In some example embodiments, the apparatus 102 is embodied as a mobile computing device, such as the mobile terminal illustrated in FIG. 2.
[0023] In this regard, FIG. 2 illustrates a block diagram of a mobile terminal 10 representative of one example embodiment of an apparatus 102. It should be understood, however, that the mobile terminal 10 illustrated and hereinafter described is merely illustrative of one type of apparatus 102 that may implement and/or benefit from various example embodiments of the invention and, therefore, should not be taken to limit the scope of the disclosure. While several embodiments of the electronic device are illustrated and will be hereinafter described for purposes of example, other types of electronic devices, such as mobile telephones, mobile computers, personal digital assistants (PDAs), pagers, laptop computers, desktop computers, gaming devices, televisions, e-papers, and other types of electronic systems, may employ various embodiments of the invention.
[0024] As shown, the mobile terminal 10 may include an antenna 12 (or multiple antennas 12) in communication with a transmitter 14 and a receiver 16. The mobile terminal 10 may also include a processor 20 configured to provide signals to and receive signals from the transmitter and receiver, respectively. The processor 20 may, for example, be embodied as various means including circuitry, one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an ASIC (application specific integrated circuit) or FPGA (field programmable gate array), or some combination thereof. Accordingly, although illustrated in FIG. 2 as a single processor, in some embodiments the processor 20 comprises a plurality of processors. These signals sent and received by the processor 20 may include signaling information in accordance with an air interface standard of an applicable cellular system, and/or any number of different wireline or wireless networking techniques, comprising but not limited to Wi-Fi, wireless local access network (WLAN) techniques such as Institute of Electrical and Electronics Engineers (IEEE) 802.11, 802.16, and/or the like. In addition, these signals may include speech data, user generated data, user requested data, and/or the like. In this regard, the mobile terminal may be capable of operating with one or more air interface standards, communication protocols, modulation types, access types, and/or the like. More particularly, the mobile terminal may be capable of operating in accordance with various first generation (1G), second generation (2G), 2.5G, third-generation (3G) communication protocols, fourth-generation (4G) communication protocols, Internet Protocol Multimedia Subsystem (IMS) communication protocols (e.g., session initiation protocol (SIP)), and/or the like. For example, the mobile terminal may be capable of operating in accordance with 2G wireless communication protocols IS-136 (Time Division Multiple Access (TDMA)), Global System for Mobile communications (GSM), IS-95 (Code Division Multiple Access (CDMA)), and/or the like. Also, for example, the mobile terminal may be capable of operating in accordance with 2.5G wireless communication protocols General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), and/or the like. Further, for example, the mobile terminal may be capable of operating in accordance with 3G wireless communication protocols such as Universal Mobile
Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), Wideband Code Division Multiple Access (WCDMA), Time Division- Synchronous Code Division Multiple Access (TD-SCDMA), and/or the like. The mobile terminal may be additionally capable of operating in accordance with 3.9G wireless communication protocols such as Long Term Evolution (LTE) or Evolved Universal Terrestrial Radio Access Network (E-UTRAN) and/or the like. Additionally, for example, the mobile terminal may be capable of operating in accordance with fourth-generation (4G) wireless communication protocols and/or the like as well as similar wireless communication protocols that may be developed in the future.
[0025] Some Narrow-band Advanced Mobile Phone System (NAMPS), as well as Total Access Communication System (TACS), mobile terminals may also benefit from
embodiments of this invention, as should dual or higher mode phones (e.g., digital/analog or TDMA/CDMA/analog phones). Additionally, the mobile terminal 10 may be capable of operating according to Wi-Fi or Worldwide Interoperability for Microwave Access (WiMAX) protocols,
[0026] It is understood that the processor 20 may comprise circuitry for implementing audio/video and logic functions of the mobile terminal 10. For example, the processor 20 may comprise a digital signal processor device, a microprocessor device, an analog- to-digital converter, a digital-to-analog converter, and/or the like. Control and signal processing functions of the mobile terminal may be allocated between these devices according to their respective capabilities. The processor may additionally comprise an internal voice coder (VC) 20a, an internal data modem (DM) 20b, and/or the like. Further, the processor may comprise functionality to operate one or more software programs, which may be stored in memory. For example, the processor 20 may be capable of operating a connectivity program, such as a web browser. The connectivity program may allow the mobile terminal 10 to transmit and receive web content, such as location-based content, according to a protocol, such as Wireless Application Protocol (WAP), hypertext transfer protocol (HTTP), and/or the like. The mobile terminal 10 may be capable of using a Transmission Control Protocol/Bitemet
Protocol (TCP/TP) to transmit and receive web content across the internet or other networks.
[0027] The mobile terminal 10 may also comprise a user interface including, for example, an earphone or speaker 24, a ringer 22, a microphone 26, a display 28, a user input interface, and/or the like, which may be operationally coupled to the processor 20. In this regard, the processor 20 may comprise user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as, for example, the speaker 24, the ringer 22, the microphone 26, the display 28, and/or the like. The processor 20 and/or user interface circuitry comprising the processor 20 may be configured to control one or more functions of one or more elements of the user interface through computer program
instructions (e.g., software and/or firmware) stored on a memory accessible to the processor 20 (e.g., volatile memory 40, non-volatile memory 42, and/or the like). Although not shown, the mobile terminal may comprise a battery for powering various circuits related to the mobile terminal, for example, a circuit to provide mechanical vibration as a detectable output. The display 28 of the mobile terminal may be of any type appropriate for the electronic device in question with some examples including a plasma display panel (PDP), a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode display (OLED), a projector, a holographic display or the like. The display 28 may, for example, comprise a three-dimensional touch display, examples of which will be described further herein below. The user input interface may comprise devices allowing the mobile terminal to receive data, such as a keypad 30, a touch display (e.g., some example embodiments wherein the display 28 is configured as a touch display), a joystick (not shown), and/or other input device. In embodiments including a keypad, the keypad may comprise numeric (0-9) and related keys (#, *), and/or other keys for operating the mobile terminal.
[0028] The mobile terminal 10 may comprise memory, such as a subscriber identity module (SM) 38, a removable user identity module (R-UIM), and/or the like, which may store information elements related to a mobile subscriber. In addition to the SIM, the mobile terminal may comprise other removable and/or fixed memory. The mobile terminal 10 may include volatile memory 40 and/or non-volatile memory 42. For example, volatile memory 40 may include Random Access Memory (RAM) including dynamic and/or static RAM, on- chip or off-chip cache memory, and/or the like. Non-volatile memory 42, which may be embedded and/or removable, may include, for example, read-only memory, flash memory, magnetic storage devices (e.g., hard disks, floppy disk drives, magnetic tape, etc.), optical disc drives and/or media, non-volatile random access memory (NVRAM), and/or the like. Like volatile memory 40 non- volatile memory 42 may include a cache area for temporary storage of data. The memories may store one or more software programs, instructions, pieces of information, data, and/or the like which may be used by the mobile terminal for performing functions of the mobile terminal. For example, the memories may comprise an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying the mobile terminal 10.
[0029] Returning to FIG. 1, in an example embodiment, the apparatus 102 includes various means for performing the various functions herein described. These means may comprise one or more of a processor 110, memory 112, communication interface 114, user interface 116, or user interface (UI) control circuitry 122, The means of the apparatus 102 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions (e.g., software or firmware) stored on a computer-readable medium (e.g. memory 112) that is executable by a suitably configured processing device (e.g., the processor 1 10), or some combination thereof.
[0030] In some example embodiments, one or more of the means illustrated in FIG. 1 may be embodied as a chip or chip set. In other words, the apparatus 102 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. In this regard, the processor 110, memory 112, communication interface 114, user interface 116, and/or UI control circuitry 122 may be embodied as a chip or chip set. The apparatus 102 may therefore, in some cases, be configured to or may comprise component(s) configured to implement embodiments of the present invention on a single chip or as a single "system on a chip." As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein and/or for enabling user interface navigation with respect to the
functionalities and/or services described herein.
[0031] The processor 110 may, for example, be embodied as various means including one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an ASIC (application specific integrated circuit) or FPGA (field programmable gate array), one or more other types of hardware processors, or some combination thereof.
Accordingly, although illustrated in FIG. 1 as a single processor, in some embodiments the processor 110 comprises a plurality of processors. The plurality of processors may be in operative communication with each other and may be collectively configured to perform one or more functionalities of the apparatus 102 as described herein. The plurality of processors may be embodied on a single computing device or distributed across a plurality of computing devices collectively configured to function as the apparatus 102. In embodiments wherein the apparatus 102 is embodied as a mobile terminal 10, the processor 110 may be embodied as or comprise the processor 20. In some example embodiments, the processor 110 is configured to execute instructions stored in the memory 112 or otherwise accessible to the processor 110. These instructions, when executed by the processor 110, may cause the apparatus 102 to perform one or more of the functionalities of the apparatus 102 as described herein. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 110 may comprise an entity capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when the processor 110 is embodied as an ASIC, FPGA or the like, the processor 110 may comprise specifically configured hardware for conducting one or more operations described herein. Alternatively, as another example, when the processor 110 is embodied as an executor of instructions, such as may be stored in the memory 112, the instructions may specifically configure the processor 110 to perform one or more algorithms and operations described herein.
[0032] The memory 112 may comprise, for example, volatile memory, non-volatile memory, or some combination thereof. In this regard, the memory 112 may comprise a non- transitory computer-readable storage medium. Although illustrated in FIG. 1 as a single memory, the memory 112 may comprise a plurality of memories. The plurality of memories may be embodied on a single computing device or may be distributed across a plurality of computing devices collectively configured to function as the apparatus 102. In various example embodiments, the memory 112 may comprise a hard disk, random access memory, cache memory, flash memory, a compact disc read only memory (CD-ROM), digital versatile disc read only memory (DVD-ROM), an optical disc, circuitry configured to store
information, or some combination thereof. In embodiments wherein the apparatus 102 is embodied as a mobile terminal 10, the memory 112 may comprise the volatile memory 40 and/or the non-volatile memory 42. The memory 112 may be configured to store information, data, applications, instructions, or the like for enabling the apparatus 102 to carry out various functions in accordance with various example embodiments. For example, in some example embodiments, the memory 112 is configured to buffer input data for processing by the processor 110. Additionally or alternatively, the memory 112 may be configured to store program instructions for execution by the processor 110. The memory 112 may store information in the form of static and/or dynamic information. The stored information may include, for example, images, content, media content, user data, application data, and/or the like. This stored information may be stored and/or used by the UI control circuitry 122 during the course of performing its functionalities.
[0033] The communication interface 114 may be embodied as any device or means embodied in circuitry, hardware, a computer program product comprising computer readable program instructions stored on a computer readable medium (e.g., the memory 112) and executed by a processing device (e.g., the processor 110), or a combination thereof that is configured to receive and/or transmit data from/to another computing device. In some example embodiments, the communication interface 114 is at least partially embodied as or otherwise controlled by the processor 110. In this regard, the communication interface 114 may be in communication with the processor 110, such as via a bus. The communication interface 114 may include, for example, an antenna, a transmitter, a receiver, a transceiver and/or supporting hardware or software for enabling communications with one or more remote computing devices. For example, in embodiments wherein the apparatus 102 is embodied as a mobile terminal 10, the communications interface 114 may be embodied as or comprise the antenna 12, the transmitter 14 and/or the receiver 16. The communication interface 114 may be configured to receive and/or transmit data using any protocol that may be used for communications between computing devices. In this regard, the communication interface 114 may be configured to receive and/or transmit data using any protocol that may be used for transmission of data over a wireless network, wireline network, some
combination thereof, or the like by which the apparatus 102 and one or more computing devices may be in communication. As an example, the communication interface 114 may be configured to receive and/or otherwise access content (e.g., web page content, streaming media content, and/or the like) over a network from a server or other content source. The communication interface 114 may additionally be in communication with the memory 112, user interface 116, and/or UI control circuitry 122, such as via a bus.
[0034] The user interface 116 may be in communication with the processor 110 to receive an indication of a user input and/or to provide an audible, visual, mechanical, or other output to a user. As such, the user interface 116 may include, for example, a keyboard, a mouse, a joystick, a display, a touch screen display, a microphone, a speaker, and/or other input/output mechanisms, such as the display 28 and/or keypad 30 in the embodiment of Figure 2. The user interface 116 may be in communication with the memory 112,
communication interface 114, and/or UI control circuitry 122, such as via a bus. In some example embodiments, the apparatus 102 comprises a user interface 116 comprising a touch display. In alternative example embodiments, such as in embodiments wherein the apparatus 102 is a embodied as a chip or chipset, the apparatus 102 may be operatively connected with the touch display user interface 116 such that the apparatus 102 may control the touch display, receive an indication of and/or otherwise determine a user input (e.g., a touch input) to the touch display user interface 116, and/or the like. The touch display user interface 116 may comprise any type of display capable of displaying a user interface, image, virtual keyboard, virtual keypad, and/or the like.
[0035] The touch display user interface 116 may also be configured to enable the detection of a touch input. As an example, the touch display user interface 116 may comprise a capacitive touch display, which may be configured to enable detection of capacitance of a finger or other input object by which an input may be made by physically contacting the display surface. The touch display may also be configured to enable the detection of a hovering gesture input. A hovering gesture input may comprise a gesture input to the touch display without making physical contact with a surface of the touch display, such as a gesture made in a space some distance above/in front of the surface of the touch display. As an example, the touch display may comprise a projected capacitive touch display, which may be configured to enable detection of capacitance of a finger or other input object by which a gesture may be made without physically contacting a display surface. As another example, the touch display may be configured to enable detection of a hovering gesture input through use of acoustic wave touch sensor technology, electromagnetic touch sensing technology, near field imaging technology, optical sensing technology, infrared proximity sensing technology, some combination thereof, or the like. The touch display user interface 116 may further be in communication with one or more of the processor 110, memory 1 12,
communication interface 114, or UI control circuitry 122, such as via a bus.
[0036] The UI control circuitry 122 may be embodied as various means, such as circuitry, hardware, a computer program product comprising computer readable program instructions stored on a computer readable medium (e.g., the memory 112) and executed by a processing device (e.g., the processor 110), or some combination thereof and, in some embodiments, is embodied as or otherwise controlled by the processor 110. In some example embodiments wherein the UI control circuitry 122 is embodied separately from the processor 110, the UI control circuitry 122 may be in communication with the processor 110. The UI control circuitry 122 may further be in communication with one or more of the memory 112, communication interface 114, or user interface 116, such as via a bus.
[0037] The UI control circuitry 122 may be configured to receive an indication of a touch input to the touch display user interface 1 16 and/or otherwise determine a touch input to the touch display user interface 116. In this regard, for example, the touch display user interface 116 may be configured to detect a touch input to the touch display and generate a signal indicative of the touch input. This signal may be received by the UI control circuitry 122, which may determine the touch input in response to receiving the signal. The signal may carry information indicative of a position of the touch input. In this regard, the position may comprise an area of the touch display engaged by a user contacting the touch display user interface 116. Such an area may be distinguished from a single point location in that it may comprise two or more point locations disposed within the area. Examples of areas include a single continuous area (e.g., an area bounded by a single continuous perimeter), a
discontinuous area (e.g., two or more individual areas each bounded by a single continuous perimeter), and a hollow area (e.g., a single or discontinuous area excluding one or more sub- areas located within the single or discontinuous area). In some embodiments, an area may comprise a plurality of point locations, such as a plurality of point locations falling within a particular perimeter and/or outside another perimeter.
[0038] The position may, for example, comprise a plurality of coordinate positions relative to a two-dimensional coordinate system (e.g., an X and Y axis) of the touch display user interface 116, such that the positions may be described in terms of an area of the surface of the touch display engaged by the user contacting the touch display. The UI control circuitry 122 may accordingly be configured to determine a position of a touch input based at least in part on a received signal or other indication of a touch input.
[0039] The UI control circuitry 122 may be further configured to determine a relation of a position of a touch input to a user interface that may be displayed by the touch display user interface 116. The user interface may comprise any image that may be displayed by the touch display. By way of non-limiting example, the user interface may comprise a keypad interface, which is described further herein below. However, it will be appreciated that various embodiments are not so limited and other types of user interfaces may be displayed. In this regard, the user interface may comprise any user interface, graphic(s) or object(s), some combination thereof, or the like, which may be displayed by the touch display user interface 116.
[0040] In determining a relation of the position of a touch input to a user interface, the UI control circuitry 122 may be configured to determine an element of the user interface displayed in relation to the determined position. In this regard, the UI control circuitry 122 may be configured to track and/or determine positions at which graphical elements of the user interface may be displayed to a user. Accordingly, the UI control circuitry 122 may determine an element of the user interface that is displayed at the determined position of the touch input. As another example, the UI control circuitry 122 may determine the element of the user interface that is displayed closest to the determined position of the touch input. For example, in an instance in which the UI control circuitry 122 determines two or more elements are displayed at the determined position of the touch input, the UI control circuitry may determine which of the two or more elements displayed are closest to the determined position of the touch input, as explained in further detail below and illustrated in the flowchart of Figure 5. Further, the UI control circuitry 122 may determine an element, which is not displayed on the user interface, but is associated with another element that is displayed on the user interface, that is displayed at the determined position of the touch input. For example, the UI control circuitry may determine an element, which is not displayed on the user interface, such as a target midpoint 255, 255', but is associated with another element that is displayed on the user interface, such as a target area 250, 250', that is displayed at the determined position of the touch input.
[0041] As previously mentioned, in some example embodiments, the UI control circuitry 122 may be configured to control display of a virtual keypad interface. The virtual keypad may comprise a plurality of virtual numbered keys displayed on a touch display user interface 116. In another embodiment, the UI control circuitry 122 may be configured to control display of a virtual keyboard interface comprising a plurality of virtual alphanumeric keys on a touch display user interface 116. In addition, the UI control circuitry 122 may be configured to determine the position of an area covered by at least one virtual key displayed on a touch display user interface 1 16. In another embodiment, the UI control circuitry 122 may be configured to determine a representative point or other subregion of at least one virtual key, such as a midpoint of an area covered by at least one virtual key displayed on a touch display user interface 116. As with the midpoint of the area covered by a virtual key, the representative point or other subregion is within the respective virtual key and is a subset of and smaller than the respective virtual key. Further, the UI control circuitry 122 may be further configured to determine a position of a touch input and a relation of the touch input to the representative region, such as the midpoint, of an area covered by at least one virtual key displayed on a touch display user interface 116.
[0042] Referring now to FIGs. 3A-3C, FIGs. 3A-3C illustrate example interaction with an example touch display 210, 210', 210" of an apparatus 200 according to an example embodiment. In the depicted embodiments, the apparatus 200 may further comprise a touch display 210, 210', 210" comprising a plurality of elements 220, 220', 220". In one
embodiment of the present invention, the plurality of elements 220, 220', 220" may comprise a plurality of virtual keys, which collectively form a virtual keypad. Each of the elements 220, 220', 220" may correspond to an instruction, command, or indication based on a determined position on the touch display 210, 210', 210". With reference to FIG. 3A, a user may desire to engage an element 220, such as a virtual key, corresponding to a target area 250 of the touch display 210, such as a virtual keypad. The small size of the virtual keys, however, may cause an error to occur, such as the apparatus 200 receiving an indication that the "8" key was selected when the user intended to select the "5" key.
[0043] As described herein, and illustrated in FIG. 3B, a UI control circuitry may be configured to be in communication with the touch display 210' and may be further configured to track and/or determine positions at which graphical elements 220' of the user interface may be displayed to a user. In one embodiment, a UI control circuitry may be configured to track and/or determine a midpoint 255 or other representative subregion for at least one of the elements 220' displayed on the touch display user interface 210'. As such, the desired target area 250' a user desires to engage may include at least one target midpoint 255.
[0044] Further, as previously mentioned, the UI control circuitry may be configured to receive an indication of a touch input to the touch display user interface 210", as shown in FIG. 3C. In this regard, the touch display user interface 210' may be configured to detect a touch input 270 to the touch display user interface 210" and generate a signal indicative of the touch input. The signal may carry information indicative of a position of the touch input 270. Referring to FIG. 3C, the position of the touch input 270 may comprise an area of the touch display engaged by a user contacting the touch display user interface 210", For example, the position may comprise a plurality of coordinate positions relative to a two-dimensional coordinate system (e.g., an X and Y axis) of the touch display user interface 210", such that the positions may be described in terms of an area of the surface of the touch display engaged by the user contacting the touch display. The UI control circuitry may accordingly be configured to determine a position of a touch input based at least in part on a received signal or other indication of a touch input.
[0045] In addition, the UI control circuitry may be configured to determine a relation of the touch input 270 to a touch display user interface 210". In one embodiment, the UI control circuitry may be configured to determine if at least one midpoint 255' or other representative subregion is disposed within a position, such as an area, of a touch input 270. In another embodiment, the UI control circuitry may be configured to determine a centroid of a positional area of a touch input 270. Further, the UI control circuitry may be configured to determine at least one distance of the centroid of a positional area of a touch input 270 to at least one midpoint 255' or other representative subregion of an element 220" displayed on the touch display user interface 210". As such, one embodiment of the present invention may comprise causing the selection of a function associated with a target area 250, 250', when the midpoint 255, 255' or other representative subregion associated with the target area is disposed within the touch area 270. In one embodiment, the processor 110, memory 112, communication interface 114, user interface 116, and/or UI control circuitry 122 may, for example, provide means for causing the selection of the function associated with the target area 250, 250', when the midpoint 255, 255' or other representative subregion associated with the target area is disposed within the touch area 270.
[0046] In one embodiment of the present invention, the touch display user interface may be configured to display a 3x3 grid of squares, each square measuring approximately 6mm x 6mm. As such, the touch display user interface measures approximately 18mm x 18mm. Further, the apparatus 200 may be configured to display each target area as one of the 6mm x 6mm squares. As shown in FIGs 3A-3C, the user may desire to select the target area 250, 250' associated with the "5" key. Although the touch display user interface displays the target area as a 6mm x 6mm square, the effective target area 260 may be an area
corresponding to at least a square having dimensions less than 12mm x 12mm. As discussed in further detail below, the apparatus 200 may be configured to cause the selection of a function associated with a target area when the midpoint or other representative subregion of the target area is disposed within the touch area 270. As such, a user desiring to select the "5" key may contact the touch display user interface, and thus create a touch area, at any position so long as the midpoint 255, 255' or other representative subregion associated with the target area of the "5" key is disposed within the touch area 270. This example is purely illustrative, and as such, an arrangement of shapes may have dimensions other than 3x3 grids, may not be a grid, may use shapes other than squares, such as circles, rectangles, or other shapes, or may use squares of dimensions other than 6mm x 6mm.
[0047] FIG. 4 illustrates a flowchart according to an example method for preventing erroneous touch screen inputs made with a touch display user interface according to an example embodiment. The operations illustrated in and described with respect to FIG. 4 may, for example, be performed by, with the assistance of, and/or under the control of one or more of the processor 1 10, memory 112, communication interface 114, user interface 116, or UI control circuitry 122. Operation 302 may comprise receiving a user input to a touch display (e.g., a touch display user interface 116). The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 302. Operation 304 may comprise determining a touch area corresponding to user input. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 302. Operation 306 may comprise determining a relation of the touch area to a representative subregion of at least one target area of the touch display user interface. The processor 110, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for performing operation 306. Operation 308 may comprise causing, based at least in part on the determined relation, a selection corresponding to the user input. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 308 [0048] FIG. 5 illustrates a flowchart according to an example method for preventing erroneous touch screen inputs made with a touch display user interface according to another example embodiment. The operations illustrated in and described with respect to FIG. 5 may, for example, be performed by, with the assistance of, and/or under the control of one or more of the processor 110, memory 112, communication interface 114, user interface 116, or UI control circuitry 122. Operation 402 may comprise receiving a user input to a touch display user interface. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 402. Operation 404 may comprise determining a touch area of the touch display user-interface engaged by the user contacting the touch display user interface. The processor 110, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for
performing operation 404. Operation 406 may comprise determining a target midpoint of at least one target area disposed on the touch display user-interface. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 406. Operation 410 may comprise determining whether at least one target midpoint is disposed within the touch area. The processor 1 10, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for performing operation 410. If at least one target midpoint is disposed within the touch area, operation 420 may comprise determining whether a plurality of target midpoints is disposed within the touch area. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 420. If a single midpomt is disposed within the touch area, operation 430 may comprise causing a selection corresponding to the target area corresponding to the midpoint disposed within the touch area. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 430. If no midpoints are disposed within the touch area, as determined by operation 410, then operation 440 may comprise causing no selection corresponding to any of the target areas. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for
performing operation 440. If a plurality of midpoints is disposed within the touch area, then operation 450 may comprise determining a centroid of the touch area. Further, operation 452 may comprise determining the shortest distance between the centroid and the plurality of target midpoints, and operation 454 may comprise causing a selection corresponding to the target area corresponding to the midpomt with the shortest distance to the centroid of the touch area. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operations 450, 452, and 454.
[0049] FIG. 6 illustrates a flowchart of another embodiment according to an example method for preventing erroneous touch screen inputs made with a touch display user interface according to another example embodiment. The operations illustrated in and described with respect to FIG, 6 may, for example, be performed by, with the assistance of, and/or under the control of one or more of the processor 110, memory 112, communication interface 114, user interface 116, or UI control circuitry 122. Operation 502 may comprise receiving a user input to a touch display user interface. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 502. Operation 504 may comprise determining a touch area of the touch display user- interface engaged by the user contacting the touch display user interface. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 504. Operation 506 may comprise determining a target midpoint of at least one target area disposed on the touch display user-interface. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 506. Operation 510 may comprise determining whether at least one target midpoint is disposed within the touch area. The processor 110, memory 112, user interface 1 16, and/or UI control circuitry 122 may, for example, provide means for performing operation 510. If at least one target midpoint is disposed within the touch area, operation 520 may comprise determining whether a plurality of target midpoints is disposed within the touch area. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for
performing operation 520. If a single midpoint is disposed within the touch area, operation 530 may comprise causing a selection corresponding to the target area corresponding to the midpoint disposed within the touch area. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 530. If no midpoints are disposed within the touch area, as determined by operation 510, then operation 540 may comprise causing no selection corresponding to any of the target areas. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operation 540. If a plurality of midpoints are disposed within the touch area, then operation 550 may comprise causing no selection corresponding to any of the target areas. The processor 110, memory 112, user interface 116, and/or UI control circuitry 122 may, for example, provide means for performing operations 550.
[0050] FIGs. 4-6 each illustrate a flowchart of a system, method, and computer program product according to an example embodiment. It will be understood that each block of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by various means, such as hardware and/or a computer program product comprising one or more computer-readable mediums having computer readable program instructions stored thereon. For example, one or more of the procedures described herein may be embodied by computer program instructions of a computer program product. In this regard, the computer program product(s) which embody the procedures described herein may be stored by one or more memory devices of a mobile terminal, server, or other computing device (for example, in the memory 112) and executed by a processor in the computing device (for example, by the processor 110). In some embodiments, the computer program instructions comprising the computer program product(s) which embody the procedures described above may be stored by memory devices of a plurality of computing devices. As will be appreciated, any such computer program product may be loaded onto a computer or other programmable apparatus (for example, an apparatus 102) to produce a machine, such that the computer program product including the instructions which execute on the computer or other programmable apparatus creates means for implementing the functions specified in the flowchart block(s). Further, the computer program product may comprise one or more computer-readable memories on which the computer program instructions may be stored such that the one or more computer-readable memories can direct a computer or other programmable apparatus to function in a particular manner, such that the computer program product comprises an article of manufacture which implements the function specified in the flowchart block(s). The computer program instructions of one or more computer program products may also be loaded onto a computer or other programmable apparatus (for example, an apparatus 102) to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus implement the functions specified in the flowchart block(s).
[0051] Accordingly, blocks of the flowcharts support combinations of means for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer program product(s).
[0052] The above described functions may be carried out in many ways. For example, any suitable means for carrying out each of the functions described above may be employed to carry out embodiments of the invention. In one embodiment, a suitably configured processor (for example, the processor 110) may provide all or a portion of the elements, such as each of those shown in Figures 4-6, and, as such, may constitute means for receiving a user input to the touch display user-interface, means for determining a touch area that corresponds to the user input to the touch display user-interface, means for determining a relation of the touch area to a representative subregion of at least one target area disposed on the touch display user-interface and/or means for causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface. In another embodiment, all or a portion of the elements may be configured by and operate under control of a computer program product. The computer program product for performing the methods of an example embodiment of the invention includes a computer-readable storage medium (for example, the memory 112), such as the non- volatile storage medium, and computer-readable program code portions, such as a series of computer instructions, embodied in the computer-readable storage medium.
[0053] Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments of the invention are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the invention. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the invention. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated within the scope of the invention. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims

WHAT IS CLAIMED IS:
1. A method comprising:
receiving a user input to a touch display user-interface;
determining a touch area that corresponds to the user input to the touch display user- interface;
determining, by a processor, a relation of the touch area to a representative subregion of at least one target area disposed on the touch display user-interface; and
causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
2. The method of Claim 1, wherein determining the touch area corresponding to the user input comprises:
determining an area of the touch display user-interface engaged by the user contacting the touch display user-interface.
3. The method of Claim 1, wherein determining the relation of the touch area to a representative subregion of at least one of the target areas comprises:
determining at least one target midpoint of at least one target area disposed on the touch display user-interface; and
determining if at least one target midpoint is disposed within the touch area.
4. The method of Claim 3, wherein causing the selection correspondmg to the user input comprises:
causing selection of a function associated with a target area in an instance in which a target midpoint corresponding to the target area is disposed within the touch area; and
causing selection of no functions associated with at least one of the target areas in an instance in which at least two target midpoints corresponding to two separate target areas are disposed within the touch area.
5. The method of Claim 1, wherein determining the relation of the touch area to a representative subregion of at least one of the target areas further comprises:
determining a centroid point of the touch area; determining at least one target midpoint of at least one target area disposed on the touch display user-interface;
determining a distance between the centroid point and the at least one target midpoint; and
determining if the target midpoint is disposed within the touch area.
6. The method of Claim 5, wherein causing the selection comprises;
causing selection of a function associated with a target area in an instance in which a single target midpoint corresponding to the target area is disposed within the touch area; and causing selection of a function associated with the target area having the shortest distance between the centroid point and the corresponding target midpoint in an instance in which the touch area comprises at least two target midpoints.
7. The method of Claim 1 , wherein causing the selection corresponding to the user input comprises causing selection of a function associated with a target area.
8. An apparatus comprising at least one processor and at least one memory storing computer program code, wherein the at least one memory and stored computer program code are configured, with the at least one processor to cause the apparatus to at least: receive a user input to a touch display user-interface;
determine a touch area that corresponds to the user input to the touch display user- interface;
determine a relation of the touch area to a representative subregion of at least one target area disposed on the touch display user-interface; and
cause, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
9. The apparatus of Claim 8, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to:
determine an area of the touch display user-interface engaged by the user contacting the touch display user-interface.
10. The apparatus of Claim 8, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to:
determine at least one target midpoint of at least one target area disposed on the touch display user-interface; and
determine if at least one target midpoint is disposed within the touch area.
11. The apparatus of Claim 10, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to:
cause selection of a function associated with a target area in an instance in which a target midpoint corresponding to the target area is disposed within the touch area; and
cause selection of no functions associated with at least one of the target areas in an instance in which at least two of the target midpoints corresponding to two separate target areas are disposed within the touch area.
12. The apparatus of Claim 8, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to cause selection of a function associated with a target area.
13. The apparatus of Claim 8, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to:
determine a centroid point of the touch area;
determine at least one target midpoint of a plurality of target areas disposed on the touch display user-interface;
determine a distance between the centroid point and the at least one target midpoint; and
determine if the target midpoint is disposed within the touch area.
14. The apparatus of Claim 13, wherein the at least one memory and stored computer program code are configured, with the at least one processor, to cause the apparatus to: cause selection of a function associated with a target area in an instance in which a single target midpoint corresponding to the target area is disposed within the touch area; and cause selection of a function associated with the target area having the shortest distance between the centroid point and the corresponding target midpoint in an instance in which the touch area comprises at least two target midpoints.
15. A computer program product comprising at least one non- transitory computer- readable storage medium having computer-readable program instructions stored therein, the computer-readable program instructions comprising program instructions configured to cause an apparatus to perform a method comprising:
receiving a user input to a touch display user-interface;
determining a touch area that corresponds to the user input to the touch display user- interface;
determining a relation of the touch area to a representative subregion of at least one target area disposed on the touch display user-interface; and
causing, based at least in part on the determined relation, a selection corresponding to the user input to the touch display user-interface.
16. The computer program product of Claim 15, wherein determining the touch area corresponding to the user input comprises:
determining the area of the touch display user-interface engaged by the user contacting the touch display user-interface.
17. The computer program product of Claim 15, wherein determining the relation of the touch area to a representative subregion of at least one of the target areas comprises: determining a centroid point of the touch area;
determining at least one target midpoint of at least one target area disposed on the touch display user-interface;
determining a distance between the centroid point and the at least one target midpoint; and
determining if at least one target midpoint is disposed within the touch area.
18. The computer program product of Claim 17, wherein causing the selection corresponding to the user input comprises: causing selection of a function associated with a target area in an instance in which a singular target midpoint corresponding to the target area is disposed within the touch area; and
causing selection of no functions associated at least one of the target areas in an instance in which at least two target midpoints corresponding to at least two separate target areas are disposed within the touch area.
19. The computer program product of Claim 19, wherein causing the selection corresponding to the user input comprises:
causing selection of a function associated with a target area in an instance in which a singular target midpoint corresponding to the target area is disposed within the touch area; and
causing selection of a function associated with a target area having the shortest distance between the centroid point and the at least one target midpoint in an instance in which the touch area includes at least two target midpoints.
20. The computer program product of Claim 15, wherein causing the selection corresponding to the user input comprises selecting a function associated with a target area.
PCT/CN2011/072341 2011-03-31 2011-03-31 Recognizing touch screen inputs WO2012129808A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/072341 WO2012129808A1 (en) 2011-03-31 2011-03-31 Recognizing touch screen inputs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/072341 WO2012129808A1 (en) 2011-03-31 2011-03-31 Recognizing touch screen inputs

Publications (1)

Publication Number Publication Date
WO2012129808A1 true WO2012129808A1 (en) 2012-10-04

Family

ID=46929351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/072341 WO2012129808A1 (en) 2011-03-31 2011-03-31 Recognizing touch screen inputs

Country Status (1)

Country Link
WO (1) WO2012129808A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160034069A1 (en) * 2014-08-04 2016-02-04 Fujitsu Limited Information processing apparatus, input control method, and computer-readable recording medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101263448A (en) * 2005-09-16 2008-09-10 苹果公司 Activating virtual keys of a touch-screen virtual keyboard
CN101689084A (en) * 2008-02-14 2010-03-31 科乐美数码娱乐株式会社 A selection determination apparatus, a selection determination method, a data recording medium, and a program
CN101957722A (en) * 2010-09-28 2011-01-26 华为终端有限公司 Touch screen input control method and device as well as mobile phone
CN101968711A (en) * 2010-09-29 2011-02-09 北京播思软件技术有限公司 Method for accurately inputting characters based on touch screen

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101263448A (en) * 2005-09-16 2008-09-10 苹果公司 Activating virtual keys of a touch-screen virtual keyboard
CN101689084A (en) * 2008-02-14 2010-03-31 科乐美数码娱乐株式会社 A selection determination apparatus, a selection determination method, a data recording medium, and a program
CN101957722A (en) * 2010-09-28 2011-01-26 华为终端有限公司 Touch screen input control method and device as well as mobile phone
CN101968711A (en) * 2010-09-29 2011-02-09 北京播思软件技术有限公司 Method for accurately inputting characters based on touch screen

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160034069A1 (en) * 2014-08-04 2016-02-04 Fujitsu Limited Information processing apparatus, input control method, and computer-readable recording medium

Similar Documents

Publication Publication Date Title
US9727128B2 (en) Methods, apparatuses, and computer program products for enhancing activation of an augmented reality mode
US20120223935A1 (en) Methods and apparatuses for facilitating interaction with a three-dimensional user interface
US9191798B2 (en) Methods, apparatuses, and computer program products for saving and resuming a state of a collaborative interaction session between devices based on their positional relationship
US20130009882A1 (en) Methods and apparatuses for providing haptic feedback
US20130159930A1 (en) Displaying one or more currently active applications
US20130237147A1 (en) Methods, apparatuses, and computer program products for operational routing between proximate devices
US20120249596A1 (en) Methods and apparatuses for dynamically scaling a touch display user interface
WO2012025669A1 (en) Methods and apparatuses for facilitating content navigation
US20130159899A1 (en) Display of graphical representations
US20120280915A1 (en) Method and apparatus for facilitating interacting with a multimodal user interface
US9047008B2 (en) Methods, apparatuses, and computer program products for determination of the digit being used by a user to provide input
US9063582B2 (en) Methods, apparatuses, and computer program products for retrieving views extending a user's line of sight
US20120280899A1 (en) Methods and apparatuses for defining the active channel in a stereoscopic view by using eye tracking
EP2677413B1 (en) Method for improving touch recognition and electronic device thereof
WO2014140420A2 (en) Methods, apparatuses and computer program products for improved device and network searching
US8902180B2 (en) Methods, apparatuses, and computer program products for enabling use of remote devices with pre-defined gestures
WO2012129808A1 (en) Recognizing touch screen inputs
US20140232659A1 (en) Methods, apparatuses, and computer program products for executing functions based on hover gestures or touch gestures
US9288247B2 (en) Methods and apparatus for improved navigation of content including a representation of streaming data
US20160117294A1 (en) Methods, apparatuses, and computer program products for modification of webpage based on device data
US10425586B2 (en) Methods, apparatuses, and computer program products for improved picture taking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11862601

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11862601

Country of ref document: EP

Kind code of ref document: A1