EP3060972A1 - Vorrichtung und verfahren zur bereitstellung indirekter berührungseingaben auf einer touchscreen-anzeige - Google Patents

Vorrichtung und verfahren zur bereitstellung indirekter berührungseingaben auf einer touchscreen-anzeige

Info

Publication number
EP3060972A1
EP3060972A1 EP13895993.7A EP13895993A EP3060972A1 EP 3060972 A1 EP3060972 A1 EP 3060972A1 EP 13895993 A EP13895993 A EP 13895993A EP 3060972 A1 EP3060972 A1 EP 3060972A1
Authority
EP
European Patent Office
Prior art keywords
touch input
display
input
location
extrapolated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP13895993.7A
Other languages
English (en)
French (fr)
Other versions
EP3060972A4 (de
Inventor
Kenton Lyons
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Technologies Oy
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Publication of EP3060972A1 publication Critical patent/EP3060972A1/de
Publication of EP3060972A4 publication Critical patent/EP3060972A4/de
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0338Fingerprint track pad, i.e. fingerprint sensor used as pointing device tracking the fingertip image

Definitions

  • Example embodiments of the present invention relate generally to receipt of input via a touch screen display.
  • Touch screen displays are becoming more and more prevalent in user devices, both large and small.
  • Devices such as watches, smart phones, tablet computers, and other mobile devices, as well as other devices such as computer displays, televisions, and other non-mobile terminals now come equipped with display screens that are configured to receive input from a user as a result of the user's contact (e.g., using the user's finger, a stylus, or some other object) with the screen on which content is displayed.
  • Embodiments of the present invention thus provide mechanisms for displaying content, including selectable elements, in a main portion of a display and receiving a touch input in an input region proximate the main portion of the display, where the input region does not overlap with the main portion.
  • selection of a particular selectable element may be accomplished when the location of the extrapolated touch input coincides with the location of the selectable element.
  • an apparatus may be provided that includes at least one processor and at least one memory including computer program code.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to at provide for presentation of a selectable element in a main portion of a display; receive a touch input in an input region proximate the main portion of the display, wherein the input region does not overlap with the main portion; determine an extrapolated touch input in the main portion of the display based on a location of the touch input in the input region; and provide for selection of the selectable element in an instance in which a location of the extrapolated touch input coincides with a location of the selectable element.
  • the touch input may comprise a first touch input and the extrapolated touch input may comprise a first extrapolated touch input.
  • the at least one memory and the computer program code may be further configured to, with the processor, cause the apparatus to receive a second touch input in the input region, determine a second extrapolated touch input in the main portion of the display based on the location of the second touch input in the input region, and provide for selection of the selectable element in an instance in which an intersection of the first extrapolated touch input and the second extrapolated touch input coincides with the location of the selectable element.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to detect an angle associated with the touch input and to determine the location of the extrapolated touch input based on the angle of the touch input that is detected.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to provide a visual representation of the location of the extrapolated touch input in the main portion of the display.
  • the visual representation may comprise a ray extending from the location of the touch input in the input region at least partially into the main portion of the display.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to modify the location of the extrapolated touch input based on a change in the angle of the touch input.
  • the input region may be disposed along an edge of the display.
  • the input region may be defined by a surface that is in substantially the same plane as the display. In other cases, the input region may be defined by a surface that defines a non-zero angle with the display. In still other embodiments, the input region may be defined by the display.
  • a method and a computer program product are described for providing for presentation of a selectable element in a main portion of a display; receiving a touch input in an input region proximate the main portion of the display, wherein the input region does not overlap with the main portion; determining an extrapolated touch input in the main portion of the display based on a location of the touch input in the input region; and providing for selection of the selectable element in an instance in which a location of the extrapolated touch input coincides with a location of the selectable element.
  • the touch input comprises a first touch input and the extrapolated touch input comprises a first extrapolated touch input.
  • Embodiments of the method and computer program product may in such cases include receiving a second touch input in the input region, determining a second extrapolated touch input in the main portion of the display based on the location of the second touch input in the input region, and providing for selection of the selectable element in an instance in which an intersection of the first extrapolated touch input and the second extrapolated touch input coincides with the location of the selectable element.
  • An angle associated with the touch input may be detected in some cases, and the location of the extrapolated touch input may be determined based on the angle of the touch input that is detected.
  • a visual representation of the location of the extrapolated touch input may be provided in the main portion of the display. In some embodiments, the visual representation may comprise a ray extending from the location of the touch input in the input region at least partially into the main portion of the display. In other cases, the location of the extrapolated touch input may be modified based on a change in the angle of the touch input.
  • the input region may be disposed along an edge of the display.
  • the input region may be defined by a surface that is in substantially the same plane as the display.
  • the input region may be defined by a surface that defines a non-zero angle with the display.
  • an apparatus for receiving indirect touch input to a touch screen display.
  • the apparatus may include means for providing for presentation of a selectable element in a main portion of a display; means for receiving a touch input in an input region proximate the main portion of the display, wherein the input region does not overlap with the main portion; means for determining an extrapolated touch input in the main portion of the display based on a location of the touch input in the input region; and means for providing for selection of the selectable element in an instance in which a location of the extrapolated touch input coincides with a location of the selectable element.
  • the touch input may comprise a first touch input and the extrapolated touch input may comprise a first extrapolated touch input.
  • the apparatus may further comprise means for receiving a second touch input in the input region; means for determining a second extrapolated touch input in the main portion of the display based on the location of the second touch input in the input region; and means for providing for selection of the selectable element in an instance in which an intersection of the first extrapolated touch input and the second extrapolated touch input coincides with the location of the selectable element.
  • the apparatus may further comprise means for detecting an angle associated with the touch input and means for determining the location of the extrapolated touch input based on the angle of the touch input that is detected. Additionally or alternatively, the apparatus may further comprise means for providing a visual representation of the location of the extrapolated touch input in the main portion of the display.
  • FIG. 1 illustrates one example of a communication system according to an example embodiment of the present invention
  • FIG. 2 illustrates a schematic block diagram of an apparatus for receiving indirect touch input to a touch screen display according to an example embodiment of the present invention
  • FIG. 3 illustrates a user device with a touch screen display having a main portion and an input region according to an example embodiment of the present invention
  • FIG. 4 illustrates a user device with a touch screen display having a main portion and an input region, where a ray is shown extending from the location of the touch input according to an example embodiment of the present invention
  • FIG. 5 illustrates a user device with a touch screen display having a main portion and an input region extending along two edges of the display for receiving two touch inputs according to an example embodiment of the present invention
  • FIGs. 6A and 6B illustrate a user's application of a touch input to a user device, where the input involves an angle according to an example embodiment of the present invention
  • FIGs. 7A and 7B illustrate a user's application of a touch input to a user device, where the input involves a different angle than that shown in Figs. 6A and 6B according to an example embodiment of the present invention
  • FIGs. 8A and 8B illustrate a user's application of a rocking touch input to a user device according to an example embodiment of the present invention
  • FIG. 9 illustrates a user device with a touch screen display having a main portion and an input region according to another example embodiment of the present invention.
  • FIG. 10 illustrates a flowchart of methods of receiving indirect touch input to a touch screen display according to an example embodiment of the present invention.
  • circuitry refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present.
  • This definition of 'circuitry' applies to all uses of this term herein, including in any claims.
  • the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware.
  • the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
  • selectable element refers to, e.g., icons for launching applications, links for navigating to a website, and other graphical or textual content that, when selected, provides for execution of a particular operation.
  • the user's touch input is received at a location on the screen that coincides with the location where the desired selectable element is displayed.
  • the selectable element is obscured by the object in the course of providing the input.
  • the user's finger may temporarily cover up the words or images of the selectable element as the touch input is being applied to the touch screen.
  • the object applying the touch input may also cover up content in the vicinity of the selectable element being selected, including other selectable elements and other content being consumed by the user.
  • providing a touch input that coincides with the location of the selectable element may, at least temporarily, interrupt the user's consumption of the displayed content and/or cause the user to lack a visual confirmation of the particular selectable element(s) being selected at the time the input is applied.
  • example embodiments of the present invention provide mechanisms for receiving a user's touch input without obscuring the selectable element and/or other content displayed on a main portion of the touch screen display by providing an input region proximate the main portion of the display that does not overlap with the main portion and that is configured to receive touch inputs.
  • an extrapolated touch input in the main portion of the display may be determined based on the location of the touch input in the input region, and selection of a particular selectable element may be accomplished when the location of the extrapolated touch input coincides with the location of the selectable element.
  • FIG. 1 a block diagram of a mobile terminal 10 that would benefit from embodiments of the present invention is illustrated. It should be understood, however, that the mobile terminal 10 as illustrated and hereinafter described is merely illustrative of one type of device that may benefit from embodiments of the present invention and, therefore, should not be taken to limit the scope of embodiments of the present invention.
  • mobile terminals such as portable digital assistants (PDAs), watches, mobile telephones, pagers, mobile televisions, gaming devices, laptop computers, cameras, tablet computers, touch surfaces, wearable devices, video recorders, audio/video players, radios, electronic books, positioning devices (e.g., global positioning system (GPS) devices), or any combination of the aforementioned, and other types of voice and text communications systems
  • PDAs portable digital assistants
  • watches mobile telephones, pagers, mobile televisions, gaming devices, laptop computers, cameras, tablet computers, touch surfaces, wearable devices, video recorders, audio/video players, radios, electronic books, positioning devices (e.g., global positioning system (GPS) devices), or any combination of the aforementioned, and other types of voice and text communications systems
  • GPS global positioning system
  • other devices including fixed (non-mobile) electronic devices may also employ some example embodiments, such as large, stand-alone touch screen displays that are configured to be mounted to a wall or are otherwise configured for providing content for viewing by a group.
  • the mobile terminal 10 may include an antenna 12 (or multiple antennas) in operable communication with a transmitter 14 and a receiver 16.
  • the mobile terminal 10 may further include an apparatus, such as a processor 20 or other processing device (e.g., processor 70 of Fig. 2), which controls the provision of signals to and the receipt of signals from the transmitter 14 and receiver 16, respectively.
  • the signals may include signaling information in accordance with the air interface standard of the applicable cellular system, and also user speech, received data and/or user generated data.
  • the mobile terminal 10 may be capable of operating in accordance with non- cellular communication mechanisms.
  • the mobile terminal 10 may be capable of communication in a wireless local area network (WLAN) or other communication networks.
  • WLAN wireless local area network
  • the processor 20 may include circuitry desirable for implementing audio and logic functions of the mobile terminal 10.
  • the processor 20 may be comprised of a digital signal processor device, a microprocessor device, and various analog to digital converters, digital to analog converters, and other support circuits. Control and signal processing functions of the mobile terminal 10 are allocated between these devices according to their respective capabilities.
  • the processor 20 thus may also include the functionality to convolutionally encode and interleave message and data prior to modulation and transmission.
  • the processor 20 may additionally include an internal voice coder, and may include an internal data modem.
  • the processor 20 may include functionality to operate one or more software programs, which may be stored in memory.
  • the processor 20 may be capable of operating a connectivity program, such as a conventional Web browser. The connectivity program may then allow the mobile terminal 10 to transmit and receive Web content, such as location-based content and/or other web page content, according to a Wireless Application Protocol (WAP), Hypertext Transfer Protocol (HTTP) and/or the like, for example.
  • WAP Wireless Application Protocol
  • HTTP
  • the mobile terminal 10 may also comprise a user interface including an output device such as a conventional earphone or speaker 24, a ringer 22, a microphone 26, a display 28, and a user input interface, all of which are coupled to the processor 20.
  • the user input interface which allows the mobile terminal 10 to receive data, may include any of a number of devices allowing the mobile terminal 10 to receive data, such as a keypad 30, a touch screen display (display 28 providing an example of such a touch screen display) or other input device.
  • the keypad 30 may include the conventional numeric (0-9) and related keys (#, *), and other hard and soft keys used for operating the mobile terminal 10.
  • the keypad 30 may include a conventional QWERTY keypad arrangement.
  • the keypad 30 may also include various soft keys with associated functions.
  • the mobile terminal 10 may include an interface device such as a joystick or other user input interface.
  • the mobile terminal 10 further includes a battery 34, such as a vibrating battery pack, for powering various circuits that are required to operate the mobile terminal 10, as well as optionally providing mechanical vibration as a detectable output.
  • a battery 34 such as a vibrating battery pack
  • the mobile terminal 10 may further include a user identity module (UIM) 38.
  • the UIM 38 is typically a memory device having a processor built in.
  • the UIM 38 may include, for example, a subscriber identity module (SIM), a universal integrated circuit card (UICC), a universal subscriber identity module (USIM), a removable user identity module (R-UIM), etc.
  • SIM subscriber identity module
  • UICC universal integrated circuit card
  • USIM universal subscriber identity module
  • R-UIM removable user identity module
  • the UIM 38 typically stores information elements related to a mobile subscriber.
  • the mobile terminal 10 may be equipped with memory.
  • the mobile terminal 10 may include volatile memory 40, such as volatile Random Access Memory (RAM) including a cache area for the temporary storage of data.
  • RAM volatile Random Access Memory
  • the mobile terminal 10 may also include other non-volatile memory 42, which may be embedded and/or may be removable.
  • the memories may store any of a number of pieces of information, and data, used by the mobile terminal 10 to implement
  • FIG. 2 depicts certain elements of an apparatus 50 that provides for receipt of indirect touch input to a touch screen display.
  • the apparatus 50 of Fig. 2 may be employed, for example, with the mobile terminal 10 of Fig. 1.
  • the apparatus 50 of Fig. 2 may also be employed in connection with a variety of other devices, both mobile and fixed, and therefore, embodiments of the present invention should not be limited to application on devices such as the mobile terminal 10 of Fig. 1.
  • the apparatus 50 may be employed on a watch, a personal computer, a tablet, a mobile telephone, a large, stand-alone, mountable display, or other user terminal.
  • part or all of the apparatus 50 may be on a fixed device such as a server or other service platform, and the content may be presented (e.g., via a server/client relationship) on a remote device such as a user terminal (e.g., the mobile terminal 10) based on processing that occurs at the fixed device.
  • a fixed device such as a server or other service platform
  • the content may be presented (e.g., via a server/client relationship) on a remote device such as a user terminal (e.g., the mobile terminal 10) based on processing that occurs at the fixed device.
  • FIG. 2 illustrates one example of a configuration of an apparatus 50 for receiving indirect touch input
  • numerous other configurations may also be used to implement embodiments of the present invention.
  • devices or elements are shown as being in communication with each other, hereinafter such devices or elements should be considered to be capable of being embodied within a same device or element and, thus, devices or elements shown in communication should be understood to alternatively be portions of the same device or element.
  • the apparatus 50 may include or otherwise be in communication with a processor 70, a user interface transceiver 72, a communication interface 74, and a memory device 76.
  • the processor 70 (and/or coprocessors or any other processing circuitry assisting or otherwise associated with the processor 70) may be in communication with the memory device 76 via a bus for passing information among components of the apparatus 50.
  • the memory device 76 may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device 76 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device like the processor 70).
  • the memory device 76 may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus to carry out various functions in accordance with an example embodiment of the present invention.
  • the memory device 76 could be configured to buffer input data for processing by the processor 70. Additionally or alternatively, the memory device 76 could be configured to store instructions for execution by the processor 70.
  • the apparatus 50 may, in some embodiments, be a mobile terminal (e.g., mobile terminal 10) or a fixed communication device or computing device configured to employ an example embodiment of the present invention.
  • the apparatus 50 may be embodied as a chip or chip set.
  • the apparatus 50 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard).
  • the structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon.
  • the apparatus 50 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip.”
  • a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processor 70 may be embodied in a number of different ways.
  • the processor 70 may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like.
  • the processor 70 may include one or more processing cores configured to perform independently.
  • a multi-core processor may enable multiprocessing within a single physical package.
  • the processor 70 may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
  • the processor 70 may be configured to execute instructions stored in the memory device 76 or otherwise accessible to the processor 70. Alternatively or additionally, the processor 70 may be configured to execute hard-coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 70 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly. Thus, for example, when the processor 70 is embodied as an ASIC, FPGA or the like, the processor 70 may be specifically configured hardware for conducting the operations described herein.
  • the instructions may specifically configure the processor 70 to perform the algorithms and/or operations described herein when the instructions are executed.
  • the processor 70 may be a processor of a specific device (e.g., a mobile terminal or network device) adapted for employing an embodiment of the present invention by further configuration of the processor 70 by instructions for performing the algorithms and/or operations described herein.
  • the processor 70 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 70.
  • ALU arithmetic logic unit
  • the communication interface 74 may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the apparatus 50.
  • the communication interface 74 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network.
  • the communication interface 74 may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s).
  • the communication interface 74 may alternatively or also support wired communication.
  • the communication interface 74 may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
  • the user interface transceiver 72 may be in communication with the processor 70 to receive an indication of a user input and/or to cause provision of an audible, visual, mechanical or other output to the user.
  • the user interface transceiver 72 may include, for example, a keyboard, a mouse, a joystick, a display, a touch screen(s), touch areas, soft keys, a microphone, a speaker, or other input/output mechanisms.
  • the processor 70 may comprise user interface circuitry configured to control at least some functions of one or more user interface elements such as, for example, a speaker, ringer, microphone, display, and/or the like.
  • the processor 70 and/or user interface circuitry comprising the processor 70 may be configured to control one or more functions of one or more user interface elements through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor 70 (e.g., memory device 76, and/or the like).
  • computer program instructions e.g., software and/or firmware
  • the apparatus 50 may include or otherwise be in communication with a touch screen display 68 (e.g., the display 28).
  • the touch screen display 68 may be a two dimensional (2D) or three dimensional (3D) display.
  • the touch screen display 68 may be embodied as any known touch screen display.
  • the touch screen display 68 could be configured to enable touch recognition by any suitable technique, such as resistive, capacitive, infrared, strain gauge, surface wave, optical imaging, dispersive signal technology, acoustic pulse recognition, and/or other techniques.
  • the user interface transceiver 72 may be in communication with the touch screen display 68 to receive touch inputs at the touch screen display 68 and to analyze and/or modify a response to such indications based on corresponding user actions that may be inferred or otherwise determined responsive to the touch inputs.
  • the apparatus 50 may include a touch screen interface 80.
  • the touch screen interface 80 may, in some instances, be a portion of the user interface transceiver 72.
  • the touch screen interface 80 may be embodied as the processor 70 or may be a separate entity controlled by the processor 70.
  • the processor 70 may be said to cause, direct or control the execution or occurrence of the various functions attributed to the touch screen interface 80 (and any components of the touch screen interface 80) as described herein.
  • the touch screen interface 80 may be any means such as a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g., processor 70 operating under software control, the processor 70 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of the touch screen interface 80 as described herein.
  • a device or circuitry e.g., the processor 70 in one example
  • executing the software forms the structure associated with such means.
  • the touch screen interface 80 may be configured to receive an input in the form of a touch event at the touch screen display 68. As such, the touch screen interface 80 may be in communication with the touch screen display 68 to receive user inputs at the touch screen display 68 and to modify a response to such inputs based on corresponding user actions that may be inferred or otherwise determined responsive to the inputs. Following recognition of a touch event, the touch screen interface 80 may be configured to determine a classification of the touch event and provide a corresponding function based on the touch event in some situations.
  • the touch screen interface 80 may include a detector 82, a display manager 84, and a gesture classifier 86.
  • Each of the detector 82, the display manager 84, and the gesture classifier 86 may be any device or means embodied in either hardware or a combination of hardware and software configured to perform the corresponding functions associated with the detector 82, the display manager 84, and the gesture classifier 86, respectively, as described herein.
  • each of the detector 82, the display manager 84, and the gesture classifier 86 may be controlled by or otherwise embodied as the processor 70.
  • the detector 82 may be in communication with the touch screen display 68 to receive user inputs in order to recognize and/or determine a touch event based on each input received at the detector 82.
  • a touch event may be defined as a detection of an object, such as a stylus, finger, pen, pencil, cellular telephone, digital camera, or any other mobile device (including the mobile terminal 10 shown in Fig. 1) or object, coming into contact with a portion of the touch screen display in a manner sufficient to register as a touch.
  • a touch event could be a detection of pressure on the screen of the touch screen display 68 above a particular pressure threshold over a given area.
  • the detector 82 may be further configured to pass along the data corresponding to the touch event (e.g., location of touch, length of touch, number of objects touching, touch pressure, touch area, speed of movement, direction of movement, length of delay, frequency of touch, etc.) to the gesture classifier 86 for gesture classification.
  • the detector 82 may include or be in communication with one or more force sensors configured to measure the amount of touch pressure (e.g., force over a given area) applied as a result of a touch event, as an example.
  • the gesture classifier 86 may be configured to recognize and/or determine a corresponding classification of a touch event.
  • the gesture classifier 86 may be configured to perform gesture classification to classify the touch event as any of a number of possible gestures.
  • Some examples of recognizable gestures may include a touch, multi- touch, stroke, character, symbol, shape, pinch event (e.g., a pinch in or pinch out), and/or the like.
  • the gesture classifier 86 may also be configured to communicate detection information regarding the recognition, detection, and/or classification of a touch event to the display manager 84.
  • the display manager 84 may be configured to provide control over modifications made to that which is displayed on the touch screen display 68 based on the detection information received from the detector 82 and gesture classifications provided by the gesture classifier 86 in accordance with the responses prescribed for each respective gesture classification and implementation characteristic determined by the gesture classifier 86.
  • the display manager 84 may configure the display (e.g., with respect to the content displayed and/or the user interface effects presented relative to the content displayed) according to the gesture classification and implementation characteristic classification determined for a given touch event that may be detected at the display.
  • an apparatus 50 shown in Fig. 2 is provided, such as an apparatus embodied by a device 100 that includes a touch screen display 105 (such as the display 68 of Fig. 2).
  • the touch screen display 105 may be the display 28 of the mobile terminal 10 shown in Fig. 1 (e.g., a cellular phone).
  • the apparatus may comprise at least one processor (e.g., processor 70 of Fig. 2) and at least one memory (e.g., memory device 76 of Fig. 2) including computer program code.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to provide for presentation of a selectable element in a main portion 115 of the display 105.
  • a selectable element in a main portion 115 of the display 105.
  • three selectable elements are presented in the main portion 115 of the display, including an icon 110a for launching an email application, an icon 110b for launching a word processing application, and an icon 110c for placing a phone call.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to receive a touch input (represented by the oval 120) in an input region 125 proximate the main portion of the display 105 and to determine an extrapolated touch input (represented by the dashed line oval 120') in the main portion 115 of the display based on the location of the touch input 120 in the input region.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to provide for selection of the selectable element in an instance in which a location of the extrapolated touch input coincides with a location of the selectable element. As illustrated in Fig. 3, the input region 125 does not overlap with the main portion 115 of the display 105.
  • the input region 125 may be provided at any location that is convenient to the user considering the type of device 100 and its configuration (e.g., the size of the display 105 and the intended use of the device, such as whether the device is hand-held, attached to a stand for positioning on a user's desk, or mountable on a wall, for example).
  • the input region 125 is defined by the display 105 and is disposed along an edge of the display.
  • the input region 125 is disposed along a right-hand edge of the display 105
  • the input region 125 is disposed along both a right-hand edge and a bottom edge of the display.
  • the input region 125 may comprise more than one region, which may be continuous or spaced apart, depending on the device and/or the preferences of the user.
  • the input region 125 may be defined by a surface (which may or may not be the display surface) that is in the same plane as the display 105.
  • the input region 125 may be defined by a surface that defines a non-zero angle with the display 105, such as a surface that is perpendicular to the surface of the display, as shown.
  • the apparatus is embodied by a device 100 that is a wrist watch.
  • the input region 125 is provided around the rim of the display 105 (which, in this case, serves as the watch face).
  • the input region is provided via one or more touch sensors provided on the outside edge of the device, where a digital watch's buttons may otherwise be provided.
  • the surface defining the input region 125 in embodiments such as in Fig. 9 may be configured to enable touch recognition by any suitable technique, such as resistive, capacitive, infrared, strain gauge, surface wave, optical imaging, dispersive signal technology, acoustic pulse recognition, and/or other techniques.
  • the input region 125 may be provided in an area that is not part of the touch screen display, but is in the same plane (or substantially the same plane) as the touch screen display.
  • the input region may be provided in a bezel of the display (which may be round, as shown in Fig. 9, or rectangular or another shape), such that the input region may be in substantially the same plane, but outside of where the pixels of the display are provided.
  • the user applies a touch input 120 (e.g., using his finger) to the input region 125 at a point along a y-axis of the display 105 that corresponds to a point yl along the y-axis at which the desired selectable element (in this case, the icon 110a for the email application) is displayed.
  • the at least one memory and the computer program code with the processor (e.g., via the detector 82 of the touch screen interface 80 shown in Fig. 2) may determine that the location along the y-axis of the touch input 120 corresponds to the location along the y-axis of the icon 110a.
  • the extrapolated touch input 120' may be determined as one or more points along the x-axis at the same y-coordinate as the applied touch input 120.
  • the extrapolated touch input 120' may be considered to coincide with the location of the selectable element when the location of the extrapolated touch input (e.g., the y-coordinate of the extrapolated touch input in the depicted embodiment) is within a certain distance along the y-axis (e.g., y-coordinate) of the location of the icon 110a.
  • the center of the displayed icon 110a may be considered the nominal location of the icon, and any touch input 120 resulting in an extrapolated touch input 120' that is within a certain y-distance of the icon center point (e.g., within 1 cm for a touch screen display provided on a mobile phone) may be considered to be a touch input that corresponds to the location of the icon.
  • a certain y-distance of the icon center point e.g., within 1 cm for a touch screen display provided on a mobile phone
  • the touch input 120 and resulting extrapolated touch input 120' may be considered to have an area (e.g., an area across which pressure from the user's finger or other object is applied to the touch screen 105), and the location of the extrapolated touch input may be considered to coincide with the location of the selectable element 110a in an instance in which the corresponding area of the extrapolated touch input overlaps with the area of the icon presented.
  • an area e.g., an area across which pressure from the user's finger or other object is applied to the touch screen 105
  • the location of the extrapolated touch input may be considered to coincide with the location of the selectable element 110a in an instance in which the corresponding area of the extrapolated touch input overlaps with the area of the icon presented.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to provide a visual
  • the visual representation may be the dashed line oval or some other visual indication (e.g., a color, a shaded area, etc.) of the general location of the extrapolated touch input 120'. In other cases, however, the location of the extrapolated touch input 120' may not be visually apparent on the touch screen display 105.
  • the visual representation may comprise a ray 130 extending from the location of the touch input 120 in the input region 125 at least partially into the main portion 115 of the display 105, as illustrated in Fig. 4.
  • the ray 130 may, for example, be an illuminated line that extends along the x-axis from the location of the touch input 120 (e.g., from a nominal location of the touch input, such as a center of the touch input) into the main portion of the display, as shown.
  • the extrapolated touch input 120' may be considered any point along the ray 130, such that the extrapolated touch input may be deemed to coincide with the location of the selectable element 110a in an instance in which the ray 130 intersects the selectable element.
  • the touch input 120 may comprise a first touch input 120-y
  • the extrapolated touch input 120' may comprise first extrapolated touch input 120'-y.
  • the at least one memory and the computer program code may be further configured to, with the processor cause the apparatus to receive a second touch input 120-x in the input region 125 of the display 105, determine a second extrapolated touch input 120'-x in the main portion 115 of the display 105 based on the location of the second touch input in the input region, and provide for selection of the selectable element 110a in an instance in which an intersection of the first extrapolated touch input and the second extrapolated touch input coincides with the location of the selectable element.
  • the input region 125 comprises a portion of the display extending along the y-axis and a portion of the display extending along the x-axis.
  • a touch input 120-y received via the portion extending along the y-axis may be considered a first touch input 120-y that corresponds to the y-coordinate of the desired selectable element
  • a touch input 120-x received via the portion extending along the x-axis may be considered a second touch input 120-x that corresponds to the x-coordinate of the desired selectable element.
  • the combination of the first and second touch inputs 120-x and 120-y may thus provide an x-y coordinate location (xi, yi) of the desired selectable element as shown.
  • the user may need to apply two touch inputs substantially simultaneously (e.g., in a manner such that a duration of contact of the first touch input overlaps, at least partially, with a duration of contact of the second touch input) to more accurately specify the desired selectable element.
  • the first touch input may not need to be applied simultaneously with the second touch input to be considered as designating an x- and y-coordinate of the desired selectable element.
  • the first and second touch inputs may be provided in sequence (e.g., the second touch input being applied within a certain amount of time from application of the first touch input, such as 1-5 seconds).
  • the user may be able to provide both the first and the second touch input using the same hand and/or the same finger.
  • This may be helpful, for example, when a larger display (such as a wall-mounted display) is involved, in which case it may not be physically possible (due to the larger area of the display) for the user to apply both the first and second touch inputs simultaneously.
  • the icons 110a, 110b are disposed at the same y-coordinate.
  • a single touch input applied as shown in Fig. 3, for example, may not be adequate to indicate which of the two icons 110a, 110b is desired for selection by the user.
  • the addition of a second touch input may thus represent the x-coordinate of the desired selectable element, such that the intersection of the extrapolated touch inputs 120'-x, 120'-y may pinpoint the selectable element that the user wishes to select.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to detect an angle ⁇ (shown in Figs. 6B and 7B) associated with the touch input 120 and to determine the location of the extrapolated touch input based on the angle of the touch input that is detected.
  • the angle ⁇ may be detected, for example, by a sensor that senses the angle of the finger or other object with respect to the display (e.g., a pressure sensor array and/or a proximity array embedded or otherwise associated with the display).
  • the user may apply a touch input in the input region 125 to select a selectable element 110c that is not located at the y-coordinate corresponding to the y-coordinate of the touch input (e.g., not directly in line with the touch input), but rather is at an angle from the location of the applied touch input (e.g., "below" the y-position of the applied touch input).
  • the user applies his finger 135 (or other object) at an angle ⁇ to the plane P of the display 105, as shown in Fig.
  • the angle ⁇ corresponds to an angle ⁇ ' that is used to determine the extrapolated touch input (represented by the ray 130 in Fig. 6A).
  • the user may apply his finger 135 using a different angle ⁇ , which may correspond to a different angle ⁇ ' that is used to determine the extrapolated touch input (represented by the ray 130 in Fig. 7A).
  • the angle ⁇ is shown in Figs. 6B and 7B as an angle in the y-z plane, in other embodiments the angle ⁇ may be in other planes, such as the x-z plane, or in multiple planes.
  • angle ⁇ may be used to determine other characteristics of the extrapolated touch input such as, for example, a distance to the desired selectable element (e.g., with a shallower angle ⁇ corresponding to a longer distance from the user's finger 135 to the selectable element).
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to modify the location of the extrapolated touch input based on a change in the angle ⁇ of the touch input 120.
  • the user may "rock" his finger 135 from a start position 137 at a first angle ⁇ ] to an end position 138 at a second angle ⁇ 2 (shown in Fig. 8B).
  • such a rocking motion may allow the user to adjust the location of the ray 130 to select a particular selectable element.
  • a user may be able to select a selectable element in situations requiring a finer grained control, such as when numerous selectable elements are provided on the display 105 (e.g., more than the three icons shown in the figures).
  • a "sweeping" input may allow the user to select more than one selectable element (e.g., selecting all selectable elements that are within the swept area 140).
  • the user may provide a "rocking" input with respect to the outside rim of the display screen 105.
  • the user may "rock" his finger 135 from a start position 137 (shown in dashed lines) at a first angle ⁇ to an end position 138 (shown in solid lines) at a second angle ⁇ 2 .
  • a rocking motion may allow the user to adjust the location of the ray 130 to select a particular selectable element (not shown) or select more than one selectable element (e.g., selecting all selectable elements that are within a swept area).
  • Fig. 10 illustrates a flowchart of systems, methods, and computer program products according to example embodiments of the invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device of an apparatus employing an example embodiment of the present invention and executed by a processor in the apparatus.
  • any such computer program instructions may be loaded onto a computer or other programmable apparatus (e.g., hardware) to produce a machine, such that the resulting computer or other programmable apparatus implements the functions specified in the flowchart block(s).
  • These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart block(s).
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer- implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart block(s).
  • blocks of the flowchart support combinations of means for performing the specified functions, combinations of operations for performing the specified functions, and program instruction means for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, can be implemented by special purpose hardware -based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
  • Fig. 10 depicts an example embodiment of the method that includes providing for presentation of a selectable element in a main portion of a display at block 200 and receiving a touch input in an input region proximate the main portion of the display at block 210, where the input region does not overlap with the main portion.
  • the input region may be disposed along an edge of the display in some cases.
  • An extrapolated touch input in the main portion of the display may be determined based on a location of the touch input in the input region at block 220, as described above, and selection of the selectable element may be provided for in an instance in which a location of the extrapolated touch input coincides with a location of the selectable element at block 230.
  • the first touch input may comprise a first touch input and the extrapolated touch input may comprise a first extrapolated touch input.
  • a second touch input may be received in the input region, and a second extrapolated touch input may be determined in the main portion of the display based on the location of the second touch input in the input region. Selection of the selectable element may be provided for in an instance in which an intersection of the first extrapolated touch input and the second extrapolated touch input coincides with the location of the selectable element.
  • an angle associated with the touch input may be detected at block 240, and the location of the extrapolated touch input may be determined based on the angle of the touch input that is detected.
  • a visual representation of the location of the extrapolated touch input may be provided in the main portion of the display at block 250.
  • the visual representation may comprise a ray extending from the location of the touch input in the input region at least partially into the main portion of the display.
  • the location of the extrapolated touch input may be modified based on a change in the angle of the touch input in some embodiments.
  • certain ones of the operations above may be modified or further amplified as described below. Furthermore, in some embodiments, additional optional operations may be included. Modifications, additions, or amplifications to the operations above may be performed in any order and in any combination.
  • an apparatus for performing the methods of Fig. 10 above may comprise a processor (e.g., the processor 70 of Fig. 2) configured to perform some or each of the operations (200-250) described above.
  • the processor may, for example, be configured to perform the operations (200-250) by performing hardware implemented logical functions, executing stored instructions, or executing algorithms for performing each of the operations.
  • the apparatus may comprise means for performing each of the operations described above.
  • examples of means for performing at least portions of operations 200, 210, and 240 may comprise, for example, the processor 70, the user interface transceiver 72, and the memory device 76, and/or a device or circuit for executing instructions or executing an algorithm for processing information as described above.
  • examples of means for performing operation 220 may comprise, for example, the processor 70 and/or a device or circuit for executing instructions or executing an algorithm for processing information as described above.
  • means for performing operation 230 may comprise, for example, the processor 70, the user interface transceiver 72, communication interface 74, the memory device 76, and/or a device or circuit for executing instructions or executing an algorithm for processing information as described above.
  • Examples of means for performing operation 250 may comprise, for example, the processor 70, the user interface transceiver 72, and/or a device or circuit for executing instructions or executing an algorithm for processing information as described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
EP13895993.7A 2013-10-22 2013-10-22 Vorrichtung und verfahren zur bereitstellung indirekter berührungseingaben auf einer touchscreen-anzeige Ceased EP3060972A4 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2013/066106 WO2015060824A1 (en) 2013-10-22 2013-10-22 Apparatus and method for providing for receipt of indirect touch input to a touch screen display

Publications (2)

Publication Number Publication Date
EP3060972A1 true EP3060972A1 (de) 2016-08-31
EP3060972A4 EP3060972A4 (de) 2017-10-18

Family

ID=52993276

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13895993.7A Ceased EP3060972A4 (de) 2013-10-22 2013-10-22 Vorrichtung und verfahren zur bereitstellung indirekter berührungseingaben auf einer touchscreen-anzeige

Country Status (6)

Country Link
US (1) US11360652B2 (de)
EP (1) EP3060972A4 (de)
JP (1) JP6200600B2 (de)
KR (1) KR101862954B1 (de)
CN (1) CN105659203A (de)
WO (1) WO2015060824A1 (de)

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2558984B2 (ja) * 1991-03-12 1996-11-27 松下電器産業株式会社 3次元情報会話システム
US5689667A (en) 1995-06-06 1997-11-18 Silicon Graphics, Inc. Methods and system of controlling menus with radial and linear portions
WO2000011541A1 (en) * 1998-08-18 2000-03-02 Koninklijke Philips Electronics N.V. Display device with cursor positioning means
US6727892B1 (en) * 1999-05-20 2004-04-27 Micron Technology, Inc. Method of facilitating the selection of features at edges of computer touch screens
SE9904380D0 (sv) 1999-12-02 1999-12-02 Siemens Elema Ab Method and apparatus for selecting a software item using a graphical user interface
JP2002333951A (ja) * 2001-05-08 2002-11-22 Matsushita Electric Ind Co Ltd 入力装置
US7009599B2 (en) * 2001-11-20 2006-03-07 Nokia Corporation Form factor for portable device
CN1316340C (zh) * 2002-03-08 2007-05-16 诺基亚有限公司 用于提供在电子装置上显示的应用程序的表示的方法和装置
US7731495B2 (en) * 2005-12-20 2010-06-08 3M Innovative Properties Company User interface having cross section control tool for digital orthodontics
JP2007200002A (ja) * 2006-01-26 2007-08-09 Brother Ind Ltd 表示装置および表示制御プログラム
US8077153B2 (en) * 2006-04-19 2011-12-13 Microsoft Corporation Precise selection techniques for multi-touch screens
US7860071B2 (en) * 2006-08-31 2010-12-28 Skype Limited Dual-mode device for voice communication
EP3654141A1 (de) * 2008-10-06 2020-05-20 Samsung Electronics Co., Ltd. Verfahren und vorrichtung zur anzeige einer graphischen benutzerschnittstelle in abhängigkeit vom kontaktmuster eines benutzers
US20100295799A1 (en) * 2009-05-21 2010-11-25 Sony Computer Entertainment America Inc. Touch screen disambiguation based on prior ancillary touch input
CN101901098A (zh) * 2009-05-26 2010-12-01 鸿富锦精密工业(深圳)有限公司 电子显示装置及其图标的显示方法
US8907897B2 (en) 2009-06-16 2014-12-09 Intel Corporation Optical capacitive thumb control with pressure sensor
JP5402322B2 (ja) * 2009-07-02 2014-01-29 ソニー株式会社 情報処理装置および情報処理方法
JP5429627B2 (ja) * 2009-12-04 2014-02-26 日本電気株式会社 携帯端末、携帯端末の操作方法、及び携帯端末の操作プログラム
TWI420379B (zh) * 2009-12-09 2013-12-21 Telepaq Technology Inc 觸控螢幕之選擇功能選單方法
EP2341420A1 (de) * 2010-01-04 2011-07-06 Research In Motion Limited Tragbare elektronische Vorrichtung und Steuerungsverfahren dafür
US11068149B2 (en) * 2010-06-09 2021-07-20 Microsoft Technology Licensing, Llc Indirect user interaction with desktop using touch-sensitive control surface
JP4865063B2 (ja) * 2010-06-30 2012-02-01 株式会社東芝 情報処理装置、情報処理方法およびプログラム
US20120162091A1 (en) * 2010-12-23 2012-06-28 Lyons Kenton M System, method, and computer program product for multidisplay dragging
US20130271419A1 (en) * 2011-09-30 2013-10-17 Sangita Sharma Transforming mobile device sensor interaction to represent user intent and perception
US20130093688A1 (en) * 2011-10-17 2013-04-18 Matthew Nicholas Papakipos Virtual Soft Keys in Graphic User Interface with Side Mounted Touchpad Input Device
JP5804596B2 (ja) * 2011-11-18 2015-11-04 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation 表示装置、入力方法、およびプログラム
US20130154955A1 (en) * 2011-12-19 2013-06-20 David Brent GUARD Multi-Surface Touch Sensor Device With Mode of Operation Selection
JP5874625B2 (ja) * 2012-12-20 2016-03-02 カシオ計算機株式会社 入力装置、入力操作方法及び制御プログラム並びに電子機器

Also Published As

Publication number Publication date
WO2015060824A1 (en) 2015-04-30
US20160231904A1 (en) 2016-08-11
KR101862954B1 (ko) 2018-05-31
EP3060972A4 (de) 2017-10-18
JP2016536728A (ja) 2016-11-24
JP6200600B2 (ja) 2017-09-20
CN105659203A (zh) 2016-06-08
US11360652B2 (en) 2022-06-14
KR20160075631A (ko) 2016-06-29

Similar Documents

Publication Publication Date Title
EP2812796B1 (de) Vorrichtung und verfahren zur bereitstellung für entfernte benutzerinteraktion
US9170607B2 (en) Method and apparatus for determining the presence of a device for executing operations
KR102120930B1 (ko) 포터블 디바이스의 사용자 입력 방법 및 상기 사용자 입력 방법이 수행되는 포터블 디바이스
US10073493B2 (en) Device and method for controlling a display panel
US8890825B2 (en) Apparatus and method for determining the position of user input
US20130050133A1 (en) Method and apparatus for precluding operations associated with accidental touch inputs
AU2013276998B2 (en) Mouse function provision method and terminal implementing the same
US10928948B2 (en) User terminal apparatus and control method thereof
US11112959B2 (en) Linking multiple windows in a user interface display
US9377943B2 (en) Method and apparatus for outputting display data based on a touch operation on a touch panel
US9767605B2 (en) Method and apparatus for presenting multi-dimensional representations of an image dependent upon the shape of a display
US20140240257A1 (en) Electronic device having touch-sensitive user interface and related operating method
EP2731001A2 (de) Elektronische Vorrichtung und Verfahren zur Änderung eines Objekts gemäss einem Biegezustand
CN109933252A (zh) 一种图标移动方法及终端设备
US9678608B2 (en) Apparatus and method for controlling an interface based on bending
JP2015215840A (ja) 情報処理装置及び入力方法
WO2015051521A1 (en) Method and apparatus for controllably modifying icons
US11360652B2 (en) Apparatus and method for providing for receipt of indirect touch input to a touch screen display
EP2946269B1 (de) Elektronische vorrichtung mit berührungsempfindlicher anzeige und gestenerkennung, verwendungsverfahren und computerlesbare speichervorrichtung
CN110874141A (zh) 图标移动的方法及终端设备
KR20150044757A (ko) 플로팅 입력에 따라 동작을 제어하는 전자 장치 및 그 방법
US20150242004A1 (en) Touch-sensitive input device having a logo displayed thereon for use in a mobile electronic device

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160502

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0488 20130101ALI20170515BHEP

Ipc: G06F 3/048 20130101AFI20170515BHEP

Ipc: G06F 3/0484 20130101ALI20170515BHEP

Ipc: G06F 3/0354 20130101ALI20170515BHEP

A4 Supplementary search report drawn up and despatched

Effective date: 20170918

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0354 20130101ALI20170912BHEP

Ipc: G06F 3/048 20130101AFI20170912BHEP

Ipc: G06F 3/0484 20130101ALI20170912BHEP

Ipc: G06F 3/0488 20130101ALI20170912BHEP

17Q First examination report despatched

Effective date: 20180502

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: NOKIA TECHNOLOGIES OY

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20201111