WO2010131122A2 - Interface utilisateur pour fournir une commande améliorée d'un programme d'application - Google Patents

Interface utilisateur pour fournir une commande améliorée d'un programme d'application Download PDF

Info

Publication number
WO2010131122A2
WO2010131122A2 PCT/IB2010/001808 IB2010001808W WO2010131122A2 WO 2010131122 A2 WO2010131122 A2 WO 2010131122A2 IB 2010001808 W IB2010001808 W IB 2010001808W WO 2010131122 A2 WO2010131122 A2 WO 2010131122A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch input
gui
touch
input events
user
Prior art date
Application number
PCT/IB2010/001808
Other languages
English (en)
Other versions
WO2010131122A3 (fr
Inventor
Srinivas Chervirala
Satya Mallya
Original Assignee
France Telecom
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by France Telecom filed Critical France Telecom
Publication of WO2010131122A2 publication Critical patent/WO2010131122A2/fr
Publication of WO2010131122A3 publication Critical patent/WO2010131122A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention generally relates to mobile devices or handsets, and more specifically to mobile devices handling both touch and motion based inputs.
  • GUI with respect to the desktop.
  • Small screens and tiny keyboards are typical of mobile handsets that fit in your pocket.
  • Recent so called smart phones have introduced the use of a touch screen in an attempt to simplify the user experience with his mobile handset. For instance, the touch interface of the iPhone® has revolutionized the mobile handset industry and brought whole new mobile user experiences.
  • AP application programs
  • touch inputs may control the AP in different ways.
  • the desktop GUI comprising a plurality of AP icons may be seen as an AP itself.
  • a user touching an AP icon will cause a control of the desktop GUI that will launch the AP corresponding to the touched icon.
  • a sliding motion across the desktop GUI, or a drag touch input will cause another control of the desktop GUI, that will display another set of AP icons hidden so far.
  • the user gets a feeling that he is browsing through pages of AP icons to select an interesting application program.
  • a prolonged touch input or clutch input on any AP icon will cause all icons to start shaking around their position.
  • the control associated to the clutch input opens the desktop GUI management. The user can then delete applications from the desktop or move them around in the desktop layout. If such a method facilitates the user experience, there is still today a lot of scope for innovation using touch interfaces of electronic devices, mobile or not.
  • the existing examples all use well known touch inputs, such as a short touch, a clutch or a drag.
  • the touch inputs may include touch inputs from two fingers that are dragged away or closer to each other which can control a zoom in or a zoom out of a map GUI or a picture.
  • touch inputs could be used to increase the user experience with touch screens or panels.
  • GUI graphical user interface
  • a novel touch input is disclosed that allows further control of application programs.
  • the user can either follow a first touch input with the tip of his finger by pressing the finger progressively in contact with the touch panel (extended touch input) or release his finger away from the panel when the first touch input is itself and extended touch input (release touch input) .
  • This new touch input differs from existing multi touch input as the touched portion or region of the panel varies in size as the user moves his finger against or away from the screen.
  • the present system also relates to a mobile device for imparting control to an application program (AP) running on said mobile device, said mobile device being arranged to:
  • AP application program
  • GUI graphical user interface
  • the present system also relates to an application embodied on a computer readable medium and arranged to impart control to an application program (AP) running on a mobile device, the application comprising:
  • GUI graphical user interface
  • FIG. 1 shows a mobile device in accordance with an embodiment of the present system
  • FIG. 2 shows an illustrative process flow diagram in accordance with an embodiment of the present system
  • FIG. 3 shows an exemplary flowchart in accordance with an embodiment of the present system
  • FIGs. 4A-4B show exemplary illustrations of an application program controlled according to an embodiment of the present system
  • FIG. 5 shows an exemplary implementation in accordance with an embodiment of the present system.
  • an operative coupling may include one or more of a wired connection and/or a wireless connection between two or more devices that enables a one and/or two-way communication path between the devices and/or portions thereof.
  • An operative coupling may also include a wired and/or wireless coupling to enable communication between a service platform, such as the profiling platform in accordance with an embodiment of the present system, and one or more user devices.
  • An operative coupling may also relate to an interaction between program portions and thereby may not describe a physical connection so much as an interaction based coupling.
  • rendering and formatives thereof as utilized herein refer to providing content, such as digital media or a graphical user interface (GUI), such that it may be perceived by at least one user sense, such as a sense of sight and/or a sense of hearing.
  • GUI graphical user interface
  • the present system may render a user interface on a display device so that it may be seen and interacted with by a user.
  • rendering may also comprise all the actions required to generate a GUI prior to the display, like e.g. a map representation generated on a server side for a browser application on a user device.
  • an electronic device provides a GUI for controlling an application program (AP) through touch inputs.
  • AP application program
  • a mobile device or handsets The man skilled in the art may easily apply the present teachings to any electronic device presenting a touch sensitive panel, referred also hereafter as a touch sensitive display or screen.
  • GUI graphical user interface
  • GUI being rendered on the mobile device through a local application program connected to the web-based server.
  • Applications like Google Maps ® are implemented today using that approach.
  • the provided visual environment may be displayed by the processor on a display device of the mobile device, namely a touch sensitive panel (touch panel in short), which a user may use to provide a number of touch inputs of different types.
  • a GUI is a type of user interface which allows a user to interact with electronic devices such as computers, hand-held devices, household appliances, office equipment and the likes.
  • GUIs are typically used to render visual and textual images which describe various visual metaphors of an operating system, an application, etc., and implemented on a processor/computer including rendering on ⁇ display device.
  • GUIs can represent programs, files and operational functions with graphical images, objects, or vector representations.
  • the graphical images can include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, maps, etc.
  • Such images can be arranged in predefined layouts, or can be created dynamically (by the device itself or by a web-based server) to serve the specific actions being taken by a user.
  • the user can select and/or activate various graphical images in order to initiate functions and tasks, i.e. controls, associated therewith.
  • a user can select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular application program.
  • the GUI may present a typical user interface including a windowing environment and as such, may include menu items, pull-down menu items, icons, pop-up windows, etc., that are typical of those provided in a windowing environment, such as may be represented within a WindowsTM Operating System GUI as provided by Microsoft Corporation and/or an OS XTM Operating System GUI, such as provided on an iPhoneTM, MacBookTM, iMacTM, etc., as provided by Apple, Inc., and/or another operating system.
  • a WindowsTM Operating System GUI as provided by Microsoft Corporation and/or an OS XTM Operating System GUI, such as provided on an iPhoneTM, MacBookTM, iMacTM, etc., as provided by Apple, Inc., and/or another operating system.
  • an application program (AP) - or software - may be seen as any tool that functions and is operated by means of a computer, with the purpose of performing one or more functions or tasks for a user or another application program.
  • AP application program
  • a GUI of the AP may be displayed on the mobile device display.
  • FIG. 1 is an illustration of an exemplary mobile device 1 10 used in the present system.
  • the mobile device 1 10 comprises a display device 1 1 1 , a processor 1 12, a controller 1 13 of the display device, and an input device 1 15.
  • Mobile device 1 10 may be for instance a desktop or laptop computer, a mobile device, a PDA (personal digital assistant) ...
  • the user interaction with and manipulation of the application program rendered on a GUI is achieved using the display device 1 1 1 , or screen, which is presently a touch panel operationally coupled to the processor 1 12 controlling the displayed interface.
  • Processor 1 12 may control the rendering and/or the display of the GUI on the display device 1 1 1 depending on the type of application program, i.e. resident or web-based. Processor 1 12 may also handle the user entries according to the present method. The user entries to interact with an application program may be provided through interactions with the touch panel 1 1 1 .
  • the touch panel 1 1 1 can be seen as an input device allowing interactions with a finger of a user or other devices such as a stylus. Such an input device can, for example, be used to make selections of portions of the GUI of the AP.
  • the input received from a user's touch is sent to the processor 1 12.
  • the touch panel is configured to detect and report the (location of the) touches to the processor 1 12 and the processor 1 12 can interpret the touches in accordance with the application program and the currently displayed GUI. For example, the processor 1 12 can initiate a task, i.e. a control of the AP, in accordance with a particular touch.
  • the controller 1 13, i.e. a dedicated processor, can be used to process touches locally and reduce demand for the main processor 1 12 of the computer system.
  • the touch panel 1 1 1 can be based on sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the likes.
  • sensing technologies including but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, and/or the likes.
  • a finger of the user touching panel 1 1 1 1 other devices such as a stylus may be used in place of the user finger.
  • the touch interface is a touch interface
  • touch panel 1 1 1 can be based on single point sensing or multipoint sensing.
  • Single point sensing can be capable of only distinguishing a single touch
  • multipoint sensing can be capable of distinguishing multiple touches that occur at the same time.
  • the captured touch input may be referred to as a touch input event (or touch event in short) that allows imparting a control on the AP.
  • a touch input event or touch event in short
  • the duration and/or frequency of the touch inputs may be taken into account to distinguish different types of touch events.
  • One of the touch inputs illustrated herein may be seen as touching and holding a point on the screen with a single finger, or "clutching" the screen. Clutching the screen is distinguishable from conventional touch inputs by the amount of time it takes to press the finger down on the screen and when the finger is lifted from the screen. A clutch event would only be captured if the finger has not been released from the point or portion on the screen before a given time threshold CLUTCH_THRESHOLD.
  • the processor 1 1 2 is arranged to monitor further touch input event/events (hereinafter, simply events) on the GUI, and to impart a control of the AP in response to the monitored further touch input events when determining that said further touch input events correspond to touched portions contiguous with one another on the GUI.
  • said further touch input events may cause the first touched portion to change in size, such as grow in size.
  • the changing in size may represent a control metaphor that may be applied to other types of control and/or to other parameters. For example, with the proposed system by using the touch panel remote control, the brightness of a TV may be adjusted.
  • Other applications would readily occur to a person of ordinary skill in the art and art intended to be encompassed in the description of the present system.
  • FIG. 4A and 4B Illustrations of such touch events are presented in FIGs. 4A and 4B. In FIG. 4A and 4B.
  • the user may touch a position on the map to define a neighborhood on this map with the tip of his finger 41 1.
  • the map based application is configured to update, either locally (local application) or remotely (using a web based server), the GUI and to display a default neighborhood 431 centered on the captured first touch input.
  • the captured first touch input corresponds to a first portion of the GUI.
  • the finger position 412 in FIG. 4B the user may start pressing his finger sideways to the first touch input so as to have his first phalange come into touch with the screen. As the user's phalange comes progressively into contact with the GUI, the first touch portion will increase in size.
  • processor 1 1 2 is capturing more and more contiguous touch input events, i.e. successive touch inputs next to each other.
  • more local touch input events are captured, while previously captured touch inputs are still active. This is what causes the whole touched first portion to increase in size.
  • the new touch input of the present system can be seen as an extended touch input. This differs from multi-touch inputs where two or more fingers are used. If the touched surface on the touch panel increases as more fingers come into touch with the screen, the touch inputs are not contiguous with one another. In some cases when the user performed the first touch input using the tip of this finger, the tip may be released as more touch events are caused by the rest of the phalange.
  • the first touch portion may first be translated sideways, following the new touch events from the moving finger. Nevertheless the touched portion on the GUI will at some point increase in size as the user try to press his whole finger or the outermost phalange against the screen. As more touch events are detected in the present method, an AP control is imparted as seen in FIG. 4B as the neighborhood 432 changes in size.
  • the AP control here corresponds to an update of the GUI with an increased neighborhood size for instance proportional to the size of the touched first portion.
  • FIGs. 4A and 4B reference is made to a phalange of a finger but the portion of the finger coming into touch with the screen may be larger or smaller than the tip phalange depending on the type of control the user may need to impart on the AP.
  • the first touch input corresponds to a touch input with the tip of the finger 41 1 .
  • the start position may be the full finger, or portion thereof, against the GUI, causing the GUI to be updated and to show a large default neighborhood as a large first touched portion has been detected by processor 1 12 on touch panel 1 1 1.
  • the touch input events actually corresponds to release events from the screen.
  • the new touch input can be seen as an extended release from a full finger touch input.
  • the present system will be illustrated using a map application that can be controlled using the novel touch input of the present system.
  • the user is presented with a default neighborhood centered on a selected point of the map and may control the size of the neighborhood using the novel touch input.
  • the present teaching may be generalized to any AP that can be controlled through touch inputs.
  • FIG. 2 shows an illustrative process flow diagram in accordance with an embodiment of the present system.
  • An application program is running on the processor 1 12 of the mobile device 1 10.
  • the application program running (e.g., being executed) by the processor changes the processor into a special purpose processor for operation in accordance with the present system.
  • Such an AP may for instance be a map application, like for instance Google MapsTM or Yahoo MapsTM.
  • the map application may run locally or a web-based application connected to a distance geo-server hosting the main application.
  • FIGs. 4A and 4B Such an AP is illustrated in FIGs. 4A and 4B mentioned here before.
  • An optional activation phase may be carried out to trigger the monitoring of the novel touch input of the present system.
  • Such an activation phase may be useful if several types of touch inputs (simple, clutch, ...) can be handled by the touch panel. More generally, the present new touch input may only be activated provided some criteria are matched. In a first additional embodiment of the present system, this activation phase may require:
  • the new touch input may be monitored only if the initial touch input matches a predefined criterion.
  • this activation phase may require: - allowing the capture of the further touch input events when the first touch input matches a predefined criterion.
  • the new touch input may be monitored only if the first touch input matches a predefined criterion.
  • a Graphical User Interface (GUI) of the AP is rendered on the touch panel 1 1 1.
  • the GUI as seen is FIG. 4A renders a geographical map, i.e. a map based GUI 400.
  • the map may be for instance a default map based on the user's current location or on a user preferred location, as known from a user profile.
  • the displayed map is centered on the user office location.
  • map based GUI 400 may be updated to display the default neighborhood 431 .
  • the update of the GUI 400 may be server based or provided locally as described here above depending on the AP.
  • a touch event listener may be provided so as to monitor any further touch input from the user on GUI 431.
  • a subsequent act 230 the user may provide a first touch input that is captured by processor 1 12 through the touch panel 1 1 1. That first touch input corresponds to a first touched portion on the touch panel 1 1 1.
  • the touched portion may differ in size depending on the touch panel technology but is generally limited to the tip of the finger surface.
  • an activation test 240 may be performed to determine whether the first touch input is provided within the default neighborhood 431. If not (no to act 240), the monitoring of further touch input is ended in act 250.
  • the activation test of act 240 corresponds to the criterion mentioned in the optional activation phase of the second additional embodiment here above.
  • a touch input event may correspond to either a further touch input caused by the user finger, or a point on the touch panel that is no longer in contact with the user finger.
  • touch input events are contiguous. By contiguous, one may understand that consecutive touched - or no longer touched - points on the touch panel are next to each other within the detection range of the touch panel technology. In other words, when a next touch input event is captured, the previously captured touch input event is still ongoing.
  • the first touched portion on the touch panel 1 1 1 varies in size. Indeed:
  • processor 1 12 will check whether the further touch inputs are contiguous with one another, for example, causing the first touched portion to increase in sized.
  • two contiguous touch inputs may vary in distance.
  • Some touch panel technology e.g. surface acoustic wave sensing
  • Others may be limited to a discrete number of points (e.g. capacitive sensing, resistive sensing, ...), here after referred to as discrete sensing.
  • processor 1 12 will check if the consecutive touch input events are contiguous.
  • processor 1 12 will detect a series of touch input events contiguous with one another as the finger is moving towards or away from the touch panel. With a discrete sensing technology, processor 1 12 will compare the distance between the two successive touch input events and may compare it to the detection range of the touch panel. Provided their locations are within the detection range, processor 1 12 can define the touch input events are contiguous.
  • the further touch input events monitored in act 260 will cause the size of the first touched portions to vary, either increasing (extended touch) or decreasing (extended release).
  • processor 1 12 may associate pixels from the touch panel that correspond to the portion of the touch panel that triggered the touched input event. A possible way to ensure that the touch input events are contiguous, and cause the touched portion on the screen to increase is to measure this at the pixel level. Provided the additional pixels associated to a further touch input event are in contact with the pixels associated to the previous touch input event, processor 1 12 will indeed have detected contiguous touch input events.
  • the monitoring will end in a further act 265.
  • an AP control may be imparted by processor 1 12.
  • this corresponds to the neighborhood 432 to vary in size.
  • an update of the map based GUI 400 may be generated to show a neighborhood varying in size, e.g. following the finger as more touch input events are detected with the present system.
  • An extended touch (as illustrated in FIG. 4B) will result in a larger neighborhood, following for instance any new touch input.
  • the application program could either be a stand alone application resident on the mobile device (such as its operating system for instance) or the client to a web based application (such as map based application using for instance a client downloaded to the mobile device to upload a map) .
  • FIG. 3 is an illustration of a message flow chart between different parts of the mobile device involved in imparting an AP control according to an exemplary embodiment of the present system.
  • the different parts in the present system illustrated in FIG. 3 are: - the application program AP, in the present illustration a map based application program,
  • - a neighborhood engine provided to build a neighborhood overlay on a map GUI, and; - ⁇ user interface (Ui) engine arranged to render a AP GUI of the application program.
  • the exemplary flowchart of FIG. 3 corresponds to the exemplary embodiment of the present method described in relation to FIG. 2.
  • the AP will add a touch event listener to the map based GUI illustrated in FIG. 4A, through processor 1 12.
  • the AP will request from a neighborhood engine a default neighborhood.
  • the neighborhood engine can be part of the application program or a separate module to define neighborhood characteristics to be rendered on a map based GUI.
  • the AP GUI will be updated with the default neighborhood as seen in the illustration of FIG. 4A.
  • Another touch event listener may be added by the AP (act 220 of FIG. 2) to further capture a first touch input from the user (act 230).
  • the map based AP will check with the neighborhood engine with the first touch input is located within the default neighborhood. If so, further touch input listeners will be added to the AP GUI (act 250 in FIG. 2, and loop 310 in FIG. 3). If the captured touch input events are contiguous (act 31 1 in FIG. 3 or Yes to act 260 in FIG. 2), the imparted AP control may include:
  • FIG. 5 shows a system 500 in accordance with an embodiment of the present system.
  • the system 500 includes a user device 590 that has a processor 510 operationally coupled to a memory 520, a rendering device 530, such as one or more of a display, speaker, etc., a user input device 570, such as a sensor panel, and a connection 580 operationally coupled to the user device 590.
  • the connection 580 may be an operable connection between the device 590, as a user device, and another device that has similar elements as the device 590, such as a web server such as one or more content providers.
  • the user device may be for instance a mobile phone, a smart phone, a PDA (personal digital assistant) or any type of wireless portable device.
  • the user device may be an electronic device such as a desktop computer or a server.
  • the present method is suited for a wireless device with a display panel that is also a sensor panel to offer the user an enhanced control over an application program running on the user device.
  • the memory 520 may be any type of device for storing for instance application data related to an application program controlled through touch inputs, to the operating system of the user device, to a browser as well as other application programs controllable with the present method.
  • the application data are received by the processor 510 for configuring the processor 510 to perform operation acts in accordance with the present system.
  • the processor 510 so configured becomes a special purpose machine particularly suited for performing in accordance with the present system.
  • the operation acts include rendering a GUI of the AP, capturing on the sensor panel a first touch input on the AP GUI and corresponding to a first touched portion of said GUI, and when further captured touch input events are identified as contiguous, imparting an AP control.
  • the user input 570 may include the sensor panel as well as a keyboard, mouse, trackball, touchpad or other devices, which may be stand alone or be a part of a system, such as part of a personal computer (e.g., desktop computer, laptop computer, etc.) personal digital assistant, mobile phone, converged device, or other rendering device for communicating with the processor 510 via any type of link, such as a wired or wireless link.
  • the user input device 570 is operable for interacting with the processor 510 including interaction within a paradigm of a GUI and/or other elements of the present system, such as to enable web browsing, selection of a portion of the GUI provided by a touch input.
  • the rendering device 530 may operate as a touch sensitive display for communicating with the processors 510 (e.g., providing selection of portions of the AP GUI). In this way, a user may interact with the processor 510 including interaction within a paradigm of a GUI, such as to operation of the present system, device and method.
  • the user device 590, the processor 510, memory 520, rendering device 530 and/or user input device 570 may all or partly be portions of a computer system or other device, and/or be embedded in a portable device, such as a mobile telephone, personal computer (PC), personal digital assistant (PDA), converged device such as a smart telephone, etc.
  • the device 590, corresponding user interfaces and other portions of the system 500 are provided for imparting an enhanced control in accordance with the present system over application programs.
  • the methods of the present system are particularly suited to be carried out by a computer software program, such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system, such as the different engines, the application program, the user interface engine, etc.
  • a computer software program such program containing modules corresponding to one or more of the individual steps or acts described and/or envisioned by the present system, such as the different engines, the application program, the user interface engine, etc.
  • Such program may of course be embodied in a computer-readable medium, such as an integrated chip, a peripheral device or memory, such as the memory 520 or other memory coupled to the processor 510.
  • the computer-readable medium and/or memory 520 may be any recordable medium (e.g., RAM, ROM, removable memory, CD-ROM, hard drives, DVD, floppy disks or memory cards) or may be a transmission medium utilizing one or more of radio frequency (RF) coupling, Bluetooth coupling, infrared coupling, etc. Any medium known or developed that can store and/or transmit information suitable for use with a computer system may be used as the computer-readable medium and/or memory 520.
  • RF radio frequency
  • Additional memories may also be used. These memories configure processor 510 to implement the methods, operational acts, and functions disclosed herein.
  • the operation acts may include controlling the rendering device 530 to render elements in a form of a GUI and/or controlling the rendering device 530 to render other information in accordance with the present system.
  • the term "memory" should be construed broadly enough to encompass any information able to be read from or written to an address in the addressable space accessed by a processor. With this definition, information on a network is still within memory 520, for instance, because the processor 510 may retrieve the information from the network for operation in accordance with the present system. For example, a portion of the memory as understood herein may reside as a portion of the content providers, and/or the user device.
  • the processor 510 is capable of providing control signals and/or performing operations in response to input signals from the user input device 570 and executing instructions stored in the memory 520.
  • the processor 510 may be an application-specific or general-use integrated circuit(s).
  • the processor 510 may be a dedicated processor for performing in accordance with the present system or may be a general-purpose processor wherein only one of many functions operates for performing in accordance with the present system.
  • the processor 510 may operate utilizing a program portion, multiple program segments, or may be a hardware device utilizing a dedicated or multi-purpose integrated circuit.
  • the changing in size may represent a control metaphor that may be applied to other types of control and/or to other parameters.
  • the brightness of a TV may be adjusted.
  • Other applications would readily occur to a person of ordinary skill in the art and art intended to be encompassed in the description of the present system.
  • exemplary user interfaces are provided to facilitate an understanding of the present system, other user interfaces may be provided and/or elements of one user interface may be combined with another of the user interfaces in accordance with further embodiments of the present system.

Abstract

L'invention porte sur un procédé pour communiquer une commande à un programme d'application (AP) s'exécutant sur un dispositif électronique, ledit dispositif électronique comprenant un panneau tactile et un processeur pour commander ledit panneau tactile, ledit procédé étant réalisé par ledit processeur et comportant les opérations consistant à afficher une interface graphique utilisateur (GUI) de l'AP sur l'écran tactile ; à capturer une première entrée tactile sur la GUI, ladite première entrée tactile correspondant à une première partie touchée de ladite GUI ; à surveiller d'autres événements d'entrée tactile sur la GUI, et à communiquer une commande AP en réponse aux autres événements d'entrée tactile surveillés lors de la détermination du fait que lesdits autres événements d'entrée tactile correspondent à des parties touchées contiguës les unes avec les autres sur la GUI, amenant la première partie touchée à changer de dimension.
PCT/IB2010/001808 2009-05-13 2010-05-11 Interface utilisateur pour fournir une commande améliorée d'un programme d'application WO2010131122A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17802509P 2009-05-13 2009-05-13
US61/178,025 2009-05-13

Publications (2)

Publication Number Publication Date
WO2010131122A2 true WO2010131122A2 (fr) 2010-11-18
WO2010131122A3 WO2010131122A3 (fr) 2011-01-06

Family

ID=43003481

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/001808 WO2010131122A2 (fr) 2009-05-13 2010-05-11 Interface utilisateur pour fournir une commande améliorée d'un programme d'application

Country Status (1)

Country Link
WO (1) WO2010131122A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309591A (zh) * 2012-03-09 2013-09-18 宏碁股份有限公司 可触控式电子装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7663607B2 (en) * 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US9292111B2 (en) * 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US20070097096A1 (en) * 2006-03-25 2007-05-03 Outland Research, Llc Bimodal user interface paradigm for touch screen devices
US20080024454A1 (en) * 2006-07-31 2008-01-31 Paul Everest Three-dimensional touch pad input device
US7973778B2 (en) * 2007-04-16 2011-07-05 Microsoft Corporation Visual simulation of touch pressure

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309591A (zh) * 2012-03-09 2013-09-18 宏碁股份有限公司 可触控式电子装置

Also Published As

Publication number Publication date
WO2010131122A3 (fr) 2011-01-06

Similar Documents

Publication Publication Date Title
US20210191582A1 (en) Device, method, and graphical user interface for a radial menu system
US20210019028A1 (en) Method, device, and graphical user interface for tabbed and private browsing
US11604571B2 (en) Remote user interface
KR102642883B1 (ko) 터치 감응형 디스플레이를 갖는 전자 디바이스 상에 동시에 디스플레이되는 다수의 애플리케이션들과 상호작용하기 위한 시스템들 및 방법들
US20190124203A1 (en) User interface for phone call routing among devices
US10386995B2 (en) User interface for combinable virtual desktops
US9632688B2 (en) Enhanced user interface to transfer media content
EP3336672B1 (fr) Procédé et appareil pour la fourniture d'une interface utilisateur graphique dans un terminal mobile
US9665177B2 (en) User interfaces and associated methods
EP2610726B1 (fr) Opération de glisser-déposer dans une interface utilisateur graphique avec mise en évidence d'objets cible
US20140006949A1 (en) Enhanced user interface to transfer media content
EP2680119A2 (fr) Interface utilisateur améliorée pour suspendre une opération de glisser-déposer
US10474346B2 (en) Method of selection of a portion of a graphical user interface
US20190034075A1 (en) Multifunction device control of another electronic device
CN112199000A (zh) 多维对象重排
KR20120126255A (ko) 아이템 표시 제어 방법 및 장치
US9880726B2 (en) Fragmented scrolling of a page
WO2010131122A2 (fr) Interface utilisateur pour fournir une commande améliorée d'un programme d'application
KR100966848B1 (ko) 회전식 직육면체 메뉴 바 디스플레이 방법 및 장치
KR101352506B1 (ko) 단말 장치에서의 아이템 표시 방법 및 그 방법에 따른 단말 장치
KR20160027063A (ko) 그래픽 사용자 인터페이스의 일 부분을 선택하는 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10750164

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10750164

Country of ref document: EP

Kind code of ref document: A2