WO2013157013A1 - Sélection d'éléments d'une interface utilisateur graphique - Google Patents

Sélection d'éléments d'une interface utilisateur graphique Download PDF

Info

Publication number
WO2013157013A1
WO2013157013A1 PCT/IN2012/000270 IN2012000270W WO2013157013A1 WO 2013157013 A1 WO2013157013 A1 WO 2013157013A1 IN 2012000270 W IN2012000270 W IN 2012000270W WO 2013157013 A1 WO2013157013 A1 WO 2013157013A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointer
mobile device
user input
touch
movement
Prior art date
Application number
PCT/IN2012/000270
Other languages
English (en)
Inventor
Mohit Jain
Sriganesh Madhvanath
Vimal Sharma
Original Assignee
Hewlett - Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett - Packard Development Company, L.P. filed Critical Hewlett - Packard Development Company, L.P.
Priority to PCT/IN2012/000270 priority Critical patent/WO2013157013A1/fr
Publication of WO2013157013A1 publication Critical patent/WO2013157013A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • An interactive system typically presents a graphical user interface ("Ul") in a visual form.
  • User graphical user interface
  • Commonly known interactive systems include television sets and computers.
  • a remote control with a four-way directional pad is the most common mode of interaction with ah interactive system that is at some distance from a user.
  • Most television sets and devices that are used from a distance for instance, a DVD player, a set-top box, etc.
  • a remote control does not always offer an intuitive way to interact with the system's Ul, especially in case of recent systems such as internet TV (iTV) or Smart TV.
  • FIG. 1 shows a flow chart of a method of selecting user interface elements of a Ul, according to an embodiment.
  • FIG. 2A and 2B is a diagram of an illustrative mechanism for selecting user interface elements of a Ul, according to an embodiment.
  • FIG. 3A and 3B is a diagram of an illustrative mechanism for selecting user interface elements of a Ul, according to an embodiment.
  • FIG. 4A and 4B is a diagram of an illustrative mechanism for selecting user interface elements of a Ul, according to an embodiment.
  • FIG. 5A and 5B is a diagram of an illustrative mechanism for selecting user interface elements of a Ul, according to an embodiment.
  • FIG. 6A and 6B is a diagram of an illustrative mechanism for selecting user interface elements of a Ul, according to an embodiment.
  • FIG. 7 is a diagram of an illustrative mechanism for selecting user interface elements of a Ul, according to an embodiment.
  • FIG. 8 illustrates a system for selecting user interface elements of a Ul, according to an embodiment.
  • a typical infrared remote control is not amenable to a smooth interaction with a Ul from a distance.
  • a Ul may correspond to an internet TV or Smart TV system, or even a personal computer.
  • Smart TV systems enable users to view not only regular broadcast programs but also provide advanced services such as video on-demand, catch-up TV services, Electronic Program Guide (EPG), etc.
  • EPG Electronic Program Guide
  • Smart TV systems allow users to access their local or on-line content (for example, photos), browse the internet, download and launch apps, etc.
  • the user interface of these systems is far more complex than of a regular television set.
  • Proposed is a solution that provides a more intuitive and fluid interaction with a Ul, which may be at some distance from the user (for example, shown on a wall-mounted display).
  • embodiments of the present solution provide a method and system for interaction with user interface elements of a Ul viewed on a large display by using a touch- enabled handheld device. Examples described later enable selection of a user interface element in such a way that there is fluidity in the process.
  • a visible pointer is shown on the Ul to identify the target that is being selected. But due to interaction from a distance, it is hard to move the pointer over the target directly, resulting in overshooting problem.
  • a pointer is a graphical image on the Ul which reflects movements of a pointing device.
  • a pointer may be used to select and move the graphical user interface elements in a Ul.
  • FIG. 1 shows a flow chart of a method of selecting user interface elements of a Ul, according to an embodiment.
  • the method may be implemented in a handheld device, like a mobile phone, an electronic tablet, a Personal Digital Assistant (PDA), etc., which may be coupled to an interactive system, like a television set, set-top box or a computer.
  • PDA Personal Digital Assistant
  • the method includes receiving (110) a first user input for moving a pointer on a user interface.
  • the method envisages a scenario where a user is interacting with an interactive system via a display device, and the user input is received from a touch-enabled handheld device.
  • the display device may be present in close proximity of a user or it may be at a distance.
  • a display device is an output device for presentation of information in visual or tactile form.
  • Some illustrative examples of a display may include a television screen, a computer monitor, and mounted displays.
  • the display device may employ different display technologies, such as Liquid Crystal Display (LCD), Light-emitting diode Display (LED), laser TV, etc.
  • LCD Liquid Crystal Display
  • LED Light-emitting diode Display
  • the display device is capable of displaying a Graphical User Interface (Ul) of the system being interacted with.
  • the graphical user interface may include a number of graphical user interface elements, which may be of different types. Some non- limiting illustrative examples may include a window, a drop-down list, a button, a test box, a list box, a radio button, a check box, etc.
  • a pointer may be used to select and move the graphical user interface elements in a Ul.
  • a pointer is a graphical image on a user interface which reflects movements of a pointing device.
  • the mobile device which is used to provide a user input to the display device may include a mobile phone, a tablet PC (personal computer), an electronic tablet, a PDA, a smart phone, a watch, a remote controller with a touch screen (or touch pad) etc.
  • the method Upon receipt of a first user input for moving a pointer on a display, the method recognizes (120) a first location of the pointer on the Ul. The user's input to move a pointer on the Ul is tracked and the location where the pointer movement comes to an end is determined as the first location of the pointer.
  • a MS Word document is open on a display. A user may provide a first user input to move a pointer on the display to a "Home" menu option on the graphical user interface of the MS Word program. This positioning of the pointer on the "Home" menu option is determined as the first location of the pointer.
  • a graphical user interface on a display device includes first level user interface elements and a second level user interface elements within each of the first level user interface elements.
  • a user may provide a first user input to move a pointer to one of the first level user interface elements.
  • the placement of the pointer on the first level user interface element would be considered as the first location of the pointer.
  • a user may provide a second user input to further move the pointer on the Ul.
  • the second user input may also be received (130) from a mobile device.
  • the second user input may be provided to fine tune the first input and to refine the location of the pointer from its first location.
  • the second user input may be used to place the pointer in a location of user's actual desire.
  • a user may want to further refine the pointer location in order to select a "Bold” font option, an "Italic” font option, an "Underline” text option, or any other option on the "Home” menu.
  • the second user input is provided to refine the location of the pointer from its first location.
  • the method recognizes (140) the second location of the pointer on the Ul in response to the second user input.
  • the location where the pointer movement comes to an end in response to the second user input is determined as the second location of the pointer.
  • this position is identified as the second location of the pointer.
  • the second location of the pointer highlights the user interface element which a user intends to select.
  • the second user input may be provided to place the pointer on a second level user interface element within the first level user interface element. In this way, the user refines the location of the pointer from its first location to a location of his actual desire.
  • the user interface element corresponding to the second location of the pointer is selected (150).
  • the "Bold" font option on the "Home” menu would be selected for execution by the computing system.
  • a selection may be made by providing a third user input to the computing system.
  • the third user input may be in the form of a single-tap on the touch screen of the mobile device.
  • a selection may be made by other methods as well, such as but not limited to, pressing a hardware key, a voice input, a gesture input, etc.
  • the above sequence of user inputs enables a user to select a graphical user interface element of his choice on a distant Ul through a mobile device.
  • a first user input may comprise a touch input on a touch screen of a mobile device followed by a movement of the mobile device in air to move (control) a pointer on a Ul.
  • the ' first user input may be preceded by an initial touch input on the mobile device.
  • the initial touch input acts as a signal to indicate that the user may wish to use the mobile device as a pointing device (like a laser pointer).
  • the initial touch transfers the control of a pointer on a Ul to the mobile device, and a subsequent motion of the mobile device in air is used to move the pointer (Fig. 2A).
  • a user wants to control a pointer on a Ul all he needs to do is to tap on a touch screen of his mobile device and point to a distant display. If a pointer is present on the display, its control would be transferred to the user who may move it to a desired location (on the Ul) by moving his mobile device.
  • the location where the pointer movement comes to an end is determined as the first location of the, pointer. This is typically a location on the graphical user interface where a user would like to make a selection (of a user interface element).
  • a second user input in the above case may be provided through a movement of a user's hand across the touch screen of the mobile device.
  • it may be in the form of a continuous touch movement (an uninterrupted stroke) on the touch screen of the mobile device (Fig. 2B), wherein the user maintains a constant contact with the touch screen during the movement.
  • the touch screen acts as a trackpad for moving a pointer on the display.
  • a user may make a continuous swipe movement over the touch screen to move the pointer to a location of his choice on the Ul.
  • a final selection of a graphical user interface element may be made by, for instance, by providing a single tap on the mobile device.
  • a first user input comprises a touch input on a touch screen of a mobile device followed by a movement of the mobile device in air to move (control) a pointer of a Ul (Fig. 3A) similar to one described in relation to Fig. 2A.
  • a second user input in this case is provided through a discrete touch movement(s) across the touch screen of the mobile device.
  • a user moves his hand across the touch screen as if it's a discrete trackpad.
  • a discrete pad mechanism the touch screen of the mobile device acts as a trackpad, but only discrete motions are allowed.
  • a user may perform a small swipe motion towards the right. The user may keep on making these discrete swipe motions until the pointer reaches the user interface element of his choice. Similarly, left, up and down movements may be made to select an interface element in the corresponding direction.
  • a user could also make longer swipe movements across the surface of touch screen to quickly move through the interface elements.
  • the pointer could be a selection block, as illustrated in Fig. 3B. (A selection block may be considered as a special type of pointer which encircles a user interface element to highlight its selection in a graphical user interface.)
  • a final selection of a graphical user interface element may be made, for instance, by providing a single tap on the mobile device.
  • a first user input may comprise a touch input on a mobile device, followed by a " movement of the mobile device in air and a. subsequent release movement wherein the user releases the touch input from the touch screen of the mobile device.
  • the initial touch input on a touch screen of the mobile device followed by a movement of the mobile device in air to move (control) a pointer on a Ul is similar to the example mentioned earlier in relation to Fig. 2A. However, the difference lies in a subsequent movement.
  • the release action results in the disappearance of the regular pointer from the Ul, and in its place a selection block appears on the screen (Fig. 4B). Since the appearance of a pointer changes (from cursor to selection block) upon release of the user's initial touch input, the location of the selection block may also be construed as the first location of the pointer.
  • a second user input in the above case may be provided through discrete touch inputs on the touch screen of the mobile device.
  • a user provides his inputs on the touch screen as if it's a directional pad. (In a directional pad mechanism only discrete input taps are allowed to move a pointer.
  • An input tap in this case may include one or multiple distinct taps, or a long press tap)
  • a user may provide a discrete tap on the right hand side of the touch screen. Each stroke moves the cursor to the next interface element on the user interface. The direction of a stroke determines the direction of the cursor movement.
  • a user may keep on making these discrete strokes until the pointer highlights the user interface element of his choice.
  • a final selection of a graphical user interface element may be made by, for instance, providing a single tap on the mobile device.
  • a final selection may be made by tapping on an "OK" key on the directional pad (as illustrated in Fig. 4B).
  • a first user input may comprise a continuous touch movement across a touch screen of the mobile device to control a pointer on the Ul and a subsequent release movement wherein the touch movement is released from the touch screen of the mobile device (Fig. 5A).
  • a user may provide a first user input by performing a regular swipe motion on the touch screen of the mobile device, followed by a release of the user's hand (from the touch screen) when the swipe motion is completed.
  • a second user input in the above case may be provided through a J mechanism similar to the one described in relation to Fig. 5B.
  • Discrete touch inputs are provided on the touch screen of the mobile device.
  • a user provides his inputs on the touch screen as if it's a directional pad.
  • Each stroke moves the cursor to the next interface element on the user interface, and the direction of a stroke determines the direction of the cursor movement.
  • a final selection of a graphical user interface element may be made by, for instance, providing a single tap on the mobile device.
  • a first user input may comprise a swipe movement across a touch screen of the mobile device to control a pointer on the display (Fig. 6A). This is similar to the method described in relation to Fig. 2B above. The location of the pointer on the Ul, once the swipe movement is over, is identified as the first location of the pointer.
  • a second user input in the above case may be provided by moving the mobile device in air.
  • the mobile device acts as a pointing device to move the pointer frdm its first location on the Ul.
  • the regular pointer disappears on moving the mobile device and a selection box appears in its place.
  • a user moves the selection box from one interface element to another through discrete movements of the mobile device in air. For instance, to move a pointer from a present location to the right, a user may perform a small wrist-motion towards the right (Fig. 6B).
  • the user interface elements are organized in a hierarchy, wherein a user interface element (or a group of user interface elements) may lead to another user interface element (or a group of user interface elements).
  • the user interface elements are organized at multiple levels, wherein a first level leads to a second level/the second level to a third level, and so and so forth (Fig. 7).
  • a first user input may be obtained through a discrete touch movement(s) across the touch screen of the mobile device, as illustrated in Fig. 3B. In another instance, it may be obtained through discrete touch inputs on the touch screen of a mobile device, wherein the touch screen acts as a D-pad (such as illustrated in Fig. 4B).
  • a first user input may also be obtained by moving the mobile device in air, wherein a regular pointer disappears on moving the mobile device and a selection box appears in its place (as described in relation to Fig. 6B). Once a first user input is recognized, a cursor may appear inside the selected user interface element to provide access to a second level of user interface elements.
  • a second user input may be obtained by a swipe movement across a touch screen of the mobile device (similar to the method described in relation to Fig. 2B earlier) or through a touch input on a touch screen of a mobile device followed by a movement of the mobile device in air to move (control) a pointer on a Ul (Fig. 2A).
  • a video may be selected from a collection of multimedia files by obtaining a first user input through any of the ways described above.
  • a second level user interface element for example, play, pause, rewind, stop, etc.
  • a second user input through either of the input methods mentioned above.
  • Fig. 8 illustrates a system for selecting user interface elements of a Ul, according to an embodiment.
  • the system 800 includes a computing device 810 connected to a display device (screen) 860.
  • the computing device 810 may be a mobile device (for example, a mobile phone, a touch pad, a personal digital assistant (PDA), a remote control, etc.), a desktop computer, a notebook computer, and the like.
  • PDA personal digital assistant
  • Computing device 810 may include a processor 820, for executing machine readable instructions, a memory (storage medium) 830 (for storing machine readable instructions), a touch interface 840 and a communication interface 850.
  • Processor 820 is arranged to execute machine readable instructions.
  • the machine readable instructions may be in the form of a software program.
  • processor 820 executes machine readable instructions to: receive a first user input for moving a pointer on the Ul, recognize a first location of the pointer on the Ul in response to the first user input, receive a second user input for moving the pointer on the Ul, recognize a second location of the pointer on the Ul in response to the second user input and select a user interface element on the Ul based on the second location of the pointer on the Ul.
  • the memory 830 may include computer system memory such as, but not limited to, SDRAM (Synchronous DRAM), DDR (Double Data Rate SDRAM), Rambus DRAM (RDRAM), Rambus RAM, etc. or storage memory media, such as, a floppy disk, a hard disk, a CD-ROM, a DVD, a pen drive, etc.
  • SDRAM Synchronous DRAM
  • DDR Double Data Rate SDRAM
  • RDRAM Rambus DRAM
  • Rambus RAM Rambus RAM
  • storage memory media such as, a floppy disk, a hard disk, a CD-ROM, a DVD, a pen drive, etc.
  • the touch interface 840 may include a touch input device, such as a touch screen. It may be used to receive a touch input from a user.
  • Communication interface 850 is used to communicate with an external device, such as a display screen 860. It may be a software program, a hardware, a firmware, or any combination thereof. Communication interface 850 may use a variety of communication technologies to enable communication between the computing device 810 and an external device. Some non- limiting examples of communication technologies which may be used may include infrared, Bluetooth, Wi-Fi, etc.
  • the computing device 810 may be a mobile device. In such case the computing device may include additional components such as a receiver, a transmitter, an antenna etc. for wireless communication over a -voice or data communication network such as GSM, CDMA, etc.
  • the display device (screen) 860 may be any device that enables a user to receive visual feedback.
  • the display may be a liquid crystal display (LCD), a light-emitting diode (LED) display, a plasma display panel, a television, a computer monitor, and the like.
  • the display device 860 is capable of displaying a graphical user interface (Ul).
  • the display may include a communication interface for communication with an external device, such as a computing device 810.
  • the communication interface may be a software program, a hardware, a firmware, or any combination thereof. It may use a variety of communication technologies (infrared, Bluetooth, Wi-Fi, etc.) to enable communication between the display device 810 and an external device.
  • FIG. 8 system components depicted in FIG. 8 are for the purpose of illustration only and the actual components may vary depending on the computing system and architecture deployed for implementation of the present solution.
  • the various components described above may be hosted on a single computing system or multiple computer systems, including servers, connected together through suitable means.
  • Embodiments within the scope of the present solution may be implemented in the form of a computer program product including computer- executable instructions, such as program code, which may be run on any suitable computing environment in conjunction with a suitable operating system, such as Microsoft Windows, Linux or UNIX operating system.
  • Embodiments within the scope of the present solution may also include program products comprising computer-readable media for carrying or having computer-executable instructions or data structures stored thereon.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer.
  • Such computer-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM, magnetic disk storage or other storage devices, or any other medium which can be used to carry or store desired program code in the form of computer- executable instructions and which can be accessed by a general purpose or special purpose computer.

Abstract

L'invention concerne un procédé permettant de sélectionner un élément d'interface utilisateur sur une interface utilisateur graphique (UI). Une première entrée utilisateur visant à déplacer un pointeur sur une UI est reçue d'un dispositif portable tactile. Un premier emplacement du pointeur est reconnu en réponse à la première entrée utilisateur. Une seconde entrée utilisateur visant à déplacer le pointeur sur l'UI est reçue d'un dispositif portable tactile. Un second emplacement du pointeur sur l'UI est reconnu en réponse à la seconde entrée utilisateur. Un élément d'interface utilisateur est sélectionné d'après le second emplacement du pointeur sur l'UI.
PCT/IN2012/000270 2012-04-17 2012-04-17 Sélection d'éléments d'une interface utilisateur graphique WO2013157013A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IN2012/000270 WO2013157013A1 (fr) 2012-04-17 2012-04-17 Sélection d'éléments d'une interface utilisateur graphique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IN2012/000270 WO2013157013A1 (fr) 2012-04-17 2012-04-17 Sélection d'éléments d'une interface utilisateur graphique

Publications (1)

Publication Number Publication Date
WO2013157013A1 true WO2013157013A1 (fr) 2013-10-24

Family

ID=49383026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2012/000270 WO2013157013A1 (fr) 2012-04-17 2012-04-17 Sélection d'éléments d'une interface utilisateur graphique

Country Status (1)

Country Link
WO (1) WO2013157013A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103645847A (zh) * 2013-12-02 2014-03-19 乐视致新电子科技(天津)有限公司 通过移动终端模拟鼠标控制智能电视的方法和系统
CN104363495A (zh) * 2014-11-27 2015-02-18 北京奇艺世纪科技有限公司 通过终端设备遥控器进行焦点切换控制的方法及装置
GB2525945A (en) * 2014-05-09 2015-11-11 British Sky Broadcasting Ltd Television display and remote control
US9766787B2 (en) 2008-06-27 2017-09-19 Microsoft Technology Licensing, Llc Using visual landmarks to organize diagrams
US10078411B2 (en) 2014-04-02 2018-09-18 Microsoft Technology Licensing, Llc Organization mode support mechanisms

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994029788A1 (fr) * 1993-06-15 1994-12-22 Honeywell Inc. Procede d'utilisation d'un ecran tactile a faible definition dans un environement graphique a haute definition
US20070070054A1 (en) * 2005-09-29 2007-03-29 Samsung Electronics Co., Ltd Slide-type input device, portable device having the input device and method and medium using the input device
WO2009080653A1 (fr) * 2007-12-20 2009-07-02 Purple Labs Procédé et système pour déplacer un curseur et sélectionner des objets sur un écran tactile à l'aide d'un doigt de pointage

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994029788A1 (fr) * 1993-06-15 1994-12-22 Honeywell Inc. Procede d'utilisation d'un ecran tactile a faible definition dans un environement graphique a haute definition
US20070070054A1 (en) * 2005-09-29 2007-03-29 Samsung Electronics Co., Ltd Slide-type input device, portable device having the input device and method and medium using the input device
WO2009080653A1 (fr) * 2007-12-20 2009-07-02 Purple Labs Procédé et système pour déplacer un curseur et sélectionner des objets sur un écran tactile à l'aide d'un doigt de pointage

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9766787B2 (en) 2008-06-27 2017-09-19 Microsoft Technology Licensing, Llc Using visual landmarks to organize diagrams
CN103645847A (zh) * 2013-12-02 2014-03-19 乐视致新电子科技(天津)有限公司 通过移动终端模拟鼠标控制智能电视的方法和系统
US10078411B2 (en) 2014-04-02 2018-09-18 Microsoft Technology Licensing, Llc Organization mode support mechanisms
GB2525945A (en) * 2014-05-09 2015-11-11 British Sky Broadcasting Ltd Television display and remote control
GB2535832A (en) * 2014-05-09 2016-08-31 Sky Cp Ltd Television display and remote control
GB2535832B (en) * 2014-05-09 2017-08-23 Sky Cp Ltd Television display and remote control
GB2525945B (en) * 2014-05-09 2019-01-30 Sky Cp Ltd Television display and remote control
US10298993B2 (en) 2014-05-09 2019-05-21 Sky Cp Limited Television user interface
CN104363495A (zh) * 2014-11-27 2015-02-18 北京奇艺世纪科技有限公司 通过终端设备遥控器进行焦点切换控制的方法及装置

Similar Documents

Publication Publication Date Title
US8271908B2 (en) Touch gestures for remote control operations
CN105278674B (zh) 通过穿戴式设备的基于雷达的手势识别
KR102044826B1 (ko) 마우스 기능 제공 방법 및 이를 구현하는 단말
US20160170703A1 (en) System and method for linking and controlling terminals
US10175880B2 (en) Display apparatus and displaying method thereof
US20130176244A1 (en) Electronic apparatus and display control method
US20120249466A1 (en) Information processing apparatus, information processing method, program, control target device, and information processing system
EP2667627A2 (fr) Appareil dýaffichage dýimage et procédé de fonctionnement de celui-ci
US20180364890A1 (en) Image display apparatus and method of operating the same
US20150339026A1 (en) User terminal device, method for controlling user terminal device, and multimedia system thereof
US20140181746A1 (en) Electrionic device with shortcut function and control method thereof
US9703577B2 (en) Automatically executing application using short run indicator on terminal device
US9595186B2 (en) Electronic device combining functions of touch screen and remote control and operation control method thereof
KR101325026B1 (ko) 스마트 단말을 이용한 안드로이드 플랫폼 기반의 애플리케이션 실행 단말 제어방법 및 이를 위한 컴퓨터로 판독가능한 기록매체
US20100162155A1 (en) Method for displaying items and display apparatus applying the same
EP2787429B1 (fr) Procédé et appareil de saisie de texte dans un dispositif électronique comportant un écran tactile
WO2013157013A1 (fr) Sélection d'éléments d'une interface utilisateur graphique
EP2743816A2 (fr) Procédé et appareil pour écran à défilement d'un dispositif d'affichage
TW201435651A (zh) 行動通訊裝置以及其人機介面操作方法
CN114779921A (zh) 利用预计手势的完成提高仪器性能的方法
US20160085359A1 (en) Display apparatus and method for controlling the same
US10467031B2 (en) Controlling a display apparatus via a GUI executed on a separate mobile device
US20160124606A1 (en) Display apparatus, system, and controlling method thereof
KR102046181B1 (ko) 단말기 연동 및 제어 시스템 및 방법
US20150177963A1 (en) Method for selecting an electronic content to be displayed on a display of an electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12874375

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12874375

Country of ref document: EP

Kind code of ref document: A1