WO2012104288A1 - Dispositif à surface tactile multipoint - Google Patents

Dispositif à surface tactile multipoint Download PDF

Info

Publication number
WO2012104288A1
WO2012104288A1 PCT/EP2012/051531 EP2012051531W WO2012104288A1 WO 2012104288 A1 WO2012104288 A1 WO 2012104288A1 EP 2012051531 W EP2012051531 W EP 2012051531W WO 2012104288 A1 WO2012104288 A1 WO 2012104288A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
action
sensing surface
multipoint sensing
fingers
Prior art date
Application number
PCT/EP2012/051531
Other languages
English (en)
Inventor
Robert Skog
Justus Petersson
Original Assignee
Telefonaktiebolaget L M Ericsson (Publ)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget L M Ericsson (Publ) filed Critical Telefonaktiebolaget L M Ericsson (Publ)
Publication of WO2012104288A1 publication Critical patent/WO2012104288A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • H04N21/4325Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • H04N21/4383Accessing a communication channel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present application relates to a device comprising a multipoint sensing surface; a method in a device having a multipoint sensing surface, and a computer-readable medium.
  • Touch sensitive surfaces are popular device interfaces and can be found in the form of touch screens on smart-phones and as touch-pads on laptops. With the advent of capacitive, as opposed to resistive, touch sensors it became possible to detect multiple points of contact on the touch sensitive surface and from this the use of gestures in the user interface became more common.
  • An example of a simple gesture and action is pinch-to-zoom-out, whereby if two contact points such as those presented by a thumb and forefinger on the touch sensitive surface are brought together as a pinch the device responds by performing a zoom-out action on the currently displayed item, such as a picture or a webpage.
  • US Patent No. 7,340,077 to Gokturk et al. describes an arrangement where a camera is used to recognise a user's body position, which is used as an input into a related electronic device.
  • the terminal device includes a communication unit which receives electronic program guide (EPG) information from the Internet, a display unit that displays EPG information, and a control unit that controls the broadcast receiving apparatus to perform an operation corresponding to a selection.
  • EPG electronic program guide
  • a Touch Gesture Reference Guide by C. Villamor et al., available at www.lukew.com, describes gestures used for touch commands, using gestures to support user actions, visual representations of gestures, and outlines of how popular software platforms support touch gestures.
  • gestures as part of the user interface of a device is that certain actions can be performed intuitively by the user.
  • a problem with existing devices that use touch sensitive surfaces is that as the gestures increase in number and complexity, the user interface becomes less intuitive and less user friendly.
  • a device comprising a multipoint sensing surface and a gesture module.
  • the multipoint sensing surface is for receiving inputs from one or more objects.
  • the gesture module is configured to receive inputs from the multipoint sensing surface, and to match a gesture on the multipoint sensing surface to an action, the magnitude of the action determined by the number of contact points with which the gesture is made.
  • the device may further comprise a display.
  • the multipoint sensing surface may be mounted on the display.
  • the device may further comprise a transmission means for sending instructions to a further device.
  • the further device may be at least one of: a television, a set top box, a radio, a media player and a PC.
  • the action may be to change channel on a further device, wherein the number of fingers with which the gesture is performed determines the number of channels changed.
  • the direction of the gesture may determine the direction of channel change.
  • the action may be to skip a segment of media playback, wherein the number of fingers with which the gesture is performed determines the length of the skip.
  • the direction of the gesture may determine the direction of the skip.
  • the direction of the skip may be forwards in time or backwards in time.
  • the action may be to change the speed of media playback, wherein the number of fingers with which the gesture is performed determines the change in playback speed.
  • the direction of the gesture may determine whether the playback speed is increased or decreased.
  • the playback speed may be negative, sometimes referred to as "re-wind"
  • the action may be to move a cursor on a screen, wherein the number of fingers with which the gesture is performed determines the amount the cursor moves. In this way a multi-speed cursor is provided.
  • the method comprises receiving inputs from one or more objects in contact with the multipoint sensing surface.
  • the method further comprises matching a gesture on the multipoint sensing surface to an action.
  • the method further comprises determining the magnitude of the action by the number of contact points with which the gesture is made.
  • the multipoint sensing surface may be mounted in front of a display. Such a multipoint sensing surface is substantially transparent.
  • the method may further comprise sending instructions to a further device, the instructions relating to the action corresponding to the gesture and the number of contact points with which the gesture is made.
  • the action may be to change channel, wherein the number of fingers with which the gesture is performed determines the number of channels changed.
  • the direction in which the gesture is made may determine the direction of channel change.
  • the gesture may be at least one of a horizontal swipe, a vertical swipe or a rotational swipe, or a combination thereof.
  • a remote control for selecting and controlling an electronic device.
  • the remote control comprising: a touch-sensitive screen; a programmable digital signal processor in communication with the touch- sensitive screen; and a transmitter in communication with the programmable digital signal processor; wherein the processor is configured to determine a number of fingers touching the touch-sensitive screen, to set based on the number determined a pace corresponding to the electronic device, to identify a gesture made by the fingers on the screen, and to generate based on the gesture identified a corresponding command to be transmitted to the electronic device corresponding to the set pace.
  • Figure 1 illustrates an embodiment of a device as described herein having a multipoint sensing surface
  • Figure 2 illustrates a plurality of gestures which may be used to instruct a command to be sent to a further device
  • Figure 3 shows some alternative gestures which may be used herein;
  • Figure 4 illustrates a method incorporating the gestures described herein;
  • Figure 5 is a block diagram illustrating the components of the device described herein; and Figure 6 is a block diagram illustrating the components of an alternative device also described herein.
  • Figure 1 illustrates an embodiment of a device having a multipoint sensing surface.
  • the device shown in figure 1 is a remote control comprising a user interface for receiving instructions from a user and a transmitting means for transmitting commands to at least one further device.
  • Remote control 100 sends instructions to a television 1 10 and a radio 120. The instructions are transmitted from the remote control 100 using conventional wireless
  • Remote control 100 comprises a display 102 which has a transparent touch sensitive surface overlaid to create what is commonly termed a touch-screen. Remote control 100 additionally includes a plurality of hardware buttons 103.
  • a user may use remote control 100 to send commands to the further devices 1 10, 120 to control their operation.
  • the remote control 100 thus allows a user to perform actions such as turn on a further device, and to change the channel the further device outputs.
  • a typical remote control has only physical buttons for receiving input from a user, but remote controls with touch sensitive surfaces such as touch screens are known.
  • Figure 2 illustrates a plurality of gestures which may be used to instruct a channel change command to be sent to one of the further devices 1 10, 120.
  • To which of the further devices 1 10, 120 that the gesture relates is determined by an aspect of the user interface of device 100.
  • the user interface of remote control 100 comprises a plurality of buttons at the top of display 102 each button relating to a further device 1 10, 120 that may be controlled. Before making a gesture on the touch screen 102 the user must ensure that the appropriate button is highlighted indicating the appropriate further device is selected.
  • the touch screen 102 of remote control 100 is capable of detecting multiple simultaneous contact points.
  • remote control 100 comprises an LCD display with a capacitive touch surface overlaid.
  • Figure 2A illustrates a two finger swipe to the right which is interpreted by remote control 100 as single channel up instruction.
  • Figure 2B illustrates a two finger swipe to the left which is interpreted by remote control 100 as a single channel down instruction. While selecting a channel, switching between channels, or channel surfing, a user will typically wish to change from one channel to another channel multiple channels away in the displayed channel order. This may be achieved by the user inputting multiple consecutive channel up or channel down instructions.
  • Remote control 100 presents an intuitive and more efficient method of providing these instructions by recognising a three finger swipe to the left or to the right as a multiple channel up or a multiple channel down instruction. This is illustrated in figure 2C which shows a three finger swipe to the right indicating a five channel up instruction; and figure 2D which illustrates a three finger swipe to the left indicating a five channels down instruction.
  • the number of channels that is skipped when a multiple channel change instruction is received is an implementation detail which will vary dependant upon the total number of channels available. For example, for a television with only eight channels a five channel change instruction is unlikely to be useful whereas a three channel change instruction is more likely to be used. Similarly, for a radio with one hundred channels a ten channel change instruction maybe more useful than a five channel change instruction.
  • Figure 3 shows some alternative gestures to which the multiple channel change instruction may be assigned. Each of the gestures in figure 3 is shown made by two contact points which will typically be two fingers. Each of the gestures shown in figure 3 may also be made with one, three or even four fingers.
  • Figures 3A and 3B illustrate a two finger swipe to the right and to the left respectively corresponding to the single channel change instructions in figures 2A and 2B.
  • Figure 3C shows a two finger swipe up
  • figure 3D shows a two finger swipe down
  • Figure 3E shows a two finger rotational swipe clockwise
  • Figure 3F shows a two finger rotational swipe anticlockwise.
  • the channel up and channel down instruction described above in relation to remote control 100 may alternatively be implemented with vertical swipes (figures 3C and 3D respectively) or rotational swipes (figures 3E and 3F respectively). It should be further noted that where the size of the touch screen 102 of remote control 100 is sufficient three speed channel change instructions may be implemented.
  • a two finger swipe may correspond to a single channel change
  • a three finger swipe may corresponds to a four channel change
  • a four finger swipe may correspond to a ten channel change.
  • the technique may also be applied to scrolling or skipping through a media presentation (such as a recorded or buffered TV programme, an audio file or an audio & video file), or scrolling through a list such as an electronic programme guide (EPG) or content catalogue.
  • a two finger swipe may correspond to scrolling at twice normal speed
  • a three finger swipe may correspond to scrolling at four times normal speed.
  • a two finger swipe may correspond to an increase of the scrolling speed by one step (from 1x to 2x, or from 2x to 4x), and a three finger swipe may correspond to an increase of the scrolling speed by two steps (from 1x to 4x, from 2x to 8x, or from 4x to 16x).
  • a two finger swipe may correspond to skipping 5 seconds, whereas a three finger swipe may correspond to skipping 30 seconds.
  • a two finger swipe may correspond to a moving on to the next item on the list (or time slot in an EPG) and a three finger swipe may correspond to moving onto the fourth next item on the list (or time slot on the EPG).
  • the direction of the gesture may be used to determine the direction of the skipping or scrolling.
  • single finger gestures are not used for scrolling or skipping instructions as this may be reserved for another function in the user interface of remote control 100 such as, for example, cursor movement. It should be understood that where reserving of the single finger swipe is not require then the single finger swipe may be used for a single channel change, skip or scroll instructions with additional fingers used to indicate a higher speed of channel change, skip or scroll.
  • the technique described herein may be applied to cursor movement. This is particularly relevant to devices that have a separate touch sensitive surface and screen, or to devices which have a touch screen but where the cursor controlled by the touch sensitive surface of the touch screen is displayed on a screen other than the touch screen.
  • the relation between amount of input movement and amount of cursor movement is determined by the cursor speed. For a given input movement a fast cursor moves further on the screen than a slow cursor.
  • the technique described herein may be used to provide a multi-speed cursor whereby a cursor moves at one speed in response to a one finger gesture, and at a faster speed in response to a two finger gesture.
  • the cursor may move faster still in response to a three finger gesture.
  • the cursor may move at one speed in response to a one finger gesture, and at a slower speed in response to a two finger gesture.
  • the cursor may move slower still in response to a three finger gesture. This may be particularly useful where precision cursor control is occasionally required.
  • Figure 4 illustrates a method implemented by a device such as remote control 100 incorporating the gestures described above.
  • the process starts and at 410 it is determined whether or not a touch input is detected. If no touch input is detected the process proceeds to a standby mode 405 in which the device may periodically determine whether touch inputs have been made to the multipoint sensing surface. If a touch input is detected at 410 the process proceeds to 420 where a determination is made as to the number of contact points that the touch input comprises. Then the process proceeds to 430 where the touch input is monitored and any movements of the contact points are detected. Once movement is detected a gesture is identified at 440 which best correlates with the detected movement. Thus, the process establishes what gesture was used and the number of contact points used to make it. At 450 the process performs the action corresponding to the input gesture and the number of contact points. The action may comprise transmitting a control instruction to a further device.
  • FIG. 5 is a block diagram illustrating the components of remote control 100.
  • Remote control 100 comprises a touch screen 510, a gesture module 520, a processor 530 and a graphics module 540.
  • the gesture module 520 receives touch information from the touch sensitive surface of touch screen 510 and translates these signals into gesture information which is sent to a processor 530.
  • Processor 530 is arranged to run instructions to allow the remote control 100 to perform as required.
  • Processor 530 sends display information to a graphics module 540 which drives the display component of touch screen 510.
  • Figure 6 illustrates the structure of a device 600 which does not comprise of a touch screen but has separate touch and display surfaces.
  • Device 600 comprises a touch surface 610, a gesture module 620, a processor 630, a graphics module 540, and a display 650.
  • the gesture module 620 receives touch information from the touch surface 610 and translates these signals into gesture information which is sent to a processor 630.
  • Processor 630 is arranged to run instructions to allow the device 600 to perform as required.
  • Processor 630 sends display information to a graphics module 640 which drives the display 650.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention porte sur un dispositif comprenant une surface tactile multipoint et un module de geste. La surface tactile multipoint sert à recevoir des entrées provenant d'un ou plusieurs objets. Le module de geste est configuré pour recevoir des entrées provenant de la surface tactile multipoint, et pour apparier un geste sur la surface tactile multipoint à une action, l'amplitude de l'action étant déterminée par le nombre de points de contact avec lequel le geste est fait.
PCT/EP2012/051531 2011-02-03 2012-01-31 Dispositif à surface tactile multipoint WO2012104288A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161439147P 2011-02-03 2011-02-03
US61/439,147 2011-02-03

Publications (1)

Publication Number Publication Date
WO2012104288A1 true WO2012104288A1 (fr) 2012-08-09

Family

ID=45833357

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/051531 WO2012104288A1 (fr) 2011-02-03 2012-01-31 Dispositif à surface tactile multipoint

Country Status (1)

Country Link
WO (1) WO2012104288A1 (fr)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130246948A1 (en) * 2012-03-16 2013-09-19 Lenovo (Beijing) Co., Ltd. Control method and control device
WO2014099893A3 (fr) * 2012-12-17 2014-08-21 Motorola Mobility Llc Geste multipoint pour le déplacement d'un élément multimédia
US9024894B1 (en) * 2012-08-29 2015-05-05 Time Warner Cable Enterprises Llc Remote control including touch-sensing surface
WO2017081455A1 (fr) * 2015-11-09 2017-05-18 Sky Cp Limited Interface utilisateur de télévision
WO2019127566A1 (fr) * 2017-12-30 2019-07-04 李庆远 Procédé et dispositif de changement de station basé sur des gestes à plusieurs niveaux
WO2019127419A1 (fr) * 2017-12-29 2019-07-04 李庆远 Procédé et dispositif de geste de main pour effectuer un retour et une avance rapides à niveaux multiples
US11201961B2 (en) * 2017-05-16 2021-12-14 Apple Inc. Methods and interfaces for adjusting the volume of media
AU2022200515B2 (en) * 2017-05-16 2023-01-12 Apple Inc. Methods and interfaces for home media control
US11620103B2 (en) 2019-05-31 2023-04-04 Apple Inc. User interfaces for audio media control
US11683408B2 (en) 2017-05-16 2023-06-20 Apple Inc. Methods and interfaces for home media control
US11714597B2 (en) 2019-05-31 2023-08-01 Apple Inc. Methods and user interfaces for sharing audio
US11750734B2 (en) 2017-05-16 2023-09-05 Apple Inc. Methods for initiating output of at least a component of a signal representative of media currently being played back by another device
US11755273B2 (en) 2019-05-31 2023-09-12 Apple Inc. User interfaces for audio media control
US11755712B2 (en) 2011-09-29 2023-09-12 Apple Inc. Authentication with secondary approver
US11782598B2 (en) 2020-09-25 2023-10-10 Apple Inc. Methods and interfaces for media control with dynamic feedback
US11785387B2 (en) 2019-05-31 2023-10-10 Apple Inc. User interfaces for managing controllable external devices
US11847378B2 (en) 2021-06-06 2023-12-19 Apple Inc. User interfaces for audio routing
US11893052B2 (en) 2011-08-18 2024-02-06 Apple Inc. Management of local and remote media items
US11900372B2 (en) 2016-06-12 2024-02-13 Apple Inc. User interfaces for transactions
US11907519B2 (en) 2009-03-16 2024-02-20 Apple Inc. Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate
US11907013B2 (en) 2014-05-30 2024-02-20 Apple Inc. Continuity of applications across devices
US12001650B2 (en) 2014-09-02 2024-06-04 Apple Inc. Music user interface

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080062141A1 (en) * 2006-09-11 2008-03-13 Imran Chandhri Media Player with Imaged Based Browsing
EP1942401A1 (fr) * 2007-01-05 2008-07-09 Apple Inc. Dispositif de communication multimédia avec écran tactile sensible aux gestes de contrôle, manipulation et édition de fichiers média
US20080165140A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Detecting gestures on multi-event sensitive devices
US20090153288A1 (en) * 2007-12-12 2009-06-18 Eric James Hope Handheld electronic devices with remote control functionality and gesture recognition
EP2182431A1 (fr) * 2008-10-28 2010-05-05 Sony Corporation Procédé d'information
US20100162181A1 (en) * 2008-12-22 2010-06-24 Palm, Inc. Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress
EP2202614A1 (fr) * 2008-12-26 2010-06-30 Brother Kogyo Kabushiki Kaisha Dispositif d'entrée d'utlilisateur pour un dispositif multifonctionel péripherique
US20100214322A1 (en) * 2009-02-24 2010-08-26 Samsung Electronics Co., Ltd. Method for controlling display and device using the same

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080062141A1 (en) * 2006-09-11 2008-03-13 Imran Chandhri Media Player with Imaged Based Browsing
EP1942401A1 (fr) * 2007-01-05 2008-07-09 Apple Inc. Dispositif de communication multimédia avec écran tactile sensible aux gestes de contrôle, manipulation et édition de fichiers média
US20080165140A1 (en) * 2007-01-05 2008-07-10 Apple Inc. Detecting gestures on multi-event sensitive devices
US20090153288A1 (en) * 2007-12-12 2009-06-18 Eric James Hope Handheld electronic devices with remote control functionality and gesture recognition
EP2182431A1 (fr) * 2008-10-28 2010-05-05 Sony Corporation Procédé d'information
US20100162181A1 (en) * 2008-12-22 2010-06-24 Palm, Inc. Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress
EP2202614A1 (fr) * 2008-12-26 2010-06-30 Brother Kogyo Kabushiki Kaisha Dispositif d'entrée d'utlilisateur pour un dispositif multifonctionel péripherique
US20100214322A1 (en) * 2009-02-24 2010-08-26 Samsung Electronics Co., Ltd. Method for controlling display and device using the same

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11907519B2 (en) 2009-03-16 2024-02-20 Apple Inc. Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate
US11893052B2 (en) 2011-08-18 2024-02-06 Apple Inc. Management of local and remote media items
US11755712B2 (en) 2011-09-29 2023-09-12 Apple Inc. Authentication with secondary approver
US20130246948A1 (en) * 2012-03-16 2013-09-19 Lenovo (Beijing) Co., Ltd. Control method and control device
US9024894B1 (en) * 2012-08-29 2015-05-05 Time Warner Cable Enterprises Llc Remote control including touch-sensing surface
WO2014099893A3 (fr) * 2012-12-17 2014-08-21 Motorola Mobility Llc Geste multipoint pour le déplacement d'un élément multimédia
US11907013B2 (en) 2014-05-30 2024-02-20 Apple Inc. Continuity of applications across devices
US12001650B2 (en) 2014-09-02 2024-06-04 Apple Inc. Music user interface
WO2017081455A1 (fr) * 2015-11-09 2017-05-18 Sky Cp Limited Interface utilisateur de télévision
CN108476338A (zh) * 2015-11-09 2018-08-31 Cp天空有限公司 电视用户界面
US11523167B2 (en) 2015-11-09 2022-12-06 Sky Cp Limited Television user interface
US11900372B2 (en) 2016-06-12 2024-02-13 Apple Inc. User interfaces for transactions
US11201961B2 (en) * 2017-05-16 2021-12-14 Apple Inc. Methods and interfaces for adjusting the volume of media
US11750734B2 (en) 2017-05-16 2023-09-05 Apple Inc. Methods for initiating output of at least a component of a signal representative of media currently being played back by another device
US12107985B2 (en) 2017-05-16 2024-10-01 Apple Inc. Methods and interfaces for home media control
AU2022200515B2 (en) * 2017-05-16 2023-01-12 Apple Inc. Methods and interfaces for home media control
US11683408B2 (en) 2017-05-16 2023-06-20 Apple Inc. Methods and interfaces for home media control
WO2019127419A1 (fr) * 2017-12-29 2019-07-04 李庆远 Procédé et dispositif de geste de main pour effectuer un retour et une avance rapides à niveaux multiples
WO2019127566A1 (fr) * 2017-12-30 2019-07-04 李庆远 Procédé et dispositif de changement de station basé sur des gestes à plusieurs niveaux
US11853646B2 (en) 2019-05-31 2023-12-26 Apple Inc. User interfaces for audio media control
US11620103B2 (en) 2019-05-31 2023-04-04 Apple Inc. User interfaces for audio media control
US11785387B2 (en) 2019-05-31 2023-10-10 Apple Inc. User interfaces for managing controllable external devices
US11714597B2 (en) 2019-05-31 2023-08-01 Apple Inc. Methods and user interfaces for sharing audio
US11755273B2 (en) 2019-05-31 2023-09-12 Apple Inc. User interfaces for audio media control
US12114142B2 (en) 2019-05-31 2024-10-08 Apple Inc. User interfaces for managing controllable external devices
US11782598B2 (en) 2020-09-25 2023-10-10 Apple Inc. Methods and interfaces for media control with dynamic feedback
US12112037B2 (en) 2020-09-25 2024-10-08 Apple Inc. Methods and interfaces for media control with dynamic feedback
US11847378B2 (en) 2021-06-06 2023-12-19 Apple Inc. User interfaces for audio routing

Similar Documents

Publication Publication Date Title
WO2012104288A1 (fr) Dispositif à surface tactile multipoint
US11243615B2 (en) Systems, methods, and media for providing an enhanced remote control having multiple modes
JP6144242B2 (ja) 3dリモートコントローラ用のguiアプリケーション
US8881049B2 (en) Scrolling displayed objects using a 3D remote controller in a media system
US10324612B2 (en) Scroll bar with video region in a media system
US8194037B2 (en) Centering a 3D remote controller in a media system
KR101621524B1 (ko) 디스플레이장치 및 그 제어방법
US20120162101A1 (en) Control system and control method
KR101379398B1 (ko) 스마트 텔레비전용 원격 제어 방법
US20090158222A1 (en) Interactive and dynamic screen saver for use in a media system
US20090153475A1 (en) Use of a remote controller Z-direction input mechanism in a media system
EP2682853A2 (fr) Dispositif mobile et commande de procédé de fonctionnement disponibles pour l'utilisation de la fonction toucher-déposer
CN111897480B (zh) 播放进度调节方法、装置及电子设备
KR101515454B1 (ko) 듀얼 터치패드를 구비하는 리모콘 및 그를 이용한 제어 방법
EP2341492B1 (fr) Dispositif électronique incluant un écran tactile et son procédé de contrôle de fonctionnement
US20100162155A1 (en) Method for displaying items and display apparatus applying the same
WO2012057179A1 (fr) Dispositif électronique
KR102250091B1 (ko) 디스플레이 장치 및 디스플레이 방법
JP2013143139A (ja) 入力装置、ディスプレイ装置、その制御方法及びディスプレイシステム
KR101253168B1 (ko) 터치 패드를 구비한 입력 장치
US20240231599A9 (en) A computer a software module arrangement, a circuitry arrangement, a user equipment and a method for an improved user interface controlling multiple applications simultaneously
JP5246974B2 (ja) 電子機器の入力装置及び入力操作処理方法、並びに入力制御プログラム
KR101748735B1 (ko) 터치 스크린을 구비하는 전자기기 및 그 동작제어방법
AU2015258317B2 (en) Apparatus and method for controlling motion-based user interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12708772

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12708772

Country of ref document: EP

Kind code of ref document: A1