WO2012078665A1 - User interface for a remote control device - Google Patents

User interface for a remote control device Download PDF

Info

Publication number
WO2012078665A1
WO2012078665A1 PCT/US2011/063583 US2011063583W WO2012078665A1 WO 2012078665 A1 WO2012078665 A1 WO 2012078665A1 US 2011063583 W US2011063583 W US 2011063583W WO 2012078665 A1 WO2012078665 A1 WO 2012078665A1
Authority
WO
WIPO (PCT)
Prior art keywords
context
user interface
remote controller
specific user
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2011/063583
Other languages
English (en)
French (fr)
Inventor
Neil D. Hunt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netflix Inc
Original Assignee
Netflix Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netflix Inc filed Critical Netflix Inc
Priority to CA2819709A priority Critical patent/CA2819709C/en
Priority to BR112013013945-5A priority patent/BR112013013945B1/pt
Priority to JP2013543286A priority patent/JP5770856B2/ja
Priority to EP11846635.8A priority patent/EP2649501A4/en
Priority to KR1020137017617A priority patent/KR101525091B1/ko
Priority to MX2013006311A priority patent/MX2013006311A/es
Publication of WO2012078665A1 publication Critical patent/WO2012078665A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control

Definitions

  • This disclosure generally relates to a displayed user interface for consumer electronic devices, and more specifically to activating the user interface using a remote controller including a touchpad.
  • a mouse-like pointing device is used to interact with the media content, and the content is viewed on a display coupled to a personal computer.
  • consumer electronic devices have started to be configured as connected media devices, e.g., connected Blu-Ray players, media adaptors, connected televisions, digital video recorders, and cable boxes.
  • a handheld remote control device is typically used that can navigate a displayed menu in a limited number of directions such as left, right, up, and down.
  • the typical handheld remote control device includes numerous buttons to control the different operations that can be performed by the consumer electronic device. In order to press a particular button, a user may need to read a label on the button or the buttons may require backlighting that consumes more power. Overall, the typical handheld remote control is less intuitive and more cumbersome than a typical mouselike pointing device.
  • a system includes a display device, CE device, and a remote controller with a touchpad.
  • the CE device is configured to output a context-specific user interface for display by the display device.
  • the particular user interface varies based on the operational context of the CE device.
  • the remote controller sends signals that are detected by the CE device and control operations performed by the CE device. More specifically, the remote controller is used as a handheld pointing device to position the movable cursor and select operations through interactions with the context-specific user interface.
  • a user may apply gestures to the touchpad that are interpreted based on the context-specific user interface to control the CE device.
  • One embodiment of the invention provides a computer-implemented method for controlling a consumer electronic device.
  • the method includes receiving a signal from a remote controller that is configured to control the consumer electronic device, where the signal is generated in response to a user gesture that is applied to a touchpad of the remote controller.
  • An operational context is determined based on a function being performed by the consumer electronic device.
  • a context-specific user interface based on the operational context is output for display to a display device associated with the consumer electronic device.
  • a cursor and having a position that is controlled by the signal received from the remote controller is output for display and overlaid on the context-specific user interface.
  • a remote controller including a touchpad provides an inexpensive solution for controlling a CE device through interaction with a context-specific user interface displayed on an existing display on or attached to a the CE device.
  • the context-specific user interface simplifies the user interaction for controlling the CE device since the user interface includes only controls that are relevant for the current operations for the CE device.
  • Figure 2 is a block diagram illustrating the consumer electronic device and display device of Figure 1 , according to one embodiment of the present invention
  • Figure 3 sets forth a flow diagram of method steps for controlling the consumer electronic device, according to one embodiment of the present invention.
  • Figures 4A, 4B, 4C, 4D, and 4E illustrate context-specific user interfaces, according to one embodiment of the present invention.
  • Figure 5 illustrates the remote controller of Figure 1 , according to one embodiment of the present invention.
  • Figure 1 illustrates a system 100 that is configured to implement one or more aspects of the present invention.
  • the system 00 includes a display device 120, consumer electronic (CE) device 1 15, and a remote controller 1 10 with a touchpad 105.
  • the remote controller 110 may communicate with the CE device 115 through an infra-red optical technology or a radio-frequency wireless link based upon Bluetooth, WiFi, Z-wave, or other low-power digital wireless connection.
  • the connection may be uni-directional (from the remote controller 1 10 to the CE device 115) or bi-directional.
  • the touchpad 105 of the remote controller 110 may be a touch-sensitive input device using a capacitive or similar technology to detect gestures, e.g., tap, swipe, and the like, applied by a user.
  • the remote controller 1 10 may be packaged in a housing appropriate for a hand-held device, e.g., a package 15 cms long, 5-10 cms wide, and 1 cm thick.
  • the remote controller 110 is configured to control the CE device 1 15, e.g., television, DVD player, set-top box, and other media device that includes an integrated display or is coupled to the display device 120.
  • the display device 1 10 may be a conventional CRT, LCD based monitor, projection screen, a combination of a projector and screen, or the like.
  • the CE device 1 15 may comprise a set-top box configured to receive streaming audiovisual media, to convert the media to one or more output signals in a specified audiovisual format, and to provide the one or more output signals to the display device 120.
  • the client computer 1 10 may comprise a digital disc media player, such as a Blu-ray player, configured to receive streaming audiovisual media, to convert the media to one or more output signals in a specified audiovisual format, and to provide the one or more output signals to the display device 120.
  • the CE device 1 15 may comprise a personal computer, laptop computer, notebook computer, or handheld computer.
  • Embodiments also encompass a computer program product that can be loaded into a computer to perform the functions that are described herein.
  • an embodiment may comprise a computer-readable storage medium storing one or more sequences of instructions which, when executed by one or more digital processors, cause the processors to perform steps as shown and described.
  • the consumer electronic device 1 5 is configured to output a context-specific user interface for display on the display device 120.
  • the particular user interface varies based on the operational context of the CE device 120, where the operational context is the function being performed by the CE device 120 at the particular moment in time.
  • the operational context of the CE device 120 varies when a DVD is inserted into the CE device 120 compared with when the content of the DVD is playing.
  • the operational context of the CE device 120 is different when the CE device 120 is configured to provide a video conferencing function compared with when the CE device 120 is configured as a game box.
  • the context- specific user interface may include selection controls, e.g., a pull-down menu or radio buttons, for selecting a particular station or playlist based on the operational context of the CE device 120 when the CE device 120 is configured to provide an audio media playback function.
  • selection controls e.g., a pull-down menu or radio buttons
  • Other context-specific user interfaces may provide an onscreen keyboard to allow a user to enter specific words, channels, call letters, and the like, based on other operational contexts of the CE device 120.
  • the remote controller 1 10 sends signals that are detected by the CE device 1 5 and control operations performed by the CE device 1 15. More specifically, the remote controller 110 is used as a handheld pointing device to select operations through interactions with the context-specific user interface. A user may apply gestures to the touchpad 105 that are interpreted by the CE device 1 15 based on the context-specific user interface to control the CE device 15. Importantly, the user interface is displayed on the display device 120 and not on the remote controller 1 10. Therefore, no display of status information is necessary on the remote controller 1 15. In one embodiment, the remote controller 1 10 includes only the touchpad 105 and no buttons or other mechanism for receiving user-provided input.
  • the remote controller 110 may consume very little power, and may be powered by a two or three AA or AAA sized batteries. In one
  • the remote controller 1 10 includes a status display that is configured to highlight different control-sets on the touchpad 105 with selective backlighting.
  • the context-specific user interface that is displayed by display device 120 simplifies the user interaction compared with controlling the CE device 1 15 through a conventional multi-button remote control device. In particular, only controls that are relevant to the operations that may be performed by the user are included in the context-specific user interface. For example, when the CE device 15 is configured to display web pages on the display device 120 and perform web page operations, a context-specific user interface for navigating web pages may be displayed.
  • the remote controller 1 10 controls the position of a cursor that is overlaid on the context- specific user interface and is used to select various operations or actions to be performed by the CE device 1 15 through the context-specific user interface.
  • a context-specific user interface that includes transport controls, e.g., Play, Stop, Pause, FF, RW, Next, Prev, and the like.
  • a context-specific user interface may also include mode-sensitive controls, e.g., player setup controls, eject, power, and the like. Since the context-specific user interface is displayed on display device 120, no additional illumination is needed in the room or on the remote controller 1 10, such as backlighting. Therefore, the remote controller 1 10 may be operated without additional lighting and does not require any physical buttons for providing input commands to the CE device 1 15.
  • the remote controller 110 When the user activates the remote controller 110 a signal is transmitted to CE device 1 15.
  • the signal may be transmitted in response to a motion sensor when the user picks up the remote controller 1 10 and when the user touches or strokes the touchpad 105.
  • the CE device 1 15 overlays a movable cursor, and a context-specific user interface including graphical representations of mode-appropriate controls.
  • the user interface may appear on the screen - fading in, sliding in from an edge, or otherwise appearing perhaps overlaid on the video or imagery or other content on the display device 120.
  • the user strokes or swipes the touchpad 105 with directional gestures that cause the cursor to move on the display device 120.
  • Clickable controls included in the context-specific user interface may be highlighted as the cursor passes over the controls.
  • buttons that control the cursor movement and other operations. Only a subset of the buttons may be relevant for a particular operational mode of the CE device 1 15. Finding the correct buttons to activate on a conventional remote control device can be challenging for a user, particularly when the lighting level is low and labels on the buttons cannot be easily seen.
  • the special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • Such special- purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques.
  • the special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hardwired and/or program logic to implement the techniques.
  • FIG. 2 is a block diagram illustrating the CE device 115 and display device 120 of Figure 1 , according to one embodiment of the present invention.
  • the CE device 115 includes a central processing unit (CPU) 202 and a system memory 204 communicating via an interconnection path that may include a bridge 205.
  • CPU 202 may be, for example, a general purpose microprocessor and the system memory 204 may be a random access memory (RAM) or other dynamic storage device for storing information and instructions to be executed by CPU 202.
  • System memory 204 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by CPU 202.
  • Such instructions when stored in storage media accessible to CPU 202, render CE device 115 into a special- purpose machine that is customized to perform the operations specified in the instructions.
  • a device driver 203 may be stored in the system memory 204 and configured to translate signals received from the remote controller 110 into commands for the CE device 1 5.
  • a system disk 214 such as a magnetic disk or optical disk, is provided and coupled to a bridge 205 for storing information and instructions.
  • CE device 115 may be coupled via a bus to a display device 120 and the bridge 205 may receive user input from one or more user input devices (e.g., keyboard, physical buttons, mouse, or trackball) via a direct input interface 208.
  • the bridge 205 forwards the user input to CPU 102 and provides pixels for display to the display device 120.
  • CE device 115 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs CE device 115 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by CE device 1 15 in response to CPU 202 executing one or more sequences of one or more instructions contained in system memory 204. Such instructions may be read into system memory 204 from another storage medium, such as system disk 214. Execution of the sequences of instructions contained in system memory 204 causes CPU 202 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • Non-volatile media includes, for example, optical or magnetic disks, such as system disk 214.
  • Volatile media includes dynamic memory, such as system memory 204.
  • Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that couple components to the bridge 205 and a switch 216.
  • transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data
  • the switch 216 provides connections between bridge 205 and other components such as a network link 1 18 and various add-in cards, such as add-in card 220. Other components (not explicitly shown), including USB or other port connections, CD drives, DVD drives, film recording devices, and the like, may also be connected to the bridge 205.
  • a remote-controller interface 215 that is coupled to the switch 216 may include an infra-red detector that can receive the data carried in an infra-red signal output by the remote controller 110 and provide the signal to the CPU 202 and/or the device driver 203.
  • the switch 216 also provides a two-way data communication coupling to a network link 218 that is connected to a local network.
  • the network link 218 may include a local area network (LAN) card to provide a data communication connection to a compatible LAN.
  • LAN local area network
  • Wireless links may also be implemented.
  • network link 218 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • Network link 218 typically provides data communication through one or more networks to other data devices.
  • network link 218 may provide a connection through a local network to a host computer or to data equipment operated by an Internet Service Provider (ISP).
  • ISP Internet Service Provider
  • An ISP in turn provides data communication services through the world wide packet data communication network now commonly referred to as the "Internet.”
  • a local network and Internet both use electrical, electromagnetic or optical signals that carry digital data streams.
  • the signals through the various networks and the signals on network link 218 and through switch 216, which carry the digital data to and from CE device 115, are example forms of transmission media.
  • CE device 115 can send messages and receive data, including digital audio and video data and program code, through the network(s) and network link 218.
  • the received code may be executed by CPU 202 as it is received, and/or stored in system disk 214, or other non-volatile storage for later execution.
  • the received digital audio and video data may be displayed by display device 120 under the control of the CPU 202.
  • the connection topology including the number and arrangement of bridges, the number of CPUs 202, and the number of display devices 120, may be modified as desired. For instance, in some
  • system memory 204 is connected to CPU 202 directly rather than through a bridge, and other devices communicate with system memory 204 via CPU 202.
  • the particular components shown herein are optional; for instance, any number of add-in cards or peripheral devices might be supported.
  • switch 216 is eliminated, and network link 218, remote controller interface 215, and add-in card 120 connect directly to bridge 205.
  • Figure 3 sets forth a flow diagram of method steps for controlling the CE device 115, according to one embodiment of the present invention. Although the method steps are described in conjunction with the systems for Figures 1 and 2, persons skilled in the art will understand that any system configured to perform the method steps, in any order, is within the scope of the invention.
  • the method begins at step 300, where the CE device 115 receives a signal from the remote controller 110 that is configured to control the CE device 115.
  • the signal is generated in response to a user gesture that is applied to the touchpad 105 of the remote controller 110.
  • the CE device 115 determines an operational context based on a function being performed by the CE device 115.
  • the operational context may depend on the content displayed on the display device 120. For example when a user views a webpage with a video or the CE device 115 is playing a video, the operational context is video playback. When a DVD is inserted into the CE device 115 the operational context is the DVD controls.
  • the CE device 115 in response to the signal received from the remote controller 110, the CE device 115 outputs a context-specific user interface based on the operational context to the display device 120. Also in response to the signal received from the remote controller 110, at step 315 the CE device 115 outputs a cursor at a position that is controlled by the signal from the remote controller. The user interface is displayed by the display device 120 and the cursor is overlaid on the user interface at the position.
  • the CE device 115 determines if the signal indicates that a gesture was applied to the touchpad 105 of the remote controller 110 by a user, and, if not, at step 325 the CE device 1 15 determines if a timeout has expired.
  • a timeout counter is used to determine when display of the user interface and/or cursor should be discontinued due to inactivity.
  • a short duration of time, e.g., 500mS, 2 seconds, between gestures or movement of the remote controller 115 may be cause the timeout counter to expire. If, at step 325, the timeout has not expired, then the CE device 115 returns to step 320. Otherwise, at step 335 the timeout has expired and the context-specific user interface and cursor are removed from the display device 120.
  • the context-specific user interface and cursor may fade out, slide off the screen, or otherwise disappear.
  • the CE device 1 15 determines that the signal indicates that a gesture was applied to the touchpad 105 of the remote controller 0 by a user, then at step 330 the CE device 1 15 performs the operation specified by the gesture.
  • gestures include a single tap, double or multiple tap, and a directional swipe or stroke on the touchpad 105.
  • the cursor position on the display device 120 is updated based on the directional gestures.
  • Figure 4A illustrates a context-specific user interface 405 that is displayed by the display device 120, according to one embodiment of the present invention.
  • the context- specific user interface 405 may include graphical elements for transport controls, e.g., Play, Stop, Pause, FF (fast forward), RW (rewind), Next, Prev (previous), and DVD Menu buttons.
  • a gesture such as a double-tap anywhere in the context-specific user interface 455 may activate the play function and a triple-tap may activate the fast-forward function.
  • the graphical elements for Stop, Play, and Pause transport controls 410 are shown in Figure 4A.
  • Figure 4B illustrates a context-specific user interface 405 that is displayed by the display device 120, according to one embodiment of the present invention.
  • Clickable controls may be highlighted as the cursor passes over the graphical elements for the control, as shown in Figure 4B by a selection indication 422 when the cursor 420 is positioned over the graphical element for the Play control.
  • the graphics elements for controls are depicted on the display device 120, operation in a dark room requires no recall of the layout of specific buttons on the remote controller 1 15.
  • the graphics elements for controls are visible on the display device 120, and relative motion gestures are applied to the touchpad 105 to control the CE device 115.
  • Figure 4C illustrates a context-specific user interface 435 that is displayed by the display device 120, according to one embodiment of the present invention.
  • the context-specific user interface may include graphical elements for the player setup controls, the eject control, and the power- control.
  • the user interface 435 includes setup controls 430 for selecting various soundtrack options. Functions, such as choosing a soundtrack or subtitle overlay, may be difficult to locate by navigating system submenus of conventional CE devices. Alternatively, these functions may be selected by toggling a special button on a conventional remote control device (one of dozens of special- purpose buttons). A user may easily access these functions using the remote controller 110 since the functions can be exposed much more directly by a context- specific user interface that includes the setup controls 430 as on-screen choices overlaying the media content.
  • Figure 4D illustrates a context-specific user interface 445 that is displayed by the display device 120, according to one embodiment of the present invention.
  • an alternative to including transport controls (Play, Stop, Pause, FF, RW, Next, Prev) in the context-specific user interface 445 is to include graphical elements for a timeline bar 440 on the display device 120, with a "now" icon 455 representing the current display point.
  • a user may directly drag (via the cursor 420) the now icon 455 along the timeline bar 440 to move to a different position along the timeline bar 440.
  • a gesture such as a double-tap anywhere in the context-specific user interface 455 may activate the play function and a triple-tap may activate the fast-forward function.
  • Chapter or index marks such as index mark 450 may be displayed on the timeline bar 440 to represent likely points in time to skip forward to or to rewind to.
  • Gestures applied by a user to the touchpad 105, such as dragging forward or backward along the timeline bar 440 may be modified near index marks so that the index marks exert "gravity” or "magnetism.” The gravity or magnetism attracts the now icon 455 to snap the now icon 455 to an index mark when the now icon 455 is moved near to the index mark.
  • commercial content may be highlighted or otherwise distinguished graphically on the timeline bar 440 in order to allow the user to navigate past or to the commercial content.
  • Figure 4E illustrates the context-specific user interface 445 including a parallel timeline bar, according to one embodiment of the present invention.
  • Alternative angle scenes or variations may be displayed in the user interface 445 as parallel timeline bars to the timeline bar 440.
  • a parallel timeline bar 460 branches and rejoins the timeline bar 440 to represent a time region for which different angle scenes of the digital video and audio content are available.
  • the parallel timeline bar 460 may include content such as alternative endings, or even director's and actors'
  • the parallel timeline bar 460 above the parallel timeline bar 440 may represent the program with director's commentary. Clicking on the timeline bar 460 controls the CE device 110 to output the director's commentary version of the digital video and audio content for display on the display device 120. Two seconds before up to ten seconds after the beginning of a section of the timeline bar 440 with an alternative scene angle, the parallel timeline bar 460 may fade in automatically, showing the parallel timeline bar 460 branching off the primary timeline bar 440.
  • Clicking on the parallel timeline bar 460 may select an alternative scene angle and control the CE device 110 to output the alternative scene angle of the digital video and audio content for display on the display device 120.
  • Clicking back on the primary timeline bar 440 controls the CE device 110 to output the primary scene angle of the digital video and audio content for display on the display device 120.
  • a parallel timeline bar may fade into the context-specific user interface 445 so that the user can select the alternative ending for display.
  • Figure 5 illustrates the remote controller 110 of Figure 1 , according to one embodiment of the present invention. Some hard buttons, such as physical controls 510 may included in the remote controller 110 in addition to the touchpad 105.
  • buttons such as Volume Up/Down and Mute are included as physical controls 510 and are easy to feel in the dark.
  • a user avoids having a context-specific user interface displayed that may obscure the content displayed on the display device 120.
  • Specific zones on the touchpad 105 may serve as dedicated buttons, shown as virtual controls 520.
  • the virtual controls 520 may be identified by texture imprinted on the touchpad 105, printing (silkscreen, offset, etc.) on the touchpad 05, or selective backlighting.
  • a user may stroke up or down along the right edge or other designated zone of the touchpad 105 to manage volume, e.g., upward strokes increase volume and downward strokes decrease volume.
  • a small wheel or roller on the side or top of the remote control 110 is used to manage volume and control the timeline bar.
  • the remote controller 110 may also be configured to emit sound to indicate an error, such as when a user attempts to manipulate a control that is inappropriate for the current operational context, e.g., attempting to select "Play" when a video is already playing. Sounds may also be used for aural feedback of clicking or dragging gestures, or to indicate when the movable cursor is positioned to select a graphic element for a control.
  • the context-specific user interface includes the controls that are relevant to the operational context of the CE device 115.
  • Using a pointer device, such as the cursor that is controlled by user gestures provides an intuitive mechanism for controlling the CE device 115.
  • the remote controller 110 is very simple and can be easily operated in the dark or with no additional lighting. Also, the remote controller 110 is “always on” and does not need to be “woken up” or enabled by entering a password or other operation to "power up” the device.
  • Another advantage of a pointing interface such as that provided by a cursor having a position controlled by the remote controller 1 10, is the ability to navigate a dense, rich set of options and alternatives, in a manner similar to a website. Imagine 100,000 movie titles available to play on a streaming Internet movie player. The ability to move the cursor in any direction by applying a gesture to the touchpad 115 provides the user with an intuitive interface to browse, choose, select categories, pick and choose rich meta-content. Content developed for viewing on a personal computer and navigation using a pointer device may be viewed on the CE device 1 15 and navigated using the context-specific user interface and remote controller 1 10.
  • aspects of the present invention may be implemented in hardware or software or in a combination of hardware and software.
  • One embodiment of the invention may be implemented as a program product for use with a computer system.
  • the program(s) of the program product define functions of the embodiments (including the methods described herein) and can be contained on a variety of computer-readable storage media.
  • Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, flash memory, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
  • non-writable storage media e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, flash memory, ROM chips or any type of solid-state non-volatile semiconductor memory
  • writable storage media e.g., floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)
  • Details Of Television Systems (AREA)
  • Position Input By Displaying (AREA)
PCT/US2011/063583 2010-12-06 2011-12-06 User interface for a remote control device Ceased WO2012078665A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CA2819709A CA2819709C (en) 2010-12-06 2011-12-06 User interface for a remote control device
BR112013013945-5A BR112013013945B1 (pt) 2010-12-06 2011-12-06 Método implementado por computador e sistema para controlar um dispositivo eletrônico de consumidor, e mídia nãotransitória legível por computador
JP2013543286A JP5770856B2 (ja) 2010-12-06 2011-12-06 遠隔制御部のためのユーザーインターフェース
EP11846635.8A EP2649501A4 (en) 2010-12-06 2011-12-06 USER INTERFACE FOR A REMOTE CONTROL DEVICE
KR1020137017617A KR101525091B1 (ko) 2010-12-06 2011-12-06 원격 제어 디바이스에 대한 사용자 인터페이스
MX2013006311A MX2013006311A (es) 2010-12-06 2011-12-06 Interfaz de usuario para un dispositivo de control remoto.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/961,387 US8963847B2 (en) 2010-12-06 2010-12-06 User interface for a remote control device
US12/961,387 2010-12-06

Publications (1)

Publication Number Publication Date
WO2012078665A1 true WO2012078665A1 (en) 2012-06-14

Family

ID=46161777

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/063583 Ceased WO2012078665A1 (en) 2010-12-06 2011-12-06 User interface for a remote control device

Country Status (8)

Country Link
US (2) US8963847B2 (enExample)
EP (1) EP2649501A4 (enExample)
JP (1) JP5770856B2 (enExample)
KR (1) KR101525091B1 (enExample)
BR (1) BR112013013945B1 (enExample)
CA (1) CA2819709C (enExample)
MX (1) MX2013006311A (enExample)
WO (1) WO2012078665A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015130015A (ja) * 2014-01-06 2015-07-16 コニカミノルタ株式会社 オブジェクトの停止位置制御方法、操作表示装置およびプログラム

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9239890B2 (en) 2011-05-31 2016-01-19 Fanhattan, Inc. System and method for carousel context switching
US9778818B2 (en) 2011-05-31 2017-10-03 Fanhattan, Inc. System and method for pyramidal navigation
CN103049997B (zh) * 2011-10-11 2016-01-27 Lg电子株式会社 遥控器以及多媒体设备的控制方法
US9131327B2 (en) * 2011-10-12 2015-09-08 Blackberry Limited Methods and apparatus to control accessories
EP2595399A1 (en) * 2011-11-16 2013-05-22 Thomson Licensing Method of digital content version switching and corresponding device
TWI448961B (zh) * 2011-12-16 2014-08-11 Wistron Neweb Corp 電子裝置及其操控方法
US9146616B2 (en) * 2012-01-10 2015-09-29 Fanhattan Inc. Touch-enabled remote control
KR101661526B1 (ko) * 2012-04-08 2016-10-04 삼성전자주식회사 플렉서블 디스플레이 장치 및 그 ui 방법
CN102905183A (zh) * 2012-10-11 2013-01-30 中兴通讯股份有限公司 一种实现电视分屏观看的方法、机顶盒及电视系统
KR102052960B1 (ko) * 2012-11-23 2019-12-06 삼성전자주식회사 입력장치, 디스플레이장치, 디스플레이시스템 및 그 제어 방법
US9591339B1 (en) 2012-11-27 2017-03-07 Apple Inc. Agnostic media delivery system
US9774917B1 (en) 2012-12-10 2017-09-26 Apple Inc. Channel bar user interface
US10200761B1 (en) 2012-12-13 2019-02-05 Apple Inc. TV side bar user interface
US9532111B1 (en) 2012-12-18 2016-12-27 Apple Inc. Devices and method for providing remote control hints on a display
US10521188B1 (en) 2012-12-31 2019-12-31 Apple Inc. Multi-user TV user interface
KR102044826B1 (ko) * 2013-01-02 2019-11-14 삼성전자 주식회사 마우스 기능 제공 방법 및 이를 구현하는 단말
US9143715B2 (en) * 2013-03-14 2015-09-22 Intel Corporation Remote control with capacitive touchpad
US12149779B2 (en) 2013-03-15 2024-11-19 Apple Inc. Advertisement user interface
US9858050B2 (en) 2013-07-02 2018-01-02 Youi Labs Inc. System and method for streamlining user interface development
WO2015052961A1 (ja) * 2013-10-08 2015-04-16 株式会社ソニー・コンピュータエンタテインメント 情報処理装置
GB201408258D0 (en) 2014-05-09 2014-06-25 British Sky Broadcasting Ltd Television display and remote control
AU2015280257B2 (en) 2014-06-24 2017-08-24 Apple Inc. Character recognition on a computing device
CN106415475A (zh) 2014-06-24 2017-02-15 苹果公司 用于在用户界面中导航的列界面
EP3126952B1 (en) 2014-06-24 2023-07-12 Apple Inc. Input device and user interface interactions
US9588625B2 (en) * 2014-08-15 2017-03-07 Google Inc. Interactive textiles
GB2532405A (en) * 2014-09-11 2016-05-25 Piksel Inc Configuration of user Interface
KR101635614B1 (ko) * 2014-11-27 2016-07-01 이동섭 전자기기의 제어를 위한 스마트 단말의 제어 인터페이스 설정 방법
KR20160097520A (ko) * 2015-02-09 2016-08-18 삼성전자주식회사 디스플레이 장치 및 그의 ui 디스플레이 방법
KR102250091B1 (ko) * 2015-02-11 2021-05-10 삼성전자주식회사 디스플레이 장치 및 디스플레이 방법
US9874952B2 (en) 2015-06-11 2018-01-23 Honda Motor Co., Ltd. Vehicle user interface (UI) management
AU2016100651B4 (en) 2015-06-18 2016-08-18 Apple Inc. Device, method, and graphical user interface for navigating media content
US9652125B2 (en) 2015-06-18 2017-05-16 Apple Inc. Device, method, and graphical user interface for navigating media content
US9781468B2 (en) 2015-08-25 2017-10-03 Echostar Technologies L.L.C. Dynamic scaling of touchpad/UI grid size relationship within a user interface
US9826187B2 (en) * 2015-08-25 2017-11-21 Echostar Technologies L.L.C. Combined absolute/relative touchpad navigation
US10331312B2 (en) * 2015-09-08 2019-06-25 Apple Inc. Intelligent automated assistant in a media environment
US9928029B2 (en) 2015-09-08 2018-03-27 Apple Inc. Device, method, and graphical user interface for providing audiovisual feedback
US9990113B2 (en) 2015-09-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control
GB2552273A (en) * 2015-11-09 2018-01-17 Sky Cp Ltd Television User Interface
KR102395701B1 (ko) * 2015-11-11 2022-05-10 삼성전자주식회사 전자 장치 및 전자 장치의 제어방법
US9959343B2 (en) * 2016-01-04 2018-05-01 Gracenote, Inc. Generating and distributing a replacement playlist
CN105872737B (zh) * 2016-04-18 2019-04-02 高创(苏州)电子有限公司 媒体播放的控制方法及媒体源设备、媒体播放设备及系统
DK201670581A1 (en) 2016-06-12 2018-01-08 Apple Inc Device-level authorization for viewing content
DK201670582A1 (en) 2016-06-12 2018-01-02 Apple Inc Identifying applications on which content is available
US11966560B2 (en) 2016-10-26 2024-04-23 Apple Inc. User interfaces for browsing content from multiple content applications on an electronic device
US10489106B2 (en) 2016-12-31 2019-11-26 Spotify Ab Media content playback during travel
US11514098B2 (en) 2016-12-31 2022-11-29 Spotify Ab Playlist trailers for media content playback during travel
US10747423B2 (en) 2016-12-31 2020-08-18 Spotify Ab User interface for media content playback
CN107085508B (zh) * 2017-03-27 2020-03-24 联想(北京)有限公司 一种信息处理方法及电子设备
US10585637B2 (en) 2017-03-27 2020-03-10 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device
US12307082B2 (en) 2018-02-21 2025-05-20 Apple Inc. Scrollable set of content items with locking feature
DK180639B1 (en) 2018-06-01 2021-11-04 Apple Inc DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT
US11922006B2 (en) 2018-06-03 2024-03-05 Apple Inc. Media control for screensavers on an electronic device
AU2019100574B4 (en) 2018-06-03 2020-02-20 Apple Inc. Setup procedures for an electronic device
CN110958473A (zh) * 2018-09-26 2020-04-03 深圳Tcl数字技术有限公司 一种低亮度环境下的遥控方法、电视机及存储介质
EP3928194A1 (en) 2019-03-24 2021-12-29 Apple Inc. User interfaces including selectable representations of content items
US12008232B2 (en) 2019-03-24 2024-06-11 Apple Inc. User interfaces for viewing and accessing content on an electronic device
US11683565B2 (en) 2019-03-24 2023-06-20 Apple Inc. User interfaces for interacting with channels that provide content that plays in a media browsing application
EP4443850A3 (en) 2019-03-24 2024-12-04 Apple Inc. User interfaces for a media browsing application
CN120595989A (zh) 2019-05-31 2025-09-05 苹果公司 用于播客浏览和回放应用程序的用户界面
US11863837B2 (en) 2019-05-31 2024-01-02 Apple Inc. Notification of augmented reality content on an electronic device
US11843838B2 (en) 2020-03-24 2023-12-12 Apple Inc. User interfaces for accessing episodes of a content series
US11899895B2 (en) 2020-06-21 2024-02-13 Apple Inc. User interfaces for setting up an electronic device
US11720229B2 (en) 2020-12-07 2023-08-08 Apple Inc. User interfaces for browsing and presenting content
US11934640B2 (en) 2021-01-29 2024-03-19 Apple Inc. User interfaces for record labels
US11762458B2 (en) * 2021-02-15 2023-09-19 Sony Group Corporation Media display device control based on eye gaze

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050188406A1 (en) * 2004-02-23 2005-08-25 Gielow Christopher C. System and method for managing applications and media content of a wireless communication device
US20080253735A1 (en) * 2007-04-16 2008-10-16 Adobe Systems Incorporated Changing video playback rate
US20080302582A1 (en) * 2000-03-15 2008-12-11 Logitech Europe S.A. Easy to Use and Intuitive User Interface for a Remote Control
US20100107107A1 (en) * 2006-09-14 2010-04-29 Kevin Corbett Apparatus, system and method for context and language specific data entry
US20100241699A1 (en) * 2009-03-20 2010-09-23 Muthukumarasamy Sivasubramanian Device-Based Control System

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05333839A (ja) 1992-06-03 1993-12-17 Sony Corp 画像表示装置
US5682326A (en) 1992-08-03 1997-10-28 Radius Inc. Desktop digital video processing system
KR0170326B1 (ko) 1994-07-27 1999-03-30 김광호 원격제어방법 및 그 장치
US5517257A (en) * 1995-03-28 1996-05-14 Microsoft Corporation Video control user interface for interactive television systems and method for controlling display of a video movie
JPH09305305A (ja) 1996-03-15 1997-11-28 Matsushita Electric Ind Co Ltd 画像表示装置およびその遠隔制御方法
JPH10112888A (ja) * 1996-08-14 1998-04-28 Sony Corp リモートコントロール装置
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US6330592B1 (en) 1998-12-05 2001-12-11 Vignette Corporation Method, memory, product, and code for displaying pre-customized content associated with visitor data
US6538665B2 (en) 1999-04-15 2003-03-25 Apple Computer, Inc. User interface for presenting media information
WO2001069567A2 (en) 2000-03-15 2001-09-20 Glen Mclean Harris State-based remote control system
US6765557B1 (en) 2000-04-10 2004-07-20 Interlink Electronics, Inc. Remote control having touch pad to screen mapping
JP4963141B2 (ja) 2000-04-27 2012-06-27 ソニー株式会社 情報提供装置および方法、並びにプログラム格納媒体
JP2004525675A (ja) 2001-01-24 2004-08-26 インターリンク エレクトロニクス インコーポレイテッド ゲーム及びホーム・エンターテイメント・デバイス遠隔制御
KR100811339B1 (ko) * 2001-10-11 2008-03-07 엘지전자 주식회사 그래픽 유저 인터페이스가 구현되는 원격제어 시스템 및방법
US7109974B2 (en) 2002-03-05 2006-09-19 Matsushita Electric Industrial Co., Ltd. Remote control system including an on-screen display (OSD)
KR101023699B1 (ko) * 2002-12-05 2011-03-25 엘지전자 주식회사 대화형 광디스크 장치에서의 재생 제어방법
JP4446728B2 (ja) 2002-12-17 2010-04-07 株式会社リコー 複数のマルチメディア文書に格納された情報の表示法
US7053965B1 (en) 2003-06-10 2006-05-30 Fan Nong-Qiang Remote control for controlling a computer using a screen of a television
US7603689B2 (en) * 2003-06-13 2009-10-13 Microsoft Corporation Fast start-up for digital video streams
KR100533675B1 (ko) 2003-07-24 2005-12-05 삼성전자주식회사 구조화된 데이터 포맷을 이용한 원격 제어 장치 및 제어방법
EP1667485A4 (en) 2003-08-29 2012-04-18 Panasonic Corp CONTROL DEVICE AND CONTROL PROCEDURE
JP4203741B2 (ja) 2003-09-25 2009-01-07 日本電気株式会社 データ再生装置およびデータ再生方法
JP2006094210A (ja) 2004-09-24 2006-04-06 Toshiba Corp 放送受信装置
JP2006147084A (ja) * 2004-11-22 2006-06-08 Funai Electric Co Ltd 光ディスク記録再生装置
US20060150225A1 (en) * 2005-01-05 2006-07-06 Microsoft Corporation Methods and systems for retaining and displaying pause buffer indicia across channel changes
US7788592B2 (en) 2005-01-12 2010-08-31 Microsoft Corporation Architecture and engine for time line based visualization of data
JP2007048383A (ja) 2005-08-10 2007-02-22 Matsushita Electric Ind Co Ltd 情報記録媒体およびその記録装置、記録方法、記録プログラム
US8837914B2 (en) * 2005-11-30 2014-09-16 Samsung Electronics Co., Ltd. Digital multimedia playback method and apparatus
US20080002021A1 (en) * 2006-06-30 2008-01-03 Guo Katherine H Method and apparatus for overlay-based enhanced TV service to 3G wireless handsets
US8564544B2 (en) * 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US20080062137A1 (en) * 2006-09-11 2008-03-13 Apple Computer, Inc. Touch actuation controller for multi-state media presentation
KR101368713B1 (ko) * 2006-11-20 2014-03-04 삼성전자주식회사 A/v기기 및 그 표시방법
US9767681B2 (en) * 2007-12-12 2017-09-19 Apple Inc. Handheld electronic devices with remote control functionality and gesture recognition
JP4513894B2 (ja) * 2008-05-16 2010-07-28 ソニー株式会社 画像処理装置、画像処理方法、画像再生装置、画像再生方法およびプログラム
KR101789619B1 (ko) * 2010-11-22 2017-10-25 엘지전자 주식회사 멀티미디어 장치에서 음성과 제스쳐를 이용한 제어 방법 및 그에 따른 멀티미디어 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080302582A1 (en) * 2000-03-15 2008-12-11 Logitech Europe S.A. Easy to Use and Intuitive User Interface for a Remote Control
US20050188406A1 (en) * 2004-02-23 2005-08-25 Gielow Christopher C. System and method for managing applications and media content of a wireless communication device
US20100107107A1 (en) * 2006-09-14 2010-04-29 Kevin Corbett Apparatus, system and method for context and language specific data entry
US20080253735A1 (en) * 2007-04-16 2008-10-16 Adobe Systems Incorporated Changing video playback rate
US20100241699A1 (en) * 2009-03-20 2010-09-23 Muthukumarasamy Sivasubramanian Device-Based Control System

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015130015A (ja) * 2014-01-06 2015-07-16 コニカミノルタ株式会社 オブジェクトの停止位置制御方法、操作表示装置およびプログラム
US10338792B2 (en) 2014-01-06 2019-07-02 Konica Minolta, Inc. Object stop position control method, action indicating device, and program

Also Published As

Publication number Publication date
US9766772B2 (en) 2017-09-19
BR112013013945B1 (pt) 2021-08-17
MX2013006311A (es) 2013-07-29
US8963847B2 (en) 2015-02-24
US20150169172A1 (en) 2015-06-18
KR20130108636A (ko) 2013-10-04
KR101525091B1 (ko) 2015-06-02
BR112013013945A2 (pt) 2016-09-27
CA2819709C (en) 2016-08-02
JP5770856B2 (ja) 2015-08-26
JP2014500558A (ja) 2014-01-09
CA2819709A1 (en) 2012-06-14
EP2649501A1 (en) 2013-10-16
EP2649501A4 (en) 2015-07-01
US20120139847A1 (en) 2012-06-07

Similar Documents

Publication Publication Date Title
US9766772B2 (en) User interface for a remote control device
JP2014500558A5 (enExample)
CN114302210B (zh) 用于查看和访问电子设备上的内容的用户界面
KR101364849B1 (ko) 방향성 터치 원격 제어장치
US9009594B2 (en) Content gestures
US20100026640A1 (en) Electronic apparatus and method for implementing user interface
US20130127731A1 (en) Remote controller, and system and method using the same
CN102622868B (zh) 一种遥控控制方法、显示控制装置、遥控器及系统
US20150163443A1 (en) Display apparatus, remote controller, display system, and display method
US8749426B1 (en) User interface and pointing device for a consumer electronics device
US20170180777A1 (en) Display apparatus, remote control apparatus, and control method thereof
TWI493407B (zh) Multi - function touchpad remote control and its control method
HK1167027B (en) Directional touch remote
HK1171828A (en) Audio/visual device graphical user interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11846635

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2819709

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: MX/A/2013/006311

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2013543286

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2011846635

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20137017617

Country of ref document: KR

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112013013945

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112013013945

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20130605