US20210326013A1 - Menu modification based on controller manipulation data - Google Patents

Menu modification based on controller manipulation data Download PDF

Info

Publication number
US20210326013A1
US20210326013A1 US17/302,809 US202117302809A US2021326013A1 US 20210326013 A1 US20210326013 A1 US 20210326013A1 US 202117302809 A US202117302809 A US 202117302809A US 2021326013 A1 US2021326013 A1 US 2021326013A1
Authority
US
United States
Prior art keywords
media
sequential movement
movement profile
data
sequential
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/302,809
Inventor
Paul Streit
Marc Stoksik
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OpenTV Inc
Original Assignee
OpenTV Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OpenTV Inc filed Critical OpenTV Inc
Priority to US17/302,809 priority Critical patent/US20210326013A1/en
Publication of US20210326013A1 publication Critical patent/US20210326013A1/en
Assigned to OPENTV, INC. reassignment OPENTV, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STOKSIK, MARC, STREIT, PAUL
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4222Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24568Data stream processing; Continuous queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25891Management of end-user data being end-user preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42222Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/441Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Definitions

  • the subject matter disclosed herein generally relates to the technical field of special-purpose machines that facilitate generation and presentation of graphical user interfaces, including software-configured computerized variants of such special-purpose machines and improvements to such variants, and to the technologies by which such special-purpose machines become improved compared to other special-purpose machines that facilitate generation and presentation of graphical user interfaces.
  • the present disclosure addresses systems and methods to facilitate menu modification based on controller manipulation data.
  • a machine may be configured to interact with one or more users by causing a graphical user interface to be generated, causing the graphical user interface to be presented to the one or more users, or both.
  • the graphical user interface may be or include a menu of items that are each separately selectable by the user.
  • at least a portion of an electronic programming guide may be included in a menu presented within a graphical user interface, such that the included portion lists various pieces of media content (e.g., streams of media content, pieces of media content, or both) available for selection by the user (e.g., for presentation by a media player device, for recording by a media storage device, or for indication in a media preference profile).
  • the graphical user interface may enable the user to select media content (e.g., one or more pieces of media content or streams of media content) by indicating which media content is selected or is to be selected.
  • FIG. 1 is a network diagram illustrating a network environment suitable for menu modification based on controller manipulation data, according to some example embodiments.
  • FIG. 2 is a block diagram illustrating components of a media server machine suitable for menu modification based on controller manipulation data, according to some example embodiments.
  • FIG. 3 is a block diagram illustrating components of a media device in the network environment, according to some example embodiments.
  • FIG. 4 is a block diagram illustrating components of a controller device suitable for menu modification based on controller manipulation data, according to some example embodiments.
  • FIGS. 5-7 are flowcharts illustrating operations in a method of menu modification based on controller manipulation data, according to some example embodiments.
  • FIG. 8 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.
  • Example methods facilitate menu modification based on controller manipulation data
  • example systems e.g., special-purpose machines configured by special-purpose software
  • example systems are configured to facilitate menu modification based on controller manipulation data. Examples merely typify possible variations.
  • structures e.g., structural components, such as modules
  • operations e.g., in a procedure, algorithm, or other function
  • numerous specific details are set forth to provide a thorough understanding of various example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
  • a machine e.g., a server machine, such as a media server machine
  • the machine may cause a media device (e.g., a set-top box or television set) to display at least a portion of a menu of media streams that are available to be selected (e.g., for playback by the media device).
  • a media device e.g., a set-top box or television set
  • the machine accesses (e.g., receives, retrieves, or reads) controller manipulation data that has been generated by a controller device (e.g., a remote control or a smart phone configured to function as a remote control) that fully or partially controls the media device.
  • a controller device e.g., a remote control or a smart phone configured to function as a remote control
  • the controller manipulation data indicates a sequence of physical manipulations (e.g., translational or angular movements) experienced by the controller device during operation by a user in selecting one or more media streams (e.g., for playback by the media device).
  • the machine selects a profile identifier from a set of profile identifiers. For example, the selection of the profile identifier may be based on a comparison of the controller manipulation data to a controller manipulation profile that corresponds to the profile identifier (e.g., among multiple controller manipulation profiles that each correspond to a different profile identifier among multiple profile identifiers).
  • the machine next selects a first subset (e.g., first portion) of the menu of media streams available to be selected, and the selection of the first subset may be based on the selected profile identifier.
  • the first subset of the menu indicates specific media streams to be hidden from view (e.g., visually omitted from the menu, from the graphical user interface, or from both).
  • the first subset of the menu contrasts with a second subset of the menu to be preserved in view (e.g., visually maintained in the menu, in the graphical user interface, or in both). Accordingly, the selected first subset of the menu has no overlap with the second subset of the menu.
  • the machine then causes the media device (e.g., via command or other communication) to modify the menu of media streams by omitting the first subset of the menu from the displayed portion of the menu while continuing to display the second subset of the menu in the displayed portion of the menu.
  • the media device e.g., via command or other communication
  • FIG. 1 is a network diagram illustrating a network environment 100 suitable for menu modification based on controller manipulation data, according to some example embodiments.
  • the network environment 100 includes a media server machine 110 , a database 115 , user devices 130 and 150 , a media device 140 , and a controller device 141 .
  • the media server machine 110 , the database 115 , and the media device 140 are shown as being communicatively coupled to each other via a network 190 .
  • the controller device 141 is communicatively coupled to the media device 140 (e.g., via the network 190 or via a different network or communication path, such as infrared signaling or other wireless signaling).
  • the user devices 130 and 150 may each be communicatively coupled to the controller device 141 , the media device 140 , or both (e.g., via the network 190 or via a different network or communication path, such as infrared signaling or other wireless signaling).
  • the media server machine 110 may form all or part of a cloud 118 (e.g., a geographically distributed set of multiple machines configured to function as a single server), which may form all or part of a network-based system 105 (e.g., a cloud-based server system configured to provide one or more network-based services to the user devices 130 and 150 ).
  • the media server machine 110 , the database 115 , the media device 140 , the controller device 141 , and the user devices 130 and 150 may each be implemented in a special-purpose (e.g., specialized) computer system, in whole or in part, as described below with respect to FIG. 8 .
  • users 132 and 152 are also shown in FIG. 1 .
  • One or both of the users 132 and 152 may be a human user (e.g., a human being), a machine user (e.g., a computer configured by a software program to interact with the user device 130 or the user device 150 ), or any suitable combination thereof (e.g., a human assisted by a machine or a machine supervised by a human).
  • the user 132 is associated with the user device 130 and may be a user of the user device 130 .
  • the user device 130 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, a smart phone, or a wearable device (e.g., a smart watch, smart glasses, smart clothing, or smart jewelry) belonging to the user 132 .
  • the user 152 is associated with the user device 150 and may be a user of the user device 150 .
  • the user device 150 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, a smart phone, or a wearable device (e.g., a smart watch, smart glasses, smart clothing, or smart jewelry) belonging to the user 152 .
  • any of the systems or machines (e.g., databases and devices) shown in FIG. 1 may be, include, or otherwise be implemented in a special-purpose (e.g., specialized or otherwise non-conventional and non-generic) computer that has been modified to perform one or more of the functions described herein for that system or machine (e.g., configured or programmed by special-purpose software, such as one or more software modules of a special-purpose application, operating system, firmware, middleware, or other software program).
  • special-purpose software such as one or more software modules of a special-purpose application, operating system, firmware, middleware, or other software program.
  • a special-purpose computer system able to implement any one or more of the methodologies described herein is discussed below with respect to FIG. 8 , and such a special-purpose computer may accordingly be a means for performing any one or more of the methodologies discussed herein.
  • a special-purpose computer that has been specially modified (e.g., configured by special-purpose software) by the structures discussed herein to perform the functions discussed herein is technically improved compared to other special-purpose computers that lack the structures discussed herein or are otherwise unable to perform the functions discussed herein. Accordingly, a special-purpose machine configured according to the systems and methods discussed herein provides an improvement to the technology of similar special-purpose machines.
  • a “database” is a data storage resource and may store data structured as a text file, a table, a spreadsheet, a relational database (e.g., an object-relational database), a triple store, a hierarchical data store, or any suitable combination thereof.
  • a relational database e.g., an object-relational database
  • a triple store e.g., a hierarchical data store, or any suitable combination thereof.
  • any two or more of the systems or machines illustrated in FIG. 1 may be combined into a single system or machine, and the functions described herein for any single system or machine may be subdivided among multiple systems or machines.
  • the network 190 may be any network that enables communication between or among systems, machines, databases, and devices (e.g., between the media server machine 110 and the media device 140 ). Accordingly, the network 190 may be a wired network, a wireless network (e.g., a mobile or cellular network), or any suitable combination thereof. The network 190 may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof.
  • the network 190 may include one or more portions that incorporate a local area network (LAN), a wide area network (WAN), the Internet, a mobile telephone network (e.g., a cellular network), a wired telephone network (e.g., a plain old telephone system (POTS) network), a wireless data network (e.g., a WiFi network or WiMax network), or any suitable combination thereof. Any one or more portions of the network 190 may communicate information via a transmission medium.
  • LAN local area network
  • WAN wide area network
  • the Internet a mobile telephone network
  • POTS plain old telephone system
  • POTS plain old telephone system
  • WiFi Wireless Fidelity
  • transmission medium refers to any intangible (e.g., transitory) medium that is capable of communicating (e.g., transmitting) instructions for execution by a machine (e.g., by one or more processors of such a machine), and includes digital or analog communication signals or other intangible media to facilitate communication of such software.
  • FIG. 2 is a block diagram illustrating components of the media server machine 110 , according to some example embodiments (e.g., server-side implementations).
  • the media server machine 110 is shown as including a display controller 210 , a controller interface 220 , a profile selector 230 , a menu modifier 240 , and a device detector 250 , all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
  • the display controller 210 may be or include a display module or other code for controlling a display (e.g., by controlling the media device 140 or by controlling a display screen within or communicatively coupled to the media device 140 ).
  • the controller interface 220 may be or include an access module or other code for accessing one or more controller devices (e.g., the controller device 141 or another remote control device).
  • the profile selector 230 may be or include a selection module or other code for selecting a profile.
  • the menu modifier 240 may be or include a modification module or other code for modifying a menu (e.g., within a graphical user interface).
  • the device detector 250 may be or include a detection module or other code for detecting one or more devices (e.g., user devices 130 and 150 ).
  • the display controller 210 may form all or part of an application 200 (e.g., a server-side software application) that is stored (e.g., installed) on the media server machine 110 (e.g., responsive to or otherwise as a result of data being received from the database 115 , the media device 140 , the controller device 141 , the user device 130 , the user device 150 , or another data repository).
  • an application 200 e.g., a server-side software application
  • the media server machine 110 e.g., responsive to or otherwise as a result of data being received from the database 115 , the media device 140 , the controller device 141 , the user device 130 , the user device 150 , or another data repository.
  • processors 299 may be included (e.g., temporarily or permanently) in the application 200 , the display controller 210 , the controller interface 220 , the profile selector 230 , the menu modifier 240 , the device detector 250 , or any suitable combination thereof.
  • FIG. 3 is a block diagram illustrating components of the media device 140 , according to some example embodiments (e.g., client-side implementations).
  • the media device 140 is shown as including the display controller 210 (e.g., an instance thereof), the controller interface 220 (e.g., an instance thereof), the profile selector 230 (e.g., an instance thereof), the menu modifier 240 (e.g., an instance thereof), and the device detector 250 (e.g., an instance thereof), all configured to communicate with each other (e.g., via a bus, shared memory, or a switch).
  • the display controller 210 e.g., an instance thereof
  • the controller interface 220 e.g., an instance thereof
  • the profile selector 230 e.g., an instance thereof
  • the menu modifier 240 e.g., an instance thereof
  • the device detector 250 e.g., an instance thereof
  • the display controller 210 may be or include a display module or other code for controlling a display (e.g., by controlling a display screen within or communicatively coupled to the media device 140 ).
  • the controller interface 220 may be or include an access module or other code for accessing one or more controller devices (e.g., the controller device 141 ).
  • the profile selector 230 may be or include a selection module or other code for selecting a profile.
  • the menu modifier 240 may be or include a modification module or other code for modifying a menu (e.g., within a graphical user interface presented by the media device 140 or a display screen thereof).
  • the device detector 250 may be or include a detection module or other code for detecting one or more devices (e.g., user devices 130 and 150 ).
  • the display controller 210 may form all or part of an app 300 (e.g., a mobile app or other client-side app) that is stored (e.g., installed) on the media device 140 (e.g., responsive to or otherwise as a result of data being received from the media server machine 110 , the database 115 , the controller device 141 , the user device 130 , the user device 150 , or another data repository).
  • an app 300 e.g., a mobile app or other client-side app
  • the media device 140 e.g., responsive to or otherwise as a result of data being received from the media server machine 110 , the database 115 , the controller device 141 , the user device 130 , the user device 150 , or another data repository.
  • processors 299 may be included (e.g., temporarily or permanently) in the app 300 , the display controller 210 , the controller interface 220 , the profile selector 230 , the menu modifier 240 , the device detector 250 , or any suitable combination thereof.
  • FIG. 4 is a block diagram illustrating components of the controller device 141 , according to some example embodiments.
  • the controller device 141 is shown as including control elements 410 , 420 , 430 , and 440 , as well as an accelerometer 450 and the device detector 250 (e.g., an instance thereof).
  • the control elements 410 , 420 , 430 , and 440 each may be or include an activatable control (e.g., a software or hardware button, switch, slider, dial, knob, or other manipulable user interface element) that can be operated by a user (e.g., the user 132 or 152 ).
  • an activatable control e.g., a software or hardware button, switch, slider, dial, knob, or other manipulable user interface element
  • control element 410 may be a graphical channel change button (e.g., channel up button or channel down button) within a graphical user interface 402 ;
  • control element 420 may be a graphical volume change button (e.g., volume up button or volume down button) within the graphical user interface 402 ;
  • control element 430 may be a hardware channel change button (e.g., channel up button or channel down button) on the exterior of the controller device 141 ; and the control element 440 may be a hardware volume change button (e.g., volume up button or volume down button) on the exterior of the controller device 141 .
  • the control element 410 , the control element 420 , or both may form all or part of the graphical user interface 402 .
  • the control element 410 , the control element 420 , the device detector 250 , or any suitable combination thereof may form all or part of an app 400 (e.g., a mobile app or other client-side app) that is stored (e.g., installed) on the controller device 141 (e.g., in response to or otherwise as a result of data being received from the media server machine 110 , the database 115 , the media device 140 , the user device 130 , the user device 150 , or another data repository).
  • the accelerometer 450 may be or include a set of acceleration sensors (e.g., one or more hardware accelerometers).
  • any one or more of the components (e.g., modules) described herein may be implemented using hardware alone (e.g., one or more of the processors 299 ) or a combination of hardware and software.
  • any component described herein may physically include an arrangement of one or more of the processors 299 (e.g., a subset of or among the processors 299 ) configured to perform the operations described herein for that component.
  • any component described herein may include software, hardware, or both, that configure an arrangement of one or more of the processors 299 to perform the operations described herein for that component.
  • different components described herein may include and configure different arrangements of the processors 299 at different points in time or a single arrangement of the processors 299 at different points in time.
  • Each component (e.g., module) described herein is an example of a means for performing the operations described herein for that component.
  • any two or more components described herein may be combined into a single component, and the functions described herein for a single component may be subdivided among multiple components.
  • components described herein as being implemented within a single system or machine e.g., a single device
  • FIGS. 5-7 are flowcharts illustrating operations in a method 500 of menu modification based on controller manipulation data, according to some example embodiments.
  • Operations in the method 500 may be performed by the media server machine 110 , the media device 140 , or any suitable combination thereof, using components (e.g., modules) described above (e.g., with respect to at least FIGS. 2 and 3 ), using one or more processors (e.g., microprocessors or other hardware processors), or using any suitable combination thereof.
  • the method 500 includes operations 510 , 520 , 530 , 540 , and 550 .
  • the display controller 210 causes the media device 140 to display at least a portion of a menu of menu items (e.g., within a graphical user interface presented on a display screen fully or partially controlled by the media device 140 ).
  • the media device 140 may be caused to display a menu of media streams that are each individually selectable (e.g., for playback by the media device 140 ).
  • the controller interface 220 accesses controller manipulation data generated by the controller device 141 .
  • the controller manipulation data may be stored by the controller device 141 itself, the database 115 , the media server machine 110 , or any suitable combination thereof, and accessed therefrom.
  • the controller manipulation data indicates a sequence of multiple physical manipulations (e.g., translational or angular movements) experienced by the controller device 141 during operation of the controller device 141 (e.g., in selecting one or more menu items, such as media streams selected for playback by the media device 140 ).
  • the profile selector 230 selects (e.g., indicates, designates, specifies, or otherwise denotes as being selected) a profile identifier from a set of profile identifiers.
  • the set of profile identifiers may be stored by the controller device 141 , the database 115 , the media server machine 110 , the media device 140 , or any suitable combination thereof, and accessed therefrom.
  • the profile selector 230 may therefore access the set of profile identifiers and accordingly select a profile identifier from among them.
  • the selection of the profile identifier may be based on one or more comparisons of the controller manipulation data accessed in operation 520 to one or more controller manipulation profiles.
  • the profile identifier may be selected based on a comparison of the controller manipulation data accessed in operation 520 to a controller manipulation profile (e.g., a reference controller manipulation profile) that corresponds to the profile identifier ultimately selected.
  • a controller manipulation profile e.g., a reference controller manipulation profile
  • the menu modifier 240 selects a first subset of the menu of menu items (e.g., the menu of media streams), and the selection of the first subset may be based on the profile identifier selected in operation 530 .
  • the selected first subset indicates menu items (e.g., media streams) to be hidden from view (e.g., omitted from presentation by the media device 140 ).
  • the selected first subset has no overlap with a second subset of the menu of menu items, since the second subset is to be preserved in view (e.g., by continuing to be presented by the media device 140 ).
  • the display controller 210 causes the media device 140 to modify the presented menu of menu items (e.g., the menu of media streams) by omitting the first subset of the menu (e.g., as selected in operation 540 ). In tandem with this omission, the display controller 210 causes the media device 140 to continue displaying the second subset of the menu, which second subset has no overlap with the first subset. Thus, the menu becomes modified by omission of the first subset while continuing to display the second subset in the displayed portion of the menu.
  • the presented menu of menu items e.g., the menu of media streams
  • the display controller 210 causes the media device 140 to continue displaying the second subset of the menu, which second subset has no overlap with the first subset.
  • the menu becomes modified by omission of the first subset while continuing to display the second subset in the displayed portion of the menu.
  • the method 500 may include one or more of operations 630 , 632 , 634 , 636 , 637 , 638 , and 639 , according to various example embodiments. Any one or more of operations 630 , 632 , 634 , 636 , 637 , 638 , and 639 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 530 , in which the profile selector 230 selects the profile identifier based on the controller manipulation data.
  • the controller manipulation data includes one or more frequencies of activation for one or more individual control elements (e.g., control element 410 , 420 , 430 , or 440 ). That is, the controller manipulation data may include a first activation frequency at which a first control element (e.g., control element 410 ) of the controller device 141 was activated during a sequence of activations of that first control element (e.g., repeated mashing of a channel change button or a volume change button).
  • a first control element e.g., control element 410
  • the profile selector 230 compares a recorded (e.g., first) activation frequency of a single control element (e.g., control element 410 , as recorded in the controller manipulation data) to a reference (e.g., second) activation frequency for the same single control element (e.g., control element 410 , as represented in a controller manipulation profile).
  • a recorded (e.g., first) activation frequency of a single control element e.g., control element 410 , as recorded in the controller manipulation data
  • a reference e.g., second activation frequency for the same single control element (e.g., control element 410 , as represented in a controller manipulation profile).
  • the controller manipulation data includes patterns of activation for multiple control elements (e.g., control elements 410 , 420 , and 410 again, in that order). That is, the controller manipulation data may include a first activation pattern (e.g., control element 410 , then control element 420 , and then control element 410 again) according to which multiple control elements (e.g., control elements 410 and 420 ) were activated during operation of those control elements in accordance with the first activation pattern.
  • a first activation pattern e.g., control element 410 , then control element 420 , and then control element 410 again
  • the profile selector 230 compares a recorded (e.g., first) activation pattern of these control elements (e.g., control elements 410 and 420 , as recorded in the controller manipulation data) to a reference (e.g., second) activation pattern for the same control elements (e.g., control elements 410 and 420 , as represented in a controller manipulation profile).
  • the controller manipulation data includes accelerometer data that indicates a series of accelerations experienced by the controller device 141 . That is, the controller manipulation data may include a first sequence of accelerometer data (e.g., generated by the accelerometer 450 ) that indicates sequential accelerations experienced by the controller device 141 during operation of the controller device 141 (e.g., during activations of one or more control elements, such as the control element 410 , either singly or in combination with other control elements, such as the control element 420 ).
  • a first sequence of accelerometer data e.g., generated by the accelerometer 450
  • the controller manipulation data may include a first sequence of accelerometer data (e.g., generated by the accelerometer 450 ) that indicates sequential accelerations experienced by the controller device 141 during operation of the controller device 141 (e.g., during activations of one or more control elements, such as the control element 410 , either singly or in combination with other control elements, such as the control element 420 ).
  • the profile selector 230 compares a recorded (e.g., first) sequence of accelerometer data (e.g., as recorded in the controller manipulation data) to a reference (e.g., second) sequence of accelerometer data (e.g., as represented in a controller manipulation profile).
  • the recorded sequence of accelerometer data may indicate a series of movements in which the controller device 141 moves in an upward rising rightward arc (e.g., as a result of being raised to point at the media device 140 by a right-handed user) and then tilts forward slightly (e.g., 2-5 degrees as a result of the control element 410 being activated), and the reference sequence of accelerometer data may indicate a similar upward rising rightward arc followed by a similarly slight forward tilt.
  • the recorded sequence of accelerometer data may indicate a series of movements in which the control device 141 moves in a descending rightward diagonal path with a 10-degree clockwise twist (e.g., as a result of being lowered to point at the media device 140 by a left-handed user), and the reference sequence of accelerometer data may indicate a similar descending rightward diagonal path with a similar (e.g., 8-degree) clockwise twist.
  • one or more of operations 636 , 637 , 638 , and 639 may be performed as part of operation 634 , in which the profile selector 230 compares the recorded sequence of accelerometer data to the reference sequence of accelerometer data.
  • the profile selector 230 compares a recorded sequence of motion data to a reference sequence of motion data, where each recorded or reference sequence represents translational motions of the controller device 141 (e.g., movements from one three-dimensional spatial location to another three-dimensional spatial location, such as upward movements, downward movements, leftward movements, rightward movements, forward movements, backward movements, or any suitable combination thereof).
  • Operation 637 may be performed as part of operation 636 .
  • the profile selector 230 compares a handedness of the recorded sequence with a handedness of the reference sequence.
  • the recorded sequence may indicate that the user (e.g., user 132 ) who manipulated the controller device 141 during the recorded sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the translational motions of the controller device 141 , as indicated by the recorded sequence of accelerometer data);
  • the reference sequence may indicate that the user (e.g., user 132 or 152 ) who manipulated the controller device 141 during the reference sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the translational motions of the controller device 141 , as indicated by the reference sequence of accelerometer data); and the profile selector 230 may compare the right-handedness of the recorded sequence to the right-handedness of the reference sequence.
  • the profile selector 230 may compare the right-handedness of the recorded sequence to the left-handedness of the reference sequence.
  • the handedness of the reference sequence of motion data is known from previous user input (e.g., a user preference or other user profile information corresponding to the user 132 or 152 ). Accordingly, the comparison of the handedness of the recorded sequence of motion data to the handedness of the reference sequence of motion data enables the handedness of the recorded sequence to be inferred from the degree to which the handedness of the recorded sequence is similar to, or different from, the handedness of the reference sequence.
  • the profile selector 230 compares a recorded sequence of orientation data to a reference sequence of orientation data, where each recorded or reference sequence represents rotational motions of the controller device 141 (e.g., movements from one angular orientation to another angular orientation, such as upward pitch, downward pitch, leftward yaw, rightward yaw, leftward roll, rightward roll, or any suitable combination thereof).
  • Operation 639 may be performed as part of operation 638 .
  • the profile selector 230 compares a handedness of the recorded sequence with a handedness of the reference sequence.
  • the recorded sequence may indicate that the user (e.g., user 132 ) who manipulated the controller device 141 during the recorded sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the rotational motions of the controller device 141 , as indicated by the recorded sequence of accelerometer data);
  • the reference sequence may indicate that the user (e.g., user 132 or 152 ) who manipulated the controller device 141 during the reference sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the rotational motions of the controller device 141 , as indicated by the reference sequence of accelerometer data); and the profile selector 230 may compare the right-handedness of the recorded sequence to the right-handedness of the reference sequence.
  • the profile selector 230 may compare the right-handedness of the recorded sequence to the left-handedness of the reference sequence.
  • the handedness of the reference sequence of orientation data is known from previous user input (e.g., a user preference or other user profile information corresponding to the user 132 or 152 ). Accordingly, the comparison of the handedness of the recorded sequence of orientation data to the handedness of the reference sequence of orientation data enables the handedness of the recorded sequence to be inferred from the degree to which the handedness of the recorded sequence is similar to, or different from, the handedness of the reference sequence.
  • the method 500 may include one or more of operations 720 , 721 , 730 , 732 , 734 , 740 , and 742 , according to various example embodiments.
  • Operations 720 and 730 may be implemented in the example embodiments that support a device detection feature, in which detection of a nearby device (e.g., user device 130 or user device 150 ) forms a basis for the profile selector 230 in operation 530 to select the profile identifier based on the controller manipulation data.
  • a nearby device e.g., user device 130 or user device 150
  • the device detector 250 detects that a user device (e.g., the user device 130 that corresponds to the user 132 ) is near the media device 140 (e.g., in physical proximity to the media device 140 in comparison to a threshold distance, such as the user device 130 being able to wirelessly communicate with the media device 140 with a minimum threshold signal strength), near the controller device 141 (e.g., in physical proximity to the controller device 141 in comparison to a threshold distance, such as the user device 130 being able to wirelessly communicate with the controller device 141 with a minimum threshold signal strength), or any suitable combination thereof.
  • a user device e.g., the user device 130 that corresponds to the user 132
  • the media device 140 e.g., in physical proximity to the media device 140 in comparison to a threshold distance, such as the user device 130 being able to wirelessly communicate with the media device 140 with a minimum threshold signal strength
  • controller device 141 e.g., in physical proximity to the controller device 141 in comparison to a threshold distance, such
  • the detected nearness of the user device (e.g., user device 130 ) to the media device 140 , the controller device 141 , or both, may be a basis for inferring that the corresponding user (e.g., user 132 ) is the user who is manipulating the controller device 141 . Accordingly, the device detector 250 may provide the results of this detection as input to the profile selector 230 to aid in the performance of operation 530 , in which the profile identifier is selected by the profile selector 230 .
  • Operation 721 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 720 .
  • the device detector 250 detects that the user device is joining a wireless network (e.g., a Wi-Fi local area network or a Bluetooth personal area network) that the controller device 141 has also joined. Accordingly, the device detector 250 may provide the results of this detection as input to the profile selector 230 , as discussed above.
  • a wireless network e.g., a Wi-Fi local area network or a Bluetooth personal area network
  • operation 730 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 530 , in which the profile selector 230 selects the profile identifier based on the controller manipulation data.
  • the device detector 250 has detected (e.g., in operation 720 ) that the user device (e.g., user device 130 ) is near the controller device 141 , the result of this detection may be communicated by the device detector 250 to the profile selector 230 and may cause the profile selector 230 to select the profile identifier (e.g., of a known user, such as the user 132 ) based on the detected user device.
  • the result of this detection may be or include a device identifier (e.g., a name of the device, a network address of the device, or any suitable combination thereof).
  • the profile identifier is selected based on such a device identifier (e.g., communicated by the device detector 250 to the profile selector 230 ).
  • the profile selector 230 may select a profile identifier that corresponds to the user (e.g., user 132 ) who corresponds to the detected user device (e.g., user device 130 ) by virtue of corresponding to a device identifier thereof (e.g., a device name or network address that identifies the user device 130 .
  • the performance of operation 530 by the profile selector 230 results in selection of an anonymous profile identifier that corresponds to an anonymous set of conditions (e.g., indicated by or inferred from the controller manipulation data), that is, a set of conditions that are not linked or associated with any known user (e.g., user 132 ).
  • an anonymous set of conditions e.g., indicated by or inferred from the controller manipulation data
  • the user 152 may be physically manipulating the controller device 141 for the very first time; and the corresponding set of conditions may be treated as an anonymous user profile that is identified by an anonymous profile identifier.
  • the profile selector 230 may perform operation 732 by selecting a profile identifier that corresponds to a specific, but anonymous, set of conditions.
  • the profile selector 230 may generate or otherwise obtain a new profile identifier, a new user profile, or both. In some example embodiments, the profile selector 230 may select or otherwise apply a default template in generating or otherwise obtaining a new profile identifier, a new user profile, or both. According to various example embodiments, multiple anonymous profiles (e.g., in the form of multiple, yet distinct, anonymous sets of conditions) may be supported and differentiated by the profile selector 230 .
  • the performance of operation 530 by the profile selector 230 results in selection of a known profile identifier, that is, a profile identifier that corresponds to a known, previously profiled, or otherwise non-anonymous user (e.g., user 132 ).
  • a set of one or more pre-existing profile identifiers may be stored by the media server machine 110 , the database 115 , the media device 140 , the controller device 141 , a user device that was detected in operation 720 (e.g., user device 130 ), or any suitable combination thereof, and the profile selector 230 may accordingly perform operation 734 by selecting one of these pre-existing profile identifiers (e.g., with or without selecting a corresponding pre-existing user profile).
  • One or both of operations 740 and 742 may be performed as part of operation 540 , in which the menu modifier 240 selects the first subset of the menu of menu items (e.g., the menu of media streams), which selection may be based on the profile identifier selected in operation 530 .
  • the menu modifier 240 accesses user preferences for the selected profile identifier. Such user preferences may be stored in a user profile, which may be included in a set of one or more pre-existing user profiles that are stored or otherwise made available by the media server machine 110 , the database 115 , the media device 140 , the controller device 141 , a user device that was detected in operation 720 (e.g., user device 130 ), or any suitable combination thereof.
  • the user preferences accessed in operation 740 are used as a basis for selecting the first subset of the menu, in accordance with operation 540 .
  • the menu modifier 240 selects at least part of the first subset based on the user preferences.
  • the user preferences may list, identify, specify, or otherwise indicate one or more menu items to omit (e.g., media streams to omit from view), one or more menu items to preserve (e.g., media streams to maintain in view), or any suitable combination thereof.
  • the menu of menu items can be modified by the menu modifier 240 in accordance with user preferences that correspond to a profile identifier that was selected based on the controller manipulation data accessed in operation 520 .
  • a modified menu can be presented or caused to be presented by the media device 140 . This may have the effect of customizing, personalizing, or otherwise targeting the modification of the menu of menu items specifically for a user (e.g., user 132 or 152 ) without necessarily making an affirmative identification of the user.
  • the controller manipulation data can be sufficient to distinguish, disambiguate, or otherwise indicate different human users of the controller device 141 , whether the human users are known or unknown (e.g., anonymous) to the media server machine 110 , the database 115 , the media device 140 , the controller device 141 , or any suitable combination thereof.
  • one or more of the methodologies described herein may facilitate menu modification based on controller manipulation data. Moreover, one or more of the methodologies described herein may facilitate modifying a menu of menu items in accordance with one or more user preferences that correspond to a profile identifier selected based on such controller manipulation data. Hence, one or more of the methodologies described herein may facilitate customization, personalization, or other targeting of menu modifications, as well as improved experiences with a graphical user interface that contains a menu modified according to the systems and methods described herein, compared to capabilities of pre-existing systems and methods.
  • one or more of the methodologies described herein may obviate a need for certain efforts or resources that otherwise would be involved in menu modification. Efforts expended by a user (e.g., user 132 or 152 ) in customizing or personalizing a menu of menu items may be reduced by use of (e.g., reliance upon) a special-purpose machine that implements one or more of the methodologies described herein. Computing resources used by one or more systems or machines (e.g., within the network environment 100 ) may similarly be reduced (e.g., compared to systems or machines that lack the structures discussed herein or are otherwise unable to perform the functions discussed herein). Examples of such computing resources include processor cycles, network traffic, computational capacity, main memory usage, graphics rendering capacity, graphics memory usage, data storage capacity, power consumption, and cooling capacity.
  • FIG. 8 is a block diagram illustrating components of a machine 800 , according to some example embodiments, able to read instructions 824 from a machine-readable medium 822 (e.g., a non-transitory machine-readable medium, a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof) and perform any one or more of the methodologies discussed herein, in whole or in part.
  • a machine-readable medium 822 e.g., a non-transitory machine-readable medium, a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof
  • FIG. 8 shows the machine 800 in the example form of a computer system (e.g., a computer) within which the instructions 824 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 800 to perform any one or more of the methodologies discussed herein may be executed, in whole or in part.
  • the instructions 824 e.g., software,
  • the machine 800 operates as a standalone device or may be communicatively coupled (e.g., networked) to other machines.
  • the machine 800 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a distributed (e.g., peer-to-peer) network environment.
  • the machine 800 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a cellular telephone, a smart phone, a set-top box (STB), a personal digital assistant (PDA), a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 824 , sequentially or otherwise, that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • STB set-top box
  • web appliance a network router, a network switch, a network bridge, or any machine capable of executing the instructions 824 , sequentially or otherwise, that specify actions to be taken by that machine.
  • the machine 800 includes a processor 802 (e.g., one or more central processing units (CPUs), one or more graphics processing units (GPUs), one or more digital signal processors (DSPs), one or more application specific integrated circuits (ASICs), one or more radio-frequency integrated circuits (RFICs), or any suitable combination thereof), a main memory 804 , and a static memory 806 , which are configured to communicate with each other via a bus 808 .
  • the processor 802 contains solid-state digital microcircuits (e.g., electronic, optical, or both) that are configurable, temporarily or permanently, by some or all of the instructions 824 such that the processor 802 is configurable to perform any one or more of the methodologies described herein, in whole or in part.
  • a set of one or more microcircuits of the processor 802 may be configurable to execute one or more modules (e.g., software modules) described herein.
  • the processor 802 is a multicore CPU (e.g., a dual-core CPU, a quad-core CPU, an 8-core CPU, or a 128-core CPU) within which each of multiple cores behaves as a separate processor that is able to perform any one or more of the methodologies discussed herein, in whole or in part.
  • beneficial effects described herein may be provided by the machine 800 with at least the processor 802 , these same beneficial effects may be provided by a different kind of machine that contains no processors (e.g., a purely mechanical system, a purely hydraulic system, or a hybrid mechanical-hydraulic system), if such a processor-less machine is configured to perform one or more of the methodologies described herein.
  • a processor-less machine e.g., a purely mechanical system, a purely hydraulic system, or a hybrid mechanical-hydraulic system
  • the machine 800 may further include a graphics display 810 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, a cathode ray tube (CRT), or any other display capable of displaying graphics or video).
  • a graphics display 810 e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, a cathode ray tube (CRT), or any other display capable of displaying graphics or video).
  • PDP plasma display panel
  • LED light emitting diode
  • LCD liquid crystal display
  • CRT cathode ray tube
  • the machine 800 may also include an alphanumeric input device 812 (e.g., a keyboard or keypad), a pointer input device 814 (e.g., a mouse, a touchpad, a touchscreen, a trackball, a joystick, a stylus, a motion sensor, an eye tracking device, a data glove, or other pointing instrument), a data storage 816 , an audio generation device 818 (e.g., a sound card, an amplifier, a speaker, a headphone jack, or any suitable combination thereof), and a network interface device 820 .
  • an alphanumeric input device 812 e.g., a keyboard or keypad
  • a pointer input device 814 e.g., a mouse, a touchpad, a touchscreen, a trackball, a joystick, a stylus, a motion sensor, an eye tracking device, a data glove, or other pointing instrument
  • a data storage 816 e.g., an audio generation device 818 (
  • the data storage 816 (e.g., a data storage device) includes the machine-readable medium 822 (e.g., a tangible and non-transitory machine-readable storage medium) on which are stored the instructions 824 embodying any one or more of the methodologies or functions described herein.
  • the instructions 824 may also reside, completely or at least partially, within the main memory 804 , within the static memory 806 , within the processor 802 (e.g., within the processor's cache memory), or any suitable combination thereof, before or during execution thereof by the machine 800 . Accordingly, the main memory 804 , the static memory 806 , and the processor 802 may be considered machine-readable media (e.g., tangible and non-transitory machine-readable media).
  • the instructions 824 may be transmitted or received over the network 190 via the network interface device 820 .
  • the network interface device 820 may communicate the instructions 824 using any one or more transfer protocols (e.g., hypertext transfer protocol (HTTP)).
  • HTTP hypertext transfer protocol
  • the machine 800 may be a portable computing device (e.g., a smart phone, a tablet computer, or a wearable device), and may have one or more additional input components 830 (e.g., sensors or gauges).
  • additional input components 830 include an image input component (e.g., one or more cameras), an audio input component (e.g., one or more microphones), a direction input component (e.g., a compass), a location input component (e.g., a global positioning system (GPS) receiver), an orientation component (e.g., a gyroscope), a motion detection component (e.g., one or more accelerometers), an altitude detection component (e.g., an altimeter), and a biometric input component (e.g., a heartrate detector or a blood pressure detector).
  • Input data gathered by any one or more of these input components may be accessible and available for use by any of the modules described herein.
  • the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 822 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions.
  • machine-readable medium shall also be taken to include any medium, or combination of multiple media, that is capable of storing the instructions 824 for execution by the machine 800 , such that the instructions 824 , when executed by one or more processors of the machine 800 (e.g., processor 802 ), cause the machine 800 to perform any one or more of the methodologies described herein, in whole or in part.
  • a “machine-readable medium” refers to a single storage apparatus or device, as well as cloud-based storage systems or storage networks that include multiple storage apparatus or devices.
  • machine-readable medium shall accordingly be taken to include, but not be limited to, one or more tangible and non-transitory data repositories (e.g., data volumes) in the example form of a solid-state memory chip, an optical disc, a magnetic disc, or any suitable combination thereof.
  • the instructions 824 for execution by the machine 800 may be communicated by a carrier medium.
  • Examples of such a carrier medium include a storage medium (e.g., a non-transitory machine-readable storage medium, such as a solid-state memory, being physically moved from one place to another place) and a transient medium (e.g., a propagating signal that communicates the instructions 824 ).
  • a storage medium e.g., a non-transitory machine-readable storage medium, such as a solid-state memory, being physically moved from one place to another place
  • a transient medium e.g., a propagating signal that communicates the instructions 824 .
  • Modules may constitute software modules (e.g., code stored or otherwise embodied in a machine-readable medium or in a transmission medium), hardware modules, or any suitable combination thereof.
  • a “hardware module” is a tangible (e.g., non-transitory) physical component (e.g., a set of one or more processors) capable of performing certain operations and may be configured or arranged in a certain physical manner.
  • one or more computer systems or one or more hardware modules thereof may be configured by software (e.g., an application or portion thereof) as a hardware module that operates to perform operations described herein for that module.
  • a hardware module may be implemented mechanically, electronically, hydraulically, or any suitable combination thereof.
  • a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations.
  • a hardware module may be or include a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC.
  • FPGA field programmable gate array
  • a hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations.
  • a hardware module may include software encompassed within a CPU or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, hydraulically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • the phrase “hardware module” should be understood to encompass a tangible entity that may be physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
  • the phrase “hardware-implemented module” refers to a hardware module. Considering example embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module includes a CPU configured by software to become a special-purpose processor, the CPU may be configured as respectively different special-purpose processors (e.g., each included in a different hardware module) at different times.
  • Software e.g., a software module
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory (e.g., a memory device) to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information from a computing resource).
  • a resource e.g., a collection of information from a computing resource
  • processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein.
  • processor-implemented module refers to a hardware module in which the hardware includes one or more processors. Accordingly, the operations described herein may be at least partially processor-implemented, hardware-implemented, or both, since a processor is an example of hardware, and at least some operations within any one or more of the methods discussed herein may be performed by one or more processor-implemented modules, hardware-implemented modules, or any suitable combination thereof.
  • processors may perform operations in a “cloud computing” environment or as a service (e.g., within a “software as a service” (SaaS) implementation). For example, at least some operations within any one or more of the methods discussed herein may be performed by a group of computers (e.g., as examples of machines that include processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)). The performance of certain operations may be distributed among the one or more processors, whether residing only within a single machine or deployed across a number of machines.
  • SaaS software as a service
  • the one or more processors or hardware modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or hardware modules may be distributed across a number of geographic locations.
  • a media device to display at least a portion of a menu of media streams that are selectable for playback by the media device; accessing, by one or more processors of the machine, controller manipulation data generated by a controller device and indicating a sequence of physical manipulations experienced by the controller device during operation of the controller device in selecting one or more media streams from the menu of media streams for playback by the media device; selecting, by one or more processors of the machine, a profile identifier from a set of profile identifiers based on a comparison of the accessed controller manipulation data to a controller manipulation profile that corresponds to the profile identifier; selecting, by one or more processors of the machine, a first subset of the menu of media streams based on the selected profile identifier, the selected first subset indicating media streams to be hidden from view, the first subset having no overlap with a second subset of the menu of media streams; and causing, by one or more processors of the machine, the media device to modify the menu of
  • the controller manipulation data may include one or more frequencies at which one or more control elements of the controller device have been activated. Accordingly, a second embodiment provides a method according to the first embodiment, wherein:
  • the accessed controller manipulation data includes a first activation frequency at which a control element of the controller device was activated during a sequence of activations of the control element; the controller manipulation profile that corresponds to the profile identifier includes a second activation frequency for the control element; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation frequencies.
  • the controller manipulation data may include one or more patterns in which multiple control elements have been activated. Accordingly, a third embodiment provides a method according to the first embodiment or the second embodiment, wherein:
  • the accessed controller manipulation data includes a first activation pattern according to which multiple control elements of the controller device were activated during operation of the multiple control elements; the controller manipulation profile that corresponds to the profile identifier includes a second activation pattern for the multiple control elements; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation patterns.
  • the controller manipulation data may include acceleration data generated by one or more accelerometers as a result of movements by the controller device. Accordingly, a fourth embodiment provides a method according to any of the first through third embodiments, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the controller device; the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • a fifth embodiment provides a method according to the fourth embodiment, wherein:
  • the sequential accelerations indicated by the first sequence of accelerometer data correspond to a handedness of a human user in physically manipulating the controller device; and in the compared controller manipulation profile, the second sequence of accelerometer data indicates the handedness.
  • the controller manipulation data may include orientation data generated by one or more accelerometers as a result of movements by the controller device. Accordingly, a sixth embodiment provides a method according to any of the first through fifth embodiments, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential orientations at which the controller device was held; the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • a seventh embodiment provides a method according to the sixth embodiment, wherein:
  • the sequential orientations indicated by the first sequence of accelerometer data correspond to a handedness of a human user in physically manipulating the controller device; and in the compared controller manipulation profile, the second sequence of accelerometer data indicates the handedness.
  • an eighth embodiment provides a method according to any of the first through seventh embodiments, further comprising:
  • the user device detecting that a user device is within a threshold range of the media device, the user device having a device identifier that corresponds to the profile identifier; and wherein the selecting of the profile identifier from the set of profile identifiers is further based on the user device being detected within the threshold range of the media device.
  • a ninth embodiment provides a method according to the eighth embodiment, wherein:
  • the detecting that the user device is within the threshold range of the media device includes detecting that the user device has joined a wireless network to which the media device belongs.
  • the profile identifier is a basis for selecting, applying, or otherwise using a set of one or more user preferences for a known user of the controller device (e.g., a user who has previously used the controller device and whose explicit or implicit user preferences are stored by the controller device or other machine within the network-based system 105 ). Accordingly, a tenth embodiment provides a method according to any of the first through ninth embodiments, wherein:
  • the selecting of the first subset of the menu of media streams includes accessing user preferences that correspond to the selected profile identifier and selecting at least part of the first subset based on the accessed user preferences.
  • an eleventh embodiment provides a method according to the tenth embodiment, wherein:
  • the accessed user preferences that correspond to the selected profile identifier indicate at least one media stream in the first subset of the menu of media streams to be omitted from the modified menu.
  • a twelfth embodiment provides a method according to the tenth embodiment or the eleventh embodiment, wherein:
  • the accessed user preferences that correspond to the selected profile identifier indicate at least one media stream in the second subset of the menu of media streams to be maintained in the modified menu.
  • the profile identifier does not correspond to any known user of the controller device (e.g., corresponds to an anonymous user or to a new user). Accordingly, a thirteenth embodiment provides a method according to any of the first through ninth embodiments, wherein:
  • the profile identifier identifies no user of the controller device and corresponds to an anonymous set of conditions under which the controller device was operated.
  • the profile identifier is sufficient to identify a known user of the controller device (e.g., uniquely identifies the user among a set of known users). Accordingly, a fourteenth embodiment provides a method according to any of the first through twelfth embodiments, wherein:
  • the profile identifier identifies a human user of the controller device.
  • a fifteenth embodiment provides a machine-readable medium (e.g., a non-transitory machine-readable storage medium) comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
  • a media device to display at least a portion of a menu of media streams that are selectable for playback by the media device; accessing controller manipulation data generated by a controller device and indicating a sequence of physical manipulations experienced by the controller device during operation of the controller device in selecting one or more media streams from the menu of media streams for playback by the media device; selecting a profile identifier from a set of profile identifiers based on a comparison of the accessed controller manipulation data to a controller manipulation profile that corresponds to the profile identifier; selecting a first subset of the menu of media streams based on the selected profile identifier, the selected first subset indicating media streams to be hidden from view, the first subset having no overlap with a second subset of the menu of media streams; and causing the media device to modify the menu of media streams by omitting the first subset of the menu of media streams from the displayed portion of the menu while displaying the second subset of the menu of media streams in the displayed portion of the menu.
  • a sixteenth embodiment provides a machine-readable medium according to the fifteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first activation frequency at which a control element of the controller device was activated during a sequence of activations of the control element; the controller manipulation profile that corresponds to the profile identifier includes a second activation frequency for the control element; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation frequencies.
  • a seventeenth embodiment provides a machine-readable medium according to the fifteenth embodiment or the sixteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the controller device; the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • An eighteenth embodiment provides a system comprising: one or more processors;
  • a memory storing instructions that, when executed by at least one processor among the one or more processors, cause the system to perform operations comprising: causing a media device to display at least a portion of a menu of media streams that are selectable for playback by the media device; accessing controller manipulation data generated by a controller device and indicating a sequence of physical manipulations experienced by the controller device during operation of the controller device in selecting one or more media streams from the menu of media streams for playback by the media device; selecting a profile identifier from a set of profile identifiers based on a comparison of the accessed controller manipulation data to a controller manipulation profile that corresponds to the profile identifier; selecting a first subset of the menu of media streams based on the selected profile identifier, the selected first subset indicating media streams to be hidden from view, the first subset having no overlap with a second subset of the menu of media streams; and causing the media device to modify the menu of media streams by omitting the first subset of the menu of media streams from the displayed portion of the
  • a nineteenth embodiment provides a system according to the eighteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first activation frequency at which a control element of the controller device was activated during a sequence of activations of the control element; the controller manipulation profile that corresponds to the profile identifier includes a second activation frequency for the control element; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation frequencies.
  • a twentieth embodiment provides a system according to the eighteenth embodiment or the nineteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the controller device; the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • a twenty-first embodiment provides a carrier medium carrying machine-readable instructions for controlling a machine to carry out the method of any one of the first through fourteenth embodiments.

Abstract

A machine performs menu modification based on information that indicates how a controller device was manipulated by a user. The machine causes a media device to display a portion of a menu. The machine accesses controller manipulation data generated by a controller device in fully or partially controlling the media device, such as controller manipulation data that indicates a sequence of physical manipulations experienced by the controller device being operated by a user to select menu items. Based on the sequence of physical manipulations, the machine selects a profile identifier from a set of profile identifiers. Based on the profile identifier, the machine selects a first subset of the menu. The first subset indicates menu items to be hidden, unlike a second subset of the menu. The machine causes the media device to modify the menu by omitting the first subset while continuing to display the second subset.

Description

    PRIORITY APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 15/422,221, filed Feb. 1, 2017, the disclosure of which is incorporated herein in its entirety by reference.
  • TECHNICAL FIELD
  • The subject matter disclosed herein generally relates to the technical field of special-purpose machines that facilitate generation and presentation of graphical user interfaces, including software-configured computerized variants of such special-purpose machines and improvements to such variants, and to the technologies by which such special-purpose machines become improved compared to other special-purpose machines that facilitate generation and presentation of graphical user interfaces. Specifically, the present disclosure addresses systems and methods to facilitate menu modification based on controller manipulation data.
  • BACKGROUND
  • A machine may be configured to interact with one or more users by causing a graphical user interface to be generated, causing the graphical user interface to be presented to the one or more users, or both. In particular, the graphical user interface may be or include a menu of items that are each separately selectable by the user. For example, at least a portion of an electronic programming guide may be included in a menu presented within a graphical user interface, such that the included portion lists various pieces of media content (e.g., streams of media content, pieces of media content, or both) available for selection by the user (e.g., for presentation by a media player device, for recording by a media storage device, or for indication in a media preference profile). In such situations, the graphical user interface may enable the user to select media content (e.g., one or more pieces of media content or streams of media content) by indicating which media content is selected or is to be selected.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings.
  • FIG. 1 is a network diagram illustrating a network environment suitable for menu modification based on controller manipulation data, according to some example embodiments.
  • FIG. 2 is a block diagram illustrating components of a media server machine suitable for menu modification based on controller manipulation data, according to some example embodiments.
  • FIG. 3 is a block diagram illustrating components of a media device in the network environment, according to some example embodiments.
  • FIG. 4 is a block diagram illustrating components of a controller device suitable for menu modification based on controller manipulation data, according to some example embodiments.
  • FIGS. 5-7 are flowcharts illustrating operations in a method of menu modification based on controller manipulation data, according to some example embodiments.
  • FIG. 8 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.
  • DETAILED DESCRIPTION
  • Example methods (e.g., algorithms) facilitate menu modification based on controller manipulation data, and example systems (e.g., special-purpose machines configured by special-purpose software) are configured to facilitate menu modification based on controller manipulation data. Examples merely typify possible variations. Unless explicitly stated otherwise, structures (e.g., structural components, such as modules) are optional and may be combined or subdivided, and operations (e.g., in a procedure, algorithm, or other function) may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of various example embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
  • A machine (e.g., a server machine, such as a media server machine) may be configured (e.g., by suitable hardware, software, or both) to perform menu modification (e.g., within a graphical user interface), and such menu modification may be based on manipulation data that describes, specifies, identifies, or otherwise indicates how a controller device was manipulated by a user. The machine may cause a media device (e.g., a set-top box or television set) to display at least a portion of a menu of media streams that are available to be selected (e.g., for playback by the media device). For purposes of clarity in illustrating various example embodiments, although much of the discussion herein refers to a menu of media streams (e.g., individual data streams, such as television channels or radio stations), the systems and methods discussed herein are similarly applicable to a menu of individual pieces of media content (e.g., individual movies, television show episodes, short films, advertisements, or video clips).
  • As configured according to the systems and methods described herein, the machine accesses (e.g., receives, retrieves, or reads) controller manipulation data that has been generated by a controller device (e.g., a remote control or a smart phone configured to function as a remote control) that fully or partially controls the media device. In particular, the controller manipulation data indicates a sequence of physical manipulations (e.g., translational or angular movements) experienced by the controller device during operation by a user in selecting one or more media streams (e.g., for playback by the media device).
  • Based on the indicated sequence of physical manipulations, the machine selects a profile identifier from a set of profile identifiers. For example, the selection of the profile identifier may be based on a comparison of the controller manipulation data to a controller manipulation profile that corresponds to the profile identifier (e.g., among multiple controller manipulation profiles that each correspond to a different profile identifier among multiple profile identifiers).
  • The machine next selects a first subset (e.g., first portion) of the menu of media streams available to be selected, and the selection of the first subset may be based on the selected profile identifier. The first subset of the menu indicates specific media streams to be hidden from view (e.g., visually omitted from the menu, from the graphical user interface, or from both). The first subset of the menu contrasts with a second subset of the menu to be preserved in view (e.g., visually maintained in the menu, in the graphical user interface, or in both). Accordingly, the selected first subset of the menu has no overlap with the second subset of the menu. The machine then causes the media device (e.g., via command or other communication) to modify the menu of media streams by omitting the first subset of the menu from the displayed portion of the menu while continuing to display the second subset of the menu in the displayed portion of the menu.
  • FIG. 1 is a network diagram illustrating a network environment 100 suitable for menu modification based on controller manipulation data, according to some example embodiments. The network environment 100 includes a media server machine 110, a database 115, user devices 130 and 150, a media device 140, and a controller device 141. The media server machine 110, the database 115, and the media device 140 are shown as being communicatively coupled to each other via a network 190. Additionally, the controller device 141 is communicatively coupled to the media device 140 (e.g., via the network 190 or via a different network or communication path, such as infrared signaling or other wireless signaling). Moreover, the user devices 130 and 150 may each be communicatively coupled to the controller device 141, the media device 140, or both (e.g., via the network 190 or via a different network or communication path, such as infrared signaling or other wireless signaling).
  • The media server machine 110, with or without the database 115, may form all or part of a cloud 118 (e.g., a geographically distributed set of multiple machines configured to function as a single server), which may form all or part of a network-based system 105 (e.g., a cloud-based server system configured to provide one or more network-based services to the user devices 130 and 150). The media server machine 110, the database 115, the media device 140, the controller device 141, and the user devices 130 and 150 may each be implemented in a special-purpose (e.g., specialized) computer system, in whole or in part, as described below with respect to FIG. 8.
  • Also shown in FIG. 1 are users 132 and 152. One or both of the users 132 and 152 may be a human user (e.g., a human being), a machine user (e.g., a computer configured by a software program to interact with the user device 130 or the user device 150), or any suitable combination thereof (e.g., a human assisted by a machine or a machine supervised by a human). The user 132 is associated with the user device 130 and may be a user of the user device 130. For example, the user device 130 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, a smart phone, or a wearable device (e.g., a smart watch, smart glasses, smart clothing, or smart jewelry) belonging to the user 132. Likewise, the user 152 is associated with the user device 150 and may be a user of the user device 150. As an example, the user device 150 may be a desktop computer, a vehicle computer, a tablet computer, a navigational device, a portable media device, a smart phone, or a wearable device (e.g., a smart watch, smart glasses, smart clothing, or smart jewelry) belonging to the user 152.
  • Any of the systems or machines (e.g., databases and devices) shown in FIG. 1 may be, include, or otherwise be implemented in a special-purpose (e.g., specialized or otherwise non-conventional and non-generic) computer that has been modified to perform one or more of the functions described herein for that system or machine (e.g., configured or programmed by special-purpose software, such as one or more software modules of a special-purpose application, operating system, firmware, middleware, or other software program). For example, a special-purpose computer system able to implement any one or more of the methodologies described herein is discussed below with respect to FIG. 8, and such a special-purpose computer may accordingly be a means for performing any one or more of the methodologies discussed herein. Within the technical field of such special-purpose computers, a special-purpose computer that has been specially modified (e.g., configured by special-purpose software) by the structures discussed herein to perform the functions discussed herein is technically improved compared to other special-purpose computers that lack the structures discussed herein or are otherwise unable to perform the functions discussed herein. Accordingly, a special-purpose machine configured according to the systems and methods discussed herein provides an improvement to the technology of similar special-purpose machines.
  • As used herein, a “database” is a data storage resource and may store data structured as a text file, a table, a spreadsheet, a relational database (e.g., an object-relational database), a triple store, a hierarchical data store, or any suitable combination thereof. Moreover, any two or more of the systems or machines illustrated in FIG. 1 may be combined into a single system or machine, and the functions described herein for any single system or machine may be subdivided among multiple systems or machines.
  • The network 190 may be any network that enables communication between or among systems, machines, databases, and devices (e.g., between the media server machine 110 and the media device 140). Accordingly, the network 190 may be a wired network, a wireless network (e.g., a mobile or cellular network), or any suitable combination thereof. The network 190 may include one or more portions that constitute a private network, a public network (e.g., the Internet), or any suitable combination thereof. Accordingly, the network 190 may include one or more portions that incorporate a local area network (LAN), a wide area network (WAN), the Internet, a mobile telephone network (e.g., a cellular network), a wired telephone network (e.g., a plain old telephone system (POTS) network), a wireless data network (e.g., a WiFi network or WiMax network), or any suitable combination thereof. Any one or more portions of the network 190 may communicate information via a transmission medium. As used herein, “transmission medium” refers to any intangible (e.g., transitory) medium that is capable of communicating (e.g., transmitting) instructions for execution by a machine (e.g., by one or more processors of such a machine), and includes digital or analog communication signals or other intangible media to facilitate communication of such software.
  • FIG. 2 is a block diagram illustrating components of the media server machine 110, according to some example embodiments (e.g., server-side implementations). The media server machine 110 is shown as including a display controller 210, a controller interface 220, a profile selector 230, a menu modifier 240, and a device detector 250, all configured to communicate with each other (e.g., via a bus, shared memory, or a switch). The display controller 210 may be or include a display module or other code for controlling a display (e.g., by controlling the media device 140 or by controlling a display screen within or communicatively coupled to the media device 140). The controller interface 220 may be or include an access module or other code for accessing one or more controller devices (e.g., the controller device 141 or another remote control device). The profile selector 230 may be or include a selection module or other code for selecting a profile. The menu modifier 240 may be or include a modification module or other code for modifying a menu (e.g., within a graphical user interface). The device detector 250 may be or include a detection module or other code for detecting one or more devices (e.g., user devices 130 and 150).
  • As shown in FIG. 2, the display controller 210, the controller interface 220, the profile selector 230, the menu modifier 240, the device detector 250, or any suitable combination thereof may form all or part of an application 200 (e.g., a server-side software application) that is stored (e.g., installed) on the media server machine 110 (e.g., responsive to or otherwise as a result of data being received from the database 115, the media device 140, the controller device 141, the user device 130, the user device 150, or another data repository). Furthermore, one or more processors 299 (e.g., hardware processors, digital processors, or any suitable combination thereof) may be included (e.g., temporarily or permanently) in the application 200, the display controller 210, the controller interface 220, the profile selector 230, the menu modifier 240, the device detector 250, or any suitable combination thereof.
  • FIG. 3 is a block diagram illustrating components of the media device 140, according to some example embodiments (e.g., client-side implementations). The media device 140 is shown as including the display controller 210 (e.g., an instance thereof), the controller interface 220 (e.g., an instance thereof), the profile selector 230 (e.g., an instance thereof), the menu modifier 240 (e.g., an instance thereof), and the device detector 250 (e.g., an instance thereof), all configured to communicate with each other (e.g., via a bus, shared memory, or a switch). As noted above, the display controller 210 may be or include a display module or other code for controlling a display (e.g., by controlling a display screen within or communicatively coupled to the media device 140). The controller interface 220 may be or include an access module or other code for accessing one or more controller devices (e.g., the controller device 141). The profile selector 230 may be or include a selection module or other code for selecting a profile. The menu modifier 240 may be or include a modification module or other code for modifying a menu (e.g., within a graphical user interface presented by the media device 140 or a display screen thereof). The device detector 250 may be or include a detection module or other code for detecting one or more devices (e.g., user devices 130 and 150).
  • As shown in FIG. 3, the display controller 210, the controller interface 220, the profile selector 230, the menu modifier 240, the device detector 250, or any suitable combination thereof, may form all or part of an app 300 (e.g., a mobile app or other client-side app) that is stored (e.g., installed) on the media device 140 (e.g., responsive to or otherwise as a result of data being received from the media server machine 110, the database 115, the controller device 141, the user device 130, the user device 150, or another data repository). Furthermore, one or more processors 299 (e.g., hardware processors, digital processors, or any suitable combination thereof) may be included (e.g., temporarily or permanently) in the app 300, the display controller 210, the controller interface 220, the profile selector 230, the menu modifier 240, the device detector 250, or any suitable combination thereof. Although FIG. 2 and FIG. 3 depict the display controller 210, the controller interface 220, the profile selector 230, the menu modifier 240, and the device detector 250 as being included together on a single machine (e.g., executing on the media server machine 110 or executing on the media device 140), such components may be distributed between the media server machine 110 and the media device 140, according to various hybrid example embodiments.
  • FIG. 4 is a block diagram illustrating components of the controller device 141, according to some example embodiments. The controller device 141 is shown as including control elements 410, 420, 430, and 440, as well as an accelerometer 450 and the device detector 250 (e.g., an instance thereof). The control elements 410, 420, 430, and 440 each may be or include an activatable control (e.g., a software or hardware button, switch, slider, dial, knob, or other manipulable user interface element) that can be operated by a user (e.g., the user 132 or 152). For example, the control element 410 may be a graphical channel change button (e.g., channel up button or channel down button) within a graphical user interface 402; the control element 420 may be a graphical volume change button (e.g., volume up button or volume down button) within the graphical user interface 402; the control element 430 may be a hardware channel change button (e.g., channel up button or channel down button) on the exterior of the controller device 141; and the control element 440 may be a hardware volume change button (e.g., volume up button or volume down button) on the exterior of the controller device 141.
  • As shown in FIG. 4, the control element 410, the control element 420, or both, may form all or part of the graphical user interface 402. Moreover, the control element 410, the control element 420, the device detector 250, or any suitable combination thereof may form all or part of an app 400 (e.g., a mobile app or other client-side app) that is stored (e.g., installed) on the controller device 141 (e.g., in response to or otherwise as a result of data being received from the media server machine 110, the database 115, the media device 140, the user device 130, the user device 150, or another data repository). The accelerometer 450 may be or include a set of acceleration sensors (e.g., one or more hardware accelerometers).
  • Any one or more of the components (e.g., modules) described herein (e.g., with respect to any of FIGS. 2-4) may be implemented using hardware alone (e.g., one or more of the processors 299) or a combination of hardware and software. For example, any component described herein may physically include an arrangement of one or more of the processors 299 (e.g., a subset of or among the processors 299) configured to perform the operations described herein for that component. As another example, any component described herein may include software, hardware, or both, that configure an arrangement of one or more of the processors 299 to perform the operations described herein for that component. Accordingly, different components described herein may include and configure different arrangements of the processors 299 at different points in time or a single arrangement of the processors 299 at different points in time. Each component (e.g., module) described herein is an example of a means for performing the operations described herein for that component. Moreover, any two or more components described herein may be combined into a single component, and the functions described herein for a single component may be subdivided among multiple components. Furthermore, according to various example embodiments, components described herein as being implemented within a single system or machine (e.g., a single device) may be distributed across multiple systems or machines (e.g., multiple devices).
  • FIGS. 5-7 are flowcharts illustrating operations in a method 500 of menu modification based on controller manipulation data, according to some example embodiments. Operations in the method 500 may be performed by the media server machine 110, the media device 140, or any suitable combination thereof, using components (e.g., modules) described above (e.g., with respect to at least FIGS. 2 and 3), using one or more processors (e.g., microprocessors or other hardware processors), or using any suitable combination thereof. As shown in FIG. 5, the method 500 includes operations 510, 520, 530, 540, and 550.
  • In operation 510, the display controller 210 causes the media device 140 to display at least a portion of a menu of menu items (e.g., within a graphical user interface presented on a display screen fully or partially controlled by the media device 140). For example, the media device 140 may be caused to display a menu of media streams that are each individually selectable (e.g., for playback by the media device 140).
  • In operation 520, the controller interface 220 accesses controller manipulation data generated by the controller device 141. The controller manipulation data may be stored by the controller device 141 itself, the database 115, the media server machine 110, or any suitable combination thereof, and accessed therefrom. The controller manipulation data indicates a sequence of multiple physical manipulations (e.g., translational or angular movements) experienced by the controller device 141 during operation of the controller device 141 (e.g., in selecting one or more menu items, such as media streams selected for playback by the media device 140).
  • In operation 530, the profile selector 230 selects (e.g., indicates, designates, specifies, or otherwise denotes as being selected) a profile identifier from a set of profile identifiers. The set of profile identifiers may be stored by the controller device 141, the database 115, the media server machine 110, the media device 140, or any suitable combination thereof, and accessed therefrom. The profile selector 230 may therefore access the set of profile identifiers and accordingly select a profile identifier from among them. The selection of the profile identifier may be based on one or more comparisons of the controller manipulation data accessed in operation 520 to one or more controller manipulation profiles. In particular, the profile identifier may be selected based on a comparison of the controller manipulation data accessed in operation 520 to a controller manipulation profile (e.g., a reference controller manipulation profile) that corresponds to the profile identifier ultimately selected.
  • In operation 540, the menu modifier 240 selects a first subset of the menu of menu items (e.g., the menu of media streams), and the selection of the first subset may be based on the profile identifier selected in operation 530. As noted above, the selected first subset indicates menu items (e.g., media streams) to be hidden from view (e.g., omitted from presentation by the media device 140). Furthermore, as also noted above, the selected first subset has no overlap with a second subset of the menu of menu items, since the second subset is to be preserved in view (e.g., by continuing to be presented by the media device 140).
  • In operation 550, the display controller 210 causes the media device 140 to modify the presented menu of menu items (e.g., the menu of media streams) by omitting the first subset of the menu (e.g., as selected in operation 540). In tandem with this omission, the display controller 210 causes the media device 140 to continue displaying the second subset of the menu, which second subset has no overlap with the first subset. Thus, the menu becomes modified by omission of the first subset while continuing to display the second subset in the displayed portion of the menu.
  • As shown in FIG. 6, in addition to any one or more of the operations previously described, the method 500 may include one or more of operations 630, 632, 634, 636, 637, 638, and 639, according to various example embodiments. Any one or more of operations 630, 632, 634, 636, 637, 638, and 639 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 530, in which the profile selector 230 selects the profile identifier based on the controller manipulation data.
  • In example embodiments that include operation 630, the controller manipulation data includes one or more frequencies of activation for one or more individual control elements (e.g., control element 410, 420, 430, or 440). That is, the controller manipulation data may include a first activation frequency at which a first control element (e.g., control element 410) of the controller device 141 was activated during a sequence of activations of that first control element (e.g., repeated mashing of a channel change button or a volume change button). Accordingly, in operation 630, the profile selector 230 compares a recorded (e.g., first) activation frequency of a single control element (e.g., control element 410, as recorded in the controller manipulation data) to a reference (e.g., second) activation frequency for the same single control element (e.g., control element 410, as represented in a controller manipulation profile).
  • In example embodiments that include operation 632, the controller manipulation data includes patterns of activation for multiple control elements (e.g., control elements 410, 420, and 410 again, in that order). That is, the controller manipulation data may include a first activation pattern (e.g., control element 410, then control element 420, and then control element 410 again) according to which multiple control elements (e.g., control elements 410 and 420) were activated during operation of those control elements in accordance with the first activation pattern. Accordingly, in operation 632, the profile selector 230 compares a recorded (e.g., first) activation pattern of these control elements (e.g., control elements 410 and 420, as recorded in the controller manipulation data) to a reference (e.g., second) activation pattern for the same control elements (e.g., control elements 410 and 420, as represented in a controller manipulation profile).
  • In example embodiments that include operation 634, the controller manipulation data includes accelerometer data that indicates a series of accelerations experienced by the controller device 141. That is, the controller manipulation data may include a first sequence of accelerometer data (e.g., generated by the accelerometer 450) that indicates sequential accelerations experienced by the controller device 141 during operation of the controller device 141 (e.g., during activations of one or more control elements, such as the control element 410, either singly or in combination with other control elements, such as the control element 420). Accordingly, in operation 634, the profile selector 230 compares a recorded (e.g., first) sequence of accelerometer data (e.g., as recorded in the controller manipulation data) to a reference (e.g., second) sequence of accelerometer data (e.g., as represented in a controller manipulation profile).
  • For example, the recorded sequence of accelerometer data may indicate a series of movements in which the controller device 141 moves in an upward rising rightward arc (e.g., as a result of being raised to point at the media device 140 by a right-handed user) and then tilts forward slightly (e.g., 2-5 degrees as a result of the control element 410 being activated), and the reference sequence of accelerometer data may indicate a similar upward rising rightward arc followed by a similarly slight forward tilt. As another example, the recorded sequence of accelerometer data may indicate a series of movements in which the control device 141 moves in a descending rightward diagonal path with a 10-degree clockwise twist (e.g., as a result of being lowered to point at the media device 140 by a left-handed user), and the reference sequence of accelerometer data may indicate a similar descending rightward diagonal path with a similar (e.g., 8-degree) clockwise twist.
  • As shown in FIG. 6, one or more of operations 636, 637, 638, and 639 may be performed as part of operation 634, in which the profile selector 230 compares the recorded sequence of accelerometer data to the reference sequence of accelerometer data. In operation 636, the profile selector 230 compares a recorded sequence of motion data to a reference sequence of motion data, where each recorded or reference sequence represents translational motions of the controller device 141 (e.g., movements from one three-dimensional spatial location to another three-dimensional spatial location, such as upward movements, downward movements, leftward movements, rightward movements, forward movements, backward movements, or any suitable combination thereof).
  • Operation 637 may be performed as part of operation 636. In operation 637, the profile selector 230 compares a handedness of the recorded sequence with a handedness of the reference sequence. For example, the recorded sequence may indicate that the user (e.g., user 132) who manipulated the controller device 141 during the recorded sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the translational motions of the controller device 141, as indicated by the recorded sequence of accelerometer data); the reference sequence may indicate that the user (e.g., user 132 or 152) who manipulated the controller device 141 during the reference sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the translational motions of the controller device 141, as indicated by the reference sequence of accelerometer data); and the profile selector 230 may compare the right-handedness of the recorded sequence to the right-handedness of the reference sequence. As another example, if the reference sequence indicates that the user (e.g., user 152) who manipulated the controller device 141 during the reference sequence used his or her left hand or otherwise is inferred to be a left-handed user, the profile selector 230 may compare the right-handedness of the recorded sequence to the left-handedness of the reference sequence.
  • In some example embodiments, the handedness of the reference sequence of motion data is known from previous user input (e.g., a user preference or other user profile information corresponding to the user 132 or 152). Accordingly, the comparison of the handedness of the recorded sequence of motion data to the handedness of the reference sequence of motion data enables the handedness of the recorded sequence to be inferred from the degree to which the handedness of the recorded sequence is similar to, or different from, the handedness of the reference sequence.
  • In operation 638, the profile selector 230 compares a recorded sequence of orientation data to a reference sequence of orientation data, where each recorded or reference sequence represents rotational motions of the controller device 141 (e.g., movements from one angular orientation to another angular orientation, such as upward pitch, downward pitch, leftward yaw, rightward yaw, leftward roll, rightward roll, or any suitable combination thereof).
  • Operation 639 may be performed as part of operation 638. In operation 639, the profile selector 230 compares a handedness of the recorded sequence with a handedness of the reference sequence. For example, the recorded sequence may indicate that the user (e.g., user 132) who manipulated the controller device 141 during the recorded sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the rotational motions of the controller device 141, as indicated by the recorded sequence of accelerometer data); the reference sequence may indicate that the user (e.g., user 132 or 152) who manipulated the controller device 141 during the reference sequence used his or her right hand or otherwise is inferred to be a right-handed user (e.g., based on the rotational motions of the controller device 141, as indicated by the reference sequence of accelerometer data); and the profile selector 230 may compare the right-handedness of the recorded sequence to the right-handedness of the reference sequence. As another example, if the reference sequence indicates that the user (e.g., user 152) who manipulated the controller device 141 during the reference sequence used his or her left hand or otherwise is inferred to be a left-handed user, the profile selector 230 may compare the right-handedness of the recorded sequence to the left-handedness of the reference sequence.
  • In some example embodiments, the handedness of the reference sequence of orientation data is known from previous user input (e.g., a user preference or other user profile information corresponding to the user 132 or 152). Accordingly, the comparison of the handedness of the recorded sequence of orientation data to the handedness of the reference sequence of orientation data enables the handedness of the recorded sequence to be inferred from the degree to which the handedness of the recorded sequence is similar to, or different from, the handedness of the reference sequence.
  • As shown in FIG. 7, in addition to any one or more of the operations previously described, the method 500 may include one or more of operations 720, 721, 730, 732, 734, 740, and 742, according to various example embodiments. Operations 720 and 730 may be implemented in the example embodiments that support a device detection feature, in which detection of a nearby device (e.g., user device 130 or user device 150) forms a basis for the profile selector 230 in operation 530 to select the profile identifier based on the controller manipulation data.
  • In operation 720, the device detector 250 detects that a user device (e.g., the user device 130 that corresponds to the user 132) is near the media device 140 (e.g., in physical proximity to the media device 140 in comparison to a threshold distance, such as the user device 130 being able to wirelessly communicate with the media device 140 with a minimum threshold signal strength), near the controller device 141 (e.g., in physical proximity to the controller device 141 in comparison to a threshold distance, such as the user device 130 being able to wirelessly communicate with the controller device 141 with a minimum threshold signal strength), or any suitable combination thereof. The detected nearness of the user device (e.g., user device 130) to the media device 140, the controller device 141, or both, may be a basis for inferring that the corresponding user (e.g., user 132) is the user who is manipulating the controller device 141. Accordingly, the device detector 250 may provide the results of this detection as input to the profile selector 230 to aid in the performance of operation 530, in which the profile identifier is selected by the profile selector 230.
  • Operation 721 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 720. In operation 721, as part of detecting the user device (e.g., user device 130), the device detector 250 detects that the user device is joining a wireless network (e.g., a Wi-Fi local area network or a Bluetooth personal area network) that the controller device 141 has also joined. Accordingly, the device detector 250 may provide the results of this detection as input to the profile selector 230, as discussed above.
  • In example embodiments that include operation 720, operation 730 may be performed as part (e.g., a precursor task, a subroutine, or a portion) of operation 530, in which the profile selector 230 selects the profile identifier based on the controller manipulation data. As noted above, if the device detector 250 has detected (e.g., in operation 720) that the user device (e.g., user device 130) is near the controller device 141, the result of this detection may be communicated by the device detector 250 to the profile selector 230 and may cause the profile selector 230 to select the profile identifier (e.g., of a known user, such as the user 132) based on the detected user device. In some example embodiments, the result of this detection may be or include a device identifier (e.g., a name of the device, a network address of the device, or any suitable combination thereof). Accordingly, in operation 730, the profile identifier is selected based on such a device identifier (e.g., communicated by the device detector 250 to the profile selector 230). For example, the profile selector 230 may select a profile identifier that corresponds to the user (e.g., user 132) who corresponds to the detected user device (e.g., user device 130) by virtue of corresponding to a device identifier thereof (e.g., a device name or network address that identifies the user device 130.
  • According to some example embodiments, the performance of operation 530 by the profile selector 230 results in selection of an anonymous profile identifier that corresponds to an anonymous set of conditions (e.g., indicated by or inferred from the controller manipulation data), that is, a set of conditions that are not linked or associated with any known user (e.g., user 132). For example, the user 152 may be physically manipulating the controller device 141 for the very first time; and the corresponding set of conditions may be treated as an anonymous user profile that is identified by an anonymous profile identifier. Accordingly, the profile selector 230 may perform operation 732 by selecting a profile identifier that corresponds to a specific, but anonymous, set of conditions. As an example, the profile selector 230 may generate or otherwise obtain a new profile identifier, a new user profile, or both. In some example embodiments, the profile selector 230 may select or otherwise apply a default template in generating or otherwise obtaining a new profile identifier, a new user profile, or both. According to various example embodiments, multiple anonymous profiles (e.g., in the form of multiple, yet distinct, anonymous sets of conditions) may be supported and differentiated by the profile selector 230.
  • As mentioned above, according to some alternative example embodiments, the performance of operation 530 by the profile selector 230 results in selection of a known profile identifier, that is, a profile identifier that corresponds to a known, previously profiled, or otherwise non-anonymous user (e.g., user 132). For example, a set of one or more pre-existing profile identifiers (e.g., with or without corresponding pre-existing user profiles) may be stored by the media server machine 110, the database 115, the media device 140, the controller device 141, a user device that was detected in operation 720 (e.g., user device 130), or any suitable combination thereof, and the profile selector 230 may accordingly perform operation 734 by selecting one of these pre-existing profile identifiers (e.g., with or without selecting a corresponding pre-existing user profile).
  • One or both of operations 740 and 742 may be performed as part of operation 540, in which the menu modifier 240 selects the first subset of the menu of menu items (e.g., the menu of media streams), which selection may be based on the profile identifier selected in operation 530. In operation 740, the menu modifier 240 accesses user preferences for the selected profile identifier. Such user preferences may be stored in a user profile, which may be included in a set of one or more pre-existing user profiles that are stored or otherwise made available by the media server machine 110, the database 115, the media device 140, the controller device 141, a user device that was detected in operation 720 (e.g., user device 130), or any suitable combination thereof.
  • In operation 742, the user preferences accessed in operation 740 are used as a basis for selecting the first subset of the menu, in accordance with operation 540. Accordingly, in operation 742, the menu modifier 240 selects at least part of the first subset based on the user preferences. For example, the user preferences may list, identify, specify, or otherwise indicate one or more menu items to omit (e.g., media streams to omit from view), one or more menu items to preserve (e.g., media streams to maintain in view), or any suitable combination thereof.
  • Thus, the menu of menu items (e.g., the menu of media streams) can be modified by the menu modifier 240 in accordance with user preferences that correspond to a profile identifier that was selected based on the controller manipulation data accessed in operation 520. As noted above, such a modified menu can be presented or caused to be presented by the media device 140. This may have the effect of customizing, personalizing, or otherwise targeting the modification of the menu of menu items specifically for a user (e.g., user 132 or 152) without necessarily making an affirmative identification of the user. That is, the controller manipulation data can be sufficient to distinguish, disambiguate, or otherwise indicate different human users of the controller device 141, whether the human users are known or unknown (e.g., anonymous) to the media server machine 110, the database 115, the media device 140, the controller device 141, or any suitable combination thereof.
  • According to various example embodiments, one or more of the methodologies described herein may facilitate menu modification based on controller manipulation data. Moreover, one or more of the methodologies described herein may facilitate modifying a menu of menu items in accordance with one or more user preferences that correspond to a profile identifier selected based on such controller manipulation data. Hence, one or more of the methodologies described herein may facilitate customization, personalization, or other targeting of menu modifications, as well as improved experiences with a graphical user interface that contains a menu modified according to the systems and methods described herein, compared to capabilities of pre-existing systems and methods.
  • When these effects are considered in aggregate, one or more of the methodologies described herein may obviate a need for certain efforts or resources that otherwise would be involved in menu modification. Efforts expended by a user (e.g., user 132 or 152) in customizing or personalizing a menu of menu items may be reduced by use of (e.g., reliance upon) a special-purpose machine that implements one or more of the methodologies described herein. Computing resources used by one or more systems or machines (e.g., within the network environment 100) may similarly be reduced (e.g., compared to systems or machines that lack the structures discussed herein or are otherwise unable to perform the functions discussed herein). Examples of such computing resources include processor cycles, network traffic, computational capacity, main memory usage, graphics rendering capacity, graphics memory usage, data storage capacity, power consumption, and cooling capacity.
  • FIG. 8 is a block diagram illustrating components of a machine 800, according to some example embodiments, able to read instructions 824 from a machine-readable medium 822 (e.g., a non-transitory machine-readable medium, a machine-readable storage medium, a computer-readable storage medium, or any suitable combination thereof) and perform any one or more of the methodologies discussed herein, in whole or in part. Specifically, FIG. 8 shows the machine 800 in the example form of a computer system (e.g., a computer) within which the instructions 824 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 800 to perform any one or more of the methodologies discussed herein may be executed, in whole or in part.
  • In alternative embodiments, the machine 800 operates as a standalone device or may be communicatively coupled (e.g., networked) to other machines. In a networked deployment, the machine 800 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a distributed (e.g., peer-to-peer) network environment. The machine 800 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a cellular telephone, a smart phone, a set-top box (STB), a personal digital assistant (PDA), a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 824, sequentially or otherwise, that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute the instructions 824 to perform all or part of any one or more of the methodologies discussed herein.
  • The machine 800 includes a processor 802 (e.g., one or more central processing units (CPUs), one or more graphics processing units (GPUs), one or more digital signal processors (DSPs), one or more application specific integrated circuits (ASICs), one or more radio-frequency integrated circuits (RFICs), or any suitable combination thereof), a main memory 804, and a static memory 806, which are configured to communicate with each other via a bus 808. The processor 802 contains solid-state digital microcircuits (e.g., electronic, optical, or both) that are configurable, temporarily or permanently, by some or all of the instructions 824 such that the processor 802 is configurable to perform any one or more of the methodologies described herein, in whole or in part. For example, a set of one or more microcircuits of the processor 802 may be configurable to execute one or more modules (e.g., software modules) described herein. In some example embodiments, the processor 802 is a multicore CPU (e.g., a dual-core CPU, a quad-core CPU, an 8-core CPU, or a 128-core CPU) within which each of multiple cores behaves as a separate processor that is able to perform any one or more of the methodologies discussed herein, in whole or in part. Although the beneficial effects described herein may be provided by the machine 800 with at least the processor 802, these same beneficial effects may be provided by a different kind of machine that contains no processors (e.g., a purely mechanical system, a purely hydraulic system, or a hybrid mechanical-hydraulic system), if such a processor-less machine is configured to perform one or more of the methodologies described herein.
  • The machine 800 may further include a graphics display 810 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, a cathode ray tube (CRT), or any other display capable of displaying graphics or video). The machine 800 may also include an alphanumeric input device 812 (e.g., a keyboard or keypad), a pointer input device 814 (e.g., a mouse, a touchpad, a touchscreen, a trackball, a joystick, a stylus, a motion sensor, an eye tracking device, a data glove, or other pointing instrument), a data storage 816, an audio generation device 818 (e.g., a sound card, an amplifier, a speaker, a headphone jack, or any suitable combination thereof), and a network interface device 820.
  • The data storage 816 (e.g., a data storage device) includes the machine-readable medium 822 (e.g., a tangible and non-transitory machine-readable storage medium) on which are stored the instructions 824 embodying any one or more of the methodologies or functions described herein. The instructions 824 may also reside, completely or at least partially, within the main memory 804, within the static memory 806, within the processor 802 (e.g., within the processor's cache memory), or any suitable combination thereof, before or during execution thereof by the machine 800. Accordingly, the main memory 804, the static memory 806, and the processor 802 may be considered machine-readable media (e.g., tangible and non-transitory machine-readable media). The instructions 824 may be transmitted or received over the network 190 via the network interface device 820. For example, the network interface device 820 may communicate the instructions 824 using any one or more transfer protocols (e.g., hypertext transfer protocol (HTTP)).
  • In some example embodiments, the machine 800 may be a portable computing device (e.g., a smart phone, a tablet computer, or a wearable device), and may have one or more additional input components 830 (e.g., sensors or gauges). Examples of such input components 830 include an image input component (e.g., one or more cameras), an audio input component (e.g., one or more microphones), a direction input component (e.g., a compass), a location input component (e.g., a global positioning system (GPS) receiver), an orientation component (e.g., a gyroscope), a motion detection component (e.g., one or more accelerometers), an altitude detection component (e.g., an altimeter), and a biometric input component (e.g., a heartrate detector or a blood pressure detector). Input data gathered by any one or more of these input components may be accessible and available for use by any of the modules described herein.
  • As used herein, the term “memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 822 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing the instructions 824 for execution by the machine 800, such that the instructions 824, when executed by one or more processors of the machine 800 (e.g., processor 802), cause the machine 800 to perform any one or more of the methodologies described herein, in whole or in part. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as cloud-based storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more tangible and non-transitory data repositories (e.g., data volumes) in the example form of a solid-state memory chip, an optical disc, a magnetic disc, or any suitable combination thereof. A “non-transitory” machine-readable medium, as used herein, specifically does not include propagating signals per se. In some example embodiments, the instructions 824 for execution by the machine 800 may be communicated by a carrier medium. Examples of such a carrier medium include a storage medium (e.g., a non-transitory machine-readable storage medium, such as a solid-state memory, being physically moved from one place to another place) and a transient medium (e.g., a propagating signal that communicates the instructions 824).
  • Certain example embodiments are described herein as including modules. Modules may constitute software modules (e.g., code stored or otherwise embodied in a machine-readable medium or in a transmission medium), hardware modules, or any suitable combination thereof. A “hardware module” is a tangible (e.g., non-transitory) physical component (e.g., a set of one or more processors) capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems or one or more hardware modules thereof may be configured by software (e.g., an application or portion thereof) as a hardware module that operates to perform operations described herein for that module.
  • In some example embodiments, a hardware module may be implemented mechanically, electronically, hydraulically, or any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. A hardware module may be or include a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC. A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. As an example, a hardware module may include software encompassed within a CPU or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, hydraulically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity that may be physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Furthermore, as used herein, the phrase “hardware-implemented module” refers to a hardware module. Considering example embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module includes a CPU configured by software to become a special-purpose processor, the CPU may be configured as respectively different special-purpose processors (e.g., each included in a different hardware module) at different times. Software (e.g., a software module) may accordingly configure one or more processors, for example, to become or otherwise constitute a particular hardware module at one instance of time and to become or otherwise constitute a different hardware module at a different instance of time.
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory (e.g., a memory device) to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information from a computing resource).
  • The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module in which the hardware includes one or more processors. Accordingly, the operations described herein may be at least partially processor-implemented, hardware-implemented, or both, since a processor is an example of hardware, and at least some operations within any one or more of the methods discussed herein may be performed by one or more processor-implemented modules, hardware-implemented modules, or any suitable combination thereof.
  • Moreover, such one or more processors may perform operations in a “cloud computing” environment or as a service (e.g., within a “software as a service” (SaaS) implementation). For example, at least some operations within any one or more of the methods discussed herein may be performed by a group of computers (e.g., as examples of machines that include processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)). The performance of certain operations may be distributed among the one or more processors, whether residing only within a single machine or deployed across a number of machines. In some example embodiments, the one or more processors or hardware modules (e.g., processor-implemented modules) may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or hardware modules may be distributed across a number of geographic locations.
  • Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and their functionality presented as separate components and functions in example configurations may be implemented as a combined structure or component with combined functions. Similarly, structures and functionality presented as a single component may be implemented as separate components and functions. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
  • Some portions of the subject matter discussed herein may be presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a memory (e.g., a computer memory or other machine memory). Such algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
  • Unless specifically stated otherwise, discussions herein using words such as “accessing,” “processing,” “detecting,” “computing,” “calculating,” “determining,” “generating,” “presenting,” “displaying,” or the like refer to actions or processes performable by a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. Furthermore, unless specifically stated otherwise, the terms “a” or “an” are herein used, as is common in patent documents, to include one or more than one instance. Finally, as used herein, the conjunction “or” refers to a non-exclusive “or,” unless specifically stated otherwise.
  • The following enumerated embodiments describe various example embodiments of methods, machine-readable media, and systems (e.g., machines, devices, or other apparatus) discussed herein.
  • A first embodiment provides a method comprising:
  • causing, by one or more processors of a machine, a media device to display at least a portion of a menu of media streams that are selectable for playback by the media device;
    accessing, by one or more processors of the machine, controller manipulation data generated by a controller device and indicating a sequence of physical manipulations experienced by the controller device during operation of the controller device in selecting one or more media streams from the menu of media streams for playback by the media device;
    selecting, by one or more processors of the machine, a profile identifier from a set of profile identifiers based on a comparison of the accessed controller manipulation data to a controller manipulation profile that corresponds to the profile identifier;
    selecting, by one or more processors of the machine, a first subset of the menu of media streams based on the selected profile identifier, the selected first subset indicating media streams to be hidden from view, the first subset having no overlap with a second subset of the menu of media streams; and
    causing, by one or more processors of the machine, the media device to modify the menu of media streams by omitting the first subset of the menu of media streams from the displayed portion of the menu while displaying the second subset of the menu of media streams in the displayed portion of the menu.
  • The controller manipulation data may include one or more frequencies at which one or more control elements of the controller device have been activated. Accordingly, a second embodiment provides a method according to the first embodiment, wherein:
  • the accessed controller manipulation data includes a first activation frequency at which a control element of the controller device was activated during a sequence of activations of the control element;
    the controller manipulation profile that corresponds to the profile identifier includes a second activation frequency for the control element; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation frequencies.
  • The controller manipulation data may include one or more patterns in which multiple control elements have been activated. Accordingly, a third embodiment provides a method according to the first embodiment or the second embodiment, wherein:
  • the accessed controller manipulation data includes a first activation pattern according to which multiple control elements of the controller device were activated during operation of the multiple control elements;
    the controller manipulation profile that corresponds to the profile identifier includes a second activation pattern for the multiple control elements; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation patterns.
  • The controller manipulation data may include acceleration data generated by one or more accelerometers as a result of movements by the controller device. Accordingly, a fourth embodiment provides a method according to any of the first through third embodiments, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the controller device;
    the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • In some situations, the acceleration data from such movements indicates whether the controller device was used by a right-handed person or a left-handed person. Accordingly, a fifth embodiment provides a method according to the fourth embodiment, wherein:
  • the sequential accelerations indicated by the first sequence of accelerometer data correspond to a handedness of a human user in physically manipulating the controller device; and
    in the compared controller manipulation profile, the second sequence of accelerometer data indicates the handedness.
  • The controller manipulation data may include orientation data generated by one or more accelerometers as a result of movements by the controller device. Accordingly, a sixth embodiment provides a method according to any of the first through fifth embodiments, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential orientations at which the controller device was held;
    the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • In some situations, the orientation data from such movements indicates whether the controller device was used by a right-handed person or a left-handed person. Accordingly, a seventh embodiment provides a method according to the sixth embodiment, wherein:
  • the sequential orientations indicated by the first sequence of accelerometer data correspond to a handedness of a human user in physically manipulating the controller device; and
    in the compared controller manipulation profile, the second sequence of accelerometer data indicates the handedness.
  • A device detection feature may additionally be implemented. Accordingly, an eighth embodiment provides a method according to any of the first through seventh embodiments, further comprising:
  • detecting that a user device is within a threshold range of the media device, the user device having a device identifier that corresponds to the profile identifier; and wherein
    the selecting of the profile identifier from the set of profile identifiers is further based on the user device being detected within the threshold range of the media device.
  • In some situations, the device detection is performed via a wireless network (e.g., Bluetooth or Wi-Fi). Accordingly, a ninth embodiment provides a method according to the eighth embodiment, wherein:
  • the detecting that the user device is within the threshold range of the media device includes detecting that the user device has joined a wireless network to which the media device belongs.
  • According to some implementations, the profile identifier is a basis for selecting, applying, or otherwise using a set of one or more user preferences for a known user of the controller device (e.g., a user who has previously used the controller device and whose explicit or implicit user preferences are stored by the controller device or other machine within the network-based system 105). Accordingly, a tenth embodiment provides a method according to any of the first through ninth embodiments, wherein:
  • the selecting of the first subset of the menu of media streams includes accessing user preferences that correspond to the selected profile identifier and selecting at least part of the first subset based on the accessed user preferences.
  • In some situations, the set of one or more user preferences specifies that a particular media stream is to be hidden. Accordingly, an eleventh embodiment provides a method according to the tenth embodiment, wherein:
  • the accessed user preferences that correspond to the selected profile identifier indicate at least one media stream in the first subset of the menu of media streams to be omitted from the modified menu.
  • In certain situations, the set of one or more user preferences specifies that a particular media stream is to be displayed. Accordingly, a twelfth embodiment provides a method according to the tenth embodiment or the eleventh embodiment, wherein:
  • the accessed user preferences that correspond to the selected profile identifier indicate at least one media stream in the second subset of the menu of media streams to be maintained in the modified menu.
  • According to certain implementations, the profile identifier does not correspond to any known user of the controller device (e.g., corresponds to an anonymous user or to a new user). Accordingly, a thirteenth embodiment provides a method according to any of the first through ninth embodiments, wherein:
  • the profile identifier identifies no user of the controller device and corresponds to an anonymous set of conditions under which the controller device was operated.
  • According to various implementations, the profile identifier is sufficient to identify a known user of the controller device (e.g., uniquely identifies the user among a set of known users). Accordingly, a fourteenth embodiment provides a method according to any of the first through twelfth embodiments, wherein:
  • the profile identifier identifies a human user of the controller device.
  • A fifteenth embodiment provides a machine-readable medium (e.g., a non-transitory machine-readable storage medium) comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
  • causing a media device to display at least a portion of a menu of media streams that are selectable for playback by the media device;
    accessing controller manipulation data generated by a controller device and indicating a sequence of physical manipulations experienced by the controller device during operation of the controller device in selecting one or more media streams from the menu of media streams for playback by the media device;
    selecting a profile identifier from a set of profile identifiers based on a comparison of the accessed controller manipulation data to a controller manipulation profile that corresponds to the profile identifier;
    selecting a first subset of the menu of media streams based on the selected profile identifier, the selected first subset indicating media streams to be hidden from view, the first subset having no overlap with a second subset of the menu of media streams; and
    causing the media device to modify the menu of media streams by omitting the first subset of the menu of media streams from the displayed portion of the menu while displaying the second subset of the menu of media streams in the displayed portion of the menu.
  • A sixteenth embodiment provides a machine-readable medium according to the fifteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first activation frequency at which a control element of the controller device was activated during a sequence of activations of the control element;
    the controller manipulation profile that corresponds to the profile identifier includes a second activation frequency for the control element; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation frequencies.
  • A seventeenth embodiment provides a machine-readable medium according to the fifteenth embodiment or the sixteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the controller device;
    the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • An eighteenth embodiment provides a system comprising: one or more processors; and
  • a memory storing instructions that, when executed by at least one processor among the one or more processors, cause the system to perform operations comprising:
    causing a media device to display at least a portion of a menu of media streams that are selectable for playback by the media device;
    accessing controller manipulation data generated by a controller device and indicating a sequence of physical manipulations experienced by the controller device during operation of the controller device in selecting one or more media streams from the menu of media streams for playback by the media device;
    selecting a profile identifier from a set of profile identifiers based on a comparison of the accessed controller manipulation data to a controller manipulation profile that corresponds to the profile identifier;
    selecting a first subset of the menu of media streams based on the selected profile identifier, the selected first subset indicating media streams to be hidden from view, the first subset having no overlap with a second subset of the menu of media streams; and
    causing the media device to modify the menu of media streams by omitting the first subset of the menu of media streams from the displayed portion of the menu while displaying the second subset of the menu of media streams in the displayed portion of the menu.
  • A nineteenth embodiment provides a system according to the eighteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first activation frequency at which a control element of the controller device was activated during a sequence of activations of the control element;
    the controller manipulation profile that corresponds to the profile identifier includes a second activation frequency for the control element; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second activation frequencies.
  • A twentieth embodiment provides a system according to the eighteenth embodiment or the nineteenth embodiment, wherein:
  • the accessed controller manipulation data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the controller device;
    the controller manipulation profile that corresponds to the profile identifier includes a second sequence of accelerometer data; and
    the comparison of the accessed controller manipulation data to the controller manipulation profile includes a comparison of the first and second sequences of accelerometer data.
  • A twenty-first embodiment provides a carrier medium carrying machine-readable instructions for controlling a machine to carry out the method of any one of the first through fourteenth embodiments.

Claims (21)

1. (canceled)
2. A method comprising:
accessing, by one or more processors, sequential movement data that indicates movements experienced by a remote device during selection of one or more media via the remote device;
selecting, by the one or more processors and based on the sequential movement data, a sequential movement profile among a set of sequential movement profiles that each indicate a different corresponding user;
determining, by the one or more processors and based on the selected sequential movement profile, a set of media to be omitted from a menu presented by a media device; and
causing, by the one or more processors, the menu presented by the media device to omit the set of media determined based on the sequential movement profile selected based on the sequential movement data.
3. The method of claim 2, wherein:
the remote device includes a remote controller of the media device, and
the sequential movement data indicates intentional movements of the remote device during intentional selection of the one or more media via the remote controller of the media device.
4. The method of claim 2, further comprising:
accessing a first activation frequency at which a control element of the remote device was activated during a sequence of activations of the control element during the selection of the one or more media; and wherein:
the sequential movement profile includes a second activation frequency for the control element; and
the selecting of the sequential movement profile is based on a comparison of the first and second activation frequencies for the control element.
5. The method of claim 2, further comprising:
accessing a first activation pattern according to which control elements of the remote device were activated during the selection of the one or more media; and wherein:
the sequential movement profile includes a second activation pattern for the control elements; and
the selecting of the sequential movement profile is based on a comparison of the first and second activation patterns for the control elements.
6. The method of claim 2, wherein:
the accessed sequential movement data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the remote device during the selection of the one or more media;
the sequential movement profile includes a second sequence of accelerometer data; and
the selecting of the sequential movement profile is based on a comparison of the first and second sequences of accelerometer data.
7. The method of claim 2, wherein:
the accessed sequential movement data includes a first sequence of accelerometer data that indicates sequential orientations at which the remote device was held during the selection of the one or more media;
the sequential movement profile includes a second sequence of accelerometer data; and
the selecting of the sequential movement profile is based on a comparison of the first and second sequences of accelerometer data.
8. The method of claim 2, further comprising:
detecting that a user device is within a threshold range of the media device, the user device corresponding to the sequential movement profile; and wherein:
the selecting of the sequential movement profile is based on the detecting of the user device within the threshold range of the media device.
9. A non-transitory machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing sequential movement data that indicates movements experienced by a remote device during selection of one or more media via the remote device;
based on the sequential movement data, selecting a sequential movement profile among a set of sequential movement profiles that each indicate a different corresponding user;
based on the selected sequential movement profile, determining a set of media to be omitted from a menu presented by a media device; and
causing the menu presented by the media device to omit the set of media determined based on the sequential movement profile selected based on the sequential movement data.
10. The non-transitory machine-readable storage medium of claim 9, wherein:
the remote device includes a remote controller of the media device, and
the sequential movement data indicates intentional movements of the remote device during intentional selection of the one or more media via the remote controller of the media device.
11. The non-transitory machine-readable storage medium of claim 9, wherein the operations further comprise:
accessing a first activation frequency at which a control element of the remote device was activated during a sequence of activations of the control element during the selection of the one or more media; and wherein:
the sequential movement profile includes a second activation frequency for the control element; and
the selecting of the sequential movement profile is based on a comparison of the first and second activation frequencies for the control element.
12. The non-transitory machine-readable storage medium of claim 9, wherein the operations further comprise:
accessing a first activation pattern according to which control elements of the remote device were activated during the selection of the one or more media; and wherein:
the sequential movement profile includes a second activation pattern for the control elements; and
the selecting of the sequential movement profile is based on a comparison of the first and second activation patterns for the control elements.
13. The non-transitory machine-readable storage medium of claim 9, wherein:
the accessed sequential movement data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the remote device during the selection of the one or more media;
the sequential movement profile includes a second sequence of accelerometer data; and
the selecting of the sequential movement profile is based on a comparison of the first and second sequences of accelerometer data.
14. The non-transitory machine-readable storage medium of claim 9, wherein:
the accessed sequential movement data includes a first sequence of accelerometer data that indicates sequential orientations at which the remote device was held during the selection of the one or more media;
the sequential movement profile includes a second sequence of accelerometer data; and
the selecting of the sequential movement profile is based on a comparison of the first and second sequences of accelerometer data.
15. The non-transitory machine-readable storage medium of claim 9, wherein the operations further comprise:
detecting that a user device is within a threshold range of the media device, the user device corresponding to the sequential movement profile; and wherein:
the selecting of the sequential movement profile is based on the detecting of the user device within the threshold range of the media device.
16. A system comprising:
one or more processors; and
a memory storing instructions that, when executed by at least one processor among the one or more processors, cause the system to perform operations comprising:
accessing sequential movement data that indicates movements experienced by a remote device during selection of one or more media via the remote device;
based on the sequential movement data, selecting a sequential movement profile among a set of sequential movement profiles that each indicate a different corresponding user;
based on the selected sequential movement profile, determining a set of media to be omitted from a menu presented by a media device; and
causing the menu presented by the media device to omit the set of media determined based on the sequential movement profile selected based on the sequential movement data.
17. The system of claim 16, wherein the operations further comprise:
accessing a first activation frequency at which a control element of the remote device was activated during a sequence of activations of the control element during the selection of the one or more media; and wherein:
the sequential movement profile includes a second activation frequency for the control element; and
the selecting of the sequential movement profile is based on a comparison of the first and second activation frequencies for the control element.
18. The system of claim 16, wherein the operations further comprise:
accessing a first activation pattern according to which control elements of the remote device were activated during the selection of the one or more media; and wherein:
the sequential movement profile includes a second activation pattern for the control elements; and
the selecting of the sequential movement profile is based on a comparison of the first and second activation patterns for the control elements.
19. The system of claim 16, wherein:
the accessed sequential movement data includes a first sequence of accelerometer data that indicates sequential accelerations experienced by the remote device during the selection of the one or more media;
the sequential movement profile includes a second sequence of accelerometer data; and
the selecting of the sequential movement profile is based on a comparison of the first and second sequences of accelerometer data.
20. The system of claim 16, wherein:
the accessed sequential movement data includes a first sequence of accelerometer data that indicates sequential orientations at which the remote device was held during the selection of the one or more media;
the sequential movement profile includes a second sequence of accelerometer data; and
the selecting of the sequential movement profile is based on a comparison of the first and second sequences of accelerometer data.
21. The system of claim 16, wherein the operations further comprise:
detecting that a user device is within a threshold range of the media device, the user device corresponding to the sequential movement profile; and wherein:
the selecting of the sequential movement profile is based on the detecting of the user device within the threshold range of the media device.
US17/302,809 2017-02-01 2021-05-12 Menu modification based on controller manipulation data Pending US20210326013A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/302,809 US20210326013A1 (en) 2017-02-01 2021-05-12 Menu modification based on controller manipulation data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/422,221 US11042262B2 (en) 2017-02-01 2017-02-01 Menu modification based on controller manipulation data
US17/302,809 US20210326013A1 (en) 2017-02-01 2021-05-12 Menu modification based on controller manipulation data

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/422,221 Continuation US11042262B2 (en) 2017-02-01 2017-02-01 Menu modification based on controller manipulation data

Publications (1)

Publication Number Publication Date
US20210326013A1 true US20210326013A1 (en) 2021-10-21

Family

ID=60382056

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/422,221 Active 2038-08-27 US11042262B2 (en) 2017-02-01 2017-02-01 Menu modification based on controller manipulation data
US17/302,809 Pending US20210326013A1 (en) 2017-02-01 2021-05-12 Menu modification based on controller manipulation data

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/422,221 Active 2038-08-27 US11042262B2 (en) 2017-02-01 2017-02-01 Menu modification based on controller manipulation data

Country Status (6)

Country Link
US (2) US11042262B2 (en)
EP (1) EP3358851A1 (en)
CN (1) CN108391175B (en)
AU (2) AU2017268643B2 (en)
CA (1) CA2992880A1 (en)
SG (1) SG10201709504UA (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11042262B2 (en) 2017-02-01 2021-06-22 Opentv, Inc. Menu modification based on controller manipulation data
US20180357870A1 (en) * 2017-06-07 2018-12-13 Amazon Technologies, Inc. Behavior-aware security systems and associated methods
US10848832B2 (en) * 2018-09-11 2020-11-24 Opentv, Inc. Selection interface with synchronized suggestion elements
US10936861B2 (en) * 2018-09-28 2021-03-02 Aptiv Technologies Limited Object detection system of a vehicle
CN113126854B (en) * 2019-12-31 2022-06-28 北京百度网讯科技有限公司 Menu display method and device and electronic equipment
KR20210130066A (en) * 2020-04-21 2021-10-29 엘지전자 주식회사 A display device and operating method thereof

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060064495A1 (en) * 2004-08-11 2006-03-23 Tu Edgar A User profile selection
US20060125968A1 (en) * 2004-12-10 2006-06-15 Seiko Epson Corporation Control system, apparatus compatible with the system, and remote controller
US20070041058A1 (en) * 2005-08-22 2007-02-22 Israel Disatnik Device and a method for identifying movement patterns
US20080252509A1 (en) * 2007-04-13 2008-10-16 Seiko Epson Corporation Remote control signal generation device and remote control system
US20090077163A1 (en) * 2007-09-14 2009-03-19 Phorm Uk, Inc. Approach for identifying and providing targeted content to a network client with reduced impact to the service provider
US20100042564A1 (en) * 2008-08-15 2010-02-18 Beverly Harrison Techniques for automatically distingusihing between users of a handheld device
US20100053458A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Method and System for Network Enabled Remote Controls Using Physical Motion Detection Remote control Devices
US20100083373A1 (en) * 2008-09-29 2010-04-01 Scott White Methods and apparatus for determining user authorization from motion of a gesture-based control unit
US20110050569A1 (en) * 2004-03-23 2011-03-03 Fujitsu Limited Motion Controlled Remote Controller
US20120124516A1 (en) * 2010-11-12 2012-05-17 At&T Intellectual Property I, L.P. Electronic Device Control Based on Gestures
US20120240223A1 (en) * 2004-08-11 2012-09-20 Sony Computer Entertainment, Inc. Process and apparatus for automatically identifying user of consumer electronics
US20120323521A1 (en) * 2009-09-29 2012-12-20 Commissariat A L'energie Atomique Et Aux Energies Al Ternatives System and method for recognizing gestures
US20150286813A1 (en) * 2014-04-04 2015-10-08 Qualcomm Incorporated Method and apparatus that facilitates a wearable identity manager

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256019B1 (en) * 1999-03-30 2001-07-03 Eremote, Inc. Methods of using a controller for controlling multi-user access to the functionality of consumer devices
US7979880B2 (en) * 2000-04-21 2011-07-12 Cox Communications, Inc. Method and system for profiling iTV users and for providing selective content delivery
US7600245B2 (en) 2000-06-27 2009-10-06 At&T Intellectual Property I, L.P. System and methods for subscribers to view, select and otherwise customize delivery of programming over a communication system
WO2002082214A2 (en) * 2001-04-06 2002-10-17 Predictive Media Corporation Method and apparatus for identifying unique client users from user behavioral data
US7489299B2 (en) * 2003-10-23 2009-02-10 Hillcrest Laboratories, Inc. User interface devices and methods employing accelerometers
EP2337016B1 (en) * 2004-04-30 2018-01-10 IDHL Holdings, Inc. Free space pointing devices with tilt compensation and improved usability
KR100937572B1 (en) * 2004-04-30 2010-01-19 힐크레스트 래보래토리스, 인크. Free space pointing device and method
US8194034B2 (en) 2006-12-20 2012-06-05 Verizon Patent And Licensing Inc. Systems and methods for controlling a display
US8031175B2 (en) * 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
US8090670B2 (en) * 2008-09-19 2012-01-03 Satyam Computer Services Limited System and method for remote usage modeling
US8593576B2 (en) 2009-10-15 2013-11-26 At&T Intellectual Property I, L.P. Gesture-based remote control
KR101632077B1 (en) 2009-11-24 2016-07-01 엘지전자 주식회사 A method of editing menu screen for a network television
WO2012027605A2 (en) * 2010-08-27 2012-03-01 Intel Corporation Intelligent remote control system
US10262324B2 (en) * 2010-11-29 2019-04-16 Biocatch Ltd. System, device, and method of differentiating among users based on user-specific page navigation sequence
US10037421B2 (en) * 2010-11-29 2018-07-31 Biocatch Ltd. Device, system, and method of three-dimensional spatial user authentication
US8630963B2 (en) * 2011-07-01 2014-01-14 Intel Corporation Automatic user identification from button presses recorded in a feature vector
CN103797784A (en) 2011-08-05 2014-05-14 汤姆逊许可公司 Video Peeking
KR101868623B1 (en) 2011-10-05 2018-06-18 엘지전자 주식회사 Display device and method for controlling the same
US9519909B2 (en) * 2012-03-01 2016-12-13 The Nielsen Company (Us), Llc Methods and apparatus to identify users of handheld computing devices
KR101661526B1 (en) 2012-04-08 2016-10-04 삼성전자주식회사 Flexible display apparatus and user interface providing method thereof
US9100708B2 (en) 2012-08-31 2015-08-04 Echostar Technologies L.L.C. Electronic program guides, systems and methods providing a collapsible channel listing
US9237292B2 (en) 2012-12-28 2016-01-12 Echostar Technologies L.L.C. Determining remote control state and user via accelerometer
US9223297B2 (en) * 2013-02-28 2015-12-29 The Nielsen Company (Us), Llc Systems and methods for identifying a user of an electronic device
US20150178374A1 (en) * 2013-12-23 2015-06-25 Trusteer Ltd. Method and system of providing user profile detection from an input device
SG10201402026RA (en) * 2014-05-04 2015-12-30 Seow Loong Tan Activity monitoring method and system
US10491960B2 (en) 2014-11-10 2019-11-26 Sony Interactive Entertainment LLC Customizable electronic program guide
US9747734B2 (en) * 2014-12-12 2017-08-29 International Busines Machines Corporation Authentication of users with tremors
US20160182950A1 (en) * 2014-12-17 2016-06-23 Lenovo (Singapore) Pte. Ltd. Identification of a user for personalized media content presentation
US9590986B2 (en) * 2015-02-04 2017-03-07 Aerendir Mobile Inc. Local user authentication with neuro and neuro-mechanical fingerprints
US9577992B2 (en) * 2015-02-04 2017-02-21 Aerendir Mobile Inc. Data encryption/decryption using neuro and neuro-mechanical fingerprints
US10127371B2 (en) * 2015-12-11 2018-11-13 Roku, Inc. User identification based on the motion of a device
US20170177203A1 (en) * 2015-12-18 2017-06-22 Facebook, Inc. Systems and methods for identifying dominant hands for users based on usage patterns
US9870533B2 (en) * 2016-01-27 2018-01-16 Striiv, Inc. Autonomous decision logic for a wearable device
US10154316B2 (en) * 2016-02-26 2018-12-11 Apple Inc. Motion-based configuration of a multi-user device
KR101775829B1 (en) * 2016-05-16 2017-09-06 (주)휴맥스 Computer processing device and method for determining coordinate compensation and error for remote control key using user profile information based on force input
US11042262B2 (en) 2017-02-01 2021-06-22 Opentv, Inc. Menu modification based on controller manipulation data

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110050569A1 (en) * 2004-03-23 2011-03-03 Fujitsu Limited Motion Controlled Remote Controller
US20060064495A1 (en) * 2004-08-11 2006-03-23 Tu Edgar A User profile selection
US20120240223A1 (en) * 2004-08-11 2012-09-20 Sony Computer Entertainment, Inc. Process and apparatus for automatically identifying user of consumer electronics
US20060125968A1 (en) * 2004-12-10 2006-06-15 Seiko Epson Corporation Control system, apparatus compatible with the system, and remote controller
US20070041058A1 (en) * 2005-08-22 2007-02-22 Israel Disatnik Device and a method for identifying movement patterns
US20080252509A1 (en) * 2007-04-13 2008-10-16 Seiko Epson Corporation Remote control signal generation device and remote control system
US20090077163A1 (en) * 2007-09-14 2009-03-19 Phorm Uk, Inc. Approach for identifying and providing targeted content to a network client with reduced impact to the service provider
US20100042564A1 (en) * 2008-08-15 2010-02-18 Beverly Harrison Techniques for automatically distingusihing between users of a handheld device
US20100053458A1 (en) * 2008-08-27 2010-03-04 International Business Machines Corporation Method and System for Network Enabled Remote Controls Using Physical Motion Detection Remote control Devices
US20100083373A1 (en) * 2008-09-29 2010-04-01 Scott White Methods and apparatus for determining user authorization from motion of a gesture-based control unit
US20120323521A1 (en) * 2009-09-29 2012-12-20 Commissariat A L'energie Atomique Et Aux Energies Al Ternatives System and method for recognizing gestures
US20120124516A1 (en) * 2010-11-12 2012-05-17 At&T Intellectual Property I, L.P. Electronic Device Control Based on Gestures
US20150286813A1 (en) * 2014-04-04 2015-10-08 Qualcomm Incorporated Method and apparatus that facilitates a wearable identity manager

Also Published As

Publication number Publication date
CA2992880A1 (en) 2018-08-01
EP3358851A1 (en) 2018-08-08
US20180217719A1 (en) 2018-08-02
US11042262B2 (en) 2021-06-22
CN108391175B (en) 2023-03-21
AU2017268643B2 (en) 2022-04-07
AU2022204892B2 (en) 2023-10-26
CN108391175A (en) 2018-08-10
SG10201709504UA (en) 2018-09-27
AU2022204892A1 (en) 2022-07-28
AU2017268643A1 (en) 2018-08-16

Similar Documents

Publication Publication Date Title
US20210326013A1 (en) Menu modification based on controller manipulation data
US10740978B2 (en) Surface aware lens
KR102341301B1 (en) electronic device and method for sharing screen
US10402460B1 (en) Contextual card generation and delivery
US11868590B2 (en) Interface to display shared user groups
US20180181274A1 (en) Electronic device, wearable device, and method of controlling displayed object in electronic device
EP3899865A1 (en) Virtual surface modification
US20200104466A1 (en) Collaborative public user profile
TW201535160A (en) Using proximity sensing to adjust information provided on a mobile device
JP6239755B2 (en) Wearable map and image display
US11297393B2 (en) Selection interface with synchronized suggestion elements
KR20160030640A (en) Method and apparatus for providing lockscreen
AU2015202698B2 (en) Method and apparatus for processing input using display
US11711414B2 (en) Triggering changes to real-time special effects included in a live streaming video
US11468613B2 (en) Annotating an image with a texture fill
US20220308721A1 (en) Message thread prioritization interface
US11789972B2 (en) Data synchronization for content consumed via a client application

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: OPENTV, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STREIT, PAUL;STOKSIK, MARC;REEL/FRAME:058577/0969

Effective date: 20170201

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS