WO2007065020A2 - Multimedia systems, methods and applications - Google Patents

Multimedia systems, methods and applications Download PDF

Info

Publication number
WO2007065020A2
WO2007065020A2 PCT/US2006/046303 US2006046303W WO2007065020A2 WO 2007065020 A2 WO2007065020 A2 WO 2007065020A2 US 2006046303 W US2006046303 W US 2006046303W WO 2007065020 A2 WO2007065020 A2 WO 2007065020A2
Authority
WO
WIPO (PCT)
Prior art keywords
images
epg
grid
recorded
programs
Prior art date
Application number
PCT/US2006/046303
Other languages
French (fr)
Other versions
WO2007065020A3 (en
Inventor
Charles W.K. Gritton
Negar Moshiri
Frank J. Wroblewski
Original Assignee
Hillcrest Laboratories, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hillcrest Laboratories, Inc. filed Critical Hillcrest Laboratories, Inc.
Priority to EP06844806A priority Critical patent/EP1966987A4/en
Publication of WO2007065020A2 publication Critical patent/WO2007065020A2/en
Publication of WO2007065020A3 publication Critical patent/WO2007065020A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4147PVR [Personal Video Recorder]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42646Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • H04N21/4325Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47214End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for content reservation or setting reminders; for requesting event notification, e.g. of sport results or stock market
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/47815Electronic shopping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4821End-user interface for program selection using a grid, e.g. sorted out by channel and broadcast time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/775Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/782Television signal recording using magnetic recording on tape
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests

Definitions

  • This application describes, among other things, multimedia systems, methods and applications running thereon.
  • the television was tuned to the desired channel by adjusting a tuner knob and the viewer watched the selected program. Later, remote control devices were introduced that permitted viewers to tune the television from a distance. This addition to the user-television interface created the phenomenon known as "channel surfing" whereby a viewer could rapidly view short segments being broadcast on a number of channels to quickly learn what programs were available at any given time.
  • buttons that can be programmed with the expert commands. These soft buttons sometimes have accompanying LCD displays to indicate their action. These too have the flaw that they are difficult to use without looking away from the TV to the remote control. Yet another flaw in these remote units is the use of modes in an attempt to reduce the number of buttons.
  • moded a special button exists to select whether the remote should communicate with the TV, DVD player, cable set-top box, VCR, etc. This causes many usability issues including sending commands to the wrong device, forcing the user to look at the remote to make sure that it is in the right mode, and it does not provide any simplification to the integration of multiple devices.
  • the most advanced of these universal remote units provide some integration by allowing the user to program sequences of commands to multiple devices into the remote. This is such a difficult task that many users hire professional installers to program their universal remote units.
  • 3D pointing is used in this specification to refer to the ability of an input device to move in three (or more) dimensions in the air in front of, e.g., a display screen, and the corresponding ability of the user interface to translate those motions directly into user interface commands, e.g., movement of a cursor on the display screen.
  • the transfer of data between the 3D pointing device may be performed wirelessly or via a wire connecting the 3D pointing device to another device.
  • “3D pointing” differs from, e.g., conventional computer mouse pointing techniques which use a surface, e.g., a desk surface or mousepad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display screen.
  • An example of a 3D pointing device can be found in U.S. Patent Application No. 11/119,663, the disclosure of which is incorporated here by reference.
  • a scrollable visual directory display includes a plurality of images each associated with a selectable media item, the plurality of images arranged in a rectangular matrix with a first number of images in each row and a second number of images in each column and a scroll bar on one side of the plurality of images for scrolling said rectangular matrix.
  • an electronic program guide
  • EPG responsive to pointing inputs and having an integrated digital video recorder (DVR) function includes a grid displayed on a display screen, the grid having a plurality of program selections displayed therein, a cursor displayed as a moveable overlay on the grid, the cursor responsive to the pointing inputs to provide random access to the plurality of program selections, and wherein when the cursor is positioned over one of the plurality of program selections, a visual indication of focus is provided to the one of said plurality of program selections; and further wherein when a selection command is received by the electronic program guide, a DVR control overlay is displayed on the grid.
  • DVR digital video recorder
  • a scrollable visual directory display includes a plurality of images each associated with a selectable media item, the plurality of images arranged in a rectangular matrix with a first number of images in each row and a second number of images in each column, and a scroll bar on one side of the plurality of images for scrolling the rectangular matrix.
  • FIG. 1 depicts a conventional remote control unit for an entertainment system
  • FIG. 2 depicts an exemplary multimedia system architecture according to an exemplary embodiment of the present invention
  • FIG. 3 shows an exemplary device client software architecture according to an exemplary embodiment of the present invention
  • FIGS. 4-11 illustrate scenes from a live TV application according to an exemplary embodiment of the present invention.
  • FIGS. 12(a)-12(f> illustrate scenes from a shopping application according to an exemplary embodiment of the present invention.
  • multimedia systems and methods provide, among other things the ability to: a) navigate entertainment choices in a way that is simple and compelling, b) unify and present disparate applications in a seamless fashion, and c) extend a consistent navigation method across many different network connected devices using, for example, standards-based protocols and networking hardware.
  • the below-described navigation and content management systems and methods combine 3D pointing technology with a graphical presentation of content options. This approach allows, for example, consumers to select their content choices on a television screen in a manner which is similar to the way that they would use a mouse to make selections on a computer.
  • exemplary multimedia systems and methods provide for an open architecture 200 which is built as a modular, scalable, data-driven client- server system. This enables simple navigation of content (such as movies, music or TV shows) on hardware that resides in the consumer's home.
  • exemplary multimedia systems and methods are designed so that the metadata is stored at the metadata server 202 and then delivered to the client on demand.
  • Exemplary techniques for implementing a metadata server 202, as well as for processing metadata from various databases 201 to be supplied to client devices can be found in U.S. Patent Application Serial No. 11/037,897, entitled “A Metadata Brokering Server", filed on January 18, 2005 and U.S. Patent Application Serial No. 1 1/140,885, entitled “Method and Apparatus for Metadata Organization for Video on Demand (VOD) Systems", filed on January May 31, 2005, respectively, the disclosures of which are hereby incorporated by reference.
  • the open aspect of the architecture 200 can best be understood by examining the interfaces shown in Figure 2.
  • the HID-FSN (Human Interface Device for Free-Space Navigation) interface is shown on the right side of Figure 2.
  • all applications are controlled by commands that come in via the HID-FSN interface from the remote control 204 to the device client 206 or embedded client 208, the latter of which is part of the navigation system software provided on the client device (e.g., a set-top box, a game console, a processor associated with a television itself or the like).
  • the response(s) to those commands are then rendered on a display (such as a TV screen, not shown in Figure 2) created by the client 206, 208.
  • the client 206, 208 In response to user commands conveyed over the HID-FSN interface, the client 206, 208 requests relevant metadata from the server 202 using the MQL (Metadata Query Language) interface shown in Figure 2. After receiving the appropriate metadata in response to its MQL request, the client 206, 208 displays it in an appropriate format, e.g., on a TV screen, for the particular application which is currently being served by the multimedia system. Examples of higher level applications are provided below. Even the application itself can be divided out on demand from the server 202.
  • a virtual client 210 is shown as part of the architecture 200.
  • the virtual client 210 runs on the server side and manages delivery of application pieces. Those skilled in the art will recognize, however, that some client platforms are capable of housing the full application and, for such implementations, the virtual client 210 can reside on the client device, e.g., in combination with the embedded client 208.
  • the application pieces themselves are sent from the server to the client encoded in a Zoomable User Interface Mark-up Language (ZML). Those interested in more details regarding ZML are directed to U.S. Patent Application Serial
  • the architecture 200 also includes data flows associated with toolkits to the server 202 and virtual client 210. These data flows occur either when new applications are created using application toolkit 212 or when new metadata sources are added to the server 202 using metadata toolkit 214. In either case, the creation of the relevant code is done offline and then published to the server. If it is a new application, it is sent to the virtual client 210 on the server in, for example, ZML format. If it is a new or modified metadata source description, it is sent in, for example, MDL (Metadata Description Language) format.
  • MDL Metal Data Description Language
  • the techniques described herein for providing multimedia user interfaces are based on a model of the user interface as an ordered sequence of semantic interactions.
  • Each individual interaction in a user interface is the sequence of input commands and output display responses that correspond to a particular element of a given application screen. For example, if the user interface screen includes a list box, the manipulation and display of that list box would be a single interaction. Those individual interactions within a single element are referred to herein as "semantic" because they are restricted in meaning to that specific interaction element. The entire “conversation" of the user interface across both the current screen and all application screens can therefore be seen as an ordered sequence of those semantic interactions. Exemplary multimedia systems and methods described herein use this semantic interaction model and partition applications into associated building blocks — elements referred to herein as "bricks.”
  • Bricks encompass all of the attributes and functionality associated with a particular interaction regardless of where in the overall system that interaction occurs.
  • the processing of a particular push button on a given screen can involve code in the metadata server 202 (e.g., a query), in the virtual client 206 (e.g., the display and/or transition preparation) and in the client 206,208 (e.g., the display and interaction with the user). Since each element of this code is relevant to the specific interaction, it is bundled together in a single brick. Each brick then represents different types of user interactions which perform different functions. Since the functions and interactions for each brick differ in meaning from those in other bricks, each set of interactions embedded in a single brick is referred to herein as "semantic".
  • a second aspect of modularity in multimedia systems and methods according to these exemplary embodiments is an object model that permits unique functionality required by the individual bricks to be separately organized as services associated with contextual objects in the system.
  • This abstraction enables bricks to be designed and evolved separately from the object and service model. In turn, this allows for quicker and more efficient design and porting of multimedia systems and methods according to these exemplary embodiments to different platforms.
  • Significant meta-objects associated with the object model are "visual display”, “audio output”, “human interface device”, “timer” and “database”.
  • the visual display object handles all elements of the display.
  • the audio output object handles all elements of audio output.
  • the human interface device object handles all elements of cursor and command input.
  • the timer object handles all services associated with date and time.
  • the database object handles all aspects of persistent storage including metadata and options. [0028] For each object, significant
  • the basic operation Get returns data while its companion operation Set stores data. With these two operations, all data manipulation can be performed.
  • the other pair of operations, Subscribe and Notify, combines to allow for event generation and monitoring. These operations allow for implementation of asynchronous events. This is a virtual abstraction that simplifies overall integration. Actual integration with a particular hardware OS and middleware can be performed at a lower level and is described below with respect to the client 206, 208.
  • the entire non-navigational functionality of systems in accordance with these exemplary embodiments can be built up from these four basic operations on the five basic meta-objects. This simplicity and clean structure enables for better design and validation of the final system.
  • a third element of modularity involves the establishment of clear interfaces between system components.
  • Each of the architectural components of multimedia systems according to these exemplary embodiments e.g., the application and metadata toolkits (212 and 214, respectively), the metadata server 202, the virtual client 210, the client 206,208 and the 3D remote control device 204 — are connected by clearly defined application programming interfaces (APIs).
  • APIs application programming interfaces
  • These APIs facilitate each architectural element's function within the multimedia system.
  • the metadata toolkit 214 is used to describe new metadata sources so that the metadata server 202 can ingest them using an interface referred to herein as metadata description language (IvIDL).
  • metadata description language specifies a mapping between, e.g., the XML data source and a proprietary format (MDL) that is natively recognized by the multimedia system.
  • MDL proprietary format
  • the metadata toolkit 212 provides the application designer with the ability to specify expert rules to account for metadata exceptions and to specify links between metadata types. This linkage is what allows different applications to be tied together. For example, if a consumer is viewing information about a movie, the application may provide a link to that movie's soundtrack.
  • the application toolkit 212 is used to construct applications out of bricks via a Zoomable Mark-up Language (ZML) interface, ZML being described in detail in the above incorporated by reference patent application.
  • ZML Zoomable Mark-up Language
  • Multimedia systems and methods provide an object-based application design method to simplify and speed up the creation of new applications.
  • an on-screen volume control would be implemented once as a brick and would be reused wherever the application designer needs to provide access to volume.
  • Each building block then becomes part of a library that can be reused as often as desired.
  • the result is that the user experience is easily made common across a wide range of applications.
  • the designer uses the application toolkit 212 to customize and assemble the bricks to form new applications, which dramatically reduces the amount of time it takes to create new applications.
  • the entire process of constructing an application can be performed in the toolkit 212.
  • the first step is to use the brick component library to add elements to the screen at the appropriate position.
  • the second step is to set all the options for the screen and each brick according to the application designer's wishes.
  • the third step is to add any necessary functionality over and above the pre-defined Bricks in the embedded JavaScript associated with this scene.
  • the fourth step is to adjust the graphics of each element and the screen itself so that the appearance of the scene is according to the application designer's wishes.
  • the last step is to run and test the candidate screen. This can be done from within the toolkit 212 so that errors can be quickly and easily caught and corrected.
  • the metadata server 202 receives the required metadata from various sources, maps them to a common structure and then responds to queries from the virtual client 210. For example, as applications run on multimedia systems according to these exemplary embodiments, they send requests for specific metadata such as "the current top ten video on demand movies" to the server using the MQL (Metadata Query Language) interface.
  • MQL Metal Query Language
  • Each application in multimedia systems according to these exemplary embodiments allows the user to view and browse specific metadata which, as described above, is information about the various content choices available to the consumer.
  • a TV application manipulates metadata about movies or TV programs and then allows the consumer to watch selected content that the metadata describes. According to exemplary embodiments, it is anticipated that the metadata neither comes from one source nor is entirely accurate or complete.
  • One important operation of the metadata server 202 is to obtain new and/or updated metadata from external sources. This process is referred to herein as "adaptation" since its main function is format adaptation.
  • a complete adaptation process can also include normalization and validation.
  • the normalization process provides for putting all metadata in a standard format so that it can be used by the applications.
  • the validation process insures that the metadata is real and correct.
  • the result is a dataset that feeds the aggregation process.
  • all of the metadata needed run a complete system does not come from a single source. Therefore, another function of the metadata server 202 is to aggregate the various underlying sources together. The result is a more complete and accurate dataset which allows for better navigation.
  • Metadata sources include both third party metadata licensors as well as the individual users themselves. For example, the names of all photos in a consumer's photo library would constitute a metadata source.
  • the metadata consolidated in the database is accessed by the various applications of multimedia systems and methods according to queries posed by the virtual client 210.
  • Multimedia systems according to exemplary embodiments can support a complete query language that allows for full, parameterized construction of the retrieval criteria needed by the application developer. Both searching and sorting are offered as primitive services.
  • Yet another function of the metadata server 202 is to generate cross- linking (special connections) across the metadata. In order to do this, multimedia systems according to these exemplary embodiments can be designed to filter, validate and normalize the ingested metadata and then use inference and matching rules.
  • the result is a rich, cross- linked metadata set that enables seamless browsing both within and across categories of metadata.
  • the user can examine the content related to the movie Independence Day, see that Will Smith starred in it, and notice that Smith produced a record album called Born to Reign.
  • the consumer starts at a movie and browses seamlessly through music and even into commerce if the service offers a way to buy content or related products.
  • the device client 206, 208, 210 is in communication with the metadata server to generate user interfaces in accordance with these exemplary embodiments.
  • the virtual client 210 builds the full application set and handles the data delivery to the remote portions 206, 208 of the device clients.
  • the client software 206, 208 processes the basic semantic interactions with the end user and requests assistance as needed from the virtual client 210.
  • the virtual client 210 preferably resides on the same hardware platform as the rest of the device client software 206, 208 but may reside separately if required, e.g.”, due to client hardware with relatively fewer processing resources.
  • the device client software runs the applications in multimedia systems and methods according to these exemplary embodiments and also runs any services, such as a personal video recorder (PVR) service, that are used to support those applications.
  • PVR personal video recorder
  • Some features of the device client 206-210 include visual presentation, spatial structure, semantic zooming, and low latency.
  • visual presentation since people can process an entire visual screen in about the same time it takes to process seven to ten words, user interfaces according to exemplary embodiments make large use of images as selectable media objects. In multimedia systems having, e.g., tens of thousands of content items to choose from, using images to obtain this extra cognitive efficiency is beneficial.
  • these exemplary embodiments provide a spatially-organized user interface that users find easier to navigate and locate items. The result is that overall performance for content selection can be twice as fast as when a consumer uses plain image maps, e.g., without spatial connections between scenes.
  • All visual, spatial user interfaces are not necessarily ideal for displaying, e.g., multimedia content selection choices. If not properly designed, such user interfaces can easily result in information overload.
  • Effective navigation with large amounts of content can be implemented by considering, for example, the following constraints: constrained out- degree from any given point (small views), small diameters (short paths), good residue at connected nodes and small outlink information (interlocking sets with shared residue) as described by, for example, George W. Furnas in his article "Effective View Navigation", Human Factors in Computing Systems, CHP97 Conference Proceedings (ACM), Atlanta, GA, March 22-27, 1997. Semantic zooming according to exemplary embodiments provides a solution to all of these constraints.
  • exemplary embodiments contemplate that a visual user interface with poor latency performance may be perceived as inferior than a user interface that is solely or primarily text-based, but which has better latency performance. Accordingly, techniques to improve latency in a visual system by the device client are also contemplated according to these exemplary embodiments.
  • one method for reducing latency is to use the transitions associated with the zooming interface to mask long latency operations. Transitions also allow the user to preserve spatial awareness and context so that she or he is never lost while navigating the application interface.
  • An exemplary client architecture 300 for providing a user interface having the afore-described characteristics is illustrated in Figure 3.
  • meta-service objects of the platform which, together, form a zoomable user interface object model (ZOM).
  • these meta-service objects include a database object 302, a handheld device object 304, a timers object 306, a screen object 308 and a speakers object 310.
  • An API boundary is provided between the client run-time software 312 and the middleware and services 302-310 provided by the platform itself.
  • client run-time software 312 shown in the middle region of Figure 3
  • Various subroutines/functions are provided in the client run-time software 312 to interact with the service objects.
  • a database manager 318 interacts with the database object 302
  • an event processor 320 interacts with the handheld object 304 and timers object 306, while the rendering engine 322 interacts with the screen object 308 and the speakers object 310.
  • the client run-time software 312 also includes a scene manager 314 which keeps track of the current camera view of the entire zoomable user interface.
  • the scene manager 314 is also responsible for adjusting the camera view according to either user events (such as the pointer moving or a button press) or external ones (such as the phone ringing or a timer expiring). More information regarding camera views associated with exemplary ZUIs can be found in the above-incorporated by reference U.S. Patent Application Serial No. 10/768,432.
  • the scene manager 314 calls upon the scene loader 316 as appropriate to navigate to another part of the ZUI (new scene) via the ZML interface.
  • the full spatial world associated with the ZUI exists at creation time, it is only brought into the client's view as needed over the ZML interface. This allows for efficient memory utilization by the client.
  • bricks can be defined by a toolkit and stored in a brick library as shown in Figure 3.
  • each application is designed to provide a specific type of entertainment, information or communications function to make it easy for integrators to create products that are unique and aligned to their business interests.
  • one advantage of multimedia systems in accordance with these exemplary embodiments is that new applications can be created using the visual development system toolkit. Three exemplary applications are described below: (1) a television application, (2) a media application, and (3) a shopping application.
  • the applications available in these multimedia systems are launched through a portal or menu screen.
  • the portal contains, for example, icons for each system application.
  • the application framework can provide on-screen navigation buttons, which reside on the portal and all navigation pages.
  • these navigation buttons include a home button, a search button, a go up one level button, and a live TV button.
  • the home navigation button takes the consumer back to the top screen of the portal regardless of their location in the application worldview.
  • the search navigation button is used to help users find desired content. According to some exemplary embodiments, it can allows text entry and presents results visually.
  • Each application is a set of sub-functions organized in a hierarchy.
  • the go up one level button takes the consumer up to the main level in the current sub-function.
  • the live TV button takes the consumer to the Live TV screen associated with the TV application described next below.
  • the channel viewed is the last one selected.
  • U.S. Patent Application Serial No. 11/437,215 filed on May 19, 2006, entitled "Global Navigation Objects in User Interfaces, the disclosure of which is incorporated here by reference.
  • U.S. Patent Application Serial No. 11/437,215 filed on May 19, 2006, entitled "Global Navigation Objects in User Interfaces, the disclosure of which is incorporated here by reference.
  • one application associated with exemplary embodiments is a TV application that manages the television watching experience. This application represents the basic functionality of today's living room applications — linear television, video-on-demand (VOD) and digital video recording.
  • An exemplary TV application which can be run on the afore-described hardware/software architecture can include the following features.
  • Live TV Viewing live TV is an important aspect of many multimedia systems and applications. According to these exemplary embodiments, users are freed from strict linear timetables with digital controls like pause, fast forward and rewind. This feature lets consumers receive a phone call or use a bathroom without missing any of their favorite show or sportscast. They can skip over parts they don't want to see or rewind to see important scenes again.
  • the Live TV function also offers a new type of on-screen control that makes changing channels a snap even in an environment with several hundred TV channels and thousands of video on demand options. The consumer can point-and-click at the desired choice and is free from the cumbersome up-down-left-right approach of today's remote controls. Available Live TV features include volume and channel adjustment, ad banners, rewind and fast forward controls, and play and pause.
  • some exemplary embodiments of live TV applications indicate show progress and, if available, includes linkages to other related content and services.
  • the exemplary user interface screens of Figures 4-11 which illustrate portions of an exemplary live TV application.
  • the user interface screen illustrated in Figure 4 can, for example, be accessed by selecting and actuating the live TV button from the home portal described above.
  • moving a cursor 400 over a left portion of the display screen invokes a channel control overlay 402 which is superimposed over a live video feed 404.
  • the channel control overlay 402 includes a channel bar with a movable selector 406.
  • the movable selector 406 can be dragged up and down the channel bar and its current location along the channel bar indicates which channel has the focus of the interface.
  • the location of the movable selector 406 along the channel bar is such that channel 37 has the focus, resulting in a supplemental information overlay 408 being displayed over the live TV feed 404.
  • the movable selector 406 can be controlled by, for example, rotating a scroll wheel on a pointing device or a 3D pointing device (handheld controller) which is in communication with the multimedia system.
  • To the left of the channel bar are a number of rectangular icons representing favorite channels or networks which can be selected for display on the live TV feed 404 by, e.g., pointing and clicking thereon. [0044] If, for example, the cursor 400 is moved to the bottom portion of the screen, a
  • DVR control overlay 500 can be invoked as shown in Figure 5.
  • pause 502, rewind 504 and fast-forward 506 controls can be displayed and actuated by, e.g., pointing and clicking on the desired DVR control while they are being displayed over the live TV feed.
  • a volume control overly 600 can be displayed when, for example, a scroll wheel is depressed on the pointing device or 3D pointing device used as a remote control. Thereafter, in response to rotating the scroll wheel up or down, the volume can be increased or decreased and the slider 602 will be displayed on the screen as moving up or down to reflect the change in volume.
  • the live TV application can also include a guide which features a program grid that is enhanced with by 3D pointing capabilities of these exemplary embodiments.
  • This new interaction approach increases the value of the traditional EPG because it offers random navigation of each grid element (using the pointing metaphor). It also provides fast access to content metadata and services.
  • the guide and channel banner have built-in content and service linkages that further enhance the value of the guide.
  • the digital video recorder (DVR) and on-demand content services are tightly integrated within the guide and include a robust search and filter mechanism, which works across all types of programming whether linear, on-demand or recorded. Other features include user selected favorites, reminders, filters, and linkages.
  • FIG. 7 An example is shown as Figure 7, wherein these features and others, e.g., filters 702 re "Movies" or “Sports” enable a user to easily filter the available guide selections being played by pointing and clicking on the associated tabs.
  • this exemplary live TV application integrates the DVR into the guide itself. Recorded programming and services to schedule a program recording are always just a few clicks away. This feature eliminates complex hierarchies to access DVR services. DVR features include record, play (with progress bar), delete (priority- based), recorded schedule, display of space available, sort, filter, conflict resolution, and watch while record.
  • This integration enables the user to, for example, look for something to watch within the guide portion of the live TV application and to decide (without changing scenes in the ZUI) to record that show since it is on later, as shown in Figure 8.
  • selection of the entry "Judge Alex" within the guide at a time (2:30pm) which is later than the current time results in a DVR overlay being displayed directly on the guide, i.e., without taking the user to another ZUI scene.
  • the DVR overlay 800 includes some information about the selected TV show plus three selectable options - "My Shows", "More Info” and "Record” tiny of which can be pointed to and selected. Selecting the "Record” button in overlay 800 will result in the selected show being queued for recording by the multimedia system.
  • Selecting the "My Shows" button in the overlay 800 will provide the user with a browsable listing of all of his or her shows which have been recorded (as well as those scheduled to be recorded) using, e.g., a visual directory of images, an example of which is illustrated in Figure 9.
  • this exemplary visual directory is scrollable vertically by use of the scroll bar provided to the right of the matrix of images.
  • horizontal scrolling could be supported by providing a scroll bar above or below the matrix, either together with or as an alternative to vertical scrolling.
  • Selecting one of the TV shows which have been recorded to view more details can be accomplished by pointing at and clicking on a corresponding TV show image. For example, selecting the TV show image of "Alias" might result in the ZUI screen shown in Figure 10 which provides ready access to recorded episodes as well as a schedule of upcoming episodes scheduled to be recorded.
  • Yet another feature of the live TV application is the use of visual directories for on -demand services.
  • the visual directories offer the consumer the same experience as when they walk into a video store. They can see many movie covers organized into appropriate categories, thereby providing an approach that effectively scales with the breadth of available content.
  • a zoomable Visual DirectoryTM shows as many as 128 titles per screen and can scale using the pan and scroll features to support thousands of titles in a simple to use structure.
  • the information structure of the Visual DirectoryTM flattens the hierarchy of total available options and, using linkages, supports the various methods that consumers use to search for content. They can search directly or they can browse through categories or use links embedded in the listings to improve access to a wide range of content choices.
  • VOD features include scaling, filters, sorting, pay-per- view, and rental management.
  • Still another feature of the live TV application is a search capability.
  • a live TV application can include a search function (reachable by, among other techniques, a global navigation button as described above) from which users can search for specific content using keywords, names, titles and date or time information.
  • the search system provides filtered results from TV listings, the DVR manager, and video on demand database.
  • the consumer can easily point at the desired search result in a visual list of options.
  • predictive methods are employed to minimize the number of "free-space operations" required to enter the desired request.
  • An example of this latter type of search is shown in Figure 11 wherein a user is trying to determine if any movies having Tom Hanks as an actor are playing on available VOD selections.
  • a media application can provide a comprehensive suite of personal multimedia content navigation and media management applications including music, photos and home videos that directly address convergence in the home.
  • This exemplary media application organizes digital media content from a consumer's personal collection (e.g., on a Personal Computer or other networked device) and integrates this content delivered by a service providers (over a broadband connection) to present all available digital media in a consistent user interface on the TV.
  • this exemplary media application provides for the creation of photo slide shows wherein users can point-and-click on their favorite photos to create instant slide shows displayed on their televisions.
  • Custom playlists of music can be readily created by applying the Visual DirectoryTM to a user's personal music collection.
  • the user wants easy access to video clips delivered by a service provider or those clips previously downloaded to a personal computer, a simple point-and-click creates a custom video playlist for playback to the TV.
  • she or he is referred to U.S. Patent Application Serial No. 11/354,329, entitled “Methods and Systems for Enhancing Television Using 3D Pointing", filed on February 14, 2006, the disclosure of which is incorporated here by reference.
  • an interactive shopping application is created that provides the experience of a virtual mall on TV which allows consumers to shop in a comfortable and secure setting while providing a community aspect to the experience.
  • This shopping application leverages the metadata created for online shopping destinations and reformats this information to present it in an interactive and visually appealing manner that is optimized for the TV.
  • Running the exemplary shopping application a user can point at a shopping items list or visual goods presentation and either automatically add it to a shopping cart or just buy it.
  • a number of different categories of items for sale can be depicted on the TV (or other display) using some generic phrases and images.
  • a user pauses the cursor 1200 over a particular category, e.g., "Handbags" in Figure 12(b), that image is magnified slightly to denote that it has the current focus.
  • a zoom in can be performed on the "Handbag” category, revealing a bookshelf (visual directory) of handbags as shown in Figure 12(c).
  • the cursor's position indicates a current selection within the bookshelf, reflected by the hoverzoom of the "Smooth Leather" category of items in Figure 12(c).
  • Another zoom in can be performed, again either automatically after an elapsed period of pointing at this category or in response to a specific user input via the handheld device, resulting in a more detailed view of this category as shown in Figure 12(d).
  • Selection of an image of particular handbag may result in a zoom in to the detailed view of Figure 12(e), e.g., using the zooming, panning and/or translating effects described above.
  • a user can easily navigate cross-links within this exemplary shopping application by pointing at the ones of interest (like other goods from same store, matching accessories, similar designers). For example, as shown in Figure 12(e) a crosslink to a shoe accessory is displayed as an image 1202. If that link is activated, the user can jump to a detailed view of that item without having to navigate through the various higher level screens to reach it, as seen in Figure 12(f).
  • One other component in addition to the afore-described hardware/software architectures and applications, associated with multimedia systems according to these exemplary embodiments is the remote control device with which the user interacts with the various ZUI screens to select and consume content.
  • the number of zoom levels, as well as the particular information and controls provided to the user at each level may be varied.
  • the present invention provides techniques for presenting large and small sets of media items using azoomable interface such that a user can easily search through, browse, organize and play back media items such as movies and music.
  • Graphical user interfaces according to the present invention organize media item selections on a virtual surface such that similar selections are grouped together. Initially, the interface presents a zoomed out view of the surface, and in most cases, the actual selections will not be visible at this level, but rather only their group names. As the user zooms progressively inward, more details are revealed concerning the media item groups or selections.
  • Zooming graphical user interfaces can contain categories of images nested to an arbitrary depth as well as categories of categories.
  • the media items can include content which is stored locally, broadcast by a broadcast provider, received via a direct connection from a content provider or on a peering basis.
  • the media items can be provided in a scheduling format wherein date/time information is provided at some level of the GUI.
  • frameworks and GUIs according to exemplary embodiments of the present invention can also be applied to television commerce wherein the items for selection are being sold to the user.
  • Zoomable user interfaces employ various transition effects to instill a sense of spatial positioning within the ZUI "world” as a user navigates among content selections.
  • zooming refers to, for example, the progressive scaling of a displayed object, set of objects or a portion thereof that gives the visual impression of movement of all or part of such object(s) toward or away from an observer.
  • the zooming feature in some instances, causes the display of the object or objects to change from a distant view to a close view, and vice versa, as though the end user were manipulating a telescope, a magnifying glass, or a zoom lens of a camera.
  • semantic zooming may be employed to provide a similar progressive scaling on the display, yet adding or hiding details which would not necessarily be added or hidden when using a "pure" camera zoom.
  • a panning transition refers to the progressive translating of a displayed object, set of objects or a portion thereof that gives the visual impression of lateral movement of the object(s).
  • Systems and methods for processing data according to exemplary embodiments of the present invention can be performed by one or more processors executing sequences of instructions contained in a memory device. Such instructions may be read into the memory device from other computer-readable mediums such as secondary data storage device(s). Execution of the sequences of instructions contained in the memory device causes the processor to operate, for example, as described above. In alternative embodiments, hardwire circuitry may be used in place of or in combination with software instructions to implement the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Systems and methods according to exemplary embodiments of the present invention provide a user interface including an electronic program guide and scrollable visual directories.

Description

MULTIMEDIA SYSTEMS, METHODS AND APPLICATIONS
RELATED APPLICATIONS
[0001] This application is related to, and claims priority from, U;S. Provisional
Patent Application Serial No. 60/791,596 filed on December 2, 2005, entitled "Home Multimedia Environment" to Daniel S. Simpkins et al., the disclosure of which is incorporated here by reference and U.S. Provisional Patent Application Serial No. 60/755,819, entitled "Spontaneous Navigation System", filed on January 4, 2006, the disclosure of which is incorporated here by reference.
BACKGROUND
[0002] This application describes, among other things, multimedia systems, methods and applications running thereon.
[0003] Technologies associated with the communication of information have evolved rapidly over the last several decades. Television, cellular telephony, the Internet and optical communication techniques (to name just a few things) combine to inundate consumers with available information and entertainment options. Taking television as an example, the last three decades have seen the introduction of cable television service, satellite television service, pay-per-view movies and video-on-demand. Whereas television viewers of the 1960s could typically receive perhaps four or five over-the-air TV channels on their television sets, today's TV watchers have the opportunity to select from hundreds, thousands, and potentially millions of channels of shows and information. Video-on-demand technology, currently used primarily in hotels and the like, provides the potential for in-home entertainment selection from among thousands of movie titles. [0004] The technological ability to provide so much information and content to end users provides both opportunities and challenges to system designers and service providers. One challenge is that while end users typically prefer having more choices rather than fewer, this preference is counterweighted by their desire that the selection process be both fast and simple. Unfortunately, the development of the systems and interfaces by which end users access media items has resulted in selection processes which are neither fast nor simple. Consider again the example of television programs. When television was in its infancy, determining which program- to watch was a relatively simple process primarily due to the small number of choices. One would consult a printed guide which was formatted, for example, as series of columns and rows which showed the correspondence between (1) nearby television channels, (2) programs being transmitted on those channels and (3) date and time. The television was tuned to the desired channel by adjusting a tuner knob and the viewer watched the selected program. Later, remote control devices were introduced that permitted viewers to tune the television from a distance. This addition to the user-television interface created the phenomenon known as "channel surfing" whereby a viewer could rapidly view short segments being broadcast on a number of channels to quickly learn what programs were available at any given time.
[0005] Despite the fact that the number of channels and amount of viewable content has dramatically increased, the generally available user interface, control device options and frameworks for televisions has not changed much over the last 30 years. Printed guides are still the most prevalent mechanism for conveying programming information. The multiple button remote control with up and down arrows is still the most prevalent channel/content selection mechanism. The reaction of those who design and implement the TV user interface to the increase in available media content has been a straightforward extension of the existing selection procedures and interface objects. Thus, the number of rows in the printed guides has been increased to accommodate more channels. The number of buttons on the remote control devices has been increased to support additional functionality and content handling, e.g., as shown in Figure 1. However, this approach has significantly increased both the time required for a viewer to review the available information and the complexity of actions required to implement a selection. Arguably, the cumbersome nature of the existing interface has hampered commercial implementation of some services, e.g., video-on-demand, since consumers are resistant to new services that will add complexity to an interface that they view as already too slow and complex.
[0006] In addition to increases in bandwidth and content, the user interface bottleneck problem is being exacerbated by the aggregation of technologies. Consumers are reacting positively to having the option of buying integrated systems rather than a number of segregable components. An example of this trend is the combination teleyision/VCR/DVD in which three previously independent components are frequently sold today as an integrated unit. This trend is likely to continue, potentially with an end result that most if not all of the communication devices currently found in the household will be packaged together as an integrated unit, e.g., a television/VCR/DVD/internet access/radio/stereo unit. Even those who continue to buy separate components will likely desire seamless control of, and interworking between, the separate components. With this increased aggregation comes the potential for more complexity in the user interface. For example, when so-called "universal" remote units were introduced, e.g., to combine the functionality of TV remote units and VCR remote units, the number of buttons on these universal remote units was typically more than the number of buttons on either the TV remote unit or VCR remote unit individually. This added number of buttons and functionality makes it very difficult to control anything but the simplest aspects of a TV or VCR without hunting for exactly the right button on the remote. Many times, these universal remotes do not provide enough buttons to access many levels of control or features unique to certain TVs. In these cases, the original device remote unit is still needed, and the original hassle of handling multiple remotes remains due to user interface issues arising from the complexity of aggregation. Some remote units have addressed this problem by adding "soft" buttons that can be programmed with the expert commands. These soft buttons sometimes have accompanying LCD displays to indicate their action. These too have the flaw that they are difficult to use without looking away from the TV to the remote control. Yet another flaw in these remote units is the use of modes in an attempt to reduce the number of buttons. In these "moded" universal remote units, a special button exists to select whether the remote should communicate with the TV, DVD player, cable set-top box, VCR, etc. This causes many usability issues including sending commands to the wrong device, forcing the user to look at the remote to make sure that it is in the right mode, and it does not provide any simplification to the integration of multiple devices. The most advanced of these universal remote units provide some integration by allowing the user to program sequences of commands to multiple devices into the remote. This is such a difficult task that many users hire professional installers to program their universal remote units.
[0007] Some attempts have also been made to modernize the screen interface between end users and media systems. However, these attempts typically suffer from, among other drawbacks, an inability to easily scale between large collections of media items and small collections of media items. For example, interfaces which rely on lists of items may work well for small collections of media items, but are tedious to browse for large collections of media items. Interfaces which rely on hierarchical navigation (e.g., tree structures) may be speedier to traverse than list interfaces for large collections of media items, but are not readily adaptable to small collections of media items. Additionally, users tend to lose interest in selection processes wherein the user has to move through three or more layers in a tree structure. For all of these cases, current remote units make this selection process even more tedious by forcing the user to repeatedly depress the up and down buttons to navigate the list or hierarchies. When selection skipping controls are available such as page up and page down, the user usually has to look at the remote to find these special buttons or be trained to know that they even exist. Accordingly, organizing frameworks, techniques and systems which simplify the control and screen interface between users and media systems as well as accelerate the selection process, while at the same time permitting service providers to take advantage of the increases in available bandwidth to end user equipment by facilitating the supply of a large number of media items and new services to the user have been proposed in U.S. Patent Application Serial No. 10/768,432, filed on January 30, 2004, entitled "A Control Framework with a Zoomable Graphical User Interface for Organizing, Selecting and Launching Media Items", the disclosure of which is incorporated here by reference. [0008] Also of particular interest for this specification are the remote devices usable to interact with such frameworks, as well as other applications, systems and methods for these remote devices for interacting with such frameworks. As mentioned in the above- incorporated application, various different types of remote devices can be used with such frameworks including, for example, trackballs, "mouse"-type pointing devices, light pens, etc. However, another category of remote devices which can be used with such frameworks (and other applications) is 3D pointing devices with scroll wheels. The phrase "3D pointing" is used in this specification to refer to the ability of an input device to move in three (or more) dimensions in the air in front of, e.g., a display screen, and the corresponding ability of the user interface to translate those motions directly into user interface commands, e.g., movement of a cursor on the display screen. The transfer of data between the 3D pointing device may be performed wirelessly or via a wire connecting the 3D pointing device to another device. Thus "3D pointing" differs from, e.g., conventional computer mouse pointing techniques which use a surface, e.g., a desk surface or mousepad, as a proxy surface from which relative movement of the mouse is translated into cursor movement on the computer display screen. An example of a 3D pointing device can be found in U.S. Patent Application No. 11/119,663, the disclosure of which is incorporated here by reference.
SUMMARY
[0009] According to one exemplary embodiment of the present invention, a scrollable visual directory display includes a plurality of images each associated with a selectable media item, the plurality of images arranged in a rectangular matrix with a first number of images in each row and a second number of images in each column and a scroll bar on one side of the plurality of images for scrolling said rectangular matrix.
[0010] According to another exemplary embodiment, an electronic program guide
(EPG) responsive to pointing inputs and having an integrated digital video recorder (DVR) function includes a grid displayed on a display screen, the grid having a plurality of program selections displayed therein, a cursor displayed as a moveable overlay on the grid, the cursor responsive to the pointing inputs to provide random access to the plurality of program selections, and wherein when the cursor is positioned over one of the plurality of program selections, a visual indication of focus is provided to the one of said plurality of program selections; and further wherein when a selection command is received by the electronic program guide, a DVR control overlay is displayed on the grid.
[0011] According to another exemplary embodiment, a scrollable visual directory display includes a plurality of images each associated with a selectable media item, the plurality of images arranged in a rectangular matrix with a first number of images in each row and a second number of images in each column, and a scroll bar on one side of the plurality of images for scrolling the rectangular matrix. BRTEF DESCRIPTION OF THE DRAWINGS
[0012] The accompanying drawings illustrate exemplary embodiments of the present invention, wherein:
[0013] FIG. 1 depicts a conventional remote control unit for an entertainment system;
[0014] FIG. 2 depicts an exemplary multimedia system architecture according to an exemplary embodiment of the present invention;
[0015] FIG. 3 shows an exemplary device client software architecture according to an exemplary embodiment of the present invention;
[0016] FIGS. 4-11 illustrate scenes from a live TV application according to an exemplary embodiment of the present invention; and
[0017] FIGS. 12(a)-12(f> illustrate scenes from a shopping application according to an exemplary embodiment of the present invention.
DETAILED DESCRIPTION
[0018] The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. Also, the following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims.
[0019] According to exemplary embodiments of the present invention, multimedia systems and methods provide, among other things the ability to: a) navigate entertainment choices in a way that is simple and compelling, b) unify and present disparate applications in a seamless fashion, and c) extend a consistent navigation method across many different network connected devices using, for example, standards-based protocols and networking hardware. The below-described navigation and content management systems and methods combine 3D pointing technology with a graphical presentation of content options. This approach allows, for example, consumers to select their content choices on a television screen in a manner which is similar to the way that they would use a mouse to make selections on a computer. However, this approach is much more powerful than today's operating systems because it interacts with one or more relational databases of available content, referred to below as residing on a metadata server, to help consumers browse for desired content by making recommendations to related content and products based on consumer usage and appropriate business rules. These and other features of multimedia systems and methods according to these exemplary embodiments will become more apparent upon reviewing some detailed, yet purely illustrative embodiments of system architecture (including hardware and software architecture) described below beginning with Figure 2. [0020] Therein, it can be seen that exemplary multimedia systems and methods provide for an open architecture 200 which is built as a modular, scalable, data-driven client- server system. This enables simple navigation of content (such as movies, music or TV shows) on hardware that resides in the consumer's home. Since the total amount of available content can be quite large (e.g., many tens of thousands of items), the total amount of relevant metadata is very large as well. In the most general case, the entire metadata set cannot be stored entirely at the client. Therefore, exemplary multimedia systems and methods are designed so that the metadata is stored at the metadata server 202 and then delivered to the client on demand. Exemplary techniques for implementing a metadata server 202, as well as for processing metadata from various databases 201 to be supplied to client devices, can be found in U.S. Patent Application Serial No. 11/037,897, entitled "A Metadata Brokering Server", filed on January 18, 2005 and U.S. Patent Application Serial No. 1 1/140,885, entitled "Method and Apparatus for Metadata Organization for Video on Demand (VOD) Systems", filed on January May 31, 2005, respectively, the disclosures of which are hereby incorporated by reference.
[0021] The open aspect of the architecture 200 can best be understood by examining the interfaces shown in Figure 2. For example, the HID-FSN (Human Interface Device for Free-Space Navigation) interface is shown on the right side of Figure 2. According to an exemplary embodiment of the present invention, all applications are controlled by commands that come in via the HID-FSN interface from the remote control 204 to the device client 206 or embedded client 208, the latter of which is part of the navigation system software provided on the client device (e.g., a set-top box, a game console, a processor associated with a television itself or the like). The response(s) to those commands are then rendered on a display (such as a TV screen, not shown in Figure 2) created by the client 206, 208. This user interaction drives operations in multimedia system's applications according to these exemplary embodiments, examples of which are described below. [0022] In response to user commands conveyed over the HID-FSN interface, the client 206, 208 requests relevant metadata from the server 202 using the MQL (Metadata Query Language) interface shown in Figure 2. After receiving the appropriate metadata in response to its MQL request, the client 206, 208 displays it in an appropriate format, e.g., on a TV screen, for the particular application which is currently being served by the multimedia system. Examples of higher level applications are provided below. Even the application itself can be parceled out on demand from the server 202. In this exemplary embodiment, a virtual client 210 is shown as part of the architecture 200. The virtual client 210 runs on the server side and manages delivery of application pieces. Those skilled in the art will recognize, however, that some client platforms are capable of housing the full application and, for such implementations, the virtual client 210 can reside on the client device, e.g., in combination with the embedded client 208. The application pieces themselves are sent from the server to the client encoded in a Zoomable User Interface Mark-up Language (ZML). Those interested in more details regarding ZML are directed to U.S. Patent Application Serial
No. , filed on the same date as the present application, entitled "Scene
Transitions in a Zoomable User Interface Using Zoomable Markup Language", the disclosure of which is incorporated here by reference.
[0023] The architecture 200 also includes data flows associated with toolkits to the server 202 and virtual client 210. These data flows occur either when new applications are created using application toolkit 212 or when new metadata sources are added to the server 202 using metadata toolkit 214. In either case, the creation of the relevant code is done offline and then published to the server. If it is a new application, it is sent to the virtual client 210 on the server in, for example, ZML format. If it is a new or modified metadata source description, it is sent in, for example, MDL (Metadata Description Language) format. [0024] There are at least three kinds of modularity associated with multimedia systems and methods according to these exemplary embodiments. First, the techniques described herein for providing multimedia user interfaces are based on a model of the user interface as an ordered sequence of semantic interactions. Each individual interaction in a user interface is the sequence of input commands and output display responses that correspond to a particular element of a given application screen. For example, if the user interface screen includes a list box, the manipulation and display of that list box would be a single interaction. Those individual interactions within a single element are referred to herein as "semantic" because they are restricted in meaning to that specific interaction element. The entire "conversation" of the user interface across both the current screen and all application screens can therefore be seen as an ordered sequence of those semantic interactions. Exemplary multimedia systems and methods described herein use this semantic interaction model and partition applications into associated building blocks — elements referred to herein as "bricks."
[0025] Typically, television application developers either construct their software as a monolithic and integrated software application or use a markup language like HTML. In the former case, applications are highly optimized, consistent and have excellent bandwidth management. However, they are also very slow to develop, release and upgrade. In the latter case, development time is very fast but performance and consistency are not controlled very well. Distributed software construction using bricks enables multimedia systems and methods according to these exemplary embodiments to obtain the advantages of both techniques.
[0026] Bricks encompass all of the attributes and functionality associated with a particular interaction regardless of where in the overall system that interaction occurs. For example, the processing of a particular push button on a given screen can involve code in the metadata server 202 (e.g., a query), in the virtual client 206 (e.g., the display and/or transition preparation) and in the client 206,208 (e.g., the display and interaction with the user). Since each element of this code is relevant to the specific interaction, it is bundled together in a single brick. Each brick then represents different types of user interactions which perform different functions. Since the functions and interactions for each brick differ in meaning from those in other bricks, each set of interactions embedded in a single brick is referred to herein as "semantic". Bundling each interaction along with the other aspects of the brick, such as the graphics used on the display, makes each brick an individual design element of a user interface language. The entire application and user interface is then built by assembling bricks together in the way the application designer wants. Using bricks in this way adds repeatability and consistency to the entire application. Finally, bricks provide the performance advantage of a monolithic application and the flexibility and creation advantage of an application developed with a markup language. For those interested in more details regarding bricks, U.S. Patent Application Serial No. 11/325,749, filed on January 5, 2006 and entitled "Distributed Software Construction for User Interfaces", is expressly incorporated here by reference. [0027] A second aspect of modularity in multimedia systems and methods according to these exemplary embodiments is an object model that permits unique functionality required by the individual bricks to be separately organized as services associated with contextual objects in the system. This represents an abstraction of physical devices along with their associated services and permits the complete separation of specific, non- navigational functionality from the semantic interaction encapsulation of the brick system. This abstraction enables bricks to be designed and evolved separately from the object and service model. In turn, this allows for quicker and more efficient design and porting of multimedia systems and methods according to these exemplary embodiments to different platforms. Significant meta-objects associated with the object model are "visual display", "audio output", "human interface device", "timer" and "database". The visual display object handles all elements of the display. The audio output object handles all elements of audio output. The human interface device object handles all elements of cursor and command input. The timer object handles all services associated with date and time. The database object handles all aspects of persistent storage including metadata and options. [0028] For each object, significant services include Get, Set, Subscribe and Notify.
The basic operation Get returns data while its companion operation Set stores data. With these two operations, all data manipulation can be performed. The other pair of operations, Subscribe and Notify, combines to allow for event generation and monitoring. These operations allow for implementation of asynchronous events. This is a virtual abstraction that simplifies overall integration. Actual integration with a particular hardware OS and middleware can be performed at a lower level and is described below with respect to the client 206, 208. The entire non-navigational functionality of systems in accordance with these exemplary embodiments can be built up from these four basic operations on the five basic meta-objects. This simplicity and clean structure enables for better design and validation of the final system.
[0029] A third element of modularity involves the establishment of clear interfaces between system components. Each of the architectural components of multimedia systems according to these exemplary embodiments, e.g., the application and metadata toolkits (212 and 214, respectively), the metadata server 202, the virtual client 210, the client 206,208 and the 3D remote control device 204 — are connected by clearly defined application programming interfaces (APIs). These APIs facilitate each architectural element's function within the multimedia system. For example, as mentioned above, the metadata toolkit 214 is used to describe new metadata sources so that the metadata server 202 can ingest them using an interface referred to herein as metadata description language (IvIDL). According to one exemplary embodiment, metadata description language specifies a mapping between, e.g., the XML data source and a proprietary format (MDL) that is natively recognized by the multimedia system. In more complex applications, the metadata toolkit 212 provides the application designer with the ability to specify expert rules to account for metadata exceptions and to specify links between metadata types. This linkage is what allows different applications to be tied together. For example, if a consumer is viewing information about a movie, the application may provide a link to that movie's soundtrack. [0030] Similarly, the application toolkit 212 is used to construct applications out of bricks via a Zoomable Mark-up Language (ZML) interface, ZML being described in detail in the above incorporated by reference patent application. Multimedia systems and methods according to exemplary embodiments of the present invention provide an object-based application design method to simplify and speed up the creation of new applications. For example, an on-screen volume control would be implemented once as a brick and would be reused wherever the application designer needs to provide access to volume. Each building block then becomes part of a library that can be reused as often as desired. The result is that the user experience is easily made common across a wide range of applications. The designer uses the application toolkit 212 to customize and assemble the bricks to form new applications, which dramatically reduces the amount of time it takes to create new applications. The entire process of constructing an application can be performed in the toolkit 212. The first step is to use the brick component library to add elements to the screen at the appropriate position. The second step is to set all the options for the screen and each brick according to the application designer's wishes. The third step is to add any necessary functionality over and above the pre-defined Bricks in the embedded JavaScript associated with this scene. The fourth step is to adjust the graphics of each element and the screen itself so that the appearance of the scene is according to the application designer's wishes. The last step is to run and test the candidate screen. This can be done from within the toolkit 212 so that errors can be quickly and easily caught and corrected. Some exemplary applications are described below.
[0031] The metadata server 202 receives the required metadata from various sources, maps them to a common structure and then responds to queries from the virtual client 210. For example, as applications run on multimedia systems according to these exemplary embodiments, they send requests for specific metadata such as "the current top ten video on demand movies" to the server using the MQL (Metadata Query Language) interface. Each application in multimedia systems according to these exemplary embodiments allows the user to view and browse specific metadata which, as described above, is information about the various content choices available to the consumer. For example, as described below, a TV application manipulates metadata about movies or TV programs and then allows the consumer to watch selected content that the metadata describes. According to exemplary embodiments, it is anticipated that the metadata neither comes from one source nor is entirely accurate or complete. Therefore, a significant aspect of the overall experience associated with these multimedia methods comes from gathering, correcting, and linking or connecting the available metadata in the relational database on the metadata server 202. The following sections describe four functions of the server 201: adaptation, aggregation, queries and crosslink generation.
[0092] One important operation of the metadata server 202 is to obtain new and/or updated metadata from external sources. This process is referred to herein as "adaptation" since its main function is format adaptation. A complete adaptation process can also include normalization and validation. The normalization process provides for putting all metadata in a standard format so that it can be used by the applications. The validation process insures that the metadata is real and correct. The result is a dataset that feeds the aggregation process. Typically, all of the metadata needed run a complete system does not come from a single source. Therefore, another function of the metadata server 202 is to aggregate the various underlying sources together. The result is a more complete and accurate dataset which allows for better navigation. Metadata sources include both third party metadata licensors as well as the individual users themselves. For example, the names of all photos in a consumer's photo library would constitute a metadata source. [0033] The metadata consolidated in the database is accessed by the various applications of multimedia systems and methods according to queries posed by the virtual client 210. Multimedia systems according to exemplary embodiments can support a complete query language that allows for full, parameterized construction of the retrieval criteria needed by the application developer. Both searching and sorting are offered as primitive services. Yet another function of the metadata server 202 is to generate cross- linking (special connections) across the metadata. In order to do this, multimedia systems according to these exemplary embodiments can be designed to filter, validate and normalize the ingested metadata and then use inference and matching rules. The result is a rich, cross- linked metadata set that enables seamless browsing both within and across categories of metadata. For example, the user can examine the content related to the movie Independence Day, see that Will Smith starred in it, and notice that Smith produced a record album called Born to Reign. In this hypothetical example, the consumer starts at a movie and browses seamlessly through music and even into commerce if the service offers a way to buy content or related products.
[0034] The device client 206, 208, 210 is in communication with the metadata server to generate user interfaces in accordance with these exemplary embodiments. According to one exemplary embodiment, the virtual client 210 builds the full application set and handles the data delivery to the remote portions 206, 208 of the device clients. In such embodiments, the client software 206, 208 processes the basic semantic interactions with the end user and requests assistance as needed from the virtual client 210. The virtual client 210 preferably resides on the same hardware platform as the rest of the device client software 206, 208 but may reside separately if required, e.g.", due to client hardware with relatively fewer processing resources.
[0035] Regardless of whether the device client software is split between the server side and the client side or not, it runs the applications in multimedia systems and methods according to these exemplary embodiments and also runs any services, such as a personal video recorder (PVR) service, that are used to support those applications. Some features of the device client 206-210 include visual presentation, spatial structure, semantic zooming, and low latency. Regarding visual presentation, since people can process an entire visual screen in about the same time it takes to process seven to ten words, user interfaces according to exemplary embodiments make large use of images as selectable media objects. In multimedia systems having, e.g., tens of thousands of content items to choose from, using images to obtain this extra cognitive efficiency is beneficial. Additionally, these exemplary embodiments provide a spatially-organized user interface that users find easier to navigate and locate items. The result is that overall performance for content selection can be twice as fast as when a consumer uses plain image maps, e.g., without spatial connections between scenes.
[0036] All visual, spatial user interfaces are not necessarily ideal for displaying, e.g., multimedia content selection choices. If not properly designed, such user interfaces can easily result in information overload. Effective navigation with large amounts of content can be implemented by considering, for example, the following constraints: constrained out- degree from any given point (small views), small diameters (short paths), good residue at connected nodes and small outlink information (interlocking sets with shared residue) as described by, for example, George W. Furnas in his article "Effective View Navigation", Human Factors in Computing Systems, CHP97 Conference Proceedings (ACM), Atlanta, GA, March 22-27, 1997. Semantic zooming according to exemplary embodiments provides a solution to all of these constraints. The user is presented only with information that is semantically relevant to him or her at that instant and that information is presented with the appropriate detail to make it useful. Combined with effective cross-linking as performed by the metadata engine, user interfaces that transition between scenes using semantic zooming provides a complete navigation solution. For example, when a list of images is shown, the user can receive an overlay of textural titles by moving the cursor over the selected picture. [0037] Lastly, exemplary embodiments contemplate that a visual user interface with poor latency performance may be perceived as inferior than a user interface that is solely or primarily text-based, but which has better latency performance. Accordingly, techniques to improve latency in a visual system by the device client are also contemplated according to these exemplary embodiments. For example, one method for reducing latency according to these exemplary embodiments is to use the transitions associated with the zooming interface to mask long latency operations. Transitions also allow the user to preserve spatial awareness and context so that she or he is never lost while navigating the application interface. An exemplary client architecture 300 for providing a user interface having the afore-described characteristics is illustrated in Figure 3.
[0038] Therein, at the bottom of the figure, are the meta-service objects of the platform which, together, form a zoomable user interface object model (ZOM). Specifically, these meta-service objects include a database object 302, a handheld device object 304, a timers object 306, a screen object 308 and a speakers object 310. An API boundary is provided between the client run-time software 312 and the middleware and services 302-310 provided by the platform itself. Thus the client run-time software 312, shown in the middle region of Figure 3, provides for the core event loop in the system. Various subroutines/functions are provided in the client run-time software 312 to interact with the service objects. For example, a database manager 318 interacts with the database object 302, an event processor 320 interacts with the handheld object 304 and timers object 306, while the rendering engine 322 interacts with the screen object 308 and the speakers object 310. The client run-time software 312 also includes a scene manager 314 which keeps track of the current camera view of the entire zoomable user interface. The scene manager 314 is also responsible for adjusting the camera view according to either user events (such as the pointer moving or a button press) or external ones (such as the phone ringing or a timer expiring). More information regarding camera views associated with exemplary ZUIs can be found in the above-incorporated by reference U.S. Patent Application Serial No. 10/768,432. The scene manager 314 calls upon the scene loader 316 as appropriate to navigate to another part of the ZUI (new scene) via the ZML interface. According to some exemplary embodiments, even though the full spatial world associated with the ZUI exists at creation time, it is only brought into the client's view as needed over the ZML interface. This allows for efficient memory utilization by the client. As mentioned above, bricks can be defined by a toolkit and stored in a brick library as shown in Figure 3.
[0039] Of course the applications which run on top of the various architectures described herein are a significant part of the entertainment system because they are the means that consumers use to get access to the content and services they want to consume. According to exemplary embodiments, each application is designed to provide a specific type of entertainment, information or communications function to make it easy for integrators to create products that are unique and aligned to their business interests. As described above, one advantage of multimedia systems in accordance with these exemplary embodiments is that new applications can be created using the visual development system toolkit. Three exemplary applications are described below: (1) a television application, (2) a media application, and (3) a shopping application.
[0040] According to exemplary embodiments of the present invention, the applications available in these multimedia systems are launched through a portal or menu screen. The portal contains, for example, icons for each system application. In addition, the application framework can provide on-screen navigation buttons, which reside on the portal and all navigation pages. According to one exemplary embodiment, these navigation buttons include a home button, a search button, a go up one level button, and a live TV button. The home navigation button takes the consumer back to the top screen of the portal regardless of their location in the application worldview. The search navigation button is used to help users find desired content. According to some exemplary embodiments, it can allows text entry and presents results visually. Each application is a set of sub-functions organized in a hierarchy. The go up one level button takes the consumer up to the main level in the current sub-function. Lastly the live TV button takes the consumer to the Live TV screen associated with the TV application described next below. The channel viewed is the last one selected. For the reader interested in more details and other examples relating to navigation buttons and an exemplary portal, she or he is directed to U.S. Patent Application Serial No. 11/437,215, filed on May 19, 2006, entitled "Global Navigation Objects in User Interfaces, the disclosure of which is incorporated here by reference. [0041] As mentioned above, one application associated with exemplary embodiments is a TV application that manages the television watching experience. This application represents the basic functionality of today's living room applications — linear television, video-on-demand (VOD) and digital video recording. An exemplary TV application which can be run on the afore-described hardware/software architecture can include the following features.
[0042] Live TV — Viewing live TV is an important aspect of many multimedia systems and applications. According to these exemplary embodiments, users are freed from strict linear timetables with digital controls like pause, fast forward and rewind. This feature lets consumers receive a phone call or use a bathroom without missing any of their favorite show or sportscast. They can skip over parts they don't want to see or rewind to see important scenes again. The Live TV function also offers a new type of on-screen control that makes changing channels a snap even in an environment with several hundred TV channels and thousands of video on demand options. The consumer can point-and-click at the desired choice and is free from the cumbersome up-down-left-right approach of today's remote controls. Available Live TV features include volume and channel adjustment, ad banners, rewind and fast forward controls, and play and pause. In addition, some exemplary embodiments of live TV applications indicate show progress and, if available, includes linkages to other related content and services.
[0043] Consider, in this regard, the exemplary user interface screens of Figures 4-11 which illustrate portions of an exemplary live TV application. The user interface screen illustrated in Figure 4 can, for example, be accessed by selecting and actuating the live TV button from the home portal described above. Therein, moving a cursor 400 over a left portion of the display screen invokes a channel control overlay 402 which is superimposed over a live video feed 404. In this exemplary embodiment, the channel control overlay 402 includes a channel bar with a movable selector 406. The movable selector 406 can be dragged up and down the channel bar and its current location along the channel bar indicates which channel has the focus of the interface. In this example, the location of the movable selector 406 along the channel bar is such that channel 37 has the focus, resulting in a supplemental information overlay 408 being displayed over the live TV feed 404. Alternatively, the movable selector 406 (slider) can be controlled by, for example, rotating a scroll wheel on a pointing device or a 3D pointing device (handheld controller) which is in communication with the multimedia system. To the left of the channel bar are a number of rectangular icons representing favorite channels or networks which can be selected for display on the live TV feed 404 by, e.g., pointing and clicking thereon. [0044] If, for example, the cursor 400 is moved to the bottom portion of the screen, a
DVR control overlay 500 can be invoked as shown in Figure 5. Therein, pause 502, rewind 504 and fast-forward 506 controls can be displayed and actuated by, e.g., pointing and clicking on the desired DVR control while they are being displayed over the live TV feed. Similarly, a volume control overly 600 (see Figure 6) can be displayed when, for example, a scroll wheel is depressed on the pointing device or 3D pointing device used as a remote control. Thereafter, in response to rotating the scroll wheel up or down, the volume can be increased or decreased and the slider 602 will be displayed on the screen as moving up or down to reflect the change in volume.
[0045] The live TV application can also include a guide which features a program grid that is enhanced with by 3D pointing capabilities of these exemplary embodiments. This new interaction approach increases the value of the traditional EPG because it offers random navigation of each grid element (using the pointing metaphor). It also provides fast access to content metadata and services. The guide and channel banner have built-in content and service linkages that further enhance the value of the guide. The digital video recorder (DVR) and on-demand content services are tightly integrated within the guide and include a robust search and filter mechanism, which works across all types of programming whether linear, on-demand or recorded. Other features include user selected favorites, reminders, filters, and linkages. An example is shown as Figure 7, wherein these features and others, e.g., filters 702 re "Movies" or "Sports" enable a user to easily filter the available guide selections being played by pointing and clicking on the associated tabs. [0046] Unlike other guides that present the DVR function as a separate and standalone application, this exemplary live TV application integrates the DVR into the guide itself. Recorded programming and services to schedule a program recording are always just a few clicks away. This feature eliminates complex hierarchies to access DVR services. DVR features include record, play (with progress bar), delete (priority- based), recorded schedule, display of space available, sort, filter, conflict resolution, and watch while record. This integration enables the user to, for example, look for something to watch within the guide portion of the live TV application and to decide (without changing scenes in the ZUI) to record that show since it is on later, as shown in Figure 8. Therein, selection of the entry "Judge Alex" within the guide at a time (2:30pm) which is later than the current time results in a DVR overlay being displayed directly on the guide, i.e., without taking the user to another ZUI scene. In this example, the DVR overlay 800 includes some information about the selected TV show plus three selectable options - "My Shows", "More Info" and "Record" tiny of which can be pointed to and selected. Selecting the "Record" button in overlay 800 will result in the selected show being queued for recording by the multimedia system. Selecting the "My Shows" button in the overlay 800 will provide the user with a browsable listing of all of his or her shows which have been recorded (as well as those scheduled to be recorded) using, e.g., a visual directory of images, an example of which is illustrated in Figure 9. Note that this exemplary visual directory is scrollable vertically by use of the scroll bar provided to the right of the matrix of images. However, according to other exemplary embodiments, horizontal scrolling could be supported by providing a scroll bar above or below the matrix, either together with or as an alternative to vertical scrolling. Selecting one of the TV shows which have been recorded to view more details can be accomplished by pointing at and clicking on a corresponding TV show image. For example, selecting the TV show image of "Alias" might result in the ZUI screen shown in Figure 10 which provides ready access to recorded episodes as well as a schedule of upcoming episodes scheduled to be recorded.
[0047] Yet another feature of the live TV application according to these exemplary embodiments is the use of visual directories for on -demand services. The visual directories offer the consumer the same experience as when they walk into a video store. They can see many movie covers organized into appropriate categories, thereby providing an approach that effectively scales with the breadth of available content. According to some purely exemplary embodiments, a zoomable Visual Directory™ shows as many as 128 titles per screen and can scale using the pan and scroll features to support thousands of titles in a simple to use structure. The information structure of the Visual Directory™ flattens the hierarchy of total available options and, using linkages, supports the various methods that consumers use to search for content. They can search directly or they can browse through categories or use links embedded in the listings to improve access to a wide range of content choices. VOD features according to these exemplary embodiments include scaling, filters, sorting, pay-per- view, and rental management.
[0048] Still another feature of the live TV application is a search capability.
Sometimes consumers know precisely what media content they would like to consume, in which case a direct search may be more appropriate than browsing. Thus, according to these exemplary embodiments, a live TV application can include a search function (reachable by, among other techniques, a global navigation button as described above) from which users can search for specific content using keywords, names, titles and date or time information. The search system provides filtered results from TV listings, the DVR manager, and video on demand database. To simplify the user experience, the consumer can easily point at the desired search result in a visual list of options. For text based search, predictive methods are employed to minimize the number of "free-space operations" required to enter the desired request. An example of this latter type of search is shown in Figure 11 wherein a user is trying to determine if any movies having Tom Hanks as an actor are playing on available VOD selections.
[0049] Having described an exemplary live TV application according to an exemplary embodiment of the present invention, a second exemplary application which can be run on the afore-described architectures is a media application. According to these exemplary embodiments, a media application can provide a comprehensive suite of personal multimedia content navigation and media management applications including music, photos and home videos that directly address convergence in the home. This exemplary media application organizes digital media content from a consumer's personal collection (e.g., on a Personal Computer or other networked device) and integrates this content delivered by a service providers (over a broadband connection) to present all available digital media in a consistent user interface on the TV. Among other things, this exemplary media application provides for the creation of photo slide shows wherein users can point-and-click on their favorite photos to create instant slide shows displayed on their televisions. Custom playlists of music can be readily created by applying the Visual Directory™ to a user's personal music collection. Similarly, wether the user wants easy access to video clips delivered by a service provider or those clips previously downloaded to a personal computer, a simple point-and-click creates a custom video playlist for playback to the TV. For the reader interested in more detail regarding these and other aspects of an exemplary media application (as well as the afore- described live TV application and the below-described shopping application), she or he is referred to U.S. Patent Application Serial No. 11/354,329, entitled "Methods and Systems for Enhancing Television Using 3D Pointing", filed on February 14, 2006, the disclosure of which is incorporated here by reference.
[0050] Yet another type of application which can be provided to run on the afore- described architectures is a shopping application. According to exemplary embodiments of the present invention, an interactive shopping application is created that provides the experience of a virtual mall on TV which allows consumers to shop in a comfortable and secure setting while providing a community aspect to the experience. This shopping application leverages the metadata created for online shopping destinations and reformats this information to present it in an interactive and visually appealing manner that is optimized for the TV. [0051] Running the exemplary shopping application, a user can point at a shopping items list or visual goods presentation and either automatically add it to a shopping cart or just buy it. For example, as illustrated in Figure 12(a), at a first zoom level, a number of different categories of items for sale can be depicted on the TV (or other display) using some generic phrases and images. When a user pauses the cursor 1200 over a particular category, e.g., "Handbags" in Figure 12(b), that image is magnified slightly to denote that it has the current focus. Then, either automatically or as a result of an additional user input (e.g., a button press), a zoom in can be performed on the "Handbag" category, revealing a bookshelf (visual directory) of handbags as shown in Figure 12(c). Again, the cursor's position indicates a current selection within the bookshelf, reflected by the hoverzoom of the "Smooth Leather" category of items in Figure 12(c). Another zoom in can be performed, again either automatically after an elapsed period of pointing at this category or in response to a specific user input via the handheld device, resulting in a more detailed view of this category as shown in Figure 12(d). Selection of an image of particular handbag may result in a zoom in to the detailed view of Figure 12(e), e.g., using the zooming, panning and/or translating effects described above.
[0052] A user can easily navigate cross-links within this exemplary shopping application by pointing at the ones of interest (like other goods from same store, matching accessories, similar designers). For example, as shown in Figure 12(e) a crosslink to a shoe accessory is displayed as an image 1202. If that link is activated, the user can jump to a detailed view of that item without having to navigate through the various higher level screens to reach it, as seen in Figure 12(f). [0053] One other component, in addition to the afore-described hardware/software architectures and applications, associated with multimedia systems according to these exemplary embodiments is the remote control device with which the user interacts with the various ZUI screens to select and consume content. Various references have been made to 3D pointing devices throughout this specification which are one example of a remote control device which can be used in conjunction with these architectures and applications. In addition to the above-incorporated by reference '663 patent application, the reader interested in more information related to exemplary 3D pointing devices which can be used in conjunction with these architectures and applications is also referred to U.S. Patent Application Serial No. 1 1/480,662, entitled "3D Pointing Device", filed on July 3, 2006, the disclosure of which is incorporated here by reference. However, the present invention is not limited to implementations including 3D pointing devices, but could also be used in conjunction with mice, joysticks, trackballs and other pointing devices. [0054] The foregoing exemplary embodiments are purely illustrative in nature. The number of zoom levels, as well as the particular information and controls provided to the user at each level may be varied. Those skilled in the art will appreciate that the present invention provides techniques for presenting large and small sets of media items using azoomable interface such that a user can easily search through, browse, organize and play back media items such as movies and music. Graphical user interfaces according to the present invention organize media item selections on a virtual surface such that similar selections are grouped together. Initially, the interface presents a zoomed out view of the surface, and in most cases, the actual selections will not be visible at this level, but rather only their group names. As the user zooms progressively inward, more details are revealed concerning the media item groups or selections. At each zoom level, different controls are available so that the user can play groups of selections, individual selections, or go to another part of the virtual surface to browse other related media items. Zooming graphical user interfaces according to exemplary embodiments of the present invention can contain categories of images nested to an arbitrary depth as well as categories of categories. The media items can include content which is stored locally, broadcast by a broadcast provider, received via a direct connection from a content provider or on a peering basis. The media items can be provided in a scheduling format wherein date/time information is provided at some level of the GUI. Additionally, frameworks and GUIs according to exemplary embodiments of the present invention can also be applied to television commerce wherein the items for selection are being sold to the user.
[0055] Zoomable user interfaces according to these exemplary embodiments employ various transition effects to instill a sense of spatial positioning within the ZUI "world" as a user navigates among content selections. These aspects of ZUIs are described in more detail in various ones of the above-incorporated by reference patent applications. Briefly, however, zooming refers to, for example, the progressive scaling of a displayed object, set of objects or a portion thereof that gives the visual impression of movement of all or part of such object(s) toward or away from an observer. In other words, the zooming feature, in some instances, causes the display of the object or objects to change from a distant view to a close view, and vice versa, as though the end user were manipulating a telescope, a magnifying glass, or a zoom lens of a camera. In other instances, semantic zooming may be employed to provide a similar progressive scaling on the display, yet adding or hiding details which would not necessarily be added or hidden when using a "pure" camera zoom. Similarly, a panning transition refers to the progressive translating of a displayed object, set of objects or a portion thereof that gives the visual impression of lateral movement of the object(s). [0056] Systems and methods for processing data according to exemplary embodiments of the present invention can be performed by one or more processors executing sequences of instructions contained in a memory device. Such instructions may be read into the memory device from other computer-readable mediums such as secondary data storage device(s). Execution of the sequences of instructions contained in the memory device causes the processor to operate, for example, as described above. In alternative embodiments, hardwire circuitry may be used in place of or in combination with software instructions to implement the present invention.
[0057] Numerous variations of the afore-described exemplary embodiments are contemplated. The above-described exemplary embodiments are intended to be illustrative in all respects, rather than restrictive, of the present invention. Thus the present invention is capable of many variations in detailed implementation that can be derived from the description contained herein by a person skilled in the art. All such variations and modifications are considered to be within the scope and spirit of the present invention as defined by the following claims. No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, used herein, the article "a" is intended to include one or more items.

Claims

WHAT IS CLAIMED IS:
1. An electronic program guide (EPG) responsive to pointing inputs and having an integrated digital video recorder (DVR) function comprising: a grid displayed on a display screen, said grid having a plurality of program selections displayed therein; a cursor displayed as a moveable overlay on said grid, said cursor responsive to said pointing inputs to provide random access to said plurality of program selections; and wherein when said cursor is positioned over one of said plurality of program selections, a visual indication of focus is provided to said one of said plurality of program selections; and further wherein when a selection command is received by said electronic program guide, a DVR control overlay is displayed on said grid.
2. The EPG of claim 1, wherein said DVR control overlay includes a record button.
3. The EPG of claim l,wherein said grid includes a first axis grid corresponding to a plurality of channels and a second axis of grid corresponding to time.
4. The EPG of claim 1, wherein said DVR control overlay includes a button which, when actuated, causes said EPG to display a user interface screen which shows at least one of previously recorded programs and programs scheduled to be recorded.
5. The EPG of claim 4, wherein said at least one of previously recorded programs and programs scheduled to be recorded are displayed as a visual directory of images.
6. The EPG of claim 5, wherein said visual directory of images includes a plurality of images each associated with one of said at least one of previously recorded programs and programs scheduled to be recorded, said plurality of images arranged in a rectangular matrix with a first number of images in each row and a second number of images, different than said first number, in each column.
7. The EPG of claim 6, wherein said visual directory includes a scroll bar for vertical scrolling of said visual directory.
8. The EPG of claim 7, wherein said scroll bar is actuable via a scroll wheel of a remote control device.
9. The EPG of claim 1, further comprising: a pair of arrow-shaped overlays superimposed on said grid which, when pointed to and actuated, scroll said EPG horizontally.
10. The EPG of claim 4, wherein a user interface screen which shows at least one of previously recorded programs and programs scheduled to be recorded further comprises two tabs, one of which is associated with display of a visual directory of said previously recorded program images and one of which is associated with display of a visual directory of said programs scheduled to be recorded.
1 1. A scrollable visual directory display comprising: a plurality of images each associated with a selectable media item, said plurality of images arranged in a rectangular matrix with a first number of images in each row and a second number of images in each column; and a scroll bar on one side of said plurality of images for scrolling said rectangular matrix.
12. The scrollable visual directory of claim 11, wherein said selectable media items are associated with at least one of previously recorded programs and programs scheduled to be recorded.
13. The scrollable visual directory of claim 11, wherein said scroll bar is on the left or right side of said rectangular matrix and is usable to vertically scroll said plurality of images up and down.
14. The scrollable visual directory of claim 11, wherein said scroll bar is above or below said rectangular matrix and is usable to horizontally scroll said plurality of images to the left and right.
15. A method for providing information in an electronic program guide (EPG) having an integrated digital video recorder (DVR) function comprising: displaying a grid displayed on a display screen, said grid having a plurality of program selections displayed therein; displaying a cursor as a moveable overlay on said grid, said cursor responsive to said pointing inputs to provide random access to said plurality of program selections; providing, when said cursor is positioned over one of said plurality of program selections, a visual indication of focus to said one of said plurality of program selections; and displaying, when a selection command is received by said electronic program guide, a DVR control overlay on said grid.
16. The method of claim 15, wherein said DVR control overlay includes a record button.
17. The method of claim 15, wherein said grid includes a first axis grid corresponding to a plurality of channels and a second axis of grid corresponding to time.
18. The method of claim 15, wherein said DVR control overlay includes a button which, when actuated, causes said EPG to display a user interface screen which shows at least one of previously recorded programs and programs scheduled to be recorded.
19. The method of claim 18, wherein said at least one of previously recorded programs and programs scheduled to be recorded are displayed as a visual directory of images.
20. The method of claim 19, wherein said visual directory of images includes a plurality of images each associated with one of said at least one of previously recorded programs and programs scheduled to be recorded, said plurality of images arranged in a rectangular matrix with a first number of images in each row and a second number of images, different than said first number, in each column.
21. The method of claim 20, wherein said visual directory includes a scroll bar for vertical scrolling of said visual directory.
22. The method of claim 21, wherein said scroll bar is actuable via a scroll wheel of a remote control device.
23. The method of claim 15, further comprising: displaying a pair of arrow-shaped overlays superimposed on said grid which, when pointed to and actuated, scroll said EPG horizontally.
24. The method of claim 18, wherein said user interface screen which shows at least one of previously recorded programs and programs scheduled to be recorded further comprises two tabs, one of which is associated with display of a visual directory of said previously recorded program images and one of which is associated with display of a visual directory of said programs scheduled to be recorded.
PCT/US2006/046303 2005-12-02 2006-12-04 Multimedia systems, methods and applications WO2007065020A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP06844806A EP1966987A4 (en) 2005-12-02 2006-12-04 Multimedia systems, methods and applications

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US74159605P 2005-12-02 2005-12-02
US60/741,596 2005-12-02
US75581906P 2006-01-04 2006-01-04
US60/755,819 2006-01-04

Publications (2)

Publication Number Publication Date
WO2007065020A2 true WO2007065020A2 (en) 2007-06-07
WO2007065020A3 WO2007065020A3 (en) 2008-11-06

Family

ID=38092902

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/046303 WO2007065020A2 (en) 2005-12-02 2006-12-04 Multimedia systems, methods and applications

Country Status (3)

Country Link
US (3) US8850478B2 (en)
EP (1) EP1966987A4 (en)
WO (1) WO2007065020A2 (en)

Families Citing this family (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004100539A1 (en) * 2003-05-05 2004-11-18 Thomson Licensing S.A. Method and apparatus for indicating whether sufficient space exists for recording a program
EP1966987A4 (en) * 2005-12-02 2010-05-26 Hillcrest Lab Inc Multimedia systems, methods and applications
KR20090060311A (en) * 2006-08-29 2009-06-11 힐크레스트 래보래토리스, 인크. Television control, playlist generation and dvr systems and methods
US9124767B2 (en) * 2006-10-25 2015-09-01 Microsoft Technology Licensing, Llc Multi-DVR media content arbitration
US9305119B1 (en) * 2006-12-14 2016-04-05 Myspace Llc System, apparatus and method for determining correct metadata from community-submitted data
US11783925B2 (en) 2006-12-29 2023-10-10 Kip Prod P1 Lp Multi-services application gateway and system employing the same
US9602880B2 (en) 2006-12-29 2017-03-21 Kip Prod P1 Lp Display inserts, overlays, and graphical user interfaces for multimedia systems
US7849095B2 (en) * 2006-12-29 2010-12-07 Brooks Roger K Method for using two-dimensional dynamics in assessing the similarity of sets of data
WO2008085205A2 (en) 2006-12-29 2008-07-17 Prodea Systems, Inc. System and method for providing network support services and premises gateway support infrastructure
US8544040B2 (en) * 2006-12-29 2013-09-24 Google Inc. System and method for displaying multimedia events scheduling information
US20170344703A1 (en) 2006-12-29 2017-11-30 Kip Prod P1 Lp Multi-services application gateway and system employing the same
US11316688B2 (en) 2006-12-29 2022-04-26 Kip Prod P1 Lp Multi-services application gateway and system employing the same
US9569587B2 (en) 2006-12-29 2017-02-14 Kip Prod Pi Lp Multi-services application gateway and system employing the same
US9794310B2 (en) * 2007-01-11 2017-10-17 Samsung Electronics Co., Ltd. Meta data information providing server, client apparatus, method of providing meta data information, and method of providing content
US8972875B2 (en) 2007-04-24 2015-03-03 Google Inc. Relevance bar for content listings
US20090031367A1 (en) * 2007-07-24 2009-01-29 The Directv Group, Inc. Method and system for utilizing multiple content delivery networks for distributing content
US9104987B2 (en) * 2007-07-24 2015-08-11 The Directv Group, Inc. Method and system for associating business rules with received content in a content processing system and generating a content list therefrom
US8875209B2 (en) * 2007-07-26 2014-10-28 The Directv Group, Inc. Method and system for receiving content in a content processing system using a workflow system
US20090030941A1 (en) * 2007-07-26 2009-01-29 The Directv Group, Inc. Method and system for receiving normalizing content data in a content processing system using a workflow system
US20090133063A1 (en) * 2007-11-20 2009-05-21 General Instrument Corporation Preference Based Electronic Programming Guide
JP5264192B2 (en) * 2008-01-16 2013-08-14 三菱電機株式会社 Video distribution system, receiver
US9003465B1 (en) 2008-04-25 2015-04-07 The Directv Group, Inc. Method and system for converting content into multiple formats
WO2009137368A2 (en) 2008-05-03 2009-11-12 Mobile Media Now, Inc. Method and system for generation and playback of supplemented videos
US8473979B2 (en) 2008-09-30 2013-06-25 Echostar Technologies L.L.C. Systems and methods for graphical adjustment of an electronic program guide
US8937687B2 (en) 2008-09-30 2015-01-20 Echostar Technologies L.L.C. Systems and methods for graphical control of symbol-based features in a television receiver
US8582957B2 (en) * 2008-09-22 2013-11-12 EchoStar Technologies, L.L.C. Methods and apparatus for visually displaying recording timer information
US8763045B2 (en) 2008-09-30 2014-06-24 Echostar Technologies L.L.C. Systems and methods for providing customer service features via a graphical user interface in a television receiver
US8572651B2 (en) 2008-09-22 2013-10-29 EchoStar Technologies, L.L.C. Methods and apparatus for presenting supplemental information in an electronic programming guide
US9357262B2 (en) 2008-09-30 2016-05-31 Echostar Technologies L.L.C. Systems and methods for graphical control of picture-in-picture windows
US8793735B2 (en) 2008-09-30 2014-07-29 EchoStar Technologies, L.L.C. Methods and apparatus for providing multiple channel recall on a television receiver
US8397262B2 (en) 2008-09-30 2013-03-12 Echostar Technologies L.L.C. Systems and methods for graphical control of user interface features in a television receiver
US9100614B2 (en) 2008-10-31 2015-08-04 Echostar Technologies L.L.C. Graphical interface navigation based on image element proximity
US20100121891A1 (en) * 2008-11-11 2010-05-13 At&T Intellectual Property I, L.P. Method and system for using play lists for multimedia content
JP5470861B2 (en) 2009-01-09 2014-04-16 ソニー株式会社 Display device and display method
US8798311B2 (en) * 2009-01-23 2014-08-05 Eldon Technology Limited Scrolling display of electronic program guide utilizing images of user lip movements
US9407973B2 (en) 2009-12-02 2016-08-02 At&T Intellectual Property I, L.P. System and method to identify an item depicted when media content is displayed
US20120266069A1 (en) * 2009-12-28 2012-10-18 Hillcrest Laboratories, Inc. TV Internet Browser
US9489062B2 (en) * 2010-09-14 2016-11-08 Google Inc. User interfaces for remote management and control of network-connected thermostats
US8918219B2 (en) 2010-11-19 2014-12-23 Google Inc. User friendly interface for control unit
US10346275B2 (en) 2010-11-19 2019-07-09 Google Llc Attributing causation for energy usage and setpoint changes with a network-connected thermostat
US9256230B2 (en) 2010-11-19 2016-02-09 Google Inc. HVAC schedule establishment in an intelligent, network-connected thermostat
US9235323B2 (en) * 2010-12-20 2016-01-12 Intel Corporation Techniques for management and presentation of content
US10353566B2 (en) * 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
WO2014028068A1 (en) 2012-08-17 2014-02-20 Flextronics Ap, Llc Media center
US11368760B2 (en) 2012-08-17 2022-06-21 Flextronics Ap, Llc Applications generating statistics for user behavior
JP5889223B2 (en) * 2013-01-29 2016-03-22 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Display device and image display system
US10129585B2 (en) * 2013-03-15 2018-11-13 DISH Technologies L.L.C. Advance notification of catch-up events through broadcast metadata
CN103458277B (en) * 2013-08-26 2016-07-06 小米科技有限责任公司 A kind of method and apparatus operating direct broadcast band program
JP6405112B2 (en) * 2014-04-18 2018-10-17 キヤノン株式会社 Information processing apparatus and control method thereof
USD819042S1 (en) 2015-10-07 2018-05-29 MAQUET CARDIOPULMONARY GmbH Display screen or portion thereof with graphical user interface for a medical device
USD797132S1 (en) * 2015-10-16 2017-09-12 Biogen Ma Inc. Display screen with graphical user interface
KR102471989B1 (en) * 2016-12-07 2022-11-29 주식회사 알티캐스트 system and method for providing cloud based user interfaces
KR102438108B1 (en) 2017-11-10 2022-08-31 삼성전자주식회사 Display apparatus
USD879122S1 (en) 2017-11-30 2020-03-24 MAQUET CARDIOPULMONARY GmbH Display screen or portion thereof with graphical user interface for a clamp display of a cardiopulmonary bypass machine system
USD851114S1 (en) * 2018-01-08 2019-06-11 Modulus Media Systems, Inc. Display screen with user interface
USD892824S1 (en) * 2018-08-30 2020-08-11 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
JP7193797B2 (en) * 2018-11-06 2022-12-21 任天堂株式会社 Game program, information processing system, information processing device, and game processing method
USD942497S1 (en) * 2018-12-20 2022-02-01 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
KR102633219B1 (en) * 2020-12-03 2024-02-02 경인전자 주식회사 Input device
USD983822S1 (en) * 2021-04-06 2023-04-18 Beijing Zitiao Network Technology Co., Ltd. Display screen or portion thereof with a graphical user interface

Family Cites Families (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020056136A1 (en) * 1995-09-29 2002-05-09 Wistendahl Douglass A. System for converting existing TV content to interactive TV programs operated with a standard remote control and TV set-top box
US6323911B1 (en) * 1995-10-02 2001-11-27 Starsight Telecast, Inc. System and method for using television schedule information
WO1998006219A1 (en) * 1996-08-06 1998-02-12 Starsight Telecast, Incorporated Electronic program guide with interactive areas
US5850218A (en) * 1997-02-19 1998-12-15 Time Warner Entertainment Company L.P. Inter-active program guide with default selection control
US6097878A (en) * 1997-02-25 2000-08-01 Sony Corporation Automatic timer event entry
US6292624B1 (en) * 1997-05-13 2001-09-18 Sony Corporation System and method for selection/deselection of timer recording
US6400378B1 (en) * 1997-09-26 2002-06-04 Sony Corporation Home movie maker
US6460181B1 (en) * 1997-12-29 2002-10-01 Starsight Telecast, Inc. Channels and services display
US20050204388A1 (en) * 1998-06-11 2005-09-15 Knudson Edward B. Series reminders and series recording from an interactive television program guide
CN1867068A (en) * 1998-07-14 2006-11-22 联合视频制品公司 Client-server based interactive television program guide system with remote server recording
US6295646B1 (en) * 1998-09-30 2001-09-25 Intel Corporation Method and apparatus for displaying video data and corresponding entertainment data for multiple entertainment selection sources
US6526577B1 (en) * 1998-12-01 2003-02-25 United Video Properties, Inc. Enhanced interactive program guide
US6577350B1 (en) * 1998-12-21 2003-06-10 Sony Corporation Method and apparatus for displaying an electronic program guide
US6549929B1 (en) * 1999-06-02 2003-04-15 Gateway, Inc. Intelligent scheduled recording and program reminders for recurring events
US6640337B1 (en) * 1999-11-01 2003-10-28 Koninklijke Philips Electronics N.V. Digital television (DTV) including a smart electronic program guide (EPG) and operating methods therefor
US6690391B1 (en) * 2000-07-13 2004-02-10 Sony Corporation Modal display, smooth scroll graphic user interface and remote command device suitable for efficient navigation and selection of dynamic data/options presented within an audio/visual system
JP3600521B2 (en) * 2000-11-17 2004-12-15 株式会社東芝 Video recording system, video recording method, and storage medium
CN101257609B (en) * 2001-02-21 2014-03-19 联合视频制品公司 Systems and method for interactive program guides with personal video recording features
WO2003009126A1 (en) * 2001-07-19 2003-01-30 Digeo, Inc. System and method for managing television programs within an entertainment system
US7631331B2 (en) * 2002-03-29 2009-12-08 Starz Entertainment, Llc Cross-channel interstitial program promotion
US8255968B2 (en) * 2002-04-15 2012-08-28 Universal Electronics, Inc. System and method for adaptively controlling the recording of program material using a program guide
US20040095317A1 (en) * 2002-11-20 2004-05-20 Jingxi Zhang Method and apparatus of universal remote pointing control for home entertainment system and computer
US20040223738A1 (en) * 2003-05-07 2004-11-11 Johnson Carolynn Rae User-defined categorized display of programs stored to video recording system
US8046705B2 (en) * 2003-05-08 2011-10-25 Hillcrest Laboratories, Inc. Systems and methods for resolution consistent semantic zooming
US20040268393A1 (en) * 2003-05-08 2004-12-30 Hunleth Frank A. Control framework with a zoomable graphical user interface for organizing, selecting and launching media items
US6990637B2 (en) 2003-10-23 2006-01-24 Microsoft Corporation Graphical user interface for 3-dimensional view of a data collection based on an attribute of the data
US20050097601A1 (en) * 2003-10-31 2005-05-05 Daniel Danker Quick EPG navigation
US8650596B2 (en) * 2003-11-03 2014-02-11 Microsoft Corporation Multi-axis television navigation
US7493341B2 (en) * 2004-01-16 2009-02-17 Hillcrest Laboratories, Inc. Metadata brokering server and methods
JP2005204176A (en) * 2004-01-16 2005-07-28 Sharp Corp Program recording apparatus
US20050160461A1 (en) * 2004-01-21 2005-07-21 United Video Properties, Inc. Interactive television program guide systems with digital video recording support
CN102566751B (en) * 2004-04-30 2016-08-03 希尔克瑞斯特实验室公司 Free space pointing devices and method
CN101077005B (en) * 2004-05-28 2011-08-10 希尔克瑞斯特实验室公司 Methods and apparatuses for video on demand (VOD) metadata organization
US7747132B2 (en) * 2004-08-26 2010-06-29 Sony Corporation Method and system for displaying multiple media content instances during a single viewing session
US7461343B2 (en) * 2004-11-08 2008-12-02 Lawrence Kates Touch-screen remote control for multimedia equipment
CN101233504B (en) * 2005-01-05 2010-11-10 希尔克瑞斯特实验室公司 Distributed software construction for user interfaces
US7839385B2 (en) * 2005-02-14 2010-11-23 Hillcrest Laboratories, Inc. Methods and systems for enhancing television applications using 3D pointing
US20060262116A1 (en) * 2005-05-19 2006-11-23 Hillcrest Laboratories, Inc. Global navigation objects in user interfaces
KR101288186B1 (en) * 2005-07-01 2013-07-19 힐크레스트 래보래토리스, 인크. 3d pointing devices
EP1966987A4 (en) * 2005-12-02 2010-05-26 Hillcrest Lab Inc Multimedia systems, methods and applications
WO2007065019A2 (en) * 2005-12-02 2007-06-07 Hillcrest Laboratories, Inc. Scene transitions in a zoomable user interface using zoomable markup language

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP1966987A4 *

Also Published As

Publication number Publication date
WO2007065020A3 (en) 2008-11-06
EP1966987A4 (en) 2010-05-26
US20070199022A1 (en) 2007-08-23
US20150100985A1 (en) 2015-04-09
US8850478B2 (en) 2014-09-30
US20170223420A1 (en) 2017-08-03
EP1966987A2 (en) 2008-09-10

Similar Documents

Publication Publication Date Title
US20170223420A1 (en) Multimedia systems, methods and applications
US20200329280A1 (en) Method of Adaptive Browsing for Digital Content
US9400598B2 (en) Fast and smooth scrolling of user interfaces operating on thin clients
US7734680B1 (en) Method and apparatus for realizing personalized information from multiple information sources
JP4354973B2 (en) Query-based electronic program guide
US20060262116A1 (en) Global navigation objects in user interfaces
US20060136246A1 (en) Hierarchical program guide
WO2012094228A1 (en) Systems and methods for navigating through content in an interactive media guidance application
JP2008520121A (en) Method and system for searching for television content with reduced text input and channel using non-intrusive television interface
WO2007098206A2 (en) Systems and methods for placing advertisements
JP2013219812A (en) Systems and methods for providing interactive media guidance on wireless communications device
US12047654B2 (en) Surf mode for streamed content
KR20150117212A (en) Display apparatus and control method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006844806

Country of ref document: EP