WO2001061996A1 - Method and apparatus for controlling the movement or changing the appearance of a three-dimensional element - Google Patents

Method and apparatus for controlling the movement or changing the appearance of a three-dimensional element Download PDF

Info

Publication number
WO2001061996A1
WO2001061996A1 PCT/US2001/005056 US0105056W WO0161996A1 WO 2001061996 A1 WO2001061996 A1 WO 2001061996A1 US 0105056 W US0105056 W US 0105056W WO 0161996 A1 WO0161996 A1 WO 0161996A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
data
class
dimensional element
elements
Prior art date
Application number
PCT/US2001/005056
Other languages
French (fr)
Inventor
Yakov Kamen
Leon Shirman
Original Assignee
Isurftv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isurftv filed Critical Isurftv
Priority to AU2001238406A priority Critical patent/AU2001238406A1/en
Publication of WO2001061996A1 publication Critical patent/WO2001061996A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4345Extraction or processing of SI, e.g. extracting service information from an MPEG stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4532Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/454Content or additional data filtering, e.g. blocking advertisements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors

Definitions

  • This invention relates generally to creating electronic programming guides (EPGs) and more specifically to controlling the movement or the appearance of three-dimensional (3D) elements used in EPGs.
  • EPGs electronic programming guides
  • Television provides a vast amount of audiovisual information to a variety of audiences.
  • a user is able to determine which television program to watch by reviewing a television guide purchased at a store or an EPG that is electronically available through cable television.
  • Many users use the EPG available on cable television due to its ease of access.
  • EPGs are presented in a two-dimensional format as shown in Figure 1.
  • First column 110 provides the various channels of a broadcast on cable television.
  • Second, third, fourth columns (120, 130, 140) present the television programs for the particular channels that will be broadcast in half- hour increments. For example, at 10:00 p.m., second column 120 indicates that the news will be broadcast on channel 2.
  • Third column 130 presents the television programs to be broadcast at 10:30 p.m.
  • fourth column 140 presents the television programs that will be broadcast at 11:00 p.m. Because there are generally more television channels to present television programming information than there is space in columns and rows in an EPG, grid 135 scrolls at a preselected rate to allow a user time to consider all the television programs that are to be broadcast. Typically, a user is unable to modify the two-dimensional EPG.
  • Digital EPGs also use a two-dimensional format to present television programming information.
  • a user of a digital EPG is capable of interacting with the digital EPG to customize the types of television programs that are presented.
  • a user may browse television programming information presented in a two-dimensional format in any order chosen by the user. For example, a user may select television programs from an on-screen menu for current or future viewing or order pay-per-view programming.
  • the two-dimensional format for presenting television programming information is problematic for some users because of difficulty viewing or distinguishing information.
  • the two-dimensional format of a conventional EPG is generally unable to be personalized to a user. For example, a user cannot modify the manner in which the information is presented such as by moving television program information to a portion of the screen or change the way in which the information is physically presented to make the information more easily viewed.
  • One embodiment of the invention relates to controlling the movement or changing the features (e.g., color, texture, transparency, audio etc.) associated with three-dimensional (3D) elements in an electronic programming guide (EPG) that is presented on, for example, a television.
  • EPG electronic programming guide
  • Each 3D element typically represents information relevant to a television program such as a sports program.
  • program instructions such as an event interpreter, a behavior filter, an EPG engine, and a data mapper are executed on a computer system.
  • the event interpreter receives a command from a user.
  • the event interpreter then recognizes an event or time associated with a 3D element and is able to determine whether a user is interacting with a particular 3D element.
  • the behavior filter receives the data and associates a behavior description with the data.
  • a behavior description may indicate that a 3D element is to move or to change a feature associated with a 3D element.
  • the behavior filter also filters the data. The filter prevents irrelevant data from being further processed.
  • Data is filtered based upon the content or the context of the data. The content of data relates to the subject matter of the data. For instance, a user may desire to view 3D elements related to sports in a certain spatial order. While data related to sports programs will pass through the behavior filter, other data such as information pertaining to stocks, for example, will not pass through the behavior filter.
  • Data that passes through the behavior filter is then sent to an EPG engine.
  • the EPG engine creates the EPG using a 3D graphics pipeline based upon this data and the EPG data received from the data mapper.
  • the data mapper is used to map EPG data onto 3D elements.
  • the data mapper takes a memory object (e.g., text, image, video, etc.) and attaches the memory object to a 3D element. For instance, the data mapper receives EPG data such as the ABC time slot and maps this information onto the 3D element associated with ABC. Additional features, embodiments, and benefits will be evident in view of the figures and detailed description presented herein.
  • Figure 1 illustrates an electronic version of a television guide of the prior art
  • FIG. 2 illustrates a computer system in accordance with one embodiment of the invention
  • Figure 3 illustrates a block diagram of an electronic programming guide (EPG) formed in accordance with one embodiment of the invention
  • FIG. 4 illustrates an EPG with three-dimensional (3D) elements in accordance with one embodiment of the invention.
  • Figure 5 illustrates a flow diagram for modifying or moving a 3D element in an EPG in accordance with one embodiment of the invention.
  • One embodiment of the invention relates to a system for creating a 3D electronic programming guide (EPG) that includes a plurality of three- dimensional (3D) elements that may be moved by a user or a user may change a feature associated with a 3D element(s).
  • EPG electronic programming guide
  • Changing a feature of a 3D element may include changing the color, the texture, the transparency, or audio features that are associated with the 3D element.
  • Each 3D element represents information to a user such as a television program.
  • a 3D element is defined by its geometric structure (e.g., spheres, triangles, hexagons, squares, or other suitable geometric structures), its position in a virtual 3D space, and the behavioral model applicable to the 3D element.
  • the behavioral model is a set of rules that change the surface parameters of a 3D element based upon a spatio-temporal event.
  • An event is an incident of importance. For example, an event may be a user moving a cursor closer to the 3D element that causes the 3D element to respond by changing its color or any other variety of arbitrary responses.
  • an EPG may be personalized to the requirements established by a user.
  • the user may desire to view a set of 3D elements related to sports in a certain spatial order specified by the user or by the computer system according to a priority established by the historical viewing experience of the user.
  • the set of 3D elements related to sports e.g., baseball, football, etc.
  • the set of 3D elements related to sports may be provided, for example, on the top portion of the screen.
  • Figure 2 presents the apparatus used to implement techniques of the invention whereas Figures 3 through 5 present details of moving or changing the features associated with 3D elements in a virtual 3D space.
  • FIG. 2 illustrates one embodiment of a computer system 10 such as a set-top box that is connected to a television for implementing the principles of the present invention.
  • Computer system 10 comprises a processor 17, storage device 18, and interconnect 15 such as bus or a point-to- point link.
  • Processor 17 is coupled to storage device 18 by interconnect 15.
  • a number of user input /output devices such as a keyboard 20 and display 25, are coupled to chip set (not shown) that is then connected to processor 17.
  • the chipset is typically connected to processor 17 using an interconnect that is different from interconnect 15.
  • Processor 17 represents a central processing unit of any type of architecture (e.g., the Intel architecture, the Hewlett Packard architecture, the Sun Microsystems architecture, the IBM architecture, etc.), or hybrid architecture. In addition, processor 17 could be implemented on one or more chips.
  • Storage device 18 represents one or more mechanisms for storing data. Storage device 18 may include read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, and/or other machine-readable media'.
  • Interconnect 15 represents one or more buses (e.g., accelerated graphics port bus, peripheral component interconnect bus, industry standard architecture bus, X-Bus, video electronics standards association related to buses, etc.) and bridges (also termed as bus controllers).
  • Network 30 represents one or more network connections for transmitting data over a machine readable media.
  • Figure 2 also illustrates that storage device 18 has stored therein data 35 and program instructions (e.g., software, computer program, etc.) 36.
  • Data 35 represents data stored in one or more of the formats described herein.
  • Data 35 includes existing EPG data, data that affects a 3D element or other like information.
  • Existing EPG data is data that has been previously stored in storage device 18.
  • Existing EPG data are 3D elements that represent, for example, a television program. These 3D elements may be created from the information found in two-dimensional EPGs.
  • Program instructions 36 represents the necessary code for performing any and /or all of the techniques described with reference to Figures 3-5 such as moving a 3D element or changing a feature associated with a 3D element.
  • a feature of a 3D element includes parameters, characteristics, or other suitable data associated with a 3D element. Examples of a feature of a 3D element include color, texture, transparency, and audio elements of a 3D element.
  • Program instructions include an event interpreter, a behavior filter, a data mapper, an EPG engine, and a 3D graphics pipeline. Each of these components is described below in Figure 3.
  • storage device 18 preferably contains additional software (not shown), which is not necessary to understanding the invention.
  • FIG. 2 additionally illustrates that processor 17 includes decoder 40.
  • Decoder 40 is used for decoding instructions received by processor 17 into control signals and/or microcode entry points. In response to these control signals and /or microcode entry points, decoder 40 performs the appropriate operations.
  • FIG. 3 illustrates a block diagram of program instructions 36 of computer system 10 used for representing information such as 3D elements in a 3D virtual space in accordance with one embodiment of the invention.
  • Computer system 10 includes program instructions such as event interpreter 210, behavior filter 220, data mapper 240, EPG engine 230, and 3D graphics pipeline 250. Each of these components are discussed in detail below.
  • Event interpreter 210 receives commands sent by a user to move or to change a feature of a 3D element.
  • the user inputs commands into a graphical user interface (GUI) by using, for example, input/output devices such as a remote control, a mouse, a keyboard, a voice recognition device, or other suitable devices to send commands.
  • GUI graphical user interface
  • commands may be automatically generated by computer system 10 to move or to change a 3D element, based upon, for instance, the historical viewing experiences of the user.
  • the user may input a variety of information that may affect an event that is represented by a user. For instance, the user may require a 3D element to move from a first position as represented by coordinates XI, Yl, Zl in which XI is 1, Yl is 2 and Zl is 2 to a second position such as X2, Yl, Zl in which X2 is 4 in a 3D virtual space.
  • the user causes the 3D element to move by using an input /output device to drag the 3D element from its first position to its second position.
  • the user may issue commands through an input/output device that requires the 3D element to change a feature associated with the 3D element such as changing the color, the texture, the transparency, or audio of a 3D element.
  • the user may send a command to processor 17 that changes the color of a 3D element from a blue to a red colored background.
  • Event interpreter 210 recognizes that the command sent from the user attempts to affect a 3D element of the EPG and transfers this information to behavior filter 220.
  • Behavior filter 220 serves two functions. First, behavior filter 220 locates an appropriate description as to the command issued by the user and then sends this description to EPG engine 230 for creating for example, texture maps used to form the 3D EPG. The description is based upon a rule or a set of rules related to changing the surface parameters of the 3D element. These rules are triggered by a spatio-temporal event such as a user moving the cursor close to the 3D element.
  • ⁇ behaviors that are associated with behavioral rules include causing the 3D element to shake, twist, flip, zoom in on the 3D element, rotate about the X, Y, or Z axis, or any other type of behavior.
  • Specific behavior rules may be associated with a class of 3D elements.
  • Figure 4 illustrates that numerous classes may be established with various behavior rules for 3D elements in a 3D virtual space in accordance with one embodiment of the invention.
  • 3D elements of the same class follow the same behavioral rules; however, individual rules may be customized for particular 3D elements.
  • Listed below are some of the classes that may be used such as a description class, a content class, a switch class, a network class, a movie preview class, an advertisement class, a time class, and a control class.
  • Description class includes 3D elements that verbally describe the content of the data as to television programs, or other information broadcast on the television.
  • Description 3D element 310 may be used to present information pertaining to a soccer game. Behavior for these 3D elements may include, for example, moving in the 3D virtual space or changing the appearance of the 3D elements (e.g., color, texture, transparency, etc.).
  • Content class includes 3D elements that present information (e.g., name of the program, logo associated with the network etc.) that guides a user to television programs.
  • content class 320 is a basketball game that is to be broadcast at 10:30 p.m.
  • Content class elements are configured to move (e.g., flip, twist etc.) or change a feature (e.g., color, texture parameters, transparency) when a user moves a cursor close to this 3D element.
  • Switch class includes 3D elements used to control EPG switches including such features as recording, on/off, EPG type (e.g., headers, time, titles (content), station logos, advertisements, information etc.), or other like information.
  • Switch class elements such as switch class element 330 may alter its appearance such as the color, or texture.
  • Network class includes 3D elements that may be used to represent a 3D-enabled web-navigator.
  • network class element 340 points to ESPN CACF championship coverage of a football game such as the Falcons versus the Vikings and the Jets versus the Broncos. These 3D elements may change color.
  • Movie preview class includes 3D elements that may be used to represent preview video information. These 3D elements may be used to create picture-in-picture preview control in the 3D-enabled EPG. 3D element 350 provides preview information as to a movie. Behavior rules for this class may include shaking or rotating the 3D element about an axis.
  • Advertisement class includes 3D elements that may be used to present advertisement content.
  • advertisement 3D element 360 shows an automobile that is being advertised by a car manufacturer. These 3D elements may rotate about the X, Y, or Z axis.
  • Time class includes 3D elements that may be used to present time information by displaying time stamps on a 3D element.
  • time 3D element 365 shows the time associated with viewing a program. These 3D elements may rotate about the X, Y, or Z axis.
  • Control class includes 3D elements that control content description appearance such as control element 370.
  • Control 3D elements may move (e.g., rotate, navigate 3D elements). While these classes represent numerous 3D elements, it will be appreciated that a user may develop a variety of other suitable classes that include standardized or arbitrarily established behavioral rules.
  • the second function of behavior filter 220 shown in Figure 3 involves filtering data received from event interpreter 210 based upon the content of the data, the context of data, or other suitable characteristics of data.
  • Content of the data relates to the subject matter (e.g., sports) whereas the context of data concerns one object related or linked to another object.
  • An example of data in context involves a video clip adjacent to two buttons. The video clip itself may not provide a description of the data of interest but the two buttons, linked to the video clip, may provide the desired description. In this manner, the two buttons in conjunction with the video clip exemplify the context of data.
  • a filter or filters that are used in behavior filter 220 eliminate irrelevant data that does not affect a 3D element.
  • the filter may be defined by an upper boundary and a lower boundary or it may be defined by a single boundary for a characteristic of data.
  • a system operator a person who ensures that computer system 10 operates efficiently, may designate an upper boundary for a characteristic of data content, context of data, or other suitable characteristic of data.
  • the user may establish a single boundary such as an upper boundary that may filter data based upon all sports or a lower boundary may be set to filter data based upon soccer alone.
  • a user may establish a two- boundary filter for a category such as the "comedy" category.
  • a user may wish to see comedies that are no longer than 1 hour (i.e., upper boundary) and not shorter than 0.5 hour (i.e., low boundary).
  • the user may establish a single boundary filter which allows the user to see or record movies produced after a certain year, or double boundary filter for movies produced between year 1 and year 2, or multiple boundary filter of movies produced between 1930 and 1933 and between 1955 and 1971.
  • computer system 10 is able to quickly focus on data related to 3D elements that the user desires to move or to change a feature associated with the 3D element.
  • the operation of filters is known in the art; therefore, these details as to the precise operation of filters are not presented in order to avoid obscuring the invention.
  • EPG engine 230 receives EPG data processed by data mapper 240 which is discussed below.
  • EPG engine 230 performs the function of processing the various data to produce a texture map or maps. Texture mapping is the mapping of an image onto an object.
  • Data mapper 240 serves the function of mapping EPG data onto each 3D element.
  • Data mapper 240 is configured to associate text, images, live video, store video or any other suitable object with a 3D element. Details as to the manner in which data mapping and texture mapping is performed is found in Method And Apparatus For Using A General Three- Dimensional (3D) Graphics Pipeline For Cost Effective Digital Image Video
  • 3D graphics pipeline 250 may be the 3D graphics pipeline described in Method and Apparatus for using a General Three- Dimensional (3D) Graphics Pipeline For Cost Effective Digital Image Video Editing, Transformation, and Representation, Serial No. , filed on by Yakov Kamen and Leon Shirman or any conventional 3D graphics pipeline.
  • 3D graphics pipeline 250 takes a texture map(s) created by EPG engine 230 and properly maps these texture maps onto objects and displays the 3D elements in the virtual 3D space of the EPG using known techniques.
  • Figure 5 illustrates a flow diagram for controlling the movement or the presentation of the 3D elements in a 3D virtual space in accordance with one embodiment of the invention.
  • a command concerning an event or time associated with a 3D element is sent to the event interpreter by, for example, a user.
  • the event interpreter recognizes the name of the event or the time of the event and associates it with a 3D element.
  • the event interpreter determines whether a user is interacting with at least one 3D element.
  • at least one 3D element is affected by the command.
  • the event interpreter then sends this 3D data to the behavior filter.
  • the behavior filter associates a behavior description with the data.
  • the behavior description may indicate a 3D element is to move or a feature associated with the 3D element is to be modified.
  • the behavior filter filters out irrelevant data based upon characteristics of data such as the content or context of data designated by the user or system operator.
  • the data mapper maps the EPG data (e.g., text, image, live video, stored video etc.) onto a 3D element.
  • a 3D element may represent the ABC network.
  • the data mapper accesses the ABC time slot and maps it to the 3D element.
  • the EPG engine sends a request to a 3D graphics pipeline to generate a 3D representation on a screen or other visual display of the 3D elements in the virtual 3D space.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention involves moving a three-dimensional (3D) element in a virtual 3D space of an electronic programming guide or changing a feature associated with the 3D element. This is accomplished by a user sending a command to an event interpreter (400). The event interpreter determines whether the user is interacting with a 3D element (410). It is then determined whether the 3D element is affected by the command sent by the user (420). The behaviour filter locates an appropriate description as to the command issued by the user and then sends this description to an electron program guide (EPG) engine for creating a 3D-enabled EPG (430). Data is filtered by a behaviour filter based upon the content of the data or the context of the data (440). Data is then represented by at least one three-dimensional element in a 3D virtual space (450). The three-dimensional element is configured to move in a virtual three-dimensional space of an image or change a feature of a three-dimensional element (460).

Description

METHOD AND APPARATUS FOR CONTROLLING THE MOVEMENT OR CHANGING THE APPEARANCE OF A THREE-DIMENSIONAL
ELEMENT
BACKGROUND OF THE INVENTION
This application claims the benefit of the earlier filing date of co- pending provisional application of Yakov Kamen and Leon Shirman entitled "Mechanism and Apparatus to Control Behavior of3D-Enabled EPG Elements," Serial No. 60/182,838, filed February 16, 2000 and incorporated herein by reference.
FIELD OF THE INVENTION
This invention relates generally to creating electronic programming guides (EPGs) and more specifically to controlling the movement or the appearance of three-dimensional (3D) elements used in EPGs.
BACKGROUND
Television provides a vast amount of audiovisual information to a variety of audiences. Typically, a user is able to determine which television program to watch by reviewing a television guide purchased at a store or an EPG that is electronically available through cable television. Many users use the EPG available on cable television due to its ease of access.
EPGs are presented in a two-dimensional format as shown in Figure 1. First column 110 provides the various channels of a broadcast on cable television. Second, third, fourth columns (120, 130, 140) present the television programs for the particular channels that will be broadcast in half- hour increments. For example, at 10:00 p.m., second column 120 indicates that the news will be broadcast on channel 2. Third column 130 presents the television programs to be broadcast at 10:30 p.m. and fourth column 140 presents the television programs that will be broadcast at 11:00 p.m. Because there are generally more television channels to present television programming information than there is space in columns and rows in an EPG, grid 135 scrolls at a preselected rate to allow a user time to consider all the television programs that are to be broadcast. Typically, a user is unable to modify the two-dimensional EPG.
Another conventional system involves digital EPGs. Digital EPGs also use a two-dimensional format to present television programming information. A user of a digital EPG, however, is capable of interacting with the digital EPG to customize the types of television programs that are presented. In interactive television, a user may browse television programming information presented in a two-dimensional format in any order chosen by the user. For example, a user may select television programs from an on-screen menu for current or future viewing or order pay-per-view programming.
The two-dimensional format for presenting television programming information is problematic for some users because of difficulty viewing or distinguishing information. Moreover, the two-dimensional format of a conventional EPG is generally unable to be personalized to a user. For example, a user cannot modify the manner in which the information is presented such as by moving television program information to a portion of the screen or change the way in which the information is physically presented to make the information more easily viewed.
Another disadvantage associated with conventional EPGs is that it takes a considerable amount of time to modify EPGs. Accordingly, it is desirable to have an EPG that addresses the disadvantages associated with conventional EPGs.
SUMMARY
One embodiment of the invention relates to controlling the movement or changing the features (e.g., color, texture, transparency, audio etc.) associated with three-dimensional (3D) elements in an electronic programming guide (EPG) that is presented on, for example, a television. Each 3D element typically represents information relevant to a television program such as a sports program.
To implement techniques of the invention, program instructions such as an event interpreter, a behavior filter, an EPG engine, and a data mapper are executed on a computer system. The event interpreter receives a command from a user. The event interpreter then recognizes an event or time associated with a 3D element and is able to determine whether a user is interacting with a particular 3D element.
If the user is interacting with a 3D element, the data is then sent to a behavior filter. The behavior filter receives the data and associates a behavior description with the data. A behavior description may indicate that a 3D element is to move or to change a feature associated with a 3D element. The behavior filter also filters the data. The filter prevents irrelevant data from being further processed. Data is filtered based upon the content or the context of the data. The content of data relates to the subject matter of the data. For instance, a user may desire to view 3D elements related to sports in a certain spatial order. While data related to sports programs will pass through the behavior filter, other data such as information pertaining to stocks, for example, will not pass through the behavior filter.
Data that passes through the behavior filter is then sent to an EPG engine. The EPG engine creates the EPG using a 3D graphics pipeline based upon this data and the EPG data received from the data mapper. The data mapper is used to map EPG data onto 3D elements. The data mapper takes a memory object (e.g., text, image, video, etc.) and attaches the memory object to a 3D element. For instance, the data mapper receives EPG data such as the ABC time slot and maps this information onto the 3D element associated with ABC. Additional features, embodiments, and benefits will be evident in view of the figures and detailed description presented herein.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention. In the drawings:
Figure 1 illustrates an electronic version of a television guide of the prior art;
Figure 2 illustrates a computer system in accordance with one embodiment of the invention; Figure 3 illustrates a block diagram of an electronic programming guide (EPG) formed in accordance with one embodiment of the invention;
Figure 4 illustrates an EPG with three-dimensional (3D) elements in accordance with one embodiment of the invention; and
Figure 5 illustrates a flow diagram for modifying or moving a 3D element in an EPG in accordance with one embodiment of the invention.
DETAILED DESCRIPTION
One embodiment of the invention relates to a system for creating a 3D electronic programming guide (EPG) that includes a plurality of three- dimensional (3D) elements that may be moved by a user or a user may change a feature associated with a 3D element(s). Changing a feature of a 3D element may include changing the color, the texture, the transparency, or audio features that are associated with the 3D element.
Each 3D element represents information to a user such as a television program. A 3D element is defined by its geometric structure (e.g., spheres, triangles, hexagons, squares, or other suitable geometric structures), its position in a virtual 3D space, and the behavioral model applicable to the 3D element. The behavioral model is a set of rules that change the surface parameters of a 3D element based upon a spatio-temporal event. An event is an incident of importance. For example, an event may be a user moving a cursor closer to the 3D element that causes the 3D element to respond by changing its color or any other variety of arbitrary responses.
By providing a means in which to move or change a feature of a 3D element, an EPG may be personalized to the requirements established by a user. For example, the user may desire to view a set of 3D elements related to sports in a certain spatial order specified by the user or by the computer system according to a priority established by the historical viewing experience of the user. The set of 3D elements related to sports (e.g., baseball, football, etc.) may be provided, for example, on the top portion of the screen. By allowing a user to view information in this manner, a user is able to more quickly determine the television programs that he or she may wish to view. Referring to the figures, exemplary embodiments of the invention will now be described. The exemplary embodiments are provided to illustrate aspects of the invention and should not be construed as limiting the scope of the invention. Figure 2 presents the apparatus used to implement techniques of the invention whereas Figures 3 through 5 present details of moving or changing the features associated with 3D elements in a virtual 3D space.
Figure 2 illustrates one embodiment of a computer system 10 such as a set-top box that is connected to a television for implementing the principles of the present invention. Computer system 10 comprises a processor 17, storage device 18, and interconnect 15 such as bus or a point-to- point link. Processor 17 is coupled to storage device 18 by interconnect 15. In addition, a number of user input /output devices, such as a keyboard 20 and display 25, are coupled to chip set (not shown) that is then connected to processor 17. The chipset is typically connected to processor 17 using an interconnect that is different from interconnect 15.
Processor 17 represents a central processing unit of any type of architecture (e.g., the Intel architecture, the Hewlett Packard architecture, the Sun Microsystems architecture, the IBM architecture, etc.), or hybrid architecture. In addition, processor 17 could be implemented on one or more chips. Storage device 18 represents one or more mechanisms for storing data. Storage device 18 may include read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, and/or other machine-readable media'. Interconnect 15 represents one or more buses (e.g., accelerated graphics port bus, peripheral component interconnect bus, industry standard architecture bus, X-Bus, video electronics standards association related to buses, etc.) and bridges (also termed as bus controllers).
While this embodiment is described in relation to a single processor system, the invention could be implemented in a multi-processor system such as a broadcast to a multi-processor system located in a large corporation. In addition to other devices, one or more of network 30 may be present. Network 30 represents one or more network connections for transmitting data over a machine readable media. Figure 2 also illustrates that storage device 18 has stored therein data 35 and program instructions (e.g., software, computer program, etc.) 36. Data 35 represents data stored in one or more of the formats described herein. Data 35 includes existing EPG data, data that affects a 3D element or other like information. Existing EPG data is data that has been previously stored in storage device 18. Existing EPG data are 3D elements that represent, for example, a television program. These 3D elements may be created from the information found in two-dimensional EPGs.
Program instructions 36 represents the necessary code for performing any and /or all of the techniques described with reference to Figures 3-5 such as moving a 3D element or changing a feature associated with a 3D element. A feature of a 3D element includes parameters, characteristics, or other suitable data associated with a 3D element. Examples of a feature of a 3D element include color, texture, transparency, and audio elements of a 3D element. Program instructions include an event interpreter, a behavior filter, a data mapper, an EPG engine, and a 3D graphics pipeline. Each of these components is described below in Figure 3.
While some of the modules of program instructions are described herein, it will be recognized by one of ordinary skill in the art that storage device 18 preferably contains additional software (not shown), which is not necessary to understanding the invention.
Figure 2 additionally illustrates that processor 17 includes decoder 40. Decoder 40 is used for decoding instructions received by processor 17 into control signals and/or microcode entry points. In response to these control signals and /or microcode entry points, decoder 40 performs the appropriate operations.
Given the description of computer system 10, Figure 3 illustrates a block diagram of program instructions 36 of computer system 10 used for representing information such as 3D elements in a 3D virtual space in accordance with one embodiment of the invention. Computer system 10 includes program instructions such as event interpreter 210, behavior filter 220, data mapper 240, EPG engine 230, and 3D graphics pipeline 250. Each of these components are discussed in detail below. Event interpreter 210 receives commands sent by a user to move or to change a feature of a 3D element. The user inputs commands into a graphical user interface (GUI) by using, for example, input/output devices such as a remote control, a mouse, a keyboard, a voice recognition device, or other suitable devices to send commands. Alternatively, commands may be automatically generated by computer system 10 to move or to change a 3D element, based upon, for instance, the historical viewing experiences of the user.
The user may input a variety of information that may affect an event that is represented by a user. For instance, the user may require a 3D element to move from a first position as represented by coordinates XI, Yl, Zl in which XI is 1, Yl is 2 and Zl is 2 to a second position such as X2, Yl, Zl in which X2 is 4 in a 3D virtual space. The user causes the 3D element to move by using an input /output device to drag the 3D element from its first position to its second position. Alternatively, the user may issue commands through an input/output device that requires the 3D element to change a feature associated with the 3D element such as changing the color, the texture, the transparency, or audio of a 3D element. For instance, the user may send a command to processor 17 that changes the color of a 3D element from a blue to a red colored background. Event interpreter 210 recognizes that the command sent from the user attempts to affect a 3D element of the EPG and transfers this information to behavior filter 220.
Behavior filter 220 serves two functions. First, behavior filter 220 locates an appropriate description as to the command issued by the user and then sends this description to EPG engine 230 for creating for example, texture maps used to form the 3D EPG. The description is based upon a rule or a set of rules related to changing the surface parameters of the 3D element. These rules are triggered by a spatio-temporal event such as a user moving the cursor close to the 3D element.
These rules, either standardized or arbitrarily established, cause a 3D element to behave in a certain manner. Exemplary behaviors that are associated with behavioral rules include causing the 3D element to shake, twist, flip, zoom in on the 3D element, rotate about the X, Y, or Z axis, or any other type of behavior. Specific behavior rules may be associated with a class of 3D elements. Figure 4 illustrates that numerous classes may be established with various behavior rules for 3D elements in a 3D virtual space in accordance with one embodiment of the invention. Typically, 3D elements of the same class follow the same behavioral rules; however, individual rules may be customized for particular 3D elements. Listed below are some of the classes that may be used such as a description class, a content class, a switch class, a network class, a movie preview class, an advertisement class, a time class, and a control class.
Description class includes 3D elements that verbally describe the content of the data as to television programs, or other information broadcast on the television. Description 3D element 310 may be used to present information pertaining to a soccer game. Behavior for these 3D elements may include, for example, moving in the 3D virtual space or changing the appearance of the 3D elements (e.g., color, texture, transparency, etc.).
Content class includes 3D elements that present information (e.g., name of the program, logo associated with the network etc.) that guides a user to television programs. To illustrate, content class 320 is a basketball game that is to be broadcast at 10:30 p.m. Content class elements are configured to move (e.g., flip, twist etc.) or change a feature (e.g., color, texture parameters, transparency) when a user moves a cursor close to this 3D element.
Switch class includes 3D elements used to control EPG switches including such features as recording, on/off, EPG type (e.g., headers, time, titles (content), station logos, advertisements, information etc.), or other like information. Switch class elements such as switch class element 330 may alter its appearance such as the color, or texture.
Network class includes 3D elements that may be used to represent a 3D-enabled web-navigator. For example, network class element 340 points to ESPN CACF championship coverage of a football game such as the Falcons versus the Vikings and the Jets versus the Broncos. These 3D elements may change color.
Movie preview class includes 3D elements that may be used to represent preview video information. These 3D elements may be used to create picture-in-picture preview control in the 3D-enabled EPG. 3D element 350 provides preview information as to a movie. Behavior rules for this class may include shaking or rotating the 3D element about an axis.
Advertisement class includes 3D elements that may be used to present advertisement content. For example, advertisement 3D element 360 shows an automobile that is being advertised by a car manufacturer. These 3D elements may rotate about the X, Y, or Z axis.
Time class includes 3D elements that may be used to present time information by displaying time stamps on a 3D element. For example, time 3D element 365 shows the time associated with viewing a program. These 3D elements may rotate about the X, Y, or Z axis.
Control class includes 3D elements that control content description appearance such as control element 370. Control 3D elements may move (e.g., rotate, navigate 3D elements). While these classes represent numerous 3D elements, it will be appreciated that a user may develop a variety of other suitable classes that include standardized or arbitrarily established behavioral rules.
The second function of behavior filter 220 shown in Figure 3 involves filtering data received from event interpreter 210 based upon the content of the data, the context of data, or other suitable characteristics of data. Content of the data relates to the subject matter (e.g., sports) whereas the context of data concerns one object related or linked to another object. An example of data in context involves a video clip adjacent to two buttons. The video clip itself may not provide a description of the data of interest but the two buttons, linked to the video clip, may provide the desired description. In this manner, the two buttons in conjunction with the video clip exemplify the context of data.
A filter or filters that are used in behavior filter 220 eliminate irrelevant data that does not affect a 3D element. The filter may be defined by an upper boundary and a lower boundary or it may be defined by a single boundary for a characteristic of data. For example, a system operator, a person who ensures that computer system 10 operates efficiently, may designate an upper boundary for a characteristic of data content, context of data, or other suitable characteristic of data. To illustrate, if a user is solely interested in sports, and in particular, soccer, the user may establish a single boundary such as an upper boundary that may filter data based upon all sports or a lower boundary may be set to filter data based upon soccer alone.
With respect to context of data, a user may establish a two- boundary filter for a category such as the "comedy" category. A user may wish to see comedies that are no longer than 1 hour (i.e., upper boundary) and not shorter than 0.5 hour (i.e., low boundary). In another example, the user may establish a single boundary filter which allows the user to see or record movies produced after a certain year, or double boundary filter for movies produced between year 1 and year 2, or multiple boundary filter of movies produced between 1930 and 1933 and between 1955 and 1971. By establishing boundaries, computer system 10 is able to quickly focus on data related to 3D elements that the user desires to move or to change a feature associated with the 3D element. The operation of filters is known in the art; therefore, these details as to the precise operation of filters are not presented in order to avoid obscuring the invention.
The filtered data from behavior filter 220 is then sent to EPG engine 230. EPG engine 230 also receives EPG data processed by data mapper 240 which is discussed below. EPG engine 230 performs the function of processing the various data to produce a texture map or maps. Texture mapping is the mapping of an image onto an object.
Data mapper 240, in contrast, serves the function of mapping EPG data onto each 3D element. Data mapper 240 is configured to associate text, images, live video, store video or any other suitable object with a 3D element. Details as to the manner in which data mapping and texture mapping is performed is found in Method And Apparatus For Using A General Three- Dimensional (3D) Graphics Pipeline For Cost Effective Digital Image Video
Editing, Transformation, and Representation, Serial No. , filed on by Yakov Kamen and Leon Shirman, which is incorporated by reference.
EPG engine 230 then sends a request to 3D graphics pipeline 250 to provide a screen representation or other visual display of the event or time related to the EPG data. 3D graphics pipeline 250, may be the 3D graphics pipeline described in Method and Apparatus for using a General Three- Dimensional (3D) Graphics Pipeline For Cost Effective Digital Image Video Editing, Transformation, and Representation, Serial No. , filed on by Yakov Kamen and Leon Shirman or any conventional 3D graphics pipeline. 3D graphics pipeline 250 takes a texture map(s) created by EPG engine 230 and properly maps these texture maps onto objects and displays the 3D elements in the virtual 3D space of the EPG using known techniques.
Figure 5 illustrates a flow diagram for controlling the movement or the presentation of the 3D elements in a 3D virtual space in accordance with one embodiment of the invention. At block 400, a command concerning an event or time associated with a 3D element is sent to the event interpreter by, for example, a user. The event interpreter recognizes the name of the event or the time of the event and associates it with a 3D element. At block 410, the event interpreter determines whether a user is interacting with at least one 3D element. At block 420, at least one 3D element is affected by the command. The event interpreter then sends this 3D data to the behavior filter. At block 430, the behavior filter associates a behavior description with the data. The behavior description may indicate a 3D element is to move or a feature associated with the 3D element is to be modified. At block 440, the behavior filter filters out irrelevant data based upon characteristics of data such as the content or context of data designated by the user or system operator. At block 450, the data mapper maps the EPG data (e.g., text, image, live video, stored video etc.) onto a 3D element. For example, a 3D element may represent the ABC network. The data mapper accesses the ABC time slot and maps it to the 3D element. At block 460, the EPG engine sends a request to a 3D graphics pipeline to generate a 3D representation on a screen or other visual display of the 3D elements in the virtual 3D space.
In the foregoing description, the invention is described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims

CLAIMS:What is claimed is:
1. A computer-implemented method for creating an electronic programming guide that includes a plurality of three-dimensional elements comprising: associating a behavior description with a command sent by a user which affects at least one three-dimensional element; filtering a first set of data using a filter to obtain a second set of data which affects the at least one three-dimensional element, the filter filters a first set of data based upon one of content of the first set of data and context of the first set of data; mapping electronic programming guide data - onto the at least one three-dimensional element; and displaying the at least one three-dimensional element using a three- dimensional graphics pipeline wherein the at least three-dimensional element is configured to perform one of moving in a virtual three-dimensional space of the electronic programming guide and changing a feature of the at least one three-dimensional element.
2. The computer-implemented method of claim 1, wherein the feature is one of a color, a texture, transparency, and audio.
3. The computer-implemented method of claim 1, wherein the at least one three-dimensional element is defined by the at least one three-dimensional element's geometric structure, its position in the three-dimensional space, and at least one behavioral rule.
4. The computer-implemented method of claim 1, further comprising: representing data by a plurality of elements, the plurality of three- dimensional elements are classified in at least one of an advertisement class, a network class, a movie preview class, a time class, a description class, control class, a content class, and a switch class.
5. A system comprising: a processor; the processor is coupled to a display device and to a memory; the processor creates a three-dimensional electronic programming guide that includes at least one three-dimensional element; a first set of data is filtered to obtain a second set of data which affects the at least one three-dimensional element, the at least one three-dimensional element is configured to perform one of moving in a three-dimensional space of the electronic programming guide and changing a feature; and an interconnect coupled to the processor and to the memory to allow the at least one three-dimensional element to be transported between the memory, the processor, and the display device.
6. The system of claim 5, wherein a feature is one of a color, a texture, transparency, and audio.
7. The system of claim 5, wherein the at least one three-dimensional element is defined by one of geometric structure and position in the three- dimensional space.
8. The system of claim 5, further comprising: a three dimensional graphics pipeline that is executed on the processor for representing data by a plurality of three-dimensional elements, the plurality of three-dimensional elements are classified in one of an advertisement class, a network class, a control class, a time class, a description class and a switch class.
9. A machine readable storage media containing executable program instructions which when executed cause a digital processing system to perform a method comprising: associating a behavior description with a command sent by a user which affects at least one three-dimensional element- filtering a first set of data using a filter to obtain a second set of data which affects the at least one three-dimensional element, the filter filters a first set of data based upon one of content of the first set of data and context of the first set of data; mapping electronic programming guide information onto the at least one three-dimensional element; and displaying the at least one three-dimensional element using a three- dimensional graphics pipeline wherein the at least three-dimensional element is configured to perform one of moving in a virtual three-dimensional space of the electronic programming guide and changing a feature of the at least one three-dimensional element.
10. The machine readable storage media of claim 16, wherein a feature is one of a color, a texture, transparency, and audio.
11. The machine readable storage media of claim 9, wherein the at least one three-dimensional element is defined by the at least one three-dimensional element's geometric structure, its position in the three-dimensional space, and behavioral rules.
12. The machine readable storage media of claim 9, wherein the method further comprises: representing data by a plurality of elements, the plurality of three- dimensional elements are classified in at least one of an advertisement class, a network class, a movie preview class, a time class, a description class, control class, a content class, and a switch class.
PCT/US2001/005056 2000-02-16 2001-02-16 Method and apparatus for controlling the movement or changing the appearance of a three-dimensional element WO2001061996A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001238406A AU2001238406A1 (en) 2000-02-16 2001-02-16 Method and apparatus for controlling the movement or changing the appearance of a three-dimensional element

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US18283800P 2000-02-16 2000-02-16
US60/182,838 2000-02-16
US78496101A 2001-02-15 2001-02-15
US09/784,961 2001-02-15

Publications (1)

Publication Number Publication Date
WO2001061996A1 true WO2001061996A1 (en) 2001-08-23

Family

ID=26878475

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/005056 WO2001061996A1 (en) 2000-02-16 2001-02-16 Method and apparatus for controlling the movement or changing the appearance of a three-dimensional element

Country Status (2)

Country Link
AU (1) AU2001238406A1 (en)
WO (1) WO2001061996A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102088638A (en) * 2009-11-24 2011-06-08 Lg电子株式会社 Image display device and method for operating the same
EP2609739A1 (en) * 2010-08-27 2013-07-03 Telefonaktiebolaget L M Ericsson (PUBL) Methods and apparatus for providing electronic program guides
EP2962458A4 (en) * 2013-05-10 2016-10-26 Samsung Electronics Co Ltd Display apparatus and method of providing a user interface thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1093880A (en) * 1996-09-12 1998-04-10 Hitachi Ltd Three-dimensional display program guide generation device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1093880A (en) * 1996-09-12 1998-04-10 Hitachi Ltd Three-dimensional display program guide generation device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102088638A (en) * 2009-11-24 2011-06-08 Lg电子株式会社 Image display device and method for operating the same
EP2337366A1 (en) * 2009-11-24 2011-06-22 LG Electronics Inc. Image display device and method for operating the same
CN102088638B (en) * 2009-11-24 2013-07-24 Lg电子株式会社 Image display device and method for operating the same
US8896672B2 (en) 2009-11-24 2014-11-25 Lg Electronics Inc. Image display device capable of three-dimensionally displaying an item or user interface and a method for operating the same
EP2609739A1 (en) * 2010-08-27 2013-07-03 Telefonaktiebolaget L M Ericsson (PUBL) Methods and apparatus for providing electronic program guides
EP2609739A4 (en) * 2010-08-27 2014-04-16 Ericsson Telefon Ab L M Methods and apparatus for providing electronic program guides
EP2962458A4 (en) * 2013-05-10 2016-10-26 Samsung Electronics Co Ltd Display apparatus and method of providing a user interface thereof

Also Published As

Publication number Publication date
AU2001238406A1 (en) 2001-08-27

Similar Documents

Publication Publication Date Title
JP2020188479A (en) System and method for navigating three-dimensional media guidance application
JP5189748B2 (en) Method of selecting a button in a graphical bar and a receiver implementing this method
US8601510B2 (en) User interface for interactive digital television
US7511710B2 (en) Three-dimensional program guide
US20080141172A1 (en) Multimedia Player And Method Of Displaying On-Screen Menu
JP5783245B2 (en) How to display a video stream according to a customized format
US7322009B2 (en) Three dimensional light electronic programming guide
US20040100486A1 (en) Method and system for image editing using a limited input device in a video environment
US20100180304A1 (en) Electronic program guide with support for rich program content
US7975399B2 (en) Perpendicular view three dimensional electronic programming guide
US20040100484A1 (en) Three-dimensional television viewing environment
KR20070093084A (en) Distributed software construction for user interfaces
KR20070092262A (en) Scaling and layout methods and systems for handling one-to-many objects
KR20130132743A (en) System, method and user interface for content search
WO2011084890A1 (en) Overlay device, system and method
EP2329643B1 (en) Systems and methods for graphical control of user interface features provided by a television receiver
WO2001061996A1 (en) Method and apparatus for controlling the movement or changing the appearance of a three-dimensional element
WO2001057683A1 (en) Method and system for image editing using a limited input device in a video environment

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP