US20150249846A1 - Synchronization of user interactive events with on-screen events during playback of multimedia stream - Google Patents
Synchronization of user interactive events with on-screen events during playback of multimedia stream Download PDFInfo
- Publication number
- US20150249846A1 US20150249846A1 US14/697,149 US201514697149A US2015249846A1 US 20150249846 A1 US20150249846 A1 US 20150249846A1 US 201514697149 A US201514697149 A US 201514697149A US 2015249846 A1 US2015249846 A1 US 2015249846A1
- Authority
- US
- United States
- Prior art keywords
- multimedia content
- multimedia
- data stream
- interactive event
- events
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/88—Mini-games executed independently while main games are being loaded
-
- A63F13/12—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/33—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
- A63F13/338—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections using television networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
- H04N21/2353—Processing of additional data, e.g. scrambling of additional data or processing content descriptors specifically adapted to content descriptors, e.g. coding, compressing or processing of metadata
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/242—Synchronization processes, e.g. processing of PCR [Program Clock References]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47205—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8166—Monomedia components thereof involving executable data, e.g. software
- H04N21/8173—End-user applications, e.g. Web browser, game
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8455—Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
- H04N21/8547—Content authoring involving timestamps for synchronizing content
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/40—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
- A63F2300/409—Data transfer via television network
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6036—Methods for processing data by generating or executing the game program for offering a minigame in combination with a main game
Definitions
- the present disclosure generally relates to multimedia content distribution, and relates more particularly to providing user interactive events during playback with multimedia content.
- Service providers can offer users a variety of viewing options for different multimedia programs. For example, the service providers can supply users with real-time television programs that are typically available for the users to watch only at a specific date and time. The service providers can also offer the users on-demand multimedia content that is available for an extended amount of time and that is provided to the users upon request of the on-demand movie.
- FIG. 1 is a diagram illustrating a multimedia system providing synchronization between on-screen events and corresponding user interaction events in accordance with at least one embodiment of the present disclosure
- FIG. 2 is a flow diagram illustrating a method for generating metadata in association with multimedia content in accordance with at least one embodiment of the present disclosure
- FIG. 3 is a diagram illustrating an example implementation of the method of FIG. 2 in accordance with at least one embodiment of the present disclosure
- FIG. 4 is a flow diagram illustrating a method for synchronizing user interaction events with corresponding on-screen events during playback of multimedia content by a end-user multimedia device in accordance with at least one embodiment of the present disclosure
- FIG. 5 is a diagram illustrating an example operation of the multimedia system of FIG. 1 in a on-screen display overlay context in accordance with at least one embodiment of the present disclosure
- FIG. 6 is a diagram illustrating an example operation of the multimedia system of FIG. 1 in a video game context in accordance with at least one embodiment of the present disclosure.
- FIG. 7 is a diagram illustrating an example computer system for implementing one or more of the components or techniques described herein in accordance with at least one embodiment of the present disclosure.
- the techniques of the present disclosure are illustrated in the example context of a packet-based network architecture (such as the Internet or a private Internet Protocol (IP)-based network) utilized to convey information between end-user devices, content sources, media servers, and other components of the network.
- IP Internet Protocol
- these techniques are not limited to this example, but instead can be implemented in any of a variety networks configured to support the transmission of multimedia content and other information using the guidelines provided herein without departing from the scope of the present disclosure.
- the techniques of the present disclosure are described with reference to a single end-user multimedia device.
- a telecommunications network may support a number of end-user multimedia devices and thus the described techniques can be employed in parallel for some or all of the end-user multimedia devices within the telecommunications network.
- FIG. 1 illustrates an example multimedia system 100 configured to utilize a data stream representing multimedia content and metadata associated with the multimedia content to provide users with an interactive experience that is more fully synchronized to on-screen events of interest in accordance with at least one embodiment of the present disclosure.
- the multimedia system 100 includes an end-user multimedia device 102 and a display device 104 at a user's premises.
- the multimedia device 102 can include, for example, a set top box (STB) device, a digital video recorder (DVR) device, a video game console, a portable multimedia device (such as a multimedia-enable cellular phone, a digital radio receiver, or a handheld video game console), and the like.
- STB set top box
- DVR digital video recorder
- portable multimedia device such as a multimedia-enable cellular phone, a digital radio receiver, or a handheld video game console
- the multimedia system 100 further includes a multimedia server 106 and a metadata server 108 connected to the multimedia device 102 via networks 110 and 112 , respectively.
- the networks 110 and 112 can include packet-based networks such as a local area network (LAN), a wireless network an Internet Protocol (IP)-based provider network, or the Internet. Further, the networks 110 and 112 can comprise the same network or different networks.
- the multimedia server 106 and the metadata server 108 are provided in association with the same service provider 114 such as a cable or satellite television provider.
- the networks 110 and 112 may be implemented as the same provider network connecting the service provider 114 to the multimedia device.
- the multimedia server 106 and the metadata server 108 may be maintained and operated by separate service providers via the same or different networks.
- the multimedia device 102 includes a network interface 122 to the network 110 , a network interface 124 to the network 112 , a data storage component 126 , a display controller 128 , a user interaction controller 130 , a user control interface 132 , and a user input device 134 .
- the network interfaces 122 and 124 can include, for example, an Ethernet interface, a wireless interface, a fiber-optic interface, and the like. In instances where the network 110 and the network 112 are the same network, the network interfaces 122 and 124 can together comprise a single network interface.
- the data storage component 126 can include any of a variety of storage components, such as a hard drive, an optical drive, a random access memory (RAM), flash memory, a cache, and the like.
- the display controller 128 is configured to access a data stream stored or buffered in the data storage component 126 to generate a corresponding audio/video signaling 136 representative of a playback of the multimedia content of the data stream. Accordingly, the display controller 128 therefore can implement a decoder (such as a digital television (DTV) decoder, an H.264 decoder, or an MPEG decoder), a video display driver, and other such components utilized in processing data streams to generate corresponding audio/video signaling 136 .
- a decoder such as a digital television (DTV) decoder, an H.264 decoder, or an MPEG decoder
- the user interaction controller 130 is configured to access metadata stored or buffered in the data storage component 126 (or an alternate storage component) to identify interactive event information associated with corresponding time points in multimedia content being presented by the display controller 128 , and to perform one or more user interactive events associated with the identified interactive event information. In performing these user interactive events, the user interaction controller 130 may seek input from the user via the user input device 134 .
- the user input device 134 can include, for example, a remote control device or a keyboard device.
- the user input device 134 can include, for example, a video game controller or handset (that is, a “joystick”).
- the user interaction controller 130 is implemented at least in part as one or more processors 138 that execute a software program 140 , whereby the software program 140 includes a set of executable instructions that manipulate the one or more processors 138 to perform at least a subset of the functionality of the user interaction controller 130 described herein.
- the software program 140 can be stored locally at the multimedia device 102 , such as in the data storage component 126 or other data storage component.
- the software program 140 is stored in portable storage medium 142 , and the one or more processors 138 access the software program 140 from the portable medium 142 via a portable storage interface 142 .
- the software program 140 can implement a video game application and the portable medium 142 can include an optical disc, a Universal Serial Bus (USB)-based flash memory, or a video game cartridge.
- USB Universal Serial Bus
- the multimedia system 100 provides a data stream 150 and metadata 152 to the multimedia device 102 .
- the data stream 150 preferably includes data representative of multimedia content to be played back by the multimedia device 102 via the display device 104 .
- the metadata 152 preferably includes interactive event information corresponding to certain on-screen events of interest in the multimedia content, whereby the multimedia device 102 uses this interactive event information to perform user interactive events in conjunction with the on-screen events.
- the data stream 150 and the metadata each includes timing information that is used by the multimedia device 102 to identify the approach of an on-screen event as the playback of the multimedia content progresses and then identify a user interactive event to be performed concurrent with the on-screen event based on corresponding interactive event information in the metadata 152 .
- the multimedia system 100 stores multimedia data representative of the multimedia content in a storage component 156 .
- the multimedia server 106 accesses the multimedia data from the storage component 156 and encodes it for transmission or provision to the multimedia device 102 as the data stream 150 .
- the multimedia server 106 inserts timing information into the data stream 150 based on timing indicators from a timing source 158 , whereby the inserted timing information permits the multimedia device 102 to identify a relative time of playback for each corresponding portion of multimedia content.
- this timing information is inserted into the data stream 150 in the form of time stamps interspersed substantially equally throughout the data stream 150 , whereby the time stamps may be inserted into the header of packets of the data stream 150 , as separate time stamp packets, and the like.
- the time indicators in the time stamps can be based on a well-known timing source, such as Coordinated Universal Time (also commonly referred to as Temps Universel Coordonne).
- the frequency at which the time stamps are inserted typically is sufficient to ensure that synchronization can be maintained within, for example, a few seconds of the beginning of a playback of the multimedia content, a tuning event, a trickplay event such as pause/resume/fast-forward/reverse, or the like.
- a time stamp may be inserted into the data stream 150 such that a new time stamp is encountered every few seconds in the processing of the data stream 150 at the multimedia device 102 for playback of the corresponding multimedia content.
- the time indicators represented by the time stamps can include either absolute time indicators or relative time indicators that reference an elapsed time from a start of the playback. The implementation of the timing information in the data stream 150 is described in greater detail herein with reference to FIGS. 2-6 .
- the form and content of the metadata 152 is based on the relevant aspects, situations, events and other circumstances of the multimedia content represented in the data stream 150 .
- the interactive event information of the metadata 152 can include, for example, scoring updates and the relative times that each corresponding change in the score occurred.
- the interactive event information further can include other dynamic information for the sports game, such as the number of strikes and balls at any given time for a televised baseball game, which down and the number of yards to go for a televised football game, and the like.
- the multimedia device 102 can implement an on-screen display (OSD) that overlies the video content of the sports game as it is being displayed on the display device 104 , whereby the on-screen display provides the current score and other statistics regarding the sports game as it progresses.
- the multimedia device 102 can utilize the interactive event information of the metadata 152 to modify the on-screen display as the playback of the sports game progresses to reflect the changes in the score and other statistics as they occur in the playback of the sports game.
- the interactive event information of the metadata 152 can include, for example, the notes played on the instrument of a particular band member, and the timing of each note.
- the multimedia device 102 can implement a video game application executed concurrent with the playback of the multimedia content.
- the video game application can include a musical band video game, such as one similar to the Guitar HeroTM video game or the Rock BandTM video game, and the multimedia device 102 can use the note information to permit a user to attempt to “play” the same song with the same notes through the video game application.
- the metadata server 108 identifies an on-screen event of interest in the multimedia content and identifies the timing information associated with the on-screen event.
- the timing information associated with the on-screen event is obtained by the metadata server 108 from encoding information 161 provided by the multimedia server 106 during the final encoding process of the data stream 152 in which the multimedia server 106 inserts the timing information for the data stream 152 .
- the metadata server 108 accesses a storage component 158 to obtain interactive event information relevant to the on-screen event and combines or otherwise associates the obtained interactive event information with a corresponding timing indicator synchronized to the timing of the data stream 152 to form a corresponding data packet or other data structure of the metadata 152 .
- the implementation of the timing information in the metadata 152 is described in greater detail herein with reference to FIGS. 2-6 .
- the interactive event information associated with multimedia content can be obtained in any of a variety of ways.
- one or more instruments or other input devices can be used to obtain information regarding the circumstances of the activity as it is being recorded in real-time.
- instruments and other devices such as wind gauges, fuel monitors, lap count monitors, and the like, may be taking periodic measurements as the race progresses.
- the measurements from these instruments and devices, along with timing information associated with each measurement then may be stored in the storage component 158 in association with the corresponding multimedia data representing the recorded automotive race.
- a technician or an automated process may review the recorded content after recording has completed and then generate the corresponding interactive event information based on the review.
- a technician can review a playback of the golf game and use this playback, along with information accessed from other sources, to generate a data file containing, for example: the sequence in which the golf players played each hole; the number of strokes each golf player took for each hole; the distance, direction, and landing point of each stroke and the time at which each stroke occurred; the humidity and other weather conditions at the golf course at various points throughout the tournament; the length of the grass at each fairway and green; and the like.
- the data representative of this information then may be stored in the storage component 158 in association with the corresponding multimedia data representing the recorded golf game.
- the raw timing information associated with the interactive event information may not be synchronized to the timing source 158 .
- measurements, scores, or other quantifiers may have been stored with timing references indicative of an elapsed time since the measuring device was powered-up or otherwise initiated or the timing references may be tied to a local time source, whereas the timing information implemented in the data stream 150 may be directly tied to an absolute time reference, such as the Coordinated Universal Time.
- the metadata server 108 may reformat or adjust the timing information of the interactive event information to relate to the same timing reference as the data stream 150 , rather than in relation to another timing source.
- the metadata server 108 may determine the particular point in the Coordinated Universal Time for that particular starting point and then reformat the successive time indicators relative to this particular point in the Coordinated Universal Time.
- the multimedia device 150 can initiate a playback of the multimedia content represented by the data stream 150 in response to, for example, user input requesting the playback.
- the display controller 128 accesses the data stream 150 and processes the data stream 150 for playback of the multimedia content.
- the processing by the display controller 128 can include, for example, decoding and decrypting encoded video and audio information.
- the display controller 128 can provide progress information 160 indicating the current position or progress of the playback of the multimedia content to the user interactive controller 130 .
- This progress information 160 can include, for example, the timing indicators included in the time stamps embedded in the data stream 150 .
- the display controller 128 can output the timing indicator of the time stamp, or a representation thereof, to the user interaction controller 130 .
- the user interaction controller 130 uses the progress information 160 to determine whether the playback of the multimedia content by the display controller 128 is approaching an identified on-screen event of interest. When an identified on-screen event of interest is approaching or reached in the playback, the user interaction controller 130 accesses the subset of the stored interactive event information that is associated with the identified on-screen event and performs one or more user interactive events based on the accessed interactive event information.
- the user interactive events can include or result in video content 162 to be displayed at the display device 104 in association with the identified on-screen event.
- the user interactive event can include, for example, an update to a displayed score of an OSD provided by the user interaction controller 130 at or following the particular point in time of the playback.
- the user interactive events also can include or result in the playback of a video game experience that emulates a situation or other conditions present in the playback of the multimedia content at that particular point in time.
- the multimedia content may represent a band playing a particular song and the multimedia device 102 may implement a musical play-along game application whereby the user is presented the opportunity in the game application to manipulate the user input device 134 to emulate a note played by a guitarist in the band at the same time that the guitarist plays the note in the playback of the concert recording at the multimedia device 102 .
- This process of progressing through the playback of the multimedia content, identifying on-screen events of interest, and then performing corresponding user interactive events concurrent with the occurrence of the on-screen events in the playback can continue until the playback is completed.
- the multimedia device 102 may need access to at least a current portion of the data stream 150 and the relevant portion of the metadata 152 .
- the data stream 150 may be provided to the multimedia device 102 in a variety of manners.
- the data stream 150 may be transmitted from the multimedia server 106 to the multimedia device 102 for a “live” or real-time playback as the recorded event occurs (with some degree of time-shifting to allow for the above-described encoding, processing, and buffering of the multimedia data).
- the data stream 150 may be provided as a pre-recorded broadcast from the multimedia server 106 to the multimedia device 102 , whereby the multimedia device 102 initiates playback of the multimedia content of the data stream 150 as soon as the data stream 150 is received.
- the data stream 150 also may be provided from the multimedia server 106 to the multimedia device 102 as an on-demand transmission such as a video-on-demand (VoD) transmission.
- the data stream 150 may be transmitted to the multimedia device 102 , whereupon the multimedia device 102 operates as a digital video recorder (DVR) and stores the data stream 150 for subsequent playback at a later time.
- DVR digital video recorder
- the metadata 152 is provided to the multimedia device 102 in parallel with the data stream 150 .
- the metadata server 108 may provide the metadata 152 to the multimedia server 106 , which then embeds the metadata 152 into the data stream 150 .
- the metadata 152 need not be provided to the multimedia device 102 in parallel with the provision of the data stream 150 .
- the multimedia device 102 may receive and store the data stream 150 .
- the multimedia device 102 can then fetch the metadata 152 from the metadata server 108 using, for example, a File Transfer Protocol (FTP) or a Hypertext Transport Protocol (HTTP) and store the fetched metadata 152 at the storage component 126 .
- FTP File Transfer Protocol
- HTTP Hypertext Transport Protocol
- a pointer or other reference to the particular location of the metadata 152 can be implemented in the data stream 150 .
- the data stream 150 can include an IP address of a FTP server or an HTTP server and an identifier of the file at the FTP server of HTTP server that stores the metadata 152 .
- the user interaction controller 130 or other component of the multimedia device 102 can access the IP address and identifier of the file and initiate a transfer of the file from the server at the identified IP address.
- the metadata 152 is provided to the multimedia device 102 via a portable storage medium, such as on an optical disc, a removable flash drive, a video game cartridge.
- a portable storage medium such as on an optical disc, a removable flash drive, a video game cartridge.
- the multimedia device 102 can include a video game console that implements a video game application represented by a software program 140 stored on a portable storage medium 142 .
- the portable storage medium 142 also may store the metadata 152 , which is then accessed by the multimedia device 102 via the portable storage interface 144 .
- the data stream 150 instead of providing the data stream 150 to the media device 102 via a network, the data stream 150 instead can be provided via a portable storage medium, either together with the metadata 152 (and the software program 140 ) or as a separate portable storage medium.
- an optical disc may be provided for use by the multimedia device 102 , whereby the optical disc includes the data stream 150 representing a sports game for playback by the display controller 128 , and the software program 140 and metadata 152 for execution by the user interaction controller 130 to provide a video game application that emulates the dynamic conditions of the sports game as the playback of the sports game progresses.
- FIG. 2 illustrates an example method 200 for generating the data stream 150 and the metadata 152 of the multimedia system 100 in accordance with at least one embodiment of the present disclosure.
- the multimedia server 106 accesses multimedia data from the storage device 156 and encodes the multimedia data with timing information based on the timing source 158 .
- this encoding process can include inserting time stamps into the multimedia data at periodic intervals, where the time stamps may include time indicators related to the Coordinated Universal Time.
- on-screen events of interest in the multimedia content and their corresponding time points are identified. This identification of the on-screen events may be performed by a technician reviewing a playback of the multimedia content, by an automated process or any combination thereof.
- interactive event information for each on-screen event of interest is determined and the interactive event information is then included in the metadata 152 along with an associated time point indicator.
- the process of blocks 204 and 206 can be performed concurrently with, or subsequent to, the process of block 202 .
- FIG. 3 illustrates an example of the metadata 152 generated in accordance with the method 200 of FIG. 2 .
- the bar 302 of FIG. 3 represents the playback sequence of multimedia content of the data stream 150 .
- the data stream 150 includes time stamps periodically dispersed within the playback sequence, initiating with a starting time indicator of +0 and continuing in 5 unit intervals (+5, +10, +15, and so forth.
- Identified in the illustrated data stream 150 are three on-screen events: on-screen event 311 at time point +2.4; on-screen event 312 at time point +15.2; and on-screen event 313 at time point +27.5.
- the data stream 150 is provided to the metadata server 108 for processing.
- the metadata server 108 accesses the storage device 158 to determine the corresponding interactive event information and then the metadata server 108 generates a metadata packet 320 for the metadata 152 .
- the metadata packet 320 includes a time stamp field 322 and a data field 324 .
- the time stamp field 322 stores a time indicator representative of the time point +2.4 and the data field 324 stores the interactive event information accessed in relation to the on-screen event 311 . This process is repeated for the on-screen events 312 and 313 to result in metadata packets 326 and 328 , respectively, for the metadata 152 .
- FIG. 4 illustrates an example method 400 for synchronizing user interactive events with corresponding on-screen events during a playback of multimedia content at the multimedia device 102 in accordance with at least one embodiment of the present disclosure.
- the multimedia device 102 receives the data stream 150 .
- the data stream 150 can be transmitted from the multimedia server 106 to the multimedia device 102 as a live or real-time broadcast, as a pre-recorded broadcast, as a VoD transmission, a prerecorded playback, and the like.
- the data stream 150 can be made accessible to the multimedia device 102 via a portable storage medium, such as an optical disc or a flash memory device.
- the data stream 150 includes timing information associated with the multimedia content.
- the multimedia device 102 initiates one or more user interactive applications associated with the multimedia content of the data stream 150 .
- the user interactive applications can include, for example, a video game application for providing a video game experience that emulates the circumstances present in the multimedia content.
- the user interactive applications can include an on-screen display application to provide an on-screen display that overlies the display for the multimedia content, whereby the on-screen display can include dynamically updated scores, statistics, quantifiers, and other information pertaining to the displayed multimedia content.
- the user interactive applications can be implemented as solely hardware such as an application-specific integrated circuit (ASIC) or state machine as one or more processors 138 and software programs 140 executed by the one or more processors 138 , or combinations thereof.
- ASIC application-specific integrated circuit
- the initiation of the one or more user interactive applications can occur prior to, concurrent with, or subsequent to the receipt of the data stream 150 .
- the multimedia device 102 receives the metadata 152 associated with the display stream 150 .
- the metadata 152 may be received at the multimedia device 102 prior to, concurrent with, or subsequent to receipt of the data stream 150 .
- the metadata 152 can be embedded in the display stream 150 or provided to the multimedia device 102 via a separate channel.
- the metadata 152 can be made available to the multimedia device 102 via a portable storage medium, which may include the same portable storage medium used to provide at least one of a software program representative of the user interactive application or the data stream 150 .
- the metadata 152 can be obtained by the multimedia device 102 as downloadable content downloaded via a network from a FTP server, a web server, or other content server.
- the multimedia device 102 receives user input via the user input device 134 to instruct the multimedia device 102 to initiate playback of the multimedia content at the display device 104 .
- the display controller 128 accesses the data stream 150 from the storage device 126 and begins processing the accessed data stream 150 to generate corresponding video and audio signaling for the display device 104 .
- the display controller 128 periodically encounters the time stamps or other time indicators interspersed within the data stream 150 .
- the display controller 128 provides a value representative of time stamp to the user interaction controller 130 as part of the progress information 160 so as to inform the user interaction controller 130 of the current position of the playback of the multimedia content.
- the user interaction controller 130 uses the progress information 160 to determine whether the playback of the multimedia content has progressed to the next on-screen event of interest.
- the user interaction controller 130 accesses the storage device 126 to obtain the subset of interactive event information associated with the on-screen event.
- the interactive event information stored in the storage device 126 includes timing information that correlates certain time points to corresponding subsets of interactive event information, and the user interaction controller 130 therefore can index the corresponding subset of interactive event information based on this timing information and the timing information indicated by the progress information 160 received from the display controller 128 during playback.
- the user interaction controller 130 uses the accessed subset of interactive event information to identify one or more user interactive events and then perform the one or more user interactive events concurrent with the playback of the corresponding on-screen event.
- the user interactive event to be performed at any given on-screen event may be predetermined and thus the interactive event information would only include information to be used in performing the user interactive event.
- the user interaction application initiated at block 404 may include an on-screen display application that only provides box scores for the playback of a corresponding sports game.
- the only on-screen events of interest may be changes in the score of the game, and thus the OSD application may be configured to dynamically update the displayed scores using score update information included in the metadata 152 .
- the interactive event information associated with an on-screen event may include an indicator of one or more user interactive events to perform from a multitude of potential user interactive events, as well as data to be used in performing the one or more user interactive events.
- the process of blocks 408 , 410 , 412 , and 414 may be repeated for each identified on-screen event until the playback is terminated.
- FIG. 5 illustrates an example operation of the multimedia system 100 in a context whereby the multimedia device 102 implements an on-screen display (OSD) application 502 that provides a dynamically updated OSD overlay 504 that is synchronized to on-screen events in the playback of the multimedia content.
- OSD on-screen display
- the OSD overlay 504 is displayed at the top border of the display device 104 concurrent with the display of video content 506 of a playback 501 of a soccer game.
- the metadata 152 is organized by the user interaction controller 130 into a table 552 or other data structure that facilitates efficient indexing of the interactive event information of the metadata 152 based on the timing information of the metadata 152 .
- playback 501 of the soccer game is initiated. Using the time point +0 at the initiation of the playback, the user interaction controller 130 indexes the first entry of the table 552 to obtain interactive event information indicating that the current score is 0-0. In response, the OSD application 502 formats the OSD overlay 504 to display a box score of 0-0. As illustrated in the bottom half of FIG. 5 , playback 501 of the soccer game progresses to time point +18.0, at which point the on-screen event 511 of Team A scoring a goal occurs. Using this time point, the user interaction controller 130 indexes the second entry of the table 552 to obtain interactive event information indicating that Team A has scored. In response, the OSD application 502 formats the OSD overlay 504 to display a box score of 1-0. This process may be repeated for the score update at time point +33.7 corresponding to the on-screen event 512 of Team B scoring a goal.
- the OSD application 502 can be used to provide other information in addition to, or instead of box scores, for corresponding multimedia content.
- the OSD overlay can be used to provide fantasy sports scoring and player statistics updated in synchronization with on-screen events during playback of a sports game.
- FIG. 6 illustrates an example operation of the multimedia system 100 in a context whereby the multimedia device 102 implements a video game application 602 that provides a video game experience with gaming situations and other circumstances synchronized to on-screen events in the playback of the multimedia content.
- the display area of the display device 104 is split so as to simultaneously display video content 604 from the video game application 602 and video content 606 from a playback 601 of an automotive race.
- the user is pitted against a well-known professional racer that participated in the actual automotive race being played back by the multimedia device 102 .
- the metadata 152 includes game context information representative of dynamic changes in the circumstances of the automotive race in general such as air temperature, wind speed and direction, and timings of yellow flags, as well as game context information directly related to the professional driver.
- game context information can include the fuel status of the professional driver's race car at given points in time, the time at which the professional driver completes each lap, the times during which the professional driver is in a pit stop, and the like.
- the metadata 152 is organized by the user interaction controller 130 into a table 652 or other data structure that facilitates efficient indexing of the interactive event information of the metadata 152 based on the timing information of the metadata 152 .
- playback 601 of the automotive race has already initiated and progressed to the time point +35.0, at which point an on-screen event 611 of the professional driver (Driver A) starting the twenty-second lap occurs in the video content 606 of the playback 601 of the actual race.
- the video game application 602 indexes the second entry of the table 652 to obtain interactive event information indicating that the professional driver has started the twenty-second lap and that the wind is coming from the northwest (NW) at 8 miles-per-hour (mph).
- the video game application 602 presents video content 604 emulating the user's race car (Driver B) negotiating the twentieth lap of the race based on user driving control input and whereby the professional driver's lap information is presented in an OSD overlay in the video content 606 . Further, the video game application 602 implements the game context information to affect or otherwise control gaming actions provided by the video game application 602 , such as using the wind data so as to affect the maximum speed of the user's race car in the video game experience at the current time point.
- the playback 601 progresses, as does the emulation of the conditions of the race by the video game application 602 for the user. As illustrated in the bottom half of FIG. 6 , playback 601 of the race reaches time point +53.2, at which point an on-screen event 612 of a yellow flag on the course occurs in the video content 606 of the playback 601 of the actual race. Using this time point, the user interaction controller 130 indexes the third entry of the table 652 to obtain interactive event information indicating professional driver has started lap 46 , the wind conditions have changed to 10 mph out of the N, and there is a yellow flag on the course.
- the video game application 602 configures the video content 604 emulating the user's race car at this point, whereby gaming actions taken by the user with respect to the user's race car are reconfigured so as to prevent the user's race car from exceeding a certain speed in view of the yellow flag and the user's race car is affected by the new wind direction and speed, and further whereby the video game application 602 provides a yellow flag icon 608 for display in the video content 604 so as to indicate that a yellow flag is on the course in accordance with the actual car race.
- the video game application is not limited to this example.
- the video game application could be used to provide the user with an interactive game experience whereby the user is permitted to take game actions so as to play along with, or competes directly with, a professional golfer in a televised golf tournament.
- the video game application could emulate the same conditions encountered by the professional golfer during the tournament.
- the video game application could include a musical play-along video game whereby the user's emulated playback of the notes of a song are synchronized to the notes of the same song played by a band during a playback of the band's video recorded concert.
- Other such synchronized video game experiences can be implemented using the guidelines provided herein without departing from the scope of the present disclosure.
- FIG. 7 shows an illustrative embodiment of a general computer system 700 in accordance with at least one embodiment of the present disclosure.
- the computer system 700 can include a set of instructions that can be executed to cause the computer system 700 to perform any one or more of the methods or computer based functions disclosed herein.
- the computer system 700 may operate as a standalone device or may be connected via a network to other computer systems or peripheral devices.
- the computer system may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment.
- the computer system 700 can also be implemented as or incorporated into, for example, a STB device.
- the computer system 700 can be implemented using electronic devices that provide voice, video or data communication.
- the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.
- the computer system 700 may include a processor 702 , such as a central processing unit (CPU), a graphics processing unit (GPU), or both. Moreover, the computer system 700 can include a main memory 704 and a static memory 706 that can communicate with each other via a bus 708 . As shown, the computer system 700 may further include a video display unit 710 , such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, or a cathode ray tube (CRT). Additionally, the computer system 700 may include an input device 712 , such as a keyboard, and a cursor control device 714 , such as a mouse. The computer system 700 can also include a disk drive unit 716 , a signal generation device 718 , such as a speaker or remote control, and a network interface device 720 .
- a processor 702 such as a central processing unit (CPU), a graphics processing unit (GPU), or both.
- the disk drive unit 716 may include a computer-readable medium 722 in which one or more sets of instructions 724 , e.g. software, can be embedded. Further, the instructions 724 may embody one or more of the methods or logic as described herein. In a particular embodiment, the instructions 724 may reside completely, or at least partially, within the main memory 704 , the static memory 706 , and/or within the processor 702 during execution by the computer system 700 . The main memory 704 and the processor 702 also may include computer-readable media.
- the network interface device 720 can provide connectivity to a network 726 , such as a wide area network (WAN), a local area network (LAN), or other network.
- WAN wide area network
- LAN local area network
- dedicated hardware implementations such as application specific integrated circuits, programmable logic arrays and other hardware devices can be constructed to implement one or more of the methods described herein.
- Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems.
- One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
- the methods described herein may be implemented by software programs executable by a computer system.
- implementations can include distributed processing, component/object distributed processing, and parallel processing.
- virtual computer system processing can be constructed to implement one or more of the methods or functionality as described herein.
- the present disclosure contemplates a computer-readable medium that includes instructions or receives and executes instructions responsive to a propagated signal, so that a device connected to a network can communicate voice, video or data over the network 726 . Further, the instructions 724 may be transmitted or received over the network 726 via the network interface device 720 .
- While the computer-readable medium is shown to be a single medium, the term “computer-readable medium” includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions.
- the term “computer-readable medium” shall also include any medium that is capable of storing a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or operations disclosed herein.
- the computer-readable medium can include a solid-state memory such as a memory card or other package that houses one or more non-volatile read-only memories. Further, the computer-readable medium can be a random access memory or other volatile re-writeable memory. Additionally, the computer-readable medium can include a magneto-optical or optical medium, such as a disk or tapes or other storage device to capture carrier wave signals such as a signal communicated over a transmission medium. A digital file attachment to an e-mail or other self-contained information archive or set of archives may be considered a distribution medium that is equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a computer-readable medium or a distribution medium and other equivalents and successor media, in which data or instructions may be stored.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Library & Information Science (AREA)
- Computer Security & Cryptography (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
- This application is a continuation of and claims priority to U.S. application Ser. No. 12/575,017, filed Oct. 7, 2009, which is incorporated herein by reference in its entirety.
- The present disclosure generally relates to multimedia content distribution, and relates more particularly to providing user interactive events during playback with multimedia content.
- Service providers can offer users a variety of viewing options for different multimedia programs. For example, the service providers can supply users with real-time television programs that are typically available for the users to watch only at a specific date and time. The service providers can also offer the users on-demand multimedia content that is available for an extended amount of time and that is provided to the users upon request of the on-demand movie.
- It will be appreciated that for simplicity and clarity of illustration, elements illustrated in the Figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements are exaggerated relative to other elements. Embodiments incorporating teachings of the present disclosure are shown and described with respect to the drawings presented herein, in which:
-
FIG. 1 is a diagram illustrating a multimedia system providing synchronization between on-screen events and corresponding user interaction events in accordance with at least one embodiment of the present disclosure; -
FIG. 2 is a flow diagram illustrating a method for generating metadata in association with multimedia content in accordance with at least one embodiment of the present disclosure; -
FIG. 3 is a diagram illustrating an example implementation of the method ofFIG. 2 in accordance with at least one embodiment of the present disclosure; -
FIG. 4 is a flow diagram illustrating a method for synchronizing user interaction events with corresponding on-screen events during playback of multimedia content by a end-user multimedia device in accordance with at least one embodiment of the present disclosure; -
FIG. 5 is a diagram illustrating an example operation of the multimedia system ofFIG. 1 in a on-screen display overlay context in accordance with at least one embodiment of the present disclosure; -
FIG. 6 is a diagram illustrating an example operation of the multimedia system ofFIG. 1 in a video game context in accordance with at least one embodiment of the present disclosure; and -
FIG. 7 is a diagram illustrating an example computer system for implementing one or more of the components or techniques described herein in accordance with at least one embodiment of the present disclosure. - The use of the same reference symbols in different drawings indicates similar or identical items.
- The numerous innovative teachings of the present application will be described with particular reference to the presently preferred example embodiments. However, it should be understood that this class of embodiments provides only a few examples of the many advantageous uses of the innovative teachings herein. In general, statements made in the specification of the present application do not necessarily delimit any of the various claimed inventions. Moreover, some statements may apply to some inventive features but not to others.
- For ease of discussion, the techniques of the present disclosure are illustrated in the example context of a packet-based network architecture (such as the Internet or a private Internet Protocol (IP)-based network) utilized to convey information between end-user devices, content sources, media servers, and other components of the network. However, these techniques are not limited to this example, but instead can be implemented in any of a variety networks configured to support the transmission of multimedia content and other information using the guidelines provided herein without departing from the scope of the present disclosure. Likewise, the techniques of the present disclosure are described with reference to a single end-user multimedia device. However, it will be appreciated that a telecommunications network may support a number of end-user multimedia devices and thus the described techniques can be employed in parallel for some or all of the end-user multimedia devices within the telecommunications network.
-
FIG. 1 illustrates anexample multimedia system 100 configured to utilize a data stream representing multimedia content and metadata associated with the multimedia content to provide users with an interactive experience that is more fully synchronized to on-screen events of interest in accordance with at least one embodiment of the present disclosure. Themultimedia system 100 includes an end-user multimedia device 102 and adisplay device 104 at a user's premises. Themultimedia device 102 can include, for example, a set top box (STB) device, a digital video recorder (DVR) device, a video game console, a portable multimedia device (such as a multimedia-enable cellular phone, a digital radio receiver, or a handheld video game console), and the like. Themultimedia system 100 further includes amultimedia server 106 and ametadata server 108 connected to themultimedia device 102 vianetworks networks networks FIG. 1 , themultimedia server 106 and themetadata server 108 are provided in association with thesame service provider 114 such as a cable or satellite television provider. As such, thenetworks service provider 114 to the multimedia device. Alternately, themultimedia server 106 and themetadata server 108 may be maintained and operated by separate service providers via the same or different networks. - In the illustrated example, the
multimedia device 102 includes anetwork interface 122 to thenetwork 110, anetwork interface 124 to thenetwork 112, adata storage component 126, adisplay controller 128, auser interaction controller 130, auser control interface 132, and auser input device 134. Thenetwork interfaces network 110 and thenetwork 112 are the same network, thenetwork interfaces data storage component 126 can include any of a variety of storage components, such as a hard drive, an optical drive, a random access memory (RAM), flash memory, a cache, and the like. - The
display controller 128 is configured to access a data stream stored or buffered in thedata storage component 126 to generate a corresponding audio/video signaling 136 representative of a playback of the multimedia content of the data stream. Accordingly, thedisplay controller 128 therefore can implement a decoder (such as a digital television (DTV) decoder, an H.264 decoder, or an MPEG decoder), a video display driver, and other such components utilized in processing data streams to generate corresponding audio/video signaling 136. - The
user interaction controller 130 is configured to access metadata stored or buffered in the data storage component 126 (or an alternate storage component) to identify interactive event information associated with corresponding time points in multimedia content being presented by thedisplay controller 128, and to perform one or more user interactive events associated with the identified interactive event information. In performing these user interactive events, theuser interaction controller 130 may seek input from the user via theuser input device 134. In a context where themultimedia device 102 includes a set-top box or DVR, theuser input device 134 can include, for example, a remote control device or a keyboard device. In a context where themultimedia device 102 includes a video gaming console, theuser input device 134 can include, for example, a video game controller or handset (that is, a “joystick”). In one embodiment, theuser interaction controller 130 is implemented at least in part as one ormore processors 138 that execute asoftware program 140, whereby thesoftware program 140 includes a set of executable instructions that manipulate the one ormore processors 138 to perform at least a subset of the functionality of theuser interaction controller 130 described herein. Thesoftware program 140 can be stored locally at themultimedia device 102, such as in thedata storage component 126 or other data storage component. In another embodiment, thesoftware program 140 is stored inportable storage medium 142, and the one ormore processors 138 access thesoftware program 140 from theportable medium 142 via aportable storage interface 142. To illustrate, in a context where themultimedia device 102 is implemented as a video gaming console, thesoftware program 140 can implement a video game application and theportable medium 142 can include an optical disc, a Universal Serial Bus (USB)-based flash memory, or a video game cartridge. - In operation, the
multimedia system 100 provides adata stream 150 andmetadata 152 to themultimedia device 102. Thedata stream 150 preferably includes data representative of multimedia content to be played back by themultimedia device 102 via thedisplay device 104. Themetadata 152 preferably includes interactive event information corresponding to certain on-screen events of interest in the multimedia content, whereby themultimedia device 102 uses this interactive event information to perform user interactive events in conjunction with the on-screen events. In order to permit the user interactive events to be synchronized to the occurrences of the corresponding on-screen events during the playback of the multimedia content, thedata stream 150 and the metadata each includes timing information that is used by themultimedia device 102 to identify the approach of an on-screen event as the playback of the multimedia content progresses and then identify a user interactive event to be performed concurrent with the on-screen event based on corresponding interactive event information in themetadata 152. - In order to provide the
data stream 150 to themultimedia device 102, themultimedia system 100 stores multimedia data representative of the multimedia content in astorage component 156. Upon some triggering event, themultimedia server 106 accesses the multimedia data from thestorage component 156 and encodes it for transmission or provision to themultimedia device 102 as thedata stream 150. As part of this encoding process, themultimedia server 106 inserts timing information into thedata stream 150 based on timing indicators from atiming source 158, whereby the inserted timing information permits themultimedia device 102 to identify a relative time of playback for each corresponding portion of multimedia content. In at least one embodiment, this timing information is inserted into thedata stream 150 in the form of time stamps interspersed substantially equally throughout thedata stream 150, whereby the time stamps may be inserted into the header of packets of thedata stream 150, as separate time stamp packets, and the like. Further, in one embodiment, the time indicators in the time stamps can be based on a well-known timing source, such as Coordinated Universal Time (also commonly referred to as Temps Universel Coordonne). The frequency at which the time stamps are inserted typically is sufficient to ensure that synchronization can be maintained within, for example, a few seconds of the beginning of a playback of the multimedia content, a tuning event, a trickplay event such as pause/resume/fast-forward/reverse, or the like. To illustrate, a time stamp may be inserted into thedata stream 150 such that a new time stamp is encountered every few seconds in the processing of thedata stream 150 at themultimedia device 102 for playback of the corresponding multimedia content. Further, the time indicators represented by the time stamps can include either absolute time indicators or relative time indicators that reference an elapsed time from a start of the playback. The implementation of the timing information in thedata stream 150 is described in greater detail herein with reference toFIGS. 2-6 . - The form and content of the
metadata 152 is based on the relevant aspects, situations, events and other circumstances of the multimedia content represented in thedata stream 150. To illustrate, when the multimedia content represents a televised sports game, the interactive event information of themetadata 152 can include, for example, scoring updates and the relative times that each corresponding change in the score occurred. The interactive event information further can include other dynamic information for the sports game, such as the number of strikes and balls at any given time for a televised baseball game, which down and the number of yards to go for a televised football game, and the like. As described in greater detail below, themultimedia device 102 can implement an on-screen display (OSD) that overlies the video content of the sports game as it is being displayed on thedisplay device 104, whereby the on-screen display provides the current score and other statistics regarding the sports game as it progresses. In this implementation, themultimedia device 102 can utilize the interactive event information of themetadata 152 to modify the on-screen display as the playback of the sports game progresses to reflect the changes in the score and other statistics as they occur in the playback of the sports game. As another example, when the multimedia content represents a televised concert, the interactive event information of themetadata 152 can include, for example, the notes played on the instrument of a particular band member, and the timing of each note. As described in greater detail below, themultimedia device 102 can implement a video game application executed concurrent with the playback of the multimedia content. In this implementation, the video game application can include a musical band video game, such as one similar to the Guitar Hero™ video game or the Rock Band™ video game, and themultimedia device 102 can use the note information to permit a user to attempt to “play” the same song with the same notes through the video game application. - This concurrent implementation of the user interactive events with the corresponding on-screen events in the playback of the multimedia content benefits from synchronization between the timing of the on-screen events and the interactive event information used to implement the corresponding user interactive events. Accordingly, in providing the
metadata 152, themetadata server 108 identifies an on-screen event of interest in the multimedia content and identifies the timing information associated with the on-screen event. In one embodiment, the timing information associated with the on-screen event is obtained by themetadata server 108 from encodinginformation 161 provided by themultimedia server 106 during the final encoding process of thedata stream 152 in which themultimedia server 106 inserts the timing information for thedata stream 152. For an identified on-screen event of interest, themetadata server 108 then accesses astorage component 158 to obtain interactive event information relevant to the on-screen event and combines or otherwise associates the obtained interactive event information with a corresponding timing indicator synchronized to the timing of thedata stream 152 to form a corresponding data packet or other data structure of themetadata 152. The implementation of the timing information in themetadata 152 is described in greater detail herein with reference toFIGS. 2-6 . - The interactive event information associated with multimedia content can be obtained in any of a variety of ways. To illustrate, one or more instruments or other input devices can be used to obtain information regarding the circumstances of the activity as it is being recorded in real-time. For example, assume an automotive race is being filmed as the multimedia content. During the filming, instruments and other devices, such as wind gauges, fuel monitors, lap count monitors, and the like, may be taking periodic measurements as the race progresses. The measurements from these instruments and devices, along with timing information associated with each measurement, then may be stored in the
storage component 158 in association with the corresponding multimedia data representing the recorded automotive race. In other instances, a technician or an automated process may review the recorded content after recording has completed and then generate the corresponding interactive event information based on the review. To illustrate, assume a golf game has been filmed and recorded. In this case, a technician can review a playback of the golf game and use this playback, along with information accessed from other sources, to generate a data file containing, for example: the sequence in which the golf players played each hole; the number of strokes each golf player took for each hole; the distance, direction, and landing point of each stroke and the time at which each stroke occurred; the humidity and other weather conditions at the golf course at various points throughout the tournament; the length of the grass at each fairway and green; and the like. The data representative of this information then may be stored in thestorage component 158 in association with the corresponding multimedia data representing the recorded golf game. - It will be appreciated that the raw timing information associated with the interactive event information may not be synchronized to the
timing source 158. To illustrate, measurements, scores, or other quantifiers may have been stored with timing references indicative of an elapsed time since the measuring device was powered-up or otherwise initiated or the timing references may be tied to a local time source, whereas the timing information implemented in thedata stream 150 may be directly tied to an absolute time reference, such as the Coordinated Universal Time. Accordingly, in generating themetadata 152 from the stored interactive event information, themetadata server 108 may reformat or adjust the timing information of the interactive event information to relate to the same timing reference as thedata stream 150, rather than in relation to another timing source. To illustrate, if the timing information of the interactive event information is configured relative to elapsed times from some particular starting point, themetadata server 108 may determine the particular point in the Coordinated Universal Time for that particular starting point and then reformat the successive time indicators relative to this particular point in the Coordinated Universal Time. - After receiving the
data stream 150, themultimedia device 150 can initiate a playback of the multimedia content represented by thedata stream 150 in response to, for example, user input requesting the playback. In response to such user input, thedisplay controller 128 accesses thedata stream 150 and processes thedata stream 150 for playback of the multimedia content. The processing by thedisplay controller 128 can include, for example, decoding and decrypting encoded video and audio information. As the playback progresses, thedisplay controller 128 can provideprogress information 160 indicating the current position or progress of the playback of the multimedia content to the userinteractive controller 130. Thisprogress information 160 can include, for example, the timing indicators included in the time stamps embedded in thedata stream 150. Thus, as each time stamp is encountered by thedisplay controller 128 during playback of the multimedia content of thedata stream 150, thedisplay controller 128 can output the timing indicator of the time stamp, or a representation thereof, to theuser interaction controller 130. - The
user interaction controller 130 uses theprogress information 160 to determine whether the playback of the multimedia content by thedisplay controller 128 is approaching an identified on-screen event of interest. When an identified on-screen event of interest is approaching or reached in the playback, theuser interaction controller 130 accesses the subset of the stored interactive event information that is associated with the identified on-screen event and performs one or more user interactive events based on the accessed interactive event information. The user interactive events can include or result in video content 162 to be displayed at thedisplay device 104 in association with the identified on-screen event. To illustrate, if the multimedia content represents a sports game and the on-screen event is a scored goal by one team at a particular point in time of the playback, the user interactive event can include, for example, an update to a displayed score of an OSD provided by theuser interaction controller 130 at or following the particular point in time of the playback. The user interactive events also can include or result in the playback of a video game experience that emulates a situation or other conditions present in the playback of the multimedia content at that particular point in time. To illustrate, the multimedia content may represent a band playing a particular song and themultimedia device 102 may implement a musical play-along game application whereby the user is presented the opportunity in the game application to manipulate theuser input device 134 to emulate a note played by a guitarist in the band at the same time that the guitarist plays the note in the playback of the concert recording at themultimedia device 102. This process of progressing through the playback of the multimedia content, identifying on-screen events of interest, and then performing corresponding user interactive events concurrent with the occurrence of the on-screen events in the playback can continue until the playback is completed. - In order to implement the above-described process, the
multimedia device 102 may need access to at least a current portion of thedata stream 150 and the relevant portion of themetadata 152. Thedata stream 150 may be provided to themultimedia device 102 in a variety of manners. Thedata stream 150 may be transmitted from themultimedia server 106 to themultimedia device 102 for a “live” or real-time playback as the recorded event occurs (with some degree of time-shifting to allow for the above-described encoding, processing, and buffering of the multimedia data). Alternately, thedata stream 150 may be provided as a pre-recorded broadcast from themultimedia server 106 to themultimedia device 102, whereby themultimedia device 102 initiates playback of the multimedia content of thedata stream 150 as soon as thedata stream 150 is received. Thedata stream 150 also may be provided from themultimedia server 106 to themultimedia device 102 as an on-demand transmission such as a video-on-demand (VoD) transmission. Alternately, thedata stream 150 may be transmitted to themultimedia device 102, whereupon themultimedia device 102 operates as a digital video recorder (DVR) and stores thedata stream 150 for subsequent playback at a later time. - In one embodiment, the
metadata 152 is provided to themultimedia device 102 in parallel with thedata stream 150. For example, themetadata server 108 may provide themetadata 152 to themultimedia server 106, which then embeds themetadata 152 into thedata stream 150. However, while at least a portion of themetadata 152 is needed by themultimedia device 102 to implement user interactive events synchronized to corresponding on-screen events, themetadata 152 need not be provided to themultimedia device 102 in parallel with the provision of thedata stream 150. To illustrate, themultimedia device 102 may receive and store thedata stream 150. In response to storing thedata stream 150, or in response to initiating playback of thedata stream 150, themultimedia device 102 can then fetch themetadata 152 from themetadata server 108 using, for example, a File Transfer Protocol (FTP) or a Hypertext Transport Protocol (HTTP) and store thefetched metadata 152 at thestorage component 126. A pointer or other reference to the particular location of themetadata 152 can be implemented in thedata stream 150. To illustrate, thedata stream 150 can include an IP address of a FTP server or an HTTP server and an identifier of the file at the FTP server of HTTP server that stores themetadata 152. In response to accessing thedata stream 150 to begin processing thedata stream 150 for playback, theuser interaction controller 130 or other component of themultimedia device 102 can access the IP address and identifier of the file and initiate a transfer of the file from the server at the identified IP address. - In another embodiment, rather than receiving the
metadata 152 at themultimedia device 102 via a network (such as by receiving themetadata 152 as part of a broadcast or multicast or by obtaining themetadata 152 as downloadable content from a file transfer protocol (FTP) server or other type of content server), themetadata 152 is provided to themultimedia device 102 via a portable storage medium, such as on an optical disc, a removable flash drive, a video game cartridge. As discussed above, themultimedia device 102 can include a video game console that implements a video game application represented by asoftware program 140 stored on aportable storage medium 142. In this instance, theportable storage medium 142 also may store themetadata 152, which is then accessed by themultimedia device 102 via the portable storage interface 144. Likewise, instead of providing thedata stream 150 to themedia device 102 via a network, thedata stream 150 instead can be provided via a portable storage medium, either together with the metadata 152 (and the software program 140) or as a separate portable storage medium. To illustrate, an optical disc may be provided for use by themultimedia device 102, whereby the optical disc includes thedata stream 150 representing a sports game for playback by thedisplay controller 128, and thesoftware program 140 andmetadata 152 for execution by theuser interaction controller 130 to provide a video game application that emulates the dynamic conditions of the sports game as the playback of the sports game progresses. -
FIG. 2 illustrates anexample method 200 for generating thedata stream 150 and themetadata 152 of themultimedia system 100 in accordance with at least one embodiment of the present disclosure. Atblock 202, themultimedia server 106 accesses multimedia data from thestorage device 156 and encodes the multimedia data with timing information based on thetiming source 158. As discussed above, this encoding process can include inserting time stamps into the multimedia data at periodic intervals, where the time stamps may include time indicators related to the Coordinated Universal Time. Atblock 204, on-screen events of interest in the multimedia content and their corresponding time points are identified. This identification of the on-screen events may be performed by a technician reviewing a playback of the multimedia content, by an automated process or any combination thereof. Atblock 206, interactive event information for each on-screen event of interest is determined and the interactive event information is then included in themetadata 152 along with an associated time point indicator. The process ofblocks block 202. -
FIG. 3 illustrates an example of themetadata 152 generated in accordance with themethod 200 ofFIG. 2 . Thebar 302 ofFIG. 3 represents the playback sequence of multimedia content of thedata stream 150. In the depicted example, thedata stream 150 includes time stamps periodically dispersed within the playback sequence, initiating with a starting time indicator of +0 and continuing in 5 unit intervals (+5, +10, +15, and so forth. Identified in the illustrateddata stream 150 are three on-screen events: on-screen event 311 at time point +2.4; on-screen event 312 at time point +15.2; and on-screen event 313 at time point +27.5. Thedata stream 150 is provided to themetadata server 108 for processing. In response to the identified on-screen event 311, themetadata server 108 accesses thestorage device 158 to determine the corresponding interactive event information and then themetadata server 108 generates ametadata packet 320 for themetadata 152. Themetadata packet 320 includes atime stamp field 322 and adata field 324. Thetime stamp field 322 stores a time indicator representative of the time point +2.4 and thedata field 324 stores the interactive event information accessed in relation to the on-screen event 311. This process is repeated for the on-screen events metadata packets metadata 152. -
FIG. 4 illustrates anexample method 400 for synchronizing user interactive events with corresponding on-screen events during a playback of multimedia content at themultimedia device 102 in accordance with at least one embodiment of the present disclosure. Atblock 402 themultimedia device 102 receives thedata stream 150. As noted above, thedata stream 150 can be transmitted from themultimedia server 106 to themultimedia device 102 as a live or real-time broadcast, as a pre-recorded broadcast, as a VoD transmission, a prerecorded playback, and the like. Alternately, thedata stream 150 can be made accessible to themultimedia device 102 via a portable storage medium, such as an optical disc or a flash memory device. As also noted above, thedata stream 150 includes timing information associated with the multimedia content. - At
block 404, themultimedia device 102 initiates one or more user interactive applications associated with the multimedia content of thedata stream 150. The user interactive applications can include, for example, a video game application for providing a video game experience that emulates the circumstances present in the multimedia content. As another example, the user interactive applications can include an on-screen display application to provide an on-screen display that overlies the display for the multimedia content, whereby the on-screen display can include dynamically updated scores, statistics, quantifiers, and other information pertaining to the displayed multimedia content. The user interactive applications can be implemented as solely hardware such as an application-specific integrated circuit (ASIC) or state machine as one ormore processors 138 andsoftware programs 140 executed by the one ormore processors 138, or combinations thereof. The initiation of the one or more user interactive applications can occur prior to, concurrent with, or subsequent to the receipt of thedata stream 150. - At
block 406, themultimedia device 102 receives themetadata 152 associated with thedisplay stream 150. Themetadata 152 may be received at themultimedia device 102 prior to, concurrent with, or subsequent to receipt of thedata stream 150. Themetadata 152 can be embedded in thedisplay stream 150 or provided to themultimedia device 102 via a separate channel. Alternately, themetadata 152 can be made available to themultimedia device 102 via a portable storage medium, which may include the same portable storage medium used to provide at least one of a software program representative of the user interactive application or thedata stream 150. As another example, themetadata 152 can be obtained by themultimedia device 102 as downloadable content downloaded via a network from a FTP server, a web server, or other content server. - At
block 408, themultimedia device 102 receives user input via theuser input device 134 to instruct themultimedia device 102 to initiate playback of the multimedia content at thedisplay device 104. In response, thedisplay controller 128 accesses thedata stream 150 from thestorage device 126 and begins processing the accesseddata stream 150 to generate corresponding video and audio signaling for thedisplay device 104. As the processing of thedata stream 150 progresses, thedisplay controller 128 periodically encounters the time stamps or other time indicators interspersed within thedata stream 150. As each time stamp is encountered, thedisplay controller 128 provides a value representative of time stamp to theuser interaction controller 130 as part of theprogress information 160 so as to inform theuser interaction controller 130 of the current position of the playback of the multimedia content. - At
block 410 theuser interaction controller 130 uses theprogress information 160 to determine whether the playback of the multimedia content has progressed to the next on-screen event of interest. When an on-screen event is reached, atblock 412 theuser interaction controller 130 accesses thestorage device 126 to obtain the subset of interactive event information associated with the on-screen event. As discussed above, the interactive event information stored in thestorage device 126 includes timing information that correlates certain time points to corresponding subsets of interactive event information, and theuser interaction controller 130 therefore can index the corresponding subset of interactive event information based on this timing information and the timing information indicated by theprogress information 160 received from thedisplay controller 128 during playback. - At
block 414, theuser interaction controller 130 uses the accessed subset of interactive event information to identify one or more user interactive events and then perform the one or more user interactive events concurrent with the playback of the corresponding on-screen event. In some instances, the user interactive event to be performed at any given on-screen event may be predetermined and thus the interactive event information would only include information to be used in performing the user interactive event. To illustrate, the user interaction application initiated atblock 404 may include an on-screen display application that only provides box scores for the playback of a corresponding sports game. In this context, the only on-screen events of interest may be changes in the score of the game, and thus the OSD application may be configured to dynamically update the displayed scores using score update information included in themetadata 152. In other instances, the interactive event information associated with an on-screen event may include an indicator of one or more user interactive events to perform from a multitude of potential user interactive events, as well as data to be used in performing the one or more user interactive events. The process ofblocks -
FIG. 5 illustrates an example operation of themultimedia system 100 in a context whereby themultimedia device 102 implements an on-screen display (OSD)application 502 that provides a dynamically updatedOSD overlay 504 that is synchronized to on-screen events in the playback of the multimedia content. In the particular example ofFIG. 5 , theOSD overlay 504 is displayed at the top border of thedisplay device 104 concurrent with the display ofvideo content 506 of aplayback 501 of a soccer game. Further, in the illustrated example, themetadata 152 is organized by theuser interaction controller 130 into a table 552 or other data structure that facilitates efficient indexing of the interactive event information of themetadata 152 based on the timing information of themetadata 152. - At illustrated in the top half of
FIG. 5 ,playback 501 of the soccer game is initiated. Using the time point +0 at the initiation of the playback, theuser interaction controller 130 indexes the first entry of the table 552 to obtain interactive event information indicating that the current score is 0-0. In response, theOSD application 502 formats theOSD overlay 504 to display a box score of 0-0. As illustrated in the bottom half ofFIG. 5 ,playback 501 of the soccer game progresses to time point +18.0, at which point the on-screen event 511 of Team A scoring a goal occurs. Using this time point, theuser interaction controller 130 indexes the second entry of the table 552 to obtain interactive event information indicating that Team A has scored. In response, theOSD application 502 formats theOSD overlay 504 to display a box score of 1-0. This process may be repeated for the score update at time point +33.7 corresponding to the on-screen event 512 of Team B scoring a goal. - The
OSD application 502 can be used to provide other information in addition to, or instead of box scores, for corresponding multimedia content. To illustrate, the OSD overlay can be used to provide fantasy sports scoring and player statistics updated in synchronization with on-screen events during playback of a sports game. -
FIG. 6 illustrates an example operation of themultimedia system 100 in a context whereby themultimedia device 102 implements avideo game application 602 that provides a video game experience with gaming situations and other circumstances synchronized to on-screen events in the playback of the multimedia content. In the particular example ofFIG. 6 , the display area of thedisplay device 104 is split so as to simultaneously displayvideo content 604 from thevideo game application 602 andvideo content 606 from aplayback 601 of an automotive race. Further, the user is pitted against a well-known professional racer that participated in the actual automotive race being played back by themultimedia device 102. As such, themetadata 152 includes game context information representative of dynamic changes in the circumstances of the automotive race in general such as air temperature, wind speed and direction, and timings of yellow flags, as well as game context information directly related to the professional driver. Examples of game context information can include the fuel status of the professional driver's race car at given points in time, the time at which the professional driver completes each lap, the times during which the professional driver is in a pit stop, and the like. As with the previous example ofFIG. 5 , themetadata 152 is organized by theuser interaction controller 130 into a table 652 or other data structure that facilitates efficient indexing of the interactive event information of themetadata 152 based on the timing information of themetadata 152. - At illustrated in the top half of
FIG. 6 ,playback 601 of the automotive race has already initiated and progressed to the time point +35.0, at which point an on-screen event 611 of the professional driver (Driver A) starting the twenty-second lap occurs in thevideo content 606 of theplayback 601 of the actual race. Using this time point, thevideo game application 602 indexes the second entry of the table 652 to obtain interactive event information indicating that the professional driver has started the twenty-second lap and that the wind is coming from the northwest (NW) at 8 miles-per-hour (mph). Thevideo game application 602 presentsvideo content 604 emulating the user's race car (Driver B) negotiating the twentieth lap of the race based on user driving control input and whereby the professional driver's lap information is presented in an OSD overlay in thevideo content 606. Further, thevideo game application 602 implements the game context information to affect or otherwise control gaming actions provided by thevideo game application 602, such as using the wind data so as to affect the maximum speed of the user's race car in the video game experience at the current time point. - The
playback 601 progresses, as does the emulation of the conditions of the race by thevideo game application 602 for the user. As illustrated in the bottom half ofFIG. 6 ,playback 601 of the race reaches time point +53.2, at which point an on-screen event 612 of a yellow flag on the course occurs in thevideo content 606 of theplayback 601 of the actual race. Using this time point, theuser interaction controller 130 indexes the third entry of the table 652 to obtain interactive event information indicating professional driver has startedlap 46, the wind conditions have changed to 10 mph out of the N, and there is a yellow flag on the course. In response, thevideo game application 602 configures thevideo content 604 emulating the user's race car at this point, whereby gaming actions taken by the user with respect to the user's race car are reconfigured so as to prevent the user's race car from exceeding a certain speed in view of the yellow flag and the user's race car is affected by the new wind direction and speed, and further whereby thevideo game application 602 provides ayellow flag icon 608 for display in thevideo content 604 so as to indicate that a yellow flag is on the course in accordance with the actual car race. - Although one example of a video game application synchronized to the on-screen events of a simultaneously displayed playback of a sports game is illustrated, the video game application is not limited to this example. To illustrate, the video game application could be used to provide the user with an interactive game experience whereby the user is permitted to take game actions so as to play along with, or competes directly with, a professional golfer in a televised golf tournament. In this instance, the video game application could emulate the same conditions encountered by the professional golfer during the tournament. As another example, the video game application could include a musical play-along video game whereby the user's emulated playback of the notes of a song are synchronized to the notes of the same song played by a band during a playback of the band's video recorded concert. Other such synchronized video game experiences can be implemented using the guidelines provided herein without departing from the scope of the present disclosure.
-
FIG. 7 shows an illustrative embodiment of ageneral computer system 700 in accordance with at least one embodiment of the present disclosure. Thecomputer system 700 can include a set of instructions that can be executed to cause thecomputer system 700 to perform any one or more of the methods or computer based functions disclosed herein. Thecomputer system 700 may operate as a standalone device or may be connected via a network to other computer systems or peripheral devices. - In a networked deployment, the computer system may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The
computer system 700 can also be implemented as or incorporated into, for example, a STB device. In a particular embodiment, thecomputer system 700 can be implemented using electronic devices that provide voice, video or data communication. Further, while asingle computer system 700 is illustrated, the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions. - The
computer system 700 may include aprocessor 702, such as a central processing unit (CPU), a graphics processing unit (GPU), or both. Moreover, thecomputer system 700 can include amain memory 704 and astatic memory 706 that can communicate with each other via abus 708. As shown, thecomputer system 700 may further include avideo display unit 710, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, or a cathode ray tube (CRT). Additionally, thecomputer system 700 may include aninput device 712, such as a keyboard, and acursor control device 714, such as a mouse. Thecomputer system 700 can also include adisk drive unit 716, asignal generation device 718, such as a speaker or remote control, and anetwork interface device 720. - In a particular embodiment, as depicted in
FIG. 7 , thedisk drive unit 716 may include a computer-readable medium 722 in which one or more sets ofinstructions 724, e.g. software, can be embedded. Further, theinstructions 724 may embody one or more of the methods or logic as described herein. In a particular embodiment, theinstructions 724 may reside completely, or at least partially, within themain memory 704, thestatic memory 706, and/or within theprocessor 702 during execution by thecomputer system 700. Themain memory 704 and theprocessor 702 also may include computer-readable media. Thenetwork interface device 720 can provide connectivity to anetwork 726, such as a wide area network (WAN), a local area network (LAN), or other network. - In an alternative embodiment, dedicated hardware implementations such as application specific integrated circuits, programmable logic arrays and other hardware devices can be constructed to implement one or more of the methods described herein. Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.
- In accordance with various embodiments of the present disclosure, the methods described herein may be implemented by software programs executable by a computer system. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computer system processing can be constructed to implement one or more of the methods or functionality as described herein.
- The present disclosure contemplates a computer-readable medium that includes instructions or receives and executes instructions responsive to a propagated signal, so that a device connected to a network can communicate voice, video or data over the
network 726. Further, theinstructions 724 may be transmitted or received over thenetwork 726 via thenetwork interface device 720. - While the computer-readable medium is shown to be a single medium, the term “computer-readable medium” includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The term “computer-readable medium” shall also include any medium that is capable of storing a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or operations disclosed herein.
- In a particular non-limiting, exemplary embodiment, the computer-readable medium can include a solid-state memory such as a memory card or other package that houses one or more non-volatile read-only memories. Further, the computer-readable medium can be a random access memory or other volatile re-writeable memory. Additionally, the computer-readable medium can include a magneto-optical or optical medium, such as a disk or tapes or other storage device to capture carrier wave signals such as a signal communicated over a transmission medium. A digital file attachment to an e-mail or other self-contained information archive or set of archives may be considered a distribution medium that is equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a computer-readable medium or a distribution medium and other equivalents and successor media, in which data or instructions may be stored.
- Although the present specification describes components and functions that may be implemented in particular embodiments with reference to particular standards and protocols, the invention is not limited to such standards and protocols. For example, standards for Internet and other packet switched network transmission such as TCP/IP, UDP/IP, HTML, and HTTP represent examples of the state of the art. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same or similar functions as those disclosed herein are considered equivalents thereof.
- The illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The illustrations are not intended to serve as a complete description of all of the elements and features of apparatus and systems that utilize the structures or methods described herein. Many other embodiments may be apparent to those of skill in the art upon reviewing the disclosure. Other embodiments may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. Additionally, the illustrations are merely representational and may not be drawn to scale. Certain proportions within the illustrations may be exaggerated, while other proportions may be minimized Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.
- The Abstract of the Disclosure is provided to comply with 37 C.F.R. .sctn.1.72(b) and is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description of the Drawings, various features may be grouped together or described in a single embodiment for the purpose of streamlining the disclosure. This disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter may be directed to less than all of the features of any of the disclosed embodiments. Thus, the following claims are incorporated into the Detailed Description of the Drawings, with each claim standing on its own as defining separately claimed subject matter.
- The above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosed subject matter. Thus, to the maximum extent allowed by law, the scope of the present disclosed subject matter is to be determined by the broadest permissible interpretation of the following claims and their equivalents, and shall not be restricted or limited by the foregoing detailed description.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/697,149 US20150249846A1 (en) | 2009-10-07 | 2015-04-27 | Synchronization of user interactive events with on-screen events during playback of multimedia stream |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/575,017 US9043829B2 (en) | 2009-10-07 | 2009-10-07 | Synchronization of user interactive events with on-screen events during playback of multimedia stream |
US14/697,149 US20150249846A1 (en) | 2009-10-07 | 2015-04-27 | Synchronization of user interactive events with on-screen events during playback of multimedia stream |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/575,017 Continuation US9043829B2 (en) | 2009-10-07 | 2009-10-07 | Synchronization of user interactive events with on-screen events during playback of multimedia stream |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150249846A1 true US20150249846A1 (en) | 2015-09-03 |
Family
ID=43823604
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/575,017 Expired - Fee Related US9043829B2 (en) | 2009-10-07 | 2009-10-07 | Synchronization of user interactive events with on-screen events during playback of multimedia stream |
US14/697,149 Abandoned US20150249846A1 (en) | 2009-10-07 | 2015-04-27 | Synchronization of user interactive events with on-screen events during playback of multimedia stream |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/575,017 Expired - Fee Related US9043829B2 (en) | 2009-10-07 | 2009-10-07 | Synchronization of user interactive events with on-screen events during playback of multimedia stream |
Country Status (1)
Country | Link |
---|---|
US (2) | US9043829B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108521584A (en) * | 2018-04-20 | 2018-09-11 | 广州虎牙信息科技有限公司 | Interactive information processing method, device, main broadcaster's side apparatus and medium |
US20180359514A1 (en) * | 2014-04-30 | 2018-12-13 | Piksel, Inc. | Device Synchronization |
US10863230B1 (en) | 2018-09-21 | 2020-12-08 | Amazon Technologies, Inc. | Content stream overlay positioning |
US10897637B1 (en) * | 2018-09-20 | 2021-01-19 | Amazon Technologies, Inc. | Synchronize and present multiple live content streams |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2460347A4 (en) * | 2009-10-25 | 2014-03-12 | Lg Electronics Inc | Method for processing broadcast program information and broadcast receiver |
JP5383564B2 (en) * | 2010-03-10 | 2014-01-08 | ルネサスエレクトロニクス株式会社 | Data transfer circuit and method |
US10412440B2 (en) * | 2010-03-24 | 2019-09-10 | Mlb Advanced Media, L.P. | Media and data synchronization system |
US20110289117A1 (en) * | 2010-05-19 | 2011-11-24 | International Business Machines Corporation | Systems and methods for user controllable, automated recording and searching of computer activity |
US20140059182A1 (en) * | 2011-03-04 | 2014-02-27 | Fumio Miura | Synchronized content broadcast distribution system |
US9210208B2 (en) * | 2011-06-21 | 2015-12-08 | The Nielsen Company (Us), Llc | Monitoring streaming media content |
US9535450B2 (en) * | 2011-07-17 | 2017-01-03 | International Business Machines Corporation | Synchronization of data streams with associated metadata streams using smallest sum of absolute differences between time indices of data events and metadata events |
US9166892B1 (en) * | 2012-01-20 | 2015-10-20 | Google Inc. | Systems and methods for event stream management |
US9015744B1 (en) * | 2012-06-25 | 2015-04-21 | IMBD.com, Inc. | Ascertaining events in media |
US9288542B2 (en) * | 2013-03-15 | 2016-03-15 | Time Warner Cable Enterprises Llc | Multi-option sourcing of content |
US9165203B2 (en) * | 2013-03-15 | 2015-10-20 | Arris Technology, Inc. | Legibility enhancement for a logo, text or other region of interest in video |
US20150135071A1 (en) * | 2013-11-12 | 2015-05-14 | Fox Digital Entertainment, Inc. | Method and apparatus for distribution and presentation of audio visual data enhancements |
US10412470B2 (en) * | 2014-04-08 | 2019-09-10 | Matthew A. F. Engman | Event entertainment system |
EP3484163A1 (en) * | 2014-08-11 | 2019-05-15 | OpenTV, Inc. | Method and system to create interactivity between a main reception device and at least one secondary device |
US10456666B2 (en) * | 2017-04-17 | 2019-10-29 | Intel Corporation | Block based camera updates and asynchronous displays |
CN107998655B (en) * | 2017-11-09 | 2020-11-27 | 腾讯科技(成都)有限公司 | Data display method, device, storage medium and electronic device |
US11305195B2 (en) * | 2020-05-08 | 2022-04-19 | T-Mobile Usa, Inc. | Extended environmental using real-world environment data |
CN113457130B (en) * | 2021-07-07 | 2024-02-02 | 网易(杭州)网络有限公司 | Game content playback method and device, readable storage medium and electronic equipment |
CN114025185B (en) * | 2021-10-28 | 2024-06-25 | 杭州网易智企科技有限公司 | Video playback method and device, electronic equipment and storage medium |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5613909A (en) * | 1994-07-21 | 1997-03-25 | Stelovsky; Jan | Time-segmented multimedia game playing and authoring system |
US20020089602A1 (en) * | 2000-10-18 | 2002-07-11 | Sullivan Gary J. | Compressed timing indicators for media samples |
US20030022989A1 (en) * | 2001-04-20 | 2003-01-30 | Thomas Braig | Impact-modified molding compositions of polyethylene terephthalate and dihydroxydiarylcyclohexane-based polycarbonate |
US20030229899A1 (en) * | 2002-05-03 | 2003-12-11 | Matthew Thompson | System and method for providing synchronized events to a television application |
US20040003406A1 (en) * | 2002-06-27 | 2004-01-01 | Digeo, Inc. | Method and apparatus to invoke a shopping ticker |
US20040007882A1 (en) * | 2002-07-10 | 2004-01-15 | Arabia Frank J. | Quiet vehicle door latch |
US20040078822A1 (en) * | 2002-10-18 | 2004-04-22 | Breen George Edward | Delivering interactive content to a remote subscriber |
US20040250272A1 (en) * | 2000-06-21 | 2004-12-09 | Durden George A. | Systems and methods for controlling and managing programming content and portions thereof |
US20050028217A1 (en) * | 1999-10-29 | 2005-02-03 | Marler Jerilyn L. | Identifying ancillary information associated with an audio/video program |
US6912726B1 (en) * | 1997-04-02 | 2005-06-28 | International Business Machines Corporation | Method and apparatus for integrating hyperlinks in video |
US7028327B1 (en) * | 2000-02-02 | 2006-04-11 | Wink Communication | Using the electronic program guide to synchronize interactivity with broadcast programs |
US7536706B1 (en) * | 1998-08-24 | 2009-05-19 | Sharp Laboratories Of America, Inc. | Information enhanced audio video encoding system |
US20100070991A1 (en) * | 2007-02-21 | 2010-03-18 | Koninklijke Philips Electronics N.V. | conditional access system |
US20110055885A1 (en) * | 1996-12-23 | 2011-03-03 | Corporate Media Partners | Method and system for providing interactive look-and-feel in a digital broadcast via an x y protocol |
US8099752B2 (en) * | 2008-12-03 | 2012-01-17 | Sony Corporation | Non-real time services |
US8424035B2 (en) * | 2000-08-31 | 2013-04-16 | Intel Corporation | Time shifting enhanced television triggers |
US8745661B2 (en) * | 2006-07-31 | 2014-06-03 | Rovi Guides, Inc. | Systems and methods for providing enhanced sports watching media guidance |
US20140192140A1 (en) * | 2013-01-07 | 2014-07-10 | Microsoft Corporation | Visual Content Modification for Distributed Story Reading |
US20140304233A1 (en) * | 2007-12-06 | 2014-10-09 | Adobe Systems Incorporated | System and method for maintaining cue point data structure independent of recorded time-varying content |
US8879581B2 (en) * | 2008-10-21 | 2014-11-04 | Fujitsu Limited | Data transmitting device and data receiving device |
US9060201B2 (en) * | 2008-10-28 | 2015-06-16 | Cisco Technology, Inc. | Stream synchronization for live video encoding |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5860862A (en) | 1996-01-05 | 1999-01-19 | William W. Junkin Trust | Interactive system allowing real time participation |
US6080063A (en) | 1997-01-06 | 2000-06-27 | Khosla; Vinod | Simulated real time game play with live event |
US6616529B1 (en) | 2000-06-19 | 2003-09-09 | Intel Corporation | Simulation and synthesis of sports matches |
US6699127B1 (en) | 2000-06-20 | 2004-03-02 | Nintendo Of America Inc. | Real-time replay system for video game |
US6434398B1 (en) | 2000-09-06 | 2002-08-13 | Eric Inselberg | Method and apparatus for interactive audience participation at a live spectator event |
US20050130725A1 (en) | 2003-12-15 | 2005-06-16 | International Business Machines Corporation | Combined virtual and video game |
-
2009
- 2009-10-07 US US12/575,017 patent/US9043829B2/en not_active Expired - Fee Related
-
2015
- 2015-04-27 US US14/697,149 patent/US20150249846A1/en not_active Abandoned
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5613909A (en) * | 1994-07-21 | 1997-03-25 | Stelovsky; Jan | Time-segmented multimedia game playing and authoring system |
US20110055885A1 (en) * | 1996-12-23 | 2011-03-03 | Corporate Media Partners | Method and system for providing interactive look-and-feel in a digital broadcast via an x y protocol |
US6912726B1 (en) * | 1997-04-02 | 2005-06-28 | International Business Machines Corporation | Method and apparatus for integrating hyperlinks in video |
US7536706B1 (en) * | 1998-08-24 | 2009-05-19 | Sharp Laboratories Of America, Inc. | Information enhanced audio video encoding system |
US20050028217A1 (en) * | 1999-10-29 | 2005-02-03 | Marler Jerilyn L. | Identifying ancillary information associated with an audio/video program |
US7028327B1 (en) * | 2000-02-02 | 2006-04-11 | Wink Communication | Using the electronic program guide to synchronize interactivity with broadcast programs |
US20040250272A1 (en) * | 2000-06-21 | 2004-12-09 | Durden George A. | Systems and methods for controlling and managing programming content and portions thereof |
US8424035B2 (en) * | 2000-08-31 | 2013-04-16 | Intel Corporation | Time shifting enhanced television triggers |
US20020089602A1 (en) * | 2000-10-18 | 2002-07-11 | Sullivan Gary J. | Compressed timing indicators for media samples |
US20030022989A1 (en) * | 2001-04-20 | 2003-01-30 | Thomas Braig | Impact-modified molding compositions of polyethylene terephthalate and dihydroxydiarylcyclohexane-based polycarbonate |
US20030229899A1 (en) * | 2002-05-03 | 2003-12-11 | Matthew Thompson | System and method for providing synchronized events to a television application |
US20040003406A1 (en) * | 2002-06-27 | 2004-01-01 | Digeo, Inc. | Method and apparatus to invoke a shopping ticker |
US20040007882A1 (en) * | 2002-07-10 | 2004-01-15 | Arabia Frank J. | Quiet vehicle door latch |
US20040078822A1 (en) * | 2002-10-18 | 2004-04-22 | Breen George Edward | Delivering interactive content to a remote subscriber |
US8745661B2 (en) * | 2006-07-31 | 2014-06-03 | Rovi Guides, Inc. | Systems and methods for providing enhanced sports watching media guidance |
US20100070991A1 (en) * | 2007-02-21 | 2010-03-18 | Koninklijke Philips Electronics N.V. | conditional access system |
US20140304233A1 (en) * | 2007-12-06 | 2014-10-09 | Adobe Systems Incorporated | System and method for maintaining cue point data structure independent of recorded time-varying content |
US8879581B2 (en) * | 2008-10-21 | 2014-11-04 | Fujitsu Limited | Data transmitting device and data receiving device |
US9060201B2 (en) * | 2008-10-28 | 2015-06-16 | Cisco Technology, Inc. | Stream synchronization for live video encoding |
US8099752B2 (en) * | 2008-12-03 | 2012-01-17 | Sony Corporation | Non-real time services |
US20140192140A1 (en) * | 2013-01-07 | 2014-07-10 | Microsoft Corporation | Visual Content Modification for Distributed Story Reading |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180359514A1 (en) * | 2014-04-30 | 2018-12-13 | Piksel, Inc. | Device Synchronization |
US10511880B2 (en) * | 2014-04-30 | 2019-12-17 | Piksel, Inc. | Device synchronization |
CN108521584A (en) * | 2018-04-20 | 2018-09-11 | 广州虎牙信息科技有限公司 | Interactive information processing method, device, main broadcaster's side apparatus and medium |
US10897637B1 (en) * | 2018-09-20 | 2021-01-19 | Amazon Technologies, Inc. | Synchronize and present multiple live content streams |
US10863230B1 (en) | 2018-09-21 | 2020-12-08 | Amazon Technologies, Inc. | Content stream overlay positioning |
Also Published As
Publication number | Publication date |
---|---|
US9043829B2 (en) | 2015-05-26 |
US20110081965A1 (en) | 2011-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9043829B2 (en) | Synchronization of user interactive events with on-screen events during playback of multimedia stream | |
US10974130B1 (en) | Systems and methods for indicating events in game video | |
US11956514B2 (en) | Systems and methods for enhanced trick-play functions | |
US10805685B2 (en) | Streamlined viewing of recorded programs based on markers | |
US9860613B2 (en) | Apparatus, systems and methods for presenting highlights of a media content event | |
US8535131B2 (en) | Method and system for an online performance service with recommendation module | |
US8463108B2 (en) | Client-side ad insertion during trick mode playback | |
US20140189517A1 (en) | Fantasy sports contest highlight segments systems and methods | |
US20100172626A1 (en) | Trick Mode Based Advertisement Portion Selection | |
TW201416888A (en) | Scene clip playback system, method and recording medium thereof | |
US20150358690A1 (en) | Techniques for Backfilling Content | |
JP2013192062A (en) | Video distribution system, video distribution apparatus, video distribution method and program | |
US8527880B2 (en) | Method and apparatus for virtual editing of multimedia presentations | |
US9602879B2 (en) | Indexing, advertising, and compiling sports recordings | |
US11869242B2 (en) | Systems and methods for recording portion of sports game | |
KR101336161B1 (en) | Method for providing enhanced broadcasting service | |
KR20050087751A (en) | The relay method of utilize resource of client pc | |
KR20130005669A (en) | System and method for duplex game service based on information of broadcasting contents | |
JP2014003414A (en) | Video recorder |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AT&T INTELLECTUAL PROPERTY I, LP, GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KLEIN, CRAIG A.;HARP, GREGORY O.;RUSHING, JAMES D.;SIGNING DATES FROM 20090918 TO 20090930;REEL/FRAME:035681/0719 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: AT&T INTELLECTUAL PROPERTY I, L.P., GEORGIA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECT THE ASSIGNEE NAME PREVIOUSLY RECORDED AT REEL: 035681 FRAME: 0719. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:KLEIN, CRAIG A.;HARP, GREGORY O.;RUSHING, JAMES D.;SIGNING DATES FROM 20090918 TO 20090930;REEL/FRAME:053222/0448 |