US20070113263A1 - Dynamic interactive content system - Google Patents
Dynamic interactive content system Download PDFInfo
- Publication number
- US20070113263A1 US20070113263A1 US11/619,180 US61918007A US2007113263A1 US 20070113263 A1 US20070113263 A1 US 20070113263A1 US 61918007 A US61918007 A US 61918007A US 2007113263 A1 US2007113263 A1 US 2007113263A1
- Authority
- US
- United States
- Prior art keywords
- content
- main content
- interactive
- supplemental content
- supplemental
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/232—Content retrieval operation locally within server, e.g. reading video streams from disk arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/206—Game information storage, e.g. cartridges, CD ROM's, DVD's, smart cards
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/40—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
- A63F2300/409—Data transfer via television network
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/55—Details of game data or player data management
- A63F2300/552—Details of game data or player data management for downloading to client devices, e.g. using OS version, hardware or software profile of the client device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S707/00—Data processing: database and file management or data structures
- Y10S707/99931—Database or file accessing
- Y10S707/99933—Query processing, i.e. searching
Definitions
- This invention relates generally to interactive content systems, and more particularly to interactive content systems having integrated personal video recording functionality.
- interactive content systems such as game consoles
- interactive content systems allow users to view digital videodiscs (DVDs), play interactive entertainment software, and browse the Internet.
- DVDs digital videodiscs
- interactive content systems provide exciting learning environments through educational interactive software.
- FIG. 1 is a block diagram showing a typical prior art interactive content system 100 .
- the prior art interactive content system 100 generally includes a central processing unit (CPU) 102 coupled to a system bus 104 , which connects a plurality of system components.
- the system bus 104 often is connected to a graphics processing unit (GPU) 106 , an operational request source 108 , a memory 110 , a removable media drive 112 , and video/audio output circuitry 114 .
- GPU graphics processing unit
- removal media such as a compact disc (CD) or digital videodiscs (DVD) is placed into the removal media drive 112 , which reads data from the CD and transfers program information to the memory 110 .
- the CPU 102 in conjunction with the GPU 106 , executes the program instructions from memory to execute the program.
- the operational request source 108 typically is in communication with a user input device, such as a game controller, remote controller, keyboard, or other device capable of receiving and transferring user input data to the interactive content system 100 .
- Output from the program executing on the CPU 102 generally is provided to the video/audio output circuitry 114 for presentation, typically on television or other monitor and speaker system.
- users are able to interact with the information presented to them via the operational request source 108 .
- the user is limited to interacting with the static information provided from the removable media via the removable media drive 112 .
- the removable media drive 112 For example, when utilizing sport based entertainment software, current sporting results cannot be included in the entertainment environment provided by the interactive content system 100 .
- educational information is restricted to information available at the time the removable media is manufactured. As a result, events and discovers occurring after the manufacture of the removable media are not available to the user using the interactive content system 100 .
- PVR personal video recording
- the system should allow current content from sources such as network broadcasts and other content providers, to be utilized in conjunction with interactive applications.
- the present invention fills these needs by providing a dynamic interactive content system that combines dynamic media with interactive content using prerecorded visual and audio data.
- the system allows content from sources such as network broadcasts and other content providers, to be utilized in conjunction with interactive applications.
- a dynamic interactive content system includes a receiver capable of receiving main content, such audio/visual video data, and supplemental content associated with the main content.
- a persistent storage is included that is in communication with the receiver.
- the persistent storage is capable of storing a plurality of main content entries and associated supplemental content entries.
- the system also includes a processor that is capable of executing an interactive application.
- the interactive application searches the stored plurality of supplemental content entries for a particular supplemental content entry having particular data.
- the interactive application then presents at least a portion of the main content entry that is associated with the particular supplemental content entry to a user.
- each supplemental content entry can include data that describes properties of the associated main content entry.
- the data can include a channel definition defining a channel utilized to receive the associated main content, and a time definition defining a time when the associated main content was received.
- the data can include a time code for a specific event included in the associated main content.
- the interactive application can present main content data beginning at a position in the main content entry specified by the time code.
- the dynamic interactive content system includes a receiver capable of receiving main content and supplemental content associated with the main content.
- a persistent storage is included that is in communication with the receiver.
- the persistent storage is capable of storing a plurality of main content entries and associated supplemental content entries.
- an operational request source is included that is in communication with a user input device.
- the operational request source is capable of receiving user requests via the user input device.
- the system also includes a processor that is in communication with the persistent storage device and the operational requests source. In operation, the processor searches the stored plurality of supplemental content entries for particular supplemental content entry having particular data related to a received user request. The processor can then present at least a portion of the main content entry associated with the particular supplemental content entry to a user.
- the processor can be further capable of executing an interactive application, as described above.
- a method for providing dynamic interactive content includes receiving a plurality of main content and supplemental content associated with the main content.
- the plurality of main content and associated supplemental content are stored.
- the plurality of supplemental content is searched for particular content having specific data.
- the main content associated with the particular supplemental content is then presented to a user using an interactive application capable of responding to user input data.
- the supplemental content can be received via a web site on the Internet.
- the web site can include an index associated with a supplemental content database.
- the supplemental content database can be searched using the index to obtain a supplemental content entry.
- the data included in the supplemental content can include both a channel definition defining a channel utilized to receive the associated main content and a time definition defining a time when the associated main content was received.
- FIG. 1 is a block diagram showing a typical prior art interactive content system
- FIG. 2 is a diagram showing a dynamic interactive content environment, in accordance with an embodiment of the present invention.
- FIG. 3 is a block diagram showing an exemplary dynamic interactive content system, in accordance with an embodiment of the present invention.
- FIG. 4 is an illustration showing a conventional TV frame, which includes a viewable area and a non-viewable VBI;
- FIG. 5 is an illustration showing a network based content delivery configuration, in accordance with an embodiment of the present invention.
- FIG. 6 is an illustration showing an exemplary persistent storage device of a dynamic interactive content system, in accordance with an embodiment of the present invention.
- FIG. 7 is a diagram illustrating dynamic interactive application presentation, in accordance with an embodiment of the present invention.
- An invention for a dynamic interactive content system that combines dynamic media with interactive content using prerecorded visual and audio data.
- the system allows content from sources such as network broadcasts and other content providers, to be utilized in conjunction with interactive applications.
- embodiments of the present invention record main content, such audio/visual video data, from content providers, such as network and satellite broadcasts, on a persistent storage such as a hard drive.
- main content such audio/visual video data
- content providers such as network and satellite broadcasts
- supplemental data is recorded that describes the main content.
- the supplemental data can include information such as when the main content was recorded, the channel it was received from, and the type of material that was recorded, such as whether the main content is a movie or sporting event.
- searching the supplemental data interactive applications can search for specific main content for use in the application. In this manner, interactive applications can utilize current broadcasts and other material in their user presentations.
- main content will be utilized in the following description to indicate any data usable in an interactive application.
- main content can be audio/visual data, such as a television or cable television program.
- Main content can also indicate audio data recorded from an audio source, such as a commercial radio broadcast network.
- main content can include, for example, MPEG data or other image data recorded from a wide area network such as the Internet, and other data types usable in interactive applications that will be apparent to those skilled in the art after a careful reading of the present disclosure.
- presentation data can include video, image, and/or audio data that is presented to the user, for example, using a television, monitor, and/or speakers.
- FIG. 2 is a diagram showing a dynamic interactive content environment 200 , in accordance with an embodiment of the present invention.
- the dynamic interactive content environment 200 includes a dynamic interactive content system 202 in communication with a main content source 204 , such as a broadcast network.
- the main content source 204 can be any network capable of providing main content, such as a broadcast network, cable station, premium satellite provider, radio station, or any other source capable of providing main content.
- the dynamic interactive content system 202 can be in communication with a supplemental content provider 206 , such as the Internet or other a wide area network 206 .
- the supplemental content provider 206 can be the same source as the main content provider 204 , for example, by embedding the supplemental content within the main content signal.
- embodiments of the present invention allow main content, such as audio/visual video data from the main content source 204 , to be utilized in conjunction with interactive applications.
- main content such as audio/visual video data from the main content source 204
- embodiments of the present invention allow interactive applications to utilize the recorded main content during their user presentations.
- embodiments of the present invention can record a sporting event from a main content source 204 , such as a broadcast network. Then, an interactive application executing on the dynamic interactive content system 202 can utilize particular elements of the sporting event during its execution. For example, an interactive football game application can utilize segments of a prerecorded football game to display to the user, thus increasing the realism of the interactive experience.
- embodiments of the present invention receive main content from main content sources 204 , such as a broadcast network, and supplemental content from a supplemental content sources 206 , such as the Internet.
- the received main content then is stored on a hard drive or other persistent storage device within the dynamic interactive content system 202 .
- the dynamic interactive content system 202 receives supplemental content that is related to the main content from either the supplemental content source 206 and/or the main content source 204 .
- the supplemental content includes information describing the main content.
- the supplemental data can include the time the main content was recorded and the channel or URL providing the main content.
- the supplemental data can further include the type of the main content and time codes for particular scenes in the main content.
- the supplemental content can indicate a particular program is a movie and a particular scene, such as a chase scene, is located at a particular time code, such as thirty minutes and ten seconds into the movie.
- the supplemental content can be embedded in the main content signal and/or located at a separate supplemental content source 206 , such as at a web site.
- the dynamic interactive content system 202 associates the supplemental content with the main content that the supplemental data describes.
- applications can search the supplemental content stored on the persistent storage device to find main content having particular characteristics.
- an interactive application may need a car chase scene to enhance the visual experience of a racing game.
- the application can search the supplemental data of the persistent storage for car chase scenes. Once found, the application can read the selected supplemental data to determine the time code for the car chase, and then display the associated main content to the user starting at the particular time code of the chase scene.
- the user can experience the video of the prerecorded car chase while utilizing the interactive application.
- FIG. 3 is a block diagram showing an exemplary dynamic interactive content system 202 , in accordance with an embodiment of the present invention.
- the dynamic interactive content system 202 includes a CPU 300 coupled to a system bus 302 , which connects a plurality of system components.
- the system bus 302 is coupled to a GPU 304 , memory 306 , and a receiver 322 having a network interface card 308 and a tuner 310 .
- the system bus 302 can be coupled to an encode/decode signal processor 312 , an operational request source 314 , a persistent storage device 316 , video/audio output circuitry 318 , and a removable media drive 320 .
- the dynamic interactive content system 202 records main content for use in interactive applications.
- the receiver 322 is utilized to receive and route main content and supplemental content signals to the appropriate modules of the dynamic interactive content system 202 .
- the exemplary receiver 322 can include both a network adapter 308 , such as a network interface card (NIC), and a tuner 310 .
- NIC network interface card
- the tuner 310 can be utilized to lock on to a selected carrier frequency, for example from a television station or radio channel, and filter out the audio and video signals for amplification and display.
- the tuner 310 can be utilized in conjunction with an antenna, satellite dish, cable outlet, or other transmission receiving source capable of carrying signals from a main content source to the tuner 310 .
- main content is stored utilizing a persistent storage device, such as the hard drive 316 .
- Received data often is encoded into a particular file format, for example MPEG format, prior to recording the data on the hard drive 316 .
- main content received utilizing the tuner 310 can be provided to the encoder/decoder unit 312 , which can encode the main content and then provide the encoded main content to the hard drive 316 .
- the tuner 310 can provide the main content to the video/audio output circuitry 318 , which can display the main content to the user.
- the CPU 300 can transmit a command to the encoder/decoder unit 312 which requests the encoder/decoder unit 312 to read the encoded main content from the hard drive 316 .
- the encoder/decoder unit 312 then can decode the previously encoded data and provided the decoded data to the video/audio output circuitry for presentation to the user.
- Main content also can be received utilizing the NIC 308 .
- an MPEG video or WAV audio file can be downloaded from a main content source such as a web page on the Internet.
- the received main content may not require encoding prior to being recorded on the hard drive 316 .
- the main content is an MPEG video
- the MPEG data generally can be saved directly to the hard drive 316 .
- the dynamic interactive content system 202 can again utilize the encoder/decoder unit 312 to decode the stored data prior to providing the main content to the video/audio output circuitry 318 . In this manner, main content can be stored and retrieved from the hard drive 316 as needed.
- Supplemental content provides information describing aspects, properties, and/or characteristics of a particular main content file.
- Supplemental content can include information such as the channel on which the main content was received, the time it was received, and a description of the scenes or frames included in the main content.
- more detailed information can be included in the supplemental content, such as the time code or frame ranges of particular scenes or events within the main content.
- the supplemental content may include, for example, the channel on which the auto race was broadcast, the time of the broadcast, information on the drivers involved, and description of the events that occurred during the race, such when a car loses control and spins out or the end of the race.
- the supplemental content can include the time codes of each of these events, for example, the supplemental content can indicate that a spinout occurred fifteen minutes and twenty seconds into the race.
- the supplemental content can be received either from the main content source or from a separate supplemental content source.
- the supplemental content When received from the main content source, the supplemental content typically is embedded into the main content signal.
- supplemental content can be carried in the vertical blanking interval (VBI) of a television (TV) frame, as discussed next with reference to FIG. 4 .
- VBI vertical blanking interval
- FIG. 4 is an illustration showing a conventional TV frame 400 , which includes a viewable area 402 and a non-viewable VBI 404 .
- the viewable area 402 and VBI 404 of the TV frame 400 each comprises a plurality of frame lines 406 .
- the frame lines 406 of the viewable area 402 are used to display the video portion of a TV signal, while the frame lines sent between each TV video frame comprise the VBI 404 .
- the VBI 404 comprises the last 45 frame lines of each 525-line TV frame 400 in the National Television Standards Committee (NTSC) standard.
- the VBI 404 allows the TV time to reposition its electron beam from the bottom of the current TV frame to the top of the next TV frame.
- NTSC National Television Standards Committee
- the VBI 404 since the VBI 404 is not used to transmit viewable data, the VBI 404 can be used to transmit additional data along with the viewable TV signal. Thus, the VBI 404 can be utilized to transmit supplemental content describing the main content transmitted in the viewable area 402 .
- FIG. 5 is an illustration showing a network based content delivery configuration 500 , in accordance with an embodiment of the present invention.
- the network based content delivery configuration 500 embodiments of the present invention can obtain supplemental content and/or main content via the network.
- the dynamic interactive content system 202 can be placed in communication with a wide area network such as the Internet 502 .
- supplemental content sources 206 a - b can be in communication with the Internet 502 , thus allowing them to be accessed remotely by remote dynamic interactive content system 202 .
- the dynamic interactive content system 202 can receive supplemental content from the network based supplemental content sources 206 a - b .
- the NIC can be a printed circuit board that controls the exchange of data between network nodes at the data link level, and is utilized to for communication utilizing a computer network.
- a particular supplemental content source 206 a can be a website, which can be found using a particular universal resource locator (URL).
- a URL is an address that defines the route to a file on the World Wide Web or other Internet facility.
- the dynamic interactive content system 202 can include the URL to the supplemental content source 206 a on the persistent storage.
- the CPU within the dynamic interface content system 202 can read the URL into system memory, and provide the URL to the NIC. The NIC then facilitates communication with the supplemental content source 206 a via the Internet 502 .
- the supplemental content source 206 a can include, for example, a channel and time index 506 , which can be utilized to locate supplemental content for a particular related main content using a supplemental content database 508 .
- the dynamic interactive content system 202 can provide a channel and time to the channel and time index 506 of the supplemental content source 206 a .
- the channel and time that a particular program was broadcast will be able to sufficiently identify the main content.
- other index types can be utilized, for example, program titles, identification numbers, participant names for sporting events, and other forms of main content identification that will be apparent to those skilled in the art after a careful reading of the present disclosure.
- the channel and time index 506 can search the supplemental content database 508 for supplemental content having a matching channel and time identification.
- the supplemental content database 508 can comprise a database of supplemental content entries wherein the each supplement content entry includes identification fields.
- each supplemental content entry in the supplemental content database 508 can include a channel and time identification field that is searched by the channel and time index 506 .
- the supplemental content source 206 a can return the matching supplemental content to the dynamic interactive content system 202 via the Internet 502 .
- digital main content sources 204 a - b can be in communication with the Internet 502 , thus allowing access to remote dynamic interactive content systems 202 .
- the dynamic interactive content system 202 can receive digital main content from the network based digital main content sources 204 a - b using the NIC or other network interface within the receiver.
- a particular main content source 204 a can be a website, which can be connected to using a particular URL.
- the dynamic interface content system 202 can include the URL to the digital main content source 204 a on the persistent storage.
- the CPU within the dynamic interface content system 202 can read the URL into system memory, and provide the URL to the NIC.
- the NIC then facilitates communication with the digital main content source 204 a via the Internet 506 .
- FIG. 6 is an illustration showing an exemplary persistent storage device 316 of a dynamic interactive content system, in accordance with an embodiment of the present invention.
- a plurality of supplemental content entries 600 a - 600 n and a plurality of main content entries 602 a - 602 n are stored on the persistent storage device 316 .
- each supplemental content entry 600 a - 600 n is associated with a related main content entry 602 a - 602 n .
- supplemental content entry 600 a is associated with main content entry 602 a
- supplemental content entry 600 b is associated with main content entry 602 b.
- each supplemental content entry 600 a - 600 n provides information describing the properties, aspects, and/or characteristics of the related main content entry 602 a - 602 n .
- the associated supplemental content may include, for example, the channel on which the auto race was broadcasts, the time of the broadcasts, information on the drivers involved, and description of the events that occurred during the race, such as when a car loses control and spins out or the end of the race.
- the supplemental content can include the time codes of each of these events, for example, the supplemental content can indicate that a spinout occurred fifteen minutes and twenty seconds into the race.
- embodiments of the present invention can search the supplemental content present on the system to find particular main content. For example, a user can request a list of boxing events. In this example, embodiments of the present invention can search the supplemental content entries 600 a - 600 n to obtain a list of main content entries 602 a - 602 n that are boxing events. Moreover, embodiments of the present invention can utilize the supplemental content entries 600 a - 600 n to access specific scenes and events included in a particular main content file. For example, a user can request a list of boxing events wherein at least one knockdown occurred.
- embodiments of the present invention can search the supplemental content entries 600 a - 600 n to obtain a list of main content entries 602 a - 602 n that are, for example, both boxing events and include a knockdown event/scene within the entry.
- the operational request source 314 typically is in communication with a user input device, such as a game controller, remote controller, keyboard, or other device capable of receiving and transmitting user input data to the dynamic interactive content system 202 .
- a user input device such as a game controller, remote controller, keyboard, or other device capable of receiving and transmitting user input data to the dynamic interactive content system 202 .
- Output from the program executing on the CPU 300 generally is provided to the video/audio output circuitry 318 for presentation, typically on television or other monitor and speaker system. In this manner, users are able to interact with the information presented to them via the operational request source 314 .
- FIG. 7 is a diagram illustrating dynamic interactive application presentation, in accordance with an embodiment of the present invention.
- embodiments of the present invention can execute an interactive application 700 , as discussed above, which receives control information from the operational request source 314 .
- the operational request source 314 typically is in communication with a user input device, such as a game controller, remote controller, keyboard, or other device capable of receiving and transmitting user input data to the dynamic interactive content system.
- the interactive application 700 is in communication with persistent storage device 316 , and thus can read and write data to the persistent storage device 316 .
- the interactive application 700 can utilize specified main content entries stored on the persistent storage device 316 during operation. More specifically, the interactive application 700 can search the supplemental content entries 600 on the persistent storage device 316 for main content having specific aspects, properties, and/or characteristics. Once found, the specific main content entry 602 can be read into memory from the persistent storage device 316 . Once data from the main content entry 602 is in memory, the interactive application 700 can utilize the main content or a portion of the main content during its presentation to the user. That is, the presentation data 702 generated using the interactive application 700 can include both application data 704 , which is generated via the interactive application data 700 , and prerecorded data 706 , which is obtained from the main content 602 .
- the interactive application 700 can be an interactive football game.
- the interactive application 700 could, for example, after the user scores a touchdown, display video of an actual football touchdown.
- the interactive football application can include a default video to display. However, prior to displaying the default video, the interactive football application can search the persistent storage device 316 for video of a touchdown between football teams that are the same football teams as the user is playing. Once a matching supplemental content entry 600 is found, the interactive football application can examine the supplemental content entry 600 for the time code of the touchdown.
- the interactive football application can then read the associated main content 602 into memory, and display the main content starting at the time code of the football touchdown. In this manner, the user can experience a visual display of an actual touchdown between the teams the user is playing in the game. Moreover, the video can be of a resent football game, even though the interactive football application may have been created months or years prior to the displayed football game. Further, if the interactive football application does not find supplemental data matching its request, the application can display the default video. It should be noted, that the selected data from the main content can be presented to the user in any form. For example, a video can be displayed in a small window or projected onto a three dimensional computer generated object instead of using the full screen.
- embodiments of the present invention allow interactive applications to present still images and audio to the user.
- an interactive application can search for a particular sound to play for the user, or for a particular image to display to the user.
- embodiments of the present invention enhance the user's ability to search for specific programs to watch by allowing the user to search for specific data included in the supplemental content entries. For example, the user can search for particular scenes in movies, such as car chase scenes, utilizing the embodiments of the present invention.
- Embodiments of the present invention can also analyze main content for particular semantic information. Once found, a matching object from the main content can be utilized in the interactive application. For example, an application can examine a particular frame or scene of the main content for a face, using predefined semantic definitions. Once the face is found, the face data from the main content can be utilized in the interactive application, for example, the face can be displayed to the user.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Signal Processing For Recording (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Medicines Containing Material From Animals Or Micro-Organisms (AREA)
- Liquid Crystal Substances (AREA)
Abstract
Description
- This application is a Continuation Application under 35 USC § 120 and claims priority from U.S. application Ser. No. 10/264,087 entitled “D
YNAMIC INTERACTIVE CONTENT SYSTEM ”, and filed on Oct. 2, 2002, and is herein incorporated by reference. - 1. Field of the Invention
- This invention relates generally to interactive content systems, and more particularly to interactive content systems having integrated personal video recording functionality.
- 2. Description of the Related Art
- Currently, interactive content systems, such as game consoles, provide utility and entertainment mechanisms to individuals worldwide. For example, interactive content systems allow users to view digital videodiscs (DVDs), play interactive entertainment software, and browse the Internet. In addition, interactive content systems provide exciting learning environments through educational interactive software.
-
FIG. 1 is a block diagram showing a typical prior artinteractive content system 100. The prior artinteractive content system 100 generally includes a central processing unit (CPU) 102 coupled to asystem bus 104, which connects a plurality of system components. For example, thesystem bus 104 often is connected to a graphics processing unit (GPU) 106, anoperational request source 108, amemory 110, aremovable media drive 112, and video/audio output circuitry 114. - In operation, removal media such as a compact disc (CD) or digital videodiscs (DVD) is placed into the
removal media drive 112, which reads data from the CD and transfers program information to thememory 110. TheCPU 102, in conjunction with theGPU 106, executes the program instructions from memory to execute the program. In addition, theoperational request source 108 typically is in communication with a user input device, such as a game controller, remote controller, keyboard, or other device capable of receiving and transferring user input data to theinteractive content system 100. Output from the program executing on theCPU 102 generally is provided to the video/audio output circuitry 114 for presentation, typically on television or other monitor and speaker system. - In this manner, users are able to interact with the information presented to them via the
operational request source 108. However, as can be appreciated, the user is limited to interacting with the static information provided from the removable media via theremovable media drive 112. For example, when utilizing sport based entertainment software, current sporting results cannot be included in the entertainment environment provided by theinteractive content system 100. Further, educational information is restricted to information available at the time the removable media is manufactured. As a result, events and discovers occurring after the manufacture of the removable media are not available to the user using theinteractive content system 100. - Although systems are available that provide a user access to current events, such as personal video recording (PVR) units, these systems generally do not provide a user with an engrossing interactive environment. For example, a PVR system, which records current broadcasts as they occur, can generally only be utilized to rewind, pause, and playback the recorded events. The user only is allowed to passively watch the events as they are presented to them. That is, the user is not allowed to interact with content, as in an interactive content system.
- In view of the foregoing, there is a need for an interactive content system that allows dynamic media to be utilized in conjunction with the interactive content. The system should allow current content from sources such as network broadcasts and other content providers, to be utilized in conjunction with interactive applications.
- Broadly speaking, the present invention fills these needs by providing a dynamic interactive content system that combines dynamic media with interactive content using prerecorded visual and audio data. The system allows content from sources such as network broadcasts and other content providers, to be utilized in conjunction with interactive applications.
- In one embodiment, a dynamic interactive content system is disclosed. The dynamic interactive content system includes a receiver capable of receiving main content, such audio/visual video data, and supplemental content associated with the main content. In addition, a persistent storage is included that is in communication with the receiver. The persistent storage is capable of storing a plurality of main content entries and associated supplemental content entries. The system also includes a processor that is capable of executing an interactive application. In operation, the interactive application searches the stored plurality of supplemental content entries for a particular supplemental content entry having particular data. The interactive application then presents at least a portion of the main content entry that is associated with the particular supplemental content entry to a user. Generally, each supplemental content entry can include data that describes properties of the associated main content entry. For example, the data can include a channel definition defining a channel utilized to receive the associated main content, and a time definition defining a time when the associated main content was received. In addition, the data can include a time code for a specific event included in the associated main content. In this case, the interactive application can present main content data beginning at a position in the main content entry specified by the time code.
- A further dynamic interactive content system is disclosed in an additional embodiment of the present invention. As above, the dynamic interactive content system includes a receiver capable of receiving main content and supplemental content associated with the main content. In addition, a persistent storage is included that is in communication with the receiver. As above, the persistent storage is capable of storing a plurality of main content entries and associated supplemental content entries. Further, an operational request source is included that is in communication with a user input device. The operational request source is capable of receiving user requests via the user input device. The system also includes a processor that is in communication with the persistent storage device and the operational requests source. In operation, the processor searches the stored plurality of supplemental content entries for particular supplemental content entry having particular data related to a received user request. The processor can then present at least a portion of the main content entry associated with the particular supplemental content entry to a user. In addition, the processor can be further capable of executing an interactive application, as described above.
- A method for providing dynamic interactive content is disclosed in a further embodiment of the present invention. The method includes receiving a plurality of main content and supplemental content associated with the main content. The plurality of main content and associated supplemental content are stored. Further, the plurality of supplemental content is searched for particular content having specific data. The main content associated with the particular supplemental content is then presented to a user using an interactive application capable of responding to user input data. In one aspect, the supplemental content can be received via a web site on the Internet. For example, the web site can include an index associated with a supplemental content database. In this case, the supplemental content database can be searched using the index to obtain a supplemental content entry. As above, the data included in the supplemental content can include both a channel definition defining a channel utilized to receive the associated main content and a time definition defining a time when the associated main content was received. Other aspects and advantages of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
- The invention, together with further advantages thereof, may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
-
FIG. 1 is a block diagram showing a typical prior art interactive content system; -
FIG. 2 is a diagram showing a dynamic interactive content environment, in accordance with an embodiment of the present invention; -
FIG. 3 is a block diagram showing an exemplary dynamic interactive content system, in accordance with an embodiment of the present invention; -
FIG. 4 is an illustration showing a conventional TV frame, which includes a viewable area and a non-viewable VBI; -
FIG. 5 is an illustration showing a network based content delivery configuration, in accordance with an embodiment of the present invention; -
FIG. 6 is an illustration showing an exemplary persistent storage device of a dynamic interactive content system, in accordance with an embodiment of the present invention; and -
FIG. 7 is a diagram illustrating dynamic interactive application presentation, in accordance with an embodiment of the present invention. - An invention is disclosed for a dynamic interactive content system that combines dynamic media with interactive content using prerecorded visual and audio data. The system allows content from sources such as network broadcasts and other content providers, to be utilized in conjunction with interactive applications. Broadly speaking, embodiments of the present invention record main content, such audio/visual video data, from content providers, such as network and satellite broadcasts, on a persistent storage such as a hard drive. Along with the main content, supplemental data is recorded that describes the main content. For example, the supplemental data can include information such as when the main content was recorded, the channel it was received from, and the type of material that was recorded, such as whether the main content is a movie or sporting event. By searching the supplemental data, interactive applications can search for specific main content for use in the application. In this manner, interactive applications can utilize current broadcasts and other material in their user presentations.
- In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order not to unnecessarily obscure the present invention. Further, the term “main content” will be utilized in the following description to indicate any data usable in an interactive application. For example, main content can be audio/visual data, such as a television or cable television program. Main content can also indicate audio data recorded from an audio source, such as a commercial radio broadcast network. Other examples of main content can include, for example, MPEG data or other image data recorded from a wide area network such as the Internet, and other data types usable in interactive applications that will be apparent to those skilled in the art after a careful reading of the present disclosure. In addition, the terms “presentation” and “presentation data” will refer to any data presented to the user. As such, presentation data can include video, image, and/or audio data that is presented to the user, for example, using a television, monitor, and/or speakers.
-
FIG. 2 is a diagram showing a dynamicinteractive content environment 200, in accordance with an embodiment of the present invention. The dynamicinteractive content environment 200 includes a dynamicinteractive content system 202 in communication with amain content source 204, such as a broadcast network. Themain content source 204 can be any network capable of providing main content, such as a broadcast network, cable station, premium satellite provider, radio station, or any other source capable of providing main content. In addition, the dynamicinteractive content system 202 can be in communication with asupplemental content provider 206, such as the Internet or other awide area network 206. It should be borne in mind, as explained in greater detail below, that thesupplemental content provider 206 can be the same source as themain content provider 204, for example, by embedding the supplemental content within the main content signal. - As mentioned above, embodiments of the present invention allow main content, such as audio/visual video data from the
main content source 204, to be utilized in conjunction with interactive applications. By recording main content on a persistent storage device and providing a description of the main content using the supplemental content, embodiments of the present invention allow interactive applications to utilize the recorded main content during their user presentations. - For example, embodiments of the present invention can record a sporting event from a
main content source 204, such as a broadcast network. Then, an interactive application executing on the dynamicinteractive content system 202 can utilize particular elements of the sporting event during its execution. For example, an interactive football game application can utilize segments of a prerecorded football game to display to the user, thus increasing the realism of the interactive experience. - In particular, embodiments of the present invention receive main content from
main content sources 204, such as a broadcast network, and supplemental content from asupplemental content sources 206, such as the Internet. The received main content then is stored on a hard drive or other persistent storage device within the dynamicinteractive content system 202. In addition, the dynamicinteractive content system 202 receives supplemental content that is related to the main content from either thesupplemental content source 206 and/or themain content source 204. - The supplemental content includes information describing the main content. For example, the supplemental data can include the time the main content was recorded and the channel or URL providing the main content. The supplemental data can further include the type of the main content and time codes for particular scenes in the main content. For example, the supplemental content can indicate a particular program is a movie and a particular scene, such as a chase scene, is located at a particular time code, such as thirty minutes and ten seconds into the movie. As mentioned above, and described in greater detail subsequently, the supplemental content can be embedded in the main content signal and/or located at a separate
supplemental content source 206, such as at a web site. - Broadly speaking, the dynamic
interactive content system 202 associates the supplemental content with the main content that the supplemental data describes. In this manner, applications can search the supplemental content stored on the persistent storage device to find main content having particular characteristics. For example, an interactive application may need a car chase scene to enhance the visual experience of a racing game. In this case, the application can search the supplemental data of the persistent storage for car chase scenes. Once found, the application can read the selected supplemental data to determine the time code for the car chase, and then display the associated main content to the user starting at the particular time code of the chase scene. Thus, the user can experience the video of the prerecorded car chase while utilizing the interactive application. -
FIG. 3 is a block diagram showing an exemplary dynamicinteractive content system 202, in accordance with an embodiment of the present invention. As shown inFIG. 3 , the dynamicinteractive content system 202 includes aCPU 300 coupled to asystem bus 302, which connects a plurality of system components. For example, thesystem bus 302 is coupled to aGPU 304,memory 306, and areceiver 322 having anetwork interface card 308 and atuner 310. In addition, thesystem bus 302 can be coupled to an encode/decode signal processor 312, anoperational request source 314, apersistent storage device 316, video/audio output circuitry 318, and aremovable media drive 320. - As mentioned above, the dynamic
interactive content system 202 records main content for use in interactive applications. Specifically, thereceiver 322 is utilized to receive and route main content and supplemental content signals to the appropriate modules of the dynamicinteractive content system 202. As shown inFIG. 3 , theexemplary receiver 322 can include both anetwork adapter 308, such as a network interface card (NIC), and atuner 310. - The
tuner 310 can be utilized to lock on to a selected carrier frequency, for example from a television station or radio channel, and filter out the audio and video signals for amplification and display. As such, thetuner 310 can be utilized in conjunction with an antenna, satellite dish, cable outlet, or other transmission receiving source capable of carrying signals from a main content source to thetuner 310. Once received, main content is stored utilizing a persistent storage device, such as thehard drive 316. - Received data often is encoded into a particular file format, for example MPEG format, prior to recording the data on the
hard drive 316. This typically is accomplished using an encoder/decoder unit 312, which is capable of encoding data into a particular file format and decoding data files for presentation. For example, in one embodiment, main content received utilizing thetuner 310 can be provided to the encoder/decoder unit 312, which can encode the main content and then provide the encoded main content to thehard drive 316. In addition, thetuner 310 can provide the main content to the video/audio output circuitry 318, which can display the main content to the user. During subsequent usage, theCPU 300 can transmit a command to the encoder/decoder unit 312 which requests the encoder/decoder unit 312 to read the encoded main content from thehard drive 316. The encoder/decoder unit 312 then can decode the previously encoded data and provided the decoded data to the video/audio output circuitry for presentation to the user. - Main content also can be received utilizing the
NIC 308. For example, an MPEG video or WAV audio file can be downloaded from a main content source such as a web page on the Internet. In this case, the received main content may not require encoding prior to being recorded on thehard drive 316. For example, when the main content is an MPEG video, the MPEG data generally can be saved directly to thehard drive 316. Upon retrieval, the dynamicinteractive content system 202 can again utilize the encoder/decoder unit 312 to decode the stored data prior to providing the main content to the video/audio output circuitry 318. In this manner, main content can be stored and retrieved from thehard drive 316 as needed. - As mentioned previously, embodiments of the present invention receive supplemental content in addition to main content. Supplemental content provides information describing aspects, properties, and/or characteristics of a particular main content file. For example, supplemental content can include information such as the channel on which the main content was received, the time it was received, and a description of the scenes or frames included in the main content. In addition, more detailed information can be included in the supplemental content, such as the time code or frame ranges of particular scenes or events within the main content.
- For example, when the main content is a video of an auto race, the supplemental content may include, for example, the channel on which the auto race was broadcast, the time of the broadcast, information on the drivers involved, and description of the events that occurred during the race, such when a car loses control and spins out or the end of the race. In addition, the supplemental content can include the time codes of each of these events, for example, the supplemental content can indicate that a spinout occurred fifteen minutes and twenty seconds into the race.
- The supplemental content can be received either from the main content source or from a separate supplemental content source. When received from the main content source, the supplemental content typically is embedded into the main content signal. For example, supplemental content can be carried in the vertical blanking interval (VBI) of a television (TV) frame, as discussed next with reference to
FIG. 4 . -
FIG. 4 is an illustration showing aconventional TV frame 400, which includes aviewable area 402 and anon-viewable VBI 404. Theviewable area 402 andVBI 404 of theTV frame 400 each comprises a plurality of frame lines 406. The frame lines 406 of theviewable area 402 are used to display the video portion of a TV signal, while the frame lines sent between each TV video frame comprise theVBI 404. For example, theVBI 404 comprises the last 45 frame lines of each 525-line TV frame 400 in the National Television Standards Committee (NTSC) standard. TheVBI 404 allows the TV time to reposition its electron beam from the bottom of the current TV frame to the top of the next TV frame. Further, since theVBI 404 is not used to transmit viewable data, theVBI 404 can be used to transmit additional data along with the viewable TV signal. Thus, theVBI 404 can be utilized to transmit supplemental content describing the main content transmitted in theviewable area 402. - In addition to embedding the supplemental content in the main content signal, embodiments of the present invention can also obtain supplemental content via a network, such as the Internet.
FIG. 5 is an illustration showing a network basedcontent delivery configuration 500, in accordance with an embodiment of the present invention. Using the network basedcontent delivery configuration 500, embodiments of the present invention can obtain supplemental content and/or main content via the network. For example, as shown inFIG. 5 , the dynamicinteractive content system 202 can be placed in communication with a wide area network such as theInternet 502. In addition,supplemental content sources 206 a-b can be in communication with theInternet 502, thus allowing them to be accessed remotely by remote dynamicinteractive content system 202. Using the NIC, or other network interface within the receiver, the dynamicinteractive content system 202 can receive supplemental content from the network basedsupplemental content sources 206 a-b. Generally, the NIC can be a printed circuit board that controls the exchange of data between network nodes at the data link level, and is utilized to for communication utilizing a computer network. - For example, a particular
supplemental content source 206 a can be a website, which can be found using a particular universal resource locator (URL). As is well know to those skilled in the art, a URL is an address that defines the route to a file on the World Wide Web or other Internet facility. In this example, the dynamicinteractive content system 202 can include the URL to thesupplemental content source 206 a on the persistent storage. When needed, the CPU within the dynamicinterface content system 202 can read the URL into system memory, and provide the URL to the NIC. The NIC then facilitates communication with thesupplemental content source 206 a via theInternet 502. - The
supplemental content source 206 a can include, for example, a channel andtime index 506, which can be utilized to locate supplemental content for a particular related main content using asupplemental content database 508. For example, in one embodiment, the dynamicinteractive content system 202 can provide a channel and time to the channel andtime index 506 of thesupplemental content source 206 a. Generally, the channel and time that a particular program was broadcast will be able to sufficiently identify the main content. However, other index types can be utilized, for example, program titles, identification numbers, participant names for sporting events, and other forms of main content identification that will be apparent to those skilled in the art after a careful reading of the present disclosure. - Once the main content identification is received, in this case the channel and time for the program, the channel and
time index 506 can search thesupplemental content database 508 for supplemental content having a matching channel and time identification. In one embodiment, thesupplemental content database 508 can comprise a database of supplemental content entries wherein the each supplement content entry includes identification fields. For example, each supplemental content entry in thesupplemental content database 508 can include a channel and time identification field that is searched by the channel andtime index 506. Once the matching supplemental content entry is found, thesupplemental content source 206 a can return the matching supplemental content to the dynamicinteractive content system 202 via theInternet 502. - In addition, as illustrated in
FIG. 5 , digitalmain content sources 204 a-b can be in communication with theInternet 502, thus allowing access to remote dynamicinteractive content systems 202. As above, the dynamicinteractive content system 202 can receive digital main content from the network based digitalmain content sources 204 a-b using the NIC or other network interface within the receiver. - For example, a particular
main content source 204 a can be a website, which can be connected to using a particular URL. For example, the dynamicinterface content system 202 can include the URL to the digitalmain content source 204 a on the persistent storage. When needed, the CPU within the dynamicinterface content system 202 can read the URL into system memory, and provide the URL to the NIC. The NIC then facilitates communication with the digitalmain content source 204 a via theInternet 506. - Once the dynamic
interactive content system 202 obtains both a particular main content and the related supplemental content, the supplemental content is associated with the related main content.FIG. 6 is an illustration showing an exemplarypersistent storage device 316 of a dynamic interactive content system, in accordance with an embodiment of the present invention. As shown inFIG. 6 , a plurality ofsupplemental content entries 600 a-600 n and a plurality ofmain content entries 602 a-602 n are stored on thepersistent storage device 316. In addition, eachsupplemental content entry 600 a-600 n is associated with a relatedmain content entry 602 a-602 n. For example,supplemental content entry 600 a is associated withmain content entry 602 a, andsupplemental content entry 600 b is associated withmain content entry 602 b. - As mentioned previously, each
supplemental content entry 600 a-600 n provides information describing the properties, aspects, and/or characteristics of the relatedmain content entry 602 a-602 n. For example, when the main content entry is a video of an auto race, the associated supplemental content may include, for example, the channel on which the auto race was broadcasts, the time of the broadcasts, information on the drivers involved, and description of the events that occurred during the race, such as when a car loses control and spins out or the end of the race. In addition, the supplemental content can include the time codes of each of these events, for example, the supplemental content can indicate that a spinout occurred fifteen minutes and twenty seconds into the race. - In this manner, embodiments of the present invention can search the supplemental content present on the system to find particular main content. For example, a user can request a list of boxing events. In this example, embodiments of the present invention can search the
supplemental content entries 600 a-600 n to obtain a list ofmain content entries 602 a-602 n that are boxing events. Moreover, embodiments of the present invention can utilize thesupplemental content entries 600 a-600 nto access specific scenes and events included in a particular main content file. For example, a user can request a list of boxing events wherein at least one knockdown occurred. In this example, embodiments of the present invention can search thesupplemental content entries 600 a-600 nto obtain a list ofmain content entries 602 a-602 n that are, for example, both boxing events and include a knockdown event/scene within the entry. - This ability to search supplemental content to find specific main content stored on the persistent storage advantageously allows embodiments of the present invention to utilize the main content in conjunction with interactive applications. Referring back to
FIG. 3 , when running an interactive application, removable media such as a CD or DVD is placed into the removal media drive 220. The removable media drive 220 then reads data from the CD and transfers program information to thememory 306. TheCPU 300, in conjunction with theGPU 304, then executes the program instructions from memory to run the program. - In addition, the
operational request source 314 typically is in communication with a user input device, such as a game controller, remote controller, keyboard, or other device capable of receiving and transmitting user input data to the dynamicinteractive content system 202. Output from the program executing on theCPU 300 generally is provided to the video/audio output circuitry 318 for presentation, typically on television or other monitor and speaker system. In this manner, users are able to interact with the information presented to them via theoperational request source 314. - During operation, embodiments of the present invention can utilize prerecorded main content from the
persistent storage device 316 in conjunction with the interactive application.FIG. 7 is a diagram illustrating dynamic interactive application presentation, in accordance with an embodiment of the present invention. As shown inFIG. 7 , embodiments of the present invention can execute aninteractive application 700, as discussed above, which receives control information from theoperational request source 314. As mentioned above, theoperational request source 314 typically is in communication with a user input device, such as a game controller, remote controller, keyboard, or other device capable of receiving and transmitting user input data to the dynamic interactive content system. - The
interactive application 700 is in communication withpersistent storage device 316, and thus can read and write data to thepersistent storage device 316. As a result, theinteractive application 700 can utilize specified main content entries stored on thepersistent storage device 316 during operation. More specifically, theinteractive application 700 can search thesupplemental content entries 600 on thepersistent storage device 316 for main content having specific aspects, properties, and/or characteristics. Once found, the specificmain content entry 602 can be read into memory from thepersistent storage device 316. Once data from themain content entry 602 is in memory, theinteractive application 700 can utilize the main content or a portion of the main content during its presentation to the user. That is, thepresentation data 702 generated using theinteractive application 700 can include bothapplication data 704, which is generated via theinteractive application data 700, andprerecorded data 706, which is obtained from themain content 602. - For example, the
interactive application 700 can be an interactive football game. In this example, theinteractive application 700 could, for example, after the user scores a touchdown, display video of an actual football touchdown. The interactive football application can include a default video to display. However, prior to displaying the default video, the interactive football application can search thepersistent storage device 316 for video of a touchdown between football teams that are the same football teams as the user is playing. Once a matchingsupplemental content entry 600 is found, the interactive football application can examine thesupplemental content entry 600 for the time code of the touchdown. - The interactive football application can then read the associated
main content 602 into memory, and display the main content starting at the time code of the football touchdown. In this manner, the user can experience a visual display of an actual touchdown between the teams the user is playing in the game. Moreover, the video can be of a resent football game, even though the interactive football application may have been created months or years prior to the displayed football game. Further, if the interactive football application does not find supplemental data matching its request, the application can display the default video. It should be noted, that the selected data from the main content can be presented to the user in any form. For example, a video can be displayed in a small window or projected onto a three dimensional computer generated object instead of using the full screen. - In addition to video, embodiments of the present invention allow interactive applications to present still images and audio to the user. For example, an interactive application can search for a particular sound to play for the user, or for a particular image to display to the user. Moreover, as mentioned above, embodiments of the present invention enhance the user's ability to search for specific programs to watch by allowing the user to search for specific data included in the supplemental content entries. For example, the user can search for particular scenes in movies, such as car chase scenes, utilizing the embodiments of the present invention.
- Embodiments of the present invention can also analyze main content for particular semantic information. Once found, a matching object from the main content can be utilized in the interactive application. For example, an application can examine a particular frame or scene of the main content for a face, using predefined semantic definitions. Once the face is found, the face data from the main content can be utilized in the interactive application, for example, the face can be displayed to the user.
- Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
Claims (14)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/619,180 US20070113263A1 (en) | 2002-10-02 | 2007-01-02 | Dynamic interactive content system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/264,087 US7171402B1 (en) | 2002-10-02 | 2002-10-02 | Dynamic interactive content system |
US11/619,180 US20070113263A1 (en) | 2002-10-02 | 2007-01-02 | Dynamic interactive content system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/264,087 Continuation US7171402B1 (en) | 2002-10-02 | 2002-10-02 | Dynamic interactive content system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20070113263A1 true US20070113263A1 (en) | 2007-05-17 |
Family
ID=32068292
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/264,087 Expired - Lifetime US7171402B1 (en) | 2002-10-02 | 2002-10-02 | Dynamic interactive content system |
US11/619,180 Abandoned US20070113263A1 (en) | 2002-10-02 | 2007-01-02 | Dynamic interactive content system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/264,087 Expired - Lifetime US7171402B1 (en) | 2002-10-02 | 2002-10-02 | Dynamic interactive content system |
Country Status (8)
Country | Link |
---|---|
US (2) | US7171402B1 (en) |
EP (1) | EP1547376B1 (en) |
JP (1) | JP4317131B2 (en) |
KR (1) | KR100720785B1 (en) |
AT (1) | ATE488955T1 (en) |
AU (1) | AU2003275153A1 (en) |
DE (1) | DE60335017D1 (en) |
WO (1) | WO2004032493A1 (en) |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080159715A1 (en) * | 2007-01-03 | 2008-07-03 | Microsoft Corporation | Contextual linking and out-of-band delivery of related online content |
US20100161764A1 (en) * | 2008-12-18 | 2010-06-24 | Seiko Epson Corporation | Content Information Deliver System |
US20110055888A1 (en) * | 2009-08-31 | 2011-03-03 | Dell Products L.P. | Configurable television broadcast receiving system |
US20120017236A1 (en) * | 2010-07-13 | 2012-01-19 | Sony Computer Entertainment Inc. | Supplemental video content on a mobile device |
US20120122434A1 (en) * | 2007-07-30 | 2012-05-17 | Bindu Rama Rao | Interactive media management server |
US20140201769A1 (en) * | 2009-05-29 | 2014-07-17 | Zeev Neumeier | Systems and methods for identifying video segments for displaying contextually relevant content |
US9154942B2 (en) | 2008-11-26 | 2015-10-06 | Free Stream Media Corp. | Zero configuration communication between a browser and a networked media device |
US9258383B2 (en) | 2008-11-26 | 2016-02-09 | Free Stream Media Corp. | Monetization of television audience data across muliple screens of a user watching television |
US9386356B2 (en) | 2008-11-26 | 2016-07-05 | Free Stream Media Corp. | Targeting with television audience data across multiple screens |
US9392429B2 (en) | 2006-11-22 | 2016-07-12 | Qualtrics, Llc | Mobile device and system for multi-step activities |
US9519772B2 (en) | 2008-11-26 | 2016-12-13 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US9560425B2 (en) | 2008-11-26 | 2017-01-31 | Free Stream Media Corp. | Remotely control devices over a network without authentication or registration |
US9838753B2 (en) | 2013-12-23 | 2017-12-05 | Inscape Data, Inc. | Monitoring individual viewing of television events using tracking pixels and cookies |
US9906834B2 (en) | 2009-05-29 | 2018-02-27 | Inscape Data, Inc. | Methods for identifying video segments and displaying contextually targeted content on a connected television |
US9955192B2 (en) | 2013-12-23 | 2018-04-24 | Inscape Data, Inc. | Monitoring individual viewing of television events using tracking pixels and cookies |
US9961388B2 (en) | 2008-11-26 | 2018-05-01 | David Harrison | Exposure of public internet protocol addresses in an advertising exchange server to improve relevancy of advertisements |
US9986279B2 (en) | 2008-11-26 | 2018-05-29 | Free Stream Media Corp. | Discovery, access control, and communication with networked services |
US10080062B2 (en) | 2015-07-16 | 2018-09-18 | Inscape Data, Inc. | Optimizing media fingerprint retention to improve system resource utilization |
US10116972B2 (en) | 2009-05-29 | 2018-10-30 | Inscape Data, Inc. | Methods for identifying video segments and displaying option to view from an alternative source and/or on an alternative device |
US10169455B2 (en) | 2009-05-29 | 2019-01-01 | Inscape Data, Inc. | Systems and methods for addressing a media database using distance associative hashing |
US10171754B2 (en) | 2010-07-13 | 2019-01-01 | Sony Interactive Entertainment Inc. | Overlay non-video content on a mobile device |
US10192138B2 (en) | 2010-05-27 | 2019-01-29 | Inscape Data, Inc. | Systems and methods for reducing data density in large datasets |
US10279255B2 (en) | 2010-07-13 | 2019-05-07 | Sony Interactive Entertainment Inc. | Position-dependent gaming, 3-D controller, and handheld as a remote |
US10334324B2 (en) | 2008-11-26 | 2019-06-25 | Free Stream Media Corp. | Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device |
US10375451B2 (en) | 2009-05-29 | 2019-08-06 | Inscape Data, Inc. | Detection of common media segments |
US10405014B2 (en) | 2015-01-30 | 2019-09-03 | Inscape Data, Inc. | Methods for identifying video segments and displaying option to view from an alternative source and/or on an alternative device |
US10419541B2 (en) | 2008-11-26 | 2019-09-17 | Free Stream Media Corp. | Remotely control devices over a network without authentication or registration |
US10482349B2 (en) | 2015-04-17 | 2019-11-19 | Inscape Data, Inc. | Systems and methods for reducing data density in large datasets |
US10567823B2 (en) | 2008-11-26 | 2020-02-18 | Free Stream Media Corp. | Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device |
US10631068B2 (en) | 2008-11-26 | 2020-04-21 | Free Stream Media Corp. | Content exposure attribution based on renderings of related content across multiple devices |
US10803474B2 (en) | 2006-11-22 | 2020-10-13 | Qualtrics, Llc | System for creating and distributing interactive advertisements to mobile devices |
US10873788B2 (en) | 2015-07-16 | 2020-12-22 | Inscape Data, Inc. | Detection of common media segments |
US10880340B2 (en) | 2008-11-26 | 2020-12-29 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US10902048B2 (en) | 2015-07-16 | 2021-01-26 | Inscape Data, Inc. | Prediction of future views of video segments to optimize system resource utilization |
US10949458B2 (en) | 2009-05-29 | 2021-03-16 | Inscape Data, Inc. | System and method for improving work load management in ACR television monitoring system |
US10977693B2 (en) | 2008-11-26 | 2021-04-13 | Free Stream Media Corp. | Association of content identifier of audio-visual data with additional data through capture infrastructure |
US10983984B2 (en) | 2017-04-06 | 2021-04-20 | Inscape Data, Inc. | Systems and methods for improving accuracy of device maps using media viewing data |
US11256386B2 (en) | 2006-11-22 | 2022-02-22 | Qualtrics, Llc | Media management system supporting a plurality of mobile devices |
US11308144B2 (en) | 2015-07-16 | 2022-04-19 | Inscape Data, Inc. | Systems and methods for partitioning search indexes for improved efficiency in identifying media segments |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4538328B2 (en) * | 2003-01-31 | 2010-09-08 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Mutual application control to improve playback performance of stored interactive TV applications |
US9418704B2 (en) | 2003-09-17 | 2016-08-16 | Hitachi Maxell, Ltd. | Program, recording medium, and reproducing apparatus |
US9047624B1 (en) * | 2004-08-16 | 2015-06-02 | Advertising.Com Llc | Auditing of content related events |
US8103546B1 (en) | 2004-08-16 | 2012-01-24 | Lightningcast Llc | Advertising content delivery |
KR20090000647A (en) * | 2007-03-15 | 2009-01-08 | 삼성전자주식회사 | Method and apparatus for displaying interactive data for real time |
US9831966B2 (en) | 2007-06-18 | 2017-11-28 | Nokia Technologies Oy | Method and device for continuation of multimedia playback |
CN101809962B (en) * | 2007-09-25 | 2015-03-25 | 爱立信电话股份有限公司 | Method and arrangement relating to a media structure |
US8094168B2 (en) * | 2007-10-03 | 2012-01-10 | Microsoft Corporation | Adding secondary content to underutilized space on a display device |
AU2009256066B2 (en) * | 2008-06-06 | 2012-05-17 | Deluxe Media Inc. | Methods and systems for use in providing playback of variable length content in a fixed length framework |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6263505B1 (en) * | 1997-03-21 | 2001-07-17 | United States Of America | System and method for supplying supplemental information for video programs |
US6351599B1 (en) * | 1996-03-04 | 2002-02-26 | Matsushita Electric Industrial, Co., Ltd. | Picture image selecting and display device |
US20020042293A1 (en) * | 2000-10-09 | 2002-04-11 | Ubale Ajay Ganesh | Net related interactive quiz game |
US20020099800A1 (en) * | 2000-11-27 | 2002-07-25 | Robert Brainard | Data mark and recall system and method for a data stream |
US6601103B1 (en) * | 1996-08-22 | 2003-07-29 | Intel Corporation | Method and apparatus for providing personalized supplemental programming |
US6993284B2 (en) * | 2001-03-05 | 2006-01-31 | Lee Weinblatt | Interactive access to supplementary material related to a program being broadcast |
US20060174289A1 (en) * | 2004-10-29 | 2006-08-03 | Theberge James P | System for enabling video-based interactive applications |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5778367A (en) * | 1995-12-14 | 1998-07-07 | Network Engineering Software, Inc. | Automated on-line information service and directory, particularly for the world wide web |
KR100655248B1 (en) * | 1998-07-23 | 2006-12-08 | 세드나 페이턴트 서비시즈, 엘엘씨 | Interactive user interface |
BR9912386A (en) * | 1998-07-23 | 2001-10-02 | Diva Systems Corp | System and process for generating and using an interactive user interface |
US6598229B2 (en) * | 1998-11-20 | 2003-07-22 | Diva Systems Corp. | System and method for detecting and correcting a defective transmission channel in an interactive information distribution system |
JP3619427B2 (en) * | 1999-11-05 | 2005-02-09 | 株式会社ビューポイントコミュニケーションズ | Information display device |
KR20020062961A (en) * | 1999-12-10 | 2002-07-31 | 유나이티드 비디오 프로퍼티즈, 인크. | Features for use with advanced set-top applications on interactive television systems |
US20010042107A1 (en) * | 2000-01-06 | 2001-11-15 | Palm Stephen R. | Networked audio player transport protocol and architecture |
JP2002056340A (en) | 2000-08-09 | 2002-02-20 | Konami Co Ltd | Game item providing system, its method, and recording medium |
EP1947858B1 (en) * | 2000-10-11 | 2014-07-02 | United Video Properties, Inc. | Systems and methods for supplementing on-demand media |
JP2002200359A (en) | 2000-12-27 | 2002-07-16 | Pioneer Electronic Corp | Network game system and method for providing network game |
WO2002054327A1 (en) | 2001-01-04 | 2002-07-11 | Noks-Com Ltd. | Method and system for a set of interrelated activities converging on a series of collectible virtual objects |
US8458754B2 (en) * | 2001-01-22 | 2013-06-04 | Sony Computer Entertainment Inc. | Method and system for providing instant start multimedia content |
-
2002
- 2002-10-02 US US10/264,087 patent/US7171402B1/en not_active Expired - Lifetime
-
2003
- 2003-09-22 AU AU2003275153A patent/AU2003275153A1/en not_active Abandoned
- 2003-09-22 WO PCT/US2003/029919 patent/WO2004032493A1/en active IP Right Grant
- 2003-09-22 EP EP03759421A patent/EP1547376B1/en not_active Expired - Lifetime
- 2003-09-22 KR KR1020057005727A patent/KR100720785B1/en active IP Right Grant
- 2003-09-22 JP JP2004541609A patent/JP4317131B2/en not_active Expired - Fee Related
- 2003-09-22 AT AT03759421T patent/ATE488955T1/en not_active IP Right Cessation
- 2003-09-22 DE DE60335017T patent/DE60335017D1/en not_active Expired - Lifetime
-
2007
- 2007-01-02 US US11/619,180 patent/US20070113263A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6351599B1 (en) * | 1996-03-04 | 2002-02-26 | Matsushita Electric Industrial, Co., Ltd. | Picture image selecting and display device |
US6601103B1 (en) * | 1996-08-22 | 2003-07-29 | Intel Corporation | Method and apparatus for providing personalized supplemental programming |
US6263505B1 (en) * | 1997-03-21 | 2001-07-17 | United States Of America | System and method for supplying supplemental information for video programs |
US20020042293A1 (en) * | 2000-10-09 | 2002-04-11 | Ubale Ajay Ganesh | Net related interactive quiz game |
US20020099800A1 (en) * | 2000-11-27 | 2002-07-25 | Robert Brainard | Data mark and recall system and method for a data stream |
US6993284B2 (en) * | 2001-03-05 | 2006-01-31 | Lee Weinblatt | Interactive access to supplementary material related to a program being broadcast |
US20060174289A1 (en) * | 2004-10-29 | 2006-08-03 | Theberge James P | System for enabling video-based interactive applications |
Cited By (86)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10846717B2 (en) | 2006-11-22 | 2020-11-24 | Qualtrics, Llc | System for creating and distributing interactive advertisements to mobile devices |
US10659515B2 (en) | 2006-11-22 | 2020-05-19 | Qualtrics, Inc. | System for providing audio questionnaires |
US11128689B2 (en) | 2006-11-22 | 2021-09-21 | Qualtrics, Llc | Mobile device and system for multi-step activities |
US11064007B2 (en) | 2006-11-22 | 2021-07-13 | Qualtrics, Llc | System for providing audio questionnaires |
US10649624B2 (en) | 2006-11-22 | 2020-05-12 | Qualtrics, Llc | Media management system supporting a plurality of mobile devices |
US9392429B2 (en) | 2006-11-22 | 2016-07-12 | Qualtrics, Llc | Mobile device and system for multi-step activities |
US11256386B2 (en) | 2006-11-22 | 2022-02-22 | Qualtrics, Llc | Media management system supporting a plurality of mobile devices |
US10838580B2 (en) | 2006-11-22 | 2020-11-17 | Qualtrics, Llc | Media management system supporting a plurality of mobile devices |
US10686863B2 (en) | 2006-11-22 | 2020-06-16 | Qualtrics, Llc | System for providing audio questionnaires |
US10747396B2 (en) | 2006-11-22 | 2020-08-18 | Qualtrics, Llc | Media management system supporting a plurality of mobile devices |
US10803474B2 (en) | 2006-11-22 | 2020-10-13 | Qualtrics, Llc | System for creating and distributing interactive advertisements to mobile devices |
US20080159715A1 (en) * | 2007-01-03 | 2008-07-03 | Microsoft Corporation | Contextual linking and out-of-band delivery of related online content |
US8478250B2 (en) * | 2007-07-30 | 2013-07-02 | Bindu Rama Rao | Interactive media management server |
US20120122434A1 (en) * | 2007-07-30 | 2012-05-17 | Bindu Rama Rao | Interactive media management server |
US9716736B2 (en) | 2008-11-26 | 2017-07-25 | Free Stream Media Corp. | System and method of discovery and launch associated with a networked media device |
US10425675B2 (en) | 2008-11-26 | 2019-09-24 | Free Stream Media Corp. | Discovery, access control, and communication with networked services |
US9591381B2 (en) | 2008-11-26 | 2017-03-07 | Free Stream Media Corp. | Automated discovery and launch of an application on a network enabled device |
US9589456B2 (en) | 2008-11-26 | 2017-03-07 | Free Stream Media Corp. | Exposure of public internet protocol addresses in an advertising exchange server to improve relevancy of advertisements |
US9686596B2 (en) | 2008-11-26 | 2017-06-20 | Free Stream Media Corp. | Advertisement targeting through embedded scripts in supply-side and demand-side platforms |
US9703947B2 (en) | 2008-11-26 | 2017-07-11 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US9706265B2 (en) | 2008-11-26 | 2017-07-11 | Free Stream Media Corp. | Automatic communications between networked devices such as televisions and mobile devices |
US9560425B2 (en) | 2008-11-26 | 2017-01-31 | Free Stream Media Corp. | Remotely control devices over a network without authentication or registration |
US10986141B2 (en) | 2008-11-26 | 2021-04-20 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US9838758B2 (en) | 2008-11-26 | 2017-12-05 | David Harrison | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US10977693B2 (en) | 2008-11-26 | 2021-04-13 | Free Stream Media Corp. | Association of content identifier of audio-visual data with additional data through capture infrastructure |
US9848250B2 (en) | 2008-11-26 | 2017-12-19 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US9854330B2 (en) | 2008-11-26 | 2017-12-26 | David Harrison | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US9866925B2 (en) | 2008-11-26 | 2018-01-09 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US10880340B2 (en) | 2008-11-26 | 2020-12-29 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US9154942B2 (en) | 2008-11-26 | 2015-10-06 | Free Stream Media Corp. | Zero configuration communication between a browser and a networked media device |
US9961388B2 (en) | 2008-11-26 | 2018-05-01 | David Harrison | Exposure of public internet protocol addresses in an advertising exchange server to improve relevancy of advertisements |
US9967295B2 (en) | 2008-11-26 | 2018-05-08 | David Harrison | Automated discovery and launch of an application on a network enabled device |
US9986279B2 (en) | 2008-11-26 | 2018-05-29 | Free Stream Media Corp. | Discovery, access control, and communication with networked services |
US10032191B2 (en) | 2008-11-26 | 2018-07-24 | Free Stream Media Corp. | Advertisement targeting through embedded scripts in supply-side and demand-side platforms |
US10074108B2 (en) | 2008-11-26 | 2018-09-11 | Free Stream Media Corp. | Annotation of metadata through capture infrastructure |
US10791152B2 (en) | 2008-11-26 | 2020-09-29 | Free Stream Media Corp. | Automatic communications between networked devices such as televisions and mobile devices |
US10771525B2 (en) | 2008-11-26 | 2020-09-08 | Free Stream Media Corp. | System and method of discovery and launch associated with a networked media device |
US10142377B2 (en) | 2008-11-26 | 2018-11-27 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US9167419B2 (en) | 2008-11-26 | 2015-10-20 | Free Stream Media Corp. | Discovery and launch system and method |
US9258383B2 (en) | 2008-11-26 | 2016-02-09 | Free Stream Media Corp. | Monetization of television audience data across muliple screens of a user watching television |
US9386356B2 (en) | 2008-11-26 | 2016-07-05 | Free Stream Media Corp. | Targeting with television audience data across multiple screens |
US9519772B2 (en) | 2008-11-26 | 2016-12-13 | Free Stream Media Corp. | Relevancy improvement through targeting of information based on data gathered from a networked device associated with a security sandbox of a client device |
US10631068B2 (en) | 2008-11-26 | 2020-04-21 | Free Stream Media Corp. | Content exposure attribution based on renderings of related content across multiple devices |
US10567823B2 (en) | 2008-11-26 | 2020-02-18 | Free Stream Media Corp. | Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device |
US9576473B2 (en) | 2008-11-26 | 2017-02-21 | Free Stream Media Corp. | Annotation of metadata through capture infrastructure |
US10419541B2 (en) | 2008-11-26 | 2019-09-17 | Free Stream Media Corp. | Remotely control devices over a network without authentication or registration |
US10334324B2 (en) | 2008-11-26 | 2019-06-25 | Free Stream Media Corp. | Relevant advertisement generation based on a user operating a client device communicatively coupled with a networked media device |
US20100161764A1 (en) * | 2008-12-18 | 2010-06-24 | Seiko Epson Corporation | Content Information Deliver System |
US20140201769A1 (en) * | 2009-05-29 | 2014-07-17 | Zeev Neumeier | Systems and methods for identifying video segments for displaying contextually relevant content |
US10116972B2 (en) | 2009-05-29 | 2018-10-30 | Inscape Data, Inc. | Methods for identifying video segments and displaying option to view from an alternative source and/or on an alternative device |
US9906834B2 (en) | 2009-05-29 | 2018-02-27 | Inscape Data, Inc. | Methods for identifying video segments and displaying contextually targeted content on a connected television |
US11272248B2 (en) | 2009-05-29 | 2022-03-08 | Inscape Data, Inc. | Methods for identifying video segments and displaying contextually targeted content on a connected television |
US10820048B2 (en) | 2009-05-29 | 2020-10-27 | Inscape Data, Inc. | Methods for identifying video segments and displaying contextually targeted content on a connected television |
US11080331B2 (en) | 2009-05-29 | 2021-08-03 | Inscape Data, Inc. | Systems and methods for addressing a media database using distance associative hashing |
US10271098B2 (en) | 2009-05-29 | 2019-04-23 | Inscape Data, Inc. | Methods for identifying video segments and displaying contextually targeted content on a connected television |
US9055309B2 (en) * | 2009-05-29 | 2015-06-09 | Cognitive Networks, Inc. | Systems and methods for identifying video segments for displaying contextually relevant content |
US10949458B2 (en) | 2009-05-29 | 2021-03-16 | Inscape Data, Inc. | System and method for improving work load management in ACR television monitoring system |
US10185768B2 (en) | 2009-05-29 | 2019-01-22 | Inscape Data, Inc. | Systems and methods for addressing a media database using distance associative hashing |
US10375451B2 (en) | 2009-05-29 | 2019-08-06 | Inscape Data, Inc. | Detection of common media segments |
US10169455B2 (en) | 2009-05-29 | 2019-01-01 | Inscape Data, Inc. | Systems and methods for addressing a media database using distance associative hashing |
US20110055888A1 (en) * | 2009-08-31 | 2011-03-03 | Dell Products L.P. | Configurable television broadcast receiving system |
US10192138B2 (en) | 2010-05-27 | 2019-01-29 | Inscape Data, Inc. | Systems and methods for reducing data density in large datasets |
US20120017236A1 (en) * | 2010-07-13 | 2012-01-19 | Sony Computer Entertainment Inc. | Supplemental video content on a mobile device |
US10981055B2 (en) | 2010-07-13 | 2021-04-20 | Sony Interactive Entertainment Inc. | Position-dependent gaming, 3-D controller, and handheld as a remote |
US10279255B2 (en) | 2010-07-13 | 2019-05-07 | Sony Interactive Entertainment Inc. | Position-dependent gaming, 3-D controller, and handheld as a remote |
US10171754B2 (en) | 2010-07-13 | 2019-01-01 | Sony Interactive Entertainment Inc. | Overlay non-video content on a mobile device |
US10609308B2 (en) | 2010-07-13 | 2020-03-31 | Sony Interactive Entertainment Inc. | Overly non-video content on a mobile device |
US9814977B2 (en) * | 2010-07-13 | 2017-11-14 | Sony Interactive Entertainment Inc. | Supplemental video content on a mobile device |
US10284884B2 (en) | 2013-12-23 | 2019-05-07 | Inscape Data, Inc. | Monitoring individual viewing of television events using tracking pixels and cookies |
US9838753B2 (en) | 2013-12-23 | 2017-12-05 | Inscape Data, Inc. | Monitoring individual viewing of television events using tracking pixels and cookies |
US9955192B2 (en) | 2013-12-23 | 2018-04-24 | Inscape Data, Inc. | Monitoring individual viewing of television events using tracking pixels and cookies |
US11039178B2 (en) | 2013-12-23 | 2021-06-15 | Inscape Data, Inc. | Monitoring individual viewing of television events using tracking pixels and cookies |
US10306274B2 (en) | 2013-12-23 | 2019-05-28 | Inscape Data, Inc. | Monitoring individual viewing of television events using tracking pixels and cookies |
US10945006B2 (en) | 2015-01-30 | 2021-03-09 | Inscape Data, Inc. | Methods for identifying video segments and displaying option to view from an alternative source and/or on an alternative device |
US11711554B2 (en) | 2015-01-30 | 2023-07-25 | Inscape Data, Inc. | Methods for identifying video segments and displaying option to view from an alternative source and/or on an alternative device |
US10405014B2 (en) | 2015-01-30 | 2019-09-03 | Inscape Data, Inc. | Methods for identifying video segments and displaying option to view from an alternative source and/or on an alternative device |
US10482349B2 (en) | 2015-04-17 | 2019-11-19 | Inscape Data, Inc. | Systems and methods for reducing data density in large datasets |
US10674223B2 (en) | 2015-07-16 | 2020-06-02 | Inscape Data, Inc. | Optimizing media fingerprint retention to improve system resource utilization |
US10902048B2 (en) | 2015-07-16 | 2021-01-26 | Inscape Data, Inc. | Prediction of future views of video segments to optimize system resource utilization |
US10873788B2 (en) | 2015-07-16 | 2020-12-22 | Inscape Data, Inc. | Detection of common media segments |
US11308144B2 (en) | 2015-07-16 | 2022-04-19 | Inscape Data, Inc. | Systems and methods for partitioning search indexes for improved efficiency in identifying media segments |
US11451877B2 (en) | 2015-07-16 | 2022-09-20 | Inscape Data, Inc. | Optimizing media fingerprint retention to improve system resource utilization |
US11659255B2 (en) | 2015-07-16 | 2023-05-23 | Inscape Data, Inc. | Detection of common media segments |
US10080062B2 (en) | 2015-07-16 | 2018-09-18 | Inscape Data, Inc. | Optimizing media fingerprint retention to improve system resource utilization |
US11971919B2 (en) | 2015-07-16 | 2024-04-30 | Inscape Data, Inc. | Systems and methods for partitioning search indexes for improved efficiency in identifying media segments |
US10983984B2 (en) | 2017-04-06 | 2021-04-20 | Inscape Data, Inc. | Systems and methods for improving accuracy of device maps using media viewing data |
Also Published As
Publication number | Publication date |
---|---|
AU2003275153A1 (en) | 2004-04-23 |
EP1547376A1 (en) | 2005-06-29 |
EP1547376B1 (en) | 2010-11-17 |
JP4317131B2 (en) | 2009-08-19 |
JP2006501767A (en) | 2006-01-12 |
WO2004032493A1 (en) | 2004-04-15 |
KR100720785B1 (en) | 2007-05-23 |
KR20050049516A (en) | 2005-05-25 |
ATE488955T1 (en) | 2010-12-15 |
US7171402B1 (en) | 2007-01-30 |
DE60335017D1 (en) | 2010-12-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7171402B1 (en) | Dynamic interactive content system | |
US7681221B2 (en) | Content processing apparatus and content processing method for digest information based on input of content user | |
JP4076067B2 (en) | Recording / playback system | |
US20050060741A1 (en) | Media data audio-visual device and metadata sharing system | |
KR100865042B1 (en) | System and method for creating multimedia description data of a video program, a video display system, and a computer readable recording medium | |
US20070101369A1 (en) | Method and apparatus for providing summaries of missed portions of television programs | |
US8725757B2 (en) | Information processing apparatus, tuner, and information processing method | |
JP2004357184A (en) | Apparatus and method for processing information, and computer program | |
US20020174445A1 (en) | Video playback device with real-time on-line viewer feedback capability and method of operation | |
US8000578B2 (en) | Method, system, and medium for providing broadcasting service using home server and mobile phone | |
US8467657B2 (en) | Incorporating a current event data stream onto a pre-recorded video stream for playback | |
US8374487B2 (en) | Information processing for generating an information list of video contents | |
US20110081128A1 (en) | Storage medium storing moving-image data that includes mode information, and reproducing apparatus and method | |
US7035531B2 (en) | Device and method for supplying commentary information | |
JP4645102B2 (en) | Advertisement receiver and advertisement receiving system | |
US6806913B2 (en) | Apparatus and method for processing additional information in data broadcast system | |
US20090047004A1 (en) | Participant digital disc video interface | |
US20040226038A1 (en) | Advertisement method in digital broadcasting | |
JP2003125305A (en) | Method and apparatus of watching broadcast program, and watching program for broadcast program | |
JP2001251565A (en) | Reception device, information reproduction method for the same, electronic unit and information reproduction method for the same | |
JP2007288391A (en) | Hard disk device | |
JPH09149328A (en) | Broadcasting video output device | |
GB2464685A (en) | Media output with additional information | |
JP2006054631A (en) | Program edit reproduction method, program edit reproducing system, and program edit reproduction program | |
JP2005086542A (en) | Apparatus and program for editing contents |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY COMPUTER ENTERTAINMENT AMERICA LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA INC.;REEL/FRAME:025351/0655 Effective date: 20100401 Owner name: SONY COMPUTER ENTERTAINMENT AMERICA LLC, CALIFORNI Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA INC.;REEL/FRAME:025351/0655 Effective date: 20100401 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA LLC;REEL/FRAME:038626/0637 Effective date: 20160331 Owner name: SONY INTERACTIVE ENTERTAINMENT AMERICA LLC, CALIFO Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT AMERICA LLC;REEL/FRAME:038626/0637 Effective date: 20160331 |