US20170094371A1 - Systems and methods for providing a multi-perspective video display - Google Patents

Systems and methods for providing a multi-perspective video display Download PDF

Info

Publication number
US20170094371A1
US20170094371A1 US15/379,173 US201615379173A US2017094371A1 US 20170094371 A1 US20170094371 A1 US 20170094371A1 US 201615379173 A US201615379173 A US 201615379173A US 2017094371 A1 US2017094371 A1 US 2017094371A1
Authority
US
United States
Prior art keywords
content
spoken language
audio content
computerized method
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/379,173
Inventor
Debra Hensgen
Ludovic Pierre
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OpenTV Inc
Original Assignee
OpenTV Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/630,646 external-priority patent/US6678463B1/en
Application filed by OpenTV Inc filed Critical OpenTV Inc
Priority to US15/379,173 priority Critical patent/US20170094371A1/en
Assigned to OPENTV, INC. reassignment OPENTV, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PIERRE, LUDOVIC, HENSGEN, DEBRA
Publication of US20170094371A1 publication Critical patent/US20170094371A1/en
Priority to US15/624,020 priority patent/US20170295405A1/en
Priority to US15/623,930 priority patent/US10462530B2/en
Priority to US16/570,985 priority patent/US10869102B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8106Monomedia components thereof involving special audio data, e.g. different tracks for different languages
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/005Language recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4335Housekeeping operations, e.g. prioritizing content for deletion because of storage space restrictions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4345Extraction or processing of SI, e.g. extracting service information from an MPEG stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • H04N21/4349Demultiplexing of additional data and video streams by extracting from data carousels, e.g. extraction of software modules from a DVB carousel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • H04N21/4351Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream involving reassembling additional data, e.g. rebuilding an executable program from recovered modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4394Processing of audio elementary streams involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/61Network physical structure; Signal processing
    • H04N21/6106Network physical structure; Signal processing specially adapted to the downstream path of the transmission network
    • H04N21/6125Network physical structure; Signal processing specially adapted to the downstream path of the transmission network involving transmission via Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8166Monomedia components thereof involving executable data, e.g. software
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8545Content authoring for generating interactive applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/162Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing
    • H04N7/163Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing by receiver means only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8227Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal

Definitions

  • the present invention relates generally to interactive video delivery mediums such as interactive television, and more particularly, to a system and method for providing multi-perspective instant replay of broadcast material.
  • a broadcast service provider transmits audio-video streams to a viewer's television.
  • Interactive television systems are capable of displaying text and graphic images in addition to typical audio-video programs. They can also provide a number of services, such as commerce via the television, and other interactive applications to viewers.
  • the interactive television signal can include an interactive portion consisting of application code, data, and signaling information, in addition to audio-video portions.
  • the broadcast service provider can combine any or all of this information into a single signal or several signals for transmission to a receiver connected to the viewer's television or the provider can include only a subset of the information, possibly with resource locators.
  • resource locators can be used to indicate alternative sources of interactive and/or audio-video information.
  • the resource locator could take the form of a world wide web universal resource locator (URL).
  • URL world wide web universal resource locator
  • the television signal is generally compressed prior to transmission and transmitted through typical broadcast media such as cable television (CATV) lines or direct satellite transmission systems.
  • Information referenced by resource locators may be obtained over different media, for example, through an always-on return channel, such as a DOCSIS modem.
  • a set top box connected to the television controls the interactive functionality of the television.
  • the set top box receives the signal transmitted by the broadcast service provider, separates the interactive portion from the audio-video portion, and decompresses the respective portions of the signal.
  • the set top box uses interactive information to execute an application while the audio-video information is transmitted to the television.
  • Set top boxes typically include only a limited amount of memory. While this memory is sufficient to execute interactive applications, it is typically not adequate to store the applications for an indefinite period of time. Further, the memory of the set top box is typically too small to accommodate a program which includes large amounts of audio or video data, application code, or other information.
  • Storage devices may be coupled to the set top box to provide additional memory for the storage of video and audio broadcast content.
  • Interactive content such as application code or information relating to television programs is typically broadcast in a repeating format.
  • the pieces of information broadcast in this manner form what is referred to as a “carousel”.
  • Repeating transmission of objects in a carousel allows the reception of those objects by a receiver without requiring a return path from the receivers to the server. If a receiver needs a particular piece of information, it can simply wait until the next time that piece of information is broadcast, and then extract the information from the broadcast stream. If the information were not cyclically broadcast, the receiver would have to transmit a request for the information to the server, thus requiring a return path.
  • the content provider may generate multiple video feeds from various angles of the game, for example.
  • the network may select one or more feeds from the multiple video feeds and broadcast the selected video feed(s) to the viewing audience at any given point in time. That is, the network may simultaneously broadcast video tracks that present the same scene, except from a different perspective or send different audio tracks or subtitles if a movie is broadcast in different languages, for example.
  • the viewer may use an interactive application that executes on their set top box to choose between different perspectives. When a viewer requests a change in perspective, the interactive application uses meta-data to determine which packets contain the chosen perspective. It starts delivering packets that contain the newly chosen perspective.
  • FIG. 1 is a diagram illustrating the distribution of television programs and signaling information from a broadcast station to a receiving station.
  • FIG. 2 is a block diagram of as system of the present invention for recording programs received from the broadcast station of FIG. 1 .
  • FIG. 3 is a block diagram illustrating the transfer of data to a storage device coupled to the set to box of FIG. 2 .
  • FIG. 4 is a diagram illustrating three video streams and two audio streams simultaneously sent to a receiving station with one of the audio and one of the video streams sent to a television. Those same streams are also sent to a storage device along with one of the other video streams.
  • FIG. 5 is similar to the diagram of FIG. 4 except that the second video stream is now also displayed in a PIP window along with the first audio and video streams which are displayed in the main picture of the television.
  • FIG. 6 is a diagram similar to the diagram of FIG. 5 except that the second video stream is now shown in the center of the television screen with the first video stream shown in the PIP window.
  • FIG. 6 a is a diagram similar to the diagram of FIG. 6 except that the configuration shown does not require or use a PIP.
  • FIG. 7 is a diagram similar to the diagram of FIG. 6 except that the live broadcast of the second video stream is replaced with a previously broadcast version of the same perspective.
  • FIG. 7 a is a diagram similar to the diagram of FIG. 7 except that the configuration shown does not require or use a PIP, and a recorded audio stream is played instead of a live audio stream as in FIG. 7 .
  • FIG. 8 is a diagram illustrating a first video stream and audio stream displayed on a television and recorded along with a second audio stream.
  • FIG. 9 is a diagram similar to the diagram shown in FIG. 8 except that the first audio stream is replaced with the second audio stream.
  • FIG. 10 is a diagram similar to the diagram of FIG. 9 except that the first video stream and second audio stream are replaced with earlier broadcast versions.
  • FIG. 11 illustrates an example of files and data structures on a storage device.
  • the text accompanying FIG. 11 describes how these data structures could be used to facilitate the viewing of an instant replay from a different perspective.
  • FIG. 12 is a flowchart of a method in accordance with the invention.
  • the system 10 includes a broadcast station 20 when audio-video and control information is assembled in the form of digital data and mapped into digital signals for satellite transmission to as receiving station.
  • Control information such as a conditional access information and signaling information (such as a list of services available to user, event names, and schedule of events (start time/date and duration), and program specific information) may be added to video, audio, and interactive applications for use by the interactive television system.
  • Control information can describe relationships between streams, such as which streams can be considered as carrying different perspectives of which other streams.
  • the control information is converted by the broadcast station to a format suitable for transmission over broadcast medium.
  • the data may be formatted into packets, for example, which can be transmitted over a digital satellite network.
  • the packets may be multiplexed with other packets for transmission.
  • the signal is typically compressed prior to transmission and may be transmitted through broadcast channels such as cable television lines or direct satellite transmission systems 22 (as shown in FIG. 1 ).
  • the Internet, telephone lines, cellular networks, fiber optics, or other terrestrial transmission media may also be used in place of the cable or satellite system for transmitting broadcasts.
  • the broadcaster may embed service information in the broadcast transport stream, and the service information may list each of the elementary stream identifiers and associate with each identifier an encoding that describes the type of the associated stream (e.g., whether it contains video or audio) and a textual description of the stream that can be understood and used by the user to choose between different perspectives, as described below.
  • the service information may list each of the elementary stream identifiers and associate with each identifier an encoding that describes the type of the associated stream (e.g., whether it contains video or audio) and a textual description of the stream that can be understood and used by the user to choose between different perspectives, as described below.
  • the receiving station includes a set top box 16 connected to a storage device 18 , and a television 20 which is used to present programs to a viewer.
  • the set top box 16 is operable to decompress the digital data and display programs to a viewer.
  • the decompressed video signals may be converted into analog signals such as NTSC (National Television Standards Committee) format signals for television display.
  • Signals sent to the set top box 16 are filtered and of those that meet the filtering requirements, some are used by the processor 30 immediately and others can be placed in local storage such as RAM. Examples of requirements that would need to be filtered for include a particular value in the location reserved for an elementary stream identifier or an originating network identifier.
  • the set top box 16 may be used to overlay or combine different signals to form the desired display on the viewer's television 20 .
  • the set top box 16 is configured to record one or more video and/or audio streams simultaneously to allow a viewer to replay a scene which has recently been viewed or heard by a viewer, except from a different perspective.
  • Broadcast station 12 simultaneously broadcasts multiple perspectives for use by viewers that have set top boxes 16 which execute interactive television applications. For example, multiple cameras may be used to record a sporting event and the station may broadcast from the multiple cameras at the same time to allow the viewer to choose between different camera views using an interactive application that executes on their set top box 16 .
  • a broadcaster may also send multiple perspectives of audio tracks in different languages, for example.
  • the multiple video and audio perspectives are only examples of types of perspectives of which a plurality may be contained in a broadcast.
  • the present invention allows a viewer to replay the same scene from a different perspective, while ensuring that the viewer will still be able to view, either simultaneously or at a later time, the portion of the program being broadcast simultaneously with their viewing of the replay.
  • the viewer may request a replay of any combination of audio, video, executables, and data, from either the same or different perspectives as the perspectives previously played.
  • program refers to any broadcast material including television shows, sporting events, news programs, movies, or any other type of broadcast material, or a segment of the material.
  • the material may include only audio, video, data, or any combination thereof.
  • the program may be only a portion of a television show or broadcast (e.g., without commercials or missing a portion of the beginning or end) or may be more than one show, or include commercials for example.
  • viewing as used herein is defined such that viewing of a program begins as soon as a tuner begins filtering data corresponding to a program.
  • the beginning of the viewing preferably corresponds to the beginning of the program.
  • the viewing preferably ends when the program is complete or when the tuner is no longer filtering the frequency corresponding to the program.
  • the recording of a program coincides with the “viewing” of a program and the program is only recorded when a tuner is tuned to the station broadcasting the program.
  • the television display is turned off after a viewer has started recording the program, as long as the tuner is tuned into the station broadcasting the program and a recording of the information broadcast on the same frequencies as those used at the start of the viewing is being made, the viewing is said to continue.
  • the audio-video signals and program control signals received by the set top box 16 correspond to television programs and menu selections that the viewer may access through a user interface.
  • the viewer may control the set top box 16 through an infrared remote control unit, a control panel on the set top box, or a menu displayed on the television screen, for example.
  • system 10 described above and shown in FIG. 1 is only one example of a system used to convey signals to the television 20 .
  • the broadcast network system may be different than described herein without departing from the scope of the invention.
  • the set top box 16 may be used with a receiver or integrated decoder receiver that is capable of decoding video, audio, and data, such as a digital set top box for use with a satellite receiver or satellite integrated decoder receiver that is capable of decoding MPEG video, audio, and data.
  • the set top box 16 may be configured, for example, to receive digital video channels which support broadband communications using Quadrate Amplitude Modulation (QAM) and control channels for two-way signaling and messaging.
  • QAM Quadrate Amplitude Modulation
  • the digital QAM channels carry compressed and encoded multiprogram MPEG (Motion Picture Expert Group) transport streams.
  • MPEG Motion Picture Expert Group
  • a transport system extracts the desired program from the transport stream and separates the audio, video, and data components, which are routed to devices that process the streams, such as one or more audio decoders, one or more video decoders, and optionally to RAM (or other form of memory) or a hard drive.
  • devices that process the streams such as one or more audio decoders, one or more video decoders, and optionally to RAM (or other form of memory) or a hard drive.
  • RAM or other form of memory
  • the storage device 18 is coupled to the set top box 16 .
  • the storage device 18 is used to provide sufficient storage to record programs that will not fit in the limited amount of main memory (e.g., RAM) typically available in set top boxes.
  • the storage device 18 may comprise any suitable storage device, such as a hard disk drive, a recordable DVD drive, magnetic tape, optical disk, magneto-optical disk, flash memory, or solid state memory, for example.
  • the storage device 18 may be internal to the set top box 16 or connected externally (e.g., through an IEEE 1394-1995 connection) with either a permanent connection of a removable connection. More than one storage device 18 may be attached to the set top box 16 .
  • the set top box 16 and/or storage device 18 may also be included in one package with the television set 20 .
  • FIG. 2 illustrates one embodiment of a system of the present invention used to record programs received from the broadcast station 12 .
  • the set top box 16 generally includes a control unit (e.g., microprocessor), main memory (e.g., RAM), and other components which are necessary to select and decode the received interactive television signal.
  • the set top box 16 includes a front end 26 operable to receive audio, video, and other data from the broadcast station 12 .
  • the broadcast source is fed into the set top box 16 at the front end 26 , which comprises an analog to digital (A/D) converter and tuner/demodulators (not shown).
  • the front end 26 filters out a particular band of frequencies, demodulates it and converts it to a digital format.
  • the digitized output is then sent to a transport stage 28 .
  • the transport stage 28 further processes the data, sending a portion of the data to an audio-visual (AV) stage 34 for display and another portion to the control processor 30 , and filtering out the rest of the data.
  • AV audio-visual
  • Control information may also be recorded as broadcast along with the audio-video data or may be first manipulated by software within the set top box 16 .
  • broadcast CA conditional access
  • the original broadcast streams, or modifications of these streams may be optionally re-encrypted using a set top box key or algorithm prior to recording.
  • the encrypted video may also be stored as received along with the broadcast CA information.
  • clock information may be translated to a virtual time system prior to recording.
  • An MPEG-2 elementary stream may be de-multiplexed from an MPEG-2 transport stream, then encapsulated as a program stream and recorded.
  • FIG. 3 illustrates the transfer of data from the transport stage 28 to the storage device 18 .
  • the storage device 18 typically contains a plurality of programs which have been recorded by a viewer.
  • the recordings of each perspective are associated with identifying information that may have been copied or modified from the original signaling information.
  • This identifying information may contain bookkeeping information similar to that typically stored in audio/video file systems or hierarchical computer file systems.
  • the identifying information may have various formats and content, as long as it provides sufficient information to allow the viewer, possibly interacting with the system, to uniquely retrieve a particular recorded perspective.
  • the programs may be identified with an ID number and a start time and end time. As described below, the storage may be defragmented periodically so that the programs are stored in a contiguous manner.
  • Direct memory access is preferably used to send data from the transport stage 28 to the storage device 18 .
  • the data that is sent to the control processor 30 may include meta-data which describes the content of the audio-video data streams and may also include application programs and corresponding data that can be executed on the control processor in order to provide interactive television.
  • a copy of data sent from the transport stage 28 to the AV stage 34 is sent to the storage device 18 at the beginning of the viewing.
  • the CPU in the control processor 30 configures a DMA controller to ensure that the data is written to a buffer that is allocated in the storage device 18 .
  • the number of minutes of viewing data to be recorded in the buffer is preferably selected by the viewer; however, the set top box may 16 be preset with a default value such as fifteen minutes.
  • the control processor's CPU calculates the size of the buffer to allocate based upon the number of minutes and the maximum speed at which bits in the transport stream that the viewer is watching will be sent. This maximum speed may be obtained from meta-data sent with the audio-video stream.
  • the CPU in the control processor is interrupted, at which time it will re-configure the DMA controller to start writing at the beginning of the buffer. This design is known as a circular buffer.
  • the buffer is preferably circular to allow contiguous recording and writing over of previously recorded content.
  • the control processor's CPU will be interrupted. At this time, the CPU may allocate a new buffer or mark the beginning of the new event in the original buffer.
  • the control processor 30 records the multi-perspective streams at a start of the program to store the perspectives in storage device 18 .
  • the perspectives will continue to be recorded and stored within the storage device 18 for a pre-determined period of time (e.g., 15 minutes). If a viewer decides to record the entire viewing after the start of the program, he will select a record option and the processor 30 will allocate space within the storage device 18 . All perspectives will be recorded along with the program that is being viewed. See e.g., U.S. patent application Ser. No. 09/630,646, entitled “System and Method for Incorporating Previously Broadcast Content” and filed Aug. 2, 2000 (Attorney Docket No. OPTVP013), which is incorporated herein by reference in its entirety.
  • the joining of the first and second recorded portions of any given perspective in a common storage area may be implemented either physically or virtually.
  • a physical implementation may include copying the first recorded portion to a location where the second portion has been recorded.
  • a virtual implementation may include the modification of a data structure stored in a storage device. In either case, a viewer watching a replay of any perspective should not be able to detect that the two parts of the perspective were originally stored separately.
  • the portions of the perspective may be physically contiguous or the portions of the perspective may be stored separately in a non-contiguous format as long as the entire recorded program can be played back in a continuous manner (i.e., viewer does not notice a transition between the playback of the first and second portions of the perspective).
  • the recording of the entire program, including the plurality of perspectives, in the storage device 18 may occur without any action by the viewer. For example, if the viewer rewinds (or performs a similar action on different types of storage media) a portion of one of the recorded perspectives to replay a scene, the entire program along with all of its multiple perspectives may be recorded in the storage device, since the viewer has shown interest in the program.
  • the control information that is broadcast with the program preferably indicates which streams are related to the viewed streams.
  • the set top box 16 by filtering on the appropriate identifiers in the broadcast MPEG-2 (or DSS or other encoding) packets can locate all related elementary streams. It sends the streams that the viewer is watching to the television set 20 and records in the storage device 18 the content of these streams, along with the other related streams, including related video, audio, executables, and data. Meta-data that indicates the maximum bit rate for the streams may accompany the elementary or transport streams.
  • the format of the recorded streams may depend upon the hardware support. For example, special purpose hardware inside the set top box 16 may support re-multiplexing of streams or concurrent reads and writes to the storage device 18 , as is well known by those skilled in the art.
  • Broadcast data such as audio and video data, application code, control signals and other types of information may be sent as data objects.
  • the program is to be consumed (i.e., presented to the viewer) the broadcast data must be parsed to extract data objects from the stream.
  • the program is played. For example, any applications that need to be executed are launched and any audio or video data that needs to be presented to the viewer is played.
  • the program is stored, the data objects are extracted in the same manner, but they are stored instead of being immediately used to present the program.
  • the recorded program is played back using the stored data objects.
  • the data objects may include “live” data which becomes obsolete if not consumed immediately. If this data is stored and used when the program is played back, the program will in at least part, be obsolete.
  • live data objects may be stored as references in the program.
  • new live data corresponding to the reference may be obtained and used in place of the data which was live at the time the program was recorded.
  • temporally correct data is used by the interactive application when it executes at a later time.
  • FIGS. 4-10 show the set top box 16 receiving three video and two audio streams from the broadcast station 12 .
  • the signals are received from the broadcast station 12 at the tuner in front and 26 and related streams are sent to demultiplexer and processor 100 .
  • Video streams V 1 , V 2 and audio stream A 1 are all related (e.g., video streams are different camera views of a sporting event and A 1 is the sound track for the announcer) and can be provided in a single transport stream. If all the related streams are provided in one transport stream only one tuner 50 is required.
  • the set top box 16 may include multiple tuners 50 for recording and displaying related streams broadcast in separate transport streams.
  • Related streams are preferably broadcast on a small number of frequencies so that a large number of tuners will not be required within or attached to the set top box 16 .
  • a large number e.g., five
  • video streams along with multiple audio streams, executable programs, data, and control information may be multiplexed together on a single frequency.
  • FIGS. 4-7 illustrate a case where a viewer requests a replay from a different perspective using a picture-within-picture (PIP) mode. If a viewer wants to see the replay from a different perspective, it can be viewed in a PIP mode without requiring multiple tuners in the set top box 16 or the television 20 . The additional tuner is not required since one of the video or audio streams that had been previously recorded is coming from the storage device 18 . All streams shown are preferably multiplexed on the same frequency. The video or audio can be delivered directly to the AV stage 34 which is contained in 100 which itself is inside the set top box 16 , and may be multiplexed with a transport stream that is being delivered via the tuner 50 .
  • PIP picture-within-picture
  • 100 represents three components: (i) a demultiplexer; (ii) a processor that directs portions of the broadcast information to other components; and (iii) an AV stage that modulates when necessary (i.e. when the television is analog).
  • the viewer can choose to view only the replay while the set top box 16 buffers, on the storage device 18 , the live broadcast for later delivery, as described below with respect to FIGS. 8-10 .
  • the broadcast station 12 is sending video streams V 1 and V 2 containing two different perspectives and one audio stream A 1 .
  • the two video streams may be two different camera positions at a baseball game, for example.
  • the viewer is currently watching video stream V 1 and listening to audio stream A 1 .
  • the first and second video streams V 1 and V 2 and the audio stream A 1 are automatically recorded.
  • the previously broadcast information is available if a viewer wants to replay, for example, the last play of the game. In particular, with this invention, the viewer can replay this information from any of the previously broadcast perspectives.
  • the viewer may place the set top box into a PIP mode so that the viewer can see a first perspective (video stream V 1 ) displayed in a large central area in the television screen and a second perspective (video stream V 2 ) displayed in a small picture window in the top right hand corner (or some other area) of the television screen ( FIG. 5 ).
  • a first perspective video stream V 1
  • video stream V 2 video stream
  • the viewer may want to see a repay, this time from a perspective different from the one shown in V 1 .
  • the viewer may optionally switch the windows into winch the video streams V 1 and V 2 are displayed, as shown in FIG. 6 .
  • Video stream V 1 is now sent to the PIP window and video stream V 2 is sent to the central viewing window.
  • the viewer would give a command (i.e. press a button on the remote control) to re-wind the view in the main window while permitting the PIP window to continue displaying the “live” V 1 in the PIP window.
  • the recorded video stream V 2 ′ which is from the same perspective as V 2 , but which was broadcast and recorded earlier, is sent from the storage device 18 to the demultiplexer in 100 which sends the previously recorded stream V 2 ′ along with the current video stream V 1 to the television for display.
  • the viewer may rewind or search through the recording until the beginning of the recording is reached.
  • the viewer may also rewind and display the first video stream V 1 .
  • the broadcast of the remainder of the program may be sent to the storage device 18 since the viewer has shown an interest in the recording. This may be automatic (i.e., program streams are sent to storage device 18 upon a viewer's request for a replay) or may only occur upon receiving a request from the viewer to record the entire program.
  • a viewer may prefer not to be distracted by the live broadcast which is shown as being displayed in the PIP in FIG. 7 . Therefore, the viewer may simply first switch perspectives from V 1 to V 2 as shown in FIG. 6 a. After that, the viewer may “rewind” to an earlier event to see a previous scene from the perspective carried in video stream V 2 .
  • FIG. 7 a a copy of the live video stream V 1 is only sent to the storage device, along with the live video stream V 2 and live audio stream A 1 .
  • the recorded streams V 2 ′ and A 1 ′ are the only ones sent, possibly after modulation, to the television.
  • the scenario presented in FIGS. 6 a and 7 a could also be a scenario used by the viewer to switch between a live video perspective and a different, recorded, video perspective, when there is no PIP functionality associated with the viewer's television.
  • FIGS. 8-10 illustrate a case where a program is broadcast with different perspective audio streams.
  • a viewer may be watching an Italian movie that is broadcast with an Italian audio stream A 1 and an English audio stream A 2 .
  • video stream V 1 and audio stream A 1 are presented to the viewer and recorded in the storage device 18 while audio stream A 2 is also recorded in the storage device 18 but not presented to the viewer.
  • the viewer is initially listening to the Italian broadcast (audio stream A 1 ); however, during part of the movie, the viewer does not understand the Italian, so he selects a “switch to English” option from a menu and the viewer now hears the English broadcast (audio stream A 2 ) ( FIG. 9 ).
  • the viewer wants to hear the soundtrack that accompanied the previous scene in English, he may rewind the tape of the video stream V 1 and audio stream A 2 and watch the scene over again in English ( FIG. 10 ).
  • the video and audio streams V 1 , A 1 , and A 2 will continue to be recorded so that the viewer can see the rest of the movie in a deferred mode, without missing the portion of the movie that was broadcast while the viewer was rewinding and replaying the previous scene.
  • FIG. 11 shows an example of a meta-data file that can be stored along with each recorded perspective.
  • This invention does not require the format shown in this figure, but the format is only used as an example of how meta-data can facilitate the playing of an instant replay from a different perspective.
  • Each record of the meta-data file shown contains, among other possible fields, a time and an offset.
  • a program clock reference is frequently, though not periodically, broadcast along with the video.
  • their value along with the offset into the recording of the most recent I-frame (one of 3 types of MPEG-2 frame encodings that can be used for video), can be recorded as meta-data.
  • P- and B-frames the other types of MPEG-2 encodings, both of which are typically more compressed than an I-frame.
  • the offset is in terms of bytes measured from the beginning of the file containing the recording of the perspective.
  • V 1 the viewer has been watching a live broadcast that contains video perspective V 1 .
  • V 1 is being recorded to a file.
  • other video perspectives including video perspective V 2 , are being recorded to a different file because they represent a different view of the same information.
  • V 2 could be recorded in the same file as long as other information distinguishing V 1 from V 2 is recorded somewhere.
  • the viewer has just seen something interesting on the screen and enters the appropriate commands to cause V 1 to be re-wound to the beginning of the interesting scene.
  • P- and B-frames could also be recorded in the file containing the I-frames from V 1 , and could be used in locating a scene, but they are not used in this example.
  • MPEG-2 is only used as an example; other formats of media and/or data can equally well be used
  • the location of the I-frame in the recording of perspective V 2 that is nearest to that time needs to be found.
  • the first step here is to locate t 2,k and t 2,k+1 such that t 2,k ⁇ t ⁇ t 2,k+1 .
  • the search that performs the best in any given case is dependent upon the format of the file and is a well-studied problem. Having these values allows for an approximation of d 2 ,t.
  • d 2,t (( d 2,j+1 ⁇ d 2,j )( t ⁇ t 2,j )/( t 2,j+1 ⁇ t 2,j ))+ d 2,j
  • the I-frame that is nearest to being d 2,t bytes from the beginning of the file containing the recording of V 2 is used as the starting frame for playing back the recording for the viewer.
  • FIG. 12 shows a process flow in accordance with the embodiment described herein.
  • the system receives a broadcast including multiple perspectives of a program.
  • the system presents one of the perspectives to the viewer, step 210 , and stores all of the perspectives in a storage device, step 220 .
  • the system stores all of the perspectives, but may be configured to selectively store perspectives based on criteria provided by the viewer (such as an indication of which perspectives the viewer is interested in).
  • the perspectives are stored in a circular buffer, step 260 .
  • Another perspective is presented to the viewer, step 230 , and the presentation of this perspective and the first perspective includes preparation of an audio/video signal for the television, step 250 .
  • the presentation of the other perspective in step 230 may involve searching the stored perspectives, step 240 , and the perspective presented may be one of the stored perspectives.
  • a method and system for processing broadcasts have been disclosed.
  • Software written according to the present invention may be stored in some form of computer-readable medium, such as memory or CD-ROM, or transmitted over a network, and executed by a processor. Additionally, where methods have been disclosed, various sequences of steps may be possible, and it may be possible to perform such steps simultaneously, without departing from the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Marketing (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Television Signal Processing For Recording (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Methods and systems for providing a multi-perspective video display are presented. In an example embodiment, a media content item including visual content and at least a portion of audio content associated with the visual content is received. The associated audio content includes audio content in a first spoken language and audio content in a second spoken language. The visual content is provided with the audio content in the first spoken language for presentation on a display device. Responsive to a first command from a media application, the visual content is provided along with the audio content in the second spoken language for presentation. Responsive to a second command, after the first command, from the media application to play at least a portion of the visual content, the portion of the visual content is provided along with a corresponding portion of the audio content in the second spoken language for presentation.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 14/479,077, filed on Sep. 5, 2014, which is a continuation of U.S. patent application Ser. No. 13/589,589, filed on Aug. 20, 2012, which is a continuation of U.S. patent application Ser. No. 09/765,965, filed on Jan. 19, 2001, which claims the benefit of priority to U.S. Provisional Application Ser. No. 60/235,529, filed on Sep. 26, 2000, each of which is hereby incorporated herein by reference in its entirety.
  • Each of U.S. patent application Ser. No. 09/630,646, filed on Aug. 2, 2000, and U.S. Provisional Application Ser. No. 60/162,490, filed on Oct. 29, 1999, is hereby incorporated herein by reference in its entirety.
  • FIELD OF THE INVENTION
  • The present invention relates generally to interactive video delivery mediums such as interactive television, and more particularly, to a system and method for providing multi-perspective instant replay of broadcast material.
  • BACKGROUND
  • A broadcast service provider transmits audio-video streams to a viewer's television. Interactive television systems are capable of displaying text and graphic images in addition to typical audio-video programs. They can also provide a number of services, such as commerce via the television, and other interactive applications to viewers. The interactive television signal can include an interactive portion consisting of application code, data, and signaling information, in addition to audio-video portions. The broadcast service provider can combine any or all of this information into a single signal or several signals for transmission to a receiver connected to the viewer's television or the provider can include only a subset of the information, possibly with resource locators. Such resource locators can be used to indicate alternative sources of interactive and/or audio-video information. For example, the resource locator could take the form of a world wide web universal resource locator (URL).
  • The television signal is generally compressed prior to transmission and transmitted through typical broadcast media such as cable television (CATV) lines or direct satellite transmission systems. Information referenced by resource locators may be obtained over different media, for example, through an always-on return channel, such as a DOCSIS modem.
  • A set top box connected to the television controls the interactive functionality of the television. The set top box receives the signal transmitted by the broadcast service provider, separates the interactive portion from the audio-video portion, and decompresses the respective portions of the signal. The set top box uses interactive information to execute an application while the audio-video information is transmitted to the television. Set top boxes typically include only a limited amount of memory. While this memory is sufficient to execute interactive applications, it is typically not adequate to store the applications for an indefinite period of time. Further, the memory of the set top box is typically too small to accommodate a program which includes large amounts of audio or video data, application code, or other information. Storage devices may be coupled to the set top box to provide additional memory for the storage of video and audio broadcast content.
  • Interactive content such as application code or information relating to television programs is typically broadcast in a repeating format. The pieces of information broadcast in this manner form what is referred to as a “carousel”. Repeating transmission of objects in a carousel allows the reception of those objects by a receiver without requiring a return path from the receivers to the server. If a receiver needs a particular piece of information, it can simply wait until the next time that piece of information is broadcast, and then extract the information from the broadcast stream. If the information were not cyclically broadcast, the receiver would have to transmit a request for the information to the server, thus requiring a return path. If a user is initially not interested in the carousel content, but later expresses an interest, the information can be obtained the next time the carousel is broadcast. Since broadcast networks have access only to a limited bandwidth, audio-video content is not broadcast in carousels. There is also insufficient bandwidth and server resources to handle pulling of large amounts of data required for video and audio in real-time to handle near simultaneous requests far broadcast of previously broadcast material from a vast number of television viewers.
  • In a broadcast by a television network, such as a broadcast of a sporting event, the content provider may generate multiple video feeds from various angles of the game, for example. The network may select one or more feeds from the multiple video feeds and broadcast the selected video feed(s) to the viewing audience at any given point in time. That is, the network may simultaneously broadcast video tracks that present the same scene, except from a different perspective or send different audio tracks or subtitles if a movie is broadcast in different languages, for example. The viewer may use an interactive application that executes on their set top box to choose between different perspectives. When a viewer requests a change in perspective, the interactive application uses meta-data to determine which packets contain the chosen perspective. It starts delivering packets that contain the newly chosen perspective.
  • As previously described, a viewer cannot request previously broadcast audio or video material due to the limited bandwidth available on broadcast networks. Also, data that accompanies interactive applications sometimes corresponds to audio and video that is currently being broadcast, so it changes frequently. In these cases, the values broadcast as part of the carousel often change and old values are no longer carried in the carousel. Thus, a viewer cannot replay a scene or a sporting event play from a different perspective unless the viewer has already recorded the video stream for the alternate perspective.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram illustrating the distribution of television programs and signaling information from a broadcast station to a receiving station.
  • FIG. 2 is a block diagram of as system of the present invention for recording programs received from the broadcast station of FIG. 1.
  • FIG. 3 is a block diagram illustrating the transfer of data to a storage device coupled to the set to box of FIG. 2.
  • FIG. 4 is a diagram illustrating three video streams and two audio streams simultaneously sent to a receiving station with one of the audio and one of the video streams sent to a television. Those same streams are also sent to a storage device along with one of the other video streams.
  • FIG. 5 is similar to the diagram of FIG. 4 except that the second video stream is now also displayed in a PIP window along with the first audio and video streams which are displayed in the main picture of the television.
  • FIG. 6 is a diagram similar to the diagram of FIG. 5 except that the second video stream is now shown in the center of the television screen with the first video stream shown in the PIP window.
  • FIG. 6a is a diagram similar to the diagram of FIG. 6 except that the configuration shown does not require or use a PIP.
  • FIG. 7 is a diagram similar to the diagram of FIG. 6 except that the live broadcast of the second video stream is replaced with a previously broadcast version of the same perspective.
  • FIG. 7a is a diagram similar to the diagram of FIG. 7 except that the configuration shown does not require or use a PIP, and a recorded audio stream is played instead of a live audio stream as in FIG. 7.
  • FIG. 8 is a diagram illustrating a first video stream and audio stream displayed on a television and recorded along with a second audio stream.
  • FIG. 9 is a diagram similar to the diagram shown in FIG. 8 except that the first audio stream is replaced with the second audio stream.
  • FIG. 10 is a diagram similar to the diagram of FIG. 9 except that the first video stream and second audio stream are replaced with earlier broadcast versions.
  • FIG. 11 illustrates an example of files and data structures on a storage device. The text accompanying FIG. 11 describes how these data structures could be used to facilitate the viewing of an instant replay from a different perspective.
  • FIG. 12 is a flowchart of a method in accordance with the invention.
  • Corresponding reference characters indicate corresponding parts throughout the several views of the drawings.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The following description is presented to enable one of ordinary skill in the art to make and use the invention. Descriptions of specific embodiments and applications are provided only as examples and various modifications will be readily apparent to those skilled in the art. The general principles described herein may be applied to other embodiments and applications without departing from the scope of the invention. Thus, the present invention is not to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein. It will be understood by one skilled in the art that many embodiments are possible, such as the use of a computer system and display to perform the functions and features described herein. For purpose of clarity, the invention will be described in its application to a set top box used with a television, and details relating to technical material that are known in the technical fields related to the invention have not been included.
  • Referring now to the drawings, and first to FIG. 1, a diagram of a television broadcast and receiving system is shown and generally indicated at 10. The system 10 includes a broadcast station 20 when audio-video and control information is assembled in the form of digital data and mapped into digital signals for satellite transmission to as receiving station. Control information such a conditional access information and signaling information (such as a list of services available to user, event names, and schedule of events (start time/date and duration), and program specific information) may be added to video, audio, and interactive applications for use by the interactive television system. Control information can describe relationships between streams, such as which streams can be considered as carrying different perspectives of which other streams. The control information is converted by the broadcast station to a format suitable for transmission over broadcast medium. The data may be formatted into packets, for example, which can be transmitted over a digital satellite network. The packets may be multiplexed with other packets for transmission. The signal is typically compressed prior to transmission and may be transmitted through broadcast channels such as cable television lines or direct satellite transmission systems 22 (as shown in FIG. 1). The Internet, telephone lines, cellular networks, fiber optics, or other terrestrial transmission media may also be used in place of the cable or satellite system for transmitting broadcasts. The broadcaster may embed service information in the broadcast transport stream, and the service information may list each of the elementary stream identifiers and associate with each identifier an encoding that describes the type of the associated stream (e.g., whether it contains video or audio) and a textual description of the stream that can be understood and used by the user to choose between different perspectives, as described below.
  • The receiving station includes a set top box 16 connected to a storage device 18, and a television 20 which is used to present programs to a viewer. The set top box 16 is operable to decompress the digital data and display programs to a viewer. The decompressed video signals may be converted into analog signals such as NTSC (National Television Standards Committee) format signals for television display. Signals sent to the set top box 16 are filtered and of those that meet the filtering requirements, some are used by the processor 30 immediately and others can be placed in local storage such as RAM. Examples of requirements that would need to be filtered for include a particular value in the location reserved for an elementary stream identifier or an originating network identifier. The set top box 16 may be used to overlay or combine different signals to form the desired display on the viewer's television 20.
  • As further described below, the set top box 16 is configured to record one or more video and/or audio streams simultaneously to allow a viewer to replay a scene which has recently been viewed or heard by a viewer, except from a different perspective. Broadcast station 12 simultaneously broadcasts multiple perspectives for use by viewers that have set top boxes 16 which execute interactive television applications. For example, multiple cameras may be used to record a sporting event and the station may broadcast from the multiple cameras at the same time to allow the viewer to choose between different camera views using an interactive application that executes on their set top box 16. A broadcaster may also send multiple perspectives of audio tracks in different languages, for example. The multiple video and audio perspectives are only examples of types of perspectives of which a plurality may be contained in a broadcast. Other examples include multiple teletext streams, perhaps in different languages; multiple executables, perhaps each meant for a different skill level; or multiple data streams. The present invention allows a viewer to replay the same scene from a different perspective, while ensuring that the viewer will still be able to view, either simultaneously or at a later time, the portion of the program being broadcast simultaneously with their viewing of the replay. The viewer may request a replay of any combination of audio, video, executables, and data, from either the same or different perspectives as the perspectives previously played.
  • It is to be understood that the term “program” as used herein refers to any broadcast material including television shows, sporting events, news programs, movies, or any other type of broadcast material, or a segment of the material. The material may include only audio, video, data, or any combination thereof. The program may be only a portion of a television show or broadcast (e.g., without commercials or missing a portion of the beginning or end) or may be more than one show, or include commercials for example. Furthermore, it is to be understood that the term “viewing” as used herein is defined such that viewing of a program begins as soon as a tuner begins filtering data corresponding to a program. If a viewer has tuned to a particular frequency prior to the broadcast of a program, the beginning of the viewing preferably corresponds to the beginning of the program. The viewing preferably ends when the program is complete or when the tuner is no longer filtering the frequency corresponding to the program. Thus, the recording of a program coincides with the “viewing” of a program and the program is only recorded when a tuner is tuned to the station broadcasting the program. In the event that the television display is turned off after a viewer has started recording the program, as long as the tuner is tuned into the station broadcasting the program and a recording of the information broadcast on the same frequencies as those used at the start of the viewing is being made, the viewing is said to continue. The audio-video signals and program control signals received by the set top box 16 correspond to television programs and menu selections that the viewer may access through a user interface. The viewer may control the set top box 16 through an infrared remote control unit, a control panel on the set top box, or a menu displayed on the television screen, for example.
  • It is to be understood that the system 10 described above and shown in FIG. 1 is only one example of a system used to convey signals to the television 20. The broadcast network system may be different than described herein without departing from the scope of the invention.
  • The set top box 16 may be used with a receiver or integrated decoder receiver that is capable of decoding video, audio, and data, such as a digital set top box for use with a satellite receiver or satellite integrated decoder receiver that is capable of decoding MPEG video, audio, and data. The set top box 16 may be configured, for example, to receive digital video channels which support broadband communications using Quadrate Amplitude Modulation (QAM) and control channels for two-way signaling and messaging. The digital QAM channels carry compressed and encoded multiprogram MPEG (Motion Picture Expert Group) transport streams. A transport system extracts the desired program from the transport stream and separates the audio, video, and data components, which are routed to devices that process the streams, such as one or more audio decoders, one or more video decoders, and optionally to RAM (or other form of memory) or a hard drive. It is to be understood that the set top box 16 and storage device 18 may be analog, digital, or both analog and digital.
  • As shown in FIGS. 1 and 2, the storage device 18 is coupled to the set top box 16. The storage device 18 is used to provide sufficient storage to record programs that will not fit in the limited amount of main memory (e.g., RAM) typically available in set top boxes. The storage device 18 may comprise any suitable storage device, such as a hard disk drive, a recordable DVD drive, magnetic tape, optical disk, magneto-optical disk, flash memory, or solid state memory, for example. The storage device 18 may be internal to the set top box 16 or connected externally (e.g., through an IEEE 1394-1995 connection) with either a permanent connection of a removable connection. More than one storage device 18 may be attached to the set top box 16. The set top box 16 and/or storage device 18 may also be included in one package with the television set 20.
  • FIG. 2 illustrates one embodiment of a system of the present invention used to record programs received from the broadcast station 12. The set top box 16 generally includes a control unit (e.g., microprocessor), main memory (e.g., RAM), and other components which are necessary to select and decode the received interactive television signal. As shown in FIG. 2, the set top box 16 includes a front end 26 operable to receive audio, video, and other data from the broadcast station 12. The broadcast source is fed into the set top box 16 at the front end 26, which comprises an analog to digital (A/D) converter and tuner/demodulators (not shown). The front end 26 filters out a particular band of frequencies, demodulates it and converts it to a digital format. The digitized output is then sent to a transport stage 28. The transport stage 28 further processes the data, sending a portion of the data to an audio-visual (AV) stage 34 for display and another portion to the control processor 30, and filtering out the rest of the data.
  • Control information may also be recorded as broadcast along with the audio-video data or may be first manipulated by software within the set top box 16. For example, broadcast CA (conditional access) information may be used to decrypt broadcast video. The original broadcast streams, or modifications of these streams may be optionally re-encrypted using a set top box key or algorithm prior to recording. The encrypted video may also be stored as received along with the broadcast CA information. Also, clock information may be translated to a virtual time system prior to recording. An MPEG-2 elementary stream may be de-multiplexed from an MPEG-2 transport stream, then encapsulated as a program stream and recorded.
  • FIG. 3 illustrates the transfer of data from the transport stage 28 to the storage device 18. The storage device 18 typically contains a plurality of programs which have been recorded by a viewer. The recordings of each perspective are associated with identifying information that may have been copied or modified from the original signaling information. This identifying information may contain bookkeeping information similar to that typically stored in audio/video file systems or hierarchical computer file systems. The identifying information may have various formats and content, as long as it provides sufficient information to allow the viewer, possibly interacting with the system, to uniquely retrieve a particular recorded perspective. The programs may be identified with an ID number and a start time and end time. As described below, the storage may be defragmented periodically so that the programs are stored in a contiguous manner. Direct memory access (DMA) is preferably used to send data from the transport stage 28 to the storage device 18. The data that is sent to the control processor 30 may include meta-data which describes the content of the audio-video data streams and may also include application programs and corresponding data that can be executed on the control processor in order to provide interactive television.
  • A copy of data sent from the transport stage 28 to the AV stage 34 is sent to the storage device 18 at the beginning of the viewing. The CPU in the control processor 30 configures a DMA controller to ensure that the data is written to a buffer that is allocated in the storage device 18. The number of minutes of viewing data to be recorded in the buffer is preferably selected by the viewer; however, the set top box may 16 be preset with a default value such as fifteen minutes. The control processor's CPU calculates the size of the buffer to allocate based upon the number of minutes and the maximum speed at which bits in the transport stream that the viewer is watching will be sent. This maximum speed may be obtained from meta-data sent with the audio-video stream. When the end of the buffer is reached, the CPU in the control processor is interrupted, at which time it will re-configure the DMA controller to start writing at the beginning of the buffer. This design is known as a circular buffer.
  • The buffer is preferably circular to allow contiguous recording and writing over of previously recorded content. When the viewer changes the channel or a TV event (e.g., television program ends) occurs, the control processor's CPU will be interrupted. At this time, the CPU may allocate a new buffer or mark the beginning of the new event in the original buffer. The automatic recording of a program and all related video, audio, and data streams in a storage device at the start of the program without any action by the viewer, allows the viewer to replay a portion of the program from a different perspective.
  • As previously described, the control processor 30 records the multi-perspective streams at a start of the program to store the perspectives in storage device 18. The perspectives will continue to be recorded and stored within the storage device 18 for a pre-determined period of time (e.g., 15 minutes). If a viewer decides to record the entire viewing after the start of the program, he will select a record option and the processor 30 will allocate space within the storage device 18. All perspectives will be recorded along with the program that is being viewed. See e.g., U.S. patent application Ser. No. 09/630,646, entitled “System and Method for Incorporating Previously Broadcast Content” and filed Aug. 2, 2000 (Attorney Docket No. OPTVP013), which is incorporated herein by reference in its entirety.
  • The joining of the first and second recorded portions of any given perspective in a common storage area may be implemented either physically or virtually. A physical implementation may include copying the first recorded portion to a location where the second portion has been recorded. A virtual implementation may include the modification of a data structure stored in a storage device. In either case, a viewer watching a replay of any perspective should not be able to detect that the two parts of the perspective were originally stored separately. Thus, the portions of the perspective may be physically contiguous or the portions of the perspective may be stored separately in a non-contiguous format as long as the entire recorded program can be played back in a continuous manner (i.e., viewer does not notice a transition between the playback of the first and second portions of the perspective).
  • It is to be understood that the recording of the entire program, including the plurality of perspectives, in the storage device 18 may occur without any action by the viewer. For example, if the viewer rewinds (or performs a similar action on different types of storage media) a portion of one of the recorded perspectives to replay a scene, the entire program along with all of its multiple perspectives may be recorded in the storage device, since the viewer has shown interest in the program.
  • The control information that is broadcast with the program preferably indicates which streams are related to the viewed streams. The set top box 16, by filtering on the appropriate identifiers in the broadcast MPEG-2 (or DSS or other encoding) packets can locate all related elementary streams. It sends the streams that the viewer is watching to the television set 20 and records in the storage device 18 the content of these streams, along with the other related streams, including related video, audio, executables, and data. Meta-data that indicates the maximum bit rate for the streams may accompany the elementary or transport streams. The format of the recorded streams may depend upon the hardware support. For example, special purpose hardware inside the set top box 16 may support re-multiplexing of streams or concurrent reads and writes to the storage device 18, as is well known by those skilled in the art.
  • Broadcast data such as audio and video data, application code, control signals and other types of information may be sent as data objects. If the program is to be consumed (i.e., presented to the viewer) the broadcast data must be parsed to extract data objects from the stream. When the necessary data objects have been extracted, the program is played. For example, any applications that need to be executed are launched and any audio or video data that needs to be presented to the viewer is played. If the program is stored, the data objects are extracted in the same manner, but they are stored instead of being immediately used to present the program. The recorded program is played back using the stored data objects. The data objects may include “live” data which becomes obsolete if not consumed immediately. If this data is stored and used when the program is played back, the program will in at least part, be obsolete. Thus, while most of the data objects may be stored as files, live data objects may be stored as references in the program. When the program is played back, new live data corresponding to the reference may be obtained and used in place of the data which was live at the time the program was recorded. Thus, only temporally correct data is used by the interactive application when it executes at a later time. (See e.g., U.S. Provisional Patent Application No. 60/162,490 entitled “RECORDING OF PUSH CONTENT” filed Oct. 29, 1999 (Client Docket No. OTV0033+), which is incorporated herein by reference for all purposes).
  • FIGS. 4-10 show the set top box 16 receiving three video and two audio streams from the broadcast station 12. The signals are received from the broadcast station 12 at the tuner in front and 26 and related streams are sent to demultiplexer and processor 100. Video streams V1, V2 and audio stream A1 are all related (e.g., video streams are different camera views of a sporting event and A1 is the sound track for the announcer) and can be provided in a single transport stream. If all the related streams are provided in one transport stream only one tuner 50 is required. The set top box 16 may include multiple tuners 50 for recording and displaying related streams broadcast in separate transport streams. Related streams are preferably broadcast on a small number of frequencies so that a large number of tuners will not be required within or attached to the set top box 16. For example, a large number (e.g., five) of video streams along with multiple audio streams, executable programs, data, and control information may be multiplexed together on a single frequency.
  • FIGS. 4-7 illustrate a case where a viewer requests a replay from a different perspective using a picture-within-picture (PIP) mode. If a viewer wants to see the replay from a different perspective, it can be viewed in a PIP mode without requiring multiple tuners in the set top box 16 or the television 20. The additional tuner is not required since one of the video or audio streams that had been previously recorded is coming from the storage device 18. All streams shown are preferably multiplexed on the same frequency. The video or audio can be delivered directly to the AV stage 34 which is contained in 100 which itself is inside the set top box 16, and may be multiplexed with a transport stream that is being delivered via the tuner 50. Note that 100 represents three components: (i) a demultiplexer; (ii) a processor that directs portions of the broadcast information to other components; and (iii) an AV stage that modulates when necessary (i.e. when the television is analog). Alternatively, the viewer can choose to view only the replay while the set top box 16 buffers, on the storage device 18, the live broadcast for later delivery, as described below with respect to FIGS. 8-10.
  • In FIG. 4, the broadcast station 12 is sending video streams V1 and V2 containing two different perspectives and one audio stream A1. The two video streams may be two different camera positions at a baseball game, for example. The viewer is currently watching video stream V1 and listening to audio stream A1. The first and second video streams V1 and V2 and the audio stream A1 are automatically recorded. Thus, the previously broadcast information is available if a viewer wants to replay, for example, the last play of the game. In particular, with this invention, the viewer can replay this information from any of the previously broadcast perspectives. The viewer may place the set top box into a PIP mode so that the viewer can see a first perspective (video stream V1) displayed in a large central area in the television screen and a second perspective (video stream V2) displayed in a small picture window in the top right hand corner (or some other area) of the television screen (FIG. 5). After an important play in the game (e.g., double play in a baseball game), the viewer may want to see a repay, this time from a perspective different from the one shown in V1. At this time, the viewer may optionally switch the windows into winch the video streams V1 and V2 are displayed, as shown in FIG. 6. Video stream V1 is now sent to the PIP window and video stream V2 is sent to the central viewing window. Then the viewer would give a command (i.e. press a button on the remote control) to re-wind the view in the main window while permitting the PIP window to continue displaying the “live” V1 in the PIP window.
  • As shown in FIG. 7, the recorded video stream V2′, which is from the same perspective as V2, but which was broadcast and recorded earlier, is sent from the storage device 18 to the demultiplexer in 100 which sends the previously recorded stream V2′ along with the current video stream V1 to the television for display. The viewer may rewind or search through the recording until the beginning of the recording is reached. The viewer may also rewind and display the first video stream V1. Meanwhile the broadcast of the remainder of the program may be sent to the storage device 18 since the viewer has shown an interest in the recording. This may be automatic (i.e., program streams are sent to storage device 18 upon a viewer's request for a replay) or may only occur upon receiving a request from the viewer to record the entire program.
  • Alternatively, a viewer may prefer not to be distracted by the live broadcast which is shown as being displayed in the PIP in FIG. 7. Therefore, the viewer may simply first switch perspectives from V1 to V2 as shown in FIG. 6 a. After that, the viewer may “rewind” to an earlier event to see a previous scene from the perspective carried in video stream V2. This case is shown in FIG. 7a where a copy of the live video stream V1 is only sent to the storage device, along with the live video stream V2 and live audio stream A1. The recorded streams V2′ and A1′ are the only ones sent, possibly after modulation, to the television. The scenario presented in FIGS. 6a and 7a could also be a scenario used by the viewer to switch between a live video perspective and a different, recorded, video perspective, when there is no PIP functionality associated with the viewer's television.
  • FIGS. 8-10 illustrate a case where a program is broadcast with different perspective audio streams. For example, a viewer may be watching an Italian movie that is broadcast with an Italian audio stream A1 and an English audio stream A2. As shown in FIG. 8, video stream V1 and audio stream A1 are presented to the viewer and recorded in the storage device 18 while audio stream A2 is also recorded in the storage device 18 but not presented to the viewer. The viewer is initially listening to the Italian broadcast (audio stream A1); however, during part of the movie, the viewer does not understand the Italian, so he selects a “switch to English” option from a menu and the viewer now hears the English broadcast (audio stream A2) (FIG. 9). If the viewer wants to hear the soundtrack that accompanied the previous scene in English, he may rewind the tape of the video stream V1 and audio stream A2 and watch the scene over again in English (FIG. 10). The video and audio streams V1, A1, and A2 will continue to be recorded so that the viewer can see the rest of the movie in a deferred mode, without missing the portion of the movie that was broadcast while the viewer was rewinding and replaying the previous scene.
  • FIG. 11 shows an example of a meta-data file that can be stored along with each recorded perspective. This invention does not require the format shown in this figure, but the format is only used as an example of how meta-data can facilitate the playing of an instant replay from a different perspective. Each record of the meta-data file shown contains, among other possible fields, a time and an offset. In this example, a program clock reference is frequently, though not periodically, broadcast along with the video. When some of these clock reference values are received by the set top box, their value, along with the offset into the recording of the most recent I-frame (one of 3 types of MPEG-2 frame encodings that can be used for video), can be recorded as meta-data. Again, this is only an example; an actual implementation may make use of P- and B-frames (the other types of MPEG-2 encodings, both of which are typically more compressed than an I-frame). The offset is in terms of bytes measured from the beginning of the file containing the recording of the perspective.
  • In this example, the viewer has been watching a live broadcast that contains video perspective V1. As the viewer watches, that video perspective, V1 is being recorded to a file. Also, other video perspectives, including video perspective V2, are being recorded to a different file because they represent a different view of the same information. Of course, V2 could be recorded in the same file as long as other information distinguishing V1 from V2 is recorded somewhere. The viewer has just seen something interesting on the screen and enters the appropriate commands to cause V1 to be re-wound to the beginning of the interesting scene. The viewer stops V1 when the MPEG-2 I-Frame1,t is being used to display the contents of the screen. (Again, this is only an example. P- and B-frames could also be recorded in the file containing the I-frames from V1, and could be used in locating a scene, but they are not used in this example. Also, MPEG-2 is only used as an example; other formats of media and/or data can equally well be used) The viewer then issues a command that tells the set top box to start playing forward, but from V2 rather than from V1. The set-top box must determine which I-frame of V2 it should first cause to be displayed. A simple solution, choosing the I-frame nearest to the same offset as I-frame1,t in the file that contains V2 would only work correctly if both perspectives were sent at the same constant rate, although such an approximation may be useful if the perspectives were sent at approximately the same non-constant rate. A better solution for either variable-rate streams or streams with different constant rates is now presented. This solution uses a linear interpolation, although other well-known classical interpolation methods that are readily available in the open literature may provide a better approximation under some circumstances.
  • First the actual time corresponding to the originally intended playing time of I-frame1,t is approximated. The offset into the file containing V1 where I-frame1,t is located, d1,t is used for this approximation. In order to approximate this time, t, two consecutive time values, d1,j and d1,i+1, are searched for in the meta-data file, such that d1,i≦d1,t<d1,i+1. (As a practitioner of the art would know, a binary search would likely find these two consecutive elements the most quickly if the records are fixed length and the elements are stored in consecutive order as shown. A different search would be optimal if a different storage format is used. Again, these are well-known techniques that are extensively documented in the computer science literature.) Once they are located, both t1,i and t1,i+1 will also be known. These values are then used to approximate t. This example uses the linear interpolation formula:

  • t=((t 1,i+1 −t 1,i)(d 1,t −d 1,t)/(d 1,i+1 −d 1,i))+t 1,i
  • After an approximation for t has been found, the location of the I-frame in the recording of perspective V2 that is nearest to that time needs to be found. The first step here is to locate t2,k and t2,k+1 such that t2,k≦t<t2,k+1. Again, the search that performs the best in any given case is dependent upon the format of the file and is a well-studied problem. Having these values allows for an approximation of d2,t. Once again, this example uses linear interpolation:

  • d 2,t=((d 2,j+1 −d 2,j)(t−t 2,j)/(t 2,j+1 −t 2,j))+d 2,j
  • Now that an approximation for d2,i is known, the I-frame that is nearest to being d2,t bytes from the beginning of the file containing the recording of V2 is used as the starting frame for playing back the recording for the viewer.
  • FIG. 12 shows a process flow in accordance with the embodiment described herein. For the sake of clarity, the process has been illustrated with a specific flow, but it should be understood that other sequences are possible and that some may be performed in parallel, without departing from the spirit of the invention. In step 200, the system receives a broadcast including multiple perspectives of a program. The system presents one of the perspectives to the viewer, step 210, and stores all of the perspectives in a storage device, step 220. In the embodiment disclosed, the system stores all of the perspectives, but may be configured to selectively store perspectives based on criteria provided by the viewer (such as an indication of which perspectives the viewer is interested in). The perspectives are stored in a circular buffer, step 260. Another perspective is presented to the viewer, step 230, and the presentation of this perspective and the first perspective includes preparation of an audio/video signal for the television, step 250. The presentation of the other perspective in step 230 may involve searching the stored perspectives, step 240, and the perspective presented may be one of the stored perspectives.
  • A method and system for processing broadcasts have been disclosed. Software written according to the present invention may be stored in some form of computer-readable medium, such as memory or CD-ROM, or transmitted over a network, and executed by a processor. Additionally, where methods have been disclosed, various sequences of steps may be possible, and it may be possible to perform such steps simultaneously, without departing from the scope of the invention.
  • Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations made to the embodiments without departing from the scope of the present invention. Accordingly, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.

Claims (30)

1. A computerized method comprising:
receiving a media content item including visual content and at least a portion of audio content associated with the visual content, the associated audio content including audio content in a first spoken language and audio content in a second spoken language different from the first spoken language;
providing the visual content with the audio content in the first spoken language for presentation on a display device;
responsive to a first command from a media application configured to allow a user to select the audio content in the second spoken language, providing the visual content along with the audio content in the second spoken language for presentation on the display device, the media application executing on at least one hardware processor of a computer system; and
responsive to a second command, after the first command, from the media application to play at least a portion of the visual content, providing the portion of the visual content along with a corresponding portion of the audio content in the second spoken language for presentation on the display device during the play.
2. The computerized method of claim 1, the play of the at least a portion of the visual content comprising playing at least a portion of the visual content previously presented.
3. The computerized method of claim 2, the receiving of the media content item comprising receiving the visual content, the audio content in the first spoken language, and the audio content in the second spoken language.
4. The computerized method of claim 3, further comprising:
storing the received visual content, the received audio content in the first spoken language, and the received audio content in the second spoken language in a data store of the computer system.
5. The computerized method of claim 4, the storing of the received visual content, the received audio content in the first spoken language, and the received audio content in the second spoken language in the data store of the computer system occurring concurrently as the visual content, the audio content in the first spoken language, and the audio content in the second spoken language are being received.
6. The computerized method of claim 4, further comprising:
further responsive to the second command, accessing the data store to retrieve the portion of the visual content previously presented and the corresponding portion of the audio content in the second spoken language.
7. The computerized method of claim 6, the accessing of the data store to retrieve the portion of the visual content previously presented and the corresponding portion of the audio content in the second spoken language occurring during the play.
8. The computerized method of claim 1, the receiving of the media content item comprising receiving the media content item from a content provider system via a communication network.
9. The computerized method of claim 8, the receiving of the media content item comprising receiving from the content provider system via the communication network, the visual content, the audio content in the first spoken language, and the audio content in the second spoken language.
10. The computerized method of claim 9, the visual content, the audio content in the first spoken language, and the audio content in the second spoken language being received concurrently.
11. The computerized method of claim 10, the received audio content in the first spoken language comprising a first plurality of data packets, and the received audio content in the second spoken language comprising a second plurality of data packets, the first plurality of data packets and the second plurality of data packets being multiplexed in a transport stream from the content provider system.
12. The computerized method of claim 11, the providing of the visual content with the audio content in the first spoken language comprising identifying the first plurality of data packets based on metadata associated with each of the first plurality of data packets, and the providing of the visual content with the audio content in the second spoken language comprising identifying the second plurality of data packets based on metadata associated with each of the second plurality of data packets.
13. The computerized method of claim 11, the first plurality of data packets comprising a first elementary stream of the transport stream for the audio content in the first spoken language, and the second plurality of data packets comprising a second elementary stream of the transport stream for the audio content in the second spoken language.
14. The computerized method of claim 8, the content provider system comprising a satellite broadcast system.
15. The computerized method of claim 8, the content provider system comprising a cable broadcast system.
16. The computerized method of claim 8, the content provider system comprising a terrestrial broadcast system.
17. The computerized method of claim 8, the content provider system comprising an Internet communication system.
18. The computerized method of claim 8, the content provider system comprising a cellular communication system.
19. The computerized method of claim 8, further comprising:
receiving, from the content provider system via the communication network, the media application; and
executing in response to receiving the media application, the media application.
20. The computerized method of claim 1, the second command from the media application to play at least a portion of the visual content comprising a third command to rewind the visual content while providing at least some of the visual content for presentation on the display device.
21. The computerized method of claim 3, the received media content item further including control information identifying the audio content in the first spoken language and the audio content in the second spoken language as alternative audio content perspectives corresponding to the visual content.
22. The computerized method of claim 21, the control information including a first textual identifier for the first spoken language and a second textual identifier for the second spoken language, the first textual identifier and the second textual identifier to be employed by the media application for presentation on the display device.
23. The computerized method of claim 21, the control information associating the media application with the audio content in the first spoken language and the audio content in the second spoken language.
24. The computerized method of claim 1, the computer system being operated by the user.
25. The computerized method of claim 1, the computer system being communicatively coupled with the display device.
26. The computerized method of claim 1, the computer system comprising a television set-top box, and the display device comprising a television.
27. The computerized method of claim 1, the computer system comprising the display device.
28. The computerized method of claim 27, the display device comprising a television.
29. A media content system comprising:
one or more hardware processors; and
a memory storing instructions that, when executed by at least one of the one or more hardware processors, cause the media content system to perform operations comprising:
receiving a media content item including visual content and at least a portion of audio content associated with the visual content, the associated audio content including audio content in a first spoken language and audio content in a second spoken language different from the first spoken language;
providing the visual content with the audio content in the first spoken language for presentation on a display device;
responsive to a first command from a media application configured to allow a user to select the audio content in the second spoken language, providing the visual content along with the audio content in the second spoken language for presentation on the display device, the media application being executed by at least one of the one or more hardware processors; and
responsive to a second command, after the first command, from the media application to play at least a portion of the visual content, providing the portion of the visual content along with a corresponding portion of the audio content in the second spoken language for presentation on the display device during the play.
30. A non-transitory computer-readable storage medium comprising instructions that, when executed by one or more hardware processors of a machine, cause the machine to perform operations comprising:
receiving a media content item including visual content and at least a portion of audio content associated with the visual content, the associated audio content including audio content in a first spoken language and audio content in a second spoken language different from the first spoken language;
providing the visual content with the audio content in the first spoken language for presentation on a display device;
responsive to a first command from a media application configured to allow a user to select the audio content in the second spoken language, providing the visual content along with the audio content in the second spoken language for presentation on the display device, the media application executing on at least one of the one or more hardware processors; and
responsive to a second command, after the first command, from the media application to play at least a portion of the visual content, providing the portion of the visual content along with a corresponding portion of the audio content in the second spoken language for presentation on the display device during the play.
US15/379,173 1999-10-29 2016-12-14 Systems and methods for providing a multi-perspective video display Abandoned US20170094371A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/379,173 US20170094371A1 (en) 1999-10-29 2016-12-14 Systems and methods for providing a multi-perspective video display
US15/624,020 US20170295405A1 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display
US15/623,930 US10462530B2 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display
US16/570,985 US10869102B2 (en) 1999-10-29 2019-09-13 Systems and methods for providing a multi-perspective video display

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US16249099P 1999-10-29 1999-10-29
US09/630,646 US6678463B1 (en) 2000-08-02 2000-08-02 System and method for incorporating previously broadcast content into program recording
US23552900P 2000-09-26 2000-09-26
US09/765,965 US8250617B2 (en) 1999-10-29 2001-01-19 System and method for providing multi-perspective instant replay
US13/589,589 US8832756B2 (en) 1999-10-29 2012-08-20 Systems and methods for providing a multi-perspective video display
US14/479,077 US9525839B2 (en) 1999-10-29 2014-09-05 Systems and methods for providing a multi-perspective video display
US15/379,173 US20170094371A1 (en) 1999-10-29 2016-12-14 Systems and methods for providing a multi-perspective video display

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/479,077 Continuation US9525839B2 (en) 1999-10-29 2014-09-05 Systems and methods for providing a multi-perspective video display

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/623,930 Continuation US10462530B2 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display
US15/624,020 Continuation US20170295405A1 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display

Publications (1)

Publication Number Publication Date
US20170094371A1 true US20170094371A1 (en) 2017-03-30

Family

ID=29273564

Family Applications (7)

Application Number Title Priority Date Filing Date
US09/765,965 Active 2027-06-20 US8250617B2 (en) 1999-10-29 2001-01-19 System and method for providing multi-perspective instant replay
US13/589,589 Expired - Fee Related US8832756B2 (en) 1999-10-29 2012-08-20 Systems and methods for providing a multi-perspective video display
US14/479,077 Expired - Lifetime US9525839B2 (en) 1999-10-29 2014-09-05 Systems and methods for providing a multi-perspective video display
US15/379,173 Abandoned US20170094371A1 (en) 1999-10-29 2016-12-14 Systems and methods for providing a multi-perspective video display
US15/624,020 Abandoned US20170295405A1 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display
US15/623,930 Expired - Fee Related US10462530B2 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display
US16/570,985 Expired - Lifetime US10869102B2 (en) 1999-10-29 2019-09-13 Systems and methods for providing a multi-perspective video display

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US09/765,965 Active 2027-06-20 US8250617B2 (en) 1999-10-29 2001-01-19 System and method for providing multi-perspective instant replay
US13/589,589 Expired - Fee Related US8832756B2 (en) 1999-10-29 2012-08-20 Systems and methods for providing a multi-perspective video display
US14/479,077 Expired - Lifetime US9525839B2 (en) 1999-10-29 2014-09-05 Systems and methods for providing a multi-perspective video display

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15/624,020 Abandoned US20170295405A1 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display
US15/623,930 Expired - Fee Related US10462530B2 (en) 1999-10-29 2017-06-15 Systems and methods for providing a multi-perspective video display
US16/570,985 Expired - Lifetime US10869102B2 (en) 1999-10-29 2019-09-13 Systems and methods for providing a multi-perspective video display

Country Status (1)

Country Link
US (7) US8250617B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10462530B2 (en) 1999-10-29 2019-10-29 Opentv, Inc. Systems and methods for providing a multi-perspective video display

Families Citing this family (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10839321B2 (en) 1997-01-06 2020-11-17 Jeffrey Eder Automated data storage system
US7457414B1 (en) 2000-07-21 2008-11-25 The Directv Group, Inc. Super encrypted storage and retrieval of media programs with smartcard generated keys
ATE272924T1 (en) * 2001-01-16 2004-08-15 Nagracard Sa METHOD FOR STORING ENCRYPTED DATA
EP1231782A1 (en) * 2001-02-13 2002-08-14 Sony International (Europe) GmbH Tuning device for a data distribution network
US8909128B2 (en) 2008-04-09 2014-12-09 3D Radio Llc Radio device with virtually infinite simultaneous inputs
US8706023B2 (en) * 2008-01-04 2014-04-22 3D Radio Llc Multi-tuner radio systems and methods
US8699995B2 (en) 2008-04-09 2014-04-15 3D Radio Llc Alternate user interfaces for multi tuner radio device
AU2002247173A1 (en) 2001-02-20 2002-09-04 Caron S. Ellis Enhanced radio systems and methods
US8868023B2 (en) 2008-01-04 2014-10-21 3D Radio Llc Digital radio systems and methods
US7415005B1 (en) * 2001-10-29 2008-08-19 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Ad hoc selection of voice over internet streams
GB0127234D0 (en) * 2001-11-13 2002-01-02 British Sky Broadcasting Ltd Improvements in receivers for television signals
JP2003230127A (en) * 2002-01-31 2003-08-15 Matsushita Electric Ind Co Ltd Digest image specifying system, digest image providing system, digest image specifying method, medium and program
JP3935412B2 (en) * 2002-09-09 2007-06-20 キヤノン株式会社 Receiving apparatus, receiving apparatus control method, and stream data distribution system
US7225458B2 (en) * 2002-11-21 2007-05-29 The Directv Group, Inc. Method and apparatus for ensuring reception of conditional access information in multi-tuner receivers
FR2855705A1 (en) * 2003-05-28 2004-12-03 Thomson Licensing Sa NAVIGATION METHOD FOR SELECTING DOCUMENTS ASSOCIATED WITH IDENTIFIERS, AND RECEIVER IMPLEMENTING THE METHOD.
JP4042631B2 (en) * 2003-06-02 2008-02-06 株式会社日立製作所 Receiving apparatus and receiving method
EP2204805B1 (en) * 2003-10-10 2012-06-06 Sharp Kabushiki Kaisha A content reproducing apparatus, a content recording medium, a control program and a computer-readable recording medium
KR100925295B1 (en) * 2004-06-29 2009-11-04 교세라 가부시키가이샤 Digital broadcast receiving apparatus
WO2006041060A1 (en) * 2004-10-13 2006-04-20 Matsushita Electric Industrial Co., Ltd. Content reception/recording device, method, program, and recording medium
US20060218620A1 (en) * 2005-03-03 2006-09-28 Dinesh Nadarajah Network digital video recorder and method
KR20080031032A (en) * 2005-07-28 2008-04-07 톰슨 라이센싱 Method and apparatus for user adjustable memory for content recording devices
JP2007048348A (en) * 2005-08-08 2007-02-22 Toshiba Corp Information storage medium, information reproducing apparatus, information reproducing method, and information reproducing program
US9325944B2 (en) 2005-08-11 2016-04-26 The Directv Group, Inc. Secure delivery of program content via a removable storage medium
KR100750143B1 (en) * 2005-12-05 2007-08-21 삼성전자주식회사 Method and apparatus for storing digital broadcasting Signal
JP2007281537A (en) * 2006-04-03 2007-10-25 Hitachi Ltd Video recording/reproducing device, and television receiver including the same
JP4797767B2 (en) * 2006-04-17 2011-10-19 船井電機株式会社 Information recording apparatus and information recording control apparatus
US7992175B2 (en) 2006-05-15 2011-08-02 The Directv Group, Inc. Methods and apparatus to provide content on demand in content broadcast systems
US8775319B2 (en) 2006-05-15 2014-07-08 The Directv Group, Inc. Secure content transfer systems and methods to operate the same
US8095466B2 (en) 2006-05-15 2012-01-10 The Directv Group, Inc. Methods and apparatus to conditionally authorize content delivery at content servers in pay delivery systems
US8996421B2 (en) 2006-05-15 2015-03-31 The Directv Group, Inc. Methods and apparatus to conditionally authorize content delivery at broadcast headends in pay delivery systems
US8001565B2 (en) 2006-05-15 2011-08-16 The Directv Group, Inc. Methods and apparatus to conditionally authorize content delivery at receivers in pay delivery systems
US9178693B2 (en) 2006-08-04 2015-11-03 The Directv Group, Inc. Distributed media-protection systems and methods to operate the same
US9225761B2 (en) 2006-08-04 2015-12-29 The Directv Group, Inc. Distributed media-aggregation systems and methods to operate the same
KR101128807B1 (en) * 2006-10-30 2012-03-23 엘지전자 주식회사 Method for displaying broadcast and broadcast receiver capable of implementing the same
TWI420504B (en) * 2007-03-19 2013-12-21 Cyberlink Corp Method and related system capable of multiple displays
US8942536B2 (en) * 2007-09-19 2015-01-27 Nvidia Corporation Video navigation system and method
US8683067B2 (en) * 2007-12-19 2014-03-25 Nvidia Corporation Video perspective navigation system and method
US9838750B2 (en) 2008-08-20 2017-12-05 At&T Intellectual Property I, L.P. System and method for retrieving a previously transmitted portion of television program content
KR20100030392A (en) * 2008-09-10 2010-03-18 삼성전자주식회사 Method and apparatus for transmitting content and method and apparatus for recording content
CN101459836B (en) * 2008-12-29 2011-04-20 中兴通讯股份有限公司 Service processing method and system for content distributing network of interactive network television
US20100287217A1 (en) * 2009-04-08 2010-11-11 Google Inc. Host control of background garbage collection in a data storage device
US8595572B2 (en) 2009-04-08 2013-11-26 Google Inc. Data storage device with metadata command
US8566507B2 (en) * 2009-04-08 2013-10-22 Google Inc. Data storage device capable of recognizing and controlling multiple types of memory chips
US20100262979A1 (en) * 2009-04-08 2010-10-14 Google Inc. Circular command queues for communication between a host and a data storage device
US8656432B2 (en) * 2009-05-12 2014-02-18 At&T Intellectual Property I, L.P. Providing audio signals using a network back-channel
US8655156B2 (en) * 2010-03-02 2014-02-18 Cisco Technology, Inc. Auxiliary audio transmission for preserving synchronized playout with paced-down video
US8300667B2 (en) * 2010-03-02 2012-10-30 Cisco Technology, Inc. Buffer expansion and contraction over successive intervals for network devices
US9762639B2 (en) 2010-06-30 2017-09-12 Brightcove Inc. Dynamic manifest generation based on client identity
US9838450B2 (en) 2010-06-30 2017-12-05 Brightcove, Inc. Dynamic chunking for delivery instances
US8640181B1 (en) 2010-09-15 2014-01-28 Mlb Advanced Media, L.P. Synchronous and multi-sourced audio and video broadcast
JP5586420B2 (en) * 2010-10-26 2014-09-10 株式会社東芝 Video transmission system, transmission processing device, reception processing device, and video transmission method
AU2011201404B1 (en) * 2011-03-28 2012-01-12 Brightcove Inc. Transcodeless on-the-fly ad insertion
US20120284745A1 (en) * 2011-05-06 2012-11-08 Echostar Technologies L.L.C. Apparatus, systems and methods for improving commercial presentation
US9041860B2 (en) 2011-05-31 2015-05-26 Brian K. Buchheit Simultaneously presenting an enhanced and primary stream associated with a digital television broadcast
US9172982B1 (en) * 2011-06-06 2015-10-27 Vuemix, Inc. Audio selection from a multi-video environment
US9740377B1 (en) 2011-06-06 2017-08-22 Vuemix, Inc. Auxiliary information data exchange within a video environment
WO2012173402A2 (en) * 2011-06-14 2012-12-20 삼성전자 주식회사 Method and apparatus for comprising content in broadcasting system
JP2013074458A (en) * 2011-09-28 2013-04-22 Sony Computer Entertainment Inc Information processing device, information processing system, information processing method, television program broadcasting method, program and information storage medium
US8646023B2 (en) 2012-01-05 2014-02-04 Dijit Media, Inc. Authentication and synchronous interaction between a secondary device and a multi-perspective audiovisual data stream broadcast on a primary device geospatially proximate to the secondary device
WO2013122385A1 (en) * 2012-02-15 2013-08-22 Samsung Electronics Co., Ltd. Data transmitting apparatus, data receiving apparatus, data transreceiving system, data transmitting method, data receiving method and data transreceiving method
WO2013122387A1 (en) 2012-02-15 2013-08-22 Samsung Electronics Co., Ltd. Data transmitting apparatus, data receiving apparatus, data transceiving system, data transmitting method, and data receiving method
WO2013122386A1 (en) 2012-02-15 2013-08-22 Samsung Electronics Co., Ltd. Data transmitting apparatus, data receiving apparatus, data transreceiving system, data transmitting method, data receiving method and data transreceiving method
US9256961B2 (en) 2012-06-28 2016-02-09 Here Global B.V. Alternate viewpoint image enhancement
US9256983B2 (en) * 2012-06-28 2016-02-09 Here Global B.V. On demand image overlay
US9112939B2 (en) 2013-02-12 2015-08-18 Brightcove, Inc. Cloud-based video delivery
US20140259079A1 (en) * 2013-03-06 2014-09-11 Eldon Technology Limited Trick play techniques for a picture-in-picture window rendered by a video services receiver
US20140267747A1 (en) * 2013-03-17 2014-09-18 International Business Machines Corporation Real-time sharing of information captured from different vantage points in a venue
US9402051B2 (en) * 2013-06-15 2016-07-26 The SuperGroup Creative Omnimedia, Inc. Apparatus and method for simultaneous live recording through and projecting live video images onto an interactive touch screen
WO2014201466A1 (en) 2013-06-15 2014-12-18 The SuperGroup Creative Omnimedia, Inc. Method and apparatus for interactive two-way visualization using simultaneously recorded and projected video streams
US9973722B2 (en) * 2013-08-27 2018-05-15 Qualcomm Incorporated Systems, devices and methods for displaying pictures in a picture
US10101801B2 (en) * 2013-11-13 2018-10-16 Cisco Technology, Inc. Method and apparatus for prefetching content in a data stream
US9271048B2 (en) * 2013-12-13 2016-02-23 The Directv Group, Inc. Systems and methods for immersive viewing experience
KR101700626B1 (en) * 2013-12-18 2017-01-31 한국전자통신연구원 Multi angle view processing apparatus
US9851299B2 (en) 2014-10-25 2017-12-26 Isle Management Co. Method of analyzing air quality
WO2016197092A1 (en) 2015-06-05 2016-12-08 The SuperGroup Creative Omnimedia, Inc. Imaging and display system and method
JP6701776B2 (en) * 2016-02-15 2020-05-27 船井電機株式会社 Recording device, recording method
US20170257679A1 (en) * 2016-03-01 2017-09-07 Tivo Solutions Inc. Multi-audio annotation
US10848818B2 (en) * 2018-09-03 2020-11-24 Vanco International, Llc Sensing based audio signal injection
PL3953008T3 (en) 2020-06-22 2024-10-07 Audiomob Ltd Adding audio content to digital works
BR112022026072A2 (en) * 2020-06-22 2023-01-17 Audiomob Ltd SUBMISSION OF AUDIO CONTENT FOR DIGITAL WORKS

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794465A (en) * 1986-05-12 1988-12-27 U.S. Philips Corp. Method of and apparatus for recording and/or reproducing a picture signal and an associated audio signal in/from a record carrier
US6104997A (en) * 1998-04-22 2000-08-15 Grass Valley Group Digital audio receiver with multi-channel swapping
US6233393B1 (en) * 1996-09-27 2001-05-15 Sony Corporation Apparatus for transmitting data in isochronous and asynchronous manners, an apparatus for receiving the same, and a system and method for such transmitting and receiving of such data
US6292226B1 (en) * 1996-08-28 2001-09-18 Matsushita Electric Industrial Co., Ltd. Broadcast receiver selectively using navigation information multiplexed on transport stream and recording medium recording the method of the same
US6295093B1 (en) * 1996-05-03 2001-09-25 Samsung Electronics Co., Ltd. Closed-caption broadcasting and receiving method and apparatus thereof suitable for syllable characters
US20010044726A1 (en) * 2000-05-18 2001-11-22 Hui Li Method and receiver for providing audio translation data on demand
US20010048756A1 (en) * 1998-11-09 2001-12-06 Identigene, Inc. Method of newborn indentification and tracking
US20010048736A1 (en) * 2000-06-05 2001-12-06 Walker David L. Communication system for delivering and managing content on a voice portal platform
US20020057380A1 (en) * 2000-06-28 2002-05-16 Pace Micro Technology Plc Broadcast data receiver with dual tuning capability
US20020073171A1 (en) * 1999-06-16 2002-06-13 Mcdowall Ian E. Internet radio receiver with linear tuning interface
US20020129374A1 (en) * 1991-11-25 2002-09-12 Michael J. Freeman Compressed digital-data seamless video switching system
US6496205B1 (en) * 1996-06-03 2002-12-17 Webtv Networks, Inc. User interface for controlling audio functions in a web browser
US20020194364A1 (en) * 1996-10-09 2002-12-19 Timothy Chase Aggregate information production and display system
US20030063218A1 (en) * 1995-11-13 2003-04-03 Gemstar Development Corporation Method and apparatus for displaying textual or graphic data on the screen of television receivers
US6611958B1 (en) * 1999-08-06 2003-08-26 Sony Corporation Electronic program guide feature for AV system
US20030208771A1 (en) * 1999-10-29 2003-11-06 Debra Hensgen System and method for providing multi-perspective instant replay
US6734759B2 (en) * 2001-12-05 2004-05-11 Agilent Technologies, Inc. Droop compensation filter having high pass characteristic
US20050165942A1 (en) * 2000-05-12 2005-07-28 Sonicbox, Inc. System and method for limiting dead air time in internet streaming media delivery
US6934759B2 (en) * 1999-05-26 2005-08-23 Enounce, Inc. Method and apparatus for user-time-alignment for broadcast works
US20050283819A1 (en) * 1998-10-15 2005-12-22 Matsushita Electric Industrial Co., Ltd. Digital broadcast system
US7000245B1 (en) * 1999-10-29 2006-02-14 Opentv, Inc. System and method for recording pushed data
US7051360B1 (en) * 1998-11-30 2006-05-23 United Video Properties, Inc. Interactive television program guide with selectable languages
US20070061845A1 (en) * 2000-06-29 2007-03-15 Barnes Melvin L Jr Portable Communication Device and Method of Use
US7231268B1 (en) * 1999-08-17 2007-06-12 Samsung Electronics Co., Ltd. Method of assigning audio channel identification, method for selecting audio channel using the same, and optical recording and reproducing apparatus suitable therefor
US20080010658A1 (en) * 1997-03-31 2008-01-10 Abbott Michael J System and method for media stream indexing and synchronization
US7412533B1 (en) * 1997-03-31 2008-08-12 West Corporation Providing a presentation on a network having a plurality of synchronized media types
US20080287059A1 (en) * 1999-03-08 2008-11-20 Anderson Jr Tazwell L Video/audio system and method enabling a user to select different views and sounds associated with an event
US7486926B2 (en) * 2000-03-28 2009-02-03 Affinity Labs Of Texas, Llc Content delivery system and method
US7565680B1 (en) * 2000-06-30 2009-07-21 Comcast Ip Holdings I, Llc Advanced set top terminal having a video call feature
US7613315B2 (en) * 2005-03-04 2009-11-03 Sennheiser Communications A/S Configurable headset
US7757252B1 (en) * 1998-07-20 2010-07-13 Thomson Licensing S.A. Navigation system for a multichannel digital television system
US8185929B2 (en) * 1994-10-12 2012-05-22 Cooper J Carl Program viewing apparatus and method
US8528019B1 (en) * 1999-11-18 2013-09-03 Koninklijke Philips N.V. Method and apparatus for audio/data/visual information

Family Cites Families (104)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR1224806A (en) 1957-07-25 1960-06-27 Chemical process of castration of animals
FR1266521A (en) 1960-05-30 1961-07-17 Fonderie Soc Gen De Advanced sprayer
BE729880A (en) 1968-03-15 1969-08-18
GB1224806A (en) 1968-10-23 1971-03-10 Stephanois Rech Mec A method for formation of a layer of titanium carbide at the surfaces of a component of titanium or titanium alloy and parts treated by said method
US4965825A (en) * 1981-11-03 1990-10-23 The Personalized Mass Media Corporation Signal processing apparatus and methods
JPS60113301A (en) * 1983-11-25 1985-06-19 Victor Co Of Japan Ltd Recording and reproducing device
US4847700A (en) 1987-07-16 1989-07-11 Actv, Inc. Interactive television system for providing full motion synched compatible audio/visual displays from transmitted television signals
US4847699A (en) 1987-07-16 1989-07-11 Actv, Inc. Method for providing an interactive full motion synched compatible audio/visual television display
US4847698A (en) 1987-07-16 1989-07-11 Actv, Inc. Interactive television system for providing full motion synched compatible audio/visual displays
US4918516A (en) 1987-10-26 1990-04-17 501 Actv, Inc. Closed circuit television system having seamless interactive television programming and expandable user participation
USRE34340E (en) 1987-10-26 1993-08-10 Actv, Inc. Closed circuit television system having seamless interactive television programming and expandable user participation
JPH01156519A (en) 1987-12-14 1989-06-20 Murata Mach Ltd Spinning method and apparatus therefor
US5164839A (en) * 1988-12-27 1992-11-17 Explore Technology, Inc. Method for handling audio/video source information
US5213156A (en) * 1989-12-27 1993-05-25 Elge Ab Heat exchanger and a method for its fabrication
DE4002646A1 (en) 1990-01-30 1991-08-01 Wilhelms Rolf E SOUND AND / OR IMAGE SIGNAL RECORDING DEVICE
US5068733A (en) 1990-03-20 1991-11-26 Bennett Richard H Multiple access television
WO1992022983A2 (en) * 1991-06-11 1992-12-23 Browne H Lee Large capacity, random access, multi-source recorder player
US5861881A (en) 1991-11-25 1999-01-19 Actv, Inc. Interactive computer system for providing an interactive presentation with personalized video, audio and graphics responses for multiple viewers
US5724091A (en) * 1991-11-25 1998-03-03 Actv, Inc. Compressed digital data interactive program system
US7079176B1 (en) * 1991-11-25 2006-07-18 Actv, Inc. Digital interactive system for providing full interactivity with live programming events
US5561723A (en) * 1992-03-09 1996-10-01 Tektronix, Inc. Localized image compression calculation method and apparatus to control anti-aliasing filtering in 3-D manipulation of 2-D video images
JP3465272B2 (en) 1992-08-28 2003-11-10 ソニー株式会社 Digital data recording device and recording method
US5371551A (en) * 1992-10-29 1994-12-06 Logan; James Time delayed digital video system using concurrent recording and playback
FR2700908B1 (en) 1993-01-26 1995-02-24 Thomson Consumer Electronics Buffer television receiver.
US5333135A (en) 1993-02-01 1994-07-26 North American Philips Corporation Identification of a data stream transmitted as a sequence of packets
IT1266521B1 (en) 1993-05-18 1996-12-30 Marchiori D Srl IMPROVEMENTS TO THE SYSTEMS FOR CREATING AIR DATA TEST BENCHES FOR AIRCRAFT AND RESULTING TEST BENCHES.
KR960007947B1 (en) * 1993-09-17 1996-06-17 엘지전자 주식회사 Karaoke-cd and audio control apparatus by using that
EP0836191B1 (en) 1993-10-29 1999-03-03 Kabushiki Kaisha Toshiba Method and apparatus for reproducing data from a recording medium
US5539920A (en) 1994-04-28 1996-07-23 Thomson Consumer Electronics, Inc. Method and apparatus for processing an audio video interactive signal
US5563648A (en) 1994-04-28 1996-10-08 Thomson Consumer Electronics, Inc. Method for controlling execution of an audio video interactive program
US5701383A (en) 1994-05-20 1997-12-23 Gemstar Development Corporation Video time-shifting apparatus
US5640453A (en) * 1994-08-11 1997-06-17 Stanford Telecommunications, Inc. Universal interactive set-top controller for downloading and playback of information and entertainment services
US5613032A (en) * 1994-09-02 1997-03-18 Bell Communications Research, Inc. System and method for recording, playing back and searching multimedia events wherein video, audio and text can be searched and retrieved
US5926205A (en) 1994-10-19 1999-07-20 Imedia Corporation Method and apparatus for encoding and formatting data representing a video program to provide multiple overlapping presentations of the video program
US5600368A (en) * 1994-11-09 1997-02-04 Microsoft Corporation Interactive television system and method for viewer control of multiple camera viewpoints in broadcast programming
US5574845A (en) * 1994-11-29 1996-11-12 Siemens Corporate Research, Inc. Method and apparatus video data management
US5617565A (en) 1994-11-29 1997-04-01 Hitachi America, Ltd. Broadcast interactive multimedia system
US5543853A (en) * 1995-01-19 1996-08-06 At&T Corp. Encoder/decoder buffer control for variable bit-rate channel
US5729471A (en) * 1995-03-31 1998-03-17 The Regents Of The University Of California Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene
JP3572595B2 (en) * 1995-07-21 2004-10-06 ソニー株式会社 Electronic program guide display control apparatus and method
US6307868B1 (en) * 1995-08-25 2001-10-23 Terayon Communication Systems, Inc. Apparatus and method for SCDMA digital data transmission using orthogonal codes and a head end modem with no tracking loops
US6356555B1 (en) * 1995-08-25 2002-03-12 Terayon Communications Systems, Inc. Apparatus and method for digital data transmission using orthogonal codes
GB9603332D0 (en) * 1996-02-14 1996-04-17 Thomson Consumer Electronics Interface for digital recorder and display
TW436777B (en) 1995-09-29 2001-05-28 Matsushita Electric Ind Co Ltd A method and an apparatus for reproducing bitstream having non-sequential system clock data seamlessly therebetween
US5862312A (en) 1995-10-24 1999-01-19 Seachange Technology, Inc. Loosely coupled mass storage computer cluster
US5956088A (en) 1995-11-21 1999-09-21 Imedia Corporation Method and apparatus for modifying encoded digital video for improved channel utilization
US6836295B1 (en) * 1995-12-07 2004-12-28 J. Carl Cooper Audio to video timing measurement for MPEG type television systems
US5930526A (en) * 1996-01-24 1999-07-27 Intel Corporation System for progressive transmission of compressed video including video data of first type of video frame played independently of video data of second type of video frame
US6788314B1 (en) * 1996-03-22 2004-09-07 Interval Research Corporation Attention manager for occupying the peripheral attention of a person in the vicinity of a display device
US6529680B1 (en) * 1996-04-26 2003-03-04 Mitsubishi Digital Electronics America, Inc. Device for selecting and controlling a plurality of signal sources in a television system
KR100212134B1 (en) * 1996-05-03 1999-08-02 윤종용 Soft scroll method of viewer selection type caption display
CN1178478C (en) * 1996-05-03 2004-12-01 三星电子株式会社 Viewer selecting type captions broadcasting and receiving method and equipment adapted for syllabic language
US6065050A (en) * 1996-06-05 2000-05-16 Sun Microsystems, Inc. System and method for indexing between trick play and normal play video streams in a video delivery system
US5903816A (en) 1996-07-01 1999-05-11 Thomson Consumer Electronics, Inc. Interactive television system and method for displaying web-like stills with hyperlinks
JPH1063555A (en) 1996-08-19 1998-03-06 Hitachi Ltd File managing method
US6008867A (en) * 1996-08-26 1999-12-28 Ultrak, Inc. Apparatus for control of multiplexed video system
US5999698A (en) * 1996-09-30 1999-12-07 Kabushiki Kaisha Toshiba Multiangle block reproduction system
US5917830A (en) * 1996-10-18 1999-06-29 General Instrument Corporation Splicing compressed packetized digital video streams
EP0880983B1 (en) * 1996-11-22 2005-10-05 Sega Enterprises, Ltd. Game device, game displaying method, game result evaluating method, and recording medium for recording game program
KR100238668B1 (en) * 1996-11-28 2000-01-15 윤종용 Digital video player
JPH10234007A (en) 1996-12-18 1998-09-02 Sony Corp Recording and reproducing device
US6177931B1 (en) 1996-12-19 2001-01-23 Index Systems, Inc. Systems and methods for displaying and recording control interface with television programs, video, advertising information and program scheduling information
JP3177470B2 (en) 1996-12-20 2001-06-18 三洋電機株式会社 Playback device
JP3845119B2 (en) 1997-01-06 2006-11-15 ベルサウス インテレクチュアル プロパティー コーポレーション Method and system for tracking network usage
US6637032B1 (en) 1997-01-06 2003-10-21 Microsoft Corporation System and method for synchronizing enhancing content with a video program using closed captioning
ATE331390T1 (en) 1997-02-14 2006-07-15 Univ Columbia OBJECT-BASED AUDIOVISUAL TERMINAL AND CORRESPONDING BITSTREAM STRUCTURE
JP3402993B2 (en) 1997-03-03 2003-05-06 松下電器産業株式会社 Multi-channel recording device and multi-channel reproducing device
JP4363671B2 (en) 1997-03-20 2009-11-11 ソニー株式会社 Data reproducing apparatus and data reproducing method
US6177930B1 (en) 1997-03-28 2001-01-23 International Business Machines Corp. System and method for enabling a user to move between cyclically transmitted image streams
US6118480A (en) * 1997-05-05 2000-09-12 Flashpoint Technology, Inc. Method and apparatus for integrating a digital camera user interface across multiple operating modes
KR100547928B1 (en) 1997-05-21 2006-02-02 코닌클리케 필립스 일렉트로닉스 엔.브이. Transmission and reception of television programs
US6046818A (en) * 1997-06-03 2000-04-04 Adobe Systems Incorporated Imposition in a raster image processor
US6353461B1 (en) * 1997-06-13 2002-03-05 Panavision, Inc. Multiple camera video assist control system
US6317885B1 (en) 1997-06-26 2001-11-13 Microsoft Corporation Interactive entertainment and information system using television set-top box
US6360234B2 (en) * 1997-08-14 2002-03-19 Virage, Inc. Video cataloger system with synchronized encoders
US6112007A (en) * 1997-10-22 2000-08-29 Kram; Christine Continuous delay television picture display apparatus
ES2168595T3 (en) 1997-10-31 2002-06-16 Sohard Ag METHOD FOR CREATING MULTIMEDIA DATA SUBMISSION PLANS.
US6029045A (en) 1997-12-09 2000-02-22 Cogent Technology, Inc. System and method for inserting local content into programming content
US6480667B1 (en) * 1997-12-23 2002-11-12 Intel Corporation Method of time shifting to simultaneously record and play a data stream
US6687455B1 (en) * 1998-03-06 2004-02-03 Samsung Electronics Co., Ltd Storage medium storing catalog information and apparatus and method for recording and/or playing back catalog information
JP3615657B2 (en) 1998-05-27 2005-02-02 株式会社日立製作所 Video search method and apparatus, and recording medium
US6427238B1 (en) 1998-05-29 2002-07-30 Opentv, Inc. Module manager for interactive television system
US6536041B1 (en) 1998-06-16 2003-03-18 United Video Properties, Inc. Program guide system with real-time data sources
CN1867068A (en) 1998-07-14 2006-11-22 联合视频制品公司 Client-server based interactive television program guide system with remote server recording
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6357042B2 (en) * 1998-09-16 2002-03-12 Anand Srinivasan Method and apparatus for multiplexing separately-authored metadata for insertion into a video data stream
US6704790B1 (en) * 1998-09-16 2004-03-09 Microsoft Corporation Server-side stream switching
US6408128B1 (en) * 1998-11-12 2002-06-18 Max Abecassis Replaying with supplementary information a segment of a video
WO2000060857A1 (en) * 1999-04-08 2000-10-12 Internet Pictures Corporation Virtual theater
US6484212B1 (en) * 1999-04-20 2002-11-19 At&T Corp. Proxy apparatus and method for streaming media information
US6574795B1 (en) 1999-05-28 2003-06-03 Intel Corporation Reliable communication of data by supplementing a unidirectional communications protocol
US6502139B1 (en) 1999-06-01 2002-12-31 Technion Research And Development Foundation Ltd. System for optimizing video on demand transmission by partitioning video program into multiple segments, decreasing transmission rate for successive segments and repeatedly, simultaneously transmission
US6415438B1 (en) 1999-10-05 2002-07-02 Webtv Networks, Inc. Trigger having a time attribute
US6480539B1 (en) * 1999-09-10 2002-11-12 Thomson Licensing S.A. Video encoding method and apparatus
EP1224806B1 (en) 1999-10-29 2003-11-19 OpenTV, Corp. System and method for recording pushed data
US6678463B1 (en) 2000-08-02 2004-01-13 Opentv System and method for incorporating previously broadcast content into program recording
US7206344B1 (en) * 2000-01-05 2007-04-17 Genesis Microchip Inc. Method and apparatus for displaying video
US20060036756A1 (en) * 2000-04-28 2006-02-16 Thomas Driemeyer Scalable, multi-user server and method for rendering images from interactively customizable scene information
IL136080A0 (en) * 2000-05-11 2001-05-20 Yeda Res & Dev Sequence-to-sequence alignment
US7782363B2 (en) * 2000-06-27 2010-08-24 Front Row Technologies, Llc Providing multiple video perspectives of activities through a data network to a remote multimedia server for selective display by remote viewing audiences
WO2002058383A1 (en) 2000-08-02 2002-07-25 Open Tv, Inc. System and method for providing multi-perspective instant replay
CN1309250C (en) 2001-01-19 2007-04-04 开放电视公司 System and method for providing multi-perspective instant replay
US6748335B2 (en) * 2002-05-06 2004-06-08 Tektronix, Inc. Acquisition system for a multi-channel relatively long record length digital storage oscilloscope
CN201532635U (en) 2009-09-03 2010-07-21 百富计算机技术(深圳)有限公司 Safety protection device

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794465A (en) * 1986-05-12 1988-12-27 U.S. Philips Corp. Method of and apparatus for recording and/or reproducing a picture signal and an associated audio signal in/from a record carrier
US20020129374A1 (en) * 1991-11-25 2002-09-12 Michael J. Freeman Compressed digital-data seamless video switching system
US8185929B2 (en) * 1994-10-12 2012-05-22 Cooper J Carl Program viewing apparatus and method
US20030063218A1 (en) * 1995-11-13 2003-04-03 Gemstar Development Corporation Method and apparatus for displaying textual or graphic data on the screen of television receivers
US6295093B1 (en) * 1996-05-03 2001-09-25 Samsung Electronics Co., Ltd. Closed-caption broadcasting and receiving method and apparatus thereof suitable for syllable characters
US6496205B1 (en) * 1996-06-03 2002-12-17 Webtv Networks, Inc. User interface for controlling audio functions in a web browser
US6292226B1 (en) * 1996-08-28 2001-09-18 Matsushita Electric Industrial Co., Ltd. Broadcast receiver selectively using navigation information multiplexed on transport stream and recording medium recording the method of the same
US6233393B1 (en) * 1996-09-27 2001-05-15 Sony Corporation Apparatus for transmitting data in isochronous and asynchronous manners, an apparatus for receiving the same, and a system and method for such transmitting and receiving of such data
US20020194364A1 (en) * 1996-10-09 2002-12-19 Timothy Chase Aggregate information production and display system
US7412533B1 (en) * 1997-03-31 2008-08-12 West Corporation Providing a presentation on a network having a plurality of synchronized media types
US20080010658A1 (en) * 1997-03-31 2008-01-10 Abbott Michael J System and method for media stream indexing and synchronization
US6104997A (en) * 1998-04-22 2000-08-15 Grass Valley Group Digital audio receiver with multi-channel swapping
US7757252B1 (en) * 1998-07-20 2010-07-13 Thomson Licensing S.A. Navigation system for a multichannel digital television system
US20050283819A1 (en) * 1998-10-15 2005-12-22 Matsushita Electric Industrial Co., Ltd. Digital broadcast system
US20010048756A1 (en) * 1998-11-09 2001-12-06 Identigene, Inc. Method of newborn indentification and tracking
US7051360B1 (en) * 1998-11-30 2006-05-23 United Video Properties, Inc. Interactive television program guide with selectable languages
US20080287059A1 (en) * 1999-03-08 2008-11-20 Anderson Jr Tazwell L Video/audio system and method enabling a user to select different views and sounds associated with an event
US6934759B2 (en) * 1999-05-26 2005-08-23 Enounce, Inc. Method and apparatus for user-time-alignment for broadcast works
US20020073171A1 (en) * 1999-06-16 2002-06-13 Mcdowall Ian E. Internet radio receiver with linear tuning interface
US6611958B1 (en) * 1999-08-06 2003-08-26 Sony Corporation Electronic program guide feature for AV system
US7231268B1 (en) * 1999-08-17 2007-06-12 Samsung Electronics Co., Ltd. Method of assigning audio channel identification, method for selecting audio channel using the same, and optical recording and reproducing apparatus suitable therefor
US7000245B1 (en) * 1999-10-29 2006-02-14 Opentv, Inc. System and method for recording pushed data
US20030208771A1 (en) * 1999-10-29 2003-11-06 Debra Hensgen System and method for providing multi-perspective instant replay
US8528019B1 (en) * 1999-11-18 2013-09-03 Koninklijke Philips N.V. Method and apparatus for audio/data/visual information
US7486926B2 (en) * 2000-03-28 2009-02-03 Affinity Labs Of Texas, Llc Content delivery system and method
US20050165942A1 (en) * 2000-05-12 2005-07-28 Sonicbox, Inc. System and method for limiting dead air time in internet streaming media delivery
US20010044726A1 (en) * 2000-05-18 2001-11-22 Hui Li Method and receiver for providing audio translation data on demand
US20010048736A1 (en) * 2000-06-05 2001-12-06 Walker David L. Communication system for delivering and managing content on a voice portal platform
US20020057380A1 (en) * 2000-06-28 2002-05-16 Pace Micro Technology Plc Broadcast data receiver with dual tuning capability
US20070061845A1 (en) * 2000-06-29 2007-03-15 Barnes Melvin L Jr Portable Communication Device and Method of Use
US7565680B1 (en) * 2000-06-30 2009-07-21 Comcast Ip Holdings I, Llc Advanced set top terminal having a video call feature
US6734759B2 (en) * 2001-12-05 2004-05-11 Agilent Technologies, Inc. Droop compensation filter having high pass characteristic
US7613315B2 (en) * 2005-03-04 2009-11-03 Sennheiser Communications A/S Configurable headset

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Jain 6.144,375 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10462530B2 (en) 1999-10-29 2019-10-29 Opentv, Inc. Systems and methods for providing a multi-perspective video display
US10869102B2 (en) 1999-10-29 2020-12-15 Opentv, Inc. Systems and methods for providing a multi-perspective video display

Also Published As

Publication number Publication date
US20140375885A1 (en) 2014-12-25
US20170289644A1 (en) 2017-10-05
US20120314134A1 (en) 2012-12-13
US10869102B2 (en) 2020-12-15
US20200186892A1 (en) 2020-06-11
US8832756B2 (en) 2014-09-09
US10462530B2 (en) 2019-10-29
US20030208771A1 (en) 2003-11-06
US9525839B2 (en) 2016-12-20
US8250617B2 (en) 2012-08-21
US20170295405A1 (en) 2017-10-12

Similar Documents

Publication Publication Date Title
US10869102B2 (en) Systems and methods for providing a multi-perspective video display
EP1415473B1 (en) On-demand interactive magazine
US6678463B1 (en) System and method for incorporating previously broadcast content into program recording
US7032177B2 (en) Method and system for distributing personalized editions of media programs using bookmarks
US7849487B1 (en) Review speed adjustment marker
US7340762B2 (en) Method and apparatus for broadcasting, viewing, reserving and/or delayed viewing of digital television programs
CN102415095B (en) Record and present the digital video recorder of the program formed by the section of splicing
KR100575995B1 (en) Receiving apparatus
US20030095790A1 (en) Methods and apparatus for generating navigation information on the fly
US20040268384A1 (en) Method and apparatus for processing a video signal, method for playback of a recorded video signal and method of providing an advertising service
US7907833B2 (en) Apparatus and method for communicating stop and pause commands in a video recording and playback system
CA2615008A1 (en) Method and apparatus for providing commercials suitable for viewing when fast-forwarding through a digitally recorded program
CA2398200C (en) System and method for providing multi-perspective instant replay
AU2001266732A1 (en) System and method for providing multi-perspective instant replay
JP2001258005A (en) Distributor, distribution system and its method
CN1428045A (en) System and method for providing multi-perspective instant replay

Legal Events

Date Code Title Description
AS Assignment

Owner name: OPENTV, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HENSGEN, DEBRA;PIERRE, LUDOVIC;SIGNING DATES FROM 20010626 TO 20010628;REEL/FRAME:040941/0575

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION