WO1996013124A1 - Video indexing protocol - Google Patents

Video indexing protocol Download PDF

Info

Publication number
WO1996013124A1
WO1996013124A1 PCT/US1995/014594 US9514594W WO9613124A1 WO 1996013124 A1 WO1996013124 A1 WO 1996013124A1 US 9514594 W US9514594 W US 9514594W WO 9613124 A1 WO9613124 A1 WO 9613124A1
Authority
WO
WIPO (PCT)
Prior art keywords
program
information
message
encoder
application
Prior art date
Application number
PCT/US1995/014594
Other languages
French (fr)
Inventor
John D. Miller
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to DE69534896T priority Critical patent/DE69534896T2/en
Priority to EP95942412A priority patent/EP0788714B1/en
Priority to AU43642/96A priority patent/AU4364296A/en
Publication of WO1996013124A1 publication Critical patent/WO1996013124A1/en
Priority to HK98101123A priority patent/HK1002381A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/025Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2665Gathering content from different sources, e.g. Internet and satellite
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4143Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a Personal Computer [PC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4431OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB characterized by the use of Application Program Interface [API] libraries
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/633Control signals issued by server directed to the network components or client
    • H04N21/6332Control signals issued by server directed to the network components or client directed to client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/637Control signals issued by the client directed to the server or network components
    • H04N21/6377Control signals issued by the client directed to the server or network components directed to server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/025Systems for the transmission of digital non-picture data, e.g. of text during the active part of a television frame
    • H04N7/0255Display systems therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • H04N7/0882Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of character code signals, e.g. for teletext
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • H04N7/0884Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of additional display-information, e.g. menu for programme or channel selection
    • H04N7/0885Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of additional display-information, e.g. menu for programme or channel selection for the transmission of subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • H04N7/0887Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital for the transmission of programme or channel identifying signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/08Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
    • H04N7/087Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only
    • H04N7/088Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division with signal insertion during the vertical blanking interval only the inserted signal being digital
    • H04N7/0888Subscription systems therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/162Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing
    • H04N7/165Centralised control of user terminal ; Registering at central
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information

Definitions

  • the present invention relates to video transmission/reception systems. More specifically, the present invention relates to a protocol and apparatus for transmitting information in conjunction with a video signal.
  • VBI vertical blanking interval
  • closed-captioning is that although it uses a portion of the VBI for transmission (line 21 ), is does not make efficient use of the bandwidth of that portion of the non-displayed video signal. It is estimated that a single line of the VBI can be used for uncompressed data transmission at approximately 14.4 kilobytes/second. Thus, real ⁇ time closed captioning of the audio program of a televised broadcast does not take full advantage of the bandwidth of the signal. It also is a unichannel system, wherein only the closed captioning information is transmitted, rather than taking advantage of the full-bandwidth of the signal.
  • Prior art information which has been transmitted in conjunction with television programming sometimes only transmits limited information about the programming. For example, in consumer satellite television reception systems, usually only text information describing the title of the program, and at most, the time elapsed or time remaining in the program has been transmitted with the programming. More detailed information, such as references to outside sources related to the programming, or other information, which is synchronized with the programming has not been transmitted in conjunction with the signal.
  • a computer-implemented method and apparatus for transmitting information with a video signal At least one client application creates a message to be transmitted to a receiver.
  • the client application transmits the message to a data encoder and the encoder receives the message and other messages from other client applications.
  • the encoder transforms the message and the other messages into packets and multiplexes them into a bitstream to be encoded with a video programming signal.
  • the multiplexing is performed according to priorities assigned to the at least one client application and the other client applications.
  • the encoder transmits the bitstream to a video encoder to transmit the bitstream with the video programming signal in order to be received by a decoder.
  • the decoder can then decode the information from the video signal and transmit the information to at least one decoder client application.
  • the client applications may include: a status application which transmits a status information (e.g. time references) at regular intervals; a program application which transmits descriptive information of the video programming synchronized with the video signal (e.g. program markers and/or program text, such as closed- captions and/or subtitles); and a non-program application.
  • the status application may have a highest priority, the program application has a next highest of priority, and the non-programming signal has a lowest priority. In this manner, useful, descriptive and other program or non- program-related information may be transmitted along with the video signal, and displayed and processed according to user requirements.
  • Figure 1 shows a system in which an encoder can be implemented.
  • Figure 2 shows a system in which an decoder can be implemented.
  • Figure 3 shows a block diagram of devices in a networking system in which embodiments of present invention may be implemented.
  • Figure 4 shows the software architecture of an encoder.
  • Figure 5 shows the software architecture of a decoder.
  • Figure 6 shows a specific implementation of the software in a decoder.
  • Figure 7a shows the processes performed by the encoder process.
  • Figure 7b shows the relationship of the messages, packets and frames in the encoder in relation to the ISO/OSI model in one embodiment of the present invention.
  • Figure 8 shows the format of a message used for the application program interface (API) between the encoder/decoder and client applications.
  • API application program interface
  • Figure 9 shows the format of a packet.
  • Figure 10 shows the format of a frame.
  • Figure 11 shows the format of a status composite packet.
  • Figure 12 shows the format of a program marker packet.
  • Figure 13 shows the format of a non-program stock quote packet.
  • Figure 14 shows an example packet bitstream.
  • Figures 15a-15c show the operation of the encoder's main processes.
  • Figures 16a-16c show the operation of the decoder's main processes.
  • Figure 17 shows an example user interface window which main be used by a client application program of a decoder, or an authoring client application program at the transmitter.
  • Figure 18 shows an example of a user display displaying the audio/video program, the program information and non-program information which may be displayed by separate client applications.
  • the present invention is a method and apparatus for transmitting information in conjunction with a video signal.
  • the system to be described here includes the Video Indexing Protocol (VIP).
  • VIP Video Indexing Protocol
  • the techniques to be described here can be used to transmit any information in conjunction with a video signal, although, specific information has been described for illustration purposes.
  • the present invention will be described with reference to certain specific embodiments, including specific data packets, types of communication media, networking systems, transmission apparatus, etc., it can be appreciated by one skilled in the art that these are for illustrative purposes only and are not to be construed as limiting the present invention. Other departures, modifications, and other changes may be made, by one skilled in the art, without departing from the teaching of the present invention.
  • the methods and apparatus used in implemented embodiments of the present invention comprise an encoder portion and a decoder portion, examples of which are shown in Figures 1 and 2.
  • the encoder or "head end" of the video transmission system may have a structure as illustrated in Figure 1.
  • the system includes a master encoder 100 which may receive encoded messages from a plurality of computer systems 110-114 which communicate with encoder 100 via networking medium 150.
  • Network 150 may be any number of prior art networks, including local area networks (LAN's), such as ethernet, token-ring, FDDI, or other networking media as are commercially available.
  • LAN's local area networks
  • the encoders 110-114 will convert their respective information into messages to be processed by encoder 100, and software, operative within encoder 100, will then packetize and prioritize these messages as packets which are then transmitted to VBI inserter 130.
  • VBI inserter 130 may be any number of VBI inserters as are commercially available, such as the model number TDS-3 brand VBI inserter available from norpak Corporation of Ottawa, Ontario, Canada. Note that for the remainder of this application, VBI inserters into an audio/video program signal (NTSC) will be discussed, however, this is for illustration purposes only, and other audio/video encodings for other formats (e.g. PAL, SECAM), and other broadcasting methods (e.g. digital video).
  • NTSC audio/video program signal
  • This information may then be transmitted via satellite uplink 140, along with the audio/video program content.
  • the signal may also be broadcast, cablecast or transmitted in other ways, according to implementation, and this invention is not limited to satellite uplinks or downlinks.
  • Each of the encoders 110-114 may have a different encoding function, such as closed-captioned or foreign-language subtitles, stock quotes, news text, sports scores, or weather, with serialized bitstreams feeding those encoders.
  • status information for the transmission such as timecodes (e.g. timecodes, such as SMPTE timecodes, or time reference markers such as GMT), station ID, and channel map.
  • timecodes e.g. timecodes, such as SMPTE timecodes, or time reference markers such as GMT
  • station ID e.g. timecodes, such as SMPTE timecodes, or time reference markers such as GMT
  • station ID e.g. station ID
  • channel map e.g. program content information
  • This may include generic information, such as scene or story markers, and for an application such as a news transmission, the program content information may include text of the news stories. Any other information, real-time or not, may be included within the information which is encoded.
  • FIG. 2 essentially performs the reverse of the apparatus shown in Figure 1.
  • a satellite downlink 240 may receive the encoded audio/video program which is then decoded by a VBI decoder 230 into the separate program and encoded data portions.
  • a master decoder or gateway computer system receives the encoded data, and the different channels of information can be made available to a plurality of client systems 210- 214 over a networking medium, for example.
  • client systems 210-214 may each be interested in a separate portion of the bitstream.
  • those clients may only need to examine a portion, or channel (e.g. program information, stock, sports scores, weather), of the incoming bitstream, according to user requirements. The details of this will be discussed more below.
  • a system 310 upon which one embodiment of a computer system (e.g. encoder 100 or decoder 200) of the present invention as implemented is shown.
  • 310 comprises a bus or other communication means 301 for communicating information, and a processing means 302 coupled with bus 301 for processing information.
  • System 310 further comprises a random access memory (RAM) or other volatile storage device 304 (referred to as main memory), coupled to bus 301 for storing information and instructions to be executed by processor 302.
  • Main memory 304 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 152.
  • System 310 also comprises a read only memory (ROM) and/or other static storage device 306 coupled to bus 301 for storing static information and instructions for processor 302, and a data storage device 307 such as a magnetic disk or optical disk and its corresponding disk drive.
  • ROM read only memory
  • Data storage device 307 is coupled to bus 301 for storing information and instructions.
  • System 310 may further be coupled to a display device 321 , such as a cathode ray tube (CRT) or liquid crystal display (LCD) coupled to bus 301 for displaying information to a computer user.
  • a display device 321 such as a cathode ray tube (CRT) or liquid crystal display (LCD) coupled to bus 301 for displaying information to a computer user.
  • An alphanumeric input device 322, including alphanumeric and other keys, may also be coupled to bus 301 for communicating information and command selections to processor 302.
  • other devices which may be coupled to bus 301 include a serial interface 324 and/or a communication device 325 either of which comprise means for communicating with other devices.
  • This communication device may also include a means for communicating with other nodes in a network.
  • this may include an Ethernet standard interface coupled to a CSMA/CD backplane for communicating information with other computers (e.g. encoders 110-114, or decoders 210-214).
  • encoders 110-114 e.g. encoders 110-114, or decoders 210-214.
  • any or all of the components of system 310 and associated hardware may be used in various embodiments, however, it can be appreciated that any configuration of the system that includes a processor 302 may be used for various purposes according to the particular implementation.
  • system 310 is one of the IBM AT-compatible type personal computers such as the Gateway 2000 brand personal computer manufactured by Gateway Computer Systems.
  • Processor 302 may be one of the Pentium® brand microprocessors available from Intel Corporation of Santa Clara, California (Pentium and Intel are trademarks of Intel Corporation).
  • FIG 4 shows an example software architecture of the processes which include the encoder.
  • a plurality of processes either resident within a single device (e.g. 100 of Figure 1 ) or each of a plurality of client encoders (e.g. 110-114) may include a plurality of client application programs 401 -403 which communicate via messages to the main video indexing protocol (VIP) encoder 410.
  • VIP encoder 410 implements the transport, network and data link layers of the ISO/OSI networking model via separate portions 410a, 410b and 410c of the encoder.
  • Client applications operate at the application layer.
  • the VIP encoder 410 provides the necessary communication between the application layer (client application programming interface or API) and the VBI inserter 130 which resides at the physical layer. Specific client applications may include stock quotes may be provided.
  • VIP encoder 410 may further accept as inputs timecode, GMT time references, or other time reference via an internal or house clock at the encoder via a special-purpose client application program which is used for transmitting status information to decode
  • FIG 5 illustrates a more detailed view of the software processes operative within a decoder (e.g. 200 of Figure 2).
  • the decoder will include a VIP decoder process 500 which communicates with the VBI decoder apparatus 230 after decoding of the input data from the audio/video programming received from downlink 240.
  • the VIP decoder 500 like the VIP encoder 400, communicates with a plurality of registered client applications 510-513 via client bitstreams, which each may be comprise a separate portion(s) of the multiplexed data stream, according to the client applications' requirements.
  • 256 separate prioritized channels are capable of being transmitted between a VIP encoder/decoder and client processes.
  • the VIP decoder may extract a status channel for status information which is received from the encoder at periodic intervals, which is used for controlling/timing the decoder and the other client application programs. This may be handled by a status client 610 as illustrated in Figure 6, which communicates with a control module 601 , which is responsible for controlling the decoding process 500 itself.
  • status information may be provided to the program application 620 which is responsible, for example, for receiving real-time descriptive information about the program. Such descriptive information may include program/story/segment markers, full-text of the program and/or other descriptive information about the program which is synchronized with transmission.
  • Status information may also be input to any an/or all other client application programs 631 - 633, according to implementation.
  • the remaining non-program channels of information which have a lower priority than status and program information, may include stock quotes, sports scores, weather, or other information which is transmitted (and received) as bandwidth permits.
  • the specifics of the function of the encoder will be described in more detail with reference to Figure 7a.
  • a plurality of VIP application programs 701 -703 communicate with the VIP encoder process 400 via Vt_Messages, the details of which are illustrated in Figure 8.
  • Each of the applications is prioritized. For example, a status application has priority 0 (the highest), and program information has priority 1 (second highest) and other, non-program channels, have priority 3.
  • Vt_Messages 711 - 713 When Vt_Messages 711 - 713 are received by the encoder, transmission of packets such as 722 to the data link layer is performed on a prioritized round-robin basis. In other words, highest priority Vt_Message channels are serviced until exhausted, and channels having the same priority are serviced in round- robin fashion. These are then converted by the datalink layer 724 into frames 725 for transmission to VBI inserter 130. The details of the reception of messages and transformation of messages into packets, and packets into frames is shown in Figure 7b.
  • Vt_Messages 750 are serviced in the input buffer they are transformed by the network layer 755 in the encoder into Vt_Packets 760- 762, illustrated in more detail in Figure 9. Packets are identified by message number and their sequence in the message. Once packetized by the ne'work layer 755, these messages are then sent as packets 760- 762 to the datalink layer 724 in the encoder. The datalink layer 724 then creates a Vt_Frame 770-772 for each of the packets 760-762, respectively. These frames are then transmitted to the physical layer for encoding into the vertical blanking interval by the VBI inserter 130. Vt_Frames are discussed and described in more detail with reference to Figure 10, below. Vt_Frames are serialized and then encoded by VBI inserter 130 for transmission in any number of prior art manners.
  • Vt_Message The format of a Vt_Message is illustrated as 800 in Figure 8.
  • Each message contains a first field, nMessageProtocol 801 , a byte, which is used for identifying the information transmitted (e.g. status, program, stocks, sports, weather, etc ). In this way, the messages can be prioritized.
  • the next field nVersion 802, is a byte identifying the version of this protocol which is being used.
  • the field flsHint 803 is indicates that the message is being provided before an event. This allows equipment in the receiver to "pre-roll" as necessary.
  • flsUPdate field 804 indicates updated information (e.g. outline updates or revised news stories).
  • the fReserved 805 field is currently not used.
  • the field nDataSize 806 is a double-word used for specifying the number of bytes of the data contained in the bData field 809.
  • the field nOpcode 807 is used for specifying the specific operation to be performed upon the data contained in the message. Applications communicate with the encoder by opening a communication channel and send messages to the encoder.
  • opcodes may be integer values specifying any of the following: GetStationID, SetStationID, GetTime, SetTime, GetChannelMap, ChannelOpen, ChannelClose, ChannelGetPriority, ChannelSetPriority, and SendMessage.
  • the next field fReserved2 808 is currently not used, and bData field 809 is variable length, according to the length of the data specified in field 806.
  • the encoder may sent result codes to the client applications, such as error or completion codes.
  • result codes may include: ChannelAlreadyOpen; ChannelNotOpen; ChannelBusy; NotChannelOwner; Protocol Mis match; CommError; NoMoreChannels; NoMessage; or BadData.
  • the scheduler in the encoder services applications according to the priority of the channel, and then in round- robin fashion, for creating and transmitting packets .
  • the structure of a packet is illustrated in Figure 9.
  • 900 of figure 9 shows the format of the packets which are created and then transmitted in the serialized multiplexed bitstream in Vt_Frames to the VBI inserter 130.
  • the nPacketProtocol 901 field is a byte-length field which identifies the packet as one supported by this protocol or other values for other protocols which may be transmitted in future embodiments.
  • the nVersion 902 is used for specifying the version of the encoder which is creating and transmitting the packets.
  • the nChanID field 903 is a byte specifying, as an integer value, the channel number of the packet in the serialized bitstream.
  • the nMessagelD field 904 specifies the message number on the particular channel, from which the packet was obtained. This is used for sending the message in the proper order at the decoder.
  • the nPacketlD field 905 is the number of the packet in the particular message. Again, this is used for constructing the message.
  • the fMorePackets field 906 is a boolean used for specifying whether there are any more packets for the message. If not, then the decoder can detect that it has all the packets for the message, and can transmit the message to the proper client(s).
  • the fReserved 907 field is currently unused and the nDataSize field 908 specifies the size of the bData field 909 in bytes.
  • the bData field 909 is a variable-length field which is used for containing the data. It has the length specified in nDataSize field 908.
  • FIG 10 illustrates the format of a Vt_Frame 1000, which is transmitted to the VBI inserter 130.
  • VIP packets are serialized into byte stream frames to be sent to the VBI inserter 130.
  • Frames consist of fields for indicating the start and end of the frame, and data is pre-stuffed so that it doesn't have any occurrences of the start or end characters.
  • the start frame field STX 1001 thus precedes the Vt_Packet information 1002, a single Vt_Packet, such as 900.
  • the bitstream 1002 is followed by a CRC check field 1003, and an end of frame character ETX 1004 to indicate the end of the frame.
  • FIG 11 illustrates Vt_Status_Composite packet, which is generated by the encoder and is sent to all decoders at regular (e.g. 5 second) intervals.
  • the Vt_Status_Composite packet allows synchronization of decoder applications to the encoder.
  • the packet 1100 includes a xStationID field 1101 , which identifies the station performing the transmission (e.g. "CNN Headline News").
  • the packet includes the xTime field 1102, which is a time reference synchronized to the house clock of the transmitter. In implemented embodiments which transmit status packets at 5 second intervals, the time is GMT, however, in other embodiments, SMPTE time codes may be used wherein status packets are sent every video frame (30 fps). Other time references may be used, according to implementation.
  • a channel map 1103 is included in the packet, which includes the highest channel transmitted, and a variable size portion containing identifying information for each channel (e.g. VIP protocols). -14
  • FIG. 12 illustrates a program packet, Vt_Program_Marker 1200, which is used for real-time description of audio/video program data.
  • Other program packets may also be transmitted. These include:
  • events - which encode a momentary type of action, such as a camera switch, zoom, transition or other editing command.
  • This type may include: a. a type (camera switch, zoom, transition, etc.). b. a length; c. body (data for the event);
  • VIF's video image format
  • VIF's indicates that a plurality of frames should be captured by a digitizing board in the receiver. This may be stored and used for reference (e.g. a weather map);
  • sidebars - referential data. These may include, but not be limited to: a. URL (Universal Resource Locaters) - for locating information on the WWW (World-Wide Web) which may be related to the program; b. OLE Pointer - Microsoft's Object Linking and Embedding protocol for locating data on the client machine which is program-related.
  • URL Universal Resource Locaters
  • OLE Pointer Microsoft's Object Linking and Embedding protocol for locating data on the client machine which is program-related.
  • Vt_Program_Marker 1200 is used for real-time identification of the program which is being transmitted. It is transmitted every 5 seconds, or when the status of the program (e.g. nLevel or nltem) changes. Other types of program packets are sent on an as-needed basis according to the application's requirements. Program markers can be sent synchronized with the video program, or some time period in relation to the program, according to implementation. For example, program markers of a certain type may precede the event by some relationship in order to allow activation and synchronization of device(s) in the receiver.
  • the nType field 1201 precedes program packets to identify the type of the packet (e.g. marker, event, caption, sidebar, etc..)
  • markers which may be encoded into field 1201 which specify: program; story; segment; section; segment; or commercial. These are used to indicate the beginning/ending of a program chunk.
  • Vt_Program_Marker 1200 uses nLevel field 1202 to identify the outline level which this marker describes. This will be used to display, access, or reassemble the program.
  • the nLevel field 1202 may be used to display identifying information in outline form on a client computer
  • nltem field 1203 identifies the particular item at the identified outline level.
  • the nBytes field 1204 identifies the length of the strings which have been packed into field szText 1205. For example, if the nLevel field 1202 contained a 2, then there would be three strings (for three levels) packed into szText field 1205 (the level is 0 based, wherein 0 is the first level).
  • the information may include, for example: Program - "CNN Headline News; Segment - "Dollars & Sense”; and Story - “Intel Stock Soars.”
  • Program - “CNN Headline News; Segment - "Dollars & Sense”; and Story - “Intel Stock Soars.”
  • this may include, for example: Play - "Henry IV”; Act - “Act Three”; and Scene - "Scene Five- Cade Conspires with Dirk.”
  • Other types of program packets may also be transmitted, and are within the spirit and scope of the present invention.
  • Non-program packets may also be transmitted. These include: stock quotations; sports scores; and weather information. These are transmitted in priority after both status and program information on a bandwidth-available, real-time basis.
  • An example of a stock quotation packet is shown in Figure 13 as Vt_Quote_Tick packet 1300, which is sent every time a stock quote is read off a feed to the encoder.
  • This packet includes a signature word field wSig 1301 for identifying the quotation; a stock change exchange identifier xExchange field 1302; a ticker symbol field szSymbol 1303; a number of transactions field 1304; and a transactions array 1305, containing transaction information for the number of transactions specified in field 1304.
  • a typical VIP bitstream is illustrated as 1400 in Figure 14.
  • Packets are illustrated as 1401 -1412, wherein status packets are sent on a regular basis (e.g. every 5 seconds) such as 1401 and 1411 , in order to keep the receiver(s) synchronized with the transmitter.
  • Program marker packets such as 1402 and 1412 are similarly sent, as well as when program content changes, in order to indicate the change.
  • Non-status, non- program packets 1403-1410 are sent in the remaining time, in round- robin fashion, as bandwidth permits.
  • Figure 15a illustrates the sequence of steps performed by at the Transport/API layer of the ISO/OSI model. Note that processes are implemented using object-oriented techniques, and thus, source code references object(s) and accompanying processes which service those object(s). Thus, the precise sequence of execution of the three subprocesses shows in Figures 15a-15c may not necessarily be precisely sequential.
  • the Transport/API layer receives a message from a client application, and determines whether it is a timer event (e.g. thus requiring transmission of a status message) at step 1504. If it is, then at step 1506, a Vt_Status_Composite message is sent.
  • step 1510 determines whether the message is of the ChannelOpen/Close type. If so, then the corresponding action is performed at step 1510, and a status message is sent indicating the change in channel. If the message was not of the open/close type, as detected at step 1508, then the process proceeds to step 1512, which determines whether a message is pending on the channel. If so, then a responsive message (e.g. ChannelBusy) is returned to the application at step 1514, and the application can take some corrective action. If not, then the message is queued on the current channel at step 1516, and sent to the network layer for packetizing. The process is thus complete at step 1518.
  • a responsive message e.g. ChannelBusy
  • Network layer processing in the encoder is illustrated in Figure 15b. If no more messages are pending in the message, as detected at step 1520, then the process remains idle at step 1522. If, however, message(s) are awaiting processing, then the highest priority channel with message(s) waiting is determined at step 1524. The next channel with the selected priority is then selected for message processing at step 1526, in round-robin fashion. This is to allow equivalent processing of messages for applications having the same channel priority.
  • a Vt_Packet is then constructed from a portion of the Vt_Message from the selected client at step 1528. The maximum packet length for each packet is used until the message is exhausted, wherein the last packet for a message is variable length, according to the portion of the message which remains. Once this has been performed, the Vt_Packet is then sent to the Datalink layer, at step 1530, and the process continues processing messages at step 1520.
  • Figure 15c illustrates the datalink layer processing which is performed in the encoder.
  • the steps 1532-1548 are performed on each Vt_Packet received from the network layer in order to create a Vt_Frame.
  • a CRC cyclic redundancy check
  • STX start of frame character
  • step 1536 determines whether there are any more characters to be processed in the packet. If so, then the process proceeds to step 1538.
  • Loop 1536-1542 packs the packet, preceding any occurrences of the STX, ETX or DLE (data-link escape) characters by DLE. If the character is none of these three, as detected at step 1538, then it is placed into the frame at step 1542. If it is one of the three characters, then it is preceded by DLE, and placed into the frame at step 1542. Step 1536 then repeats until no more characters are in the packet.
  • step 1536 the process proceeds to step 1544, wherein the computed CRC is placed into the frame to allow parity checking.
  • An ETX character is then added to signify the end of the frame at step 1546, and the Vt_Frame can then be transmitted to an output serial port of the encoder in order to be merged with the audio/video program (e.g. by VBI inserter 130). Encoding is thus complete.
  • FIG. 16a Datalink processing in the decoder is illustrated in Figure 16a.
  • Datalink processing is performed at step 1632 wherein a block is received from a serial port (e.g. from VBI decoder 230), and copied to the ring buffer.
  • a serial port e.g. from VBI decoder 230
  • step 1636 determines whether any more characters are are in the ring buffer to process. If not, the block reception continues at step 1632.
  • step 1636 If there are more characters tin the ring buffer, as detected at step 1636, then the process proceeds to step 1638 wherein it is determined whether the character is an ETX character. If the character is not an ETX, then the character is copied to the VT_Packet buffer at step 1640. This loop continues until no more characters are to be processed in the ring buffer, or the ETX character is located. Once the ETX character is detected at step 1638, then the end of the frame has been reached, and the processing of the packet can take place. At step 1642, a CRC of the packet is performed. If it fails, the data has been corrupted, then the Vt_Packet buffer is discarded at step 1644. If the CRC check passes, then the Vt_Packet may then be sent to the network layer at step 1646. Then, the process continues to wait, in loop 1632- 1636, to locate the STX character and determine if any more characters are received.
  • a Vt_Packet is received at step 1614.
  • Steps 1616-1620 check the Vt_Packet_Size, to determine if any bits were lost, Vt_Packet_MessagelD, to determine if this refers to a valid message, and Vt_Packet_PacketlD, to determine if this refers to a valid packet in the message. If any of these checks fail, then the Vt_Message under construction is discarded at step 1622, and the process proceeds to an idle state 1624, until a subsequent Vt_Packet is received. Once the packet is determined as valid, as detected at steps 1616-1620, the Vt_Packet is then added to the message under construction at step 1626.
  • Vt_Message is sent to the transport layer in order to communicate with any client(s). If there are more Vt_Packets in the message, as detected at step 1628, then the process continues at step 1614.
  • Message(s) are received at the transport layer from the network layer at step 1602.
  • the process then proceeds to step 1612, which is also performed if the message was not a status message.
  • the Vt_Message is then transmitted to the application(s) listening to the message's channel.
  • FIG. 17 An example of a program information window, as may be displayed on the display of a suitably programmed microcomputer (e.g. 310 of Figure 3) or other apparatus having similar function executing a program information client application is shown as 1700 of Figure 17.
  • a program information window may be displayed using any number of commercially-available graphical user interfaces (GUI) such as the Windows brand GUI available from Microsoft Corporation of Redmond, Washington.
  • GUI graphical user interfaces
  • the program information window may display program information in outline form, as derived from a packet such as the program marker packet 1200, as described with reference to Figure 12, above.
  • the program title is displayed as the window title, and segment titles are show as 1710, 1720, 1730 and 1740.
  • Story headings such as 171 1 -1718, as discussed above, are referenced in order of appearance under the headline.
  • other options may be accessed, such as real-time stock quotes, sports scores, or other non-program information.
  • Selection of any of the headlines 1711 -1718 may allow display of text of the story, closed captioned information, and/or other program-related information, such as events, captions, sidebars, or other useful information.
  • Window 1810 presents the televised audio/video program, which may be executed under control of a first task in a multitasking environment, and may be fed by any number of sources (e.g. satellite, broadcast, cable, or digital transmission).
  • a second task a client of the VIP decoder (e.g. 500 of Figure 5), may display program- related information in a window such as 1820, such as headlines for news stories. Any other transmitted program information may be accessed using this window under control of the program client.
  • non- program information such as real-time stock quotations, may be displayed in window 1830 under control of a third task, also a client of the decoder. In this way, program and non-program information may be displayed to the user, depending upon which information he considers of interest, and which client application(s) are activated.
  • client application programs can communicate with a main decoder process (e.g. 500 of Figure 5) and obtain useful, multiplexed serialized data which is received along with transmitted audio/video information, and extract certain useful information according to requirements.
  • the post-processing including the display of user- interfaces, control of additional device(s) (e.g. video digitizers or video recorders), or other responsive actions in the clients as s result of the reception of such information may be performed according to implementation.
  • the transmission or reception of such information does not interfere with the audio/video programming transmitted concurrently therewith, and also, takes advantage of the unused bandwidth provided by such transmissions.
  • implemented embodiments of the present invention present advantages neither recognized nor realized by the prior art.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Library & Information Science (AREA)
  • Software Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Television Systems (AREA)

Abstract

At least one client application (510) creates a message to be transmitted to a receiver (240). The client application transmits the message to an encoder which receives the message and other messages from other client applications. The encoder transforms the composite messages into packets and multiplexes them into a bitstream to be encoded with a video programming signal. The multiplexing is performed according to priorities assigned to the at least one client application and the other applications. The encoder transmits the bitstream to a video encoder to transmit the bitstream with the programming signal to be received by a decoder. The decoder can then decode the information from the video signal and transmit the information to at least one decoder client application. The client applications may include a status application which transmits a status information (e.g. time references) at regular intervals; a program application which transmits descriptive information of the video programming synchronized with the video signal (e.g. program markers and/or program text, such as closed-captions and/or subtitles); and a non-program application.

Description

VIDEO INDEXING PROTOCOL
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to video transmission/reception systems. More specifically, the present invention relates to a protocol and apparatus for transmitting information in conjunction with a video signal.
2. Background Information
With the proliferation of personal computer systems and the decline in costs of high-capacity computer technology in general, the technologies of television transmission and computer applications are starting to merge. Solutions have been slow in coming, however.
For certain applications in television program transmission, it is desirable to transmit information in addition to the audio/video portion of the signal. For example, closed-captioned programming is now being used extensively to serve the needs of the hearing-impaired. In fact, recent requirement for television receivers include the requirement that all receivers provide this capability to display, in text, the audio portion of the transmitted program.
For news programming, other needs have arisen. For example, for 24-hour a day news programming, real-time stock quotes and sports scores are now being displayed as part of the video portion of the signal on services such as Headline News (a trademark of Turner Broadcasting, Inc.). Although this serves the needs of the viewer, providing such real-time information, this solution not only distracts the viewer of the video screen, but also unnecessarily consumes screen space, and does not allow the viewer to view the information which is of interest to him. For example, to view the price of a particular stock, the viewer has to wait until the ticker cycles back to display that information. The same deficiency is true for sports scores. Other viewers, such as those requiring real-time weather information needs are also not met. Thus, an improved means for obtaining such information is required. Prior art solutions such as closed-captioning use the vertical blanking interval (VBI) for encoding text for the audio portion of the programming. It typically uses line 21 of the vertical synchronization portion of the video signal. Thus, although it does not interfere with the transmission of the video signal, it has lacked the capability to be used in any other way, rather than real-time display the user, such as indexing, full-text capture, and/or other text processing operations commonly performed in modern personal computer word processing application programs.
Another shortcoming of closed-captioning is that although it uses a portion of the VBI for transmission (line 21 ), is does not make efficient use of the bandwidth of that portion of the non-displayed video signal. It is estimated that a single line of the VBI can be used for uncompressed data transmission at approximately 14.4 kilobytes/second. Thus, real¬ time closed captioning of the audio program of a televised broadcast does not take full advantage of the bandwidth of the signal. It also is a unichannel system, wherein only the closed captioning information is transmitted, rather than taking advantage of the full-bandwidth of the signal.
Prior art information which has been transmitted in conjunction with television programming sometimes only transmits limited information about the programming. For example, in consumer satellite television reception systems, usually only text information describing the title of the program, and at most, the time elapsed or time remaining in the program has been transmitted with the programming. More detailed information, such as references to outside sources related to the programming, or other information, which is synchronized with the programming has not been transmitted in conjunction with the signal.
Thus, the prior art of for transmitting information with television programming suffers from several shortcomings. SUMMARY OF THE INVENTION
A computer-implemented method and apparatus for transmitting information with a video signal. At least one client application creates a message to be transmitted to a receiver. The client application transmits the message to a data encoder and the encoder receives the message and other messages from other client applications. The encoder transforms the message and the other messages into packets and multiplexes them into a bitstream to be encoded with a video programming signal. The multiplexing is performed according to priorities assigned to the at least one client application and the other client applications. Then, the encoder transmits the bitstream to a video encoder to transmit the bitstream with the video programming signal in order to be received by a decoder. The decoder can then decode the information from the video signal and transmit the information to at least one decoder client application. The client applications may include: a status application which transmits a status information (e.g. time references) at regular intervals; a program application which transmits descriptive information of the video programming synchronized with the video signal (e.g. program markers and/or program text, such as closed- captions and/or subtitles); and a non-program application. The status application may have a highest priority, the program application has a next highest of priority, and the non-programming signal has a lowest priority. In this manner, useful, descriptive and other program or non- program-related information may be transmitted along with the video signal, and displayed and processed according to user requirements.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention is illustrated by way of example and not limitation in the figures of the accompanying in which like references indicate like elements and in which:
Figure 1 shows a system in which an encoder can be implemented.
Figure 2 shows a system in which an decoder can be implemented.
Figure 3 shows a block diagram of devices in a networking system in which embodiments of present invention may be implemented.
Figure 4 shows the software architecture of an encoder.
Figure 5 shows the software architecture of a decoder.
Figure 6 shows a specific implementation of the software in a decoder.
Figure 7a shows the processes performed by the encoder process.
Figure 7b shows the relationship of the messages, packets and frames in the encoder in relation to the ISO/OSI model in one embodiment of the present invention.
Figure 8 shows the format of a message used for the application program interface (API) between the encoder/decoder and client applications.
Figure 9 shows the format of a packet.
Figure 10 shows the format of a frame.
Figure 11 shows the format of a status composite packet.
Figure 12 shows the format of a program marker packet.
Figure 13 shows the format of a non-program stock quote packet.
Figure 14 shows an example packet bitstream.
Figures 15a-15c show the operation of the encoder's main processes.
Figures 16a-16c show the operation of the decoder's main processes.
Figure 17 shows an example user interface window which main be used by a client application program of a decoder, or an authoring client application program at the transmitter. Figure 18 shows an example of a user display displaying the audio/video program, the program information and non-program information which may be displayed by separate client applications.
DETAILED DESCRIPTION
The present invention is a method and apparatus for transmitting information in conjunction with a video signal. The system to be described here includes the Video Indexing Protocol (VIP). The techniques to be described here can be used to transmit any information in conjunction with a video signal, although, specific information has been described for illustration purposes. Although the present invention will be described with reference to certain specific embodiments, including specific data packets, types of communication media, networking systems, transmission apparatus, etc., it can be appreciated by one skilled in the art that these are for illustrative purposes only and are not to be construed as limiting the present invention. Other departures, modifications, and other changes may be made, by one skilled in the art, without departing from the teaching of the present invention.
The methods and apparatus used in implemented embodiments of the present invention comprise an encoder portion and a decoder portion, examples of which are shown in Figures 1 and 2. The encoder or "head end" of the video transmission system may have a structure as illustrated in Figure 1. The system includes a master encoder 100 which may receive encoded messages from a plurality of computer systems 110-114 which communicate with encoder 100 via networking medium 150. Network 150 may be any number of prior art networks, including local area networks (LAN's), such as ethernet, token-ring, FDDI, or other networking media as are commercially available. The encoders 110-114 will convert their respective information into messages to be processed by encoder 100, and software, operative within encoder 100, will then packetize and prioritize these messages as packets which are then transmitted to VBI inserter 130.
VBI inserter 130 may be any number of VBI inserters as are commercially available, such as the model number TDS-3 brand VBI inserter available from norpak Corporation of Ottawa, Ontario, Canada. Note that for the remainder of this application, VBI inserters into an audio/video program signal (NTSC) will be discussed, however, this is for illustration purposes only, and other audio/video encodings for other formats (e.g. PAL, SECAM), and other broadcasting methods (e.g. digital video). This information may then be transmitted via satellite uplink 140, along with the audio/video program content. The signal may also be broadcast, cablecast or transmitted in other ways, according to implementation, and this invention is not limited to satellite uplinks or downlinks. Each of the encoders 110-114 may have a different encoding function, such as closed-captioned or foreign-language subtitles, stock quotes, news text, sports scores, or weather, with serialized bitstreams feeding those encoders. In addition to this information, status information for the transmission such as timecodes (e.g. timecodes, such as SMPTE timecodes, or time reference markers such as GMT), station ID, and channel map. This may include program content information. This may include generic information, such as scene or story markers, and for an application such as a news transmission, the program content information may include text of the news stories. Any other information, real-time or not, may be included within the information which is encoded.
The structure of the decoder is shown in Figure 2. Figure 2 essentially performs the reverse of the apparatus shown in Figure 1. A satellite downlink 240 may receive the encoded audio/video program which is then decoded by a VBI decoder 230 into the separate program and encoded data portions. Then, a master decoder or gateway computer system receives the encoded data, and the different channels of information can be made available to a plurality of client systems 210- 214 over a networking medium, for example. For example, each of the computer systems 210-214 may each be interested in a separate portion of the bitstream. Thus, those clients may only need to examine a portion, or channel (e.g. program information, stock, sports scores, weather), of the incoming bitstream, according to user requirements. The details of this will be discussed more below.
Although separate systems are shown in Figures 1 and 2, it can be appreciated that such is for illustration purposes only, and that in a multitasking environment, a single one of the systems (e.g. 100, 200), may be used for encoding wherein any or all of the separate encoders (e.g. 110-114, 210-214) can be implemented as separate processes resident in a single computer. The techniques have equal application to multitasking, multicomputing, or networking environments.
Referring to Figure 3, a system 310 upon which one embodiment of a computer system (e.g. encoder 100 or decoder 200) of the present invention as implemented is shown. 310 comprises a bus or other communication means 301 for communicating information, and a processing means 302 coupled with bus 301 for processing information. System 310 further comprises a random access memory (RAM) or other volatile storage device 304 (referred to as main memory), coupled to bus 301 for storing information and instructions to be executed by processor 302. Main memory 304 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 152. System 310 also comprises a read only memory (ROM) and/or other static storage device 306 coupled to bus 301 for storing static information and instructions for processor 302, and a data storage device 307 such as a magnetic disk or optical disk and its corresponding disk drive. Data storage device 307 is coupled to bus 301 for storing information and instructions.
System 310 may further be coupled to a display device 321 , such as a cathode ray tube (CRT) or liquid crystal display (LCD) coupled to bus 301 for displaying information to a computer user. An alphanumeric input device 322, including alphanumeric and other keys, may also be coupled to bus 301 for communicating information and command selections to processor 302. An additional user input device is cursor control 323, such as a mouse, a trackball, stylus, or cursor direction keys, coupled to bus 301 for communicating direction information and command selections to processor 302, and for controlling cursor movement on display 321.
In implemented embodiments, other devices which may be coupled to bus 301 include a serial interface 324 and/or a communication device 325 either of which comprise means for communicating with other devices. This communication device may also include a means for communicating with other nodes in a network. In implemented embodiments, this may include an Ethernet standard interface coupled to a CSMA/CD backplane for communicating information with other computers (e.g. encoders 110-114, or decoders 210-214). Note, also, that any or all of the components of system 310 and associated hardware may be used in various embodiments, however, it can be appreciated that any configuration of the system that includes a processor 302 may be used for various purposes according to the particular implementation.
In one embodiment, system 310 is one of the IBM AT-compatible type personal computers such as the Gateway 2000 brand personal computer manufactured by Gateway Computer Systems. Processor 302 may be one of the Pentium® brand microprocessors available from Intel Corporation of Santa Clara, California (Pentium and Intel are trademarks of Intel Corporation).
Note that the following discussion of various embodiments discussed herein will refer specifically to a series of routines which are generated in a high-level programming language (e.g., the C or C++ language) and compiled, linked, and then run as object code in system 310 during run-time. It can be appreciated by one skilled in the art, however, that the following methods and apparatus may be implemented in special purpose hardware devices, such as discrete logic devices, large scale integrated circuits (LSI's), application-specific integrated circuits (ASIC's), or other specialized hardware. The description here has equal application to apparatus having similar function.
Figure 4 shows an example software architecture of the processes which include the encoder. A plurality of processes either resident within a single device (e.g. 100 of Figure 1 ) or each of a plurality of client encoders (e.g. 110-114) may include a plurality of client application programs 401 -403 which communicate via messages to the main video indexing protocol (VIP) encoder 410. VIP encoder 410 implements the transport, network and data link layers of the ISO/OSI networking model via separate portions 410a, 410b and 410c of the encoder. Client applications operate at the application layer. The VIP encoder 410 provides the necessary communication between the application layer (client application programming interface or API) and the VBI inserter 130 which resides at the physical layer. Specific client applications may include stock quotes may be provided. VIP encoder 410 may further accept as inputs timecode, GMT time references, or other time reference via an internal or house clock at the encoder via a special-purpose client application program which is used for transmitting status information to decoders, such as 200 shown in Figure 2.
Figure 5 illustrates a more detailed view of the software processes operative within a decoder (e.g. 200 of Figure 2). The decoder will include a VIP decoder process 500 which communicates with the VBI decoder apparatus 230 after decoding of the input data from the audio/video programming received from downlink 240. The VIP decoder 500, like the VIP encoder 400, communicates with a plurality of registered client applications 510-513 via client bitstreams, which each may be comprise a separate portion(s) of the multiplexed data stream, according to the client applications' requirements.
In implemented embodiments of the present invention, 256 separate prioritized channels are capable of being transmitted between a VIP encoder/decoder and client processes. The VIP decoder may extract a status channel for status information which is received from the encoder at periodic intervals, which is used for controlling/timing the decoder and the other client application programs. This may be handled by a status client 610 as illustrated in Figure 6, which communicates with a control module 601 , which is responsible for controlling the decoding process 500 itself. In addition, status information may be provided to the program application 620 which is responsible, for example, for receiving real-time descriptive information about the program. Such descriptive information may include program/story/segment markers, full-text of the program and/or other descriptive information about the program which is synchronized with transmission. The details of program descriptive information will be discussed in more detail below. Status information may also be input to any an/or all other client application programs 631 - 633, according to implementation. The remaining non-program channels of information, which have a lower priority than status and program information, may include stock quotes, sports scores, weather, or other information which is transmitted (and received) as bandwidth permits. The specifics of the function of the encoder will be described in more detail with reference to Figure 7a. A plurality of VIP application programs 701 -703 communicate with the VIP encoder process 400 via Vt_Messages, the details of which are illustrated in Figure 8. Each of the applications is prioritized. For example, a status application has priority 0 (the highest), and program information has priority 1 (second highest) and other, non-program channels, have priority 3. When Vt_Messages 711 - 713 are received by the encoder, transmission of packets such as 722 to the data link layer is performed on a prioritized round-robin basis. In other words, highest priority Vt_Message channels are serviced until exhausted, and channels having the same priority are serviced in round- robin fashion. These are then converted by the datalink layer 724 into frames 725 for transmission to VBI inserter 130. The details of the reception of messages and transformation of messages into packets, and packets into frames is shown in Figure 7b.
As Vt_Messages 750 are serviced in the input buffer they are transformed by the network layer 755 in the encoder into Vt_Packets 760- 762, illustrated in more detail in Figure 9. Packets are identified by message number and their sequence in the message. Once packetized by the ne'work layer 755, these messages are then sent as packets 760- 762 to the datalink layer 724 in the encoder. The datalink layer 724 then creates a Vt_Frame 770-772 for each of the packets 760-762, respectively. These frames are then transmitted to the physical layer for encoding into the vertical blanking interval by the VBI inserter 130. Vt_Frames are discussed and described in more detail with reference to Figure 10, below. Vt_Frames are serialized and then encoded by VBI inserter 130 for transmission in any number of prior art manners.
The format of a Vt_Message is illustrated as 800 in Figure 8. Each message contains a first field, nMessageProtocol 801 , a byte, which is used for identifying the information transmitted (e.g. status, program, stocks, sports, weather, etc ). In this way, the messages can be prioritized. The next field nVersion 802, is a byte identifying the version of this protocol which is being used. The field flsHint 803 is indicates that the message is being provided before an event. This allows equipment in the receiver to "pre-roll" as necessary. flsUPdate field 804 indicates updated information (e.g. outline updates or revised news stories). The fReserved 805 field is currently not used. The field nDataSize 806 is a double-word used for specifying the number of bytes of the data contained in the bData field 809. The field nOpcode 807 is used for specifying the specific operation to be performed upon the data contained in the message. Applications communicate with the encoder by opening a communication channel and send messages to the encoder. In this example, opcodes may be integer values specifying any of the following: GetStationID, SetStationID, GetTime, SetTime, GetChannelMap, ChannelOpen, ChannelClose, ChannelGetPriority, ChannelSetPriority, and SendMessage. The next field fReserved2 808 is currently not used, and bData field 809 is variable length, according to the length of the data specified in field 806. Responsive to any of the operations specified by the opcodes, the encoder may sent result codes to the client applications, such as error or completion codes. These result codes may include: ChannelAlreadyOpen; ChannelNotOpen; ChannelBusy; NotChannelOwner; Protocol Mis match; CommError; NoMoreChannels; NoMessage; or BadData.
As discussed above, the scheduler in the encoder services applications according to the priority of the channel, and then in round- robin fashion, for creating and transmitting packets . The structure of a packet is illustrated in Figure 9. 900 of figure 9 shows the format of the packets which are created and then transmitted in the serialized multiplexed bitstream in Vt_Frames to the VBI inserter 130. The nPacketProtocol 901 field is a byte-length field which identifies the packet as one supported by this protocol or other values for other protocols which may be transmitted in future embodiments. The nVersion 902 is used for specifying the version of the encoder which is creating and transmitting the packets. The nChanID field 903 is a byte specifying, as an integer value, the channel number of the packet in the serialized bitstream. The nMessagelD field 904 specifies the message number on the particular channel, from which the packet was obtained. This is used for sending the message in the proper order at the decoder. The nPacketlD field 905 is the number of the packet in the particular message. Again, this is used for constructing the message. The fMorePackets field 906 is a boolean used for specifying whether there are any more packets for the message. If not, then the decoder can detect that it has all the packets for the message, and can transmit the message to the proper client(s). The fReserved 907 field is currently unused and the nDataSize field 908 specifies the size of the bData field 909 in bytes. The bData field 909 is a variable-length field which is used for containing the data. It has the length specified in nDataSize field 908.
Figure 10 illustrates the format of a Vt_Frame 1000, which is transmitted to the VBI inserter 130. VIP packets are serialized into byte stream frames to be sent to the VBI inserter 130. Frames consist of fields for indicating the start and end of the frame, and data is pre-stuffed so that it doesn't have any occurrences of the start or end characters. As illustrated in Figure 10, the start frame field STX 1001 thus precedes the Vt_Packet information 1002, a single Vt_Packet, such as 900. The bitstream 1002 is followed by a CRC check field 1003, and an end of frame character ETX 1004 to indicate the end of the frame.
Figure 11 illustrates Vt_Status_Composite packet, which is generated by the encoder and is sent to all decoders at regular (e.g. 5 second) intervals. The Vt_Status_Composite packet allows synchronization of decoder applications to the encoder. The packet 1100 includes a xStationID field 1101 , which identifies the station performing the transmission (e.g. "CNN Headline News"). The packet includes the xTime field 1102, which is a time reference synchronized to the house clock of the transmitter. In implemented embodiments which transmit status packets at 5 second intervals, the time is GMT, however, in other embodiments, SMPTE time codes may be used wherein status packets are sent every video frame (30 fps). Other time references may be used, according to implementation. Finally, a channel map 1103 is included in the packet, which includes the highest channel transmitted, and a variable size portion containing identifying information for each channel (e.g. VIP protocols). -14-
Figure 12 illustrates a program packet, Vt_Program_Marker 1200, which is used for real-time description of audio/video program data. Other program packets may also be transmitted. These include:
1. events - which encode a momentary type of action, such as a camera switch, zoom, transition or other editing command. This type may include: a. a type (camera switch, zoom, transition, etc....). b. a length; c. body (data for the event);
2. VIF's (video image format) - indicates that a plurality of frames should be captured by a digitizing board in the receiver. This may be stored and used for reference (e.g. a weather map);
3. captions - full-text of all spoken material;
4. sidebars - referential data. These may include, but not be limited to: a. URL (Universal Resource Locaters) - for locating information on the WWW (World-Wide Web) which may be related to the program; b. OLE Pointer - Microsoft's Object Linking and Embedding protocol for locating data on the client machine which is program-related.
Vt_Program_Marker 1200 is used for real-time identification of the program which is being transmitted. It is transmitted every 5 seconds, or when the status of the program (e.g. nLevel or nltem) changes. Other types of program packets are sent on an as-needed basis according to the application's requirements. Program markers can be sent synchronized with the video program, or some time period in relation to the program, according to implementation. For example, program markers of a certain type may precede the event by some relationship in order to allow activation and synchronization of device(s) in the receiver. The nType field 1201 precedes program packets to identify the type of the packet (e.g. marker, event, caption, sidebar, etc..) There are also different types of markers which may be encoded into field 1201 which specify: program; story; segment; section; segment; or commercial. These are used to indicate the beginning/ending of a program chunk.
Vt_Program_Marker 1200 uses nLevel field 1202 to identify the outline level which this marker describes. This will be used to display, access, or reassemble the program. The nLevel field 1202 may be used to display identifying information in outline form on a client computer, nltem field 1203 identifies the particular item at the identified outline level. The nBytes field 1204 identifies the length of the strings which have been packed into field szText 1205. For example, if the nLevel field 1202 contained a 2, then there would be three strings (for three levels) packed into szText field 1205 (the level is 0 based, wherein 0 is the first level). For a news program, the information may include, for example: Program - "CNN Headline News; Segment - "Dollars & Sense"; and Story - "Intel Stock Soars." For a televised play, this may include, for example: Play - "Henry IV"; Act - "Act Three"; and Scene - "Scene Five- Cade Conspires with Dirk." Other types of program packets, may also be transmitted, and are within the spirit and scope of the present invention.
Other types of non-program packets may also be transmitted. These include: stock quotations; sports scores; and weather information. These are transmitted in priority after both status and program information on a bandwidth-available, real-time basis. An example of a stock quotation packet is shown in Figure 13 as Vt_Quote_Tick packet 1300, which is sent every time a stock quote is read off a feed to the encoder. This packet includes a signature word field wSig 1301 for identifying the quotation; a stock change exchange identifier xExchange field 1302; a ticker symbol field szSymbol 1303; a number of transactions field 1304; and a transactions array 1305, containing transaction information for the number of transactions specified in field 1304.
A typical VIP bitstream is illustrated as 1400 in Figure 14. Packets are illustrated as 1401 -1412, wherein status packets are sent on a regular basis (e.g. every 5 seconds) such as 1401 and 1411 , in order to keep the receiver(s) synchronized with the transmitter. Program marker packets such as 1402 and 1412 are similarly sent, as well as when program content changes, in order to indicate the change. Non-status, non- program packets 1403-1410 are sent in the remaining time, in round- robin fashion, as bandwidth permits.
The encoders' three main processes for the different layers are shown in Figures 15a-15c. Figure 15a illustrates the sequence of steps performed by at the Transport/API layer of the ISO/OSI model. Note that processes are implemented using object-oriented techniques, and thus, source code references object(s) and accompanying processes which service those object(s). Thus, the precise sequence of execution of the three subprocesses shows in Figures 15a-15c may not necessarily be precisely sequential. At any rate, the Transport/API layer receives a message from a client application, and determines whether it is a timer event (e.g. thus requiring transmission of a status message) at step 1504. If it is, then at step 1506, a Vt_Status_Composite message is sent. If not, it is determined whether the message is of the ChannelOpen/Close type. If so, then the corresponding action is performed at step 1510, and a status message is sent indicating the change in channel. If the message was not of the open/close type, as detected at step 1508, then the process proceeds to step 1512, which determines whether a message is pending on the channel. If so, then a responsive message (e.g. ChannelBusy) is returned to the application at step 1514, and the application can take some corrective action. If not, then the message is queued on the current channel at step 1516, and sent to the network layer for packetizing. The process is thus complete at step 1518.
Network layer processing in the encoder is illustrated in Figure 15b. If no more messages are pending in the message, as detected at step 1520, then the process remains idle at step 1522. If, however, message(s) are awaiting processing, then the highest priority channel with message(s) waiting is determined at step 1524. The next channel with the selected priority is then selected for message processing at step 1526, in round-robin fashion. This is to allow equivalent processing of messages for applications having the same channel priority. A Vt_Packet is then constructed from a portion of the Vt_Message from the selected client at step 1528. The maximum packet length for each packet is used until the message is exhausted, wherein the last packet for a message is variable length, according to the portion of the message which remains. Once this has been performed, the Vt_Packet is then sent to the Datalink layer, at step 1530, and the process continues processing messages at step 1520.
Figure 15c illustrates the datalink layer processing which is performed in the encoder. The steps 1532-1548 are performed on each Vt_Packet received from the network layer in order to create a Vt_Frame. First, at step 1532, a CRC (cyclic redundancy check) computation is performed upon the packet, at step 1532. Then, a start of frame character STX is added to the frame at step 1534.
The processing of the packet then commences, wherein a check at step 1536 determines whether there are any more characters to be processed in the packet. If so, then the process proceeds to step 1538. Loop 1536-1542 packs the packet, preceding any occurrences of the STX, ETX or DLE (data-link escape) characters by DLE. If the character is none of these three, as detected at step 1538, then it is placed into the frame at step 1542. If it is one of the three characters, then it is preceded by DLE, and placed into the frame at step 1542. Step 1536 then repeats until no more characters are in the packet.
Once processing of the packet is complete, as detected at step 1536, then the process proceeds to step 1544, wherein the computed CRC is placed into the frame to allow parity checking. An ETX character is then added to signify the end of the frame at step 1546, and the Vt_Frame can then be transmitted to an output serial port of the encoder in order to be merged with the audio/video program (e.g. by VBI inserter 130). Encoding is thus complete.
The decoder's operation at the three network layers is shown in Figures 16a-16c. Datalink processing in the decoder is illustrated in Figure 16a. Datalink processing is performed at step 1632 wherein a block is received from a serial port (e.g. from VBI decoder 230), and copied to the ring buffer. At 1633, it is determined whether a Vt_Packet is currently being examined. If not, then the STX character is located at step 1634 to locate the beginning of the message. Upon completion of step 1634 or upon determination that a Vt_Packet is already being examined at step 1633, it is determined at step 1636 whether any more characters are are in the ring buffer to process. If not, the block reception continues at step 1632. If there are more characters tin the ring buffer, as detected at step 1636, then the process proceeds to step 1638 wherein it is determined whether the character is an ETX character. If the character is not an ETX, then the character is copied to the VT_Packet buffer at step 1640. This loop continues until no more characters are to be processed in the ring buffer, or the ETX character is located. Once the ETX character is detected at step 1638, then the end of the frame has been reached, and the processing of the packet can take place. At step 1642, a CRC of the packet is performed. If it fails, the data has been corrupted, then the Vt_Packet buffer is discarded at step 1644. If the CRC check passes, then the Vt_Packet may then be sent to the network layer at step 1646. Then, the process continues to wait, in loop 1632- 1636, to locate the STX character and determine if any more characters are received.
Network layer processing is shown in Figure 16b. A Vt_Packet is received at step 1614. Steps 1616-1620 check the Vt_Packet_Size, to determine if any bits were lost, Vt_Packet_MessagelD, to determine if this refers to a valid message, and Vt_Packet_PacketlD, to determine if this refers to a valid packet in the message. If any of these checks fail, then the Vt_Message under construction is discarded at step 1622, and the process proceeds to an idle state 1624, until a subsequent Vt_Packet is received. Once the packet is determined as valid, as detected at steps 1616-1620, the Vt_Packet is then added to the message under construction at step 1626. Then, if there are no more packets in the message, as detected at step 1628, a build Vt_Message is sent to the transport layer in order to communicate with any client(s). If there are more Vt_Packets in the message, as detected at step 1628, then the process continues at step 1614.
The operation of the transport layer is illustrated in Figure 16c. Message(s) are received at the transport layer from the network layer at step 1602. First, it is determined whether it is a Vt_Status message at step 1604. If so, then it is determined whether any of the channels are being closed by the status message. If so, then the local status cache of the channel is rebuilt, rebuilding the channel map, at step 1608, and a ChannelClosed message is sent to all local client applications at step 1610. The process then proceeds to step 1612, which is also performed if the message was not a status message. At step 1612, the Vt_Message is then transmitted to the application(s) listening to the message's channel.
An example of a program information window, as may be displayed on the display of a suitably programmed microcomputer (e.g. 310 of Figure 3) or other apparatus having similar function executing a program information client application is shown as 1700 of Figure 17. Such a program information window may be displayed using any number of commercially-available graphical user interfaces (GUI) such as the Windows brand GUI available from Microsoft Corporation of Redmond, Washington. As illustrated, the program information window may display program information in outline form, as derived from a packet such as the program marker packet 1200, as described with reference to Figure 12, above. In this news application window 1700, the program title is displayed as the window title, and segment titles are show as 1710, 1720, 1730 and 1740. Story headings, such as 171 1 -1718, as discussed above, are referenced in order of appearance under the headline. Using this type of display, using any of the pull-down menu options 1750, other options may be accessed, such as real-time stock quotes, sports scores, or other non-program information. Selection of any of the headlines 1711 -1718 may allow display of text of the story, closed captioned information, and/or other program-related information, such as events, captions, sidebars, or other useful information.
Another example of a display during a user-session at the decoder is shown in Figure 18. Window 1810 presents the televised audio/video program, which may be executed under control of a first task in a multitasking environment, and may be fed by any number of sources (e.g. satellite, broadcast, cable, or digital transmission). A second task, a client of the VIP decoder (e.g. 500 of Figure 5), may display program- related information in a window such as 1820, such as headlines for news stories. Any other transmitted program information may be accessed using this window under control of the program client. Finally, non- program information, such as real-time stock quotations, may be displayed in window 1830 under control of a third task, also a client of the decoder. In this way, program and non-program information may be displayed to the user, depending upon which information he considers of interest, and which client application(s) are activated.
Using the above-described methods, apparatus, packets and protocols, client application programs can communicate with a main decoder process (e.g. 500 of Figure 5) and obtain useful, multiplexed serialized data which is received along with transmitted audio/video information, and extract certain useful information according to requirements. The post-processing, including the display of user- interfaces, control of additional device(s) (e.g. video digitizers or video recorders), or other responsive actions in the clients as s result of the reception of such information may be performed according to implementation. The transmission or reception of such information does not interfere with the audio/video programming transmitted concurrently therewith, and also, takes advantage of the unused bandwidth provided by such transmissions. Thus, implemented embodiments of the present invention present advantages neither recognized nor realized by the prior art.
Thus, method and apparatus for transmission of information along with audio/video programming has been described. Note that though the foregoing has particular utility in these systems, and although it has been described with reference to certain specific embodiments in the figures and the text, that one may practice the present invention without utilizing all of these specific details. Thus, the figures and the text are to be viewed an illustrative sense only, and not limit the present invention. The present invention is only to be limited by the appended claims which follow.

Claims

CLAIMSWhat is claimed is:
1. A computer-implemented method for transmitting information with a video signal, comprising the following steps: a. at least one client application creating a message to be transmitted to a receiver; b. said client application transmitting said message to a data encoder; c. said encoder receiving said message and other messages from other client applications; d. said data encoder transforming said message and said other messages into packets and multiplexing said packets into a bitstream to be encoded with a video programming signal, said multiplexing being performed according to priorities assigned to said at least one client application and said other client applications; and e. said encoder transmitting said bitstream to a video encoder to transmit said bitstream with said video programming signal in order to be received by a decoder.
2. The method of claim 1 wherein said at least one client application includes a status application which transmits said messages containing status information at regular intervals.
3. The method of claim 1 wherein said at least one client application includes a program application which transmits descriptive information of said video programming synchronized with said video signal.
4. The method of claim 1 wherein said at least one client application includes a non-program application.
5. The method of claim 1 wherein said at least one client application includes a status application which transmits said messages containing status information at regular intervals, a program application which transmits descriptive information of said video programming synchronized with said video signal, and a non- program application, wherein said status application has a highest of said priority, said program application has a next highest of said priority, and said non-programming signal has a lowest of said priority.
PCT/US1995/014594 1994-10-24 1995-10-24 Video indexing protocol WO1996013124A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
DE69534896T DE69534896T2 (en) 1994-10-24 1995-10-24 VIDEO INDEXING PROTOCOL
EP95942412A EP0788714B1 (en) 1994-10-24 1995-10-24 Video indexing protocol
AU43642/96A AU4364296A (en) 1994-10-24 1995-10-24 Video indexing protocol
HK98101123A HK1002381A1 (en) 1994-10-24 1998-02-12 Video indexing protocol

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US32887194A 1994-10-24 1994-10-24
US08/328,871 1994-10-24

Publications (1)

Publication Number Publication Date
WO1996013124A1 true WO1996013124A1 (en) 1996-05-02

Family

ID=23282819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1995/014594 WO1996013124A1 (en) 1994-10-24 1995-10-24 Video indexing protocol

Country Status (6)

Country Link
US (1) US6064438A (en)
EP (1) EP0788714B1 (en)
AU (1) AU4364296A (en)
DE (1) DE69534896T2 (en)
HK (1) HK1002381A1 (en)
WO (1) WO1996013124A1 (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0828390A2 (en) * 1996-09-05 1998-03-11 HE HOLDINGS, INC. dba HUGHES ELECTRONICS Dynamic mapping of broadcast resources
WO1998020677A1 (en) * 1996-11-04 1998-05-14 Institut für Rundfunktechnik GmbH Radio broadcast signal and method for processing the same
EP0864213A1 (en) * 1996-09-13 1998-09-16 Motorola, Inc. System and method for capturing internet access information
EP0879534A1 (en) * 1996-02-08 1998-11-25 Thomas R. Wolzien Media online services access system and method
EP0885525A4 (en) * 1996-03-08 1998-12-23
EP0901284A2 (en) * 1997-09-05 1999-03-10 AT&T Corp. Internet linkage with broadcast TV
EP0915621A2 (en) * 1997-11-06 1999-05-12 Lucent Technologies Inc. Synchronized presentation of television programming and internet content
EP0947094A1 (en) * 1996-12-23 1999-10-06 Corporate Media Partners doing business as Americast Method and system for providing interactive look-and-feel in a digital broadcast via an x-y protocol
EP0849946A3 (en) * 1996-12-13 1999-12-15 Kabushiki Kaisha Toshiba Interactive TV broadcasting system and file access method applied thereto
US6018768A (en) * 1996-03-08 2000-01-25 Actv, Inc. Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
WO2001031908A1 (en) * 1999-10-22 2001-05-03 Innovation Venture Limited Method and apparatus for the dissemination of information
EP1098521A1 (en) * 1999-11-05 2001-05-09 THOMSON multimedia Process for allocating bandwidth to data streams in a broadcasting network
US6330595B1 (en) 1996-03-08 2001-12-11 Actv, Inc. Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
US6513069B1 (en) 1996-03-08 2003-01-28 Actv, Inc. Enhanced video programming system and method for providing a distributed community network
WO2003023981A2 (en) * 2001-09-12 2003-03-20 Grischa Corporation A method and system for video enhancement transport alteration
EP1393559A1 (en) * 2001-04-17 2004-03-03 LG Electronics, Inc. Digital broadcasting system and method of controlling the same
EP1411725A2 (en) * 1996-05-16 2004-04-21 Kabushiki Kaisha Infocity Information transmission and display method and information display apparatus
US6728269B1 (en) 1996-09-05 2004-04-27 Hughes Electronics Corporation Device and method for efficient delivery of redundant national television signals
EP1626513A2 (en) * 1996-06-25 2006-02-15 Matsushita Electric Industrial Co., Ltd. Data sending/receiving system, data broadcasting method and data receiving apparatus for television broadcasting
US7292604B2 (en) 1996-09-05 2007-11-06 The Directv Group, Inc. Device and method for efficient delivery of redundant national television signals
EP1850594A3 (en) * 1996-10-16 2008-07-09 Gemstar Development Corporation Access to internet data through a television system
USRE42103E1 (en) 1995-10-30 2011-02-01 Disney Enterprises, Inc. Apparatus and method of automatically accessing on-line services in response to broadcast of on-line addresses
US7930716B2 (en) 2002-12-31 2011-04-19 Actv Inc. Techniques for reinsertion of local market advertising in digital video from a bypass source
US8402500B2 (en) 1997-03-21 2013-03-19 Walker Digital, Llc System and method for supplying supplemental audio information for broadcast television programs
US9021538B2 (en) 1998-07-14 2015-04-28 Rovi Guides, Inc. Client-server based interactive guide with server recording
US9071872B2 (en) 2003-01-30 2015-06-30 Rovi Guides, Inc. Interactive television systems with digital video recording and adjustable reminders
US9084006B2 (en) 1998-07-17 2015-07-14 Rovi Guides, Inc. Interactive television program guide system having multiple devices within a household
US9125169B2 (en) 2011-12-23 2015-09-01 Rovi Guides, Inc. Methods and systems for performing actions based on location-based rules
US9148684B2 (en) 1999-09-29 2015-09-29 Opentv, Inc. Enhanced video programming system and method utilizing user-profile information
US9204184B2 (en) 1998-07-17 2015-12-01 Rovi Guides, Inc. Interactive television program guide with remote access
US9294799B2 (en) 2000-10-11 2016-03-22 Rovi Guides, Inc. Systems and methods for providing storage of data on servers in an on-demand media delivery system
US9307281B2 (en) 2007-03-22 2016-04-05 Rovi Guides, Inc. User defined rules for assigning destinations of content
US9532086B2 (en) 2013-11-20 2016-12-27 At&T Intellectual Property I, L.P. System and method for product placement amplification
US10063934B2 (en) 2008-11-25 2018-08-28 Rovi Technologies Corporation Reducing unicast session duration with restart TV

Families Citing this family (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7448063B2 (en) * 1991-11-25 2008-11-04 Actv, Inc. Digital interactive system for providing full interactivity with live programming events
US20040261127A1 (en) * 1991-11-25 2004-12-23 Actv, Inc. Digital interactive system for providing full interactivity with programming events
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US5903454A (en) 1991-12-23 1999-05-11 Hoffberg; Linda Irene Human-factored interface corporating adaptive pattern recognition based controller apparatus
US6400996B1 (en) 1999-02-01 2002-06-04 Steven M. Hoffberg Adaptive pattern recognition based control system and method
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
US6850252B1 (en) 1999-10-05 2005-02-01 Steven M. Hoffberg Intelligent electronic appliance system and method
US6516132B1 (en) * 1995-05-09 2003-02-04 Macrovision Corp Method and apparatus for improving the effects of color burst modifications to a video signal
PL180475B1 (en) * 1995-05-09 2001-02-28 Macrovision Corp Method of and apparatus for eliminating effects of colour sync pulse modification on video signal
US20030212996A1 (en) * 1996-02-08 2003-11-13 Wolzien Thomas R. System for interconnection of audio program data transmitted by radio to remote vehicle or individual with GPS location
US20020049832A1 (en) * 1996-03-08 2002-04-25 Craig Ullman Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
EP0863669B1 (en) * 1996-08-28 2001-11-07 Matsushita Electric Industrial Co., Ltd. Broadcast receiver selectively using navigation information multiplexed on transport stream and recording medium recording the method of the same
US6816201B1 (en) * 1998-01-13 2004-11-09 Mitsubishi Digital Electronics America, Inc. XDS enhancement system
US7707600B1 (en) 1998-08-21 2010-04-27 Intel Corporation Confirming video transmissions
US6473804B1 (en) 1999-01-15 2002-10-29 Grischa Corporation System for indexical triggers in enhanced video productions by redirecting request to newly generated URI based on extracted parameter of first URI
US6615408B1 (en) 1999-01-15 2003-09-02 Grischa Corporation Method, system, and apparatus for providing action selections to an image referencing a product in a video production
US7904187B2 (en) 1999-02-01 2011-03-08 Hoffberg Steven M Internet appliance system and method
US9171545B2 (en) * 1999-04-19 2015-10-27 At&T Intellectual Property Ii, L.P. Browsing and retrieval of full broadcast-quality video
US7877774B1 (en) * 1999-04-19 2011-01-25 At&T Intellectual Property Ii, L.P. Browsing and retrieval of full broadcast-quality video
US6266094B1 (en) * 1999-06-14 2001-07-24 Medialink Worldwide Incorporated Method and apparatus for the aggregation and selective retrieval of television closed caption word content originating from multiple geographic locations
US7634787B1 (en) 1999-06-15 2009-12-15 Wink Communications, Inc. Automatic control of broadcast and execution of interactive applications to maintain synchronous operation with broadcast programs
US7069571B1 (en) 1999-06-15 2006-06-27 Wink Communications, Inc. Automated retirement of interactive applications using retirement instructions for events and program states
US7222155B1 (en) 1999-06-15 2007-05-22 Wink Communications, Inc. Synchronous updating of dynamic interactive applications
US6530084B1 (en) 1999-11-01 2003-03-04 Wink Communications, Inc. Automated control of interactive application execution using defined time periods
US7028071B1 (en) * 2000-01-28 2006-04-11 Bycast Inc. Content distribution system for generating content streams to suit different users and facilitating e-commerce transactions using broadcast content metadata
US7631338B2 (en) * 2000-02-02 2009-12-08 Wink Communications, Inc. Interactive content delivery methods and apparatus
US7028327B1 (en) 2000-02-02 2006-04-11 Wink Communication Using the electronic program guide to synchronize interactivity with broadcast programs
ATE546013T1 (en) * 2000-03-31 2012-03-15 Opentv Inc SYSTEM AND METHOD FOR INSERTING LOCAL METADATA
US7386512B1 (en) 2000-05-11 2008-06-10 Thomson Licensing Method and system for controlling and auditing content/service systems
US6870570B1 (en) * 2000-10-31 2005-03-22 Matsushita Electric Industrial Co., Ltd. Television receiver with shared data port and control software
US7409700B1 (en) * 2000-11-03 2008-08-05 The Walt Disney Company System and method for enhanced broadcasting and interactive
JP2004534978A (en) * 2000-11-16 2004-11-18 マイ ディーティービー System and method for determining the desirability of a video programming event
US7456902B2 (en) * 2000-12-04 2008-11-25 Jlb Ventures, Llc Method and system for identifying addressing data within a television presentation
US7676822B2 (en) * 2001-01-11 2010-03-09 Thomson Licensing Automatic on-screen display of auxiliary information
US20020156909A1 (en) * 2001-02-15 2002-10-24 Harrington Jeffrey M. System and method for server side control of a flash presentation
US20020112002A1 (en) * 2001-02-15 2002-08-15 Abato Michael R. System and process for creating a virtual stage and presenting enhanced content via the virtual stage
US6700640B2 (en) * 2001-03-02 2004-03-02 Qualcomm Incorporated Apparatus and method for cueing a theatre automation system
US6903782B2 (en) * 2001-03-28 2005-06-07 Koninklijke Philips Electronics N.V. System and method for performing segmentation-based enhancements of a video image
US20020152117A1 (en) * 2001-04-12 2002-10-17 Mike Cristofalo System and method for targeting object oriented audio and video content to users
JP4724420B2 (en) * 2001-04-25 2011-07-13 ウィンク・コミュニケイションズ,インコーポレイテッド Synchronous update of dynamic interactive applications
US20020178060A1 (en) * 2001-05-25 2002-11-28 Sheehan Patrick M. System and method for providing and redeeming electronic paperless coupons
US8296400B2 (en) * 2001-08-29 2012-10-23 International Business Machines Corporation System and method for generating a configuration schema
US20030145338A1 (en) * 2002-01-31 2003-07-31 Actv, Inc. System and process for incorporating, retrieving and displaying an enhanced flash movie
US8046792B2 (en) 2002-03-20 2011-10-25 Tvworks, Llc Multi-channel audio enhancement for television
KR100574733B1 (en) * 2002-03-27 2006-04-28 미쓰비시덴키 가부시키가이샤 Communication apparatus and communication method
US20040031061A1 (en) * 2002-07-31 2004-02-12 Bluestreak Technology Inc. System and method for providing real-time ticker information
AU2003268273B2 (en) * 2002-08-30 2007-07-26 Opentv, Inc Carousel proxy
US7574233B2 (en) * 2002-12-30 2009-08-11 Intel Corporation Sharing a radio frequency interface resource
US7840905B1 (en) 2003-01-06 2010-11-23 Apple Inc. Creating a theme used by an authoring application to produce a multimedia presentation
US7694225B1 (en) * 2003-01-06 2010-04-06 Apple Inc. Method and apparatus for producing a packaged presentation
US7546544B1 (en) 2003-01-06 2009-06-09 Apple Inc. Method and apparatus for creating multimedia presentations
US20050241727A1 (en) * 2004-04-29 2005-11-03 Kosmyna Michael J Vented Funnel
US20060020690A1 (en) * 2004-06-17 2006-01-26 Richards Martin J Network topology and method of operation for a playback system in a digital cinema network
KR100630897B1 (en) * 2004-07-05 2006-10-04 에스케이 텔레콤주식회사 An interactive multimedia service system and method using a mobile station
US20060284895A1 (en) * 2005-06-15 2006-12-21 Marcu Gabriel G Dynamic gamma correction
US8085318B2 (en) * 2005-10-11 2011-12-27 Apple Inc. Real-time image capture and manipulation based on streaming data
US7663691B2 (en) 2005-10-11 2010-02-16 Apple Inc. Image capture using display device as light source
US9020326B2 (en) * 2005-08-23 2015-04-28 At&T Intellectual Property Ii, L.P. System and method for content-based navigation of live and recorded TV and video programs
US9042703B2 (en) * 2005-10-31 2015-05-26 At&T Intellectual Property Ii, L.P. System and method for content-based navigation of live and recorded TV and video programs
JP2008160337A (en) * 2006-12-22 2008-07-10 Hitachi Ltd Content-linked information indicator and indicating method
US20080303949A1 (en) * 2007-06-08 2008-12-11 Apple Inc. Manipulating video streams
US8122378B2 (en) * 2007-06-08 2012-02-21 Apple Inc. Image capture and manipulation
JP5308127B2 (en) * 2008-11-17 2013-10-09 株式会社豊田中央研究所 Power supply system
CN103634610B (en) * 2012-08-24 2018-02-16 中兴通讯股份有限公司 live content distribution system and method
US10812558B1 (en) 2016-06-27 2020-10-20 Amazon Technologies, Inc. Controller to synchronize encoding of streaming content
US10652625B1 (en) * 2016-06-27 2020-05-12 Amazon Technologies, Inc. Synchronization of multiple encoders for streaming content
JP6971624B2 (en) * 2017-05-11 2021-11-24 キヤノン株式会社 Information processing equipment, control methods, and programs

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5251209A (en) * 1991-03-28 1993-10-05 Sprint International Communications Corp. Prioritizing attributes in integrated services networks
US5347315A (en) * 1992-01-16 1994-09-13 Matra Communication Modular encoder for generating a digital multiplex signal and decoder for processing such a signal
US5347304A (en) * 1991-09-10 1994-09-13 Hybrid Networks, Inc. Remote link adapter for use in TV broadcast data transmission system
US5381413A (en) * 1992-12-28 1995-01-10 Starlight Networks Data throttling system for a communications network
US5396494A (en) * 1991-07-01 1995-03-07 At&T Corp. Method for operating an asynchronous packet bus for transmission of asynchronous and isochronous information

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5010499A (en) * 1988-02-22 1991-04-23 Yee Keen Y Digital data capture for use with TV set or monitor
US5099319A (en) * 1989-10-23 1992-03-24 Esch Arthur G Video information delivery method and apparatus
US5351129A (en) * 1992-03-24 1994-09-27 Rgb Technology D/B/A Rgb Spectrum Video multiplexor-encoder and decoder-converter
US5262860A (en) * 1992-04-23 1993-11-16 International Business Machines Corporation Method and system communication establishment utilizing captured and processed visually perceptible data within a broadcast video signal
GB9209147D0 (en) * 1992-04-28 1992-06-10 Thomson Consumer Electronics Auxiliary video information system including extended data services
US5289276A (en) * 1992-06-19 1994-02-22 General Electric Company Method and apparatus for conveying compressed video data over a noisy communication channel
US5400401A (en) * 1992-10-30 1995-03-21 Scientific Atlanta, Inc. System and method for transmitting a plurality of digital services
US5592551A (en) * 1992-12-01 1997-01-07 Scientific-Atlanta, Inc. Method and apparatus for providing interactive electronic programming guide
CA2151456C (en) * 1992-12-09 2004-03-02 John S. Hendricks Reprogrammable terminal for suggesting programs offered on a television program delivery system
US5442389A (en) * 1992-12-28 1995-08-15 At&T Corp. Program server for interactive television system
US5557724A (en) * 1993-10-12 1996-09-17 Intel Corporation User interface, method, and apparatus selecting and playing channels having video, audio, and/or text streams
US5481542A (en) * 1993-11-10 1996-01-02 Scientific-Atlanta, Inc. Interactive information services control system
US5519780A (en) * 1993-12-03 1996-05-21 Scientific-Atlanta, Inc. System and method for providing compressed digital teletext services and teletext support services
US5583562A (en) * 1993-12-03 1996-12-10 Scientific-Atlanta, Inc. System and method for transmitting a plurality of digital services including imaging services
US5524001A (en) * 1994-02-07 1996-06-04 Le Groupe Videotron Ltee Dynamic cable signal assembly
US5537151A (en) * 1994-02-16 1996-07-16 Ati Technologies Inc. Close caption support with timewarp
US5488412A (en) * 1994-03-31 1996-01-30 At&T Corp. Customer premises equipment receives high-speed downstream data over a cable television system and transmits lower speed upstream signaling on a separate channel
US5819034A (en) * 1994-04-28 1998-10-06 Thomson Consumer Electronics, Inc. Apparatus for transmitting and receiving executable applications as for a multimedia system
US5481312A (en) * 1994-09-12 1996-01-02 At&T Corp. Method of and apparatus for the transmission of high and low priority segments of a video bitstream over packet networks
US5614940A (en) * 1994-10-21 1997-03-25 Intel Corporation Method and apparatus for providing broadcast information with indexing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5251209A (en) * 1991-03-28 1993-10-05 Sprint International Communications Corp. Prioritizing attributes in integrated services networks
US5396494A (en) * 1991-07-01 1995-03-07 At&T Corp. Method for operating an asynchronous packet bus for transmission of asynchronous and isochronous information
US5347304A (en) * 1991-09-10 1994-09-13 Hybrid Networks, Inc. Remote link adapter for use in TV broadcast data transmission system
US5347315A (en) * 1992-01-16 1994-09-13 Matra Communication Modular encoder for generating a digital multiplex signal and decoder for processing such a signal
US5381413A (en) * 1992-12-28 1995-01-10 Starlight Networks Data throttling system for a communications network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP0788714A4 *

Cited By (71)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE42103E1 (en) 1995-10-30 2011-02-01 Disney Enterprises, Inc. Apparatus and method of automatically accessing on-line services in response to broadcast of on-line addresses
US6233736B1 (en) 1996-02-08 2001-05-15 Media Online Services, Inc. Media online service access system and method
EP0879534A4 (en) * 1996-02-08 1999-11-17 Thomas R Wolzien Media online services access system and method
EP0879534A1 (en) * 1996-02-08 1998-11-25 Thomas R. Wolzien Media online services access system and method
US6330595B1 (en) 1996-03-08 2001-12-11 Actv, Inc. Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
EP0982943A3 (en) * 1996-03-08 2000-05-10 Actv, Inc. An intergrated interactive video and internet system
EP0885525A4 (en) * 1996-03-08 1998-12-23
US6513069B1 (en) 1996-03-08 2003-01-28 Actv, Inc. Enhanced video programming system and method for providing a distributed community network
US6018768A (en) * 1996-03-08 2000-01-25 Actv, Inc. Enhanced video programming system and method for incorporating and displaying retrieved integrated internet information segments
EP0982943A2 (en) * 1996-03-08 2000-03-01 Actv, Inc. An intergrated interactive video and internet system
EP1411725A3 (en) * 1996-05-16 2004-05-19 Kabushiki Kaisha Infocity Information transmission and display method and information display apparatus
EP1411725A2 (en) * 1996-05-16 2004-04-21 Kabushiki Kaisha Infocity Information transmission and display method and information display apparatus
EP1626513A2 (en) * 1996-06-25 2006-02-15 Matsushita Electric Industrial Co., Ltd. Data sending/receiving system, data broadcasting method and data receiving apparatus for television broadcasting
EP1626513A3 (en) * 1996-06-25 2006-03-15 Matsushita Electric Industrial Co., Ltd. Data sending/receiving system, data broadcasting method and data receiving apparatus for television broadcasting
US6278717B1 (en) 1996-09-05 2001-08-21 Hughes Electronics Corporation Dynamic mapping of broadcast resources
US6501770B2 (en) 1996-09-05 2002-12-31 Hughes Electronics Corporation Dynamic mapping of broadcast resources
EP1626581A3 (en) * 1996-09-05 2006-12-06 The DIRECTV Group, Inc. Dynamic mapping of broadcast resources
US7075945B2 (en) 1996-09-05 2006-07-11 The Directv Group, Inc. Dynamic mapping of broadcast resources
US6728269B1 (en) 1996-09-05 2004-04-27 Hughes Electronics Corporation Device and method for efficient delivery of redundant national television signals
US7646792B2 (en) 1996-09-05 2010-01-12 The Directv Group, Inc. Dynamic mapping of broadcast resources
EP0828390A3 (en) * 1996-09-05 1999-09-08 Hughes Electronics Corporation Dynamic mapping of broadcast resources
US7292604B2 (en) 1996-09-05 2007-11-06 The Directv Group, Inc. Device and method for efficient delivery of redundant national television signals
EP0828390A2 (en) * 1996-09-05 1998-03-11 HE HOLDINGS, INC. dba HUGHES ELECTRONICS Dynamic mapping of broadcast resources
EP0864213A1 (en) * 1996-09-13 1998-09-16 Motorola, Inc. System and method for capturing internet access information
EP0864213A4 (en) * 1996-09-13 2003-03-19 Motorola Inc System and method for capturing internet access information
EP2259593A3 (en) * 1996-10-16 2011-10-05 Gemstar Development Corporation Program recording apparatus and recording schedule managing method
EP1850594A3 (en) * 1996-10-16 2008-07-09 Gemstar Development Corporation Access to internet data through a television system
WO1998020677A1 (en) * 1996-11-04 1998-05-14 Institut für Rundfunktechnik GmbH Radio broadcast signal and method for processing the same
EP0849946A3 (en) * 1996-12-13 1999-12-15 Kabushiki Kaisha Toshiba Interactive TV broadcasting system and file access method applied thereto
EP0947094A1 (en) * 1996-12-23 1999-10-06 Corporate Media Partners doing business as Americast Method and system for providing interactive look-and-feel in a digital broadcast via an x-y protocol
US8650607B2 (en) 1996-12-23 2014-02-11 Banbury Technologies Llc Method and system for providing interactive look-and-feel in a digital broadcast via an X-Y protocol
EP0947094A4 (en) * 1996-12-23 2002-05-15 Corporate Media Partners Method and system for providing interactive look-and-feel in a digital broadcast via an x-y protocol
US8756644B2 (en) 1997-03-21 2014-06-17 Inventor Holdings, Llc System and method for supplying supplemental audio information for broadcast television programs
US8402500B2 (en) 1997-03-21 2013-03-19 Walker Digital, Llc System and method for supplying supplemental audio information for broadcast television programs
EP0901284A2 (en) * 1997-09-05 1999-03-10 AT&T Corp. Internet linkage with broadcast TV
EP0901284A3 (en) * 1997-09-05 1999-05-19 AT&T Corp. Internet linkage with broadcast TV
EP0915621A2 (en) * 1997-11-06 1999-05-12 Lucent Technologies Inc. Synchronized presentation of television programming and internet content
EP0915621A3 (en) * 1997-11-06 2000-12-20 Lucent Technologies Inc. Synchronized presentation of television programming and internet content
US9055319B2 (en) 1998-07-14 2015-06-09 Rovi Guides, Inc. Interactive guide with recording
US9118948B2 (en) 1998-07-14 2015-08-25 Rovi Guides, Inc. Client-server based interactive guide with server recording
US10075746B2 (en) 1998-07-14 2018-09-11 Rovi Guides, Inc. Client-server based interactive television guide with server recording
US9055318B2 (en) 1998-07-14 2015-06-09 Rovi Guides, Inc. Client-server based interactive guide with server storage
US10027998B2 (en) 1998-07-14 2018-07-17 Rovi Guides, Inc. Systems and methods for multi-tuner recording
US9232254B2 (en) 1998-07-14 2016-01-05 Rovi Guides, Inc. Client-server based interactive television guide with server recording
US9226006B2 (en) 1998-07-14 2015-12-29 Rovi Guides, Inc. Client-server based interactive guide with server recording
US9154843B2 (en) 1998-07-14 2015-10-06 Rovi Guides, Inc. Client-server based interactive guide with server recording
US9021538B2 (en) 1998-07-14 2015-04-28 Rovi Guides, Inc. Client-server based interactive guide with server recording
US9084006B2 (en) 1998-07-17 2015-07-14 Rovi Guides, Inc. Interactive television program guide system having multiple devices within a household
US9204184B2 (en) 1998-07-17 2015-12-01 Rovi Guides, Inc. Interactive television program guide with remote access
US10271088B2 (en) 1998-07-17 2019-04-23 Rovi Guides, Inc. Interactive television program guide with remote access
US9706245B2 (en) 1998-07-17 2017-07-11 Rovi Guides, Inc. Interactive television program guide system having multiple devices within a household
US9237369B2 (en) 1998-07-17 2016-01-12 Rovi Guides, Inc. Interactive television program guide system having multiple devices within a household
US9185449B2 (en) 1998-07-17 2015-11-10 Rovi Guides, Inc. Interactive television program guide system having multiple devices within a household
US10205998B2 (en) 1999-09-29 2019-02-12 Opentv, Inc. Enhanced video programming system and method utilizing user-profile information
US9148684B2 (en) 1999-09-29 2015-09-29 Opentv, Inc. Enhanced video programming system and method utilizing user-profile information
WO2001031908A1 (en) * 1999-10-22 2001-05-03 Innovation Venture Limited Method and apparatus for the dissemination of information
FR2800960A1 (en) * 1999-11-05 2001-05-11 Thomson Multimedia Sa METHOD AND DEVICE FOR ALLOCATING BANDWIDTH TO DATA STREAMS IN A BROADCASTING NETWORK
EP1098521A1 (en) * 1999-11-05 2001-05-09 THOMSON multimedia Process for allocating bandwidth to data streams in a broadcasting network
US9294799B2 (en) 2000-10-11 2016-03-22 Rovi Guides, Inc. Systems and methods for providing storage of data on servers in an on-demand media delivery system
EP1393559A1 (en) * 2001-04-17 2004-03-03 LG Electronics, Inc. Digital broadcasting system and method of controlling the same
EP1393559A4 (en) * 2001-04-17 2009-10-28 Lg Electronics Inc Digital broadcasting system and method of controlling the same
WO2003023981A3 (en) * 2001-09-12 2003-10-30 Grischa Corp A method and system for video enhancement transport alteration
WO2003023981A2 (en) * 2001-09-12 2003-03-20 Grischa Corporation A method and system for video enhancement transport alteration
US7930716B2 (en) 2002-12-31 2011-04-19 Actv Inc. Techniques for reinsertion of local market advertising in digital video from a bypass source
US9369741B2 (en) 2003-01-30 2016-06-14 Rovi Guides, Inc. Interactive television systems with digital video recording and adjustable reminders
US9071872B2 (en) 2003-01-30 2015-06-30 Rovi Guides, Inc. Interactive television systems with digital video recording and adjustable reminders
US9307281B2 (en) 2007-03-22 2016-04-05 Rovi Guides, Inc. User defined rules for assigning destinations of content
US10063934B2 (en) 2008-11-25 2018-08-28 Rovi Technologies Corporation Reducing unicast session duration with restart TV
US9125169B2 (en) 2011-12-23 2015-09-01 Rovi Guides, Inc. Methods and systems for performing actions based on location-based rules
US9532086B2 (en) 2013-11-20 2016-12-27 At&T Intellectual Property I, L.P. System and method for product placement amplification
US10412421B2 (en) 2013-11-20 2019-09-10 At&T Intellectual Property I, L.P. System and method for product placement amplification

Also Published As

Publication number Publication date
US6064438A (en) 2000-05-16
AU4364296A (en) 1996-05-15
HK1002381A1 (en) 1998-08-21
DE69534896D1 (en) 2006-05-11
EP0788714B1 (en) 2006-03-22
DE69534896T2 (en) 2006-10-12
EP0788714A1 (en) 1997-08-13
EP0788714A4 (en) 1999-10-20

Similar Documents

Publication Publication Date Title
EP0788714B1 (en) Video indexing protocol
US5694163A (en) Method and apparatus for viewing of on-line information service chat data incorporated in a broadcast television program
EP1053642B1 (en) A host apparatus for simulating two way connectivity for one way data streams
US6571392B1 (en) Receiving an information resource from the internet if it is not received from a broadcast channel
EP1110394B1 (en) Simulating two way connectivity for one way data streams for multiple parties
KR100684654B1 (en) A system for forming and processing program specific information suitable for terrestrial, cable or satellite broadcast
JP4327233B2 (en) A system that forms and processes programs, maps, and information suitable for terrestrial, cable, and satellite broadcasting
CN1166179C (en) Decoding to digital data containing performance specific information
KR100573787B1 (en) Apparatus and method for decoding packetized program information, and method for processing packetized program information
JP4578040B2 (en) System for error management of program specific information in video decoder
EP0969668A2 (en) Copyright protection for moving image data
US20050048916A1 (en) Service system of thumbnail image and transmitting/receiving method thereof
WO1999035839A1 (en) A hand-held apparatus for simulating two way connectivity for one way data streams
US7734997B2 (en) Transport hint table for synchronizing delivery time between multimedia content and multimedia content descriptions
US20010050920A1 (en) Rate controlled insertion of asynchronous data into a synchronous stream
US20120066734A1 (en) System and method for transmitting data contents
EP0671106A1 (en) Auxiliary video information system including extended data services
EP0854649A2 (en) Television broadcasting system and receiver
KR20020074818A (en) Method of Data Send/Receive for the Digital Data broadcasting based on the Internet Contents
Barbero et al. Multilanguage opera subtitling exchange between production and broadcaster companies
CN1997151A (en) Controlling data-on-demand client access
JP2002077079A (en) Data broadcast transmission system
MXPA99010439A (en) A system for processing programs and parameter information derived from multiple sources of transmis
MXPA99010438A (en) A system for processing programs and system timing information derived from multiple sources of transmis

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TT UA UG UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1995942412

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1995942412

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWG Wipo information: grant in national office

Ref document number: 1995942412

Country of ref document: EP