US8761162B2 - Systems and methods for applications using channel switch frames - Google Patents

Systems and methods for applications using channel switch frames Download PDF

Info

Publication number
US8761162B2
US8761162B2 US11/941,014 US94101407A US8761162B2 US 8761162 B2 US8761162 B2 US 8761162B2 US 94101407 A US94101407 A US 94101407A US 8761162 B2 US8761162 B2 US 8761162B2
Authority
US
United States
Prior art keywords
csfs
csf
channel
resolution
mlc
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/941,014
Other languages
English (en)
Other versions
US20080127258A1 (en
Inventor
Gordon Kent Walker
Vijayalakshmi R. Raveendran
Serafim S. Loukas, Jr.
Seyfullah Halit Oguz
Fang Shi
Sitaraman Ganapathy Subramania
Phanikumar Bhamidipati
James T. Determan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to US11/941,014 priority Critical patent/US8761162B2/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHI, FANG, LOUKAS JR., SERAFIM S., SUBRAMANIAM, SITARAMAN GANAPATHY, OGUZ, SEYFULLAH HALIT, BHAMIDIPATI, PHANIKUMAR, DETERMAN, JAMES T., RAVEENDRAN, VIJAYALAKSHMI R., WALKER, GORDON KENT
Publication of US20080127258A1 publication Critical patent/US20080127258A1/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LOUKAS, SERAFIM S., JR., DETERMAN, JAMES T., WALKER, GORDON KENT, SHI, FANG, OGUZ, SEYFULLAH HALIT, BHAMIDIPATI, PHANIKUMAR, RAVEENDRAN, VIJAYALAKSHMI R., SUBRAMANIA, SITARAMAN GANAPATHY
Application granted granted Critical
Publication of US8761162B2 publication Critical patent/US8761162B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/162Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing
    • H04N7/163Authorising the user terminal, e.g. by paying; Registering the use of a subscription channel, e.g. billing by receiver means only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234327Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/278Content descriptor database or directory service for end-user access
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving MPEG packets from an IP network
    • H04N21/4383Accessing a communication channel
    • H04N21/4384Accessing a communication channel involving operations to reduce the access time, e.g. fast-tuning for reducing channel switching latency
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/64Addressing
    • H04N21/6405Multicasting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface

Definitions

  • the disclosure is directed to multimedia signal processing and, more particularly, to techniques for encoding and decoding, such as a common guide media logical channel (MLC), to enable fast acquisition and re/synchronization of the video stream while preserving compression efficiency.
  • MLC common guide media logical channel
  • Multimedia processing systems may encode multimedia data using encoding methods based on international standards such as Moving Picture Experts Group (MPEG)-1, -2 and -4 standards, the International Telecommunication Union (ITU)-T H.263 standard, and the ITU-T H.264 standard and its counterpart, ISO/IEC MPEG-4, Part 10, i.e., Advanced Video Coding (AVC), each of which is fully incorporated herein by reference for all purposes.
  • MPEG Moving Picture Experts Group
  • ITU International Telecommunication Union
  • ISO/IEC MPEG-4 Part 10
  • Such encoding methods generally are directed to compressing the multimedia data for transmission and/or storage. Compression can be broadly thought of as the process of removing redundancy from the multimedia data.
  • a video signal may be described in terms of a sequence of pictures, which include frames (an entire picture), or fields (e.g., an interlaced video stream comprises fields of alternating odd or even lines of a picture).
  • the term “frame” refers to a picture, a frame or a field.
  • Video encoding methods compress video signals by using lossless or lossy compression algorithms to compress each frame.
  • Intra-frame coding also referred to herein as intra-coding
  • Inter-frame coding also referred to herein as inter-coding
  • video signals often exhibit temporal redundancy in which frames near each other in the temporal sequence of frames have at least portions that are match or at least partially match each other.
  • Multimedia processors such as video encoders, may encode a frame by partitioning it into blocks or “macroblocks” of, for example, 16 ⁇ 16 pixels.
  • the encoder may further partition each macroblock into subblocks.
  • Each subblock may further comprise additional subblocks.
  • subblocks of a macroblock may include 16 ⁇ 8 and 8 ⁇ 16 subblocks.
  • Subblocks of the 8 ⁇ 16 subblocks may include 8 ⁇ 8 subblocks, which may include 4 ⁇ 4 subblocks, and so forth.
  • the term “block” refers to either a macroblock or a subblock.
  • Encoders take advantage of temporal redundancy between sequential frames using inter-coding motion compensation based algorithms.
  • Motion compensation algorithms identify portions of one or more reference frames that at least partially match a block.
  • the block may be shifted in the frame relative to the matching portion of the reference frame(s). This shift is characterized by one or more motion vector(s). Any differences between the block and partially matching portion of the reference frame(s) may be characterized in terms of one or more residual(s).
  • the encoder may encode a frame as data that comprises one or more of the motion vectors and residuals for a particular partitioning of the frame.
  • a particular partition of blocks for encoding a frame may be selected by approximately minimizing a cost function that, for example, balances encoding size with distortion, or perceived distortion, to the content of the frame resulting from an encoding.
  • Inter-coding enables more compression efficiency than intra-coding.
  • inter-coding can create problems when reference data (e.g., reference frames or reference fields) are lost due to channel errors, and the like.
  • reference data may also be unavailable due to initial acquisition or reacquisition of the video signal at an inter-coded frame.
  • decoding of inter-coded data may not be possible or may result in undesired errors and error propagation. These scenarios can result in a loss of synchronization of the video stream.
  • An independently decodable intra-coded frame is the most common form of frame that enables re/synchronization of the video signal.
  • the MPEG-x and H.26x standards use what is known as a group of pictures (GOP) which comprises an intra-coded frame (also called an I-frame) and temporally predicted P-frames or bi-directionally predicted B frames that reference the I-frame and/or other P and/or B frames within the GOP.
  • GOPs are desirable for the increased compression rates, but shorter GOPs allow for quicker acquisition and re/synchronization.
  • Increasing the number of I-frames will permit quicker acquisition and re/synchronization, but at the expense of lower compression.
  • a system comprising an encoder operative to generate a common guide media logical channel (MLC) of a plurality of channel switch frames (CSFs), each respective one active channel associated with a respected one or more CSFs is provided.
  • the system also includes a decoder operative to decode a set of the plurality of CSFs and simultaneously display programming content of the decoded set of the plurality of CSFs, on a display and automatically switch to a primary bitstream of an active channel associated with a selected one displayed CSF.
  • MLC common guide media logical channel
  • CSFs channel switch frames
  • a device comprising a decoder operative to decode programming content of a set of CSFs from a plurality of CSFs in a common guide media logical channel (MLC) is provided.
  • the decoder is further operative to simultaneously display on a display screen programming content of the decoded set of CSFs, and automatically switch to a primary bitstream of an active channel associated with a selected one displayed CSF.
  • the device further includes a memory coupled to the decoder.
  • an integrated circuit comprising a processor operative to implement a set of instructions to decode programming content of a set of a plurality of CSFs from a common guide medial logical channel (MLC) is provided.
  • the process is further operative to display simultaneously on a display screen programming content of the decoded set of the plurality of CSFs, and automatically switch to a primary bitstream of an active channel associated with a selected one displayed CSF.
  • the integrated circuit further includes a memory coupled to the processor.
  • a computer program product including a computer readable medium having instructions for causing a computer to decode programming content of a set of plurality of CSFs from a common media logical channel (MLC) guide is provided.
  • the instruction further cause the computer to display simultaneously on a display screen content of the decoded set of the plurality of CSFs; and automatically switch to a primary bitstream of an active channel associated with a selected one displayed CSF.
  • MLC media logical channel
  • FIG. 1 illustrates a block diagram of an exemplary multimedia communications system according to certain configurations.
  • FIG. 2A illustrates a block diagram of an exemplary encoder device.
  • FIG. 2B illustrates a block diagram of an exemplary decoder device.
  • FIG. 3 illustrates a network that comprises an aspect of a service acquisition system.
  • FIG. 4 illustrates a flowchart of a process for generation of a common guide MLC.
  • FIG. 5 illustrates a device receiving a common guide MLC.
  • FIG. 6A illustrates a transition (direct entry) from a guide thumbnail tile to a channel's primary bitstream.
  • FIG. 6B illustrates a device transition (direct entry) from a guide thumbnail to a channel primary bitstream using a channel identification.
  • FIG. 7 illustrates reception and display of a primary bitstream.
  • FIG. 8 illustrates a flowchart of the process for reduced resolution decoding of channel switch frame and display thereof.
  • FIG. 9 illustrates a flowchart of the process to access a channel's primary bitstream and display thereof.
  • FIG. 10A illustrates a channel switch frame (CSF) guide look ahead buffer and an active channel look ahead buffer.
  • CSF channel switch frame
  • FIG. 10B illustrates a timing flow diagram for CSF receiving, buffering and decoding.
  • FIG. 11 illustrates a device switching from a guide thumbnail to a stored program.
  • FIG. 12 illustrates a stored program primary bitstream with very fast forward processing.
  • FIG. 13 illustrates a block diagram of a video summary.
  • FIG. 14 illustrates a flowchart of a process for corrupted buffer replacement using CSFs.
  • FIG. 15 illustrates a CSF as a splice point to introduce commercials.
  • FIG. 16 illustrates another aspect of a common guide MLC.
  • FIG. 17A illustrates direct entry to pseudo-streaming content.
  • FIG. 17B illustrates the pseudo-streaming content being displayed.
  • FIG. 18 illustrates a portion of an encoder device with base layer to enhancement layer balancing.
  • FIG. 19 illustrates a flowchart of a process for base layer-to-enhancement layer balancing.
  • FIG. 20 illustrates a common preview MLC.
  • exemplary is used herein to mean “serving as an example, instance, or illustration.” Any configuration or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other configurations or designs, and the terms “core”, “engine”, “machine”, “processor” and “processing unit” are used interchangeably.
  • the techniques described herein may be used for wireless communications, computing, personal electronics, etc. An exemplary use of the techniques for wireless communication is described below.
  • Video signals may be characterized in terms of a series of pictures, frames, and/or fields, any of which may further include one or more slices or blocks.
  • frame is a broad term that may encompass one or more of frames, fields, pictures, slices and/or blocks.
  • Multimedia data may include one or more of motion video, audio, still images, text or any other suitable type of audio-visual data.
  • FIG. 1 illustrates a block diagram of an exemplary multimedia communications system 100 according to certain configurations.
  • the system 100 includes an encoder device 110 in communication with a decoder device 150 via a network 140 .
  • the encoder device receives a multimedia signal from an external source 102 and encodes that signal for transmission on the network 140 .
  • the encoder device 110 comprises a processor 112 coupled to a memory 114 and a transceiver 116 .
  • the processor 112 encodes data from the external (multimedia data) source and provides it to the transceiver 116 for communication over the network 140 .
  • the decoder device 150 comprises a processor 152 coupled to a memory 154 and a transceiver 156 .
  • the transceiver 156 may be substituted with a receiver.
  • the processor 152 may include one or more of a general purpose processor and/or a digital signal processor.
  • the memory 154 may include one or more of solid state or disk based storage.
  • the transceiver 156 is configured to receive multimedia data over the network 140 and provide it to the processor 152 for decoding.
  • the transceiver 156 includes a wireless transceiver.
  • the process or 152 may be implemented with one or more DSPs, micro-processors, RISCs, etc.
  • the processor 152 may also be fabricated on one or more application specific integrated circuits (ASICs) or some other type of integrated circuits (ICs).
  • ASICs application specific integrated circuits
  • the techniques described herein may be implemented in various hardware units.
  • the techniques may be implemented in ASICs, DSPs, RISCs, ARMs, digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and other electronic units.
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, and other electronic units.
  • the network 140 may comprise one or more of a wired or wireless communication system, including one or more of a Ethernet, telephone (e.g., POTS), cable, power-line, and fiber optic systems, and/or a wireless system comprising one or more of a code division multiple access (CDMA or CDMA2000) communication system, a frequency division multiple access (FDMA) system, an orthogonal frequency division multiple (OFDM) access system, a time division multiple access (TDMA) system such as GSM/GPRS (General packet Radio Service)/EDGE (enhanced data GSM environment), a TETRA (Terrestrial Trunked Radio) mobile telephone system, a wideband code division multiple access (WCDMA) system, a high data rate (1xEV-DO or 1xEV-DO Gold Multicast) system, an IEEE 802.11 system, a MediaFLO system, a DMB system, a DVB-H system, and the like.
  • a code division multiple access CDMA or CDMA2000
  • FDMA frequency division multiple access
  • FIG. 2A illustrates a block diagram of an exemplary encoder device 110 that may be used in system 100 of FIG. 1 according to certain configurations.
  • the encoder device 110 comprises an inter-coding encoder element 118 , an intra-coding encoder element 120 , a reference data generator element 122 and a transmitter element 124 .
  • the inter-coding encoder 118 encodes inter-coded portions of video that are predicted temporally (e.g., using motion compensated prediction) in reference to other portions of video data located in other temporal frames.
  • the intra-coding encoder element 120 encodes intra-coded portions of video that can be decoded independently without reference to other temporally located video data. In some configurations, the intra-coding encoder element 120 may use spatial prediction to take advantage of redundancy in the other video data located in the same temporal frame.
  • the reference data generator 122 in one aspect, generates data that indicates where the intra-coded and inter-coded video data generated by the encoder elements 120 and 118 respectively are located.
  • the reference data may include identifiers of subblocks and/or macroblocks that are used by a decoder device 150 to locate a position within a frame.
  • the reference data may also include a frame sequence number used to locate a frame within a video frame sequence.
  • the transmitter 124 transmits the inter-coded data, the intra-coded data, and, in some configurations, the reference data, over a network such as the network 140 of FIG. 1 .
  • the data may be transmitted over one or more communication links.
  • the terms communication links are used in a general sense and can include any channels of communication including, but not limited to, wired or wireless networks, virtual channels, optical links, and the like.
  • the intra-coded data is transmitted on a base layer communication link and the inter-coded data is transmitted over an enhancement layer communication link.
  • the intra-coded data and the inter-coded data are transmitted over the same communication link.
  • one or more of the inter-coded data, the intra-coded data and the reference data may be transmitted over a sideband communication link.
  • a sideband communication link such as the Supplemental Enhancement Information (SEI) messages of H.264 or user_ data messages of MPEG-2 may be used.
  • SEI Supplemental Enhancement Information
  • one or more of the intra-coded date, the inter-coded data and the reference data are transmitted over a virtual channel.
  • a virtual channel may comprise data packets containing an identifiable packet header that identifies the data packet as belonging to the virtual channel. Other forms of identifying a virtual channel are known in the art such as frequency division, time division, code spreading, etc.
  • FIG. 2B illustrates a block diagram of an exemplary decoder device 150 that may be used in system 100 of FIG. 1 according to certain configurations.
  • the decoder device 150 comprises a receiver element 158 , a selective decoder element 160 , a reference data determiner element 162 , and one or more reference data availability detectors such as a channel switch detector element 164 and an error detector element 166 .
  • the receiver 158 receives encoded video data (e.g., data encoded by the encoder 110 of FIGS. 1 and 2A ).
  • the receiver 158 may receive the encoded data over a wired or wireless network such as the network 140 of FIG. 1 .
  • the data may be received over one or more communication links.
  • the intra-coded data is received on a base layer communication link and the inter-coded data is received over an enhancement layer communication link.
  • the intra-coded data and the inter-coded data are received over the same communication link.
  • one or more of the inter-coded data, the intra-coded data and the reference data may be received over a sideband communication link.
  • a sideband communication link such as the Supplemental Enhancement Information (SEI) messages of H.264 or user_ data messages of MPEG-2 may be used.
  • SEI Supplemental Enhancement Information
  • one or more of the intra-coded data, the inter-coded data and the reference data are received over a virtual channel.
  • a virtual channel may comprise data packets containing an identifiable packet header that identifies the data packet as belonging to the virtual channel. Other forms of identifying a virtual channel are known in the art.
  • the selective decoder 160 decodes the received inter-coded and intra-coded video data.
  • the received data comprises an inter-coded version of a portion of video data and an intra-coded version of the portion of video data.
  • Inter-coded data can be decoded after the reference data upon which it was predicted is decoded.
  • data encoded using motion compensated prediction comprises a motion vector and a frame identifier identifying the location of the reference data. If the portion of the frame identified by the motion vector and the frame identifier of the inter-coded version is available (e.g., already decoded), then the selective decoder 160 can decode the inter-coded version. If however, the reference data is not available, then the selective decoder 160 can decode the intra-coded version.
  • the reference data determiner 162 identifies received reference data that indicates where the intra-coded and inter-coded video data in the received encoded video data are located.
  • the reference data may include identifiers of subblocks and/or macroblocks that are used by the selective decoder 160 to locate a position within a frame.
  • the reference data may also include a frame sequence number used to locate a frame within a video frame sequence. Using this received reference data enables a decoder 160 to determine if the reference data upon which inter-coded data depends is available.
  • Reference data availability can be affected by a user switching a channel of a multi-channel communication system. For example, multiple video broadcasts may be available to the receiver 158 , using one or more communication links. If a user commands the receiver 158 to change to a different broadcast channel, then reference data for the inter-coded data on the new channel may not be immediately available.
  • the channel switch detector 164 detects that a channel switch command has been issued and signals the selective decoder 160 . Selective decoder 160 can then use information obtained from the reference data determiner to identify if reference data of the inter-coded version is unavailable, and then identify the location of the nearest intra-coded version and selectively decode the identified intra-coded version.
  • Reference data availability can also be affected by errors in the received video data.
  • the error detector 166 can utilize error detection techniques (e.g., forward error correction) to identify uncorrectable errors in the bitstream. If there are uncorrectable errors in the reference data upon which the inter-coded version depends, then the error detector 166 can signal the selective decoder 160 identifying which video data are affected by the errors. The selective decoder 160 can then determine whether to decode the inter-coded version (e.g., if the reference data is available) or to decode the intra-coded version (e.g., if the reference data is not available).
  • error detection techniques e.g., forward error correction
  • one or more of the elements of the encoder device 110 of FIG. 2A may be rearranged and/or combined.
  • the elements may be implemented by hardware, software, firmware, middleware, microcode or any combination thereof.
  • one or more of the elements of the decoder 160 of FIG. 2B may be rearranged and/or combined.
  • the elements may be implemented by hardware, software, firmware, middleware, microcode or any combination thereof.
  • Certain configurations of this disclosure can be implemented using MediaFLOTM video coding for delivering realtime video services in TM3 systems using the FLO Air Interface Specification, “Forward Link Only [FLO] Air Interface Specification for Terrestrial Mobile Multimedia Multicast”, published as Technical Standard TIA-1099, August 2006, which is fully incorporated herein by reference for all purposes.
  • the channel switch frame (CSF) as used by MediaFLOTM assists channel change, as the name implies.
  • CSF channel switch frame
  • Channel Switch Frame is a low quality, small data size, structure that allows a streaming codec to acquire quickly, which can be prior to the arrival of a high quality instantaneous decoding refresh (IDR) frame.
  • IDR instantaneous decoding refresh
  • a CSF can alternately be an I-frame or a fraction of the I-frame size.
  • FIG. 3 shows a network 300 that comprises an aspect of a service acquisition system.
  • the network 300 comprises a broadcast server 302 that operates to broadcast a multimedia multiplex to a device 304 using a network 306 .
  • the server 302 communicates with the network 306 through communication link 308 that comprises any suitable type of wired and/or wireless communication link.
  • the network 306 communicates with the device 304 through communication link 310 that in this aspect comprises any suitable type of wireless communication link.
  • the communication link 310 may comprise an orthogonal frequency division multiplex (OFDM) communication link known in the telecommunication industry.
  • OFDM orthogonal frequency division multiplex
  • the device 304 is a mobile telephone but may comprise any suitable device, such as a PDA, email device, pager, notebook computer, tablet computer, desktop computer or any other suitable device that operates to receive a multimedia multiplex signal.
  • the server 302 comprises source encoders 316 that operate to receive input video signals 314 .
  • 256 input video signals are input to 256 source encoders 316 .
  • aspects of the system are suitable for use with any number of input video signals and corresponding source encoders.
  • Each of the source encoders 316 produces an encoded signal that is input to a forward error correction (FEC) encoder 320 .
  • Each of the source encoders 316 also produces a channel switch video signal (also referred to as channel switch frame (CSF)) that is input to a CSF packer 318 .
  • the CSF signal is a low resolution independently decodable version of a corresponding input signal. A more detailed description of the CSF signal.
  • the CSF packers 318 operate to pack (or encapsulate) the CSF signals and outputs encapsulated CSF signals to the FEC encoder 320 .
  • the CSF signal in the primary bitstream may be omitted. Bitrate saving in aggregate bitrate per channel can be achieved (that translates to lower power consumption attributed for example to receiving, demodulating and decoding less media data), if the CSF signal is not transmitted in a channnel's primary bitstream.
  • the FEC encoder 320 operates to error control encode the signals received from the source encoders 316 and the CSF packers 318 to produce error encoded blocks that are input to a pre-interleaver 322 .
  • the FEC encoder 320 provides RS coding.
  • the pre-interleaver 322 arranges the error encoded blocks so that selected blocks appear at predetermined locations in a transmission frame after the operation of a packer 324 .
  • the pre-interleaver 322 operates to perform the functions described above to maintain the continuous nature of the application data in the generated transmission frames.
  • the pre-interleaver 322 operates to arrange the error coded blocks so that they are optimized to provide fast service acquisition.
  • the packer 324 operates to encapsulate the output of the pre-interleaver 322 into a transmission frame.
  • the operation of the pre-interleaver 322 enables fast service acquisition because it positions the CSF and other important frame information at strategic locations in the transmission frame so that fast service acquisition can occur.
  • the output of the packer 324 is a transmission frame that is input to a modulator/transmitter 326 that operates to transmit a modulated transmission frame 328 over the network 306 .
  • the modulated transmission frame 328 is transmitted from the server 302 to the device 304 using the network 306 .
  • the transmission frame 328 comprises a sequence of superframes where each superframe comprises four frames.
  • the network 300 further includes a common guide MLC assembler 330 .
  • the common guide MLC assembler 330 is operatively coupled to receive the packed CSF from each independent CSF packer 318 .
  • the common guide MLC assembler 330 generates a single multicast guide media logical channel, (hereinafter referred to as a “common guide MLC”).
  • the guide media logical channel is a physical layer logical channel.
  • FIG. 4 illustrates a flowchart of a process 400 for generation of a common guide MLC 550 ( FIG. 5 ).
  • flowchart blocks are performed in the depicted order or these blocks or portions thereof may be performed contemporaneously, in parallel, or in a different order.
  • the process 400 for the generation of a common guide MLC 550 begins with block 402 where a CSF for an active channel is generated.
  • each source encoder 316 represents one active channel.
  • Block 402 is followed by block 404 where a determination is made whether the CSF if for the common guide MLC 550 . If the determination at block 404 is “YES,” then the resolution may be (optionally) reduced at block 406 .
  • Block 406 is followed by block 408 where the CSF is packed by CSF packer 318 .
  • Block 408 is followed by block 410 where the CSF is sent to the common guide MLC assembler 330 so that it may be inserted into the common guide MLC 550 , such as, through multiplexing.
  • Block 410 returns to block 402 where a CSF for an active channel is generated.
  • block 406 is represented in a dashed block to denote that this block is optional and may be a function of the capabilities of the network 300 and device 304 , as will be described in more detail later.
  • block 404 if the determination is “NO,” then block 404 is followed by block 412 where the CSF is inserted into the primary bitstream for the channel ( FIG. 7 ) via the FEC encoder 320 .
  • the block 412 loops back to block 402 .
  • FIG. 5 illustrates a device 304 receiving a common guide MLC 550 .
  • An exemplary configuration of the common guide MLC 550 includes a plurality of channel CSFs where each CSF is associated with a respective one channel. For example, there may be 256 active channels.
  • the common guide MLC 550 is located on channel 1 .
  • the active channels are located on channels CH 2 , CH 3 , CH 4 , CH 5 , CH 6 , CH 8 , . . . , etc.
  • the active channel CH 2 may be related to CNN and has a corresponding CSF denoted as CSF-CNN.
  • the active channel CH 3 may be related to ESPN and has a corresponding CSF denoted as CSF-ESPN.
  • the active channel CH 4 may be related to FOX and has a corresponding CSF denoted as CSF-FOX.
  • the active channel CH 5 may be related to CBS and has a corresponding CSF denoted as CSF-CBS.
  • the active channels CH 1 -CH 5 are associated with real-time continuous streaming program content.
  • the active channel CH 6 may be related to stored files and has a corresponding CSF denoted as CSF-STORED.
  • the active channel CH 7 may be related to pseudo-streaming program content and has a corresponding CSF denoted as CSF-PSEUDO.
  • the active channel CH 8 may be related to a preview channel and has a corresponding CSF denoted as CSF-PREVIEW.
  • the common guide MLC 550 may have a plurality of individually separate CSFs for direct entry to a plurality of pseudo-streaming content server(s) via a link. Likewise, for each stored channel, the common guide MLC 550 would have a separate CSF for direct entry to the stored program.
  • the device 304 is a mobile phone with video capability.
  • the device 304 may include a display 510 , a keypad 520 and microphone/speaker combination 530 incorporated in to the device housing 540 .
  • the device 304 has subscribed to receive mobile television (TV) channels or other video services in accordance with a particular subscription package.
  • the subscription packages group together one or more channels for a preset fee structure. In many cases subscription packages are tiered. Each tier adding additional channels for the lower tier of channels. Nevertheless, the subscription packages may offer separate and distinct services available on one or more active channels. Accordingly, depending on the subscription, the device 304 may receive one or more of real-time streaming TV channels, pseudo-streaming TV channels, stored files channels, a preview channel and the common guide MLC 550 .
  • the common guide MLC 550 provides a single access point for potentially all media in a waveform (or even multiple waveforms) for quick access to a universal set of media services by a universal set of device 304 .
  • the CSFs for all active channels are collected into the common guide MLC 550 . Therefore, regardless of the subscription package, the common guide MLC 550 serves as a single source of access and acquisition to available active channels.
  • the common guide MLC 550 is used as a single multicast media logical channel (MLC).
  • MLC multicast media logical channel
  • the common guide MLC 550 allows the device 304 (i.e., the device receiving the media) to tile single CSFs from multiple channel sources in the form of thumbnail tiles, directly from the common guide MLC 550 .
  • the common guide MLC 550 may be used for the acquisition of any active channel.
  • an active channel is any channel being broadcast by the network 300 .
  • the device 304 may only have access to a subset of the total available active channels.
  • common guide MLC 550 for each tiered subscription package, video service or broadcast service.
  • One common guide MLC 550 would be for a premium subscription package.
  • Another common guide MLC could be for a basic subscription package. In this example, if the basic subscription package did not permit stored programs or pseudo-streaming services, the CSFs for those services could be omitted from a basic subscription common guide MLC.
  • any one particular device 304 may be able to view all media in the common guide MLC 550 . However, access to a particular channel's primary bitstream would be blocked for those channels which are not part of the subscription service. In another aspect, if the user selects a non-subscription channel displayed on the thumbnail guide display 512 , the user may be denied viewing of those non-subscription channels from the common guide MLC 550 .
  • the decoder 160 will decode the N channel CSFs and display the thumbnail guide display 512 .
  • the thumbnail guide display 512 displays a corresponding independent channel thumbnail (THND) tile 515 A, 515 B, 515 C and 515 D for each decoded active channel CSF.
  • THND independent channel thumbnail
  • the display 510 displays N number of thumbnail tiles 515 A, 515 B, 515 C and 515 D.
  • N is equal to 4.
  • N may be any integer number and may be a function of the display size.
  • the currently displayed thumbnails (THND) tiles 515 A, 515 B, 515 C and 515 D are for CSF-CNN, CSF-ESPN, CSF-FOX and CSF-CBS.
  • the user of the device 304 is also able to scroll through a plurality of thumbnail (THND) tiles 515 A, 515 B, 515 C and 515 D.
  • thumbnail tiles are updated with the next channel's CSFs thumbnail tile.
  • the term next may be in a next channel in consecutive order or the next available channel order or some other order.
  • This feature has an advantage (of many) that the common guide MLC 550 need not be accessed at all unless the device 304 is changing channels or displaying the common guide MLC 550 .
  • CSFs channel switch frames
  • bitrate savings in aggregate bitrate per channel can be achieved (which translates to lower power consumption) if CSFs are not transmitted in the primary bitstream.
  • the guide thumbnail tile 515 B is highlighted to designate that it is a currently selected guide thumbnail tile.
  • the currently selected guide thumbnail tile 515 B is for channel CH 3 .
  • Channel CH 3 corresponds to ESPN.
  • the decoder 160 will directly enter the primary bitstream 600 -CH 3 for channel CH 3 .
  • the primary bitstream 600 -CH 3 for channel CH 3 (ESPN) has a duration denoted as PB-ESPN.
  • the primary bitstream 600 -CH 3 includes a multicast logical channel (MLC) for channel CH 3 .
  • the decoder may listen to the MLC-CH 3 to find a random access point (RAP) to enter the corresponding primary bitstream 600 -CH 3 .
  • the primary bitstream 600 -CH 3 includes at least one RAP-ESPN followed by additional program content or GOP frames.
  • the primary bitstream 600 -CH 2 includes at least one RAP-CNN followed by additional program content or coded frames.
  • the duration of the primary bitstream 600 -CH 2 is denoted by PB-CNN.
  • the primary bitstream 600 -CH 2 has associated therewith a MLC denoted as MLC-CH 2 .
  • the primary bitstream 600 -CH 4 includes at least one RAP-FOX followed by additional program content or GOP frames.
  • the primary bitstream 600 -CH 4 has associated therewith a MLC denoted as MLC-CH 4 and has a duration denoted by PB-FOX.
  • the primary bitstream 600 -CH 5 includes at least one RAP-CBS followed by additional program content or GOP frames.
  • the primary bitstream 600 -CH 5 has associated therewith a MLC denoted as MLC-CH 5 and has a duration denoted by PB-CBS.
  • the primary bitstream 600 -CH 6 includes at least one RAP-STORED followed by additional program content or GOP frames.
  • the primary bitstream 600 -CH 6 has associated therewith a MLC denoted as MLC-CH 6 with a duration denoted by PB-STORED.
  • FIG. 6A illustrates a transition (direct entry) from a guide thumbnail tile 515 B to a channel's primary bitstream using a highlighted selection.
  • FIG. 6B illustrates a transition between a guide thumbnail to a channel primary bitstream using a channel identification.
  • the common guide MLC 550 also provides for direct entry to a channel that is not next to, or sequential to, the current channel as shown in the thumbnail guide display 512 .
  • the user can enter a channel number or ID via keypad 520 while viewing the thumbnail guide display 512 to switch to his/her channel of choice (without having to browse, or scroll, through the channels in between the current channel and the channel of choice).
  • This mechanism is a cost (bitrate, power consumption, etc.) effective alternative compared to other existing schemes where a multiplex of substreams of each channel needs to be transmitted to enable direct entry.
  • FIGS. 6A , 6 B and 7 are for illustrative purposes.
  • the arrangement of the channel numbers is an example and do not have to occur in the same order as their corresponding CSFs in the common guide MLC 550 .
  • the entered channel number or ID is shown as an overlaid channel number 620 placed over the thumbnail guide display 512 .
  • the entered channel number is CH 4 .
  • Channel CH 4 corresponds to FOX.
  • the decoder device 150 will transition to the primary bitstream 600 -CH 4 .
  • FIG. 7 illustrates reception and display of a primary bitstream.
  • the channel selected is channel CH 3 for ESPN ( FIG. 6A ).
  • the entered channel number or ID may be placed over the program content on display 700 for channel selection. Now that that the program content is being displayed, the remaining GOP or other content are decoded and displayed on display 700 .
  • the common guide MLC 550 may be transmitted at any arbitrary frequency, e.g. once a second for fast channel switching to once in a few seconds for moderate latency in channel change times with some power savings.
  • the common guide MLC 550 may be located at an arbitrary point in the transmitted multiplex—at the beginning or end or aligned with a suitable acquisition point with physical layer or application layer (such as, to enable trick play).
  • the presence of the common guide MLC 550 can be indicated by means of acquisition metadata on every channel represented by the common guide MLC 550 (e.g., stream 0 or an alpha channel) or that pertaining to the overall multiplex.
  • the common guide MLC 550 may contain random access information not just for real-time streaming program channels but for other video services as well (e.g. stored files, pseudo-streaming, commercials, teasers, etc).
  • FIG. 16 illustrates another aspect of a common guide MLC 1600 .
  • the common guide MLC 1600 contains an ad insertion location directory CSF 1610 for all channels or some of the channels. This allows signaling and the required access frame to be found in a common location.
  • the ad-directory would include a list of ads for channels and related times or splice points such as location time X, time Y and time Z. These location times or splice points may be the same or different.
  • FIG. 8 illustrates a flowchart of the process 800 for reduced resolution decoding of channel switch frame and display thereof.
  • the CSFs can be of lower resolution for the thumbnail guide display 512 with multiple tiles or for previews (for e.g. when browsing through the program guide).
  • the process 800 begins with block 802 where the decoder 160 will decode the CSF of N channels at the CSF resolution. Block 802 is followed by block, 804 where the resolution of the CSFs is determined.
  • the display 510 may be capable of displaying a video graphics array (VGA) type resolution.
  • VGA video graphics array
  • the CSF may be sent at a quarter of the VGA resolution (hereinafter referred to as QVGA).
  • Block 806 is followed by block 806 where the decoder 160 will reduce (down sample) the decoded CSF resolution to the thumbnail tile size for the thumbnail guide display 512 . Since N is equal to 4 in this example, the QVGA resolution for a single CSF is further reduced by 1/N or 1 ⁇ 4. Thus, the displayed thumbnail (THND) tiles are each down-sampled by a quarter (Q) again to a QQVGA resolution.
  • Block 806 is followed by block 808 where the N thumbnail (THND) tiles are displayed.
  • Block 808 is followed by block 810 where a determination is made whether a channel thumbnail (THND) tile has been selected. If the determination is “NO,” the process 800 loops back to block 802 for the next intervals set of CSF frames. Alternately a channel number may have been entered or some other action. Nonetheless, for this example, the channel thumbnail (THND) tile 515 B has been selected. Thus, at block 810 , the process 800 continues to FIG. 9 at block 902 .
  • Tiled displays may be, for example, in 2 ⁇ 2 landscape mode through dyadic scaling, or in 3 ⁇ 5 portrait mode through appropriate resampling of resolution and, as might potentially be necessary, frame rate.
  • Tiled display options are possible, all of which are intended to be within the scope of the configurations described herein.
  • FIG. 9 illustrates a flowchart of the process 900 to access a channel's primary bitstream and display thereof from the common MLC channel 550 .
  • the entry to an active channel's primary bitstream may be enabled by selecting the CSF tile for the chosen channel.
  • the CSF is sent at a lower resolution (resulting in bitrate savings for the CSF).
  • the device 304 does not need to scale (decimate) the CSF to get a smaller image (thumbnail tile).
  • the device 304 may still utilize the reduced resolution version to acquire an active channel by simply scaling up the thumbnail tile to the nominal resolution of the primary bitstream of the selected active channel. This process reduces the computation load for the handset and hence saves power for any thumbnail guide view.
  • the process 900 begins with block 902 based on the condition of block 810 of FIG. 8 .
  • the decoder 160 will decode the CSF of the selected thumbnail tile in progress and upscale the resultant image to the resolution of the primary bitstream.
  • Block 902 is followed by block 904 where access to the primary bitstream at the next available access point takes place. In this case of FIG. 6A , RAP-ESPN is accessed.
  • Block 904 is followed by block 906 where the primary bitstream is decoded at the primary bitstream resolution.
  • the device 304 will display the decoded program content of the primary bitstream 600 -CH 3 ( FIG. 7 ).
  • FIG. 10A illustrates a channel switch frame (CSF) guide look ahead buffer 1010 and an active channel look ahead buffer 1020 .
  • the active channels are generally accessed by the device 304 in guide order. This means that there is a high probability that the next channel to acquire is one of two choices.
  • By placing the CSF in both the guide order in the CSF guide look ahead buffer 1010 and adjacent channels order in the active channel look ahead buffer 1020 it is possible to have the next second's start point this second.
  • a specific gain to this approach is that the device 304 can be assured of some video for the next second independent of the time of the channel change key is pressed.
  • the device 304 has to buffer the adjacent channels CSFs in order to achieve this effect. Another specific gain is to assure video for a next channel.
  • the (CSF) guide look ahead buffer 1010 includes a plurality of buffer sections 1012 , 1014 , and 1016 .
  • the active channel look ahead buffer 1020 may be similar to the CSF guide look ahead buffer 1010 . Hence no further discussion of the active channel look ahead buffer 1020 will be described.
  • the arrow from the thumbnail guide display 1000 to the thumbnail tiles 1002 and 1004 serves to indicate that the user is scrolling to an adjacent channel in guide order off of the thumbnail guide display 1000 .
  • the buffer section 1014 is for the current CSF denoted by CSF(i) for a current active channel in the guide order.
  • the highlighted thumbnail tile for channel CH 5 is denoted as THND CSF-CBS in column (i) and represents the current channel in guide order.
  • the buffer section 1014 stores the data associated with the next start point CSF(i) for channel (i).
  • the buffer section 1016 stores the data associated with the next start point CSF(i+1) for channel (i+1) where CSF(i+1) is the CSF for the next adjacent channel to the right in guide order.
  • the next adjacent channel may be associated with channel CH- 6 in the column denoted as (i+1).
  • the next adjacent channel may be channel CH 7 if the orientation on the display is followed for the adjacent channel identification.
  • the buffer section 1012 stores the data associated with the next start point CSF(i ⁇ 1) for channel (i ⁇ 1) where CSF(i ⁇ 1) is the CSF for the next adjacent channel to the left in guide order.
  • the next adjacent channel is channel CH- 4 in the column denoted as (i ⁇ 1).
  • the description above is for four (4) tiles and would depend on the number of tiles displayed at one instance and the arrangement.
  • the buffer sections 1012 , 1014 and 1016 may store more than one CSF.
  • the stored CSF(i) may be multiple CSFs, one for each consecutively aligned next time interval T 1 , T 2 , T 3 , etc.
  • FIG. 10B illustrates a timing flow diagram for CSF receiving, buffering and decoding.
  • the flow diagram for a current time window includes decoding a current CSF(i) at block 1070 .
  • the current CSF(i) is for a current time interval for a current channel.
  • Block 1070 is followed by block 1072 where the currently decoded CSF(i) is displayed.
  • the device 304 is also receiving the next in time one or more CSFs at block 1050 . While receiving and decoding may take place essentially simultaneously, the received CSF is for the next time window.
  • Block 1050 is followed by blocks 1060 , 1062 and 1064 .
  • the received CSFs are buffered.
  • the received CSFs are buffered in guide order.
  • the decoding block 1070 finishes decoding the CSF data, the video data is displayed and is spent or consumed during the current time window. As time advances to the next in time instance, the decoding operation needs to be feed the next in time buffered CSF(i). The next in time buffered CSF(i) becomes the current CSF(i) for decoding in the current time window.
  • the decoding operation needs to start decoding the CSF for the current time window. Hence, the decoding operation immediately needs the next in time buffered CSF data for the selected channel (i+1).
  • block 1074 is followed by block 1076 where the CSF(i) in the current time window is set to the CSF(i+1).
  • the decoding operation at block 1070 is essentially immediately feed the CSF(i+1) data from block 1064 .
  • the buffer sections ( FIG. 10A ) would be adjusted accordingly for the guide order arrangement in accordance with the new channel selection. In other words, CSF(i+1) becomes CSF(i) at block 1076 .
  • FIG. 11 illustrates a device 304 switching from a guide thumbnail tile THND CSF-STORED to a stored program 1112 in device memory 1110 .
  • the device 304 may subscribe to a subscription package that allows for broadcast programs to be stored for future playback by the user at any time or during a predetermined timed window.
  • the device 304 may automatically download sports highlights from the primary bitstream 600 -CH 6 when accessed, either manually or automatically.
  • the decoder 160 will automatically switch to the stored program 1112 in the device memory 1110 .
  • the primary bitstream 600 -CH 6 need not be accessed unless, the stored program is being updated or another one is being automatically or selectively stored in the device memory 1110 .
  • FIG. 12 illustrates a stored program 1112 primary bitstream with very fast forward processing.
  • the CSF When the CSF is applied, for example, in a stored video file, the CSF can be used as a simple faster forward mechanism.
  • a fast forward operation via playback of only I-frames and P-frames have rate limitations.
  • the possible adaptive GOP structure makes use of I-frames only unrealistic due to the highly nonlinear time.
  • the CSF can be, by its nature, periodic and possibly at 1 frame per second so a very fast forward operation (e.g. 30 ⁇ ) is possible. This provide a very fast forward operation (and all other speeds in between), and potentially inherently in linear time.
  • An exemplary primary bitstream for a stored program 1112 may include a CSF 1200 A followed by program data PD(i) 1200 B where i represent a current time interval.
  • the CSFs 1200 A, 1202 B, 1204 A and 1206 A repeated every 1 second.
  • Between the CSFs 1200 A, 1202 B, 1204 A and 1206 A are PD(i) 1200 B, PD(i+1) 1202 B, PD(i+2) 1204 B and PD(i+3) 1206 B. While, one second intervals are shown, other increments of time may be substituted.
  • the CSFs may provide a point of access. The more CSFs the more rapid a pace for fast forwarding. In general, the CSF may be 20% the size of an I-frame used for access. Thus, a plurality of CSFs may be substituted for a single I-frame.
  • FIG. 13 illustrates a block diagram of a video summary 1300 .
  • the video summary 1300 is a packet and has several applications (e.g. video indexing) as will become evident from the description provided herein.
  • the video summary 1300 can be generated using just the CSFs 1302 , 1306 (and potentially, as desired, a few additional M frames 1304 1 , 1304 2 , . . . , 1304 M that follow in decode order). Additionally, CSFs 1302 , 1306 can serve as a periodic index (or glance) into the content on any given channel and also enable searching. This is typically applied to (but not restricted to) stored program 1112 or in pseudo-streaming video where video data is buffered before playback.
  • the video summary 1300 can also be generated using transitional effects such as cross fades, for example, M number of frames may be generated between two CSFs 1302 and 1306 as their linear combination may be using alpha blending techniques. Additionally, this mechanism also can be used when switching between two (2) channels in mobile TV applications.
  • the video summary 1300 may be stored for a plurality of active channels to supply a video index.
  • the video summary 1300 may be used for channel CH 8 .
  • the common guide MLC 550 would provide the video summary 1300 without the need for accessing any primary bitstream or alternately access a common preview MLC 2000 ( FIG. 20 ).
  • the video summary 1300 would provide just enough video content for a snippet (brief video clip) to allow the user to preview program content.
  • the video summary may be used in other instances such as a video clip teaser.
  • FIG. 20 illustrates a common preview MLC 2000 .
  • the common preview MLC 2000 includes at least one video summary 2010 , 2012 , 2014 , 2016 , 2018 , and 2020 for a plurality of active channels CH 2 , CH 3 , CH 4 , CH 5 , CH 6 , and CH 7 , respectively.
  • An exemplary video summary for each active channel CH 2 , CH 3 , CH 4 , CH 5 , CH 6 , and CH 7 is shown in FIG. 13 .
  • the video summaries in the common preview MLC 2000 may be displayed in a similar manner as described in relation to the common guide MLC 550 . Furthermore, selection of one of the displayed video summaries or entry of a channel number may provide direct entry to the primary bitstream.
  • an ad insertion location directory CSF 2022 may be included.
  • the ad insertion location directory CSF 2022 is associated with channel CH 9 .
  • FIG. 14 illustrates a flowchart of a process 1400 for corrupted buffer replacement with CSFs.
  • the CSF is nominal data that when provided to a video buffer (such as buffer 1010 ) allows start up of the decoder 160 .
  • the CSF in buffer 1010 may be used as a replacement for the corrupted buffer content.
  • the decoder 160 may scale the motion vectors for the appropriate temporal location.
  • the encoder device 110 may facilitate appropriate prediction from the CSF to avoid or minimize drift artifacts.
  • the process 1400 begins with block 1402 where the video frames from the primary bitstream are received.
  • Block 1402 is followed by block 1404 where the video frames from the primary bitstream are buffered.
  • Block 1404 is followed by block 1406 where a determination is made whether the buffer's data is corrupted. If the determination at block 1406 is “NO,” the video frames are decoded at block 1412 . However, if the determination at block 1406 is “YES,” then block 1406 is followed by block 1408 where the buffered CSFs are retrieved.
  • Block 1408 is followed by block 1410 where the buffer contents of video frames is replaced with at least one CSF associated with the channel.
  • Block 1410 is followed by block 1412 where the at least one CSF is decoded.
  • a CSF for an active channel in the common guide MLC 550 is not provided during commercials and the presence of a commercial can be detected by this or other means.
  • the CSFs in one aspect, include programming content other than commercials.
  • switching can occur within the same channel from one program segment to the next (this is possible when there is sufficient buffering of data before decode and playback).
  • the user can choose not to join a channel if a commercial is observed to be playing on the desired channel. Alternatively, no acquisition of CSF is triggered during commercials.
  • FIG. 15 illustrates a CSF 1500 as a splice point or splicing mechanism to introduce commercials in the midst of regular programming broadcast on the primary bitstream. Additionally, a CSF can be used for forced viewing of commercials or teasers or video for other marketing applications. Thus, the CSF is a tool to enable flexible viewing for commercials.
  • the CSF 1500 would include information related to commercial data 1502 .
  • FIG. 17A illustrates direct entry to pseudo-streaming content 1730 .
  • Pseudo-streaming may be a combination of continuous buffering and playback. If the user selects the THND for CSF-PSEUDO (shown highlighted) the thumbnail guide display 1700 , a link may be embedded therein for direct entry to a remote pseudo-streaming server 1720 via network 1710 .
  • the remote pseudo-streaming server 1720 provides access to a respective one or more files associated with pseudo-streaming content 1730 in accordance with the link.
  • the device 304 after selecting the THND CSF-PSEUDO, begins buffering the pseudo-streaming content 1730 followed by playback via server 1720 through the network.
  • FIG. 17B illustrates the pseudo-streaming content 1730 being displayed on display 1750 via a playback operation.
  • the CSF is a flexible tool to provide layered rate balancing.
  • the location of the channel switch frame (CSF) may be adjusted between the base and enhancement layers of a layered codec system. This provides a simple mechanism to change the data rate balance between the two layers.
  • One or many specific gains for this technique is that it is very simple to implement and the enhancement rate balance reduces the overall network capacity required to carry a specific channel, which then reduces power consumption.
  • the CSF size may be adapted to the available space (e.g., by means of quantization).
  • the base layer size and enhancement layer size are application layer constrained.
  • FIG. 18 illustrates a portion of encoder device 1800 with base layer to enhancement layer balancing.
  • the encoder device 1800 includes a source encoder 1805 similar to the source encoder 316 in FIG. 3 .
  • the source encoder 1805 includes a primary bitstream encoding engine 1810 to encode and generate a primary bitstream.
  • the source encoder 1805 also includes a CSF generator 1815 to generate a CSF.
  • the output of the primary bitstream encoding engine 1810 is sent to an enhancement layer generator 1820 and to a base layer generator 1830 shown in parallel.
  • the output of the enhancement layer generator 1820 produces an enhancement layer signal with an enhancement layer size (ELS).
  • the base layer generator 1830 outputs a base layer signal with a base layer size (BLS).
  • the outputs from the enhancement layer generator 1820 and the base layer generator 1830 are sent to a base-to-enhancement layer equalizer 1840 to equalize the enhancement layer-to-base layer ratio.
  • the base-to-enhancement layer equalizer 1840 includes an equalizing CSF insertor 1845 which generates a variable CSF via CSF generator 1815 to equalize the enhancement layer-to-base layer ratio.
  • the CSF may be varied such as by quantization to equalize the BLS to the ELS within some margin.
  • FIG. 19 illustrates a flowchart of a process 1900 for base layer-to-enhancement layer balancing.
  • the process 1900 begins with block 1902 where the primary bitstream is encoded. Block 1902 is followed by block 1904 and 1906 .
  • the base layer is generated.
  • the enhancement layer is generated.
  • Block 1904 is followed by block 1908 where the BLS is determined.
  • Block 1906 is followed by block 1910 where the ELS is determined.
  • Block 1908 and 1910 are followed by block 1912 , where a determination is made whether the ratio of BLS-to-ELS is equal to some preset ratio (X).
  • the preset ratio (X) may be 1 or some other number.
  • the term equal represents the ratio within some marginally difference.
  • block 1912 determines whether the BLS is less than the ELS as a function of the preset ratio. If the determination is “NO,” at block 1914 , block 1914 is followed by block 1916 . At block 1916 , a CSF is generated and inserted into the enhancement layer so that the ELS is equalized to the base and enhancement layer sizes as a function of the preset ratio.
  • the CSF is generated and inserted into the base layer so that the BLS is equalized to the base and enhancement layer sizes as a function of the preset ratio.
  • the CSF may be partitioned into base and enhancement components (e.g. signal-to-noise ratio (SNR) scalable CSF) to achieve balance (equalization) on a finer level.
  • SNR signal-to-noise ratio
  • the size of base and enhancement components can be varied to adapt to space available for a target bandwidth ratio between base and enhancement layers. This ratio may be dictated by, for example, the physical layer energy ratios.
  • the CSF can be coded such that it enhances the quality of the corresponding base layer frame. This is of particular benefit when the enhancement layer is lost or not transmitted or received based on the system bandwidth or transmission channel error conditions. This is different from straight SNR scalability in that the CSF is independently decodable with the combination of the corresponding P- and/or B-frame(s).
  • the CSF may be placed arbitrarily to provide regular access points, i.e., temporal locations where access to the video stream is desirable, such as shown in FIG. 12 . It is possible to further optimize this process by evaluating the potential locations for access.
  • a P frame within an H.264 stream may contain both I and P data. If it is observed that a specific P frame contains a large amount of I data as in, for example, a partial scene change (optionally a bias toward intra can be applied during mode decision), the associated CSF required to join the stream at this point may be quite small.
  • the residual data size required to join the stream at all or some of the possible locations it is possible to reduce the required CSF size.
  • Some of the specific gains of this application includes a reduced data rate, and as a result, lower power consumption.
  • the CSFs provides a flexible tool for opportunistic injection of switch frame locations in a myriad of instances.
  • the CSF itself could be coded as an Intra, P or B frame.
  • the CSF could be coded such that the transform (and quantized) coefficients of the CSF data may be hidden in the transform domain coefficients of the corresponding base and/or enhancement layer(s) data (and/or the corresponding single layer data) effectively before entropy coding to minimize compression overhead of entropy coding two different streams of coefficients separately.
  • CSF augments the primary data for a channel
  • only the difference information need be coded in the primary data.
  • the remaining information can be extracted from the CSF. For example, when CSF is to be located at the temporal location of a P-picture, then code this P-frame as one where mode decision is biased towards intra (thus increasing the probability of intra macroblocks).
  • the remaining inter macroblocks are coded in the P-frame and the intra macroblocks are sent in the CSF.
  • the CSF can also be coded as a P-frame.
  • the CSF provides a flexible tool that has error robustness.
  • the decoder 160 can force a channel change to the same channel thus invoking the CSF.
  • the CSF by virtue of its location (in the same or separate guide MLC 550 ) and temporal distance may provide the diversity (temporal and/or frequency and/or spatial and/or code, as in code block) required to protect it from the same errors that contaminated the primary data. Hence recovery from errors can be facilitated using CSFs. Partial recovery in the case of random errors is also possible wherein the intra data in the CSF can be used to recover lost macroblocks in the corresponding predicted frame (P or B) through spatial or temporal concealment methods.
  • the (CSF) guide look ahead buffer 1010 stores CSFs in guide order while the active channel look ahead buffer 1020 stores CSF in active channel order.
  • the active channel order may not be the same order as the guide order and/or may be displaced in time.
  • a forced channel change when invoked, could replace the user initiated channel change at block 1074 .
  • the decoder 160 can force a channel change to the same channel (i) thus invoking the CSF(i) from the buffer 1060 .
  • the CSF is more effective than existing methods, for example, redundant coded picture or SI and SP pictures in H.264, since CSF can be encapsulated in a separate transport packet (or exposed to one or more OSI layers or encapsulation layers in the protocol stack).
  • Such isolation provides the flexibility needed for acquisition applications (some of which are described in this document) and for error recovery purposes in terms of the diversity or separation (some of which are described in this document).
  • Redundant coded picture is associated with a picture and tagged with the encoded frame and coexists with the coded data for the picture.
  • CSF can be coded based on which blocks are not intra-refreshed in the CSF (i.e., choose to refresh more important blocks, e.g. those that are referenced by most future macroblocks).
  • the CSF is a flexible tool to accommodate decoders with different capabilities.
  • devices 304 with varied capabilities in terms of computation, processor, display, power limitations etc) exist in the system.
  • the network or server transmits a signal that is typically of the latest version that is backward compatible with older versions of the decoder devices 150 .
  • the CSF can be used to provide such backward compatibility (to accommodate a variety of decoders in general), where decoders that are starved of computational power can decode the CSF instead of the corresponding full blown (in terms of quality, size or resolution) coded reference picture.
  • the CSF sent in the common guide MLC 550 of FIG. 6A may be a versionless CSF.
  • the versionless CSF would be able to be decoded by any current device 304 and any predecessors.
  • the predecessors may only be able to decode the video summary of a channel using the versionless CSFs.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form or combination of storage medium known in the art.
  • An example storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC).
  • the ASIC may reside in a wireless modem.
  • the processor and the storage medium may reside as discrete components in the wireless modem.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Television Systems (AREA)
US11/941,014 2006-11-15 2007-11-15 Systems and methods for applications using channel switch frames Active 2029-05-22 US8761162B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/941,014 US8761162B2 (en) 2006-11-15 2007-11-15 Systems and methods for applications using channel switch frames

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US86601806P 2006-11-15 2006-11-15
US11/941,014 US8761162B2 (en) 2006-11-15 2007-11-15 Systems and methods for applications using channel switch frames

Publications (2)

Publication Number Publication Date
US20080127258A1 US20080127258A1 (en) 2008-05-29
US8761162B2 true US8761162B2 (en) 2014-06-24

Family

ID=39402490

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/941,014 Active 2029-05-22 US8761162B2 (en) 2006-11-15 2007-11-15 Systems and methods for applications using channel switch frames

Country Status (9)

Country Link
US (1) US8761162B2 (ja)
EP (1) EP2098077A2 (ja)
JP (1) JP2010510725A (ja)
KR (1) KR101010881B1 (ja)
CN (2) CN101536524B (ja)
BR (1) BRPI0718810A2 (ja)
CA (1) CA2669153A1 (ja)
RU (1) RU2009122503A (ja)
WO (1) WO2008061211A2 (ja)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100329328A1 (en) * 2007-06-26 2010-12-30 Nokia, Inc. Using scalable codecs for providing channel zapping information to broadcast receivers

Families Citing this family (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9077991B2 (en) * 2002-12-10 2015-07-07 Sony Computer Entertainment America Llc System and method for utilizing forward error correction with video compression
US8670437B2 (en) * 2005-09-27 2014-03-11 Qualcomm Incorporated Methods and apparatus for service acquisition
US8229983B2 (en) 2005-09-27 2012-07-24 Qualcomm Incorporated Channel switch frame
WO2007102147A2 (en) * 2006-03-07 2007-09-13 Bitband Technologies Ltd. Personalized insertion of advertisements in streaming media
CA2783599C (en) * 2006-11-14 2013-06-25 Qualcomm Incorporated Systems and methods for channel switching
US20080205529A1 (en) * 2007-01-12 2008-08-28 Nokia Corporation Use of fine granular scalability with hierarchical modulation
US20080212673A1 (en) * 2007-03-01 2008-09-04 Peisong Chen Systems and Methods for Adaptively Determining I Frames for Acquisition and Base and Enhancement Layer Balancing
US8700792B2 (en) 2008-01-31 2014-04-15 General Instrument Corporation Method and apparatus for expediting delivery of programming content over a broadband network
US8752092B2 (en) * 2008-06-27 2014-06-10 General Instrument Corporation Method and apparatus for providing low resolution images in a broadcast system
FR2933814B1 (fr) 2008-07-11 2011-03-25 Commissariat Energie Atomique Electrolytes liquides ioniques comprenant un surfactant et dispositifs electrochimiques tels que des accumulateurs les comprenant
JP2011529674A (ja) * 2008-07-28 2011-12-08 トムソン ライセンシング スケーラブル・ビデオ符号化(svc)ストリームを用いて高速チャネル変更を行う方法および装置
JP5748940B2 (ja) * 2009-01-28 2015-07-15 京セラ株式会社 放送装置および放送受信装置
JP5797371B2 (ja) * 2009-01-28 2015-10-21 京セラ株式会社 放送装置および放送受信装置
KR101570696B1 (ko) * 2009-05-29 2015-11-20 엘지전자 주식회사 영상표시장치 및 그 동작방법
US9237296B2 (en) 2009-06-01 2016-01-12 Lg Electronics Inc. Image display apparatus and operating method thereof
KR101551212B1 (ko) * 2009-06-02 2015-09-18 엘지전자 주식회사 영상표시장치 및 그 동작방법
CA2766148A1 (en) * 2009-06-24 2011-01-13 Delta Vidyo, Inc. System and method for an active video electronic programming guide
US20110135411A1 (en) * 2009-12-03 2011-06-09 Inman John E Drill template with integral vacuum attach having plugs
US9357244B2 (en) 2010-03-11 2016-05-31 Arris Enterprises, Inc. Method and system for inhibiting audio-video synchronization delay
US8689269B2 (en) * 2011-01-27 2014-04-01 Netflix, Inc. Insertion points for streaming video autoplay
JP5802953B2 (ja) * 2012-03-31 2015-11-04 株式会社アクセル 動画再生処理方法及びその方法を採用した携帯情報端末
WO2013077236A1 (en) * 2011-11-21 2013-05-30 Canon Kabushiki Kaisha Image coding apparatus, image coding method, image decoding apparatus, image decoding method, and storage medium
US20130282917A1 (en) * 2012-04-24 2013-10-24 Vid Scale, Inc. Method and apparatus for smooth stream switching in mpeg/3gpp-dash
US9571827B2 (en) * 2012-06-08 2017-02-14 Apple Inc. Techniques for adaptive video streaming
WO2014042663A1 (en) * 2012-09-12 2014-03-20 Thomson Licensing Mosaic program guide
KR102257542B1 (ko) 2012-10-01 2021-05-31 지이 비디오 컴프레션, 엘엘씨 향상 레이어에서 변환 계수 블록들의 서브블록-기반 코딩을 이용한 스케일러블 비디오 코딩
EP2912893B1 (en) * 2012-10-25 2019-09-04 Telefonaktiebolaget LM Ericsson (publ) Method and power adaptation device arranged to adjust power consumption in a network node
US9807388B2 (en) * 2012-10-29 2017-10-31 Avago Technologies General Ip (Singapore) Pte. Ltd. Adaptive intra-refreshing for video coding units
CN105075273B (zh) 2013-02-27 2019-03-26 苹果公司 自适应流式传输技术
US10003858B2 (en) * 2014-05-09 2018-06-19 DISH Technologies L.L.C. Provisioning commercial-free media content
US11140461B2 (en) * 2016-06-29 2021-10-05 Sony Interactive Entertainment LLC Video thumbnail in electronic program guide

Citations (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992016071A1 (en) 1991-02-27 1992-09-17 General Electric Company An hdtv compression system
US5241563A (en) 1992-08-10 1993-08-31 General Instrument Corporation Method and apparatus for communicating interleaved data
JPH08307786A (ja) 1995-05-02 1996-11-22 Sony Corp 表示制御装置および表示制御方法
US5875199A (en) 1996-08-22 1999-02-23 Lsi Logic Corporation Video device with reed-solomon erasure decoder and method thereof
EP0966162A1 (en) 1998-01-08 1999-12-22 Matsushita Electric Industrial Co., Ltd. Video signal reproducing device and hierarchical video signal decoder
US6057884A (en) * 1997-06-05 2000-05-02 General Instrument Corporation Temporal and spatial scaleable coding for video object planes
EP1061737A1 (en) * 1999-06-18 2000-12-20 THOMSON multimedia Process and device for switching digital television programmes
WO2001067777A1 (en) 2000-03-07 2001-09-13 Koninklijke Philips Electronics N.V. Resynchronization method for decoding video
WO2002015589A1 (en) 2000-08-14 2002-02-21 Nokia Corporation Video coding
US6370666B1 (en) 1998-12-02 2002-04-09 Agere Systems Guardian Corp. Tuning scheme for error-corrected broadcast programs
US6480541B1 (en) 1996-11-27 2002-11-12 Realnetworks, Inc. Method and apparatus for providing scalable pre-compressed digital video with reduced quantization based artifacts
US20030014752A1 (en) 2001-06-01 2003-01-16 Eduard Zaslavsky Method and apparatus for generating a mosaic style electronic program guide
US6535240B2 (en) 2001-07-16 2003-03-18 Chih-Lung Yang Method and apparatus for continuously receiving frames from a plurality of video channels and for alternately continuously transmitting to each of a plurality of participants in a video conference individual frames containing information concerning each of said video channels
RU2201654C2 (ru) 1997-12-23 2003-03-27 Томсон Лайсенсинг С.А. Способ низкошумового кодирования и декодирования
US6611561B1 (en) 1999-02-18 2003-08-26 Nokia Mobile Phones Limited Video coding
WO2003073753A1 (en) 2002-02-27 2003-09-04 Matsushita Electric Industrial Co., Ltd. Information browsing method, transmitting apparatus and receiving apparatus
WO2003098475A1 (en) 2002-04-29 2003-11-27 Sony Electronics, Inc. Supporting advanced coding formats in media files
CN1478355A (zh) 2000-08-21 2004-02-25 ��˹��ŵ�� 视频编码
US20040066854A1 (en) 2002-07-16 2004-04-08 Hannuksela Miska M. Method for random access and gradual picture refresh in video coding
KR20040074635A (ko) 2003-02-19 2004-08-25 마쯔시다덴기산교 가부시키가이샤 동화상 부호화 방법 및 동화상 복호화 방법
KR20040074365A (ko) 2003-02-18 2004-08-25 백철 동파 방지 배관 시스템
US20040179139A1 (en) 2001-01-19 2004-09-16 Lg Electronics Inc. VSB reception system with enhanced signal detection for processing supplemental data
US20040181811A1 (en) 2003-03-13 2004-09-16 Rakib Selim Shlomo Thin DOCSIS in-band management for interactive HFC service delivery
JP2004289808A (ja) 2003-03-03 2004-10-14 Matsushita Electric Ind Co Ltd 画像符号化方法及び画像復号化方法
US20040213473A1 (en) 2003-04-28 2004-10-28 Canon Kabushiki Kaisha Image processing apparatus and method
US20040218816A1 (en) 2003-04-30 2004-11-04 Nokia Corporation Picture coding method
US20040228535A1 (en) * 2003-05-15 2004-11-18 Matsushita Electric Industrial Co., Ltd Moving image decoding apparatus and moving image decoding method
US20040243913A1 (en) 2003-04-29 2004-12-02 Utah State University Forward error correction with codeword cross-interleaving and key-based packet compression
WO2004114668A1 (en) 2003-06-16 2004-12-29 Thomson Licensing S.A. Decoding method and apparatus enabling fast channel change of compressed video
WO2005043783A1 (ja) 2003-10-30 2005-05-12 Matsushita Electric Industrial Co., Ltd. 携帯端末向け伝送方法及び装置
WO2005067191A1 (ja) 2004-01-08 2005-07-21 Matsushita Electric Industrial Co., Ltd. ザッピングストリームtsパケットのための追加誤り訂正方法
US20050163211A1 (en) * 2002-03-05 2005-07-28 Tamer Shanableh Scalable video transmission
US20050175091A1 (en) 2004-02-06 2005-08-11 Atul Puri Rate and quality controller for H.264/AVC video coder and scene analyzer therefor
WO2005076503A1 (en) 2004-02-06 2005-08-18 Nokia Corporation Mobile telecommunications apparatus for receiving and displaying more than one service
US20050185541A1 (en) 2004-02-23 2005-08-25 Darren Neuman Method and system for memory usage in real-time audio systems
US20050185795A1 (en) * 2004-01-19 2005-08-25 Samsung Electronics Co., Ltd. Apparatus and/or method for adaptively encoding and/or decoding scalable-encoded bitstream, and recording medium including computer readable code implementing the same
US20050200757A1 (en) 2004-01-23 2005-09-15 Alberta Pica Method and apparatus for digital video reconstruction
CN1674674A (zh) 2004-03-24 2005-09-28 株式会社日立制作所 动画数据的传输方法和系统、信息发出和接收装置
WO2005106875A1 (en) 2004-04-28 2005-11-10 Matsushita Electric Industrial Co., Ltd. Moving picture stream generation apparatus, moving picture coding apparatus, moving picture multiplexing apparatus and moving picture decoding apparatus
WO2005112465A1 (en) 2004-05-03 2005-11-24 Thomson Research Funding Corporation Method and apparatus enabling fast channel change for dsl system
US20060018379A1 (en) * 2002-11-15 2006-01-26 Thomson Licensing S.A. Method and system for staggered statistical multiplexing
US20060018377A1 (en) 2003-03-03 2006-01-26 Shinya Kadono Video encoding method and video decoding method
US7020823B2 (en) 2002-03-19 2006-03-28 Matsushita Electric Industrial Co., Ltd. Error resilient coding, storage, and transmission of digital multimedia data
US7031348B1 (en) 1998-04-04 2006-04-18 Optibase, Ltd. Apparatus and method of splicing digital video streams
US20060120448A1 (en) * 2004-12-03 2006-06-08 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding multi-layer video using DCT upsampling
US20060120378A1 (en) 2003-10-30 2006-06-08 Izumi Usuki Mobile-terminal-oriental transmission method and apparatus
US7072366B2 (en) 2000-07-14 2006-07-04 Nokia Mobile Phones, Ltd. Method for scalable encoding of media streams, a scalable encoder and a terminal
US20060146143A1 (en) 2004-12-17 2006-07-06 Jun Xin Method and system for managing reference pictures in multiview videos
US7085324B2 (en) 2001-04-25 2006-08-01 Lg Electronics Inc. Communication system in digital television
KR20060087966A (ko) 2005-01-31 2006-08-03 엘지전자 주식회사 비디오 디코딩 장치 및 방법
WO2006104519A1 (en) 2005-03-29 2006-10-05 Thomson Licensing Method and apparatus for providing robust reception in a wireless communications system
EP1715680A1 (fr) 2005-04-19 2006-10-25 Bouygues Telecom Affichage d'une page numérique "Mosaïque" pour la télévision sur terminal mobile
US20070073779A1 (en) 2005-09-27 2007-03-29 Walker Gordon K Channel switch frame
US20070071105A1 (en) 2005-09-27 2007-03-29 Tao Tian Mode selection techniques for multimedia coding
US20070071100A1 (en) 2005-09-27 2007-03-29 Fang Shi Encoder assisted frame rate up conversion using various motion models
WO2007038726A2 (en) 2005-09-27 2007-04-05 Qualcomm Incorporated Methods and apparatus for service acquisition
US20070076796A1 (en) 2005-09-27 2007-04-05 Fang Shi Frame interpolation using more accurate motion information
US20070083578A1 (en) 2005-07-15 2007-04-12 Peisong Chen Video encoding method enabling highly efficient partial decoding of H.264 and other transform coded information
WO2007042916A1 (en) 2005-10-11 2007-04-19 Nokia Corporation System and method for efficient scalable stream adaptation
US20070101378A1 (en) * 2003-05-02 2007-05-03 Koninklijke Philips Electronics N.V. Redundant transmission of programmes
US20070110105A1 (en) 2003-10-30 2007-05-17 Matsushita Electric Industrial Co., Ltd. Apparatus and a method for receiving a multiplexed broadcast signal carrying a plurality of services
US20070157248A1 (en) * 2005-12-29 2007-07-05 United Video Properties, Inc. Systems and methods for providing channel groups in an interactive media guidance application
US20070153914A1 (en) 2005-12-29 2007-07-05 Nokia Corporation Tune in time reduction
US20070288959A1 (en) * 2000-03-29 2007-12-13 Digeo, Inc. Single-button remote access to a synthetic channel page of specialized content
US20080022335A1 (en) * 2006-07-24 2008-01-24 Nabil Yousef A receiver with a visual program guide for mobile television applications and method for creation
US7369610B2 (en) 2003-12-01 2008-05-06 Microsoft Corporation Enhancement layer switching for scalable video coding
RU2328086C2 (ru) 2003-06-19 2008-06-27 Нокиа Корпорейшн Переключение потока на основе постепенного восстановления при декодировании
US20080170564A1 (en) 2006-11-14 2008-07-17 Qualcomm Incorporated Systems and methods for channel switching
US20080196061A1 (en) * 2004-11-22 2008-08-14 Boyce Jill Macdonald Method and Apparatus for Channel Change in Dsl System
US7428639B2 (en) * 1996-01-30 2008-09-23 Dolby Laboratories Licensing Corporation Encrypted and watermarked temporal and resolution layering in advanced television
US20090222856A1 (en) * 1999-10-08 2009-09-03 Jin Pil Kim Virtual channel table for a broadcast protocol and method of broadcasting and receiving broadcast signals using the same
US20090245393A1 (en) 2006-07-28 2009-10-01 Alan Jay Stein Method and Apparatus For Fast Channel Change For Digital Video
US7606314B2 (en) 2002-08-29 2009-10-20 Raritan America, Inc. Method and apparatus for caching, compressing and transmitting video signals
US20100021143A1 (en) 2004-06-02 2010-01-28 Tadamasa Toma Picture coding apparatus and picture decoding apparatus
US20100153999A1 (en) * 2006-03-24 2010-06-17 Rovi Technologies Corporation Interactive media guidance application with intelligent navigation and display features
US20110194842A1 (en) * 2006-03-23 2011-08-11 Krakirian Haig H System and method for selectively recording program content from a mosaic display
US8135852B2 (en) 2002-03-27 2012-03-13 British Telecommunications Public Limited Company Data streaming system and method
US8477840B2 (en) 2005-09-29 2013-07-02 Thomson Research Funding Corporation Method and apparatus for constrained variable bit rate (VBR) video encoding

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6480841B1 (en) * 1997-09-22 2002-11-12 Minolta Co., Ltd. Information processing apparatus capable of automatically setting degree of relevance between keywords, keyword attaching method and keyword auto-attaching apparatus
US7103045B2 (en) * 2002-03-05 2006-09-05 Hewlett-Packard Development Company, L.P. System and method for forwarding packets
US7684400B2 (en) * 2002-08-08 2010-03-23 Intel Corporation Logarithmic time range-based multifield-correlation packet classification
US7535899B2 (en) * 2003-12-18 2009-05-19 Intel Corporation Packet classification
US20080232359A1 (en) * 2007-03-23 2008-09-25 Taeho Kim Fast packet filtering algorithm
US20090204677A1 (en) * 2008-02-11 2009-08-13 Avaya Technology Llc Context based filter method and apparatus

Patent Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992016071A1 (en) 1991-02-27 1992-09-17 General Electric Company An hdtv compression system
US5241563A (en) 1992-08-10 1993-08-31 General Instrument Corporation Method and apparatus for communicating interleaved data
JPH08307786A (ja) 1995-05-02 1996-11-22 Sony Corp 表示制御装置および表示制御方法
US7428639B2 (en) * 1996-01-30 2008-09-23 Dolby Laboratories Licensing Corporation Encrypted and watermarked temporal and resolution layering in advanced television
US5875199A (en) 1996-08-22 1999-02-23 Lsi Logic Corporation Video device with reed-solomon erasure decoder and method thereof
US7075986B2 (en) 1996-11-27 2006-07-11 Realnetworks, Inc. Method and apparatus for providing scalable pre-compressed digital video with reduced quantization based artifacts
US6480541B1 (en) 1996-11-27 2002-11-12 Realnetworks, Inc. Method and apparatus for providing scalable pre-compressed digital video with reduced quantization based artifacts
US6057884A (en) * 1997-06-05 2000-05-02 General Instrument Corporation Temporal and spatial scaleable coding for video object planes
RU2201654C2 (ru) 1997-12-23 2003-03-27 Томсон Лайсенсинг С.А. Способ низкошумового кодирования и декодирования
EP0966162A1 (en) 1998-01-08 1999-12-22 Matsushita Electric Industrial Co., Ltd. Video signal reproducing device and hierarchical video signal decoder
US7031348B1 (en) 1998-04-04 2006-04-18 Optibase, Ltd. Apparatus and method of splicing digital video streams
US6370666B1 (en) 1998-12-02 2002-04-09 Agere Systems Guardian Corp. Tuning scheme for error-corrected broadcast programs
US6611561B1 (en) 1999-02-18 2003-08-26 Nokia Mobile Phones Limited Video coding
CN1278138A (zh) 1999-06-18 2000-12-27 汤姆森多媒体公司 切换数字电视节目的方法和装置
EP1061737A1 (en) * 1999-06-18 2000-12-20 THOMSON multimedia Process and device for switching digital television programmes
US20090222856A1 (en) * 1999-10-08 2009-09-03 Jin Pil Kim Virtual channel table for a broadcast protocol and method of broadcasting and receiving broadcast signals using the same
WO2001067777A1 (en) 2000-03-07 2001-09-13 Koninklijke Philips Electronics N.V. Resynchronization method for decoding video
US20070288959A1 (en) * 2000-03-29 2007-12-13 Digeo, Inc. Single-button remote access to a synthetic channel page of specialized content
US7072366B2 (en) 2000-07-14 2006-07-04 Nokia Mobile Phones, Ltd. Method for scalable encoding of media streams, a scalable encoder and a terminal
JP2004507178A (ja) 2000-08-14 2004-03-04 ノキア コーポレイション ビデオ信号符号化方法
US7116714B2 (en) 2000-08-14 2006-10-03 Nokia Corporation Video coding
WO2002015589A1 (en) 2000-08-14 2002-02-21 Nokia Corporation Video coding
US20060146934A1 (en) * 2000-08-21 2006-07-06 Kerem Caglar Video coding
CN1478355A (zh) 2000-08-21 2004-02-25 ��˹��ŵ�� 视频编码
US20040179139A1 (en) 2001-01-19 2004-09-16 Lg Electronics Inc. VSB reception system with enhanced signal detection for processing supplemental data
US7085324B2 (en) 2001-04-25 2006-08-01 Lg Electronics Inc. Communication system in digital television
US20030014752A1 (en) 2001-06-01 2003-01-16 Eduard Zaslavsky Method and apparatus for generating a mosaic style electronic program guide
US6535240B2 (en) 2001-07-16 2003-03-18 Chih-Lung Yang Method and apparatus for continuously receiving frames from a plurality of video channels and for alternately continuously transmitting to each of a plurality of participants in a video conference individual frames containing information concerning each of said video channels
US20040244037A1 (en) 2002-02-27 2004-12-02 Takao Yamaguchi Information browsing method, transmitting apparatus and receiving apparatus
WO2003073753A1 (en) 2002-02-27 2003-09-04 Matsushita Electric Industrial Co., Ltd. Information browsing method, transmitting apparatus and receiving apparatus
US20050163211A1 (en) * 2002-03-05 2005-07-28 Tamer Shanableh Scalable video transmission
US7020823B2 (en) 2002-03-19 2006-03-28 Matsushita Electric Industrial Co., Ltd. Error resilient coding, storage, and transmission of digital multimedia data
US8135852B2 (en) 2002-03-27 2012-03-13 British Telecommunications Public Limited Company Data streaming system and method
WO2003098475A1 (en) 2002-04-29 2003-11-27 Sony Electronics, Inc. Supporting advanced coding formats in media files
JP2006505024A (ja) 2002-04-29 2006-02-09 ソニー エレクトロニクス インク データ処理方法及び装置
US20040066854A1 (en) 2002-07-16 2004-04-08 Hannuksela Miska M. Method for random access and gradual picture refresh in video coding
US7606314B2 (en) 2002-08-29 2009-10-20 Raritan America, Inc. Method and apparatus for caching, compressing and transmitting video signals
US20060018379A1 (en) * 2002-11-15 2006-01-26 Thomson Licensing S.A. Method and system for staggered statistical multiplexing
KR20040074365A (ko) 2003-02-18 2004-08-25 백철 동파 방지 배관 시스템
KR20040074635A (ko) 2003-02-19 2004-08-25 마쯔시다덴기산교 가부시키가이샤 동화상 부호화 방법 및 동화상 복호화 방법
US20060018377A1 (en) 2003-03-03 2006-01-26 Shinya Kadono Video encoding method and video decoding method
JP2004289808A (ja) 2003-03-03 2004-10-14 Matsushita Electric Ind Co Ltd 画像符号化方法及び画像復号化方法
US20040181811A1 (en) 2003-03-13 2004-09-16 Rakib Selim Shlomo Thin DOCSIS in-band management for interactive HFC service delivery
US20040213473A1 (en) 2003-04-28 2004-10-28 Canon Kabushiki Kaisha Image processing apparatus and method
JP2004350263A (ja) 2003-04-28 2004-12-09 Canon Inc 画像処理装置及び画像処理方法
US20040243913A1 (en) 2003-04-29 2004-12-02 Utah State University Forward error correction with codeword cross-interleaving and key-based packet compression
US20040218816A1 (en) 2003-04-30 2004-11-04 Nokia Corporation Picture coding method
US20070101378A1 (en) * 2003-05-02 2007-05-03 Koninklijke Philips Electronics N.V. Redundant transmission of programmes
US20040228535A1 (en) * 2003-05-15 2004-11-18 Matsushita Electric Industrial Co., Ltd Moving image decoding apparatus and moving image decoding method
JP2006527975A (ja) 2003-06-16 2006-12-07 トムソン ライセンシング 圧縮ビデオの高速チャンネル変更を可能にする復号化方法および装置
KR20060024416A (ko) 2003-06-16 2006-03-16 톰슨 라이센싱 압축된 비디오의 고속 채널 변경을 가능하게 하기 위한인코딩 방법 및 장치
KR20060015757A (ko) 2003-06-16 2006-02-20 톰슨 라이센싱 압축된 비디오의 고속 채널 변경을 가능하게 하기 위한디코딩 방법 및 장치
WO2004114667A1 (en) 2003-06-16 2004-12-29 Thomson Licensing S.A. Encoding method and apparatus enabling fast channel change of compressed video
WO2004114668A1 (en) 2003-06-16 2004-12-29 Thomson Licensing S.A. Decoding method and apparatus enabling fast channel change of compressed video
RU2328086C2 (ru) 2003-06-19 2008-06-27 Нокиа Корпорейшн Переключение потока на основе постепенного восстановления при декодировании
US7552227B2 (en) 2003-06-19 2009-06-23 Nokia Corporation Stream switching based on gradual decoder refresh
US20070110105A1 (en) 2003-10-30 2007-05-17 Matsushita Electric Industrial Co., Ltd. Apparatus and a method for receiving a multiplexed broadcast signal carrying a plurality of services
CN1830164A (zh) 2003-10-30 2006-09-06 松下电器产业株式会社 面向携带终端的传输方法以及装置
US20060120378A1 (en) 2003-10-30 2006-06-08 Izumi Usuki Mobile-terminal-oriental transmission method and apparatus
WO2005043783A1 (ja) 2003-10-30 2005-05-12 Matsushita Electric Industrial Co., Ltd. 携帯端末向け伝送方法及び装置
US7369610B2 (en) 2003-12-01 2008-05-06 Microsoft Corporation Enhancement layer switching for scalable video coding
US20060239299A1 (en) 2004-01-08 2006-10-26 Albrecht Scheid Extra error correcting method for zapping stream ts packet
WO2005067191A1 (ja) 2004-01-08 2005-07-21 Matsushita Electric Industrial Co., Ltd. ザッピングストリームtsパケットのための追加誤り訂正方法
US20050185795A1 (en) * 2004-01-19 2005-08-25 Samsung Electronics Co., Ltd. Apparatus and/or method for adaptively encoding and/or decoding scalable-encoded bitstream, and recording medium including computer readable code implementing the same
US20050200757A1 (en) 2004-01-23 2005-09-15 Alberta Pica Method and apparatus for digital video reconstruction
KR20060113765A (ko) 2004-02-06 2006-11-02 노키아 코포레이션 하나를 넘는 서비스를 수신하고 디스플레이하기 위한이동통신장치
US20050175091A1 (en) 2004-02-06 2005-08-11 Atul Puri Rate and quality controller for H.264/AVC video coder and scene analyzer therefor
WO2005076503A1 (en) 2004-02-06 2005-08-18 Nokia Corporation Mobile telecommunications apparatus for receiving and displaying more than one service
US20050185541A1 (en) 2004-02-23 2005-08-25 Darren Neuman Method and system for memory usage in real-time audio systems
CN1674674A (zh) 2004-03-24 2005-09-28 株式会社日立制作所 动画数据的传输方法和系统、信息发出和接收装置
CN100337480C (zh) 2004-03-24 2007-09-12 株式会社日立制作所 动画数据的传输方法和系统、信息发出和接收装置
US20050213668A1 (en) 2004-03-24 2005-09-29 Kazunori Iwabuchi Method and system for transmitting data of moving picture, and apparatus for delivering and receiving data of moving picture
WO2005106875A1 (en) 2004-04-28 2005-11-10 Matsushita Electric Industrial Co., Ltd. Moving picture stream generation apparatus, moving picture coding apparatus, moving picture multiplexing apparatus and moving picture decoding apparatus
WO2005112465A1 (en) 2004-05-03 2005-11-24 Thomson Research Funding Corporation Method and apparatus enabling fast channel change for dsl system
US20100021143A1 (en) 2004-06-02 2010-01-28 Tadamasa Toma Picture coding apparatus and picture decoding apparatus
US20080196061A1 (en) * 2004-11-22 2008-08-14 Boyce Jill Macdonald Method and Apparatus for Channel Change in Dsl System
US20060120448A1 (en) * 2004-12-03 2006-06-08 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding multi-layer video using DCT upsampling
US20060146143A1 (en) 2004-12-17 2006-07-06 Jun Xin Method and system for managing reference pictures in multiview videos
KR20060087966A (ko) 2005-01-31 2006-08-03 엘지전자 주식회사 비디오 디코딩 장치 및 방법
WO2006104519A1 (en) 2005-03-29 2006-10-05 Thomson Licensing Method and apparatus for providing robust reception in a wireless communications system
EP1715680A1 (fr) 2005-04-19 2006-10-25 Bouygues Telecom Affichage d'une page numérique "Mosaïque" pour la télévision sur terminal mobile
US20070083578A1 (en) 2005-07-15 2007-04-12 Peisong Chen Video encoding method enabling highly efficient partial decoding of H.264 and other transform coded information
US20070071100A1 (en) 2005-09-27 2007-03-29 Fang Shi Encoder assisted frame rate up conversion using various motion models
US20070073779A1 (en) 2005-09-27 2007-03-29 Walker Gordon K Channel switch frame
EP1941738A2 (en) 2005-09-27 2008-07-09 QUALCOMM Incorporated Methods and apparatus for service acquisition
US20070088971A1 (en) 2005-09-27 2007-04-19 Walker Gordon K Methods and apparatus for service acquisition
US20070076796A1 (en) 2005-09-27 2007-04-05 Fang Shi Frame interpolation using more accurate motion information
US20120294360A1 (en) 2005-09-27 2012-11-22 Qualcomm Incorporated Channel switch frame
WO2007038726A2 (en) 2005-09-27 2007-04-05 Qualcomm Incorporated Methods and apparatus for service acquisition
US20070071105A1 (en) 2005-09-27 2007-03-29 Tao Tian Mode selection techniques for multimedia coding
US8477840B2 (en) 2005-09-29 2013-07-02 Thomson Research Funding Corporation Method and apparatus for constrained variable bit rate (VBR) video encoding
WO2007042916A1 (en) 2005-10-11 2007-04-19 Nokia Corporation System and method for efficient scalable stream adaptation
US20070153914A1 (en) 2005-12-29 2007-07-05 Nokia Corporation Tune in time reduction
US20070157248A1 (en) * 2005-12-29 2007-07-05 United Video Properties, Inc. Systems and methods for providing channel groups in an interactive media guidance application
US20110194842A1 (en) * 2006-03-23 2011-08-11 Krakirian Haig H System and method for selectively recording program content from a mosaic display
US20100153999A1 (en) * 2006-03-24 2010-06-17 Rovi Technologies Corporation Interactive media guidance application with intelligent navigation and display features
US20080022335A1 (en) * 2006-07-24 2008-01-24 Nabil Yousef A receiver with a visual program guide for mobile television applications and method for creation
US20090245393A1 (en) 2006-07-28 2009-10-01 Alan Jay Stein Method and Apparatus For Fast Channel Change For Digital Video
US20080170564A1 (en) 2006-11-14 2008-07-17 Qualcomm Incorporated Systems and methods for channel switching

Non-Patent Citations (15)

* Cited by examiner, † Cited by third party
Title
Bernd Girod, "The information theoretical significance of spatial and temporal masking in video signals," SPIE vol. 1077, Human vision, visual processing, and digital display, pp. 178-187 (1989).
Bormans J et al., "Video Coding with H.264/AVC: tools, performance and complexity," IEEE Circuits and Systems Magazine, IEEE Service Center, New York, NY, US, vol. 4, No. 1, Jan. 2004, pp. 7-28.
Casoulat, R, et al., "On the Usage of Video in Laser," Video Standards and Drafts, Apr. 29, 2005, pp. 1-7.
European Search Report-EP10181358, Search Authority-Munich Patent Office, Jan. 25, 2011.
Faerber N et al: "Robust H.263 compatible video transmission for mobile access to video servers" Proceeding of the International Conference on Image Processing. ICIP 1997. Oct. 26-29, 1997, vol. 2, Oct. 26, 1997, pp. 73-76, XP002171169.
Huifang Sun et al., "Error Resilience Video Transcoding for Wireless Communications," IEEE Wireless Communications, IEEE Service Center, Piscataway, NJ, US, vol. 12, No. 4, Aug. 2005, pp. 14-21.
International Search Report and Written Opinion-PCT/US/2007/084885, International Search Authority-European Patent Office-Dec. 10, 2008.
ITU-T H.264, Series H: Audiovisual and Multimedia System Infrastructure of audiovisual services, Coding of moving video, "Advanced video coding for generic audivisual services," Nov. 2007: 7.3.5.3 Residual Data Syntax; and 9.2 CALVLC parsing process.
Jan Richardson, H.264 and MPEG-4 video coding-next-generation standards, Moscow, Tehnosfera, 2005, pp. 186-197, 220-224.
Jennehag U et al., "Increasing bandwidth utilization in next generation iptv networks," Image Processing, 2004. ICIP '04. 2004 International Conference on Singapore Oct. 24-27, 2004. Piscataway, NJ, USA, IEEE, Oct. 24, 2004, pp. 2075-2078.
Karczewicz M et al.: "The SP- and Si-frame design for H.264/AVC," IEEE Transactions on Circuits and Systems for Video Technology, Jul. 2003, pp. 637-644, vol. 13, No. 7, XP011099256, ISSN: 1051-8215.
Taiwanese Search report-095135825-TIPO-Aug. 19, 2010.
TIA-1099 Standard "Forward Link Only Air Interface Specificaiton for Terrestrial Mobile Multimedia Multicast" pp. 1-341, Mar. 2007.
Translation of Office Action in Japan application 2008-533617 corresponding to U.S. Appl. No. 11/527,306, citing JP2004350263 and JP2004507178 dated Feb. 1, 2011.
Wiegand T: "H.264/AVC Video Coding Standard", Berlin, Germany, May 2003.

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100329328A1 (en) * 2007-06-26 2010-12-30 Nokia, Inc. Using scalable codecs for providing channel zapping information to broadcast receivers
US8989260B2 (en) * 2007-06-26 2015-03-24 Nokia Corporation Using scalable codecs for providing channel zapping information to broadcast receivers
US9661378B2 (en) 2007-06-26 2017-05-23 Nokia Corporation Using scalable codecs for providing channel zapping information to broadcast receivers

Also Published As

Publication number Publication date
CA2669153A1 (en) 2008-05-22
CN101902630B (zh) 2014-04-02
BRPI0718810A2 (pt) 2013-12-03
CN101536524A (zh) 2009-09-16
KR20090084910A (ko) 2009-08-05
RU2009122503A (ru) 2010-12-20
KR101010881B1 (ko) 2011-01-25
WO2008061211A3 (en) 2009-01-29
CN101536524B (zh) 2012-06-13
WO2008061211A2 (en) 2008-05-22
CN101902630A (zh) 2010-12-01
US20080127258A1 (en) 2008-05-29
JP2010510725A (ja) 2010-04-02
EP2098077A2 (en) 2009-09-09

Similar Documents

Publication Publication Date Title
US8761162B2 (en) Systems and methods for applications using channel switch frames
JP4764816B2 (ja) 移動体受信機に向けた、低減解像度ビデオのロバスト・モードでのスタガキャスト
US8229983B2 (en) Channel switch frame
US8396082B2 (en) Time-interleaved simulcast for tune-in reduction
US10075726B2 (en) Video decoding method/device of detecting a missing video frame
EP1929785B1 (en) Channel switch frame

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WALKER, GORDON KENT;RAVEENDRAN, VIJAYALAKSHMI R.;LOUKAS JR., SERAFIM S.;AND OTHERS;REEL/FRAME:020502/0683;SIGNING DATES FROM 20071119 TO 20080114

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WALKER, GORDON KENT;RAVEENDRAN, VIJAYALAKSHMI R.;LOUKAS JR., SERAFIM S.;AND OTHERS;SIGNING DATES FROM 20071119 TO 20080114;REEL/FRAME:020502/0683

AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WALKER, GORDON KENT;RAVEENDRAN, VIJAYALAKSHMI R.;LOUKAS, SERAFIM S., JR.;AND OTHERS;SIGNING DATES FROM 20100923 TO 20101011;REEL/FRAME:025158/0655

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8