WO2006108917A1 - Coding, storage and signalling of scalability information - Google Patents

Coding, storage and signalling of scalability information Download PDF

Info

Publication number
WO2006108917A1
WO2006108917A1 PCT/FI2006/050136 FI2006050136W WO2006108917A1 WO 2006108917 A1 WO2006108917 A1 WO 2006108917A1 FI 2006050136 W FI2006050136 W FI 2006050136W WO 2006108917 A1 WO2006108917 A1 WO 2006108917A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
data stream
scalable
information
decoding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/FI2006/050136
Other languages
English (en)
French (fr)
Inventor
Ye-Kui Wang
Miska Hannuksela
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Inc
Original Assignee
Nokia Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Inc filed Critical Nokia Inc
Priority to JP2008505913A priority Critical patent/JP2008536420A/ja
Priority to CA002604203A priority patent/CA2604203A1/en
Priority to EP06725911.9A priority patent/EP1869891A4/en
Priority to MX2007012564A priority patent/MX2007012564A/es
Publication of WO2006108917A1 publication Critical patent/WO2006108917A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • H04N19/29Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding involving scalability at the object level, e.g. video object layer [VOL]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • H04N19/34Scalability techniques involving progressive bit-plane based encoding of the enhancement layer, e.g. fine granular scalability [FGS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234327Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by decomposing into layers, e.g. base layer and one or more enhancement layers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/647Control signaling between network components and server or clients; Network processes for video distribution between server and clients, e.g. controlling the quality of the video stream, by dropping packets, protecting content from unauthorised alteration within the network, monitoring of network load, bridging between two different networks, e.g. between IP and wireless
    • H04N21/64784Data processing by the network
    • H04N21/64792Controlling the complexity of the content stream, e.g. by dropping packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets

Definitions

  • the present invention is directed to an encoder, a decoder, a device, method, data record, module, computer program product, and system for data encoding, decoding, storage and transmission of a scalable data stream comprising at least two scalability layers.
  • Multimedia applications include local playback, streaming or on-demand, conversational and broadcast/multicast services. Interoperability is important for fast deployment and large-scale market formation of each multimedia application. To achieve high interoperability, different standards are specified.
  • Video coding standards include ITU-T H.261 , ISO/IEC MPEG-1 Visual, ITU-T H.262 or ISO/IEC MPEG-2 Visual, ISO/IEC MPEG-4 Visual, ITU-T H.264 or ISO/IEC MPEG-4 AVC (abbreviated as AVC, AVC/H.264 or H.264/AVC in this document), and the possible future ones such as ISO/IEC MPEG-21 SVC, China AVS, ITU-T H.265, and ISO/IEC MPEG 3DAV.
  • Available media file format standards include ISO file format (ISO/IEC 14496-12), MPEG-4 file format (ISO/IEC 14496-14), AVC file format (ISO/IEC 14496-15) and 3GPP file format (3GPP TS 26.244).
  • 3GPP TS 26.140 specifies the media types, formats and codecs for the multimedia messaging services (MMS) within the 3GPP system.
  • 3GPP TS 26.234 specifies the protocols and codecs for the packet-switched streaming services (PSS) within the 3GPP system.
  • PSS packet-switched streaming services
  • the ongoing 3GPP TS 26.346 specifies the protocols and codecs for multimedia broadcast/multicast services (MBMS) within the 3GPP system.
  • MBMS multimedia broadcast/multicast services
  • Typical audio and video coding standards specify “profiles” and “levels.”
  • a “profile” is a subset of algorithmic features of the standard and a “level” is a set of limits to the coding parameters that impose a set of constraints in decoder resource consumption. Indicated profile and level can be used to signal properties of a media stream and to signal the capability of a media decoder.
  • a decoder can declare whether it can decode a stream without trying decoding, which may cause the decoder to crash, to operate slower than real-time, and/or to discard data due to buffer overflows, if the decoder is not capable of decoding the stream.
  • Each pair of profile and level forms an "interoperability point.”
  • Some coding standards allow creation of scalable bit streams.
  • a meaningful decoded representation can be produced by decoding only certain parts of a scalable bit stream.
  • Scalable bit streams can be used for rate adaptation of pre- encoded unicast streams in a streaming server and for transmission of a single bit stream to terminals having different capabilities and/or with different network conditions.
  • a list of other use cases for scalable video coding can be found in the ISO/IEC JTC1 SC29 WG11 (MPEG) output document N6880, "Applications and Requirements for Scalable Video Coding", the 71 th MPEG meeting, January 2005, Hong Kong, China.
  • Scalable coding technologies include conventional layered scalable coding techniques and fine granularity scalable coding. A review of these techniques can be found in an article by Weiping Li entitled "Overview of fine granularity scalability in MPEG-4 video standard," IEEE Transactions on Circuits and Systems for Video Technology, vol. 11 , no. 3, pp. 301- 317, March 2001.
  • Scalable video coding is a desirable feature for many multimedia applications and services used in systems employing decoders with a wide range of processing power.
  • Several types of video scalability schemes have been proposed, such as temporal, spatial and quality scalability. These proposed types consist of a base layer and an enhancement layer.
  • the base layer is the minimum amount of data required to decode the video stream, while the enhancement layer is the additional data required to provide an enhanced video signal.
  • the working draft of the scalable extension to H.264/AVC currently enables coding of multiple scalable layers.
  • the working draft is described in JVT-N020, "Scalable video coding - working draft 1 ,” 14th meeting, Hong Kong, Jan 2005, and is also known as MPEG document w6901 , "Working Draft 1.0 of 14496- 10:200x/AMD1 Scalable Video Coding," Hong Kong meeting, January 2005.
  • the variable DependencylD signaled in the bitstream is used to indicate the coding dependencies of different scalable layers.
  • a scalable bit stream contains at least two scalability layers, the base layer and one or more enhancement layers. If one scalable bit stream contains more than one scalability layer, it then has the same number of alternatives for decoding and playback.
  • Each layer is a decoding alternative.
  • Layer 0 the base layer, is the first decoding alternative.
  • Layer 1 the first enhancement layer, is the second decoding alternative. This pattern continues with subsequent layers.
  • a lower layer is contained in the higher layers. For example, layer 0 is contained in layer 1 , and layer 1 is contained in layer 2.
  • Each layer is characterized by a set of at least one property, such as Fine granularity scalability (FGS) information, Region-of-interest (ROI) scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream.
  • FGS Fine granularity scalability
  • ROI Region-of-interest
  • sub-sample scalable layer information decoding dependency information
  • initial parameter sets initial parameter sets
  • the present invention allows for encoding, decoding, storage, and transmission of a scalable bit stream, wherein at least two scalability layers are present and each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream, and wherein said set of at least one property is signaled for at least one layer that is different from the entire stream, wherein signaling of said set of at least one property may be in said scalable bit stream, in a file format container containing said scalable bit stream, or in a transmission or control protocol for transmission or control of at least one layer of said scalable bit stream.
  • a set of at least one property such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets
  • a server or client does not need to analyze the FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets of each layer of a stream by verifying the bit stream, thus reducing the computational complexity.
  • inclusion of device sub- unit to enable the capability is avoided.
  • One exemplary embodiment of the present invention discloses a method and device for encoding a scalable data stream to include layers having different coding properties.
  • the method includes: producing one or more layers of the scalable data stream, wherein the layers are characterized by a coding property that is different than a coding property of the scalable data stream, and signaling the layers with the characterized coding property such that they are readable by a decoder without the need to decode the entire layers.
  • Another exemplary embodiment of the present invention discloses a method and device for encoding a scalable bit stream, wherein at least two scalability layers are present and each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream, and wherein said set of at least one property is signaled for at least one layer that is different from the entire stream, wherein signaling of said set of at least one property is in said scalable bit stream.
  • a set of at least one property such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets
  • the method includes: producing a scalable bit stream to included at least two layers, each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream, and signaling, in said scalable bit stream, said set of at least one property, for at least one layer that is different from the entire stream, such that said set of at least one property is readable by a without the need of analyzing said set of at least one property or trying to decode a layer of said scalable bit stream without the knowledge of whether the device have the capability to decode the layer, which may lead to a device crash.
  • a set of at least one property such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets
  • Another exemplary embodiment of the invention discloses a method and device for analyzing a scalable bit stream, wherein at least two scalability layers are present and each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream.
  • the method includes: analyzing said set of at least one property such that said set of at least one property can be signaled in said scalable bit stream, in a file format container containing said scalable bit stream, or in a transmission or control protocol for transmission or control of at least one layer of said scalable bit stream.
  • Another exemplary embodiment of the invention discloses a method and device for converting a scalable bit stream, wherein at least two scalability layers are present and each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream, and wherein said set of at least one property is signaled for at least one layer that is different from the entire stream, wherein signaling of said set of at least one property may be in said scalable bit stream, in a file format container containing said scalable bit stream, or in a transmission or control protocol for transmission or control of at least one layer of said scalable bit stream.
  • the method includes: creation of a non- scalable bit stream containing the base layer of said scalable bit stream, creation of a second scalable bit stream containing an enhancement layer of said scalable bit stream.
  • Another exemplary embodiment of the invention discloses a method and device for decoding a scalable bit stream, wherein at least two scalability layers are present and each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream, and wherein said set of at least one property is signaled for at least one layer that is different from the entire stream, wherein signaling of said set of at least one property may be in said scalable bit stream, in a file format container containing said scalable bit stream, or in a transmission or control protocol for transmission or control of at least one layer of said scalable bit stream.
  • a set of at least one property such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets
  • the device comprises: a first component for receiving said scalable bit stream, a second component for identifying at least one layer in said scalable bit stream and reading said set of at least one property of said at least one layer, a third component for determining whether the decoder is capable of decoding said at least one layer based on said set of at least one property, and a fourth component for decoding said at least one layer if the third component determines that the decoder is capable of decoding said at least one layer.
  • Another exemplary embodiment of the invention discloses a method and device for storage of a scalable bit stream, wherein at least two scalability layers are present and each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream.
  • the method includes: analyzing said at least one property if not signaled in said scalable bit stream, storing said at least one property and said scalable bit stream with or without said at least one property in the bit stream to a file format container according to a file format specification.
  • Another exemplary embodiment of the invention discloses a method and a device for transmission of at least one layer of a scalable bit stream, wherein at least two scalability layers are present and each layer is characterized by a set of at least one property, such as FGS information, ROI scalability information, sub- sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream, and wherein said set of at least one property is signaled for at least one layer that is different from the entire stream, wherein signaling of said set of at least one property may be in said scalable bit stream or in a file format container containing said scalable bit stream.
  • a set of at least one property such as FGS information, ROI scalability information, sub- sample scalable layer information, decoding dependency information, and initial parameter sets
  • the device comprises: a first component for making present information on the available layers or alternatives for decoding and playback to a set of at least one receiver, wherein the receivers may be divided into at least one receiver group, each receiver group consists of at least one receiver; a second component for deciding at least one layer from said available layers to serve a receiver or a receiver group according to request and/or information on said receiver or a receiver group, and a third component for transmission and control of said at least one layer to said receiver or a receiver group.
  • ROI Region-of-interest
  • FGS Fine granularity scalability
  • Sub-sample scalable layer information Decoding dependency information
  • Initial parameter sets Initial parameter sets.
  • Figure 1 is a diagram illustrating a system in which the present invention can be applied
  • Figure 2 is a diagram illustrating signaling of a set of at least one property information for a scalable bit stream in an exemplary embodiment of the present invention
  • Figure 3 is a diagram illustrating an encoding device in an exemplary embodiment of the present invention
  • Figure 4 is a diagram illustrating a converter device in an exemplary embodiment of the present invention.
  • Figure 5 is a diagram illustrating a decoder device in an exemplary embodiment of the present invention.
  • the present invention solves the problems described above by signaling a set of at least one property, such as FGS information, ROI scalability information, sub-sample scalable layer information, decoding dependency information, and initial parameter sets, that may be different from that of the entire stream, for a layer of a scalable bit stream.
  • Signaling of said set of at least one property may be in said scalable bit stream, in a file format container containing said scalable bit stream, or in a transmission or control protocol for transmission or control of at least one layer of said scalable bit stream.
  • Multimedia applications include, among others, media coding, storage and transmission.
  • Media types include speech, audio, image, video, graphics and time text.
  • video coding is described herein as an exemplary application for the present invention, the invention is not limited thereby. Those skilled in the art will recognize that the present invention can be used with all media types, not only video.
  • Figure 2 illustrates signaling of a set of at least one property information for each layer of a scalable bit stream 200 in an exemplary embodiment of the present invention.
  • Each layer of the scalable bit stream is characterized by the set of at least one property information signaled for the layer, thus allowing selection of a layer for decoding or transmission according to the set of at least one property information. These characterizations can be stored in header 204.
  • the multiple layers 202 represent the plurality of layers in the scalable bit stream.
  • a scalable bit stream is coded and stored in a streaming server.
  • a set of at least one property information such as fine granularity scalability information, region-of-interest scalability information, sub-sample or sub-picture scalable layer information, decoding dependency information, and initial parameter sets, of each layer is signaled in the stored file.
  • the server can create an SDP (Session Description Protocol) description for each layer or alternative of the scalable bit stream in the same file such that a streaming client can conclude whether there is an ideal layer and choose an ideal layer for streaming playback according to the SDP descriptions. If the server has no prior knowledge on receiver capabilities, it is advantageous to create multiple SDP descriptions from the same content, and these descriptions are then called alternates. The client can then pick the description that suits its capabilities the best.
  • SDP Session Description Protocol
  • a stream such as that described in the first example is multicast or broadcast to multiple terminals.
  • the multicast/broadcast server can announce all the available layers or decoding and playback alternatives, each of which is characterized by a combination of fine granularity scalability information, region-of-interest scalability information, sub- sample or sub-picture scalable layer information, decoding dependency information, and initial parameter sets.
  • the client can then know from the broadcast/multicast session announcement whether there is an ideal layer for it and choose an ideal layer for playback.
  • FIG. 3 is a diagram illustrating an encoding device in an exemplary embodiment of the present invention.
  • the encoding device 304 receives a raw data stream 302.
  • the data stream is encoded and one or more layers are produced by the scalable data encoder 306 of the encoder 304. These layers are then signaled by the signaling component 308. Some of the layers may have already been signaled by the scalable data encoder 306 and the signaling component will check for such occurrences.
  • the coding property indicated data stream 310 is output from the encoder 304, thus allowing a receiving device (MMSC or decoder) to read the signals in order to determine the coding properties of the layers of the data stream.
  • FIG 4 is a diagram illustrating a converter device in an exemplary embodiment of the present invention.
  • the converter device 404 receives a scalable data stream 402 at receiver 406.
  • Receiver 406 also reads the coding property indicators associated with layers of the received data stream.
  • the coding property comparator 410 compares the coding property indicators with the already known capabilities of the decoding device or network to which the data stream is destined. Through this comparison, it determines what layers the destination device will be able to decode.
  • the data stream is then modified in data stream modifier 412 in order to make the data stream decodable by the destination device. This may involve removing layers from the data stream that were determined in element 410 to be undecodable by the destination device.
  • the modified data stream is then transmitted by transmitter 414.
  • the modified data stream 416 is output from the converter 404 destined for a receiving device (MMSC or decoder).
  • Figure 5 is a diagram illustrating a decoder in an exemplary embodiment of the present invention.
  • the decoding device 504 receives a coding property indicated data stream 502 at receiver 504.
  • a coding property identifier 510 identifies one or more layers in the received data stream and their corresponding coding properties. Based on the coding property of at least one of the layers, the decoder capability verifier 512 determines whether the decoder is capable of decoding that layer. If it is, it allows the decoding component 514 to proceed with decoding that layer of the data stream. If not, it prevents the decoding component 514 from attempting to decode the layer, thus avoiding a potential crash of the decoder.
  • the decoded data stream 516 is shown in the figure as output from the decoder 504.
  • the layer characteristics are signaled in the sequence or group of pictures (GOP) level, such as through sequence or GOP headers, sequence parameter sets, Supplemental Enhancement Information (SEI) messages, user data and other sequence level syntax.
  • GOP sequence or group of pictures
  • SEI Supplemental Enhancement Information
  • a scalability information SEI is specified to assist a bitstream extractor in analyzing the scalability features of the bitstream.
  • the SEI as it is in JSVM 1.0 may be too specific to the current SVC implementation in the JSVM reference software.
  • a new syntax for the scalability information SEI is proposed that enables the following system level operations without parsing and analyzing into any coded slice NAL unit (with the only exception for region-of-interest scalability based on sub-picture scalable layers):
  • a streaming server to offer all the possible scalable presentation points to receivers
  • a media-aware network element e.g. gateway to decide which NAL units are to be discarded (hence not transmitted) for a desired scalable presentation point.
  • the new syntax is as shown in Table 1 and described below. Of course, it would be apparent to those skilled in the art that other ways of signaling are possible and that the invention is not limited to the specific method of signaling. Categories (labeled in Table 1 as C) specify the partitioning of slice data into at most three slice data partitions and the descriptors specify the parsing process of each syntax element. The categories and descriptors are disclosed in the above mentioned document "Working Draft 1.0 of 14496-10:200x/AMD1 Scalable Video Coding".
  • Semantics are specified as follows. When present, this SEI message shall appear in an IDR access unit. The semantics of the message are valid until the next SEI message of the same type.
  • num_layers_minus1 plus 1 indicates the number of scalable layers or presentation points supported by the bitstream.
  • the value of num_layers_minus1 is in the scope of 0 to 255, inclusive.
  • Each scalable layer is associated with a layer ID.
  • the layer ID is assigned as follows. A larger value of layer ID indicates a higher layer. A value 0 indicates the lowest layer. Decoding and presentation of a layer is independent of any higher layer but may be dependent on a lower layer. Therefore, the lowest layer can be decoded and presented independently, decoding and presentation layer 1 may be dependent on layer 0, decoding and presentation of layer 2 may be dependent on layers 0 and 1 , and so on.
  • the representation of a scalable layer requires the presence of the scalable layer itself and all the lower layers on which the scalable layer are directly or indirectly dependent. In the following, a scalable layer and all the lower layers on which the scalable layer are directly or indirectly dependent are collectively called as the scalable layer representation.
  • mapping of each coded picture to a scalable layer may be signaled by the sub-sequence information SEI message.
  • fgs_layer_flag[ i ] 1 indicates that the scalable layer with layer ID equal to i is a fine granularity scalable (FGS) layer.
  • a value 0 indicates that the scalable layer is not an FGS layer.
  • the coded slice NAL units of an FGS layer can be truncated at any byte-aligned position.
  • NAL unit header and slice header of an FGS slice may be needed to include the size of the NAL unit header and slice header for each FGS slice and the minimum meaningful bitrate for each FGS layer, both in the bitstream and/or in the file format, for media-unaware network elements to do FGS.
  • sub_pic_layer_flag[ i ] indicates that the scalable layer with layer ID equal to i consists of sub-pictures, each sub-picture consists of a subset of coded slices of an access unit.
  • a value 0 indicates that the scalable layer consists of entire access units.
  • mapping of each sub-picture of a coded picture to a scalable layer may be signaled by the sub-picture layer information SEI message.
  • sub_region_layer_flag[ i ] indicates that the scalable layer with layer ID equal to i represents a sub-region of the entire region represented by the entire bitstream.
  • a value 0 indicates that the scalable layer represents the entire region represented by the entire bitstream.
  • profile_level_info_present_flag[ i ] indicates the presence of the profile and level information for the scalable layer with layer ID equal to i in the SEI message.
  • a value 0 indicates that the profile and level information for the scalable layer with layer ID equal to i is not present in the SEI message.
  • decoding_dependency_info_present_flag[ i ] 1 indicates the presence of the decoding dependency information for the scalable layer with layer ID equal to i in the SEI message.
  • a value 0 indicates that the decoding dependency information for the scalable layer with layer ID equal to i is not present in the SEI message.
  • bitrate_info_present_flag[ i ] 1 indicates the presence of the bitrate information for the scalable layer with layer ID equal to i in the SEI message.
  • a value 0 indicates that the bitrate information for the scalable layer with layer ID equal to i is not present in the SEI message.
  • frm_rate_info_present_flag[ i ] indicates the presence of the frame rate information for the scalable layer with layer ID equal to i in the SEI message.
  • a value 0 indicates that the frame rate information for the scalable layer with layer ID equal to i is not present in the SEI message.
  • frm_size_info_present_flag[ i ] indicates the presence of the frame size information for the scalable layer with layer ID equal to i in the SEI message.
  • a value 0 indicates that the frame size information for the scalable layer with layer ID equal to i is not present in the SEI message.
  • layer_dependency_info_present_flag[ i ] indicates the presence of the layer dependency information for the scalable layer with layer ID equal to i in the SEI message.
  • a value 0 indicates that the layer dependency information for the scalable layer with layer ID equal to i is not present in the SEI message.
  • init_parameter_sets_info_present_flag[ i ] indicates the presence of the initial parameter sets information for the scalable layer with layer ID equal to i in the SEI message.
  • a value 0 indicates that the initial parameter sets information for the scalable layer with layer ID equal to i is not present in the SEI message.
  • the initial parameter sets refers to those parameter sets that can be transmitted in the beginning of the session.
  • the message components layer_profile_idc[ i ], layer_constraint_setO_flag[ i ], Iayer_constraint_set1_flag[ i ], Iayer_constraint_set2_flag[ i ], Iayer_constraint_set3_flag[ i ], and layer_level_idc[ i ] indicate the profile and level compliancy of the bitstream of the representation of scalable layer with layer ID equal to i.
  • layer_profile_idc[ i ], layer_constraint_setO_flag[ i ], Iayer_constraint_set1_flag[ i ], Iayer_constraint_set2_flag[ i ], Iayer_constraint_set3_flag[ i ], and layer_level_idc[ i ] are identical to the semantics of profile_idc, constraint_setO_flag, constraint_set1_flag, constraint_set2_flag, constraint_set2_flag and leveljdc, respectively, unless herein the target bitstream being the bitstream of the scalable layer representation.
  • dependency_id[ i ] and temporal_level[ i ] are equal to Dependencyld and TemproalLevel, respectively, of the NAL units in the scalable layer with layer ID equal to i.
  • avg_bitrate[ i ] indicates the average bit rate, in units of 1000 bits per second, of the bitstream of the representation of scalable layer with layer ID equal to i.
  • the semantics of avg_bitrate[ i ] is identical to the semantics of average_bit_rate in sub-sequence layer characteristics SEI message when accurate_statistics_flag is equal to 1 , unless herein the target bitstream being the bitstream of the scalable layer representation.
  • max_bitrate[ i ] indicates the maximum bit rate, in units of 1000 bits per second, of the bitstream of the representation of scalable layer with layer ID equal to i, in any one-second time window of access unit removal time.
  • constant_frm_rate_idc[ i ] indicates whether the frame rate of the representation of the scalable layer with layer ID equal to i is constant. If the value of avg_frm_rate as specified in below is constant whichever temporal section of the scalable layer representation is used for the calculation, then the frame rate is constant, otherwise the frame rate is non-constant.
  • Value 0 denotes a non- constant frame rate
  • value 1 denotes a constant frame rate
  • value 2 denotes that it is not clear whether the frame rate is constant or not.
  • the value of constantFrameRate is in the range of 0 to 2, inclusive.
  • avg_frm_rate[ i ] indicates the average frame rate, in units of frames per second, of the bitstream of the representation of scalable layer with layer ID equal to i.
  • the semantics of avg_frm_rate[ i ] is identical to the semantics of average_frame_rate in sub-sequence layer characteristics SEI message when accurate_statistics_flag is equal to 1 , unless herein the target bitstream being the bitstream of the scalable layer representation.
  • frm_width_in_mbs_minus1[ i ] plus 1 indicates the maximum width, in macroblocks, of a coded frame in the representation of the scalable layer with layer ID equal to i.
  • frm_height_in_mbs_minus1 [ i ] plus 1 indicates the maximum height, in macroblocks, of a coded frame in the representation of the scalable layer with layer ID equal to i.
  • horizontal_offset[ i ] and verticial_offset[ i ] give the horizontal and vertical offsets, respectively, of the top-left pixel of the rectangular region represented by the representation of the scalable layer with layer ID equal to i, in relative to the top- left pixel of the overall region represented by the entire bitstream.
  • the unit is of luma samples in the scale of the highest spatial resolution.
  • region_width[ i ] and region_height[ i ] give the width and height, respectively, of the rectangular region represented by the representation of the scalable layer with layer ID equal to i, in luma samples in the scale of the highest spatial resolution.
  • num_directly_dependent_layers[ i ] indicates the number of scalable layers that the scalable layer with layer ID equal to i is directly dependent on.
  • the value of num_directly_dependent_layers is in the scope of 0 to 255, inclusive.
  • directly_dependent_layer_id_delta[ i ][ j ] indicates the difference between the layer ID of the j th scalable layer that the scalable layer with layer ID equal to i is directly dependent on and i.
  • the layer ID of the directly dependent-on scalable layer is equal to (directly_dependent_layer_id_delta + i).
  • num_init_seq_parameter_set_minus1 [ i ] plus 1 indicates the number of initial sequence parameter sets for decoding the representation of the scalable layer with layer ID equal to i.
  • init_seq_parameter_set_id_delta[ i ][ j ] indicates the value of the seq_parameter_set_id of the jth initial sequence parameter set for decoding the representation of the scalable layer with layer ID equal to i if j is equal to 0. If j is larger than 0, init_seq_parameter_set_id_delta[ i ][ j ] indicates the difference between the value of the seq_parameter_set_id of the jth initial sequence parameter set and the value of the seq_parameter_set_id of the G" 1 ) th in it ial sequence parameter set.
  • the initial sequence parameter sets are logically ordered in ascending order of the value of seq_parameter_set_id.
  • Mapping of access units to scalable layers is signaled using sub-sequence information SEI messages.
  • the sub_seq_layer_num in the sub-sequence information SEI message indicates the layer ID of the scalable layer to which the current access unit belongs.
  • a new SEI message is defined as shown in Table 2. This design is simple, but parsing into picture parameter set and slices are needed to identify whether a slice belongs to a motion-constrained slice group set. Alternatively, we can design a sub-picture-level SEI to signal the layer ID.
  • this SEI message shall appear in the same SEI payload containing a motion-constrained slice group set SEI message and immediately succeeds the motion-constrained slice group set SEI message in decoding order.
  • the slice group set identified by the motion-constrained slice group set SEI message is called the associated slice group set of the sub-picture layer information SEI message.
  • layer_id indicates the layer ID of the scalable layer to which the coded slice NAL units in the associated slice group set belongs.
  • support of the signaling can be achieved by the following method using sequence parameter set and NAL unit header or slice header.
  • the signaling information may not be present in the bit stream for any of the following reasons: 1 ) the signaling is not supported by the coding technique or standard, 2) the signaling is supported but not present, 3) the file format specification disallows including of some information in the bit stream contained in the file format container, for example, the AVC file format specification disallows including of the three kinds of sub-sequence SEI messages in the bit stream stored in media tracks. Therefore, it is important to support signaling of the information in file format.
  • the scalability structures in below are designed in the way to be usable for all types of scalable video streams, hence could be considered as an extension to the ISO base media file format.
  • the brand 'svd' can be used to indicate that this extension is used in a file.
  • an ISO file should contain zero or one instance of a SampleToGroupBox (per track) with a grouping_type equal to 'scif.
  • This SampleToGroupBox instance maps each sample to one or more scalable layers.
  • the scalability information for each scalable layer is stored in the corresponding sample group description entry (ScalabilitylnfoEntry) that is included in the SampleGroupDescriptionBox of grouping type 'scif.
  • Scalability information includes layer ID, profile and level, bitrate, frame rate, buffer parameters and dependency information.
  • each scalable layer is associated with a layer ID.
  • the layer ID is assigned as follows. A larger value of layer ID indicates a higher layer. A value 0 indicates the lowest layer. Decoding and presentation of a layer is independent of any higher layer but may be dependent on a lower layer. Therefore, the lowest layer can be decoded and presented independently, decoding and presentation layer 1 may be dependent on layer 0, decoding and presentation of layer 2 may be dependent on layers 0 and 1 , and so on.
  • the representation of a scalable layer requires the presence of the scalable layer itself and all the lower layers on which the scalable layer are directly or indirectly dependent. In the following, a scalable layer and all the lower layers on which the scalable layer are directly or indirectly dependent are collectively called as the scalable layer representation.
  • the syntax of the extension to ISO base media file format can be as follows.
  • the ScalabilitylnfoEntry includes ProfileLevelBox, BitRateBox, FrameRateBox, FrameSizeBox, RectRegionBox, BufferingBox and LayerDependencyBox. Definitions of these boxes are as shown in Tables 3 and 4:
  • ProfileLevelBox contains the profile and level that the scalable layer representation is compliant with
  • BitRateBox contains the bit rate information
  • FrameRateBox contains the frame rate information
  • FrameSizeBox contains the spatial resolution information
  • BufferingBox contains the buffer information
  • LayerDependencyBox contains the layers that the scalable is dependent of.
  • the BufferingBox is an abstract box, a file format derived from ISO base media file format shall define a buffer information data structure according to the buffering model specified by the video coding standard. For a certain scalable layer, if any of the optional boxes is not present, then the described information is the same as the highest scalable layer.
  • BitRateBox () // optional
  • the semantics is as follows.
  • the layerld gives the identifier of the scalable layer for which the following information describes.
  • IsFgsLayer 1 indicates that the scalable layer is a fine granularity scalable (FGS) layer, the bitstream data unit of which can be truncated at any byte aligned position.
  • a value 0 indicates that the scalable layer is not an FGS layer.
  • IsSubsampleLayer 1 indicates that the scalable layer is formed only by sub-samples of the samples being mapped to the layer.
  • the information on which sub-samples are included in this layer is signaled in the Sub-Sample Information Box.
  • a value 0 indicates that the scalable layer is formed by the samples being mapped to the layer.
  • profileldc and levelldc specify the profile and level, respectively, with which the bitstream of the scalable layer representation is compliant.
  • avgBitrate gives the average bit rate, in bit/s, of the bitstream of the scalable layer representation.
  • maxBitrate gives the maximum bit rates, in bit/s, of the bitstream of the scalable layer representation in any time window of one second.
  • constantFrameRate indicates whether the frame rate of the scalable layer representation is constant. If the value of frameRate as specified in below is constant whichever a temporal section of the scalable layer representation is used for the calculation, then the frame rate is constant, otherwise the frame rate is non-constant. Value 0 denotes a non-constant frame rate, value 1 denotes a constant frame rate, and value 2 denotes that it is not clear whether the frame rate is constant.
  • the value of constantFrameRate is in the range of 0 to 2, inclusive.
  • frameRate gives the average frame rate in units of frames/(256 seconds). All NAL units in the scalable layer presentation are taken into account in the calculation.
  • C is the number of frames in the scalable layer representation
  • ti is the presentation timestamp of the first picture in the scalable layer representation in presentation order
  • t 2 is the presentation timestamp of the latest picture in the scalable layer representation in presentation order.
  • frm_width and frmjieight give respectively the maximum width and height, in luma samples, of a video frame of the scalable layer representation.
  • the term "frame” is interpreted in the same way as in the SVC coding specification.
  • horizontal_offset and vertical_offset give respectively the horizontal and vertical offsets, in luma samples, of the top-left pixel of the rectangular region represented by the scalable layer representation, in relative to the top-left pixel of the overall region represented by the highest scalable layer representation.
  • region_width and region_height give respectively the width and height of the rectangular region represented by the scalable layer representation, in luma samples of the same scale of the overall region represented by the highest scalable layer representation.
  • entry_count gives the number of entries in the following table.
  • dependencyLayerld gives the layerld of a scalable layer on which the current scalable layer is directly or indirectly dependent.
  • the value of dependencyLayerld shall be smaller than the layerld of the current scalable layer.
  • the representation of the current scalable layer requires the presence of the scalable layer indicated by dependencyLayer.
  • the first 8 bits of the 32-bit reserved field are used to signal the scalable layer identifier of which
  • Similar methods can also be applied to support the signaling in other file formats. If it is supported in the ISO file format, then it is naturally supported in the derived file format such as MPEG-4 file format, AVC file format and 3GPP file format.
  • the inventors have developed the following SVC file format (AVC FF Amd.2) derived from ISO base media file format:
  • a sample is defined as follows in ISO base media file format:
  • a sample is an individual frame of video, a time-contiguous series of video frames, or a time-contiguous compressed section of audio.
  • a sample defines the formation of one or more streaming packets. No two samples within a track may share the same time-stamp.
  • scalable video particularly for spatial and quality scalability
  • the above constraint that no two samples within a track may share the same timestamp is not applicable, because more than one picture (e.g. the base layer picture and the spatial enhancement layer picture) may share the same timestamp. If these pictures are made in the same sample, it is not handy for a server to do scalable truncation because parsing into samples is always needed.
  • a picture is decoded from a set of NAL units with an identical value of picture order count and Dependencyld.
  • the corresponding NAL units shall include slice NAL units for all macroblocks of a picture and possibly additional progressive refinement slice NAL units.
  • progressive refinement slices i.e. FGS slices
  • FGS slices are in the same picture as the corresponding base layer. If the FGS slices and the corresponding base layer are made in the same sample, it is not handy for a server to do scalable truncation because parsing into samples is even needed for non-FGS operations. Therefore, in an example embodiment each FGS enhancement plane or the corresponding base layer is separated into its own picture.
  • a sub-sample is defined as one or more contiguous NAL units within one sample.
  • the first 8 bits of the reserved field in the ProfileLevelBox is used to contain the profile compatibility information, such that the syntax is as follows:
  • profile_compatibility is a byte defined exactly the same as the byte which occurs between the profile_idc and leveljdc in a sequence parameter set, as defined in the SVC video specification.
  • the following scalability information types are specific to the SVC coding format:
  • dependencyjd and temporaljevel give respectively the scalable layer's values of Dependencyld and TemporalLevel as defined in the SVC video specification.
  • numOfSequenceParameterSets indicates the number of sequence parameter sets that are used as the initial set of sequence parameter sets for decoding the scalable layer representation.
  • sequenceParameterSetLength indicates the length in bytes of the sequence parameter set NAL unit as defined in the SVC video specification.
  • sequenceParameterSetNALUnit contains a sequence parameter set NAL Unit, as specified in the SVC video specification. Sequence parameter sets shall occur in ascending order of parameter set identifier with gaps being allowed.
  • numOfPictureParameterSets indicates the number of picture parameter sets that are used as the initial set of picture parameter sets for decoding the scalable layer representation.
  • pictureParameterSetLength indicates the length in bytes of the picture parameter set NAL unit as defined in the SVC video specification.
  • pictureParameterSetNALUnit contains a picture parameter set NAL Unit, as specified in the SVC video specification. Picture parameter sets shall occur in ascending order of parameter set identifier with gaps being allowed.
  • operation_point_count specifies the number of operation points. Values of SVC HRD parameters are specified separately for each operation point. The value of operation_point_count shall be greater than 0.
  • tx_byte_rate indicates the input byte rate (in bytes per second) to the coded picture buffer (CPB) of SVC HRD.
  • the bitstream of the scalable layer representation is constrained by the value of BitRate equal to 8 * the value of tx_byte_rate for NAL HRD parameters as specified in the SVC video specification.
  • the value of BitRate is equal to tx_byte_rate * 40 / 6.
  • the value of tx_byte_rate shall be greater than 0.
  • cpb_size gives the required size of the coded picture buffer in bytes.
  • bitstream of the scalable layer representation is constrained by the value of CpbSize equal to cpb_size * 8 for NAL HRD parameters as specified in the SVC video specification.
  • CpbSize is equal to cpb_size * 40 / 6.
  • dpb_size gives the required size of the decoded picture buffer, in unit of bytes.
  • the bitstream of the scalable layer representation is constrained by the value of max_dec_frame_buffering equal to Min( 16, Floor( post_dec_buf_size ) / ( PicWidthMbs * FrameHeightlnMbs * 256 * ChromaFormatFactor ) ) ) as specified in the SVC video specification.
  • At least one set of values of tx_byte_rate, cpb_size and dpb_size of the same operation point shall conform to the constraints set by the profile and level of the bitstream of the scalable layer representation.
  • init_cpb_delay gives the required delay between the time of arrival in the pre- decoder buffer of the first bit of the first access unit and the time of removal from the pre-decoder buffer of the first access unit. It is in units of a 90 kHz clock.
  • the bitstream of the scalable layer representation is constrained by the value of the nominal removal time of the first access unit from the coded picture buffer (CPB), tr,n( 0 ), equal to init_cpb_delay as specified in the SVC video specification.
  • init_dpb_delay gives the required delay between the time of arrival in the post- decoder buffer of the first decoded picture and the time of output from the post- decoder buffer of the first decoded picture. It is in units of a 90 kHz clock.
  • the bitstream of the scalable layer representation is constrained by the value of dpb_output_delay for the first decoded picture in output order equal to init_dpb_delay as specified in the SVC video specification assuming that the clock tick variable, tc, is equal to 1 / 90 000.
  • the mapping of samples/pictures to scalable layers is a grouping concept.
  • the sample group design provides an elegant way to signaling the mapping information and also the scalability layer information of the scalable layers.
  • some parts of the scalability information of the scalable layers are exactly the same as the entire elementary stream or the highest scalable layer.
  • By categorizing and signaling the scalability information in different optional boxes those information parts do not need to be redundantly stored.
  • using boxes is flexible in the way that if more scalability information is needed it can be easily included by having new boxes in the sample group description entry.
  • a streaming server has stored a scalable stream of profile P and level L
  • a scalable layer of the stream could be of profile P1 and L1
  • the implementation requirement of a decoder compliant with L1@P1 is simpler than a decoder compliant with L@P.
  • the server If the server is going to feed the video content to a client with a decoder compliant with L1@P1 , the server has to check the stream whether there is a scalable layer that is compliant with L1@P1 , e.g., by running a hypothetical reference decoder, which imposes additional implementation and computation complexities to the streaming server.
  • To have the profile and level information signaled for each scalable layer enables the above applications with a much simplified server implementation.
  • bit rate, frame rate and frame size information are naturally needed for bit rate, temporal and spatial scalabilities.
  • region information is useful to support region-of-interest (ROI) scalability as required in N6880.
  • ROI region-of-interest
  • Decoding of different scalable layer representations requires different buffer sizes and buffering delays.
  • the presence of buffer information enables a receiver/decoder of a scalable layer representation allocate less memory for decoding or have a shorter initial delay, both are helpful to improve end-user experience.
  • the layer dependency information enables a streaming server not to send unnecessary lower layers without analyzing of the stream, which requires complex implementations.
  • This information indicates whether the scalable layer is a fine granularity scalable (FGS) layer, the bitstream data unit of which can be truncated at any byte aligned position.
  • FGS fine granularity scalable
  • the decoding dependency information is included in the NAL unit headers of scalable extension layer NAL units.
  • the mapping between the decoding dependency information and the scalable layer identifier is needed.
  • a scalable layer representation may not use all the initial parameter sets of the entire stream, to transmit all those parameter sets may cause a waste of transmission bandwidth and a longer initial setup delay, particularly because typically initial parameter sets are transmitted out-of-band and reliably, which implies that reception acknowledgement is used and retransmission may be used.
  • Signaling of initial parameter sets for each scalable layer representation solves the problem. It is also possible for a server to get the information by analyzing the bitstream. However, that requires the server to be media-aware and to do on-the-fly bitstream analysis.
  • the server can create multiple alternatives for the same stream based on the scalability information, either through multiple SDP descriptions wherein each SDP description containing one or more than one alternative, or through one SDP description containing multiple alternatives.
  • the receiving terminal chooses one or none of the alternatives. If one alternative is chosen, the server then knows which layers should be transmitted.
  • multicast/broadcast applications there may be receiving terminals of different capabilities and/or in different network conditions that result in that different alternatives of a stream are ideal for different receiving terminals.
  • the relevant layers are transmitted from the server side.
  • the server should present through the service announcement what alternatives are available, such that each receiver can choose one alternative service to subscribe. This can be achieved, for example, by using SDP in a similar manner as in the unicast case. It is also possible that in one particular multicast/broadcast group one alternative with multiple layers are transmitted, while the receiver chooses to decode one of the layers and to discard the rest data. Using this method, the available bandwidth for the sending device may be efficiently utilized, because fewer streams are transmitted. In this case, with the same multicast/broadcast group, the server does not need to know the different preferences of different receiving terminals. However, it is still necessary for the server to present the alternatives information through service announcement such that the receiver can conclude whether it is able to decode any of the alternatives.
  • the above two methods can also be applied together. That is, there may be multiple multicast/broadcast groups. In some of the groups, all the receivers can decode the same alternative, while in the other groups some of the receivers may discard some of the received bit stream layers.
  • the combined method may be used to globally optimize both the efficiency of the bandwidth available in the server and the efficiencies of the bandwidths available in the receivers.
  • embodiments within the scope of the present invention include program products comprising computer-readable media for carrying or having computer-executable instructions or data structures stored thereon.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer.
  • Such computer-readable media can comprise RAM, ROM, EPROM, EEPROM, CD- ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • Computer- executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the system comprises transmitting server 101 which has e.g. a storage media 102 containing a file 103 which contains a media stream encoded according to the present invention.
  • the file 103 is transmitted as one or more signals to a network 104 such as a mobile communication network.
  • a gateway 105 which receives the file 103 and forwards it to e.g. a base station 106 of the network by e.g. control of the MMSC 107.
  • a receiver 108 can receive the signal(s) and decode the scalability information and some other information included in the signal(s).
  • the invention is described in the general context of method steps, which may be implemented in one embodiment by a program product including computer- executable instructions, such as program code, executed by computers in networked environments.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein.
  • the particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
PCT/FI2006/050136 2005-04-13 2006-04-10 Coding, storage and signalling of scalability information Ceased WO2006108917A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2008505913A JP2008536420A (ja) 2005-04-13 2006-04-10 スケーラビリティ情報の符号化、格納およびシグナリング
CA002604203A CA2604203A1 (en) 2005-04-13 2006-04-10 Coding, storage and signalling of scalability information
EP06725911.9A EP1869891A4 (en) 2005-04-13 2006-04-10 CODING, STORAGE AND SIGNALING OF SCALABILITY INFORMATION
MX2007012564A MX2007012564A (es) 2005-04-13 2006-04-10 Codificacion, almacenamiento y senalizacion de informacion de escalabilidad.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US67121505P 2005-04-13 2005-04-13
US60/671,215 2005-04-13

Publications (1)

Publication Number Publication Date
WO2006108917A1 true WO2006108917A1 (en) 2006-10-19

Family

ID=37086626

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2006/050136 Ceased WO2006108917A1 (en) 2005-04-13 2006-04-10 Coding, storage and signalling of scalability information

Country Status (10)

Country Link
US (2) US8774266B2 (enExample)
EP (1) EP1869891A4 (enExample)
JP (1) JP2008536420A (enExample)
KR (1) KR20080006609A (enExample)
CN (1) CN101120593A (enExample)
CA (1) CA2604203A1 (enExample)
MX (1) MX2007012564A (enExample)
RU (1) RU2377736C2 (enExample)
TW (1) TW200704191A (enExample)
WO (1) WO2006108917A1 (enExample)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008047319A1 (en) 2006-10-20 2008-04-24 Nokia Corporation Generic indication of adaptation paths for scalable multimedia
WO2009040701A3 (en) * 2007-09-24 2009-05-22 Koninkl Philips Electronics Nv Method and system for encoding a video data signal, encoded video data signal, method and system for decoding a video data signal
JP2010507310A (ja) * 2006-10-20 2010-03-04 ノキア コーポレイション ビデオの符号化においてピクチャ出力インジケータを提供するためのシステムおよび方法
WO2010036085A2 (en) 2008-09-29 2010-04-01 Samsung Electronics Co., Ltd. Method and apparatus for providing rich media service
JP2010541471A (ja) * 2007-10-05 2010-12-24 トムソン ライセンシング マルチビュー・ビデオ(mvc)コーディング・システムにビデオ・ユーザビリティ情報(vui)を組み込むための方法と装置
EP1878254A4 (en) * 2005-04-13 2011-05-18 Nokia Corp FGS IDENTIFICATION IN SCALABLE VIDEO CODING
RU2430483C2 (ru) * 2007-01-18 2011-09-27 Нокиа Корпорейшн Передача сообщений дополнительной расширенной информации в формате полезной нагрузки транспортного протокола реального времени
US8442109B2 (en) 2006-07-12 2013-05-14 Nokia Corporation Signaling of region-of-interest scalability information in media files
EP2364021A3 (en) * 2010-03-05 2013-05-15 Canon Kabushiki Kaisha Image processing apparatus capable of extracting frame image data from video data and method for controlling the same
US8619871B2 (en) 2007-04-18 2013-12-31 Thomson Licensing Coding systems
US8938012B2 (en) 2007-04-13 2015-01-20 Nokia Corporation Video coder
CN104412598A (zh) * 2012-07-06 2015-03-11 夏普株式会社 发信号通知基于子图片的假想参考解码器参数的电子设备
US9131033B2 (en) 2010-07-20 2015-09-08 Qualcomm Incoporated Providing sequence data sets for streaming video data
US9313486B2 (en) 2012-06-20 2016-04-12 Vidyo, Inc. Hybrid video coding techniques
US9319717B2 (en) 2007-01-08 2016-04-19 Nokia Technologies Oy System and method for providing and using predetermined signaling of interoperability points for transcoded media streams
US9426499B2 (en) 2005-07-20 2016-08-23 Vidyo, Inc. System and method for scalable and low-delay videoconferencing using scalable video coding
US9648317B2 (en) 2012-01-30 2017-05-09 Qualcomm Incorporated Method of coding video and storing video content
US9693032B2 (en) 2012-04-03 2017-06-27 Sun Patent Trust Image encoding method, image decoding method, image encoding device, and image decoding device
RU2635892C2 (ru) * 2012-09-28 2017-11-16 Квэлкомм Инкорпорейтед Сигнализация идентификаторов уровней для рабочих точек при кодировании видео
US9936196B2 (en) 2012-10-30 2018-04-03 Qualcomm Incorporated Target output layers in video coding
US10863203B2 (en) 2007-04-18 2020-12-08 Dolby Laboratories Licensing Corporation Decoding multi-layer images
GB2587365A (en) * 2019-09-24 2021-03-31 Canon Kk Method, device, and computer program for coding and decoding a picture
WO2021234125A1 (en) * 2020-05-22 2021-11-25 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Subpicture-related video coding concepts
JP2024105403A (ja) * 2013-04-08 2024-08-06 ジーイー ビデオ コンプレッション エルエルシー 効率的なマルチビュー/レイヤ符号化を可能とする符号化コンセプト
RU2844691C2 (ru) * 2008-03-28 2025-08-05 Долби Интернэшнл Аб Декодирующее устройство

Families Citing this family (167)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7068729B2 (en) 2001-12-21 2006-06-27 Digital Fountain, Inc. Multi-stage code generator and decoder for communication systems
US6307487B1 (en) 1998-09-23 2001-10-23 Digital Fountain, Inc. Information additive code generator and decoder for communication systems
US9240810B2 (en) 2002-06-11 2016-01-19 Digital Fountain, Inc. Systems and processes for decoding chain reaction codes through inactivation
EP2348640B1 (en) 2002-10-05 2020-07-15 QUALCOMM Incorporated Systematic encoding of chain reaction codes
US7139960B2 (en) * 2003-10-06 2006-11-21 Digital Fountain, Inc. Error-correcting multi-stage code generator and decoder for communication systems having single transmitters or multiple transmitters
EP2202888A1 (en) 2004-05-07 2010-06-30 Digital Fountain, Inc. File download and streaming system
US7801383B2 (en) 2004-05-15 2010-09-21 Microsoft Corporation Embedded scalar quantizers with arbitrary dead-zone ratios
CN101180831A (zh) * 2005-05-24 2008-05-14 诺基亚公司 用于数字广播中分级传输/接收的方法和装置
US8422546B2 (en) 2005-05-25 2013-04-16 Microsoft Corporation Adaptive video encoding using a perceptual model
FR2889004B1 (fr) * 2005-07-22 2007-08-24 Canon Kk Procede et dispositif de traitement d'une sequence d'images numeriques a scalabilite spatiale ou en qualite
CN102271249B (zh) * 2005-09-26 2014-04-09 韩国电子通信研究院 用于可伸缩视频的感兴趣区域信息设置方法和解析方法
KR101255226B1 (ko) 2005-09-26 2013-04-16 한국과학기술원 스케일러블 비디오 코딩에서 다중 roi 설정, 복원을위한 장치 및 방법
US9136983B2 (en) * 2006-02-13 2015-09-15 Digital Fountain, Inc. Streaming and buffering using variable FEC overhead and protection periods
US9270414B2 (en) 2006-02-21 2016-02-23 Digital Fountain, Inc. Multiple-field based code generator and decoder for communications systems
AU2007230602B2 (en) * 2006-03-27 2012-01-12 Vidyo, Inc. System and method for management of scalability information in scalable video and audio coding systems using control messages
US8059721B2 (en) 2006-04-07 2011-11-15 Microsoft Corporation Estimating sample-domain distortion in the transform domain with rounding compensation
US7995649B2 (en) 2006-04-07 2011-08-09 Microsoft Corporation Quantization adjustment based on texture level
US8503536B2 (en) 2006-04-07 2013-08-06 Microsoft Corporation Quantization adjustments for DC shift artifacts
US8130828B2 (en) 2006-04-07 2012-03-06 Microsoft Corporation Adjusting quantization to preserve non-zero AC coefficients
US7974340B2 (en) 2006-04-07 2011-07-05 Microsoft Corporation Adaptive B-picture quantization control
US8711925B2 (en) 2006-05-05 2014-04-29 Microsoft Corporation Flexible quantization
WO2007134196A2 (en) 2006-05-10 2007-11-22 Digital Fountain, Inc. Code generator and decoder using hybrid codes
US9198084B2 (en) 2006-05-26 2015-11-24 Qualcomm Incorporated Wireless architecture for a traditional wire-based protocol
US9419749B2 (en) 2009-08-19 2016-08-16 Qualcomm Incorporated Methods and apparatus employing FEC codes with permanent inactivation of symbols for encoding and decoding processes
US9432433B2 (en) 2006-06-09 2016-08-30 Qualcomm Incorporated Enhanced block-request streaming system using signaling or block creation
US9386064B2 (en) 2006-06-09 2016-07-05 Qualcomm Incorporated Enhanced block-request streaming using URL templates and construction rules
US9178535B2 (en) 2006-06-09 2015-11-03 Digital Fountain, Inc. Dynamic stream interleaving and sub-stream based delivery
US9380096B2 (en) 2006-06-09 2016-06-28 Qualcomm Incorporated Enhanced block-request streaming system for handling low-latency streaming
US9209934B2 (en) 2006-06-09 2015-12-08 Qualcomm Incorporated Enhanced block-request streaming using cooperative parallel HTTP and forward error correction
US9344362B2 (en) 2007-01-12 2016-05-17 University-Industry Cooperation Group Of Kyung Hee University Packet format of network abstraction layer unit, and algorithm and apparatus for video encoding and decoding using the format, QOS control algorithm and apparatus for IPV6 label switching using the format
US8238424B2 (en) 2007-02-09 2012-08-07 Microsoft Corporation Complexity-based adaptive preprocessing for multiple-pass video compression
US8498335B2 (en) 2007-03-26 2013-07-30 Microsoft Corporation Adaptive deadzone size adjustment in quantization
US8243797B2 (en) 2007-03-30 2012-08-14 Microsoft Corporation Regions of interest for quality adjustments
EP3968642A1 (en) 2007-04-12 2022-03-16 InterDigital VC Holdings, Inc. Methods and apparatus for video usability information (vui) for scalable video coding (svc)
US8442337B2 (en) 2007-04-18 2013-05-14 Microsoft Corporation Encoding adjustments for animation content
JP2010527216A (ja) * 2007-05-16 2010-08-05 トムソン ライセンシング マルチビュー・ビデオ符号化(mvc)情報の符号化においてスライス群を使用する方法及び装置
US8331438B2 (en) 2007-06-05 2012-12-11 Microsoft Corporation Adaptive selection of picture-level quantization parameters for predicted video pictures
US8078568B2 (en) * 2007-06-25 2011-12-13 Sap Ag Properties of data elements
ATE495631T1 (de) 2007-07-02 2011-01-15 Fraunhofer Ges Forschung Vorrichtung und verfahren zum verarbeiten und lesen einer datei mit mediendatenbehälter und metadatenbehälter
US8667144B2 (en) 2007-07-25 2014-03-04 Qualcomm Incorporated Wireless architecture for traditional wire based protocol
RU2010114256A (ru) 2007-09-12 2011-10-20 Диджитал Фаунтин, Инк. (Us) Формирование и передача исходной идентификационной информации для обеспечения надежного обмена данными
EP2037683A1 (en) * 2007-09-17 2009-03-18 Alcatel Lucent Process for delivering to a media terminal an adapted video stream by means of an access node
KR101345287B1 (ko) * 2007-10-12 2013-12-27 삼성전자주식회사 스케일러블 영상 부호화 방법 및 장치와 그 영상 복호화방법 및 장치
FR2923124A1 (fr) * 2007-10-26 2009-05-01 Canon Kk Procede et dispositif de determination de la valeur d'un delai a appliquer entre l'envoi d'un premier ensemble de donnees et l'envoi d'un second ensemble de donnees
US8189933B2 (en) 2008-03-31 2012-05-29 Microsoft Corporation Classifying and controlling encoding quality for textured, dark smooth and smooth video content
US8811294B2 (en) 2008-04-04 2014-08-19 Qualcomm Incorporated Apparatus and methods for establishing client-host associations within a wireless network
US8897359B2 (en) 2008-06-03 2014-11-25 Microsoft Corporation Adaptive quantization for enhancement layer video coding
FR2932634B1 (fr) * 2008-06-11 2010-08-20 Alcatel Lucent Procede de transmission de contenus en couches par des ensembles choisis de stations de base d'une infrastructure radio
US8488680B2 (en) * 2008-07-30 2013-07-16 Stmicroelectronics S.R.L. Encoding and decoding methods and apparatus, signal and computer program product therefor
US9398089B2 (en) 2008-12-11 2016-07-19 Qualcomm Incorporated Dynamic resource sharing among multiple wireless devices
US8102849B2 (en) * 2009-02-12 2012-01-24 Qualcomm, Incorporated Association procedure to enable multiple multicast streams
US9281847B2 (en) 2009-02-27 2016-03-08 Qualcomm Incorporated Mobile reception of digital video broadcasting—terrestrial services
US8514931B2 (en) * 2009-03-20 2013-08-20 Ecole Polytechnique Federale De Lausanne (Epfl) Method of providing scalable video coding (SVC) video content with added media content
WO2010110770A1 (en) * 2009-03-25 2010-09-30 Thomson Licensing Method and apparatus for scalable content multicast over a hybrid network
JP5072893B2 (ja) * 2009-03-25 2012-11-14 株式会社東芝 画像符号化方法および画像復号化方法
US20100250763A1 (en) * 2009-03-31 2010-09-30 Nokia Corporation Method and Apparatus for Transmitting Information on Operation Points
US20100250764A1 (en) * 2009-03-31 2010-09-30 Nokia Corporation Method and Apparatus for Signaling Layer Information of Scalable Media Data
CN101552913B (zh) 2009-05-12 2011-07-06 腾讯科技(深圳)有限公司 多路视频通讯系统及处理方法
US9264248B2 (en) 2009-07-02 2016-02-16 Qualcomm Incorporated System and method for avoiding and resolving conflicts in a wireless mobile display digital interface multicast environment
WO2011003231A1 (zh) * 2009-07-06 2011-01-13 华为技术有限公司 一种可伸缩视频编码文件的传输方法、接收方法及装置
FR2948249B1 (fr) * 2009-07-20 2011-09-23 Canon Kk Procedes et dispositifs d'estimation d'un niveau d'utilisation d'un reseau de communication et d'adaptation d'un niveau d'abonnements a des groupes multipoints
KR101452859B1 (ko) 2009-08-13 2014-10-23 삼성전자주식회사 움직임 벡터를 부호화 및 복호화하는 방법 및 장치
KR20110017719A (ko) 2009-08-14 2011-02-22 삼성전자주식회사 비디오 부호화 방법 및 장치, 비디오 복호화 방법 및 장치
US9288010B2 (en) 2009-08-19 2016-03-15 Qualcomm Incorporated Universal file delivery methods for providing unequal error protection and bundled file delivery services
US9917874B2 (en) 2009-09-22 2018-03-13 Qualcomm Incorporated Enhanced block-request streaming using block partitioning or request controls for improved client-side handling
KR101282190B1 (ko) * 2009-12-11 2013-07-04 한국전자통신연구원 적응형 보안 정책 기반의 스케일러블 영상 서비스 방법 및 장치
US9582238B2 (en) 2009-12-14 2017-02-28 Qualcomm Incorporated Decomposed multi-stream (DMS) techniques for video display systems
TWI403951B (zh) * 2010-01-12 2013-08-01 Pegatron Corp 觸控式電子裝置
PL2526674T3 (pl) 2010-01-18 2017-09-29 Telefonaktiebolaget Lm Ericsson (Publ) Sposób i urządzenie dla wsparcia odtwarzania treści
US8908774B2 (en) * 2010-02-11 2014-12-09 Mediatek Inc. Method and video receiving system for adaptively decoding embedded video bitstream
US9485546B2 (en) 2010-06-29 2016-11-01 Qualcomm Incorporated Signaling video samples for trick mode video representations
US8918533B2 (en) 2010-07-13 2014-12-23 Qualcomm Incorporated Video switching for streaming video data
US9185439B2 (en) 2010-07-15 2015-11-10 Qualcomm Incorporated Signaling data for multiplexing video components
US9596447B2 (en) 2010-07-21 2017-03-14 Qualcomm Incorporated Providing frame packing type information for video coding
TWI399083B (zh) * 2010-07-28 2013-06-11 Compal Communication Inc 具遙控功能之無線通訊系統及其無線通訊模組
US9319448B2 (en) 2010-08-10 2016-04-19 Qualcomm Incorporated Trick modes for network streaming of coded multimedia data
FR2966679A1 (fr) * 2010-10-25 2012-04-27 France Telecom Procedes et dispositifs de codage et de decodage d'au moins une image a partir d'un epitome, signal et programme d'ordinateur correspondants
WO2012096981A1 (en) * 2011-01-14 2012-07-19 Vidyo, Inc. Improved nal unit header
US9413803B2 (en) 2011-01-21 2016-08-09 Qualcomm Incorporated User input back channel for wireless displays
US8964783B2 (en) 2011-01-21 2015-02-24 Qualcomm Incorporated User input back channel for wireless displays
US9787725B2 (en) 2011-01-21 2017-10-10 Qualcomm Incorporated User input back channel for wireless displays
US9065876B2 (en) 2011-01-21 2015-06-23 Qualcomm Incorporated User input back channel from a wireless sink device to a wireless source device for multi-touch gesture wireless displays
US10135900B2 (en) 2011-01-21 2018-11-20 Qualcomm Incorporated User input back channel for wireless displays
US20130013318A1 (en) 2011-01-21 2013-01-10 Qualcomm Incorporated User input back channel for wireless displays
US9503771B2 (en) 2011-02-04 2016-11-22 Qualcomm Incorporated Low latency wireless display for graphics
US10108386B2 (en) 2011-02-04 2018-10-23 Qualcomm Incorporated Content provisioning for wireless back channel
US8674957B2 (en) 2011-02-04 2014-03-18 Qualcomm Incorporated User input device for wireless back channel
US8958375B2 (en) 2011-02-11 2015-02-17 Qualcomm Incorporated Framing for an improved radio link protocol including FEC
US9270299B2 (en) 2011-02-11 2016-02-23 Qualcomm Incorporated Encoding and decoding using elastic codes with flexible source block mapping
US8848804B2 (en) * 2011-03-04 2014-09-30 Vixs Systems, Inc Video decoder with slice dependency decoding and methods for use therewith
EP2684371A4 (en) * 2011-03-10 2015-02-25 Vidyo Inc SIGNALING A NUMBER OF ACTIVE LAYERS IN A VIDEO CODING OPERATION
TWI425442B (zh) * 2011-04-14 2014-02-01 Univ Nat Central Method of Reconstructing Three - dimensional Housing Model on Aeronautical Mapping System
TWI482502B (zh) * 2011-04-19 2015-04-21 Etron Technology Inc 影像互動裝置、互動式影像操作系統、及相關互動式影像操作方法
US9253233B2 (en) 2011-08-31 2016-02-02 Qualcomm Incorporated Switch signaling methods providing improved switching between representations for adaptive HTTP streaming
US9843844B2 (en) 2011-10-05 2017-12-12 Qualcomm Incorporated Network streaming of media data
KR20130058584A (ko) * 2011-11-25 2013-06-04 삼성전자주식회사 복호화기의 버퍼 관리를 위한 영상 부호화 방법 및 장치, 그 영상 복호화 방법 및 장치
US9525998B2 (en) 2012-01-06 2016-12-20 Qualcomm Incorporated Wireless display with multiscreen service
US9294226B2 (en) 2012-03-26 2016-03-22 Qualcomm Incorporated Universal object delivery and template-based file delivery
US9578326B2 (en) * 2012-04-04 2017-02-21 Qualcomm Incorporated Low-delay video buffering in video coding
ES2789024T3 (es) * 2012-04-12 2020-10-23 Velos Media Int Ltd Gestión de datos de extensión
PL2842318T3 (pl) 2012-04-13 2017-06-30 Ge Video Compression, Llc Kodowanie obrazu z małym opóźnieniem
KR20130116782A (ko) 2012-04-16 2013-10-24 한국전자통신연구원 계층적 비디오 부호화에서의 계층정보 표현방식
PL2843945T3 (pl) * 2012-04-23 2020-07-27 Sun Patent Trust Sposób kodowania obrazów, sposób dekodowania obrazów, urządzenie do kodowania obrazów, urządzenie do dekodowania obrazów oraz urządzenie do kodowania/dekodowania obrazów
US9762903B2 (en) * 2012-06-01 2017-09-12 Qualcomm Incorporated External pictures in video coding
EP3151566B1 (en) * 2012-06-29 2021-03-31 GE Video Compression, LLC Video data stream concept
AU2013285333A1 (en) * 2012-07-02 2015-02-05 Nokia Technologies Oy Method and apparatus for video coding
US20140003534A1 (en) * 2012-07-02 2014-01-02 Sony Corporation Video coding system with temporal scalability and method of operation thereof
US9635369B2 (en) * 2012-07-02 2017-04-25 Qualcomm Incorporated Video parameter set including HRD parameters
TWI482494B (zh) * 2012-07-09 2015-04-21 Wistron Corp 頻道資訊提示方法及系統以及電腦可讀取儲存媒體
CN103546826B (zh) * 2012-07-16 2017-07-21 上海贝尔股份有限公司 视频业务的传输方法和装置
US9357272B2 (en) 2012-08-03 2016-05-31 Intel Corporation Device orientation capability exchange signaling and server adaptation of multimedia content in response to device orientation
US9503753B2 (en) * 2012-09-24 2016-11-22 Qualcomm Incorporated Coded picture buffer arrival and nominal removal times in video coding
US8989508B2 (en) * 2012-09-28 2015-03-24 Sharp Kabushiki Kaisha Electronic device for signaling a sub-picture buffer parameter
US9781413B2 (en) * 2012-10-02 2017-10-03 Qualcomm Incorporated Signaling of layer identifiers for operation points
CN104854872B (zh) * 2012-12-13 2018-07-20 索尼公司 发送装置、传输方法、接收装置以及接收方法
EP2936809B1 (en) * 2012-12-21 2016-10-19 Telefonaktiebolaget LM Ericsson (publ) Multi-layer video stream decoding
US9774927B2 (en) 2012-12-21 2017-09-26 Telefonaktiebolaget L M Ericsson (Publ) Multi-layer video stream decoding
ES2648970T3 (es) * 2012-12-21 2018-01-09 Telefonaktiebolaget Lm Ericsson (Publ) Codificación y decodificación de flujo de video multicapa
US10805605B2 (en) * 2012-12-21 2020-10-13 Telefonaktiebolaget Lm Ericsson (Publ) Multi-layer video stream encoding and decoding
US9294777B2 (en) * 2012-12-30 2016-03-22 Qualcomm Incorporated Progressive refinement with temporal scalability support in video coding
CN104904211B (zh) * 2013-01-04 2018-06-12 索尼公司 Jctvc-l0227:带有profile-tier-level语法结构的更新的vps_extension
US10419778B2 (en) 2013-01-04 2019-09-17 Sony Corporation JCTVC-L0227: VPS_extension with updates of profile-tier-level syntax structure
US9661341B2 (en) 2013-01-07 2017-05-23 Microsoft Technology Licensing, Llc Syntax and semantics for buffering information to simplify video splicing
GB2509954B (en) * 2013-01-18 2016-03-23 Canon Kk Method of displaying a region of interest in a video stream
GB2509953B (en) * 2013-01-18 2015-05-20 Canon Kk Method of displaying a region of interest in a video stream
US9516306B2 (en) * 2013-03-27 2016-12-06 Qualcomm Incorporated Depth coding modes signaling of depth data for 3D-HEVC
US20140301463A1 (en) * 2013-04-05 2014-10-09 Nokia Corporation Method and apparatus for video coding and decoding
KR20140122191A (ko) * 2013-04-05 2014-10-17 삼성전자주식회사 멀티 레이어 비디오 부호화 방법 및 장치, 멀티 레이어 비디오 복호화 방법 및 장치
US9467700B2 (en) 2013-04-08 2016-10-11 Qualcomm Incorporated Non-entropy encoded representation format
WO2014175919A1 (en) 2013-04-26 2014-10-30 Intel IP Corporation Shared spectrum reassignment in a spectrum sharing context
JP6271888B2 (ja) * 2013-07-12 2018-01-31 キヤノン株式会社 画像符号化装置、画像符号化方法及びプログラム、画像復号装置、画像復号方法及びプログラム
HK1222067A1 (zh) * 2013-07-14 2017-06-16 夏普株式会社 瓦片对齐信令和一致性约束
JP6330667B2 (ja) * 2013-08-09 2018-05-30 ソニー株式会社 送信装置、送信方法、受信装置、受信方法、符号化装置および符号化方法
WO2015047162A1 (en) * 2013-09-26 2015-04-02 Telefonaktiebolaget L M Ericsson (Publ) Hybrid codec scalable video
KR102246546B1 (ko) 2013-10-12 2021-04-30 삼성전자주식회사 멀티 레이어 비디오 부호화 방법 및 장치, 멀티 레이어 비디오 복호화 방법 및 장치
GB2542282B (en) 2013-10-22 2018-08-01 Canon Kk Method, device, and computer program for encapsulating partitioned timed media data in a server
US9712843B2 (en) * 2013-10-23 2017-07-18 Qualcomm Incorporated Multi-layer video file format designs
CN112887737B (zh) * 2014-01-03 2024-04-02 康普英国有限公司 用于hevc扩展处理的条件解析扩展语法
US9386275B2 (en) * 2014-01-06 2016-07-05 Intel IP Corporation Interactive video conferencing
EP3713234A1 (en) * 2014-01-07 2020-09-23 Canon Kabushiki Kaisha Method, device, and computer program for encoding inter-layer dependencies in encapsulating multi-layer partitioned timed media data
WO2015140401A1 (en) 2014-03-17 2015-09-24 Nokia Technologies Oy An apparatus, a method and a computer program for video coding and decoding
CN105163120B (zh) * 2014-06-09 2018-09-25 浙江大学 一种假设解码器中输入码流缓冲区的输入和输出/从缓冲区获取数据的方法及装置、传输视频码流的方法
US9516220B2 (en) 2014-10-02 2016-12-06 Intel Corporation Interactive video conferencing
US10021346B2 (en) 2014-12-05 2018-07-10 Intel IP Corporation Interactive video conferencing
JP6729548B2 (ja) * 2015-02-27 2020-07-22 ソニー株式会社 送信装置、送信方法、受信装置および受信方法
GB2538997A (en) * 2015-06-03 2016-12-07 Nokia Technologies Oy A method, an apparatus, a computer program for video coding
US10810701B2 (en) 2016-02-09 2020-10-20 Sony Interactive Entertainment Inc. Video display system
US9924131B1 (en) 2016-09-21 2018-03-20 Samsung Display Co., Ltd. System and method for automatic video scaling
US11979340B2 (en) 2017-02-12 2024-05-07 Mellanox Technologies, Ltd. Direct data placement
WO2018186550A1 (ko) * 2017-04-05 2018-10-11 엘지전자 주식회사 방송 신호 송수신 방법 및 장치
US12058309B2 (en) 2018-07-08 2024-08-06 Mellanox Technologies, Ltd. Application accelerator
US11252464B2 (en) * 2017-06-14 2022-02-15 Mellanox Technologies, Ltd. Regrouping of video data in host memory
US20180367589A1 (en) * 2017-06-14 2018-12-20 Mellanox Technologies, Ltd. Regrouping of video data by a network interface controller
US20200014945A1 (en) 2018-07-08 2020-01-09 Mellanox Technologies, Ltd. Application acceleration
ES2981568T3 (es) * 2019-01-09 2024-10-09 Huawei Tech Co Ltd Señalización indicadora de nivel de subimagen en codificación de vídeo
KR102855238B1 (ko) * 2019-02-01 2025-09-05 소니그룹주식회사 복호 장치, 복호 방법 및 프로그램
US10846551B2 (en) * 2019-02-06 2020-11-24 Apical Limited Video data processing
AU2020234972B2 (en) * 2019-03-11 2025-10-23 Interdigital Vc Holdings, Inc. Sub-picture bitstream extraction and reposition
US12238273B2 (en) 2019-12-03 2025-02-25 Mellanox Technologies, Ltd Video coding system
EP4114005A4 (en) * 2020-03-18 2023-08-09 LG Electronics, Inc. POINT CLOUD DATA TRANSMISSION DEVICE, POINT CLOUD DATA TRANSMISSION METHOD, POINT CLOUD DATA RECEIVING DEVICE AND POINT CLOUD DATA RECEIVING METHOD
CN112565815B (zh) * 2020-10-16 2022-05-24 腾讯科技(深圳)有限公司 文件封装方法、文件传输方法、文件解码方法及相关设备
US20240196015A1 (en) * 2021-04-23 2024-06-13 Lg Electronics Inc. Image encoding/decoding method and device based on sei message including layer identifier information, and method for transmitting bitstream
US12339902B2 (en) 2021-10-05 2025-06-24 Mellanox Technologies, Ltd Hardware accelerated video encoding
US12135662B2 (en) 2022-07-06 2024-11-05 Mellanox Technologies, Ltd. Patterned direct memory access (DMA)
US12137141B2 (en) 2022-07-06 2024-11-05 Mellanox Technologies, Ltd. Patterned remote direct memory access (RDMA)
US12216575B2 (en) 2022-07-06 2025-02-04 Mellanox Technologies, Ltd Patterned memory-network data transfer
US20250119567A1 (en) * 2023-10-05 2025-04-10 Tencent America LLC Sei message supporting decoder picture-based parallelization

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6501797B1 (en) * 1999-07-06 2002-12-31 Koninklijke Phillips Electronics N.V. System and method for improved fine granular scalable video using base layer coding information
WO2003063505A1 (en) * 2002-01-23 2003-07-31 Nokia Corporation Grouping of image frames in video coding
US20040006575A1 (en) 2002-04-29 2004-01-08 Visharam Mohammed Zubair Method and apparatus for supporting advanced coding formats in media files
US20040139462A1 (en) * 2002-07-15 2004-07-15 Nokia Corporation Method for error concealment in video sequences

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1031540C (zh) 1990-09-19 1996-04-10 菲利浦光灯制造公司 记录载体、主数据和控制文件的记录方法和装置及读出装置
US6292512B1 (en) * 1998-07-06 2001-09-18 U.S. Philips Corporation Scalable video coding system
US6263022B1 (en) 1999-07-06 2001-07-17 Philips Electronics North America Corp. System and method for fine granular scalable video with selective quality enhancement
US6639943B1 (en) * 1999-11-23 2003-10-28 Koninklijke Philips Electronics N.V. Hybrid temporal-SNR fine granular scalability video coding
KR20030005166A (ko) * 2000-11-23 2003-01-17 코닌클리케 필립스 일렉트로닉스 엔.브이. 비디오 코딩 방법 및 대응하는 인코더
US6904035B2 (en) * 2000-11-29 2005-06-07 Nokia Corporation Mobile system, terminal and interface, as well as methods for providing backward compatibility to first and second generation mobile systems
KR100491445B1 (ko) 2002-04-12 2005-05-25 한국과학기술원 Mpeg-4 fgs 비디오를 위한 사각영역 기반형의선택적 향상기법에 의한 부호화/복호화 방법 및 장치
AU2003237120B2 (en) 2002-04-29 2008-10-09 Sony Electronics, Inc. Supporting advanced coding formats in media files
ATE435567T1 (de) * 2003-08-29 2009-07-15 Koninkl Philips Electronics Nv System und verfahren zur codierung und decodierung von daten der verbesserungsebene durch verwendung deskriptiver modellparameter
KR20050042399A (ko) * 2003-11-03 2005-05-09 삼성전자주식회사 게이즈 디텍션을 이용한 비디오 데이터 처리 장치 및 방법
WO2005055605A1 (en) * 2003-12-03 2005-06-16 Koninklijke Philips Electronics N.V. System and method for improved scalability support in mpeg-2 systems
US7586924B2 (en) * 2004-02-27 2009-09-08 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Apparatus and method for coding an information signal into a data stream, converting the data stream and decoding the data stream
US20050254575A1 (en) * 2004-05-12 2005-11-17 Nokia Corporation Multiple interoperability points for scalable media coding and transmission
US7801220B2 (en) * 2005-01-07 2010-09-21 Microsoft Corporation In-band wavelet video coding with spatial scalability

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6501797B1 (en) * 1999-07-06 2002-12-31 Koninklijke Phillips Electronics N.V. System and method for improved fine granular scalable video using base layer coding information
WO2003063505A1 (en) * 2002-01-23 2003-07-31 Nokia Corporation Grouping of image frames in video coding
US20040006575A1 (en) 2002-04-29 2004-01-08 Visharam Mohammed Zubair Method and apparatus for supporting advanced coding formats in media files
US20040139462A1 (en) * 2002-07-15 2004-07-15 Nokia Corporation Method for error concealment in video sequences

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
"Applications and Requirements for Scalable Video Coding", 71TH MPEG MEETING, January 2005 (2005-01-01)
"Description of Core Experiments in SVC", 14. JVT MEETING; 71. MPEG MEETING, 18 January 2005 (2005-01-18)
"Scalable video coding - working draft 1", JVT-N020, January 2005 (2005-01-01)
"Working Draft 1.0 of 14496-10:200x/AMD1 Scalable Video Coding", MPEG DOCUMENT W6901, January 2005 (2005-01-01)
See also references of EP1869891A4 *
WEIPING LI: "Overview of fine granularity scalability in MPEG-4 video standard", IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, vol. 11, no. 3, March 2001 (2001-03-01), pages 301 - 317

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1878254A4 (en) * 2005-04-13 2011-05-18 Nokia Corp FGS IDENTIFICATION IN SCALABLE VIDEO CODING
US9426499B2 (en) 2005-07-20 2016-08-23 Vidyo, Inc. System and method for scalable and low-delay videoconferencing using scalable video coding
US8442109B2 (en) 2006-07-12 2013-05-14 Nokia Corporation Signaling of region-of-interest scalability information in media files
KR101088772B1 (ko) * 2006-10-20 2011-12-01 노키아 코포레이션 스케일러블 멀티미디어의 적응 경로들에 대한 포괄적 표시
WO2008047319A1 (en) 2006-10-20 2008-04-24 Nokia Corporation Generic indication of adaptation paths for scalable multimedia
EP2080383A4 (en) * 2006-10-20 2009-12-09 Nokia Corp GENERIC INDICATION OF ADJUSTMENT GUIDE FOR SCALABLE MULTIMEDIA
JP2010507310A (ja) * 2006-10-20 2010-03-04 ノキア コーポレイション ビデオの符号化においてピクチャ出力インジケータを提供するためのシステムおよび方法
US9807431B2 (en) 2006-10-20 2017-10-31 Nokia Technologies Oy Generic indication of adaptation paths for scalable multimedia
TWI471015B (zh) * 2006-10-20 2015-01-21 Nokia Corp 用於可縮放多媒體之適應路徑通用指示技術
US9319717B2 (en) 2007-01-08 2016-04-19 Nokia Technologies Oy System and method for providing and using predetermined signaling of interoperability points for transcoded media streams
US8355448B2 (en) 2007-01-18 2013-01-15 Nokia Corporation Carriage of SEI messages in RTP payload format
RU2430483C2 (ru) * 2007-01-18 2011-09-27 Нокиа Корпорейшн Передача сообщений дополнительной расширенной информации в формате полезной нагрузки транспортного протокола реального времени
US9451289B2 (en) 2007-01-18 2016-09-20 Nokia Technologies Oy Carriage of SEI messages in RTP payload format
US10110924B2 (en) 2007-01-18 2018-10-23 Nokia Technologies Oy Carriage of SEI messages in RTP payload format
US8908770B2 (en) 2007-01-18 2014-12-09 Nokia Corporation Carriage of SEI messages in RTP payload format
US8938012B2 (en) 2007-04-13 2015-01-20 Nokia Corporation Video coder
US11412265B2 (en) 2007-04-18 2022-08-09 Dolby Laboratories Licensing Corporaton Decoding multi-layer images
US10863203B2 (en) 2007-04-18 2020-12-08 Dolby Laboratories Licensing Corporation Decoding multi-layer images
US8619871B2 (en) 2007-04-18 2013-12-31 Thomson Licensing Coding systems
US8854427B2 (en) 2007-09-24 2014-10-07 Koninklijke Philips N.V. Method and system for encoding a video data signal, encoded video data signal, method and system for decoding a video data signal
JP2011515874A (ja) * 2007-09-24 2011-05-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ ビデオデータ信号を符号化するための方法及びシステム、符号化されたビデオデータ信号、ビデオデータ信号を復号するための方法及びシステム
WO2009040701A3 (en) * 2007-09-24 2009-05-22 Koninkl Philips Electronics Nv Method and system for encoding a video data signal, encoded video data signal, method and system for decoding a video data signal
US10904509B2 (en) 2007-09-24 2021-01-26 Koninklijke Philips N.V. Method and system for encoding a video data signal, encoded video data signal, method and system for decoding a video data signal
US11677924B2 (en) 2007-09-24 2023-06-13 Koninklijke Philips N.V. Method and system for encoding a video data signal, encoded video data signal, method and system for decoding a video data signal
JP2010541471A (ja) * 2007-10-05 2010-12-24 トムソン ライセンシング マルチビュー・ビデオ(mvc)コーディング・システムにビデオ・ユーザビリティ情報(vui)を組み込むための方法と装置
JP2010541470A (ja) * 2007-10-05 2010-12-24 トムソン ライセンシング マルチビュービデオ(mvc)符号化システム内にビデオユーザビリティ情報(vui)を取り込む方法及び装置
RU2844691C2 (ru) * 2008-03-28 2025-08-05 Долби Интернэшнл Аб Декодирующее устройство
EP2329610A4 (en) * 2008-09-29 2012-08-08 Samsung Electronics Co Ltd METHOD AND DEVICE FOR PROVIDING RICH MEDIA SERVICES
WO2010036085A2 (en) 2008-09-29 2010-04-01 Samsung Electronics Co., Ltd. Method and apparatus for providing rich media service
EP2364021A3 (en) * 2010-03-05 2013-05-15 Canon Kabushiki Kaisha Image processing apparatus capable of extracting frame image data from video data and method for controlling the same
US9277270B2 (en) 2010-03-05 2016-03-01 Canon Kabushiki Kaisha Image processing apparatus capable of extracting frame image data from video data and method for controlling the same
US9253240B2 (en) 2010-07-20 2016-02-02 Qualcomm Incorporated Providing sequence data sets for streaming video data
US9131033B2 (en) 2010-07-20 2015-09-08 Qualcomm Incoporated Providing sequence data sets for streaming video data
US9648317B2 (en) 2012-01-30 2017-05-09 Qualcomm Incorporated Method of coding video and storing video content
US10958915B2 (en) 2012-01-30 2021-03-23 Qualcomm Incorporated Method of coding video and storing video content
US10582183B2 (en) 2012-04-03 2020-03-03 Sun Patent Trust Image encoding method, image decoding method, image encoding device, and image decoding device
US9693032B2 (en) 2012-04-03 2017-06-27 Sun Patent Trust Image encoding method, image decoding method, image encoding device, and image decoding device
US10027943B2 (en) 2012-04-03 2018-07-17 Sun Patent Trust Image encoding method, image decoding method, image encoding device, and image decoding device
US9313486B2 (en) 2012-06-20 2016-04-12 Vidyo, Inc. Hybrid video coding techniques
CN104412598A (zh) * 2012-07-06 2015-03-11 夏普株式会社 发信号通知基于子图片的假想参考解码器参数的电子设备
RU2635892C2 (ru) * 2012-09-28 2017-11-16 Квэлкомм Инкорпорейтед Сигнализация идентификаторов уровней для рабочих точек при кодировании видео
US9973782B2 (en) 2012-09-28 2018-05-15 Qualcomm Incorporated Signaling layer identifiers for operation points in video coding
US9936196B2 (en) 2012-10-30 2018-04-03 Qualcomm Incorporated Target output layers in video coding
JP2024105403A (ja) * 2013-04-08 2024-08-06 ジーイー ビデオ コンプレッション エルエルシー 効率的なマルチビュー/レイヤ符号化を可能とする符号化コンセプト
JP7791241B2 (ja) 2013-04-08 2025-12-23 ドルビー・ビデオ・コンプレッション・リミテッド・ライアビリティ・カンパニー 効率的なマルチビュー/レイヤ符号化を可能とする符号化コンセプト
GB2587365A (en) * 2019-09-24 2021-03-31 Canon Kk Method, device, and computer program for coding and decoding a picture
GB2587365B (en) * 2019-09-24 2023-02-22 Canon Kk Method, device, and computer program for coding and decoding a picture
WO2021234125A1 (en) * 2020-05-22 2021-11-25 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Subpicture-related video coding concepts
US12395651B2 (en) 2020-05-22 2025-08-19 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Subpicture-related video coding concepts

Also Published As

Publication number Publication date
RU2377736C2 (ru) 2009-12-27
TW200704191A (en) 2007-01-16
JP2008536420A (ja) 2008-09-04
KR20080006609A (ko) 2008-01-16
EP1869891A1 (en) 2007-12-26
CA2604203A1 (en) 2006-10-19
US9332254B2 (en) 2016-05-03
US20060256851A1 (en) 2006-11-16
EP1869891A4 (en) 2014-06-11
RU2007141755A (ru) 2009-05-20
US8774266B2 (en) 2014-07-08
MX2007012564A (es) 2007-11-15
CN101120593A (zh) 2008-02-06
US20140307802A1 (en) 2014-10-16

Similar Documents

Publication Publication Date Title
US9332254B2 (en) Coding, storage and signalling of scalability information
EP1747673B1 (en) Multiple interoperability points for scalable media coding and transmission
EP1977604B1 (en) Method for a backward -compatible encapsulation of a scalable coded video signal into a sequence of aggregate data units
US12355991B2 (en) Low complexity enhancement video coding
WO2008084424A1 (en) System and method for providing and using predetermined signaling of interoperability points for transcoded media streams
WO2015029754A1 (ja) 送信装置、送信方法、受信装置および受信方法
US20160212435A1 (en) Transmission device, transmission method, reception device, and reception method
JP7230981B2 (ja) 受信装置および受信方法
JP2018011341A (ja) 送信装置、送信方法、受信装置および受信方法
JP5905148B2 (ja) 送信装置、送信方法、受信装置および受信方法
HK1127205B (en) Backward-compatible aggregation of pictures in scalable video coding
JP2015092746A (ja) 送信装置、送信方法、受信装置および受信方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 5456/DELNP/2007

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2006725911

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 200680004747.8

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2008505913

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: MX/a/2007/012564

Country of ref document: MX

WWE Wipo information: entry into national phase

Ref document number: 12007502236

Country of ref document: PH

Ref document number: 2604203

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

WWE Wipo information: entry into national phase

Ref document number: 1020077026308

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 1200702393

Country of ref document: VN

Ref document number: 2007141755

Country of ref document: RU

WWP Wipo information: published in national office

Ref document number: 2006725911

Country of ref document: EP