WO2006047448A2 - Supporting fidelity range extensions in advanced video codec file format - Google Patents

Supporting fidelity range extensions in advanced video codec file format

Info

Publication number
WO2006047448A2
WO2006047448A2 PCT/US2005/038255 US2005038255W WO2006047448A2 WO 2006047448 A2 WO2006047448 A2 WO 2006047448A2 US 2005038255 W US2005038255 W US 2005038255W WO 2006047448 A2 WO2006047448 A2 WO 2006047448A2
Authority
WO
Grant status
Application
Patent type
Prior art keywords
data
metadata
parameter
file
set
Prior art date
Application number
PCT/US2005/038255
Other languages
French (fr)
Other versions
WO2006047448A3 (en )
Inventor
Mohammed Zubair Visharam
Ali Tabatabai
Original Assignee
Sony Electonics, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process

Abstract

A parameter set is created to specify chroma format, luma bit depth, and chroma bit depth for a portion of multimedia data (See Fig. 5, item 500). The parameter set is encoded into a metadata file that is associated with the multimedia data (See Fig. 5, item 502). The parameter set is extracted from the metadata file if a decoder configuration record contains fields corresponding to the parameter set (See Fig. 5, item 504). In another aspect, the decoder configuration record is created with fields corresponding to the parameter set (See Fig. 5, item 506).

Description

SUPPORTING FIDELITY RANGE EXTENSIONS IN ADVANCED VIDEO CODEC FILE FORMAT

RELATED APPLICATIONS

[0001] This application is related to U.S. Patent Application numbers 10/371,434, 10/371,438, 10/371,464, and 10/371,927, all filed on February 21, 2003, and 10/425,291 and 10/425,685, both filed on April 28, 2003, all of which are assigned to the same assignees as the present application.

FIELD OF THE INVENTION

[0002] The invention relates generally to the storage and retrieval of audiovisual content in a multimedia file format and particularly to file formats compatible with the ISO media file format.

COPYRIGHT NOTICE/PERMISSION

[0003] A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings hereto: Copyright © 2003, Sony Electronics, Inc., All Rights Reserved.

BACKGROUND OF THE INVENTION

[0004] In the wake of rapidly increasing demand for network, multimedia, database and other digital capacity, many multimedia coding and storage schemes have evolved. One of the well known file formats for encoding and storing audiovisual data is the QuickTime® file format developed by Apple Computer Inc. The QuickTime file format was used as the starting point for creating the International Organization for Standardization (ISO) Multimedia file format,- IS O/IEC 14496-12, Information Technology- Coding of audio-visual objects - Part 12: ISO Media File Format (also known as the ISO file format). The ISO file format was, in turn, used as a template for two standard file formats: (1) the MPEG-4 file format developed by the Moving Picture Experts Group, known as MP4 (ISO/IEC 14496-14, Information Technology ~ Coding of audio-visual objects ~ Part 14: MP4 File Format); and (2) a file format for JPEG 2000 (ISO/IEC 15444-1), developed by Joint Photographic Experts Group (JPEG). [0005] The ISO media file format is a hierarchical data structure. The data structures contain metadata providing declarative, structural and temporal information about the actual media data. The media data itself may be located within the data structure or in the same file or externally in a different file. Each metadata stream is called a track. The metadata within this, track contains the structural information providing references to the externally framed media data.

[0006] The media data referred to by a meta-data track can be of various types (e.g., video data, audio data, binary format screen representations (BIFS), etc.). The externally framed media data is divided into samples (also known as access units or pictures. A sample represents a unit of media data at a particular time point and is the smallest data entity which can be represented by timing, location, and other metadata information. Each metadata track thereby contains various sample entries and descriptions which provide information about the type of media data being referred to, followed by their timing and location and size information.

[0007] Subsequently, MPEG's video group and the Video Coding Experts Group (VCEG) of International Telecommunication Union (ITU) began working together as a Joint Video Team (JVT) to develop a new video coding/decoding (codec) standard. The new standard is referred to both as the ITU Recommendation H.264 or MPEG-4-Part 10, Advanced Video Codec (AVC). The encapsulation methods defined in the AVC file format can be used to store the coded video data, created by these specifications. [0008] The JVT codec design distinguished between two different conceptual layers, the Video Coding Layer (VCL), and the Network Abstraction Layer (NAL). The VCL contains the coding related parts of the codec, such as motion compensation, transform coding of coefficients, and entropy coding. The output of the VCL is slices, each of which contains a series of video macroblocks and associated header information. The NAL abstracts the VCL from the details of the transport layer used to carry the VCL data. The NAL defines a generic and transport independent representation for information, and defines the interface between the video codec itself and the outside world. The JVT codec design specifies a set of NAL units, each of which contains different types of data. [0009] In many existing video coding formats, the coded stream data includes various kinds of headers containing parameters that control the decoding process. For example, the MPEG-2 video standard includes sequence headers, enhanced group of pictures (GOP), and picture headers before the video data corresponding to those items. In JVT, the information needed to decode VCL data is grouped into parameter sets, and JVT defines an NAL unit that transports the parameter sets to the decoder. The parameter set NAL units may be sent in the same stream as the video NAL units (in-band) or in a separate stream (out-of-band).

[0010] The originally adopted H.264 Recommendation/ AVC specification defined three basic feature sets called profiles: baseline, main and extended. These profiles supported only video samples having 8 bits per sample and the chroma format YUV 4:2:0 used in consumer video such as television, DVD, streaming video, etc.. Several new profiles, collectively called the fidelity range extensions (FRExt), were subsequently created to allow storage and management of professional video formats. FRExt specifies higher bit depth encoding, including 10 bit and 12 bit video samples, and additional chroma sampling formats, such as YUV 4:2:2 and 4:4:4. In addition, FRExt also specifies extra color spaces, such as the International Commission on Illumination (CIE) XYZ and RBG (red, green, blue) color spaces, in addition to the previously supported YCbCr (yellow, chroma-blue, chroma-red) color space.

[0011] Although the JVT team adopted the fidelity range extensions into their specifications, the H.264 Recommendation/ AVC specification itself does not define how the existing AVC file format is to be modified to incorporate the new parameters associated with the extensions. SUMMARY OF THE INVENTION

[0012] A parameter set is created to specify chroma format, luma bit depth, and chroma bit depth for a portion of multimedia data. The parameter set is encoded into a metadata file that is associated with the multimedia data. The parameter set is extracted from the metadata file if a decoder configuration record contains fields corresponding to the parameter set. In another aspect, the decoder configuration record is created with fields corresponding to the parameter set.

BRIEF DESCRIPTION OF THE DRAWINGS

[0013] The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:

Figure 1 is a block diagram of one embodiment of an encoding system;

Figure 2 is a block diagram of one embodiment of a decoding system;

Figure 3 is a block diagram of a computer environment suitable for practicing the invention;

Figure 4 is a flow diagram of a method for storing parameter set metadata at an encoding system; and

Figure 5 is a flow diagram of a method for utilizing parameter set metadata at a decoding system.

DETAILED DESCRIPTION OF THE INVENTION

[0014] In the following detailed description of embodiments of the invention, reference is made to the accompanying drawings in which like references indicate similar elements, and in which is shown, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical, functional and other changes maybe made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims. [0015] To support the fidelity range extensions set forth in the AVC specification, the decoder configuration record in the AVC file format is extended to specify the chroma format, luma bit depth, and chroma bit depth for a portion of multimedia data. The parameter set associated with a FRExt profiles is encoded into a metadata file that is associated with the multimedia data. The parameter set is extracted from the metadata file if the decoder configuration record contains fields corresponding to the presence of FRExt data.

[0016] Beginning with an overview of the operation of the invention, Figure 1 illustrates one embodiment of an encoding system 100 that generates parameter set metadata. The encoding system 100 includes a media encoder 104, a metadata generator 106 and a file creator 108. The media encoder 104 receives media data that may include video data (e.g., video objects created from a natural source video scene and other external video objects), audio data (e.g., audio objects created from a natural source audio scene and other external audio objects), synthetic objects, or any combination of the above. The media encoder 104 may consist of a number of individual encoders or include sub- encoders to process various types of media data. The media encoder 104 codes the media data and passes it to the metadata generator 106. The metadata generator 106 generates metadata that provides information about the media data. For AVC, the metadata is formatted as parameter set NAL units.

[0017] The file creator 108 stores the metadata in a file whose structure is defined by the media file format. The media file format may specify that the metadata is stored in-band or entirely or partially out-of band. Coded media data is linked to the out-ot-band metadata by references contained in the metadata file (e.g., via URLs). The file created by the file creator 108 is available on a channel 110 for storage or transmission. [0018] Figure 2 illustrates one embodiment of a decoding system 200 that extracts parameter set metadata. The decoding system 200 includes a metadata extractor 204, a media data stream processor 206, a media decoder 210, a compositor 212 and a Tenderer 214. The decoding system 200 may reside on a client device and be used for local playback. Alternatively, the decoding system 200 may be used for streaming data, with a server portion and a client portion communicating with each other over a network (e.g., Internet) 208. The server portion may include the metadata extractor 204 and the media data stream processor 206. The client portion may include the media decoder 210, the compositor 212 and the renderer 214.

[0019] The metadata extractor 204 is responsible for extracting metadata from a file stored in a database 216 or received over a network (e.g., from the encoding system 100). A decoder configuration record specifies the metadata that the metadata extractor 204 is capable of handling. Any additional metadata that is not recognized is ignored. [0020] The extracted metadata is passed to the media data stream processor 206 which also receives the associated coded media data. The media data stream processor 206 uses the metadata to form a media data stream to be sent to the media decoder 210. [0021] Once the media data stream is formed, it is sent to the media decoder 210 either directly (e.g., for local playback) or over a network 208 (e.g., for streaming data) for decoding. The compositor 212 receives the output of the media decoder 210 and composes a scene which is then rendered on a user display device by the renderer 214. [0022] The metadata may change between the time it is created and the time it is used to decode a corresponding portion of media data. If such a change occurs, the decoding system 200 receives a metadata update packet specifying the change. The state of the metadata before and after the update is applied is maintained in the metadata. [0023] The following description of Figure 3 is intended to provide an overview of computer hardware and other operating components suitable for implementing the invention, but is not intended to limit the applicable environments. Figure 3 illustrates one embodiment of a computer system suitable for use as a metadata generator 106 and/or a file creator 108 of Figure 1, or a metadata extractor 204 and/or a media data stream processor 206 of Figure 2.

[0024] The computer system 340 includes a processor 350, memory 355 and input/output capability 360 coupled to a system bus 365. The memory 355 is configured to store instructions which, when executed by the processor 350, perform the methods described herein. Input/output 360 also encompasses various types of machine-readable media, including any type of storage device that is accessible by the processor 350. One of skill in the art will immediately recognize that the term "machine-readable medium/media" further encompasses a carrier wave that encodes a data signal. It will also be appreciated that the system 340 is controlled by operating system software executing in memory 355. Input/output and related media 360 store the computer-executable instructions for the operating system and methods of the present invention. Each of the metadata generator 106, the file creator 108, the metadata extractor 204 and the media data stream processor 206 that are shown in Figures 1 and 2 may be a separate component coupled to the processor 350, or may be embodied in computer-executable instructions executed by the processor 350. In one embodiment, the computer system 340 may be part of, or coupled to, an ISP (Internet Service Provider) through input/output 360 to transmit or receive media data over the Internet. It is readily apparent that the present invention is not limited to Internet access and Internet web-based sites; directly coupled and private networks are also contemplated.

[0025] It will be appreciated that the computer system 340 is one example of many possible computer systems that have different architectures. A typical computer system will usually include at least a processor, memory, and a bus coupling the memory to the processor. One of skill in the art will immediately appreciate that the invention can be practiced with other computer system configurations, including multiprocessor systems, minicomputers, mainframe computers, and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. [0026] Figures 4 and 5 illustrate processes for storing and retrieving parameter set metadata that are performed by the encoding system 100 and the decoding system 200 respectively. The processes may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination ot botn. t or sαπware- implemented processes, the description of a flow diagram enables one skilled in the art to develop such programs including instructions to carry out the processes on suitably configured computers (the processor of the computer executing the instructions from computer-readable media, including memory). The computer-executable instructions may be written in a computer programming language or may be embodied in firmware logic. If written in a programming language conforming to a recognized standard, such instructions can be executed on a variety of hardware platforms and for interface to a variety of operating systems. In addition, the embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings described herein. Furthermore, it is common in the art to speak of software, in one form or another (e.g., program, procedure, process, application, module, logic...), as taking an action or causing a result. Such expressions are merely a shorthand way of saying that execution of the software by a computer causes the processor of the computer to perform an action or produce a result. It will be appreciated that more or fewer operations may be incorporated into the processes illustrated in Figures 4 and 5 without departing from the scope of the invention and that no particular order is implied by the arrangement of blocks shown and described herein.

[0027] Figure 4 is a flow diagram of one embodiment of a method 400 for creating parameter set metadata at the encoding system 100. The processing logic of block 402 receives a file with encoded media data, which includes sets of encoding parameters that specify how to decode portions of the media data. The processing logic examines the relationships between the sets of encoding parameters and the corresponding portions of the media data (block 404), and creates metadata defining the parameter sets and their associations with the media data portions (block 406).

[0028] In one embodiment, the parameter set metadata is organized into a set of predefined data structures. The set of predefined data structures may include a data structure containing descriptive information about the parameter sets, and a data structure containing information that defines associations between media data portions and corresponding parameter sets. [0029] In one embodiment, the processing logic determines whether any parameter set data structure contains a repeated sequence of data (block 408). If this determination is positive, the processing logic converts each repeated sequence of data into a reference to a sequence occurrence and the number of times the sequence occurs (block 410). This type of parameter set is referred to as a sequence parameter set.

[0030] At block 412, the processing logic incorporates the parameter set metadata in a file associated with media data using a specific media file format (e.g., the AVC file format). Depending on the media file format, the parameter set metadata may be in-band or out-of- band.

[0031] Figure 5 is a flow diagram of one embodiment of a method 500 for utilizing parameter set metadata at the decoding system 200. The processing logic at block 502 receives a file associated with encoded media data. The file may be received from a database (local or external), the encoding system 100, or from any other device on a network. The file includes the parameter set metadata that defines parameter sets for the corresponding media data. The processing logic of block 504 extracts the parameter set metadata from the file.

[0032] The processing logic at block 506 uses the extracted metadata to determine which parameter set is associated with a specific media data portion. The information in the parameter set controls decoding and transmission time of media data portions and corresponding parameter sets.

[0033] In response to the adoption of the JVT fidelity range extension (FRExt) profiles, chroma format and bit depth parameters have been created to incorporate the FRExt into the existing AVC sequence parameter sets by the JVT team. If a video sample is in one of the extended chroma formats such as YUV 4:2:2 or 4:4:4, a chroma format indicator, Mchroma_format_idc," is included in the corresponding sequence parameter set by the metadata generator 106 of Figure 1 when executing blocks 406 through 410 of method 400. The chroma_format_idc parameter specifies the chroma (hue and saturation) sampling relative to the luma (luminosity) sampling and has a value ranging from 0 to 3. The presence of 10 and 12 bit video samples are indicated by two additional parameters, bit_depth_luma_minus8 specifies the bit depth of the luma samples, and bit_depth_chroma_minus8 specifies the bit depth of the chroma samples. The values of the bit_deρth_luma_minus8 and bit_depth_chroma_minus8 parameters range from O -to 4 according to the following formulas:

BitDepth = 8+ bit_depth_luma_minus8 (1 )

BitDepth = 8+ bit_depth_chroma_minus8 (2)

Thus, a value of zero corresponds to a bit depth of 8 bits, while a value of 4 corresponds to a bit depth of 12 bits.

[0034] Corresponding changes are required to the AVC decoder configuration records in the AVC file format for decoders that are capable of processing media formats specified by the fidelity range extensions. In one embodiment, the class AVCDecoderConfigurationRecord is modified by adding the following fields: bit (6) reserved ='111 l l l'b; unsigned int(2) chroma_format; bit (5) reserved ='111 l l'b; unsigned int (3) bit_depth_luma_minus8; bit (5) reserved ='1111 Tb; unsigned int (3) bit_depth_chroma_minus8;

where the chroma_format field contains the chroma format indicator defined by the parameter chroma_format_idc. The other two fields contain the corresponding luma and chroma parameter values.

[0035] Assuming the decoder 210 of Figure 2 is capable of decoding video in the extended formats, the modified decoder configuration record controls the extraction of the new FRExt parameters by the metadata extractor 204 as it executes block 505 of method 500.

[0036] Storage and retrieval of audiovisual metadata has been described. Although specific embodiments have been illustrated and described herein in terms of the AVC file formats, it will be appreciated by those of ordinary skill in the art that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present invention.

Claims

CLAIMSWhat is claimed is:
1. A computerized method comprising: creating a parameter set for a portion of multimedia data, wherein the parameter set comprises parameters specifying chroma format, luma bit depth and chroma bit depth for the portion of the multimedia data; and encoding the parameter set into a metadata file that is associated with the multimedia data.
2. The method of claim 1, wherein the portion of the multimedia data comprises a video sample encoded with the chroma format and bit depths.
3. The method of claim 1, wherein creating the parameter set comprises: creating first data structure containing descriptive information about the parameter set and a second data structure containing information that defines an association between the parameter set and the portion of the multimedia data.
4. The method of claim 1 further comprising: receiving the metadata file; and extracting the parameter set from the metadata file, wherein the chroma format and bit depth parameters are ignored if a decoder configuration record does not include corresponding fields.
5. A computerized method comprising: receiving a metadata file associated with a portion of multimedia data, the metadata file comprising a parameter set specifying chroma format, luma bit depth and chroma bit depth for the portion of the multimedia data; and extracting the parameter set from the metadata file, wherein the chroma format and bit depth parameters are ignored if a decoder configuration record does not include corresponding fields.
6. The method of claim 5, wherein the portion of the multimedia data comprises a video sample encoded with the chroma format and bit depths.
7. A computerized method comprising: creating a decoder configuration record comprising metadata entries corresponding to parameters for chroma format, a luma bit depth and a chroma bit depth for multimedia data.
8. The method of claim 7 further comprising: inserting the decoder configuration record into a decoder that processes multimedia data encoded with chroma format and bit depths specified by the parameters.
9. A machine-readable medium having executable instructions to cause a processor to perform a method comprising: creating a parameter set for a portion of multimedia data, wherein the parameter set comprises parameters specifying chroma format, luma bit depth and chroma bit depth for the portion of the multimedia data; and encoding the parameter set into a metadata file that is associated with the multimedia data.
10. The machine-readable medium of claim 9, wherein the portion of the multimedia data comprises a video sample encoded with the chroma format and bit depths.
11. The machine-readable medium of claim 9, wherein creating the parameter set comprises: creating first data structure containing descriptive information about the parameter set and a second data structure containing information that defines an association between the parameter set and the portion of the multimedia data.
12. The machine-readable medium of claim 9, wherein the method further comprises: receiving the metadata file; and extracting the parameter set from the metadata file, wherein the chroma format and bit depth parameters are ignored if a decoder configuration record does not include corresponding fields.
13. A machine-readable medium having executable instructions to cause a processor to perform a method comprising: receiving a metadata file associated with a portion of multimedia data, the metadata file comprising a parameter set specifying chroma format, luma bit depth and chroma bit depth for the portion of the multimedia data; and extracting the parameter set from the metadata file, wherein the chroma format and bit depth parameters are ignored if a decoder configuration record does not include corresponding fields.
14. The machine-readable medium of claim 13, wherein the portion of the multimedia data comprises a video sample encoded with the chroma format and bit depths.
15. A machine-readable medium having executable instructions to cause a processor to perform a method comprising: creating a decoder configuration record comprising metadata entries corresponding to parameters for chroma format, a luma bit depth and a chroma bit depth for multimedia data.
16. A system comprising: a processor coupled to a memory through a bus; and a process executed from the memory by the processor to cause the processor to create a parameter set for a portion of multimedia data, wherein the parameter set comprises parameters specifying chroma format, luma bit depth and chroma bit depth for the portion of the multimedia data, and encode the parameter set into a metadata file that is associated with the multimedia data.
17. The system of claim 16, wherein the portion of the multimedia data comprises a video sample encoded with the chroma format and bit depths.
18. The system of claim 16, wherein creating the parameter set comprises: creating first data structure containing descriptive information about the parameter set and a second data structure containing information that defines an association between the parameter set and the portion of the multimedia data.
19. The system claim 16, wherein the process further causes the processor to receive the metadata file, and extract the parameter set from the metadata file, wherein the chroma format and bit depth parameters are ignored if a decoder configuration record does not include corresponding fields.
20. A system comprising: a processor coupled to a memory through a bus; and a process executed from the memory by the processor to cause the processor to receive a metadata file associated with a portion of multimedia data, the metadata file comprising a parameter set specifying chroma format, luma bit depth and chroma bit depth for the portion of the multimedia data, and extract the parameter set from the metadata file, wherein the chroma format and bit depth parameters are ignored if a decoder configuration record does not include corresponding fields.
21. The system of claim 20, wherein the portion of the multimedia data comprises a video sample encoded with the chroma format and bit depths.
22. A system comprising: a processor coupled to a memory through a bus; and a process executed from the memory by the processor to cause the process to create a decoder configuration record comprising metadata entries corresponding to parameters for chroma format, a luma bit depth and a chroma bit depth for multimedia data.
PCT/US2005/038255 2004-10-21 2005-10-21 Supporting fidelity range extensions in advanced video codec file format WO2006047448A3 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US62075304 true 2004-10-21 2004-10-21
US60/620,753 2004-10-21
US11/255,853 2005-10-20
US11255853 US20070098083A1 (en) 2005-10-20 2005-10-20 Supporting fidelity range extensions in advanced video codec file format

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CA 2584765 CA2584765A1 (en) 2004-10-21 2005-10-21 Supporting fidelity range extensions in advanced video codec file format
JP2007538146A JP2008518516A (en) 2004-10-21 2005-10-21 Support of FRExt (FIDELITYRANGEEXTENSIONS) in advanced video codec file format of
EP20050811841 EP1820090A2 (en) 2004-10-21 2005-10-21 Supporting fidelity range extensions in advanced video codec file format

Publications (2)

Publication Number Publication Date
WO2006047448A2 true true WO2006047448A2 (en) 2006-05-04
WO2006047448A3 true WO2006047448A3 (en) 2009-04-16

Family

ID=36228345

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/038255 WO2006047448A3 (en) 2004-10-21 2005-10-21 Supporting fidelity range extensions in advanced video codec file format

Country Status (6)

Country Link
EP (1) EP1820090A2 (en)
JP (1) JP2008518516A (en)
KR (1) KR20070084442A (en)
CA (1) CA2584765A1 (en)
RU (1) RU2007118660A (en)
WO (1) WO2006047448A3 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010516099A (en) * 2007-01-04 2010-05-13 トムソン ライセンシングThomson Licensing Method and apparatus for multi-view information transmitted high level syntax
JP2011501553A (en) * 2007-10-16 2011-01-06 サムスン エレクトロニクス カンパニー リミテッド Method and apparatus for encoding media content and metadata
US9215456B2 (en) 2007-01-11 2015-12-15 Thomson Licensing Methods and apparatus for using syntax for the coded—block—flag syntax element and the coded—block—pattern syntax element for the CAVLC 4:4:4 intra, high 4:4:4 intra, and high 4:4:4 predictive profiles in MPEG-4 AVC high level coding
US9648317B2 (en) 2012-01-30 2017-05-09 Qualcomm Incorporated Method of coding video and storing video content

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2294827A4 (en) * 2008-06-12 2017-04-19 Thomson Licensing Methods and apparatus for video coding and decoding with reduced bit-depth update mode and reduced chroma sampling update mode
CN106162187A (en) * 2011-06-24 2016-11-23 株式会社Ntt都科摩 Method and apparatus for motion compensation prediction
CA2860750A1 (en) * 2012-01-31 2013-08-08 Sony Corporation Encoding device and encoding method, and decoding device and decoding method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5325423A (en) * 1992-11-13 1994-06-28 Multimedia Systems Corporation Interactive multimedia communication system
US6639945B2 (en) * 1997-03-14 2003-10-28 Microsoft Corporation Method and apparatus for implementing motion detection in video compression
US20040143786A1 (en) * 2001-05-14 2004-07-22 Stauder Juergen Device, server, system and method to generate mutual photometric effects
US20040179605A1 (en) * 2003-03-12 2004-09-16 Lane Richard Doil Multimedia transcoding proxy server for wireless telecommunication system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5325423A (en) * 1992-11-13 1994-06-28 Multimedia Systems Corporation Interactive multimedia communication system
US6639945B2 (en) * 1997-03-14 2003-10-28 Microsoft Corporation Method and apparatus for implementing motion detection in video compression
US20040143786A1 (en) * 2001-05-14 2004-07-22 Stauder Juergen Device, server, system and method to generate mutual photometric effects
US20040179605A1 (en) * 2003-03-12 2004-09-16 Lane Richard Doil Multimedia transcoding proxy server for wireless telecommunication system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010516099A (en) * 2007-01-04 2010-05-13 トムソン ライセンシングThomson Licensing Method and apparatus for multi-view information transmitted high level syntax
US9215456B2 (en) 2007-01-11 2015-12-15 Thomson Licensing Methods and apparatus for using syntax for the coded—block—flag syntax element and the coded—block—pattern syntax element for the CAVLC 4:4:4 intra, high 4:4:4 intra, and high 4:4:4 predictive profiles in MPEG-4 AVC high level coding
US9602824B2 (en) 2007-01-11 2017-03-21 Thomson Licensing Methods and apparatus for using syntax for the coded—block—flag syntax element and the coded—block—pattern syntax element for the CAVLC 4:4:4 Intra, HIGH 4:4:4 Intra, and HIGH 4:4:4 predictive profiles in MPEG-4 AVC high level coding
JP2011501553A (en) * 2007-10-16 2011-01-06 サムスン エレクトロニクス カンパニー リミテッド Method and apparatus for encoding media content and metadata
US8660999B2 (en) 2007-10-16 2014-02-25 Samsung Electronics Co., Ltd. Method and apparatus for encoding media content and metadata thereof
US9648317B2 (en) 2012-01-30 2017-05-09 Qualcomm Incorporated Method of coding video and storing video content

Also Published As

Publication number Publication date Type
CA2584765A1 (en) 2006-05-04 application
JP2008518516A (en) 2008-05-29 application
EP1820090A2 (en) 2007-08-22 application
RU2007118660A (en) 2008-11-27 application
WO2006047448A3 (en) 2009-04-16 application
KR20070084442A (en) 2007-08-24 application

Similar Documents

Publication Publication Date Title
US7136417B2 (en) Chroma conversion optimization
US20110064146A1 (en) Media extractor tracks for file format track selection
US20030185301A1 (en) Video appliance
US20120016965A1 (en) Video switching for streaming video data
US20050210145A1 (en) Delivering and processing multimedia bookmark
US20050123055A1 (en) Method for activation and deactivation of infrequently changing sequence and picture parameter sets
US20080260045A1 (en) Signalling and Extraction in Compressed Video of Pictures Belonging to Interdependency Tiers
US20100153395A1 (en) Method and Apparatus For Track and Track Subset Grouping
US20120023250A1 (en) Arranging sub-track fragments for streaming video data
Amon et al. File format for scalable video coding
US20120023249A1 (en) Providing sequence data sets for streaming video data
US20060233247A1 (en) Storing SVC streams in the AVC file format
US20040010802A1 (en) Generic adaptation layer for JVT video
US20060251289A1 (en) Data processing apparatus and method
US20040006575A1 (en) Method and apparatus for supporting advanced coding formats in media files
US20110317760A1 (en) Signaling video samples for trick mode video representations
US20040167925A1 (en) Method and apparatus for supporting advanced coding formats in media files
US20030163781A1 (en) Method and apparatus for supporting advanced coding formats in media files
US20140341305A1 (en) Specifying visual dynamic range coding operations and parameters
US20080291999A1 (en) Method and apparatus for video frame marking
WO2003098475A1 (en) Supporting advanced coding formats in media files
WO2003073770A1 (en) Method and apparatus for supporting avc in mp4
US20030163477A1 (en) Method and apparatus for supporting advanced coding formats in media files
US7725593B2 (en) Scalable video coding (SVC) file format
WO2003073768A1 (en) Method and apparatus for supporting avc in mp4

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV LY MD MG MK MN MW MX MZ NA NG NO NZ OM PG PH PL PT RO RU SC SD SG SK SL SM SY TJ TM TN TR TT TZ UG US UZ VC VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IS IT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005299534

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 2007538146

Country of ref document: JP

Ref document number: 2584765

Country of ref document: CA

NENP Non-entry into the national phase in:

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2005811841

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2007118660

Country of ref document: RU

WWP Wipo information: published in national office

Ref document number: 2005811841

Country of ref document: EP