EP2165542A2 - Adaptive coefficient scanning in video coding - Google Patents

Adaptive coefficient scanning in video coding

Info

Publication number
EP2165542A2
EP2165542A2 EP08770909A EP08770909A EP2165542A2 EP 2165542 A2 EP2165542 A2 EP 2165542A2 EP 08770909 A EP08770909 A EP 08770909A EP 08770909 A EP08770909 A EP 08770909A EP 2165542 A2 EP2165542 A2 EP 2165542A2
Authority
EP
European Patent Office
Prior art keywords
prediction modes
given
scan
statistics
coefficient values
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP08770909A
Other languages
German (de)
French (fr)
Other versions
EP2165542B1 (en
Inventor
Yan Ye
Marta Karczewicz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of EP2165542A2 publication Critical patent/EP2165542A2/en
Application granted granted Critical
Publication of EP2165542B1 publication Critical patent/EP2165542B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/174Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a slice, e.g. a line of blocks or a group of blocks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/129Scanning of coding units, e.g. zig-zag scan of transform coefficients or flexible macroblock ordering [FMO]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/593Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding

Definitions

  • This disclosure relates to digital video coding and, more particularly, entropy coding of coefficients of video blocks, such as transform coefficients of transformed video blocks.
  • Digital video capabilities can be incorporated into a wide range of devices, including digital televisions, digital direct broadcast systems, wireless communication devices such as radio telephone handsets, wireless broadcast systems, personal digital assistants (PDAs), laptop or desktop computers, digital cameras, digital recording devices, video gaming devices, video game consoles, and the like.
  • Digital video devices implement video compression techniques, such as MPEG-2, MPEG-4, or H.264/MPEG- 4, Part 10, Advanced Video Coding (AVC), to transmit and receive digital video more efficiently.
  • Video compression techniques perform spatial and temporal prediction to reduce or remove redundancy inherent in video sequences.
  • Video compression generally includes spatial prediction and/or temporal prediction.
  • intra-coding relies on spatial prediction to reduce or remove spatial redundancy between video blocks within a given coded unit, which may comprise a video frame, a slice of a video frame, or the like.
  • inter-coding relies on temporal prediction to reduce or remove temporal redundancy between video blocks of successive coded units of a video sequence.
  • a video encoder performs spatial prediction to compress data based on other data within the same coded unit.
  • the video encoder performs motion estimation and motion compensation to track the movement of corresponding video blocks of two or more adjacent coded units.
  • a coded video block may be represented by prediction information that comprises a prediction mode and a predictive block size, and a residual block of data indicative of differences between the block being coded and a predictive block.
  • prediction information that comprises a prediction mode and a predictive block size, and a residual block of data indicative of differences between the block being coded and a predictive block.
  • inter-coding one or more motion vectors are used to identify the predictive block of data, while in the case of intra-coding, the prediction mode can be used to generate the predictive block.
  • intra-coding and inter-coding may define several different prediction modes, which may define different block sizes and/or prediction techniques used in the coding.
  • the video encoder may apply transform, quantization and entropy coding processes to further reduce the bit rate associated with communication of a residual block.
  • Transform techniques may comprise discrete cosine transforms or conceptually similar processes, such as wavelet transforms, integer transforms, or other types of transforms.
  • DCT discrete cosine transform
  • the transform process converts a set of pixel values into transform coefficients, which represent the energy of the pixel values in the frequency domain.
  • Quantization is applied to the transform coefficients, and generally involves a process that limits the number of bits associated with any given transform coefficient.
  • Entropy coding comprises one or more processes that collectively compress a sequence of quantized transform coefficients.
  • a transformed video block of transform coefficients may be serialized by scanning the transform coefficients from a two- dimensional block into a one-dimensional vector.
  • the scanning is performed in a zig-zag manner such that the transform coefficients in the upper-left part of a video block occur earlier in the one-dimensional vector and the transform coefficients in the lower-right part of a video block occur later.
  • High energy transform coefficients typically reside near the upper left corner following the transform, so zig-zag scanning is effective to group non-zero transform coefficients near the beginning of the one- dimensional vector.
  • the scanning order can significantly affect the level of compression that can be achieved in entropy coding.
  • Examples of entropy coding processes include content adaptive variable length coding (CAVLC) and context adaptive binary arithmetic coding (CABAC).
  • CAVLC is one type of entropy coding technique supported by the ITU H.264/MPEG4 Part 10 AVC standard.
  • CAVLC uses variable length coding (VLC) tables in a manner that effectively compresses serialized "runs" of quantized transform coefficients.
  • VLC variable length coding
  • CABAC is another type of entropy coding technique supported by the ITU H.264/MPEG4 Part 10 AVC standard.
  • CABAC may involve several stages, including binarization, context model selection, and binary arithmetic coding.
  • a video decoder may perform inverse entropy coding operations that correspond to the type of entropy coding used in the encoding process to reconstruct the one- dimensional vectors of transform coefficients. Inverse scanning may also be performed at the decoder to form two-dimensional blocks from received one-dimensional vectors of transform coefficients. The video decoder then inverse quantizes and inverse transforms the transform coefficients in a block to reconstruct residual pixel data. The video decoder may use decoded prediction information comprising a prediction mode, prediction size, and, in the case of inter coding, motion information to obtain the predictive video block. The video decoder may then combine the predictive block with the corresponding reconstructed residual block in order to generate a decoded sequence of video.
  • this disclosure describes techniques for scanning coefficients of video blocks, e.g., quantized transform coefficients.
  • the scanning creates one-dimensional vectors of coefficients from a two-dimensional block of coefficients
  • inverse scanning creates two-dimensional blocks of coefficients from one-dimensional vectors.
  • the scanning techniques described in this disclosure adapt the scanning order of coefficients in a block based on statistics associated with previously coded blocks of coefficients that were coded in the same prediction mode. For each prediction mode, statistics of the coefficients are stored, e.g., indicating probabilities that given coefficients have zero or non-zero values.
  • adjustments to the scanning order can be made in order to better ensure that non-zero coefficients are grouped together toward the beginning of the one-dimensional vector and zero value coefficients are grouped together toward the end of the one-dimensional vector, which can improve the effectiveness of entropy coding.
  • Adjustment of the scanning order can be computationally intensive. Therefore, the techniques of this disclosure may impose thresholds and threshold adjustments that can reduce the frequency at which the scanning order adjustments occur, yet still achieve desired improvements in compression due to scanning order adjustments.
  • the techniques can be performed in a reciprocal manner by the encoder and the decoder. That is, the encoder can use the adaptive scanning techniques prior to entropy encoding to scan coefficients of video blocks from a two-dimensional format to a one-dimensional vector format.
  • the decoder can scan received one-dimensional vectors of coefficients of video blocks to form the two-dimensional blocks of coefficients.
  • coefficients of video blocks can be represented in a two-dimensional block format or a one-dimensional vector format.
  • the scanning techniques of this disclosure generally define how coefficients of video blocks are converted from the two-dimensional block format to the one-dimensional vector format, and vice versa. Although this disclosure primarily focuses on the scanning of quantized transform coefficients, similar techniques could be used to scan other types of coefficients, such non-quantized coefficients or pixel values of non-transformed video blocks, e.g., if scanning of the pixel values was implemented.
  • this disclosure provides a method of coding coefficients of video blocks, the method comprising storing statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counting the video blocks associated with each of the prediction modes, scanning the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, evaluating a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and entropy coding the coefficient values.
  • this disclosure provides an apparatus that codes coefficients of video blocks, the apparatus comprising a scan unit and an entropy coding unit.
  • the scan unit stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes.
  • the entropy coding unit entropy codes the coefficient values.
  • this disclosure provides a device that codes coefficients of video blocks, the device comprising means for storing statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, means for counting the video blocks associated with each of the prediction modes, means for scanning the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, means for evaluating a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and means for entropy coding the coefficient values.
  • this disclosure provides a device comprising a scan unit that stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks from two-dimensional blocks to one-dimensional vectors based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes.
  • the device also includes an entropy coding unit that entropy encodes the coefficient values of the one-dimensional vectors, and a wireless transmitter that sends a bitstream comprising the entropy encoded coefficient values.
  • this disclosure provides a device comprising a wireless receiver that receives a bitstream comprising entropy coded coefficient values of video blocks in one-dimensional vectors, an entropy coding unit that entropy decodes the coefficient values of the video blocks, and a scan unit.
  • the scan unit stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks from the one-dimensional vectors to two-dimensional blocks based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes.
  • the techniques described in this disclosure may be implemented in hardware, software, firmware, or any combination thereof. If implemented in hardware, an apparatus may be realized as an integrated circuit, a processor, discrete logic, or any combination thereof. If implemented in software, the software may be executed in one or more processors, such as a microprocessor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), or digital signal processor (DSP). The software that executes the techniques may be initially stored in a computer-readable medium and loaded and executed in the processor.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • DSP digital signal processor
  • this disclosure also contemplates a computer-readable medium comprising instructions that upon execution in a video coding device cause the device to code coefficients of video blocks, wherein the instructions cause the device to store statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, count the video blocks associated with each of the prediction modes, scan the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, evaluate a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and entropy code the coefficient values.
  • FIG. 1 is an exemplary block diagram illustrating a video encoding and decoding system.
  • FIG. 2 is a block diagram illustrating an example of a video encoder consistent with this disclosure.
  • FIG. 3 is a block diagram illustrating an example of a video decoder consistent with this disclosure.
  • FIG. 4 is a conceptual diagram illustrating zig-zag scanning of a 4-by-4 video block.
  • FIG. 5 is a conceptual diagram illustrating zig-zag scanning of an 8-by-8 video block.
  • FIG. 6 is a conceptual diagram illustrating statistics associated with blocks of a particular mode, and an algorithm consistent with the techniques of this disclosure.
  • FIG. 7 is a conceptual diagram illustrating a hypothetical example consistent with this disclosure.
  • FIGS. 8 and 9 are flow diagrams illustrating techniques consistent with this disclosure.
  • coefficient block generally refers to a set of transform coefficients associated with a video block.
  • Coefficient blocks can be represented in a two-dimensional block format or one- dimensional vector format.
  • the scanning techniques of this disclosure define how the coefficient blocks are converted from the two-dimensional block format to the one- dimensional vector format by an encoder, and how the coefficient blocks are converted from the one-dimensional vector format to the two-dimensional block format by a decoder.
  • the scanning techniques described herein may also be applied to convert other types of video data (e.g., a video block in the pixel domain) from two-dimensional block format into one-dimensional vector format.
  • video data e.g., a video block in the pixel domain
  • the scanning of coefficient blocks from the two-dimensional block format to the one-dimensional vector format follows a zig-zag scanning order.
  • coefficients in the upper-left of a coefficient block occur earlier in the one-dimensional vector and the coefficients in the lower-right of a coefficient block occur later.
  • High energy transform coefficients typically reside near the upper left hand corner following transform. For this reason, zig-zag scanning is an effective way to group non-zero coefficients near the beginning of the one-dimensional vector.
  • the entropy coding unit then typically entropy codes the one-dimensional vector in the form of runs and levels, where runs are the number of zero value transform coefficients in between two non-zero transform coefficients, and the levels represent the values of the non-zero transform coefficients. Moreover, after the last non-zero transform coefficient is sent for a given coefficient block (e.g., in one-dimensional vector format), the entropy coder typically sends an End-Of-Block (EOB) symbol or a last coefficient flag to indicate this is the last non-zero transform coefficient in the block.
  • EOB End-Of-Block
  • the techniques of this disclosure adapt the scanning order based on statistics associated with previously coded blocks that were coded in the same prediction mode. For each prediction mode, statistics of the transform coefficients are stored, e.g., indicating probabilities that transform coefficients at given positions are zero or non-zero. Periodically, adjustments to the scanning order can be made in order to better ensure that non-zero transform coefficients are grouped together toward the beginning of the one-dimensional vector and zero value coefficients are grouped together toward the end of the one-dimensional vector, which can improve the effectiveness of entropy coding.
  • the adaptive scanning techniques may occur for each separate coded unit, e.g., each frame, slice, or other type of coded unit.
  • Coefficient blocks of a coded unit may initially be scanned in a fixed way (e.g., in a zig-zag scanning order or another fixed scanning order), but may quickly adapt to a different scanning order if statistics of coefficient blocks for a given prediction mode indicate that a different scanning order would be more effective to group non-zero and zero value coefficients.
  • Adjustment of the scanning order can be computationally intensive. Therefore, the techniques of this disclosure impose thresholds and threshold adjustments that can reduce the frequency at which the scanning order adjustments occur, yet still achieve desired improvements in compression due to such scanning order adjustments.
  • the techniques can be performed in a reciprocal manner by the encoder and the decoder. That is, the encoder can use the adaptive scanning techniques prior to entropy encoding to scan coefficients of video blocks from two-dimensional format to one-dimensional vectors.
  • the decoder can inverse scan received one-dimensional vectors of coefficients of video blocks following an entropy decoding process to recreate the coefficient blocks in the two-dimensional format.
  • coefficient block generally refers to a set of transformed coefficients represented in either a two-dimensional block format or a one-dimensional vector format.
  • FIG. 1 is a block diagram illustrating an exemplary video encoding and decoding system 10 that may implement techniques of this disclosure.
  • system 10 includes a source device 12 that transmits encoded video to a destination device 16 via a communication channel 15.
  • Source device 12 and destination device 16 may comprise any of a wide range of devices.
  • source device 12 and destination device 16 comprise wireless communication device handsets, such as so- called cellular or satellite radiotelephones.
  • the techniques of this disclosure are not necessarily limited to wireless applications or settings.
  • source device 12 may include a video source 20, a video encoder 22, a modulator/demodulator (modem) 23 and a transmitter 24.
  • Destination device 16 may include a receiver 26, a modem 27, a video decoder 28, and a display device 30.
  • video encoder 22 of source device 12 may be configured to perform adaptive scanning of coefficients prior to entropy encoding to form a one-dimensional set of data.
  • video decoder 28 of destination device 16 may be configured to perform adaptive scanning of coefficients following entropy decoding to produce a two-dimensional set of data.
  • Video decoder 28 need not receive any indication of the scanning order applied by video encoder 22; rather, the scanning order can be derived in essentially the same way at both video encoder 22 and video decoder 28.
  • the illustrated system 10 of FIG. 1 is merely exemplary.
  • the scanning techniques of this disclosure may be performed by any encoding or decoding device that supports any of a wide variety of entropy coding methodologies, such as content adaptive variable length coding (CAVLC), context adaptive binary arithmetic coding (CABAC), or other entropy coding methodologies.
  • Source device 12 and destination device 16 are merely examples of such coding devices.
  • video encoder 22 and video decoder 28 may store statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, and may count the video blocks associated with each of the prediction modes.
  • Video encoder 22 and video decoder 28 scan the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, evaluate a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and entropy coding the coefficient values.
  • the scanning precedes entropy encoding
  • the decoding side the scanning follows the entropy decoding.
  • source device 12 generates coded video data for transmission to destination device 16.
  • devices 12, 16 may operate in a substantially symmetrical manner.
  • each of devices 12, 16 may include video encoding and decoding components.
  • system 10 may support one-way or two-way video transmission between video devices 12, 16, e.g., for video streaming, video playback, video broadcasting, or video telephony.
  • Video source 20 of source device 12 may include a video capture device, such as a video camera, a video archive containing previously captured video, or a video feed from a video content provider.
  • video source 20 may generate computer graphics-based data as the source video, or a combination of live video, archived video, and computer-generated video.
  • source device 12 and destination device 16 may form so-called camera phones or video phones.
  • the captured, pre-captured or computer-generated video may be encoded by video encoder 22.
  • the encoded video information may then be modulated by modem 23 according to a communication standard, e.g., such as code division multiple access (CDMA) or another communication standard or technique, and transmitted to destination device 16 via transmitter 24.
  • CDMA code division multiple access
  • Receiver 26 of destination device 16 receives information over channel 15, and modem 27 demodulates the information.
  • the video decoding process performed by video decoder 28 may perform entropy decoding and adaptive scanning as part of the reconstruction of the video sequence.
  • the decoding process like the encoding process, uses the techniques of this disclosure in order to support improved levels of data compression.
  • Display device 28 displays the decoded video data to a user, and may comprise any of a variety of display devices such as a cathode ray tube (CRT), a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device.
  • CTR cathode ray tube
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • communication channel 15 may comprise any wireless or wired communication medium, such as a radio frequency (RF) spectrum or one or more physical transmission lines, or any combination of wireless and wired media.
  • Communication channel 15 may form part of a packet-based network, such as a local area network, a wide-area network, or a global network such as the Internet.
  • Communication channel 15 generally represents any suitable communication medium, or collection of different communication media, for transmitting video data from source device 12 to destination device 16.
  • Video encoder 22 and video decoder 28 may operate according to a video compression standard that supports CAVLC, CABAC or another entropy coding methodology, such as the ITU-T H.264 standard, alternatively referred to as MPEG-4, Part 10, Advanced Video Coding (AVC).
  • a video compression standard that supports CAVLC, CABAC or another entropy coding methodology, such as the ITU-T H.264 standard, alternatively referred to as MPEG-4, Part 10, Advanced Video Coding (AVC).
  • AVC Advanced Video Coding
  • Such techniques may be readily applied to any of a variety of other video coding standards, such as those defined by the Moving Picture Experts Group (MPEG) in MPEG-I, MPEG-2 and MPEG-4, the ITU-T H.263 standard, the Society of Motion Picture and Television Engineers (SMPTE) 42 IM video CODEC standard (commonly referred to as "VC-I”), the standard defined by the Audio Video Coding Standard Workgroup of China (commonly referred to as "AVS”), as well as any other video coding standard defined by a standards body or developed by an organization as a proprietary standard.
  • MPEG Moving Picture Experts Group
  • MPEG-2 MPEG-2
  • MPEG-4 the ITU-T H.263 standard
  • SMPTE Society of Motion Picture and Television Engineers
  • VC-I the Standard defined by the Audio Video Coding Standard Workgroup of China
  • AVS Audio Video Coding Standard Workgroup of China
  • video encoder 22 and video decoder 28 may each be integrated with an audio encoder and decoder, and may include appropriate MUX-DEMUX units, or other hardware and software, to handle encoding of both audio and video in a common data stream or separate data streams. If applicable, MUX-DEMUX units may conform to the ITU H.223 multiplexer protocol, or other protocols such as the user datagram protocol (UDP).
  • MUX-DEMUX units may conform to the ITU H.223 multiplexer protocol, or other protocols such as the user datagram protocol (UDP).
  • UDP user datagram protocol
  • the ITU H.264/MPEG-4 Part 10 AVC standard was formulated by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG) as the product of a collective partnership known as the Joint Video Team (JVT).
  • JVT Joint Video Team
  • the techniques described in this disclosure may be applied to devices that generally conform to the H.264 standard.
  • the H.264 standard is described in ITU-T Recommendation H.264, Advanced Video Coding for generic audiovisual services, by the ITU-T Study Group, and dated March, 2005, which may be referred to herein as the H.264 standard or H.264 specification, or the H.264/AVC standard or specification.
  • the Joint Video Team (JVT) continues to work on extensions to H.264/AVC.
  • Video encoder 22 and video decoder 28 each may be implemented as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware or any combinations thereof.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • Each of video encoder 22 and video decoder 28 may be included in one or more encoders or decoders, either of which may be integrated as part of a combined encoder/decoder (CODEC) in a respective mobile device, subscriber device, broadcast device, server, or the like.
  • CDEC combined encoder/decoder
  • a video sequence includes a series of video frames.
  • a video sequence can be arranged as a group of pictures (GOP).
  • Video encoder 22 operates on video blocks within individual video frames in order to encode the video data.
  • the video blocks may have fixed or varying sizes, and may differ in size according to a specified coding standard.
  • Each video frame may include a series of slices.
  • Each slice may include a series of macroblocks, which may be arranged into even smaller blocks. Macrob locks typically refer to 16 by 16 blocks of data.
  • the ITU-T H.264 standard supports infra prediction in various block sizes, such as 16 by 16, 8 by 8, or 4 by 4 for luma components, and 8x8 for chroma components, as well as inter prediction in various block sizes, such as 16 by 16, 16 by 8, 8 by 16, 8 by 8, 8 by 4, 4 by 8 and 4 by 4 for luma components and corresponding scaled sizes for chroma components.
  • video blocks may refer to blocks of coefficients, e.g., transform coefficients, following a transform process such as discrete cosine transform or a conceptually similar transformation process in which a set of pixel values are transformed into the frequency domain.
  • the transform coefficients may be quantized.
  • the scanning techniques of this disclosure typically apply with respect to quantized transform coefficients, but may be applicable to non-quantized transform coefficients in some implementations. Moreover, the scanning techniques of this disclosure may also be applicable to blocks of pixel values (i.e., without the transform process), which may or may not be quantized blocks of pixel values.
  • the term "coefficient" is used broadly herein to represent values of video blocks, including not only transform coefficients of coefficient blocks, but also pixel values of non-transformed video blocks. [0045] Larger video blocks, such as macroblocks, may be divided into smaller sized video blocks. Smaller video blocks can provide better resolution, and may be used for locations of a video frame that include high levels of detail.
  • MBs macroblocks
  • Video frames may comprise decodable units, or may be divided into smaller decodable units, such as "slices.” That is, a slice may be considered to be a series of video blocks, such as MBs and/or smaller sized blocks, and each slice may be an independently decodable unit of a video frame.
  • a transform may be performed on the 8x8 residual block of pixels or 4x4 residual block of pixels, and an additional transform may be applied to the DC coefficients of the 4x4 blocks of pixels for chroma components or, if an intra_16xl6 prediction mode is used, for luma components.
  • the data may be referred to as coefficient blocks, or transformed video blocks.
  • the coefficient blocks contain transform coefficients, rather than pixel values.
  • coefficients generally refers to transform coefficients, but may alternatively refer to other types of coefficients or values (e.g., pixel values without the transform process).
  • quantization may be performed.
  • Other transformation techniques such as wavelet-based compression may be used.
  • Quantization generally refers to a process in which coefficients are quantized to possibly reduce the amount of data used to represent the coefficients. The quantization process may reduce the bit depth associated with some or all of the coefficients. For example, an 8-bit value may be rounded down to a 7-bit value during quantization.
  • scanning and entropy coding may be performed according to the techniques described herein.
  • video blocks of transform coefficients such as 4 by 4 video blocks, 8 by 8 video blocks, or possibly other sized blocks such as 16 by 16 video blocks, can be scanned from a two-dimensional format to a one-dimensional format.
  • the scanning order may be initialized for each coded unit and may begin in a conventional manner (e.g., the zig-zag scanning order).
  • the scanning order may be adaptive.
  • the scanning order may adapt for video blocks of one or more prediction modes based on statistics associated with such video blocks.
  • the statistics may comprise a count of the number of video blocks encoded in each respective prediction mode, and a set of probabilities associated with coefficients of video blocks encoded in each prediction mode.
  • the probabilities may comprise an indication of the likelihood that a given coefficient value in each location of the video block has a value of zero, or has a non-zero value.
  • the probabilities may comprise more detailed probabilities indicative of the actual values at each location, or another type of statistical probability measure associated with coefficient values.
  • One or more thresholds may be defined relative to the count values. At periodic intervals (such as when macroblock boundaries are encountered), the scan order associated with the different modes of video blocks can be evaluated. When the scan order is evaluated, if the count value associated with a given prediction mode satisfies the threshold of the given prediction mode, then the scan order for that mode may be examined and possibly changed to reflect the statistics of video blocks coded in the given prediction mode. In particular, the scan order can be defined so that coefficients are scanned in the order of their probability of having non-zero values. That is, coefficient locations that have a higher probability of being non-zero are scanned prior to coefficient locations that have a lower probability of being non-zero.
  • a conventional scanning order (such as a zig-zag scanning order) may adapt to a scanning order that groups non-zero coefficients more toward the beginning of the one- dimensional vector representations of the coefficient blocks.
  • the decoder can calculate the same statistics and thereby determine the scanning order that was used in the encoding process. Accordingly, reciprocal adaptive scanning orders can be applied by the decoder in order to convert the one-dimensional vector representation of the coefficient blocks back to the two-dimensional block format.
  • the scanning order may differ for each different predictive mode. That is, statistics are maintained for each different prediction mode.
  • This disclosure is not limited to any particular number of modes, or types of modes.
  • the different modes may define the size of the video block and the type of prediction used in the coding process.
  • a plurality of prediction modes may comprise a plurality of intra prediction modes and a plurality of inter prediction modes.
  • inter coding may support two or more modes, such as an inter prediction mode that corresponds to 4 by 4 transform block size and an inter prediction mode that corresponds to 8 by 8 transform block size. In some cases, several 4 by 4 modes such as predictive (P) and bi-directional predictive (B) modes may be supported.
  • P predictive
  • B bi-directional predictive
  • Inter coding may also support an 8 by 8 P mode and an 8 by 8 B mode.
  • different modes may also be defined for inter coded blocks of luma and chroma information.
  • a variety of different inter coding prediction modes may be defined, and this disclosure is not limited to any particular set of modes.
  • Intra coding may also support a wide range of predictive modes.
  • the intra prediction modes may comprise a plurality of 4 by 4 luma intra prediction modes, a plurality of 8 by 8 luma intra prediction modes, a plurality of 16 by 16 luma intra prediction modes, and a plurality of 8 by 8 chroma intra prediction modes.
  • the intra prediction modes may comprise twenty-six different modes in which predictive blocks are generated based on different types of propagation, adaptation, and/or interpolation of neighboring data within the same coded unit.
  • Intra coding modes may comprise modes such as vertical, horizontal, DC, diagonal downleft, diagonal downright, vertical right, horizontal down, vertical left and horizontal up.
  • Intra coding modes may also define combinations of the modes mentioned above, such as vertical plus horizontal, DC plus vertical, DC plus horizontal, diagonal downleft plus horizontal, diagonal downright plus vertical, vertical right plus horizontal, horizontal down plus vertical, vertical left plus horizontal and horizontal up plus vertical. Details of these particular modes are set forth in the following document, which is incorporated herein by reference: Y. Ye and M. Karczewicz, "Improved Intra Coding," ITU-T Q.6/SG16 VCEG, C257, Geneva, Switzerland, June 2007. In any case, this disclosure is not limited to any particular number of modes, or types of modes.
  • a predictive mode may define the size of the encoded block, the size of the predictive block, the size of the transform used, and the way in which the data of the predictive block is located or generated.
  • FIG. 2 is a block diagram illustrating an example of a video encoder 50 that includes an adaptive scan unit 45 that performs techniques of this disclosure to scan video blocks from a two-dimensional block format to a one dimensional vector format.
  • video encoder 50 receives a current video block within a video frame to be encoded.
  • video encoder 50 includes prediction unit 32, reference frame store 34, block transform unit 38, quantization unit 40, inverse quantization unit 42, inverse transform unit 44, adaptive scan unit 45 and entropy encoding unit 46.
  • a deblocking filter (not shown) may also be included to filter block boundaries to remove blockiness artifacts.
  • Video encoder 50 also includes summer 48 and summer 51.
  • prediction unit 32 compares the video block to be encoded to various blocks in one or more video reference frames. For inter coding, prediction unit 32 predicts the video block to be encoded from already coded neighboring video blocks of the same coded unit. The predicted data may be retrieved from reference frame store 34, which may comprise any type of memory or data storage device to store video blocks reconstructed from previously encoded blocks. Prediction unit 32 may generate prediction modes and prediction vectors, which comprise syntax elements that may be used to identify the prediction blocks used to code the current video block. For intra coding, prediction unit 32 may comprise a spatial prediction unit, while for inter coding, prediction unit 32 may include motion estimation and motion compensation units.
  • Video encoder 50 forms a residual video block by subtracting the prediction block produced by prediction unit 32 from the original video block being encoded.
  • Summer 48 represents a unit or module that performs this subtraction operation.
  • Block transform unit 38 applies a transform, such as a discrete cosine transform (DCT) or a conceptually similar transform, to the residual block, producing a video block comprising residual transform block coefficients.
  • Block transform unit 38 may perform other transforms defined by the H.264 standard, which are conceptually similar to DCT.
  • Quantization unit 40 quantizes the residual transform coefficients to further reduce bit rate. Quantization unit 40, for example, may limit the number of bits used to code each of the coefficients.
  • adaptive scan unit 45 scans the quantized coefficient block from a two-dimensional representation to a one-dimensional vector. Then, following this scanning process, entropy encoding unit 46 encodes the quantized transform coefficients according to an entropy coding methodology, such as CAVLC or CABAC, to further compress the data.
  • an entropy coding methodology such as CAVLC or CABAC
  • adaptive scan unit 45 stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes. Then, following this scanning process, entropy encoding unit 46 encodes the quantized transform coefficients according to an entropy coding methodology.
  • Adaptive scan unit 45 may determine a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes. In addition, adaptive scan unit 45 may adjust the threshold upon adjusting the given scan order.
  • the statistics stored by adaptive scan unit 45 may comprise statistics indicative of the probability of the coefficient values being zero or non-zero.
  • adaptive scan unit 45 determines a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes, and increases or decreases the threshold based on whether the new scan order is the same as a previous scan order.
  • adaptive scan unit 45 may increase the threshold of the given one of the prediction modes, e.g., by a factor of two subject to an upper limit. Similarly, if the new scan order is different than the previous scan order, adaptive scan unit 45 may decrease the threshold of the given one of the prediction modes, e.g., by a factor of two subject to a lower limit. Upon determining the scan order of the given one of the prediction modes, adaptive scan unit 45 may re-set the count value associated with the given one of the prediction modes. Once the coefficient blocks are scanned into a one-dimensional format, entropy encoding unit 46 entropy encodes the quantized transform coefficients.
  • the encoded video may be transmitted to another device or archived for later transmission or retrieval.
  • Inverse quantization unit 42 and inverse transform unit 44 apply inverse quantization and inverse transformation, respectively, to reconstruct the residual block in the pixel domain.
  • Summer 51 adds the reconstructed residual block to the prediction block produced by prediction unit 32 to produce a reconstructed video block for storage in reference frame store 34.
  • the reconstructed video block may also go through a deblocking filter unit (not shown) before being stored in reference frame store 34.
  • the reconstructed video block may be used by prediction unit 32 as a reference block to inter-code a block in a subsequent video frame or to intra-code a future neighboring block within the same coded unit.
  • FIG. 3 is a block diagram illustrating an example of a video decoder 60, which decodes a video sequence that is encoded in the manner described herein.
  • Video decoder 60 includes an entropy decoding unit 52 that performs the reciprocal decoding function of the encoding performed by entropy encoding unit 46 of FIG. 2.
  • Video decoder 60 also includes an adaptive scan unit 55 that performs inverse scanning that is reciprocal to the scanning performed by adaptive scan unit 45 of FIG. 2.
  • Video decoder 60 may perform intra- and inter- decoding of blocks within video frames.
  • video decoder 60 also includes a prediction unit 54, an inverse quantization unit 56, an inverse transform unit 58, and reference frame store 62.
  • Video decoder 60 also includes summer 64.
  • video decoder 60 also may include a deblocking filter (not shown) that filters the output of summer 64.
  • prediction unit 54 may comprise a spatial prediction unit, while for inter coding, prediction unit 54 may comprise a motion compensation unit.
  • Inverse quantization unit 56 performs inverse quantization, and inverse transform unit 58 performs inverse transforms to change the coefficients of the video blocks back to the pixel domain.
  • Summer combines a prediction block from unit 54 with the reconstructed residual block from inverse transform unit 58 to generate a reconstructed block, which is stored in reference frame store 62. If desired, the reconstructed video block may also go through a deblocking filter unit (not shown) before being stored in reference frame store 62. Decoded video is output from reference frame store 62, and may also be fed back to prediction unit 54 for use in subsequent predictions.
  • entropy decoding unit 52 performs the reciprocal decoding function of the encoding performed by entropy encoding unit 46 of FIG. 2, and adaptive scan unit 55 then performs reciprocal scanning of that performed by adaptive scan unit 45 of FIG. 2.
  • adaptive scan unit 55 of FIG. 2 Like adaptive scan unit 45 of FIG. 2, adaptive scan unit 55 of FIG.
  • adaptive scan unit 55 performs similar functions to adaptive scan unit 45, but does so in the reverse manner.
  • adaptive scan unit 45 scans coefficient blocks from a two-dimensional format to a one-dimensional format prior to entropy encoding
  • adaptive scan unit 55 scans coefficient blocks from the one-dimensional format to the two-dimensional format following entropy decoding.
  • FIG. 4 is a conceptual diagram illustrating zig-zag scanning of a 4 by 4 coefficient block.
  • FIG. 5 is a conceptual diagram illustrating zig-zag scanning of an 8 by 8 coefficient block.
  • the zig-zag scanning shown in FIGS. 4 and 5 may be performed by adaptive scanning unit 45 at the beginning of the coding process for a coded unit. As discussed in greater detail below, however, the scanning order may adapt based on the actual statistics associated with the already coded coefficient blocks.
  • the scanning order for such zig-zag scanning shown in FIGS. 4 and 5 follows the arrow through video blocks 80 and 90, and the coefficients are labeled in the scanning order. In particular, the numerical values shown in FIGS.
  • the techniques of this disclosure are not limited to any particular scanning order or technique.
  • the initial scanning orders used in this disclosure may be the zig-zag scanning orders shown in FIGS. 4 and 5.
  • the initial scanning orders used in this disclosure may be a set of fixed scanning orders that may be specially trained for each one of a plurality of prediction modes. As zig-zag scanning is quite typical, it provides a good starting point for discussion of the adaptive scanning of this disclosure.
  • the scanning order adapts over time based on the actual statistics associated with the already coded coefficient blocks.
  • the scanning order may begin with a conventional scanning order, such as zig-zag scanning, but adapts as statistics accumulate for coefficient blocks coded in the different prediction modes within that coded unit.
  • zig-zag scanning is not the only possible starting point for adaptive scanning.
  • Horizontal scanning, vertical scanning, or any initial scanning technique may be used as a starting point for the adaptive scanning techniques described herein.
  • FIG. 6 is a conceptual diagram illustrating an exemplary set of statistics (Sl -S 16) associated with blocks of a particular prediction mode, and an algorithm consistent with the techniques of this disclosure.
  • the initial scanning order of a video block in Mode X may be defined by a zig-zag scanning process as follows: (Sl, S2, S5, S9, S6, S3, S4, S7, SlO, S13, S14, SIl, S8, S12, S15, S16).
  • the numbered coefficients correspond to the statistics that are numbered in statistics block 69 of FIG. 6.
  • Count(mode X) defines a count of a number of blocks coded in Mode X for a given coded unit.
  • Algorithm 60 of FIG. 6 may be invoked at predefined update interval in the coding of a coded unit (e.g., a frame or slice), such as when macroblock boundaries are encountered. According to this disclosure, once algorithm 60 is invoked, if Count(mode X) is greater than or equal to a pre-defined threshold, scan unit 45 or 55 (FIG. 2 or 3) selects a scan order based on the statistics Sl -S 16, and then re-sets Count(mode X).
  • a coded unit e.g., a frame or slice
  • the threshold is basically a mechanism that can limit the occurrence of scan order changes, which usually requires computationally intensive sorting process, and can ensure that sufficient statistics are accumulated for a given mode of video block prior to evaluating the scan order.
  • a new scan order can only be selected for a given mode of video block when the count of the given mode satisfies the threshold of the given mode.
  • the threshold may adjust over time in order to accelerate the occurrence of scan order evaluations when new scan orders are different than previous scan orders, or to reduce the occurrence of scan order evaluations when new scan orders remain the same as previous scan orders.
  • the techniques described herein may perform scan order evaluations more frequently at the beginning of the code unit until the scan order reaches a steady and desirable state, and may then perform scan order selections less frequently as changes in scan orders become less likely.
  • FIG. 7 is a conceptual diagram illustrating a hypothetical example consistent with this disclosure.
  • coefficients are labeled in items 71A and 71B as cl- cl6. Actual coefficient values are shown in block 1 (72), block 2 (73), block 3 (74) and block 4 (75).
  • Blocks 1-4 may comprise blocks associated with the same prediction mode. Blocks 1-4 may be coded in sequence.
  • zig-zag scanning may be used.
  • the blocks are scanned in the following order, which is consistent with the illustration of FIG. 4:
  • statistics 1 represents the statistics of block 1, e.g., with values of one for any coefficient that is non-zero and values of zero for any coefficient that has a value of zero.
  • Statistics 2 represents the combined statistics of blocks 1 and 2, e.g., with normalized probability values indicative of whether that coefficient location was one or zero in blocks 1 and 2. In this case, the normalized probability of the location c6 is 0.5 since block 1 had a nonzero coefficient at that location, but block 2 had a zero value coefficient at that location.
  • Statistics 3 represents the combined statistics of blocks 1, 2 and 3 as normalized probabilities
  • statistics 4 represent the combined statistics of blocks 1, 2, 3 and 4 as normalized probabilities.
  • the normalized probabilities may comprise an average of the values of one or zero for every given location, wherein the value of one is given for a particular location of the block if that location of the block defines a non-zero coefficient.
  • zig-zag scan is used as the initial scanning order and the statistics of the coefficient blocks are initialized to be all zero. Such initializations are given only as an example, and alternative initialization of the scanning order and the coefficient statistics may be used.
  • the threshold is set at a value of 4.
  • the preset updating interval e.g., once a macroblock boundary is encountered
  • the count of 4 blocks is determined to satisfy the threshold of 4.
  • the sorting algorithm is invoked, and scan unit 45 (FIG. 2) may define a new scan order based on statistics 4 (79). Accordingly, the new scan order is as follows:
  • the scanning order changes from an initial scan order (e.g., zig-zag scanning) to a new scan order that promotes non-zero coefficients at the beginning of the one-dimensional vector, and zero coefficients at the end.
  • an initial scan order e.g., zig-zag scanning
  • a new scan order that promotes non-zero coefficients at the beginning of the one-dimensional vector, and zero coefficients at the end.
  • the probability at locations c5 and c9 are higher than that at c2 in statistics 4 (79)
  • c5 and c9 are both scanned before c2 in the new scanning order.
  • the new scan order exhibits stronger directionality in the vertical dimension.
  • the new scan order goes through coefficients in the vertical dimension faster than the coefficients in the horizontal dimension, which is consistent with the statistical distribution of the coefficients of the video blocks 1-4 (72, 73, 74, 75) coded in a given prediction mode.
  • the techniques of this disclosure may promote grouping of non-zero coefficients near the beginning of a scanned one-dimensional vector and zero value coefficients near the end of the scanned one-dimensional vector. This, in turn, can improve the level of compression that can be achieved during entropy coding.
  • thresholds are defined to limit the occurrence of scan order changes since such changes require computationally intensive sorting process, and to help ensure that sufficient statistics are accumulated for a given mode of a video block prior to evaluating the scan order.
  • a new scan order can only be selected for a given mode of video block when the count of the given mode satisfies the threshold of the given mode.
  • the threshold may adjust upward or downward over time (subject to upper and lower bounds). For example, if a scan order evaluation results in scan order changes, the threshold may be reduced so that a subsequent scan order evaluation occurs more quickly. In this case, since the scan orders are changing, it may be desirable to speed the occurrence of future changes to bring the scan order into a steady state.
  • the threshold may be increased so that a subsequent scan order evaluation takes longer to occur.
  • it may be desirable to reduce the frequency of evaluation of possible scan order changes, since these evaluations require the use of processing resources.
  • These types of threshold adjustments may evaluate scan order changes more frequently until the scan order reaches a steady and desirable state, and may then limit the frequency of scan order evaluations as changes become less likely.
  • FIG. 8 is a flow diagram illustrating a coding (i.e., encoding or decoding) technique consistent with this disclosure.
  • FIG. 8 is illustrated from the perspective of video encoder 50 insofar as the step of entropy coding (step 85) is after the step of scanning (step 83). From the perspective of video decoder 60, the step of entropy coding (step 85) would precede the step of scanning (step 83). For example, from the perspective of video decoder 60, the steps shown in FIG. 8 may be performed in the following order (step 85, step 83, step 81, step 82, step 84). For purposes of simplicity, FIG. 8 is described below from the perspective of video encoder 50. [0075] As shown in FIG.
  • adaptive scan unit 45 updates statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes (81), and counts the video blocks associated with each of the prediction modes (82). Adaptive scan unit 45 then scans the coefficient values of the video blocks into one- dimensional coefficient vectors according to scan orders defined for each of the prediction modes (83), and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes (84). Then, following this scanning process, entropy encoding unit 46 encodes the one-dimensional coefficient vectors according to an entropy coding methodology (85).
  • Adaptive scan unit 45 may determine a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes. In addition, adaptive scan unit 45 may adjust the threshold upon determining the given scan order. As discussed in this disclosure, the statistics stored by adaptive scan unit 45 may comprise statistics indicative of the probabilities of the coefficient values being zero or non-zero, or possibly other types of statistics indicative of the probabilities of coefficients values. In one example, adaptive scan unit 45 determines a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes, and increases or decreases the threshold based on whether the new scan order is the same as a previous scan order.
  • adaptive scan unit 45 may increase the threshold, e.g., by a factor of two subject to an upper limit. Similarly, if the new scan order is different than the previous scan order, adaptive scan unit 45 may decrease the threshold, e.g., by a factor of two subject to a lower limit. Upon determining the new scan order, adaptive scan unit 45 may re-set the count value associated with the given one of the prediction modes. Once scanned into a one- dimensional format, entropy encoding unit 46 entropy encodes the coefficient vectors. [0078] FIG.
  • FIG. 9 is an exemplary flow diagram illustrating an adaptive scanning process that may be performed by scan unit 45 of video encoder 50 (FIG. 2) and scan unit 55 of video decoder 60 (FIG. 3).
  • the process of FIG. 9 may repeat for each coded unit.
  • coded units may comprise individual frames of a video sequence, portions of frames (such as slices), or another independently decodable unit of a video sequence.
  • scan unit 45 initializes its scanning order for a new coded unit (91). In other words, at the beginning of a frame or slice, the scanning order is initialized.
  • the count values for every mode are set to zero, and the thresholds are set to an initial value, such as a value of 4 for modes that correspond to 4 by 4 blocks and a value of 2 for modes that correspond to 8 by 8 blocks.
  • statistics of coefficient blocks for every mode are also initialized, either to all zero or other statistics based on empirical training.
  • Scan unit 45 applies its initial scanning order (e.g., zig-zag scanning). In doing so, scan unit 45 collects block coefficient statistics and increments count(mode) for each mode identified for the scanned blocks (92). This process continues until a preset update interval is reached (93). For example, the preset update interval may correspond to a macroblock boundary, or another predetermined interval.
  • scan unit 45 evaluates the scan order. In particular, scan unit 45 determines whether count(mode) satisfies the threshold thresh(mode) (94). If not ("no" 94), scan unit 45 considers the other modes, e.g., until all the modes are examined (100). For any given mode, if the count(mode) satisfies the threshold ("yes" 94), scan unit 45 invokes a sorting function, which updates the scan order (95) based on the accumulated statistics for that mode. If the scan order changes as a result of this update in the scan order ("yes" 96), scan unit 45 reduces thres(mode) for that mode (97).
  • scan unit 45 increases thres(mode) for that mode (98).
  • these increases (98) or reductions (97) in the thresholds may be changed by a factor of two (i.e., multiply by 2 or divide by 2) subject to lower and upper bounds.
  • the lower and upper bounds may be set to 4 for modes that correspond to 4 by 4 blocks and a value of 2 for modes that correspond to 8 by 8 blocks.
  • the initial thresholds may be set at the lower bounds in order to invoke sorting as quickly as possible following initialization.
  • the scan order for a given mode is updated (95), the count(mode) for that mode is reset to zero (99).
  • the process determines whether additional modes need to be examined (100).
  • the process continues as a given coded unit (e.g., a frame or a slice) is coded. That is, a new initialization (91) may occur when the next coded unit is encountered.
  • the techniques of this disclosure may be realized in a wide variety of devices or apparatuses, including a wireless handset, and integrated circuit (IC) or a set of ICs (i.e., a chip set). Any components, modules or units have been described provided to emphasize functional aspects and does not necessarily require realization by different hardware units.
  • the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof. Any features described as modules or components may be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a computer-readable medium comprising instructions that, when executed, performs one or more of the methods described above.
  • the computer- readable data storage medium may form part of a computer program product, which may include packaging materials.
  • the computer-readable medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, magnetic or optical data storage media, and the like.
  • RAM synchronous dynamic random access memory
  • ROM read-only memory
  • NVRAM non-volatile random access memory
  • EEPROM electrically erasable programmable read-only memory
  • FLASH memory magnetic or optical data storage media, and the like.
  • the techniques additionally, or alternatively, may be realized at least in part by a computer-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer.
  • the code may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, an application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • the term "processor,” as used herein may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described herein.
  • the functionality described herein may be provided within dedicated software modules or hardware modules configured for encoding and decoding, or incorporated in a combined video encoder-decoder (CODEC). Also, the techniques could be fully implemented in one or more circuits or logic elements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)

Abstract

This disclosure describes techniques for scanning coefficients of video blocks, e.g., quantized and transformed coefficients. Rather than use conventional zig-zag scanning, the techniques of this disclosure adapt the scanning order based on statistics associated with previously coded blocks that were coded in the same prediction mode. For each prediction mode, statistics of the coefficients are stored, e.g., indicating probabilities that given coefficients are zero or non-zero. Periodically, adjustments to the scanning order can be made in order to better ensure that non-zero coefficients are grouped together and zero value coefficients are grouped together, which can improve the effectiveness of entropy coding. The techniques of this disclosure provide thresholds and threshold adjustments that can reduce the frequency that the scanning order adjustments occur, yet still achieve desired improvements in compression due to such scanning order adjustments.

Description

ADAPTIVE COEFFICIENT SCANNING IN VIDEO CODING
[0001] This application claims the benefit of the following U.S. Provisional Applications:
U.S. Provisional Application No. 61/030,443, filed on February 21, 2008, U.S. Provisional Application No. 60/944,470, filed on June 15, 2007, and U.S. Provisional Application No. 60/979,762, filed on October 12, 2007.
The entire content of each of these applications is incorporated herein by reference.
TECHNICAL FIELD
[0002] This disclosure relates to digital video coding and, more particularly, entropy coding of coefficients of video blocks, such as transform coefficients of transformed video blocks.
BACKGROUND
[0003] Digital video capabilities can be incorporated into a wide range of devices, including digital televisions, digital direct broadcast systems, wireless communication devices such as radio telephone handsets, wireless broadcast systems, personal digital assistants (PDAs), laptop or desktop computers, digital cameras, digital recording devices, video gaming devices, video game consoles, and the like. Digital video devices implement video compression techniques, such as MPEG-2, MPEG-4, or H.264/MPEG- 4, Part 10, Advanced Video Coding (AVC), to transmit and receive digital video more efficiently. Video compression techniques perform spatial and temporal prediction to reduce or remove redundancy inherent in video sequences.
[0004] Video compression generally includes spatial prediction and/or temporal prediction. In particular, intra-coding relies on spatial prediction to reduce or remove spatial redundancy between video blocks within a given coded unit, which may comprise a video frame, a slice of a video frame, or the like. In contrast, inter-coding relies on temporal prediction to reduce or remove temporal redundancy between video blocks of successive coded units of a video sequence. For intra-coding, a video encoder performs spatial prediction to compress data based on other data within the same coded unit. For inter-coding, the video encoder performs motion estimation and motion compensation to track the movement of corresponding video blocks of two or more adjacent coded units.
[0005] A coded video block may be represented by prediction information that comprises a prediction mode and a predictive block size, and a residual block of data indicative of differences between the block being coded and a predictive block. In the case of inter-coding, one or more motion vectors are used to identify the predictive block of data, while in the case of intra-coding, the prediction mode can be used to generate the predictive block. Both intra-coding and inter-coding may define several different prediction modes, which may define different block sizes and/or prediction techniques used in the coding.
[0006] The video encoder may apply transform, quantization and entropy coding processes to further reduce the bit rate associated with communication of a residual block. Transform techniques may comprise discrete cosine transforms or conceptually similar processes, such as wavelet transforms, integer transforms, or other types of transforms. In a discrete cosine transform (DCT) process, as an example, the transform process converts a set of pixel values into transform coefficients, which represent the energy of the pixel values in the frequency domain. Quantization is applied to the transform coefficients, and generally involves a process that limits the number of bits associated with any given transform coefficient. Entropy coding comprises one or more processes that collectively compress a sequence of quantized transform coefficients. [0007] Prior to the entropy coding process, a transformed video block of transform coefficients may be serialized by scanning the transform coefficients from a two- dimensional block into a one-dimensional vector. Typically, the scanning is performed in a zig-zag manner such that the transform coefficients in the upper-left part of a video block occur earlier in the one-dimensional vector and the transform coefficients in the lower-right part of a video block occur later. High energy transform coefficients typically reside near the upper left corner following the transform, so zig-zag scanning is effective to group non-zero transform coefficients near the beginning of the one- dimensional vector. The scanning order can significantly affect the level of compression that can be achieved in entropy coding.
[0008] Examples of entropy coding processes include content adaptive variable length coding (CAVLC) and context adaptive binary arithmetic coding (CABAC). CAVLC is one type of entropy coding technique supported by the ITU H.264/MPEG4 Part 10 AVC standard. CAVLC uses variable length coding (VLC) tables in a manner that effectively compresses serialized "runs" of quantized transform coefficients. CABAC is another type of entropy coding technique supported by the ITU H.264/MPEG4 Part 10 AVC standard. CABAC may involve several stages, including binarization, context model selection, and binary arithmetic coding. Many other types of entropy coding techniques also exist, and new entropy coding techniques will likely emerge in the future. [0009] A video decoder may perform inverse entropy coding operations that correspond to the type of entropy coding used in the encoding process to reconstruct the one- dimensional vectors of transform coefficients. Inverse scanning may also be performed at the decoder to form two-dimensional blocks from received one-dimensional vectors of transform coefficients. The video decoder then inverse quantizes and inverse transforms the transform coefficients in a block to reconstruct residual pixel data. The video decoder may use decoded prediction information comprising a prediction mode, prediction size, and, in the case of inter coding, motion information to obtain the predictive video block. The video decoder may then combine the predictive block with the corresponding reconstructed residual block in order to generate a decoded sequence of video.
SUMMARY
[0010] In general, this disclosure describes techniques for scanning coefficients of video blocks, e.g., quantized transform coefficients. On the encoding side, the scanning creates one-dimensional vectors of coefficients from a two-dimensional block of coefficients, and on the decoding side, inverse scanning creates two-dimensional blocks of coefficients from one-dimensional vectors. Rather than using conventional zig-zag scanning, the scanning techniques described in this disclosure adapt the scanning order of coefficients in a block based on statistics associated with previously coded blocks of coefficients that were coded in the same prediction mode. For each prediction mode, statistics of the coefficients are stored, e.g., indicating probabilities that given coefficients have zero or non-zero values. Periodically, adjustments to the scanning order can be made in order to better ensure that non-zero coefficients are grouped together toward the beginning of the one-dimensional vector and zero value coefficients are grouped together toward the end of the one-dimensional vector, which can improve the effectiveness of entropy coding.
[0011] Adjustment of the scanning order can be computationally intensive. Therefore, the techniques of this disclosure may impose thresholds and threshold adjustments that can reduce the frequency at which the scanning order adjustments occur, yet still achieve desired improvements in compression due to scanning order adjustments. The techniques can be performed in a reciprocal manner by the encoder and the decoder. That is, the encoder can use the adaptive scanning techniques prior to entropy encoding to scan coefficients of video blocks from a two-dimensional format to a one-dimensional vector format. The decoder can scan received one-dimensional vectors of coefficients of video blocks to form the two-dimensional blocks of coefficients. Thus, coefficients of video blocks can be represented in a two-dimensional block format or a one-dimensional vector format. The scanning techniques of this disclosure generally define how coefficients of video blocks are converted from the two-dimensional block format to the one-dimensional vector format, and vice versa. Although this disclosure primarily focuses on the scanning of quantized transform coefficients, similar techniques could be used to scan other types of coefficients, such non-quantized coefficients or pixel values of non-transformed video blocks, e.g., if scanning of the pixel values was implemented. [0012] In one example, this disclosure provides a method of coding coefficients of video blocks, the method comprising storing statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counting the video blocks associated with each of the prediction modes, scanning the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, evaluating a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and entropy coding the coefficient values.
[0013] In another example, this disclosure provides an apparatus that codes coefficients of video blocks, the apparatus comprising a scan unit and an entropy coding unit. The scan unit stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes. The entropy coding unit entropy codes the coefficient values.
[0014] In another example, this disclosure provides a device that codes coefficients of video blocks, the device comprising means for storing statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, means for counting the video blocks associated with each of the prediction modes, means for scanning the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, means for evaluating a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and means for entropy coding the coefficient values.
[0015] In another example, this disclosure provides a device comprising a scan unit that stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks from two-dimensional blocks to one-dimensional vectors based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes. The device also includes an entropy coding unit that entropy encodes the coefficient values of the one-dimensional vectors, and a wireless transmitter that sends a bitstream comprising the entropy encoded coefficient values.
[0016] In another example, this disclosure provides a device comprising a wireless receiver that receives a bitstream comprising entropy coded coefficient values of video blocks in one-dimensional vectors, an entropy coding unit that entropy decodes the coefficient values of the video blocks, and a scan unit. In this case, the scan unit stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks from the one-dimensional vectors to two-dimensional blocks based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes.
[0017] The techniques described in this disclosure may be implemented in hardware, software, firmware, or any combination thereof. If implemented in hardware, an apparatus may be realized as an integrated circuit, a processor, discrete logic, or any combination thereof. If implemented in software, the software may be executed in one or more processors, such as a microprocessor, application specific integrated circuit (ASIC), field programmable gate array (FPGA), or digital signal processor (DSP). The software that executes the techniques may be initially stored in a computer-readable medium and loaded and executed in the processor.
[0018] Accordingly, this disclosure also contemplates a computer-readable medium comprising instructions that upon execution in a video coding device cause the device to code coefficients of video blocks, wherein the instructions cause the device to store statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, count the video blocks associated with each of the prediction modes, scan the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, evaluate a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and entropy code the coefficient values. [0019] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF DRAWINGS
[0020] FIG. 1 is an exemplary block diagram illustrating a video encoding and decoding system. [0021] FIG. 2 is a block diagram illustrating an example of a video encoder consistent with this disclosure.
[0022] FIG. 3 is a block diagram illustrating an example of a video decoder consistent with this disclosure.
[0023] FIG. 4 is a conceptual diagram illustrating zig-zag scanning of a 4-by-4 video block.
[0024] FIG. 5 is a conceptual diagram illustrating zig-zag scanning of an 8-by-8 video block.
[0025] FIG. 6 is a conceptual diagram illustrating statistics associated with blocks of a particular mode, and an algorithm consistent with the techniques of this disclosure.
[0026] FIG. 7 is a conceptual diagram illustrating a hypothetical example consistent with this disclosure.
[0027] FIGS. 8 and 9 are flow diagrams illustrating techniques consistent with this disclosure.
DETAILED DESCRIPTION
[0028] This disclosure describes techniques for scanning coefficients of video blocks, e.g., quantized transform coefficients. In this disclosure, the term "coefficient block" generally refers to a set of transform coefficients associated with a video block. Coefficient blocks can be represented in a two-dimensional block format or one- dimensional vector format. The scanning techniques of this disclosure define how the coefficient blocks are converted from the two-dimensional block format to the one- dimensional vector format by an encoder, and how the coefficient blocks are converted from the one-dimensional vector format to the two-dimensional block format by a decoder. Although this disclosure primarily describes the scanning techniques as being applied to transformed and quantized video blocks, the scanning techniques described herein may also be applied to convert other types of video data (e.g., a video block in the pixel domain) from two-dimensional block format into one-dimensional vector format. [0029] Conventionally, the scanning of coefficient blocks from the two-dimensional block format to the one-dimensional vector format follows a zig-zag scanning order. In this case, coefficients in the upper-left of a coefficient block occur earlier in the one-dimensional vector and the coefficients in the lower-right of a coefficient block occur later. High energy transform coefficients typically reside near the upper left hand corner following transform. For this reason, zig-zag scanning is an effective way to group non-zero coefficients near the beginning of the one-dimensional vector. The entropy coding unit then typically entropy codes the one-dimensional vector in the form of runs and levels, where runs are the number of zero value transform coefficients in between two non-zero transform coefficients, and the levels represent the values of the non-zero transform coefficients. Moreover, after the last non-zero transform coefficient is sent for a given coefficient block (e.g., in one-dimensional vector format), the entropy coder typically sends an End-Of-Block (EOB) symbol or a last coefficient flag to indicate this is the last non-zero transform coefficient in the block. By grouping nonzero transform coefficients towards the beginning of the one-dimensional vectors, higher compression can be achieved because smaller values of runs can be coded and also because the EOB symbol or the last coefficient flag can be sent more quickly. Unfortunately, zig-zag scanning does not always achieve the most effective grouping of coefficients.
[0030] Rather than use conventional zig-zag scanning, the techniques of this disclosure adapt the scanning order based on statistics associated with previously coded blocks that were coded in the same prediction mode. For each prediction mode, statistics of the transform coefficients are stored, e.g., indicating probabilities that transform coefficients at given positions are zero or non-zero. Periodically, adjustments to the scanning order can be made in order to better ensure that non-zero transform coefficients are grouped together toward the beginning of the one-dimensional vector and zero value coefficients are grouped together toward the end of the one-dimensional vector, which can improve the effectiveness of entropy coding. The adaptive scanning techniques may occur for each separate coded unit, e.g., each frame, slice, or other type of coded unit. Coefficient blocks of a coded unit may initially be scanned in a fixed way (e.g., in a zig-zag scanning order or another fixed scanning order), but may quickly adapt to a different scanning order if statistics of coefficient blocks for a given prediction mode indicate that a different scanning order would be more effective to group non-zero and zero value coefficients.
[0031] Adjustment of the scanning order, however, can be computationally intensive. Therefore, the techniques of this disclosure impose thresholds and threshold adjustments that can reduce the frequency at which the scanning order adjustments occur, yet still achieve desired improvements in compression due to such scanning order adjustments. The techniques can be performed in a reciprocal manner by the encoder and the decoder. That is, the encoder can use the adaptive scanning techniques prior to entropy encoding to scan coefficients of video blocks from two-dimensional format to one-dimensional vectors. The decoder can inverse scan received one-dimensional vectors of coefficients of video blocks following an entropy decoding process to recreate the coefficient blocks in the two-dimensional format. Again, the phrase "coefficient block" generally refers to a set of transformed coefficients represented in either a two-dimensional block format or a one-dimensional vector format.
[0032] FIG. 1 is a block diagram illustrating an exemplary video encoding and decoding system 10 that may implement techniques of this disclosure. As shown in FIG. 1, system 10 includes a source device 12 that transmits encoded video to a destination device 16 via a communication channel 15. Source device 12 and destination device 16 may comprise any of a wide range of devices. In some cases, source device 12 and destination device 16 comprise wireless communication device handsets, such as so- called cellular or satellite radiotelephones. The techniques of this disclosure, however, which apply more generally the adaptive scanning of coefficients, are not necessarily limited to wireless applications or settings.
[0033] In the example of FIG. 1, source device 12 may include a video source 20, a video encoder 22, a modulator/demodulator (modem) 23 and a transmitter 24. Destination device 16 may include a receiver 26, a modem 27, a video decoder 28, and a display device 30. In accordance with this disclosure, video encoder 22 of source device 12 may be configured to perform adaptive scanning of coefficients prior to entropy encoding to form a one-dimensional set of data. Similarly, video decoder 28 of destination device 16 may be configured to perform adaptive scanning of coefficients following entropy decoding to produce a two-dimensional set of data. Video decoder 28 need not receive any indication of the scanning order applied by video encoder 22; rather, the scanning order can be derived in essentially the same way at both video encoder 22 and video decoder 28.
[0034] The illustrated system 10 of FIG. 1 is merely exemplary. The scanning techniques of this disclosure may be performed by any encoding or decoding device that supports any of a wide variety of entropy coding methodologies, such as content adaptive variable length coding (CAVLC), context adaptive binary arithmetic coding (CABAC), or other entropy coding methodologies. Source device 12 and destination device 16 are merely examples of such coding devices.
[0035] In accordance with this disclosure, video encoder 22 and video decoder 28 may store statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, and may count the video blocks associated with each of the prediction modes. Video encoder 22 and video decoder 28 scan the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, evaluate a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes, and entropy coding the coefficient values. Again, on the encoding side, the scanning precedes entropy encoding, while on the decoding side, the scanning follows the entropy decoding.
[0036] In general, source device 12 generates coded video data for transmission to destination device 16. In some cases, however, devices 12, 16 may operate in a substantially symmetrical manner. For example, each of devices 12, 16 may include video encoding and decoding components. Hence, system 10 may support one-way or two-way video transmission between video devices 12, 16, e.g., for video streaming, video playback, video broadcasting, or video telephony.
[0037] Video source 20 of source device 12 may include a video capture device, such as a video camera, a video archive containing previously captured video, or a video feed from a video content provider. As a further alternative, video source 20 may generate computer graphics-based data as the source video, or a combination of live video, archived video, and computer-generated video. In some cases, if video source 20 is a video camera, source device 12 and destination device 16 may form so-called camera phones or video phones. In each case, the captured, pre-captured or computer-generated video may be encoded by video encoder 22. The encoded video information may then be modulated by modem 23 according to a communication standard, e.g., such as code division multiple access (CDMA) or another communication standard or technique, and transmitted to destination device 16 via transmitter 24.
[0038] Receiver 26 of destination device 16 receives information over channel 15, and modem 27 demodulates the information. The video decoding process performed by video decoder 28 may perform entropy decoding and adaptive scanning as part of the reconstruction of the video sequence. The decoding process, like the encoding process, uses the techniques of this disclosure in order to support improved levels of data compression. Display device 28 displays the decoded video data to a user, and may comprise any of a variety of display devices such as a cathode ray tube (CRT), a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device.
[0039] In the example of FIG. 1, communication channel 15 may comprise any wireless or wired communication medium, such as a radio frequency (RF) spectrum or one or more physical transmission lines, or any combination of wireless and wired media. Communication channel 15 may form part of a packet-based network, such as a local area network, a wide-area network, or a global network such as the Internet. Communication channel 15 generally represents any suitable communication medium, or collection of different communication media, for transmitting video data from source device 12 to destination device 16.
[0040] Video encoder 22 and video decoder 28 may operate according to a video compression standard that supports CAVLC, CABAC or another entropy coding methodology, such as the ITU-T H.264 standard, alternatively referred to as MPEG-4, Part 10, Advanced Video Coding (AVC). However, the techniques are described in reference to this standard merely for purposes of illustration. Such techniques may be readily applied to any of a variety of other video coding standards, such as those defined by the Moving Picture Experts Group (MPEG) in MPEG-I, MPEG-2 and MPEG-4, the ITU-T H.263 standard, the Society of Motion Picture and Television Engineers (SMPTE) 42 IM video CODEC standard (commonly referred to as "VC-I"), the standard defined by the Audio Video Coding Standard Workgroup of China (commonly referred to as "AVS"), as well as any other video coding standard defined by a standards body or developed by an organization as a proprietary standard.
[0041] Although not shown in FIG. 1, in some aspects, video encoder 22 and video decoder 28 may each be integrated with an audio encoder and decoder, and may include appropriate MUX-DEMUX units, or other hardware and software, to handle encoding of both audio and video in a common data stream or separate data streams. If applicable, MUX-DEMUX units may conform to the ITU H.223 multiplexer protocol, or other protocols such as the user datagram protocol (UDP). [0042] The ITU H.264/MPEG-4 Part 10 AVC standard was formulated by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG) as the product of a collective partnership known as the Joint Video Team (JVT). In some aspects, the techniques described in this disclosure may be applied to devices that generally conform to the H.264 standard. The H.264 standard is described in ITU-T Recommendation H.264, Advanced Video Coding for generic audiovisual services, by the ITU-T Study Group, and dated March, 2005, which may be referred to herein as the H.264 standard or H.264 specification, or the H.264/AVC standard or specification. The Joint Video Team (JVT) continues to work on extensions to H.264/AVC.
[0043] Video encoder 22 and video decoder 28 each may be implemented as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware or any combinations thereof. Each of video encoder 22 and video decoder 28 may be included in one or more encoders or decoders, either of which may be integrated as part of a combined encoder/decoder (CODEC) in a respective mobile device, subscriber device, broadcast device, server, or the like.
[0044] A video sequence includes a series of video frames. In some cases, a video sequence can be arranged as a group of pictures (GOP). Video encoder 22 operates on video blocks within individual video frames in order to encode the video data. The video blocks may have fixed or varying sizes, and may differ in size according to a specified coding standard. Each video frame may include a series of slices. Each slice may include a series of macroblocks, which may be arranged into even smaller blocks. Macrob locks typically refer to 16 by 16 blocks of data. The ITU-T H.264 standard supports infra prediction in various block sizes, such as 16 by 16, 8 by 8, or 4 by 4 for luma components, and 8x8 for chroma components, as well as inter prediction in various block sizes, such as 16 by 16, 16 by 8, 8 by 16, 8 by 8, 8 by 4, 4 by 8 and 4 by 4 for luma components and corresponding scaled sizes for chroma components. In this disclosure, the term video blocks may refer to blocks of coefficients, e.g., transform coefficients, following a transform process such as discrete cosine transform or a conceptually similar transformation process in which a set of pixel values are transformed into the frequency domain. The transform coefficients may be quantized. The scanning techniques of this disclosure typically apply with respect to quantized transform coefficients, but may be applicable to non-quantized transform coefficients in some implementations. Moreover, the scanning techniques of this disclosure may also be applicable to blocks of pixel values (i.e., without the transform process), which may or may not be quantized blocks of pixel values. The term "coefficient" is used broadly herein to represent values of video blocks, including not only transform coefficients of coefficient blocks, but also pixel values of non-transformed video blocks. [0045] Larger video blocks, such as macroblocks, may be divided into smaller sized video blocks. Smaller video blocks can provide better resolution, and may be used for locations of a video frame that include high levels of detail. In general, macroblocks (MBs) and the various smaller blocks may all be considered to be video blocks. Video frames may comprise decodable units, or may be divided into smaller decodable units, such as "slices." That is, a slice may be considered to be a series of video blocks, such as MBs and/or smaller sized blocks, and each slice may be an independently decodable unit of a video frame.
[0046] After prediction, a transform may be performed on the 8x8 residual block of pixels or 4x4 residual block of pixels, and an additional transform may be applied to the DC coefficients of the 4x4 blocks of pixels for chroma components or, if an intra_16xl6 prediction mode is used, for luma components. Following the transform, the data may be referred to as coefficient blocks, or transformed video blocks. Following the transform, the coefficient blocks contain transform coefficients, rather than pixel values. Again, the term "coefficients" generally refers to transform coefficients, but may alternatively refer to other types of coefficients or values (e.g., pixel values without the transform process).
[0047] Following intra- or inter-based predictive coding and transformation techniques (such as the 4x4 or 8x8 integer transform used in H.264/AVC or a discrete cosine transform DCT), quantization may be performed. Other transformation techniques such as wavelet-based compression may be used. Quantization generally refers to a process in which coefficients are quantized to possibly reduce the amount of data used to represent the coefficients. The quantization process may reduce the bit depth associated with some or all of the coefficients. For example, an 8-bit value may be rounded down to a 7-bit value during quantization.
[0048] Following quantization, scanning and entropy coding may be performed according to the techniques described herein. In particular, video blocks of transform coefficients, such as 4 by 4 video blocks, 8 by 8 video blocks, or possibly other sized blocks such as 16 by 16 video blocks, can be scanned from a two-dimensional format to a one-dimensional format. The scanning order may be initialized for each coded unit and may begin in a conventional manner (e.g., the zig-zag scanning order). According to this disclosure, the scanning order may be adaptive. In particular, the scanning order may adapt for video blocks of one or more prediction modes based on statistics associated with such video blocks. The statistics may comprise a count of the number of video blocks encoded in each respective prediction mode, and a set of probabilities associated with coefficients of video blocks encoded in each prediction mode. The probabilities may comprise an indication of the likelihood that a given coefficient value in each location of the video block has a value of zero, or has a non-zero value. Alternatively, the probabilities may comprise more detailed probabilities indicative of the actual values at each location, or another type of statistical probability measure associated with coefficient values.
[0049] One or more thresholds may be defined relative to the count values. At periodic intervals (such as when macroblock boundaries are encountered), the scan order associated with the different modes of video blocks can be evaluated. When the scan order is evaluated, if the count value associated with a given prediction mode satisfies the threshold of the given prediction mode, then the scan order for that mode may be examined and possibly changed to reflect the statistics of video blocks coded in the given prediction mode. In particular, the scan order can be defined so that coefficients are scanned in the order of their probability of having non-zero values. That is, coefficient locations that have a higher probability of being non-zero are scanned prior to coefficient locations that have a lower probability of being non-zero. In this way, a conventional scanning order (such as a zig-zag scanning order) may adapt to a scanning order that groups non-zero coefficients more toward the beginning of the one- dimensional vector representations of the coefficient blocks. The decoder can calculate the same statistics and thereby determine the scanning order that was used in the encoding process. Accordingly, reciprocal adaptive scanning orders can be applied by the decoder in order to convert the one-dimensional vector representation of the coefficient blocks back to the two-dimensional block format.
[0050] As noted, the scanning order (and adaptive changes thereto) may differ for each different predictive mode. That is, statistics are maintained for each different prediction mode. This disclosure is not limited to any particular number of modes, or types of modes. The different modes may define the size of the video block and the type of prediction used in the coding process. A plurality of prediction modes may comprise a plurality of intra prediction modes and a plurality of inter prediction modes. [0051] As an example, inter coding may support two or more modes, such as an inter prediction mode that corresponds to 4 by 4 transform block size and an inter prediction mode that corresponds to 8 by 8 transform block size. In some cases, several 4 by 4 modes such as predictive (P) and bi-directional predictive (B) modes may be supported. Inter coding may also support an 8 by 8 P mode and an 8 by 8 B mode. Furthermore, different modes may also be defined for inter coded blocks of luma and chroma information. A variety of different inter coding prediction modes may be defined, and this disclosure is not limited to any particular set of modes.
[0052] Intra coding may also support a wide range of predictive modes. For example, the intra prediction modes may comprise a plurality of 4 by 4 luma intra prediction modes, a plurality of 8 by 8 luma intra prediction modes, a plurality of 16 by 16 luma intra prediction modes, and a plurality of 8 by 8 chroma intra prediction modes. As an example, the intra prediction modes may comprise twenty-six different modes in which predictive blocks are generated based on different types of propagation, adaptation, and/or interpolation of neighboring data within the same coded unit. [0053] Intra coding modes may comprise modes such as vertical, horizontal, DC, diagonal downleft, diagonal downright, vertical right, horizontal down, vertical left and horizontal up. Each of these different modes defines the way in which predictive blocks are generated based on neighboring data within the same coded unit. Intra coding modes may also define combinations of the modes mentioned above, such as vertical plus horizontal, DC plus vertical, DC plus horizontal, diagonal downleft plus horizontal, diagonal downright plus vertical, vertical right plus horizontal, horizontal down plus vertical, vertical left plus horizontal and horizontal up plus vertical. Details of these particular modes are set forth in the following document, which is incorporated herein by reference: Y. Ye and M. Karczewicz, "Improved Intra Coding," ITU-T Q.6/SG16 VCEG, C257, Geneva, Switzerland, June 2007. In any case, this disclosure is not limited to any particular number of modes, or types of modes. Basically, a predictive mode may define the size of the encoded block, the size of the predictive block, the size of the transform used, and the way in which the data of the predictive block is located or generated.
[0054] FIG. 2 is a block diagram illustrating an example of a video encoder 50 that includes an adaptive scan unit 45 that performs techniques of this disclosure to scan video blocks from a two-dimensional block format to a one dimensional vector format. As shown in FIG. 2, video encoder 50 receives a current video block within a video frame to be encoded. In the example of FIG. 2, video encoder 50 includes prediction unit 32, reference frame store 34, block transform unit 38, quantization unit 40, inverse quantization unit 42, inverse transform unit 44, adaptive scan unit 45 and entropy encoding unit 46. A deblocking filter (not shown) may also be included to filter block boundaries to remove blockiness artifacts. Video encoder 50 also includes summer 48 and summer 51.
[0055] For inter coding, prediction unit 32 compares the video block to be encoded to various blocks in one or more video reference frames. For inter coding, prediction unit 32 predicts the video block to be encoded from already coded neighboring video blocks of the same coded unit. The predicted data may be retrieved from reference frame store 34, which may comprise any type of memory or data storage device to store video blocks reconstructed from previously encoded blocks. Prediction unit 32 may generate prediction modes and prediction vectors, which comprise syntax elements that may be used to identify the prediction blocks used to code the current video block. For intra coding, prediction unit 32 may comprise a spatial prediction unit, while for inter coding, prediction unit 32 may include motion estimation and motion compensation units. [0056] Video encoder 50 forms a residual video block by subtracting the prediction block produced by prediction unit 32 from the original video block being encoded. Summer 48 represents a unit or module that performs this subtraction operation. Block transform unit 38 applies a transform, such as a discrete cosine transform (DCT) or a conceptually similar transform, to the residual block, producing a video block comprising residual transform block coefficients. Block transform unit 38, for example, may perform other transforms defined by the H.264 standard, which are conceptually similar to DCT.
[0057] Quantization unit 40 quantizes the residual transform coefficients to further reduce bit rate. Quantization unit 40, for example, may limit the number of bits used to code each of the coefficients. After quantization, adaptive scan unit 45 scans the quantized coefficient block from a two-dimensional representation to a one-dimensional vector. Then, following this scanning process, entropy encoding unit 46 encodes the quantized transform coefficients according to an entropy coding methodology, such as CAVLC or CABAC, to further compress the data. The adaptive scanning performed by adaptive scan unit 45, consistent with this disclosure, is outlined in greater detail below. [0058] Briefly, adaptive scan unit 45 stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes. Then, following this scanning process, entropy encoding unit 46 encodes the quantized transform coefficients according to an entropy coding methodology.
[0059] Adaptive scan unit 45 may determine a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes. In addition, adaptive scan unit 45 may adjust the threshold upon adjusting the given scan order. The statistics stored by adaptive scan unit 45 may comprise statistics indicative of the probability of the coefficient values being zero or non-zero. In one example, adaptive scan unit 45 determines a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes, and increases or decreases the threshold based on whether the new scan order is the same as a previous scan order. For example, if the new scan order is the same as a previous scan order, adaptive scan unit 45 may increase the threshold of the given one of the prediction modes, e.g., by a factor of two subject to an upper limit. Similarly, if the new scan order is different than the previous scan order, adaptive scan unit 45 may decrease the threshold of the given one of the prediction modes, e.g., by a factor of two subject to a lower limit. Upon determining the scan order of the given one of the prediction modes, adaptive scan unit 45 may re-set the count value associated with the given one of the prediction modes. Once the coefficient blocks are scanned into a one-dimensional format, entropy encoding unit 46 entropy encodes the quantized transform coefficients. [0060] Following the entropy coding by entropy encoding unit 46, the encoded video may be transmitted to another device or archived for later transmission or retrieval. Inverse quantization unit 42 and inverse transform unit 44 apply inverse quantization and inverse transformation, respectively, to reconstruct the residual block in the pixel domain. Summer 51 adds the reconstructed residual block to the prediction block produced by prediction unit 32 to produce a reconstructed video block for storage in reference frame store 34. If desired, the reconstructed video block may also go through a deblocking filter unit (not shown) before being stored in reference frame store 34. The reconstructed video block may be used by prediction unit 32 as a reference block to inter-code a block in a subsequent video frame or to intra-code a future neighboring block within the same coded unit.
[0061] FIG. 3 is a block diagram illustrating an example of a video decoder 60, which decodes a video sequence that is encoded in the manner described herein. Video decoder 60 includes an entropy decoding unit 52 that performs the reciprocal decoding function of the encoding performed by entropy encoding unit 46 of FIG. 2. Video decoder 60 also includes an adaptive scan unit 55 that performs inverse scanning that is reciprocal to the scanning performed by adaptive scan unit 45 of FIG. 2. [0062] Video decoder 60 may perform intra- and inter- decoding of blocks within video frames. In the example of FIG. 3, video decoder 60 also includes a prediction unit 54, an inverse quantization unit 56, an inverse transform unit 58, and reference frame store 62. Video decoder 60 also includes summer 64. Optionally, video decoder 60 also may include a deblocking filter (not shown) that filters the output of summer 64. [0063] For intra coding, prediction unit 54 may comprise a spatial prediction unit, while for inter coding, prediction unit 54 may comprise a motion compensation unit. Inverse quantization unit 56 performs inverse quantization, and inverse transform unit 58 performs inverse transforms to change the coefficients of the video blocks back to the pixel domain. Summer combines a prediction block from unit 54 with the reconstructed residual block from inverse transform unit 58 to generate a reconstructed block, which is stored in reference frame store 62. If desired, the reconstructed video block may also go through a deblocking filter unit (not shown) before being stored in reference frame store 62. Decoded video is output from reference frame store 62, and may also be fed back to prediction unit 54 for use in subsequent predictions.
[0064] As noted, entropy decoding unit 52 performs the reciprocal decoding function of the encoding performed by entropy encoding unit 46 of FIG. 2, and adaptive scan unit 55 then performs reciprocal scanning of that performed by adaptive scan unit 45 of FIG. 2. Like adaptive scan unit 45 of FIG. 2, adaptive scan unit 55 of FIG. 3 stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes, counts the video blocks associated with each of the prediction modes, scans the coefficient values of the video blocks based on scan orders defined for each of the prediction modes, and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes. Basically, adaptive scan unit 55 performs similar functions to adaptive scan unit 45, but does so in the reverse manner. Thus, whereas adaptive scan unit 45 scans coefficient blocks from a two-dimensional format to a one-dimensional format prior to entropy encoding, adaptive scan unit 55 scans coefficient blocks from the one-dimensional format to the two-dimensional format following entropy decoding.
[0065] FIG. 4 is a conceptual diagram illustrating zig-zag scanning of a 4 by 4 coefficient block. FIG. 5 is a conceptual diagram illustrating zig-zag scanning of an 8 by 8 coefficient block. The zig-zag scanning shown in FIGS. 4 and 5 may be performed by adaptive scanning unit 45 at the beginning of the coding process for a coded unit. As discussed in greater detail below, however, the scanning order may adapt based on the actual statistics associated with the already coded coefficient blocks. [0066] The scanning order for such zig-zag scanning shown in FIGS. 4 and 5 follows the arrow through video blocks 80 and 90, and the coefficients are labeled in the scanning order. In particular, the numerical values shown in FIGS. 4 and 5 indicate positions of the coefficients within a sequential one-dimensional vector, and do not represent values of the coefficients. At initialization, the techniques of this disclosure are not limited to any particular scanning order or technique. For example, the initial scanning orders used in this disclosure may be the zig-zag scanning orders shown in FIGS. 4 and 5. Or, alternatively, the initial scanning orders used in this disclosure may be a set of fixed scanning orders that may be specially trained for each one of a plurality of prediction modes. As zig-zag scanning is quite typical, it provides a good starting point for discussion of the adaptive scanning of this disclosure. Again, according to this disclosure, the scanning order adapts over time based on the actual statistics associated with the already coded coefficient blocks. For each coded unit, the scanning order may begin with a conventional scanning order, such as zig-zag scanning, but adapts as statistics accumulate for coefficient blocks coded in the different prediction modes within that coded unit. As noted above, however, zig-zag scanning is not the only possible starting point for adaptive scanning. Horizontal scanning, vertical scanning, or any initial scanning technique may be used as a starting point for the adaptive scanning techniques described herein.
[0067] FIG. 6 is a conceptual diagram illustrating an exemplary set of statistics (Sl -S 16) associated with blocks of a particular prediction mode, and an algorithm consistent with the techniques of this disclosure. As shown, the initial scanning order of a video block in Mode X may be defined by a zig-zag scanning process as follows: (Sl, S2, S5, S9, S6, S3, S4, S7, SlO, S13, S14, SIl, S8, S12, S15, S16). In this case, the numbered coefficients correspond to the statistics that are numbered in statistics block 69 of FIG. 6. Count(mode X) defines a count of a number of blocks coded in Mode X for a given coded unit. With each increment of Count(mode X) the statistics (Sl -S 16) may change to reflect the statistics of the coefficients, as affected by the new block in mode X. [0068] Algorithm 60 of FIG. 6 may be invoked at predefined update interval in the coding of a coded unit (e.g., a frame or slice), such as when macroblock boundaries are encountered. According to this disclosure, once algorithm 60 is invoked, if Count(mode X) is greater than or equal to a pre-defined threshold, scan unit 45 or 55 (FIG. 2 or 3) selects a scan order based on the statistics Sl -S 16, and then re-sets Count(mode X). If the scan order changes, scan unit 45 or 55 may adjust the threshold downward, and if the scan order does not change, scan unit 45 or 55 may adjust the threshold upward. [0069] The threshold is basically a mechanism that can limit the occurrence of scan order changes, which usually requires computationally intensive sorting process, and can ensure that sufficient statistics are accumulated for a given mode of video block prior to evaluating the scan order. In particular, a new scan order can only be selected for a given mode of video block when the count of the given mode satisfies the threshold of the given mode. Furthermore, the threshold may adjust over time in order to accelerate the occurrence of scan order evaluations when new scan orders are different than previous scan orders, or to reduce the occurrence of scan order evaluations when new scan orders remain the same as previous scan orders. In this way, for each of a plurality of prediction modes, the techniques described herein may perform scan order evaluations more frequently at the beginning of the code unit until the scan order reaches a steady and desirable state, and may then perform scan order selections less frequently as changes in scan orders become less likely.
[0070] FIG. 7 is a conceptual diagram illustrating a hypothetical example consistent with this disclosure. In this example, coefficients are labeled in items 71A and 71B as cl- cl6. Actual coefficient values are shown in block 1 (72), block 2 (73), block 3 (74) and block 4 (75). Blocks 1-4 may comprise blocks associated with the same prediction mode. Blocks 1-4 may be coded in sequence.
[0071] Initially, zig-zag scanning may be used. In this case, the blocks are scanned in the following order, which is consistent with the illustration of FIG. 4:
(cl, c2, c5, c9, c6, c3, c4, c7, clO, cl3, cl4, ell, c8, cl2, cl5, cl6).
Assuming the statistics of coefficient blocks are initialized to be all zero, statistics 1 (76) represents the statistics of block 1, e.g., with values of one for any coefficient that is non-zero and values of zero for any coefficient that has a value of zero. Statistics 2 (77) represents the combined statistics of blocks 1 and 2, e.g., with normalized probability values indicative of whether that coefficient location was one or zero in blocks 1 and 2. In this case, the normalized probability of the location c6 is 0.5 since block 1 had a nonzero coefficient at that location, but block 2 had a zero value coefficient at that location. Statistics 3 (78) represents the combined statistics of blocks 1, 2 and 3 as normalized probabilities, and statistics 4 (79) represent the combined statistics of blocks 1, 2, 3 and 4 as normalized probabilities. The normalized probabilities may comprise an average of the values of one or zero for every given location, wherein the value of one is given for a particular location of the block if that location of the block defines a non-zero coefficient. In the descriptions above, zig-zag scan is used as the initial scanning order and the statistics of the coefficient blocks are initialized to be all zero. Such initializations are given only as an example, and alternative initialization of the scanning order and the coefficient statistics may be used. [0072] In the example of FIG. 7, one may assume that the threshold is set at a value of 4. In this case, upon coding the fourth block 75, once the preset updating interval is encountered (e.g., once a macroblock boundary is encountered), the count of 4 blocks is determined to satisfy the threshold of 4. In this case, the sorting algorithm is invoked, and scan unit 45 (FIG. 2) may define a new scan order based on statistics 4 (79). Accordingly, the new scan order is as follows:
(cl, c5, c9, c2, cl3, c6, c3, c4, c7, clO, cl4, el l, c8, cl2, cl5, cl6)
In particular, the scanning order changes from an initial scan order (e.g., zig-zag scanning) to a new scan order that promotes non-zero coefficients at the beginning of the one-dimensional vector, and zero coefficients at the end. For example, since the probability at locations c5 and c9 are higher than that at c2 in statistics 4 (79), c5 and c9 are both scanned before c2 in the new scanning order. Unlike the zig-zag scanning which alternates between the horizontal dimension and the vertical dimension equally, the new scan order exhibits stronger directionality in the vertical dimension. That is, the new scan order goes through coefficients in the vertical dimension faster than the coefficients in the horizontal dimension, which is consistent with the statistical distribution of the coefficients of the video blocks 1-4 (72, 73, 74, 75) coded in a given prediction mode. Thus, by using past statistics to define the scan order, the techniques of this disclosure may promote grouping of non-zero coefficients near the beginning of a scanned one-dimensional vector and zero value coefficients near the end of the scanned one-dimensional vector. This, in turn, can improve the level of compression that can be achieved during entropy coding.
[0073] Furthermore, thresholds are defined to limit the occurrence of scan order changes since such changes require computationally intensive sorting process, and to help ensure that sufficient statistics are accumulated for a given mode of a video block prior to evaluating the scan order. In this case, a new scan order can only be selected for a given mode of video block when the count of the given mode satisfies the threshold of the given mode. The threshold may adjust upward or downward over time (subject to upper and lower bounds). For example, if a scan order evaluation results in scan order changes, the threshold may be reduced so that a subsequent scan order evaluation occurs more quickly. In this case, since the scan orders are changing, it may be desirable to speed the occurrence of future changes to bring the scan order into a steady state. On the other hand, if a scan order evaluation does not result in a scan order change, the threshold may be increased so that a subsequent scan order evaluation takes longer to occur. In this case, since scan order has not changed, it may be desirable to reduce the frequency of evaluation of possible scan order changes, since these evaluations require the use of processing resources. These types of threshold adjustments may evaluate scan order changes more frequently until the scan order reaches a steady and desirable state, and may then limit the frequency of scan order evaluations as changes become less likely.
[0074] FIG. 8 is a flow diagram illustrating a coding (i.e., encoding or decoding) technique consistent with this disclosure. FIG. 8 is illustrated from the perspective of video encoder 50 insofar as the step of entropy coding (step 85) is after the step of scanning (step 83). From the perspective of video decoder 60, the step of entropy coding (step 85) would precede the step of scanning (step 83). For example, from the perspective of video decoder 60, the steps shown in FIG. 8 may be performed in the following order (step 85, step 83, step 81, step 82, step 84). For purposes of simplicity, FIG. 8 is described below from the perspective of video encoder 50. [0075] As shown in FIG. 8, adaptive scan unit 45 updates statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes (81), and counts the video blocks associated with each of the prediction modes (82). Adaptive scan unit 45 then scans the coefficient values of the video blocks into one- dimensional coefficient vectors according to scan orders defined for each of the prediction modes (83), and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes (84). Then, following this scanning process, entropy encoding unit 46 encodes the one-dimensional coefficient vectors according to an entropy coding methodology (85).
[0076] Adaptive scan unit 45 may determine a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes. In addition, adaptive scan unit 45 may adjust the threshold upon determining the given scan order. As discussed in this disclosure, the statistics stored by adaptive scan unit 45 may comprise statistics indicative of the probabilities of the coefficient values being zero or non-zero, or possibly other types of statistics indicative of the probabilities of coefficients values. In one example, adaptive scan unit 45 determines a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes, and increases or decreases the threshold based on whether the new scan order is the same as a previous scan order.
[0077] For example, if the new scan order is the same as a previous scan order, adaptive scan unit 45 may increase the threshold, e.g., by a factor of two subject to an upper limit. Similarly, if the new scan order is different than the previous scan order, adaptive scan unit 45 may decrease the threshold, e.g., by a factor of two subject to a lower limit. Upon determining the new scan order, adaptive scan unit 45 may re-set the count value associated with the given one of the prediction modes. Once scanned into a one- dimensional format, entropy encoding unit 46 entropy encodes the coefficient vectors. [0078] FIG. 9 is an exemplary flow diagram illustrating an adaptive scanning process that may be performed by scan unit 45 of video encoder 50 (FIG. 2) and scan unit 55 of video decoder 60 (FIG. 3). The process of FIG. 9 may repeat for each coded unit. Again, coded units may comprise individual frames of a video sequence, portions of frames (such as slices), or another independently decodable unit of a video sequence. [0079] As shown in FIG. 9, scan unit 45 initializes its scanning order for a new coded unit (91). In other words, at the beginning of a frame or slice, the scanning order is initialized. The count values for every mode are set to zero, and the thresholds are set to an initial value, such as a value of 4 for modes that correspond to 4 by 4 blocks and a value of 2 for modes that correspond to 8 by 8 blocks. At the beginning of a new coded unit, statistics of coefficient blocks for every mode are also initialized, either to all zero or other statistics based on empirical training. Scan unit 45 applies its initial scanning order (e.g., zig-zag scanning). In doing so, scan unit 45 collects block coefficient statistics and increments count(mode) for each mode identified for the scanned blocks (92). This process continues until a preset update interval is reached (93). For example, the preset update interval may correspond to a macroblock boundary, or another predetermined interval.
[0080] When the pre-set update interval is identified ("yes" 93), scan unit 45 evaluates the scan order. In particular, scan unit 45 determines whether count(mode) satisfies the threshold thresh(mode) (94). If not ("no" 94), scan unit 45 considers the other modes, e.g., until all the modes are examined (100). For any given mode, if the count(mode) satisfies the threshold ("yes" 94), scan unit 45 invokes a sorting function, which updates the scan order (95) based on the accumulated statistics for that mode. If the scan order changes as a result of this update in the scan order ("yes" 96), scan unit 45 reduces thres(mode) for that mode (97). If the scan order does not change as a result of this update in the scan order ("no" 96), scan unit 45 increases thres(mode) for that mode (98). As an example, these increases (98) or reductions (97) in the thresholds may be changed by a factor of two (i.e., multiply by 2 or divide by 2) subject to lower and upper bounds. The lower and upper bounds may be set to 4 for modes that correspond to 4 by 4 blocks and a value of 2 for modes that correspond to 8 by 8 blocks. In this example, the initial thresholds may be set at the lower bounds in order to invoke sorting as quickly as possible following initialization.
[0081] Once the scan order for a given mode is updated (95), the count(mode) for that mode is reset to zero (99). The process then determines whether additional modes need to be examined (100). The process continues as a given coded unit (e.g., a frame or a slice) is coded. That is, a new initialization (91) may occur when the next coded unit is encountered.
[0082] The techniques of this disclosure may be realized in a wide variety of devices or apparatuses, including a wireless handset, and integrated circuit (IC) or a set of ICs (i.e., a chip set). Any components, modules or units have been described provided to emphasize functional aspects and does not necessarily require realization by different hardware units.
[0083] Accordingly, the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof. Any features described as modules or components may be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a computer-readable medium comprising instructions that, when executed, performs one or more of the methods described above. The computer- readable data storage medium may form part of a computer program product, which may include packaging materials. The computer-readable medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, magnetic or optical data storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a computer-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer.
[0084] The code may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, an application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term "processor," as used herein may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described herein. In addition, in some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured for encoding and decoding, or incorporated in a combined video encoder-decoder (CODEC). Also, the techniques could be fully implemented in one or more circuits or logic elements.
[0085] Various aspects of the disclosure have been described. The techniques have been described in the context of scanning of transformed coefficients of transformed video blocks, but might also apply to scanning of other types of coefficients of video blocks. For example, if scanning of pixel values or other types of non-transformed coefficients or values associated with video blocks were implemented, the techniques of this disclosure could apply to such scanning. These and other aspects are within the scope of the following claims.

Claims

CLAIMS:
1. A method of coding coefficients of video blocks, the method comprising: storing statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes; counting the video blocks associated with each of the prediction modes; scanning the coefficient values of the video blocks based on scan orders defined for each of the prediction modes; evaluating a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes; and entropy coding the coefficient values.
2. The method of claim 1, further comprising: determining a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes.
3. The method of claim 2, further comprising: adjusting the threshold of the given one of the prediction modes upon adjusting the given scan order of the given one of the prediction modes.
4. The method of claim 1, wherein storing the statistics comprises for each of the plurality of prediction modes: storing statistics indicative of probabilities of the coefficient values being zero or non-zero.
5. The method of claim 1 , further comprising: determining a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes; if the new scan order is the same as a previous scan order, increasing the threshold of the given one of the prediction modes; if the new scan order is different than the previous scan order, decreasing the threshold of the given one of the prediction modes; and re-setting the count value associated with the given one of the prediction modes.
6. The method of claim 1 , wherein the plurality of prediction modes comprise a plurality of intra prediction modes and a plurality of inter prediction modes.
7. The method of claim 6, wherein the intra prediction modes comprise a plurality of 4 by 4 luma intra prediction modes, a plurality of 8 by 8 luma intra prediction modes, a plurality of 16 by 16 luma intra prediction modes, and a plurality of 8 by 8 chroma intra prediction modes; and wherein the inter prediction modes comprise inter prediction modes corresponding to a 4 by 4 block size and an 8 by 8 block size.
8. The method of claim 1, wherein entropy coding comprises variable length coding (CAVLC) or context adaptive binary arithmetic coding (CABAC).
9. The method of claim 1, further comprising generating the coefficient values via a transform of the video blocks from a pixel domain to a transformed domain.
10. The method of claim 1, wherein coding comprises encoding, wherein scanning the coefficient values of the video blocks comprises generating one-dimensional vectors of coefficients from two-dimensional blocks of coefficients based on scan orders defined for each of the prediction modes, wherein entropy coding comprises entropy encoding the one-dimensional vectors after scanning the coefficient values.
11. The method of claim 1 , wherein coding comprises decoding, wherein scanning the coefficient values of the video blocks comprises generating two-dimensional blocks of coefficients from one-dimensional vectors of coefficients based on scan orders defined for each of the prediction modes, wherein entropy coding comprises entropy decoding the one-dimensional vectors prior to scanning the coefficient values.
12. The method of claim 1, further comprising examining the scan orders defined for each of the prediction modes at a pre-set update interval.
13. The method of claim 1 , wherein the method is repeated for each of a plurality of coded units that form a video sequence, the method further comprising: initializing the scan orders, the statistics and thresholds for each of the prediction modes prior to the method being repeated for each of the plurality of coded units.
14. An apparatus that codes coefficients of video blocks, the apparatus comprising: a scan unit that: stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes; counts the video blocks associated with each of the prediction modes; scans the coefficient values of the video blocks based on scan orders defined for each of the prediction modes; and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes; and an entropy coding unit that entropy codes the coefficient values.
15. The apparatus of claim 14, wherein the scan unit: determines a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes.
16. The apparatus of claim 15, wherein the scan unit: adjusts the threshold of the given one of the prediction modes upon adjusting the given scan order of the given one of the prediction modes.
17. The apparatus of claim 14, wherein the scan unit for each of the plurality of prediction modes: stores statistics indicative of probabilities of the coefficient values being zero or non-zero.
18. The apparatus of claim 14, wherein the scan unit: determines a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes; if the new scan order is the same as a previous scan order, increases the threshold of the given one of the prediction modes; if the new scan order is different than the previous scan order, decreases the threshold of the given one of the prediction modes; and re-sets the count value associated with the given one of the prediction modes.
19. The apparatus of claim 14, wherein the plurality of prediction modes comprise a plurality of intra prediction modes and a plurality of inter prediction modes.
20. The apparatus of claim 19, wherein the intra prediction modes comprise a plurality of 4 by 4 luma intra prediction modes, a plurality of 8 by 8 luma intra prediction modes, a plurality of 16 by 16 luma intra prediction modes, and a plurality of 8 by 8 chroma intra prediction modes; and wherein the inter prediction modes comprise inter prediction modes corresponding to a 4 by 4 block size and an 8 by 8 block size.
21. The apparatus of claim 14, wherein the entropy coding unit performs variable length coding (CAVLC) or context adaptive binary arithmetic coding (CABAC).
22. The apparatus of claim 14, further comprising a transform unit that generates the coefficient values via a transform of the video blocks from a pixel domain to a transformed domain.
23. The apparatus of claim 14, wherein the apparatus encodes the video blocks, wherein the scanning unit generates one-dimensional vectors of coefficients from two-dimensional blocks of coefficients based on scan orders defined for each of the prediction modes, and wherein the entropy coding unit entropy encodes the one-dimensional vectors after the scanning unit scans the coefficient values.
24. The apparatus of claim 14, wherein the apparatus decodes the video blocks, wherein the scanning unit generates two-dimensional blocks of coefficients from one-dimensional vectors of coefficients based on scan orders defined for each of the prediction modes, and wherein the entropy coding unit entropy decodes the one-dimensional vectors prior to the scanning unit scanning the coefficient values.
25. The apparatus of claim 14, wherein the scanning unit examines the scan orders defined for each of the prediction modes at a pre-set update interval.
26. The apparatus of claim 14, wherein the scanning unit repeats its store, count, scan and evaluate operations with respect to each of a plurality of coded units that form a video sequence, and wherein the scan unit initializes the scan orders, the statistics and thresholds for each of the prediction modes before the scanning unit repeats its store, count, scan and evaluate operations for each of the plurality of coded units.
27. The apparatus of claim 14, wherein the apparatus comprises an integrated circuit.
28. The apparatus of claim 14, wherein the apparatus comprises a microprocessor.
29. A computer-readable medium comprising instructions that upon execution in a video coding device cause the device to code coefficients of video blocks, wherein the instructions cause the device to: store statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes; count the video blocks associated with each of the prediction modes; scan the coefficient values of the video blocks based on scan orders defined for each of the prediction modes; evaluate a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes; and entropy code the coefficient values.
30. The computer-readable medium claim 29, wherein the instructions cause the device to: determine a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes.
31. The computer-readable medium claim 30, wherein the instructions cause the device to: adjust the threshold of the given one of the prediction modes upon adjusting the given scan order of the given one of the prediction modes.
32. The computer-readable medium claim 29, wherein for each of the plurality of prediction modes the instructions cause the device to: store statistics indicative of probabilities of the coefficient values being zero or non-zero.
33. The computer-readable medium claim 29, wherein the instructions cause the device to: determine a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes; if the new scan order is the same as a previous scan order, increase the threshold of the given one of the prediction modes; if the new scan order is different than the previous scan order, decrease the threshold of the given one of the prediction modes; and re-set the count value associated with the given one of the prediction modes.
34. The computer-readable medium claim 29, wherein the plurality of prediction modes comprise a plurality of intra prediction modes and a plurality of inter prediction modes.
35. The computer-readable medium claim 34, wherein the intra prediction modes comprise a plurality of 4 by 4 luma intra prediction modes, a plurality of 8 by 8 luma intra prediction modes, a plurality of 16 by 16 luma intra prediction modes, and a plurality of 8 by 8 chroma intra prediction modes; and wherein the inter prediction modes comprise inter prediction modes corresponding to a 4 by 4 block size and an 8 by 8 block size.
36. The computer-readable medium claim 29, wherein the entropy coding unit performs variable length coding (CAVLC) or context adaptive binary arithmetic coding (CABAC).
37. The computer-readable medium claim 29, wherein the instructions cause the device to: generate the coefficient values via a transform of the video blocks from a pixel domain to a transformed domain.
38. The computer-readable medium claim 29, wherein the instructions cause the device to encode the video blocks, wherein the instructions cause the device to: generate one-dimensional vectors of coefficients from two-dimensional blocks of coefficients based on scan orders defined for each of the prediction modes, and entropy encode the one-dimensional vectors after scanning the coefficient values.
39. The computer-readable medium claim 29, wherein the instructions cause the device to decode the video blocks, wherein the instructions cause the device to: generate two-dimensional blocks of coefficients from one-dimensional vectors of coefficients based on scan orders defined for each of the prediction modes, and entropy decode the one-dimensional vectors prior to scanning the coefficient values.
40. The computer-readable medium claim 29, wherein the instructions cause the device to: examine the scan orders defined for each of the prediction modes at a pre-set update interval.
41. The computer-readable medium claim 29, wherein the instructions cause the device to repeat its store, count, scan and evaluate operations with respect to each of a plurality of coded units that form a video sequence, and wherein the instructions cause the device to initialize the scan orders, the statistics and thresholds for each of the prediction modes before the instructions cause the device to repeat its store, count, scan and evaluate operations for each of the plurality of coded units.
42. A device that codes coefficients of video blocks, the device comprising: means for storing statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes; means for counting the video blocks associated with each of the prediction modes; means for scanning the coefficient values of the video blocks based on scan orders defined for each of the prediction modes; means for evaluating a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes; and means for entropy coding the coefficient values.
43. The device of claim 42, further comprising: means for determining a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes when the count value associated with the given one of the prediction modes satisfies the threshold of the given one of the prediction modes.
44. The device of claim 43, further comprising: means for adjusting the threshold of the given one of the prediction modes upon adjusting the given scan order of the given one of the prediction modes.
45. The device of claim 42, wherein means for storing the statistics comprises for each of the plurality of prediction modes: means for storing statistics indicative of probabilities of the coefficient values being zero or non-zero.
46. The device of claim 42, further comprising: means for determining a new scan order associated with the given one of the prediction modes based on the statistics of the given one of the prediction modes; means for increasing the threshold of the given one of the prediction modes if the new scan order is the same as a previous scan order; means for decreasing the threshold of the given one of the prediction modes if the new scan order is different than the previous scan order; and means for re-setting the count value associated with the given one of the prediction modes.
47. The device of claim 42, wherein the plurality of prediction modes comprise a plurality of intra prediction modes and a plurality of inter prediction modes.
48. The device of claim 47, wherein the intra prediction modes comprise a plurality of 4 by 4 luma intra prediction modes, a plurality of 8 by 8 luma intra prediction modes, a plurality of 16 by 16 luma intra prediction modes, and a plurality of 8 by 8 chroma intra prediction modes; and wherein the inter prediction modes comprise inter prediction modes corresponding to a 4 by 4 block size and an 8 by 8 block size.
49. The device of claim 42, wherein means for entropy coding comprises means for variable length coding (CAVLC) or means for context adaptive binary arithmetic coding (CABAC).
50. The device of claim 42, further comprising means for generating the coefficient values via a transform of the video blocks from a pixel domain to a transformed domain.
51. The device of claim 42, wherein the device encodes video blocks, wherein means for scanning the coefficient values of the video blocks comprises means for generating one-dimensional vectors of coefficients from two-dimensional blocks of coefficients based on scan orders defined for each of the prediction modes, wherein means for entropy coding comprises means for entropy encoding the one-dimensional vectors after scanning the coefficient values.
52. The device of claim 42, wherein the device decodes video blocks, wherein means for scanning the coefficient values of the video blocks comprises means for generating two-dimensional blocks of coefficients from one-dimensional vectors of coefficients based on scan orders defined for each of the prediction modes, wherein means for entropy coding comprises means for entropy decoding the one-dimensional vectors prior to scanning the coefficient values.
53. The device of claim 42, further comprising means for examining the scan orders defined for each of the prediction modes at a pre-set update interval.
54. The device of claim 42, wherein storing, counting, scanning and evaluating operations are repeated for each of a plurality of coded units that form a video sequence, the device further comprising means for initializing the scan orders, the statistics and thresholds for each of the prediction modes prior to the storing, counting, scanning and evaluating operations being repeated for each of the plurality of coded units.
55. A device comprising : a scan unit that: stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes; counts the video blocks associated with each of the prediction modes; scans the coefficient values of the video blocks from two-dimensional blocks to one-dimensional vectors based on scan orders defined for each of the prediction modes; and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes; an entropy coding unit that entropy encodes the coefficient values of the one-dimensional vectors; and a wireless transmitter that sends a bitstream comprising the entropy encoded coefficient values.
56. The device of claim 55, wherein the device comprises a wireless communication handset.
57. A device comprising: a wireless receiver that receives a bitstream comprising entropy coded coefficient values of video blocks in one-dimensional vectors; an entropy coding unit that entropy decodes the coefficient values of the video blocks; and a scan unit that: stores statistics associated with coefficient values of the video blocks for each of a plurality of prediction modes; counts the video blocks associated with each of the prediction modes; scans the coefficient values of the video blocks from the one-dimensional vectors to two-dimensional blocks based on scan orders defined for each of the prediction modes; and evaluates a given scan order associated with a given one of the prediction modes based on the statistics of the given one of the prediction modes when a count value associated with the given one of the prediction modes satisfies a threshold of the given one of the prediction modes.
58. The device of claim 57, wherein the device comprises a wireless communication handset.
EP08770909A 2007-06-15 2008-06-12 Adaptive coefficient scanning in video coding Active EP2165542B1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US94447007P 2007-06-15 2007-06-15
US97976207P 2007-10-12 2007-10-12
US3044308P 2008-02-21 2008-02-21
US12/133,232 US8571104B2 (en) 2007-06-15 2008-06-04 Adaptive coefficient scanning in video coding
PCT/US2008/066796 WO2008157268A2 (en) 2007-06-15 2008-06-12 Adaptive coefficient scanning in video coding

Publications (2)

Publication Number Publication Date
EP2165542A2 true EP2165542A2 (en) 2010-03-24
EP2165542B1 EP2165542B1 (en) 2012-07-25

Family

ID=40132401

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08770909A Active EP2165542B1 (en) 2007-06-15 2008-06-12 Adaptive coefficient scanning in video coding

Country Status (9)

Country Link
US (1) US8571104B2 (en)
EP (1) EP2165542B1 (en)
JP (1) JP5032657B2 (en)
KR (2) KR20110129493A (en)
CN (1) CN101682771B (en)
BR (1) BRPI0813275A2 (en)
CA (1) CA2687253A1 (en)
TW (1) TW200915885A (en)
WO (1) WO2008157268A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018048516A1 (en) * 2016-09-08 2018-03-15 Google Llc Context adaptive scan order for entropy coding

Families Citing this family (106)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8428133B2 (en) 2007-06-15 2013-04-23 Qualcomm Incorporated Adaptive coding of video block prediction mode
US8542748B2 (en) 2008-03-28 2013-09-24 Sharp Laboratories Of America, Inc. Methods and systems for parallel video encoding and decoding
US8000546B2 (en) * 2008-08-01 2011-08-16 National Cheng Kung University Adaptive scan method for image/video coding
KR101379185B1 (en) * 2009-04-14 2014-03-31 에스케이 텔레콤주식회사 Prediction Mode Selection Method and Apparatus and Video Enoding/Decoding Method and Apparatus Using Same
KR101379186B1 (en) * 2009-08-21 2014-04-10 에스케이 텔레콤주식회사 Inrtra Prediction Enoding/Decoding Method and Apparatus
US9287894B2 (en) * 2009-10-05 2016-03-15 Orange Methods for encoding and decoding images, corresponding encoding and decoding devices and computer programs
CN102577393B (en) 2009-10-20 2015-03-25 夏普株式会社 Moving image coding device, moving image decoding device, moving image coding/decoding system, moving image coding method and moving image decoding method
JP5432412B1 (en) * 2010-01-07 2014-03-05 株式会社東芝 Moving picture coding apparatus and moving picture decoding apparatus
JP5525650B2 (en) * 2010-01-07 2014-06-18 株式会社東芝 Moving picture decoding apparatus, method and program
JP5597782B2 (en) * 2010-01-07 2014-10-01 株式会社東芝 Moving picture coding apparatus and moving picture decoding apparatus
JP5696248B2 (en) * 2010-01-07 2015-04-08 株式会社東芝 Moving picture coding apparatus and moving picture decoding apparatus
JP5526277B2 (en) * 2010-01-07 2014-06-18 株式会社東芝 Moving picture decoding apparatus, method and program
JP5432359B2 (en) * 2010-01-07 2014-03-05 株式会社東芝 Moving picture coding apparatus, method and program
JP5908619B2 (en) * 2010-01-07 2016-04-26 株式会社東芝 Moving picture coding apparatus and moving picture decoding apparatus
JP5323209B2 (en) * 2010-01-07 2013-10-23 株式会社東芝 Moving picture coding apparatus and moving picture decoding apparatus
WO2011083573A1 (en) 2010-01-07 2011-07-14 株式会社 東芝 Video encoder and video decoder
WO2011126283A2 (en) 2010-04-05 2011-10-13 Samsung Electronics Co., Ltd. Method and apparatus for encoding video based on internal bit depth increment, and method and apparatus for decoding video based on internal bit depth increment
US9369736B2 (en) 2010-04-05 2016-06-14 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
US8982961B2 (en) 2010-04-05 2015-03-17 Samsung Electronics Co., Ltd. Method and apparatus for encoding video by using transformation index, and method and apparatus for decoding video by using transformation index
US8929440B2 (en) * 2010-04-09 2015-01-06 Sony Corporation QP adaptive coefficients scanning and application
DK3435674T3 (en) 2010-04-13 2023-08-21 Ge Video Compression Llc Encoding of significance maps and transformation coefficient blocks
JP2011259205A (en) * 2010-06-09 2011-12-22 Sony Corp Image decoding device, image encoding device, and method and program thereof
US9215470B2 (en) * 2010-07-09 2015-12-15 Qualcomm Incorporated Signaling selected directional transform for video coding
JP2012019448A (en) * 2010-07-09 2012-01-26 Sony Corp Image processor and processing method
US9300970B2 (en) * 2010-07-09 2016-03-29 Samsung Electronics Co., Ltd. Methods and apparatuses for encoding and decoding motion vector
CN102447895B (en) * 2010-09-30 2013-10-02 华为技术有限公司 Scanning method, scanning device, anti-scanning method and anti-scanning device
US8344917B2 (en) 2010-09-30 2013-01-01 Sharp Laboratories Of America, Inc. Methods and systems for context initialization in video coding and decoding
US9313514B2 (en) * 2010-10-01 2016-04-12 Sharp Kabushiki Kaisha Methods and systems for entropy coder initialization
US8923395B2 (en) * 2010-10-01 2014-12-30 Qualcomm Incorporated Video coding using intra-prediction
US9532059B2 (en) 2010-10-05 2016-12-27 Google Technology Holdings LLC Method and apparatus for spatial scalability for video coding
KR20130054408A (en) 2010-10-05 2013-05-24 제너럴 인스트루먼트 코포레이션 Coding and decoding utilizing adaptive context model selection with zigzag scan
US9641846B2 (en) 2010-10-22 2017-05-02 Qualcomm Incorporated Adaptive scanning of transform coefficients for video coding
US9497472B2 (en) * 2010-11-16 2016-11-15 Qualcomm Incorporated Parallel context calculation in video coding
US9288496B2 (en) * 2010-12-03 2016-03-15 Qualcomm Incorporated Video coding using function-based scan order for transform coefficients
US9042440B2 (en) 2010-12-03 2015-05-26 Qualcomm Incorporated Coding the position of a last significant coefficient within a video block based on a scanning order for the block in video coding
US8976861B2 (en) 2010-12-03 2015-03-10 Qualcomm Incorporated Separately coding the position of a last significant coefficient of a video block in video coding
JP2012129888A (en) * 2010-12-16 2012-07-05 Samsung Electronics Co Ltd Image encoding apparatus and image encoding method
US20120163472A1 (en) * 2010-12-22 2012-06-28 Qualcomm Incorporated Efficiently coding scanning order information for a video block in video coding
US20120163456A1 (en) 2010-12-22 2012-06-28 Qualcomm Incorporated Using a most probable scanning order to efficiently code scanning order information for a video block in video coding
US9049444B2 (en) * 2010-12-22 2015-06-02 Qualcomm Incorporated Mode dependent scanning of coefficients of a block of video data
US20120236931A1 (en) * 2010-12-23 2012-09-20 Qualcomm Incorporated Transform coefficient scan
US10992958B2 (en) 2010-12-29 2021-04-27 Qualcomm Incorporated Video coding using mapped transforms and scanning modes
KR101739580B1 (en) * 2010-12-30 2017-05-25 에스케이 텔레콤주식회사 Adaptive Scan Apparatus and Method therefor
US9490839B2 (en) 2011-01-03 2016-11-08 Qualcomm Incorporated Variable length coding of video block coefficients
JP5781313B2 (en) 2011-01-12 2015-09-16 株式会社Nttドコモ Image prediction coding method, image prediction coding device, image prediction coding program, image prediction decoding method, image prediction decoding device, and image prediction decoding program
FR2972588A1 (en) 2011-03-07 2012-09-14 France Telecom METHOD FOR ENCODING AND DECODING IMAGES, CORRESPONDING ENCODING AND DECODING DEVICE AND COMPUTER PROGRAMS
US9106913B2 (en) 2011-03-08 2015-08-11 Qualcomm Incorporated Coding of transform coefficients for video coding
US10397577B2 (en) 2011-03-08 2019-08-27 Velos Media, Llc Inverse scan order for significance map coding of transform coefficients in video coding
US8861593B2 (en) 2011-03-15 2014-10-14 Sony Corporation Context adaptation within video coding modules
US9648334B2 (en) * 2011-03-21 2017-05-09 Qualcomm Incorporated Bi-predictive merge mode based on uni-predictive neighbors in video coding
US8938001B1 (en) 2011-04-05 2015-01-20 Google Inc. Apparatus and method for coding using combinations
US8989256B2 (en) 2011-05-25 2015-03-24 Google Inc. Method and apparatus for using segmentation-based coding of prediction information
FR2977111A1 (en) * 2011-06-24 2012-12-28 France Telecom METHOD FOR ENCODING AND DECODING IMAGES, CORRESPONDING ENCODING AND DECODING DEVICE AND COMPUTER PROGRAMS
US9167253B2 (en) 2011-06-28 2015-10-20 Qualcomm Incorporated Derivation of the position in scan order of the last significant transform coefficient in video coding
US9516316B2 (en) 2011-06-29 2016-12-06 Qualcomm Incorporated VLC coefficient coding for large chroma block
AU2012278484B2 (en) * 2011-07-01 2016-05-05 Samsung Electronics Co., Ltd. Method and apparatus for entropy encoding using hierarchical data unit, and method and apparatus for decoding
US9338456B2 (en) 2011-07-11 2016-05-10 Qualcomm Incorporated Coding syntax elements using VLC codewords
US8891616B1 (en) 2011-07-27 2014-11-18 Google Inc. Method and apparatus for entropy encoding based on encoding cost
US9787982B2 (en) * 2011-09-12 2017-10-10 Qualcomm Incorporated Non-square transform units and prediction units in video coding
HUE060005T2 (en) * 2011-10-24 2023-01-28 Gensquare Llc Image decoding apparatus
KR20130049524A (en) * 2011-11-04 2013-05-14 오수미 Method for generating intra prediction block
KR20130049523A (en) * 2011-11-04 2013-05-14 오수미 Apparatus for generating intra prediction block
KR20130050405A (en) * 2011-11-07 2013-05-16 오수미 Method for determining temporal candidate in inter prediction mode
US10390046B2 (en) 2011-11-07 2019-08-20 Qualcomm Incorporated Coding significant coefficient information in transform skip mode
US9247257B1 (en) * 2011-11-30 2016-01-26 Google Inc. Segmentation based entropy encoding and decoding
AR092786A1 (en) * 2012-01-09 2015-05-06 Jang Min METHODS TO ELIMINATE BLOCK ARTIFACTS
US9094681B1 (en) 2012-02-28 2015-07-28 Google Inc. Adaptive segmentation
US11039138B1 (en) 2012-03-08 2021-06-15 Google Llc Adaptive coding of prediction modes using probability distributions
GB2501534A (en) * 2012-04-26 2013-10-30 Sony Corp Control of transform processing order in high efficeency video codecs
US9185429B1 (en) 2012-04-30 2015-11-10 Google Inc. Video encoding and decoding using un-equal error protection
WO2013181821A1 (en) * 2012-06-07 2013-12-12 Mediatek Singapore Pte. Ltd. Improved intra transform skip mode
US9781447B1 (en) 2012-06-21 2017-10-03 Google Inc. Correlation based inter-plane prediction encoding and decoding
SI2869563T1 (en) * 2012-07-02 2018-08-31 Samsung Electronics Co., Ltd. METHOD FOR ENTROPY DECODING of a VIDEO
US9774856B1 (en) 2012-07-02 2017-09-26 Google Inc. Adaptive stochastic entropy coding
KR102341826B1 (en) 2012-07-02 2021-12-21 엘지전자 주식회사 Method for decoding image and apparatus using same
US9264713B2 (en) * 2012-07-11 2016-02-16 Qualcomm Incorporated Rotation of prediction residual blocks in video coding with transform skipping
US9332276B1 (en) 2012-08-09 2016-05-03 Google Inc. Variable-sized super block based direct prediction mode
US9167268B1 (en) 2012-08-09 2015-10-20 Google Inc. Second-order orthogonal spatial intra prediction
US9344742B2 (en) 2012-08-10 2016-05-17 Google Inc. Transform-domain intra prediction
US9380298B1 (en) 2012-08-10 2016-06-28 Google Inc. Object-based intra-prediction
US9826229B2 (en) 2012-09-29 2017-11-21 Google Technology Holdings LLC Scan pattern determination from base layer pixel information for scalable extension
WO2014078068A1 (en) 2012-11-13 2014-05-22 Intel Corporation Content adaptive transform coding for next generation video
US9350988B1 (en) 2012-11-20 2016-05-24 Google Inc. Prediction mode-based block ordering in video coding
WO2014120367A1 (en) 2013-01-30 2014-08-07 Intel Corporation Content adaptive parametric transforms for coding for next generation video
US9681128B1 (en) 2013-01-31 2017-06-13 Google Inc. Adaptive pre-transform scanning patterns for video and image compression
US9509998B1 (en) 2013-04-04 2016-11-29 Google Inc. Conditional predictive multi-symbol run-length coding
CN103428492B (en) * 2013-07-16 2016-11-16 山东大学 The method of quickly zig-zag scanning in a kind of high definition AVS coding
AU2013395426B2 (en) 2013-07-24 2017-11-30 Microsoft Technology Licensing, Llc Scanning orders for non-transform coding
US9247251B1 (en) 2013-07-26 2016-01-26 Google Inc. Right-edge extension for quad-tree intra-prediction
JP5646713B2 (en) * 2013-09-13 2014-12-24 株式会社Nttドコモ Image coding apparatus, method and program, and image decoding apparatus, method and program
US9392288B2 (en) 2013-10-17 2016-07-12 Google Inc. Video coding using scatter-based scan tables
US9179151B2 (en) 2013-10-18 2015-11-03 Google Inc. Spatial proximity context entropy coding
JP5893711B2 (en) * 2014-11-04 2016-03-23 株式会社Nttドコモ Image coding apparatus, method and program, and image decoding apparatus, method and program
US10306229B2 (en) 2015-01-26 2019-05-28 Qualcomm Incorporated Enhanced multiple transforms for prediction residual
JP6109354B2 (en) * 2016-01-20 2017-04-05 株式会社Nttドコモ Image decoding apparatus, method and program
RU2706228C1 (en) 2016-02-12 2019-11-15 Хуавей Текнолоджиз Ко., Лтд. Scanning order selection method and device
KR102210230B1 (en) * 2016-02-12 2021-01-29 후아웨이 테크놀러지 컴퍼니 리미티드 Method and apparatus for selecting the scan order
US10623774B2 (en) 2016-03-22 2020-04-14 Qualcomm Incorporated Constrained block-level optimization and signaling for video coding tools
WO2018074291A1 (en) * 2016-10-18 2018-04-26 パナソニックIpマネジメント株式会社 Image coding method, transmission method and image coding device
GB2557335A (en) * 2016-12-07 2018-06-20 Sony Corp Image data encoding and decoding
KR102424411B1 (en) 2017-04-13 2022-07-25 엘지전자 주식회사 Image encoding/decoding method and device therefor
CN115834876A (en) * 2017-12-21 2023-03-21 Lg电子株式会社 Decoding and encoding device, medium for storing bit stream and data transmitting device
CN118748715A (en) 2018-06-21 2024-10-08 株式会社Kt Method for decoding or encoding image and apparatus for transmitting compressed video data
BR112021005238A2 (en) * 2018-09-20 2021-06-15 Nokia Technologies Oy a method and apparatus for encoding and decoding digital image/video material
US11323748B2 (en) 2018-12-19 2022-05-03 Qualcomm Incorporated Tree-based transform unit (TU) partition for video coding
KR20210031296A (en) 2019-09-11 2021-03-19 삼성전자주식회사 Electronic apparatus and control method thereof

Family Cites Families (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE1000643A5 (en) 1987-06-05 1989-02-28 Belge Etat METHOD FOR CODING IMAGE SIGNALS.
JPH01155678A (en) 1987-12-11 1989-06-19 Matsushita Electric Ind Co Ltd Semiconductor light emitting device
US5136371A (en) 1990-03-15 1992-08-04 Thomson Consumer Electronics, Inc. Digital image coding using random scanning
EP0586225B1 (en) 1992-08-31 1998-12-23 Victor Company Of Japan, Ltd. Orthogonal transform coding apparatus and decoding apparatus
TW224553B (en) 1993-03-01 1994-06-01 Sony Co Ltd Method and apparatus for inverse discrete consine transform and coding/decoding of moving picture
TW297202B (en) 1993-10-13 1997-02-01 Rca Thomson Licensing Corp
KR0183688B1 (en) 1994-01-12 1999-05-01 김광호 Image encoding method and device
KR0178198B1 (en) 1995-03-28 1999-05-01 배순훈 Apparatus for encoding an image signal
US5721822A (en) 1995-07-21 1998-02-24 Intel Corporation Run-length encoding/decoding video signals using scan patterns explicitly encoded into bitstreams
US5790706A (en) 1996-07-03 1998-08-04 Motorola, Inc. Method and apparatus for scanning of transform coefficients
KR100425615B1 (en) 1996-11-07 2004-04-01 마쯔시다덴기산교 가부시키가이샤 Encoding method and encoding apparatus and decoding method and decoding apparatus
JP2002232887A (en) 1996-11-07 2002-08-16 Matsushita Electric Ind Co Ltd Image encoding method and image encoder, and image decoding method and image decoder
JPH10271505A (en) 1997-03-25 1998-10-09 Oki Electric Ind Co Ltd Signal processor, coding circuit and decoding circuit
US5995055A (en) 1997-06-30 1999-11-30 Raytheon Company Planar antenna radiating structure having quasi-scan, frequency-independent driving-point impedance
EP0895424B1 (en) 1997-07-31 2007-10-31 Victor Company of Japan, Ltd. digital video signal inter-block predictive encoding/decoding apparatus and method providing high efficiency of encoding.
JP4226172B2 (en) 1998-11-24 2009-02-18 株式会社ハイニックスセミコンダクター Video compression encoding apparatus and decoding apparatus using adaptive conversion method and method thereof
CA2388095A1 (en) 1999-10-22 2001-05-03 Activesky, Inc. An object oriented video system
US6724818B1 (en) * 2000-07-17 2004-04-20 Telefonaktiebolaget Lm Ericsson (Publ) Alternative block orders for better prediction
CN1142683C (en) 2000-10-13 2004-03-17 清华大学 Two-dimensional discrete cosine conversion/counter-conversion VLSI structure and method with no transposition part and separate line and column
JP2002135126A (en) 2000-10-26 2002-05-10 Seiko Epson Corp Semiconductor device and electronic equipment using the same
CN101448162B (en) 2001-12-17 2013-01-02 微软公司 Method for processing video image
KR100468844B1 (en) 2002-01-07 2005-01-29 삼성전자주식회사 Optimal scanning method for transform coefficients in image and video coding/decoding
BR0304545A (en) * 2002-01-14 2004-11-03 Nokia Corp Method of encoding images in a digital video sequence to provide encoded video data, video encoder, decoding method of data indicative of a digital video sequence, video decoder, and video decoding system
JP4510465B2 (en) 2002-01-22 2010-07-21 ノキア コーポレイション Coding of transform coefficients in an image / video encoder and / or decoder
US6757576B2 (en) 2002-02-05 2004-06-29 Gcc, Inc. System and method for drawing and manufacturing bent pipes
KR100508798B1 (en) 2002-04-09 2005-08-19 엘지전자 주식회사 Method for predicting bi-predictive block
US7170937B2 (en) 2002-05-01 2007-01-30 Texas Instruments Incorporated Complexity-scalable intra-frame prediction technique
EP2860979A1 (en) 2002-05-28 2015-04-15 Sharp Kabushiki Kaisha Method and systems for image intra-prediction mode estimation, communication, and organization
RU2314656C2 (en) 2002-06-11 2008-01-10 Нокиа Корпорейшн Internal encoding, based on spatial prediction
AU2003281133A1 (en) 2002-07-15 2004-02-02 Hitachi, Ltd. Moving picture encoding method and decoding method
US6795584B2 (en) 2002-10-03 2004-09-21 Nokia Corporation Context-based adaptive variable length coding for adaptive block transforms
US7463782B2 (en) 2002-11-05 2008-12-09 Canon Kabushiki Kaisha Data encoding with an amplitude model and path between the data and corresponding decoding
FI116710B (en) 2002-12-20 2006-01-31 Oplayo Oy Coding procedure and arrangements for images
KR100750110B1 (en) 2003-04-22 2007-08-17 삼성전자주식회사 4x4 intra luma prediction mode determining method and apparatus
JP3756897B2 (en) 2003-07-30 2006-03-15 株式会社東芝 Moving picture coding apparatus and moving picture coding method
US7289562B2 (en) 2003-08-01 2007-10-30 Polycom, Inc. Adaptive filter to improve H-264 video quality
US20050036549A1 (en) 2003-08-12 2005-02-17 Yong He Method and apparatus for selection of scanning mode in dual pass encoding
US7688894B2 (en) 2003-09-07 2010-03-30 Microsoft Corporation Scan patterns for interlaced video content
JP4127818B2 (en) * 2003-12-24 2008-07-30 株式会社東芝 Video coding method and apparatus
EP1558039A1 (en) 2004-01-21 2005-07-27 Deutsche Thomson-Brandt Gmbh Method and apparatus for generating/evaluating prediction information in picture signal encoding/decoding
JP4542447B2 (en) 2005-02-18 2010-09-15 株式会社日立製作所 Image encoding / decoding device, encoding / decoding program, and encoding / decoding method
US8731054B2 (en) 2004-05-04 2014-05-20 Qualcomm Incorporated Method and apparatus for weighted prediction in predictive frames
US8369402B2 (en) 2004-06-17 2013-02-05 Canon Kabushiki Kaisha Apparatus and method for prediction modes selection based on image formation
NO322043B1 (en) * 2004-12-30 2006-08-07 Tandberg Telecom As Procedure for simplified entropy coding
US8311119B2 (en) 2004-12-31 2012-11-13 Microsoft Corporation Adaptive coefficient scan order
US7706443B2 (en) * 2005-03-11 2010-04-27 General Instrument Corporation Method, article of manufacture, and apparatus for high quality, fast intra coding usable for creating digital video content
CN100345449C (en) * 2005-03-18 2007-10-24 清华大学 Method of entropy coding of transformation coefficient in image/video coding
TW200704202A (en) * 2005-04-12 2007-01-16 Nokia Corp Method and system for motion compensated fine granularity scalable video coding with drift control
EP1768415A1 (en) 2005-09-27 2007-03-28 Matsushita Electric Industrial Co., Ltd. Adaptive scan order of DCT coefficients and its signaling
CN102176754B (en) 2005-07-22 2013-02-06 三菱电机株式会社 Image encoding device and method and image decoding device and method
US7933337B2 (en) 2005-08-12 2011-04-26 Microsoft Corporation Prediction of transform coefficients for image compression
JP2007053561A (en) 2005-08-17 2007-03-01 Matsushita Electric Ind Co Ltd Device and method for encoding image
WO2007046644A1 (en) 2005-10-21 2007-04-26 Electronics And Telecommunications Research Institute Apparatus and method for encoding and decoding moving picture using adaptive scanning
EP1958453B1 (en) 2005-11-30 2017-08-09 Koninklijke Philips N.V. Encoding method and apparatus applying coefficient reordering
US7529484B2 (en) 2005-12-14 2009-05-05 Nec Laboratories America, Inc. Triplexer transceiver using parallel signal detection
CN1801940A (en) 2005-12-15 2006-07-12 清华大学 Integer transformation circuit and integer transformation method
US8000539B2 (en) * 2005-12-21 2011-08-16 Ntt Docomo, Inc. Geometrical image representation and compression
US20080008246A1 (en) 2006-07-05 2008-01-10 Debargha Mukherjee Optimizing video coding
US8428133B2 (en) 2007-06-15 2013-04-23 Qualcomm Incorporated Adaptive coding of video block prediction mode
EP2422520A1 (en) * 2009-04-20 2012-02-29 Dolby Laboratories Licensing Corporation Adaptive interpolation filters for multi-layered video delivery
US9641846B2 (en) * 2010-10-22 2017-05-02 Qualcomm Incorporated Adaptive scanning of transform coefficients for video coding

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2008157268A2 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018048516A1 (en) * 2016-09-08 2018-03-15 Google Llc Context adaptive scan order for entropy coding
US10440394B2 (en) 2016-09-08 2019-10-08 Google Llc Context adaptive scan order for entropy coding
US10701398B2 (en) 2016-09-08 2020-06-30 Google Llc Context adaptive scan order for entropy coding

Also Published As

Publication number Publication date
EP2165542B1 (en) 2012-07-25
KR20110129493A (en) 2011-12-01
US8571104B2 (en) 2013-10-29
CA2687253A1 (en) 2008-12-24
JP5032657B2 (en) 2012-09-26
KR101244229B1 (en) 2013-03-18
WO2008157268A3 (en) 2009-02-05
TW200915885A (en) 2009-04-01
JP2010530183A (en) 2010-09-02
WO2008157268A2 (en) 2008-12-24
CN101682771B (en) 2012-02-29
US20080310745A1 (en) 2008-12-18
KR20100021658A (en) 2010-02-25
CN101682771A (en) 2010-03-24
BRPI0813275A2 (en) 2014-12-30

Similar Documents

Publication Publication Date Title
US8571104B2 (en) Adaptive coefficient scanning in video coding
RU2447612C2 (en) Adaptive coefficient scanning in video coding
KR101214148B1 (en) Entropy coding of interleaved sub-blocks of a video block
JP5290325B2 (en) Quantization based on rate distortion modeling for CABAC coder
JP5139542B2 (en) Two-pass quantization for CABAC coders
KR101003320B1 (en) Video compression using adaptive variable length codes
JP5405596B2 (en) Filter prediction based on activity metrics when coding video
KR101168843B1 (en) Video coding of filter coefficients based on horizontal and vertical symmetry
JP5415546B2 (en) Weighted prediction based on vectorized entropy coding
KR101376008B1 (en) Temporal and spatial video block reordering in a decoder to improve cache hits

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100112

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

17Q First examination report despatched

Effective date: 20101115

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 568110

Country of ref document: AT

Kind code of ref document: T

Effective date: 20120815

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602008017434

Country of ref document: DE

Effective date: 20120920

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20120725

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 568110

Country of ref document: AT

Kind code of ref document: T

Effective date: 20120725

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

Effective date: 20120725

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121025

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121125

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121126

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121026

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121105

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20130426

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20121025

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602008017434

Country of ref document: DE

Effective date: 20130426

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20140228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130630

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130630

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130612

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130701

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20120725

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130612

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20080612

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602008017434

Country of ref document: DE

Owner name: QUALCOMM INCORPORATED, SAN DIEGO, US

Free format text: FORMER OWNER: QUALCOMM INCORPORATED, SAN DIEGO, CA, US

Ref country code: DE

Ref legal event code: R081

Ref document number: 602008017434

Country of ref document: DE

Owner name: VELOS MEDIA LNTERNATIONAL LTD., IE

Free format text: FORMER OWNER: QUALCOMM INCORPORATED, SAN DIEGO, CA, US

Ref country code: DE

Ref legal event code: R082

Ref document number: 602008017434

Country of ref document: DE

Representative=s name: GRUENECKER PATENT- UND RECHTSANWAELTE PARTG MB, DE

Ref country code: DE

Ref legal event code: R081

Ref document number: 602008017434

Country of ref document: DE

Owner name: VELOS MEDIA LNTERNATIONAL LTD., IE

Free format text: FORMER OWNER: QUALCOMM INCORPORATED, SAN DIEGO, CALIF., US

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20171102 AND 20171108

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602008017434

Country of ref document: DE

Owner name: QUALCOMM INCORPORATED, SAN DIEGO, US

Free format text: FORMER OWNER: VELOS MEDIA LNTERNATIONAL LTD., DUBLIN, IE

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20220106 AND 20220112

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240509

Year of fee payment: 17

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240509

Year of fee payment: 17