CN113141505A - Video data coding method and device - Google Patents

Video data coding method and device Download PDF

Info

Publication number
CN113141505A
CN113141505A CN202010057732.XA CN202010057732A CN113141505A CN 113141505 A CN113141505 A CN 113141505A CN 202010057732 A CN202010057732 A CN 202010057732A CN 113141505 A CN113141505 A CN 113141505A
Authority
CN
China
Prior art keywords
residual data
coded
data
coding
syntax element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010057732.XA
Other languages
Chinese (zh)
Other versions
CN113141505B (en
Inventor
安基程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010057732.XA priority Critical patent/CN113141505B/en
Publication of CN113141505A publication Critical patent/CN113141505A/en
Application granted granted Critical
Publication of CN113141505B publication Critical patent/CN113141505B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/625Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using discrete cosine transform [DCT]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/91Entropy coding, e.g. variable length coding [VLC] or arithmetic coding

Abstract

The application discloses a video data coding method and a device, wherein the method comprises the following steps: obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing difference data between a predicted coding block and an original coding block; judging whether the quantized residual data is residual data in a conversion skip mode; and if the quantized residual data is the residual data in the conversion skip mode, the quantized residual data is coded by adopting an arithmetic coding method based on the statistical correlation of the residual data. In the conversion skip mode of video data coding, the scheme adopts an arithmetic coding method to code the residual data to be coded based on the statistical correlation of the residual data, and can avoid the problem that the video compression performance is influenced by coding the residual data in the conversion skip mode by adopting a conversion coefficient coding mode in the prior art.

Description

Video data coding method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a video encoding method. The application also relates to a video encoding apparatus, an electronic device and a computer readable storage medium.
Background
In the transition skip mode of video coding, the transition process to the coding block is bypassed, which is mainly used to improve the compression efficiency of certain types of video content (video sequences where the scene changes slowly), such as computer generated video or graphics mixed with the camera view (e.g. scrolling text).
In the conventional coding block using the transform skip mode, the coding method is to code the transform coefficient, and the coding method uses the characteristics of the transform coefficient after the residual data is frequency-domain transformed, for example, the coding is performed based on the characteristics of the random distribution of the sign of the transform coefficient, the correlation between the size of the transform coefficient and the position of the coefficient, and the like. However, the residual data is obtained by subtracting the predicted pixels from the original pixels, so that statistical correlation exists between the residual data, and the transform coefficients are decorrelated through a frequency domain transform process, so that a key difference exists between the transform coefficients and the residual data, and therefore, the characteristics of the transform coefficients do not meet the coding requirement of the "residual data itself without transform", and the compression performance of the "residual data itself without transform" is coded by coding the transform coefficients.
Disclosure of Invention
The embodiment of the application provides a video coding method, a video coding device and electronic equipment, which aim to solve the problem that the compression performance of residual data is affected by coding 'residual data without conversion' by adopting a mode of coding a conversion coefficient in the prior art. Further embodiments of the present application also provide a video encoding apparatus, an electronic device, and a computer-readable storage medium.
An embodiment of the present application provides a video data encoding method, including:
obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing difference data between a predicted coding block and an original coding block;
judging whether the quantized residual data is residual data in a conversion skip mode;
and if the quantized residual data is residual data in a conversion skip mode, encoding the quantized residual data by adopting an arithmetic encoding method based on the statistical correlation of the residual data.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and coding the syntax element symbols of the quantized residual data by adopting a symbol information probability model based on the correlation of the syntax element symbols of the residual data.
Optionally, the encoding the syntax element symbol of the quantized residual data by using a symbol information probability model based on the correlation of the syntax element symbol of the residual data includes:
obtaining a symbol information probability model corresponding to a syntax element symbol to be coded based on coded syntax element symbol information according to the correlation of symbol information of adjacent syntax elements in a residual coding block corresponding to quantized residual data; and carrying out arithmetic coding on the syntactic element to be coded based on the symbolic information probability model.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and performing significance mapping coding on the quantized residual data based on the significance correlation of the residual data on the spatial neighborhood.
Optionally, the performing significance mapping coding on the quantized residual data based on the significant correlation of the residual data in the spatial neighborhood includes:
obtaining scanning position information of a syntax element to be coded in the quantized residual data and obtaining context model data of the syntax element to be coded based on the significant correlation of the residual data on a spatial neighborhood;
and performing significance mapping coding on the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded.
Optionally, the obtaining a scanning position of a syntax element to be coded in the quantized residual data based on a significant correlation of the residual data in a spatial neighborhood includes:
scanning the amplitude of the syntax element in the residual coding block according to the significant similarity characteristic between the syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data to obtain the scanning path information corresponding to the syntax element in the residual coding block;
and obtaining the position information of the syntactic element to be coded in the residual coding block based on the scanning path information.
Optionally, the scanning, according to similar features between syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data, the amplitude of the syntax element in the residual coding block is scanned, so as to obtain a scanning path corresponding to the syntax element in the residual coding block, including:
and determining the position information of the next syntax element which is positioned in the same spatial neighborhood with the adjacent syntax element based on any two adjacent syntax elements with known position information in the residual coding block, so as to obtain the scanning sequence of the syntax elements in the residual coding block, and scanning the amplitude of the syntax elements in the residual coding block based on the scanning sequence to obtain the scanning path corresponding to the syntax elements in the residual coding block.
Optionally, the obtaining context model data of the syntax element to be encoded includes: and obtaining context model data of the syntax element to be coded according to the information of the coded syntax element which is positioned in the same spatial neighborhood with the syntax element to be coded in the residual coding block corresponding to the quantized residual data.
Optionally, the obtaining context model data of the syntax element to be coded according to information of coded syntax elements in the residual coding block and in the same spatial neighborhood as the syntax element to be coded includes: obtaining probability model data of two adjacent coded syntax elements of the syntax element to be coded according to scanning path information corresponding to the syntax element in the residual coding block; and obtaining the context variable of the syntax element to be coded according to the probability model data of the two adjacent coded syntax elements.
Optionally, performing significance mapping coding on the syntax element to be coded according to the scanning position information and the context model data of the syntax element to be coded, including: and performing significance mapping coding on the amplitude of the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded, and obtaining a coding result that the amplitude of the syntactic element to be coded is a significance amplitude value of '1' or a non-significance amplitude value of '0'.
Optionally, if the magnitude of the syntax element to be encoded is a significance magnitude of "1", the method further includes: and binary coding the absolute value of the amplitude of the syntax element to be coded minus1 by adopting a joint binarization scheme of TU and EG 2.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and based on the probability distribution characteristics of the residual error data, binary coding the syntax element amplitude of the quantized residual error data by adopting a binarization scheme matched with the probability distribution characteristics of the quantized residual error data.
Optionally, the binary coding of the syntax element amplitude of the quantized residual data by using a binarization scheme matched with the probability distribution characteristics of the quantized residual data includes: and binary coding the amplitude absolute value of the syntax element to be coded by adopting a joint binarization scheme of TU and EG 2.
Optionally, the binary coding of the absolute value of the amplitude of the syntax element to be coded by using a joint binarization scheme of TU and EG2 includes: and binary coding is carried out on the prefix part of the absolute value of the amplitude of the syntactic element to be coded by adopting a TU binarization scheme, and binary coding is carried out on the suffix part of the absolute value of the amplitude of the syntactic element to be coded by adopting an EG2 binarization scheme.
An embodiment of the present application further provides a video data encoding apparatus, including:
a quantized residual data obtaining unit, configured to obtain quantized residual data to be encoded, where the quantized residual data is obtained by quantizing difference data between a predicted encoding block and an original encoding block;
a conversion skip judging unit for judging whether the quantized residual data is residual data in a conversion skip mode;
and an arithmetic coding unit for coding the quantized residual data by an arithmetic coding method based on statistical correlation of residual data after determining that the quantized residual data is residual data in a transform skip mode.
The embodiment of the application also provides an electronic device, which comprises a processor and a memory; wherein the memory is to store one or more computer instructions, wherein the one or more computer instructions are to be executed by the processor to:
obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing difference data between a predicted coding block and an original coding block;
judging whether the quantized residual data is residual data in a conversion skip mode;
and if the quantized residual data is residual data in a conversion skip mode, encoding the quantized residual data by adopting an arithmetic encoding method based on the statistical correlation of the residual data.
Embodiments of the present application also provide a computer-readable storage medium storing computer-readable instructions executable by one or more processors, the computer-readable instructions, when executed by the one or more processors, causing the one or more processors to perform operations comprising:
obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing difference data between a predicted coding block and an original coding block;
judging whether the quantized residual data is residual data in a conversion skip mode;
and if the quantized residual data is residual data in a conversion skip mode, encoding the quantized residual data by adopting an arithmetic encoding method based on the statistical correlation of the residual data.
Compared with the prior art, the embodiment of the application has the following advantages:
the video data encoding method provided by the embodiment of the application comprises the following steps: obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing difference data between a predicted coding block and an original coding block; judging whether the quantized residual data is residual data in a conversion skip mode; and if the quantized residual data is the residual data in the conversion skip mode, the quantized residual data is coded by adopting an arithmetic coding method based on the statistical correlation of the residual data. In the conversion skip mode of video data coding, the scheme adopts an arithmetic coding method to code the residual data to be coded based on the statistical correlation of the residual data, and can avoid the problem that the video compression performance is influenced by coding the residual data in the conversion skip mode by adopting a conversion coefficient coding mode in the prior art.
Drawings
Fig. 1 is a flowchart of a video data encoding method according to a first embodiment of the present application;
FIG. 1-A is a diagram illustrating a scan order in a transition skip mode according to a first embodiment of the present application;
FIG. 1-B is a diagram illustrating the scanning order of "zigzag" words in transform coefficient coding according to a first embodiment of the present application;
fig. 2 is a block diagram of a video data encoding apparatus according to a second embodiment of the present application;
fig. 3 is a schematic logical structure diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
Aiming at a video data coding scene, in order to save coding code rate in a conversion skip mode, the application provides a video data coding method, a video data coding device corresponding to the method and electronic equipment. The following provides embodiments to explain the method, apparatus, and electronic device in detail.
A first embodiment of the present application provides a video data encoding method, an application body of the method may be applied to a computing device for encoding video data, fig. 1 is a flowchart of the video data encoding method provided in the first embodiment of the present application, and the method provided in this embodiment is described in detail below with reference to fig. 1. The following description refers to embodiments for the purpose of illustrating the principles of the methods, and is not intended to be limiting in actual use.
As shown in fig. 1, the video data encoding method provided in this embodiment includes the following steps:
s101, obtaining quantized residual data to be coded.
The quantized residual data is data obtained by quantizing residual data between the predicted coding block and the original coding block, and the residual data represents a pixel difference between the original coding block and the predicted coding block. In the existing residual data coding mode, residual data needs to be transformed from a pixel domain to a transform domain, the process is usually realized by DCT transform, the DCT transform transforms image data represented by pixels in a spatial domain into a DCT frequency domain to be represented by transform coefficients, the DCT transform has decorrelation, the transform coefficients are mutually independent, and the DCT transform mode has entropy-preserving characteristic, and does not lose any information in the transform process.
After the residual data is quantized, the obtained data is quantized residual data presented in the form of a two-dimensional matrix.
S102, judging whether the quantized residual data is the residual data in the conversion skip mode.
The Transform skip mode (Transform skip) has a significant effect on video sequences where scene changes are slow. The method is mainly applicable to intra 4 × 4 coding blocks or inter 4 × 4 coding blocks. In a transform skip mode, the video encoder may quantize and entropy encode the pixel domain residual data without transforming the pixel domain data to the transform domain, and the video decoder may entropy decode and inverse quantize the pixel domain residual data. The video encoder may generate a syntax element "transform _ skip _ flag" for identifying whether the residual data to be encoded is residual data in a transform skip mode, and include the syntax element in a residual coding syntax, and a value of the "transform _ skip _ flag" may enable or disable transform skip for all residual blocks of a sequence, e.g., when the value of the "transform _ skip _ flag" is 0, the transform skip mode is enabled for encoding, and when the value of the "transform _ skip _ flag" is 1, the transform skip mode is not used for encoding. In this embodiment, whether the quantized residual data is residual data in the transform skip mode may be determined by whether the value of a syntax element "transform _ skip _ flag" corresponding to the quantized residual data is 0 or 1.
And S103, if the quantized residual data is the residual data in the conversion skip mode, encoding the quantized residual data by adopting an arithmetic encoding method based on the statistical correlation of the residual data.
The step is used for coding the quantized residual data by adopting an arithmetic coding method based on the statistical correlation of the residual data after determining that the quantized residual data is the residual data in the conversion skip mode according to the judgment result of the step.
Context-based Adaptive Binary Arithmetic Coding (CABAC) (context Adaptive Binary Arithmetic Coding) is an entropy Coding algorithm used in the h.264/MPEG-4AVC video Coding standard, and as an important compression module of a video Coding system, the CABAC is located at the end of the Coding system and is responsible for processing a series of syntax elements generated in the pre-prediction and transform quantization stages, and the core algorithm of the CABAC is Arithmetic Coding (Arithmetic Coding), in which the Coding for each syntax element is related to the previous Coding result and depends on the overall probability characteristic of a source symbol sequence rather than the probability characteristic of a single syntax element. In the h.264 coding standard, Syntax elements (Syntax elements) are basic units of data in an input code stream, and Syntax elements related to residual data corresponding to the existing CABAC are mainly a coded block flag (coded _ block _ flag), a significant coefficient flag (significant _ coeff _ flag), a last significant coefficient flag (last _ significant _ coeff _ flag), a coefficient sign bit flag (coeff _ sign _ flag), and the like.
In this embodiment, the statistical correlation of the residual data may refer to correlation of syntax element symbols of the residual data, significant correlation of the residual data in a spatial neighborhood, and probability distribution characteristics of the residual data, and correspondingly, based on the statistical correlation of the residual data, the quantized residual data is encoded by using an arithmetic coding method, which may refer to one or a combination of at least two of the following manners:
the first method is as follows: coding a syntax element symbol (coeff _ sign _ flag) of quantized residual data by using a symbol information probability model based on a correlation of the syntax element symbol of the residual data;
the second method comprises the following steps: performing Significance mapping coding (Significance Map) on the quantized residual data based on the Significance correlation of the residual data on the spatial neighborhood;
the third method comprises the following steps: and based on the probability distribution characteristics of the residual error data, binary coding is carried out on the syntax element amplitude of the quantized residual error data by adopting a binarization scheme matched with the probability distribution characteristics of the quantized residual error data.
The correlation of syntax element signs of residual data means that the positive and negative signs of the syntax element magnitudes in the residual coding block that has not been transformed have a tendency to be identical, i.e., the sign of the previously coded syntax element magnitude is positive "+", and then the sign of the syntax element magnitude to be coded also tends to be positive "+". In the first mode, based on the correlation of the syntax element symbols of the residual data, the symbol information probability model is used to encode the symbols of the syntax elements of the quantized residual data, which may specifically be: according to the correlation of symbol information of adjacent syntactic elements in a residual coding block corresponding to quantized residual data, a symbol information probability model corresponding to a symbol of a syntactic element to be coded is obtained based on coded syntactic element symbol information, and arithmetic coding is carried out on the symbol of the syntactic element to be coded based on the symbol information probability model. In this embodiment, the symbol information probability model is dedicated to the transition skip mode, and the selection principle and the update principle are the same as those of the probability model in the existing arithmetic coding, which is not described herein again
In the existing transform coefficient Coding, "coeff _ sign _ flag" is a 1-bit symbol, and is coded by a Bypass Coding Mode (Bypass Coding Mode), wherein the Bypass Coding Mode is coded with a fixed probability, and is mainly used for Coding an element with a symbol probability of 0.5, and the process of allocating and updating a probability model is omitted, so that the method is used for fast Coding of the symbol. After the residual data is subjected to DCT conversion, the sign information of the transformation coefficient is completely random, so that the bypass coding mode is suitable for being adopted. In the embodiment, because the signs of the residual data in one prediction block generally have strong correlation, for example, the probability that the signs of adjacent positions are all positive or all negative is relatively high, the sign information probability model is used to replace the bypass coding mode, and the transform skip mode is used to code the sign information of the residual data in the block to be coded, so that the accuracy of probability estimation can be improved, and the coding efficiency can be improved.
The significance correlation of residual data on a spatial neighborhood means that the significance of the syntax element amplitude in an unconverted residual coding block tends to be consistent in the spatial neighborhood, that is, the previously coded syntax element amplitude is a non-zero (non-significance) amplitude, and then the syntax element amplitude to be coded also tends to be a non-zero (non-significance) amplitude. In the second mode, performing significance mapping coding on the quantized residual data based on the significant correlation of the residual data in the spatial neighborhood may refer to: obtaining scanning position information of a syntax element to be coded in the quantized residual data and obtaining context model data of the syntax element to be coded based on the significant correlation of the residual data on a spatial neighborhood; and performing significance mapping coding on the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded.
The obtaining the scanning position of the syntax element to be coded in the quantized residual data based on the significant correlation of the residual data in the spatial neighborhood may specifically refer to:
firstly, scanning the amplitude of a syntax element in a residual coding block according to a significant similarity characteristic between syntax elements of the same spatial neighborhood in the residual coding block corresponding to quantized residual data to obtain scanning path information corresponding to the syntax element in the residual coding block, for example, determining position information of a next syntax element in the same spatial neighborhood with any two adjacent syntax elements whose position information is known in the residual coding block, so as to obtain a scanning order (as shown in fig. 1-a) of the syntax element in the residual coding block, and scanning the amplitude of the syntax element in the residual coding block based on the scanning order to obtain a scanning path corresponding to the syntax element in the residual coding block;
and secondly, determining the position information of the syntactic element to be coded in the residual coding block based on the scanning path information. The scanning aims at mapping a two-dimensional array corresponding to residual data onto a one-dimensional list, and the position information of a syntax element to be coded in a residual coding block can be determined according to information in the one-dimensional list.
The obtaining of the context model data of the syntax element to be coded may specifically refer to: obtaining context model data of the syntax element to be encoded according to information of encoded syntax elements in a residual coding block corresponding to the quantized residual data, which are in the same spatial neighborhood as the syntax element to be encoded, where the context model data provides a probability estimation for the current syntax element to be encoded, for example, first, obtaining probability model data of two adjacent encoded syntax elements of the syntax element to be encoded according to scan path information corresponding to the syntax element in the residual coding block, where the two adjacent encoded syntax elements may refer to syntax elements in a left adjacent block and an upper adjacent block of a block where the syntax element to be encoded is located in this embodiment; secondly, obtaining the context variable of the syntactic element to be coded according to the probability model data of the two adjacent coded syntactic elements, and selecting a proper context model for the syntactic element to be coded by using the coded syntactic elements by utilizing the correlation of the syntactic elements at the adjacent positions.
In conventional transform coefficient coding, to perform significance map coding, non-zero quantized transform coefficients are scanned in a Zigzag (Zigzag) scan order (as shown in fig. 1-B). To enable significance map coding of transform coefficients, up to 15 different probability models are used for the significance coefficient flag "significant _ coeff _ flag" and the last significance coefficient flag "last _ significant _ coeff _ flag". The choice of the model and the corresponding context index increment are determined according to the zigzag scan position of the coded block of the current coded syntax element, i.e. for the coefficient coeff [ i ] scanned at the ith position, the context index increment is determined as follows:
CtxInc_sig(coeff[i])=CtxInc_last(coeff[i])=i.
the above process, due to the use of the transform, coefficients at the same location typically have similar properties, e.g., the low frequency coefficient in the upper left corner typically has a greater magnitude or higher probability of significance than the high frequency coefficient in the lower right corner.
In the embodiment, because no transformation is used, the residual data at the same position no longer has similar features, but instead, the residual data generally has similar significance features with the same spatial neighborhood. In this embodiment, a transform skip mode is used to perform significance mapping coding on a residual block to be coded, only 3 different probability models are used, the selection of the model and the corresponding context index increment CtxInc _ sig depend on a spatial neighborhood, that is, for residual data res [ i ] scanned at the ith position, the context index increment is determined as follows:
CtxInc_sig(res[i])=(res[a]!=01:0)+(res[b]!=01:0).i>=2
CtxInc_sig(res[i])=(res[i-1]!=01:0).i=1
CtxInc_sig(res[i])=1.i=0
a and b are coded syntax elements in the same spatial neighborhood as the syntax element i to be coded, the correspondence between the syntax element i to be coded and the coded syntax elements a and b in the same spatial neighborhood, corresponding to the scanning order of fig. 1-a, is shown in table 1:
i a b
2 0 1
3 1 2
4 2 3
5 1 3
6 3 5
7 5 6
8 3 4
9 4 8
10 8 9
11 6 8
12 6 7
13 11 12
14 10 11
15 13 14
TABLE 1
The above performing significance mapping coding on the syntax element to be coded according to the scanning position information and the context model data of the syntax element to be coded may specifically refer to: and performing significance mapping coding on the amplitude of the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded, and obtaining a coding result that the amplitude of the syntactic element to be coded is a significance amplitude value of '1' or a non-significance amplitude value of '0'. In this embodiment, if the amplitude of the syntax element to be coded is a significance amplitude "1", binary coding is performed on the absolute value of the amplitude of the syntax element to be coded minus1 by using a joint binarization scheme (UEGk, Unary/kth order Exp-Golomb) of TU and EG 2.
The CABAC is a coding target for syntax elements in slice data, and before arithmetic coding, the syntax elements need to be converted into a binary string suitable for binary arithmetic coding, and this conversion process is called binarization (binarization). The probability distribution characteristic of the residual data means that the probability density distribution function corresponding to the syntactic element amplitude in the unconverted residual coding block has the characteristic of smooth distribution. In a third mode, based on the probability distribution characteristics of the residual data, binary coding is performed on the syntax element amplitude of the quantized residual data by using a binarization scheme matched with the probability distribution characteristics of the quantized residual data, where the essence is that, for the absolute value of the syntax element amplitude in the unconverted residual coding block, a codeword matched with the distribution characteristics of the probability density distribution function corresponding to the absolute value is selected to perform binary coding on the syntax element amplitude, and in this embodiment, the process may specifically refer to: binary coding is carried out on the absolute value of the amplitude of the syntactic element to be coded by adopting a combined binarization scheme of a Truncated Unary (TU) and a k-order exponential Golomb code (kth order Exp-Golomb, EGk, k value is 2) EG2, in the combined binarization scheme, binary coding is carried out on a prefix part (within a smaller range) of the absolute value of the amplitude of the syntactic element to be coded by adopting the TU binarization scheme, and binary coding is carried out on a suffix part of the absolute value of the amplitude of the syntactic element to be coded by adopting an EG2 binarization scheme.
In the prior art, different syntax elements correspond to different truncation values and orders, for example, the absolute value of the motion vector difference is binarized using UEG3 with a truncation value of 9, and the chroma intra prediction mode intra _ chroma _ pred _ mode is binarized using TU with a truncation value of 3. In conversion coefficient encoding, the syntax element "absolute value of conversion coefficient minus 1(coeff _ abs _ level _ minus 1)" uses UEG0 binarization scheme with a truncation value of 14, for example, UEG0 binarization scheme with a truncation value S of 14 and an order k of 0 is used for input syntax elements with an absolute value of amplitude abs _ level of 20, coeff _ abs _ level _ minus1 of 19, as shown in the following table (table 2), UEG0 binarization scheme is used for encoding the value of coeff _ abs _ level _ minus 1.
Figure BDA0002373380320000111
TABLE 2
Since the distribution characteristics of the probability density distribution function corresponding to the absolute value of the syntax element amplitude in the 2 nd order exponential golomb coding EG2 and the residual coding block without conversion are more matched compared with EG0, in this embodiment, the joint binarization scheme UEG2 using TU and 2 nd order exponential golomb coding EG2 performs binarization processing on the absolute value of the syntax element amplitude to be coded (coeff _ abs _ level _ minus1), and the comparison result between the binarization scheme of the conversion coefficient coding and the binarization scheme in this embodiment is shown in table 3:
Figure BDA0002373380320000112
TABLE 3
In the video data encoding method provided in this embodiment, in a transition skip mode of video data encoding, based on a correlation of syntax element symbols of residual data, a symbol information probability model is used to encode syntax element symbols of quantized residual data; performing significance mapping coding on the quantized residual data based on the significance correlation of the residual data on the spatial neighborhood; and based on the probability distribution characteristics of the residual error data, binary coding is carried out on the syntax element amplitude of the quantized residual error data by adopting a binarization scheme matched with the probability distribution characteristics of the quantized residual error data. The scheme is based on the statistical correlation of residual data, adopts an arithmetic coding method to code the residual data to be coded, and has the advantages that the residual data are coded by the process substantially as the residual data are coded without a conversion coefficient after the conversion skip, so that the compression performance can be improved, and the problem that the video compression performance is influenced by the fact that the residual data in the conversion skip mode are coded by adopting the conventional conversion coefficient coding mode is solved.
The video data encoding method provided in this embodiment is a compression scheme for residual data, and can be applied to all scenes related to video encoding and decoding, such as an encoder, video on demand, live video, audio/video communication, video transmission, video compression, and the like, and can save a bit rate by more than 2% compared with the existing encoding method using a transform coefficient (including symbol information encoding, significant mapping encoding, and amplitude encoding) under the same video quality (PSNR).
The second embodiment of the present application also provides a video data encoding apparatus, which is substantially similar to the method embodiment and therefore is described more simply, and the details of the related technical features can be found in the corresponding description of the method embodiment provided above, and the following description of the apparatus embodiment is only illustrative.
Referring to fig. 2, to understand the embodiment, fig. 2 is a block diagram of a unit of the apparatus provided in the embodiment, and as shown in fig. 2, the apparatus provided in the embodiment includes:
a quantized residual data obtaining unit 201, configured to obtain quantized residual data to be encoded, where the quantized residual data is obtained by quantizing difference data between a predicted encoding block and an original encoding block;
a conversion skip judging unit 202, configured to judge whether the quantized residual data is residual data in a conversion skip mode;
an arithmetic coding unit 203 for coding the quantized residual data by an arithmetic coding method based on statistical correlation of residual data after determining that the quantized residual data is residual data in a transform skip mode.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and coding the syntax element symbols of the quantized residual data by adopting a symbol information probability model based on the correlation of the syntax element symbols of the residual data.
Optionally, the encoding the syntax element symbol of the quantized residual data by using a symbol information probability model based on the correlation of the syntax element symbol of the residual data includes:
obtaining a symbol information probability model corresponding to a syntax element symbol to be coded based on coded syntax element symbol information according to the correlation of symbol information of adjacent syntax elements in a residual coding block corresponding to quantized residual data; and carrying out arithmetic coding on the syntactic element to be coded based on the symbolic information probability model.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and performing significance mapping coding on the quantized residual data based on the significance correlation of the residual data on the spatial neighborhood.
Optionally, the performing significance mapping coding on the quantized residual data based on the significant correlation of the residual data in the spatial neighborhood includes:
obtaining scanning position information of a syntax element to be coded in the quantized residual data and obtaining context model data of the syntax element to be coded based on the significant correlation of the residual data on a spatial neighborhood;
and performing significance mapping coding on the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded.
Optionally, the obtaining a scanning position of a syntax element to be coded in the quantized residual data based on a significant correlation of the residual data in a spatial neighborhood includes:
scanning the amplitude of the syntax element in the residual coding block according to the significant similarity characteristic between the syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data to obtain the scanning path information corresponding to the syntax element in the residual coding block;
and obtaining the position information of the syntactic element to be coded in the residual coding block based on the scanning path information.
Optionally, the scanning, according to similar features between syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data, the amplitude of the syntax element in the residual coding block is scanned, so as to obtain a scanning path corresponding to the syntax element in the residual coding block, including:
and determining the position information of the next syntax element which is positioned in the same spatial neighborhood with the adjacent syntax element based on any two adjacent syntax elements with known position information in the residual coding block, so as to obtain the scanning sequence of the syntax elements in the residual coding block, and scanning the amplitude of the syntax elements in the residual coding block based on the scanning sequence to obtain the scanning path corresponding to the syntax elements in the residual coding block.
Optionally, the obtaining context model data of the syntax element to be encoded includes: and obtaining context model data of the syntax element to be coded according to the information of the coded syntax element which is positioned in the same spatial neighborhood with the syntax element to be coded in the residual coding block corresponding to the quantized residual data.
Optionally, the obtaining context model data of the syntax element to be coded according to information of coded syntax elements in the residual coding block and in the same spatial neighborhood as the syntax element to be coded includes: obtaining probability model data of two adjacent coded syntax elements of the syntax element to be coded according to scanning path information corresponding to the syntax element in the residual coding block; and obtaining the context variable of the syntax element to be coded according to the probability model data of the two adjacent coded syntax elements.
Optionally, performing significance mapping coding on the syntax element to be coded according to the scanning position information and the context model data of the syntax element to be coded, including: and performing significance mapping coding on the amplitude of the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded, and obtaining a coding result that the amplitude of the syntactic element to be coded is a significance amplitude value of '1' or a non-significance amplitude value of '0'.
Optionally, if the magnitude of the syntax element to be encoded is a significance magnitude of "1", the method further includes: and binary coding the absolute value of the amplitude of the syntax element to be coded minus1 by adopting a joint binarization scheme of TU and EG 2.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and based on the probability distribution characteristics of the residual error data, binary coding the syntax element amplitude of the quantized residual error data by adopting a binarization scheme matched with the probability distribution characteristics of the quantized residual error data.
Optionally, the binary coding of the syntax element amplitude of the quantized residual data by using a binarization scheme matched with the probability distribution characteristics of the quantized residual data includes: and binary coding the amplitude absolute value of the syntax element to be coded by adopting a joint binarization scheme of TU and EG 2.
Optionally, the binary coding of the absolute value of the amplitude of the syntax element to be coded by using a joint binarization scheme of TU and EG2 includes: and binary coding is carried out on the prefix part of the absolute value of the amplitude of the syntactic element to be coded by adopting a TU binarization scheme, and binary coding is carried out on the suffix part of the absolute value of the amplitude of the syntactic element to be coded by adopting an EG2 binarization scheme.
In the above embodiments, a video data encoding method and a video data encoding apparatus are provided, and in addition, a third embodiment of the present application also provides an electronic device, which is basically similar to the method embodiment and therefore is relatively simple to describe, and the details of the related technical features may be obtained by referring to the corresponding description of the method embodiment provided above, and the following description of the electronic device embodiment is only illustrative. The embodiment of the electronic equipment is as follows:
please refer to fig. 3 for understanding the present embodiment, fig. 3 is a schematic diagram of an electronic device provided in the present embodiment.
As shown in fig. 3, the electronic device includes: a processor 301; a memory 302;
the memory 302 is used for storing a video coding program, and when the program is read and executed by the processor, the program performs the following operations:
obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing difference data between a predicted coding block and an original coding block;
judging whether the quantized residual data is residual data in a conversion skip mode;
and if the quantized residual data is residual data in a conversion skip mode, encoding the quantized residual data by adopting an arithmetic encoding method based on the statistical correlation of the residual data.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and coding the syntax element symbols of the quantized residual data by adopting a symbol information probability model based on the correlation of the syntax element symbols of the residual data.
Optionally, the encoding the syntax element symbol of the quantized residual data by using a symbol information probability model based on the correlation of the syntax element symbol of the residual data includes:
obtaining a symbol information probability model corresponding to a syntax element symbol to be coded based on coded syntax element symbol information according to the correlation of symbol information of adjacent syntax elements in a residual coding block corresponding to quantized residual data; and carrying out arithmetic coding on the syntactic element to be coded based on the symbolic information probability model.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and performing significance mapping coding on the quantized residual data based on the significance correlation of the residual data on the spatial neighborhood.
Optionally, the performing significance mapping coding on the quantized residual data based on the significant correlation of the residual data in the spatial neighborhood includes:
obtaining scanning position information of a syntax element to be coded in the quantized residual data and obtaining context model data of the syntax element to be coded based on the significant correlation of the residual data on a spatial neighborhood;
and performing significance mapping coding on the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded.
Optionally, the obtaining a scanning position of a syntax element to be coded in the quantized residual data based on a significant correlation of the residual data in a spatial neighborhood includes:
scanning the amplitude of the syntax element in the residual coding block according to the significant similarity characteristic between the syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data to obtain the scanning path information corresponding to the syntax element in the residual coding block;
and obtaining the position information of the syntactic element to be coded in the residual coding block based on the scanning path information.
Optionally, the scanning, according to similar features between syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data, the amplitude of the syntax element in the residual coding block is scanned, so as to obtain a scanning path corresponding to the syntax element in the residual coding block, including:
and determining the position information of the next syntax element which is positioned in the same spatial neighborhood with the adjacent syntax element based on any two adjacent syntax elements with known position information in the residual coding block, so as to obtain the scanning sequence of the syntax elements in the residual coding block, and scanning the amplitude of the syntax elements in the residual coding block based on the scanning sequence to obtain the scanning path corresponding to the syntax elements in the residual coding block.
Optionally, the obtaining context model data of the syntax element to be encoded includes: and obtaining context model data of the syntax element to be coded according to the information of the coded syntax element which is positioned in the same spatial neighborhood with the syntax element to be coded in the residual coding block corresponding to the quantized residual data.
Optionally, the obtaining context model data of the syntax element to be coded according to information of coded syntax elements in the residual coding block and in the same spatial neighborhood as the syntax element to be coded includes: obtaining probability model data of two adjacent coded syntax elements of the syntax element to be coded according to scanning path information corresponding to the syntax element in the residual coding block; and obtaining the context variable of the syntax element to be coded according to the probability model data of the two adjacent coded syntax elements.
Optionally, performing significance mapping coding on the syntax element to be coded according to the scanning position information and the context model data of the syntax element to be coded, including: and performing significance mapping coding on the amplitude of the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded, and obtaining a coding result that the amplitude of the syntactic element to be coded is a significance amplitude value of '1' or a non-significance amplitude value of '0'.
Optionally, if the magnitude of the syntax element to be encoded is a significance magnitude of "1", the method further includes: and binary coding the absolute value of the amplitude of the syntax element to be coded minus1 by adopting a joint binarization scheme of TU and EG 2.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and based on the probability distribution characteristics of the residual error data, binary coding the syntax element amplitude of the quantized residual error data by adopting a binarization scheme matched with the probability distribution characteristics of the quantized residual error data.
Optionally, the binary coding of the syntax element amplitude of the quantized residual data by using a binarization scheme matched with the probability distribution characteristics of the quantized residual data includes: and binary coding the amplitude absolute value of the syntax element to be coded by adopting a joint binarization scheme of TU and EG 2.
Optionally, the binary coding of the absolute value of the amplitude of the syntax element to be coded by using a joint binarization scheme of TU and EG2 includes: and binary coding is carried out on the prefix part of the absolute value of the amplitude of the syntactic element to be coded by adopting a TU binarization scheme, and binary coding is carried out on the suffix part of the absolute value of the amplitude of the syntactic element to be coded by adopting an EG2 binarization scheme.
In the above embodiments, a video data encoding method, a video data encoding apparatus, and an electronic device are provided, and furthermore, a fourth embodiment of the present application also provides a computer-readable storage medium for implementing the above video data encoding method. The embodiments of the computer-readable storage medium provided in the present application are described relatively simply, and for relevant portions, reference may be made to the corresponding descriptions of the above method embodiments, and the embodiments described below are merely illustrative.
The present embodiments provide a computer readable storage medium having stored thereon computer instructions that, when executed by a processor, perform the steps of:
obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing difference data between a predicted coding block and an original coding block;
judging whether the quantized residual data is residual data in a conversion skip mode;
and if the quantized residual data is residual data in a conversion skip mode, encoding the quantized residual data by adopting an arithmetic encoding method based on the statistical correlation of the residual data.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and coding the syntax element symbols of the quantized residual data by adopting a symbol information probability model based on the correlation of the syntax element symbols of the residual data.
Optionally, the encoding the syntax element symbol of the quantized residual data by using a symbol information probability model based on the correlation of the syntax element symbol of the residual data includes:
obtaining a symbol information probability model corresponding to a syntax element symbol to be coded based on coded syntax element symbol information according to the correlation of symbol information of adjacent syntax elements in a residual coding block corresponding to quantized residual data; and carrying out arithmetic coding on the syntactic element to be coded based on the symbolic information probability model.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and performing significance mapping coding on the quantized residual data based on the significance correlation of the residual data on the spatial neighborhood.
Optionally, the performing significance mapping coding on the quantized residual data based on the significant correlation of the residual data in the spatial neighborhood includes:
obtaining scanning position information of a syntax element to be coded in the quantized residual data and obtaining context model data of the syntax element to be coded based on the significant correlation of the residual data on a spatial neighborhood;
and performing significance mapping coding on the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded.
Optionally, the obtaining a scanning position of a syntax element to be coded in the quantized residual data based on a significant correlation of the residual data in a spatial neighborhood includes:
scanning the amplitude of the syntax element in the residual coding block according to the significant similarity characteristic between the syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data to obtain the scanning path information corresponding to the syntax element in the residual coding block;
and obtaining the position information of the syntactic element to be coded in the residual coding block based on the scanning path information.
Optionally, the scanning, according to similar features between syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data, the amplitude of the syntax element in the residual coding block is scanned, so as to obtain a scanning path corresponding to the syntax element in the residual coding block, including:
and determining the position information of the next syntax element which is positioned in the same spatial neighborhood with the adjacent syntax element based on any two adjacent syntax elements with known position information in the residual coding block, so as to obtain the scanning sequence of the syntax elements in the residual coding block, and scanning the amplitude of the syntax elements in the residual coding block based on the scanning sequence to obtain the scanning path corresponding to the syntax elements in the residual coding block.
Optionally, the obtaining context model data of the syntax element to be encoded includes: and obtaining context model data of the syntax element to be coded according to the information of the coded syntax element which is positioned in the same spatial neighborhood with the syntax element to be coded in the residual coding block corresponding to the quantized residual data.
Optionally, the obtaining context model data of the syntax element to be coded according to information of coded syntax elements in the residual coding block and in the same spatial neighborhood as the syntax element to be coded includes: obtaining probability model data of two adjacent coded syntax elements of the syntax element to be coded according to scanning path information corresponding to the syntax element in the residual coding block; and obtaining the context variable of the syntax element to be coded according to the probability model data of the two adjacent coded syntax elements.
Optionally, performing significance mapping coding on the syntax element to be coded according to the scanning position information and the context model data of the syntax element to be coded, including: and performing significance mapping coding on the amplitude of the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded, and obtaining a coding result that the amplitude of the syntactic element to be coded is a significance amplitude value of '1' or a non-significance amplitude value of '0'.
Optionally, if the magnitude of the syntax element to be encoded is a significance magnitude of "1", the method further includes: and binary coding the absolute value of the amplitude of the syntax element to be coded minus1 by adopting a joint binarization scheme of TU and EG 2.
Optionally, the encoding the quantized residual data by using an arithmetic coding method based on the statistical correlation of the residual data includes: and based on the probability distribution characteristics of the residual error data, binary coding the syntax element amplitude of the quantized residual error data by adopting a binarization scheme matched with the probability distribution characteristics of the quantized residual error data.
Optionally, the binary coding of the syntax element amplitude of the quantized residual data by using a binarization scheme matched with the probability distribution characteristics of the quantized residual data includes: and binary coding the amplitude absolute value of the syntax element to be coded by adopting a joint binarization scheme of TU and EG 2.
Optionally, the binary coding of the absolute value of the amplitude of the syntax element to be coded by using a joint binarization scheme of TU and EG2 includes: and binary coding is carried out on the prefix part of the absolute value of the amplitude of the syntactic element to be coded by adopting a TU binarization scheme, and binary coding is carried out on the suffix part of the absolute value of the amplitude of the syntactic element to be coded by adopting an EG2 binarization scheme.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
1. Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
2. As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Although the present application has been described with reference to the preferred embodiments, it is not intended to limit the present application, and those skilled in the art can make variations and modifications without departing from the spirit and scope of the present application, therefore, the scope of the present application should be determined by the claims that follow.

Claims (17)

1. A method for encoding video data, comprising:
obtaining quantized residual data to be coded, wherein the quantized residual data are obtained by quantizing residual data between a predicted coding block and an original coding block;
judging whether the quantized residual data is residual data in a conversion skip mode;
and if the quantized residual data is residual data in a conversion skip mode, encoding the quantized residual data by adopting an arithmetic encoding method based on the statistical correlation of the residual data.
2. The method of claim 1, wherein the encoding the quantized residual data using an arithmetic coding method based on statistical correlation of residual data comprises:
and coding the syntax element symbols of the quantized residual data by adopting a symbol information probability model based on the correlation of the syntax element symbols of the residual data.
3. The method of claim 2, wherein encoding the syntax element symbols of the quantized residual data using a symbol information probability model based on the correlation of the syntax element symbols of the residual data comprises:
obtaining a symbol information probability model corresponding to a syntax element symbol to be coded based on coded syntax element symbol information according to the correlation of symbol information of adjacent syntax elements in a residual coding block corresponding to the quantized residual data;
and carrying out arithmetic coding on the symbols of the syntactic elements to be coded based on the symbol information probability model.
4. The method of claim 1, wherein the encoding the quantized residual data using an arithmetic coding method based on statistical correlation of residual data comprises:
and performing significance mapping coding on the quantized residual data based on the significance correlation of the residual data on the spatial neighborhood.
5. The method of claim 4, wherein the significance map coding the quantized residual data based on the significant correlation of the residual data on the spatial neighborhood comprises:
obtaining scanning position information of a syntax element to be coded in the quantized residual data and obtaining context model data of the syntax element to be coded based on the significant correlation of the residual data on a spatial neighborhood;
and performing significance mapping coding on the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded.
6. The method of claim 5, wherein obtaining the scanning position of the syntax element to be coded in the quantized residual data based on a significant correlation of the residual data on a spatial neighborhood comprises:
scanning the amplitude of the syntax element in the residual coding block according to the significant similarity characteristic between the syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data to obtain the scanning path information corresponding to the syntax element in the residual coding block;
and obtaining the position information of the syntactic element to be coded in the residual coding block based on the scanning path information.
7. The method of claim 6, wherein scanning the magnitudes of the syntax elements in the residual coding block according to the similarity between the syntax elements of the same spatial neighborhood in the residual coding block corresponding to the quantized residual data to obtain the scanning path corresponding to the syntax elements in the residual coding block comprises:
and determining the position information of the next syntax element which is positioned in the same spatial neighborhood with the adjacent syntax element based on any two adjacent syntax elements with known position information in the residual coding block, so as to obtain the scanning sequence of the syntax elements in the residual coding block, and scanning the amplitude of the syntax elements in the residual coding block based on the scanning sequence to obtain the scanning path corresponding to the syntax elements in the residual coding block.
8. The method according to claim 5, wherein said obtaining context model data of said syntax element to be encoded comprises:
and obtaining context model data of the syntax element to be coded according to the information of the coded syntax element which is positioned in the same spatial neighborhood with the syntax element to be coded in the residual coding block corresponding to the quantized residual data.
9. The method according to claim 8, wherein said obtaining context model data of said syntax element to be encoded from information of coded syntax elements in said residual coding block that are in the same spatial neighborhood as said syntax element to be encoded comprises:
obtaining probability model data of two adjacent coded syntax elements of the syntax element to be coded according to scanning path information corresponding to the syntax element in the residual coding block;
and obtaining the context variable of the syntax element to be coded according to the probability model data of the two adjacent coded syntax elements.
10. The method of claim 5, wherein the significance map coding of the syntax element to be coded according to the scanning position information and the context model data of the syntax element to be coded comprises:
and performing significance mapping coding on the amplitude of the syntactic element to be coded according to the scanning position information and the context model data of the syntactic element to be coded, and obtaining a coding result that the amplitude of the syntactic element to be coded is a significance amplitude value of '1' or a non-significance amplitude value of '0'.
11. The method according to claim 10, wherein if the magnitude of the syntax element to be encoded is significance magnitude "1", the method further comprises:
and binary coding the absolute value of the amplitude of the syntax element to be coded minus1 by adopting a joint binarization scheme of TU and EG 2.
12. The method of claim 1, wherein the encoding the quantized residual data using an arithmetic coding method based on statistical correlation of residual data comprises:
and based on the probability distribution characteristics of the residual error data, binary coding the syntax element amplitude of the quantized residual error data by adopting a binarization scheme matched with the probability distribution characteristics of the quantized residual error data.
13. The method of claim 12, wherein binary coding the syntax element magnitudes of the quantized residual data using a binarization scheme matching the probability distribution characteristics of the quantized residual data comprises:
and binary coding the amplitude absolute value of the syntax element to be coded by adopting a joint binarization scheme of TU and EG 2.
14. The method of claim 13, wherein said binary encoding the absolute value of the magnitude of the syntax element to be encoded using a joint binarization scheme of TU and EG2 comprises:
and binary coding is carried out on the prefix part of the absolute value of the amplitude of the syntactic element to be coded by adopting a TU binarization scheme, and binary coding is carried out on the suffix part of the absolute value of the amplitude of the syntactic element to be coded by adopting an EG2 binarization scheme.
15. An apparatus for encoding video data, comprising:
a quantized residual data obtaining unit, configured to obtain quantized residual data to be encoded, where the quantized residual data is obtained by quantizing residual data between a predicted encoding block and an original encoding block;
a conversion skip judging unit for judging whether the quantized residual data is residual data in a conversion skip mode;
and an arithmetic coding unit for coding the quantized residual data by an arithmetic coding method based on statistical correlation of residual data after determining that the quantized residual data is residual data in a transform skip mode.
16. An electronic device comprising a processor and a memory; wherein the content of the first and second substances,
the memory is to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the method of claims 1-14.
17. A computer-readable storage medium having stored thereon computer-readable instructions executable by one or more processors, which, when executed by the one or more processors, cause the one or more processors to perform to implement the method of claims 1-14.
CN202010057732.XA 2020-01-19 2020-01-19 Video data coding method and device Active CN113141505B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010057732.XA CN113141505B (en) 2020-01-19 2020-01-19 Video data coding method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010057732.XA CN113141505B (en) 2020-01-19 2020-01-19 Video data coding method and device

Publications (2)

Publication Number Publication Date
CN113141505A true CN113141505A (en) 2021-07-20
CN113141505B CN113141505B (en) 2023-09-15

Family

ID=76808665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010057732.XA Active CN113141505B (en) 2020-01-19 2020-01-19 Video data coding method and device

Country Status (1)

Country Link
CN (1) CN113141505B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130294524A1 (en) * 2012-05-04 2013-11-07 Qualcomm Incorporated Transform skipping and lossless coding unification
CN104380734A (en) * 2012-06-07 2015-02-25 联发科技(新加坡)私人有限公司 Method and apparatus for intra transform skip mode
US20150229969A1 (en) * 2014-02-13 2015-08-13 Young Beom Jung Method and apparatus for encoding and decoding image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130294524A1 (en) * 2012-05-04 2013-11-07 Qualcomm Incorporated Transform skipping and lossless coding unification
CN104380734A (en) * 2012-06-07 2015-02-25 联发科技(新加坡)私人有限公司 Method and apparatus for intra transform skip mode
US20150229969A1 (en) * 2014-02-13 2015-08-13 Young Beom Jung Method and apparatus for encoding and decoding image

Also Published As

Publication number Publication date
CN113141505B (en) 2023-09-15

Similar Documents

Publication Publication Date Title
US11363284B2 (en) Upsampling in affine linear weighted intra prediction
US10349085B2 (en) Efficient parameter storage for compact multi-pass transforms
TWI657692B (en) Rice parameter initialization for coefficient level coding in video coding process
TWI666920B (en) Palette predictor signaling with run length code for video coding
US8526750B2 (en) Method and apparatus for encoding/decoding image by using adaptive binarization
TWI520619B (en) Bypass bins for reference index coding in video coding
RU2679784C2 (en) Data encoding and decoding
JP7195251B2 (en) Method and device for context-adaptive binary arithmetic coding of sequences of binary symbols representing syntax elements associated with picture data
CN112514386B (en) Grid coding and decoding quantization coefficient coding and decoding
TW201701664A (en) Advanced arithmetic coder
US11202100B2 (en) Coefficient coding for transform skip mode
US11700372B2 (en) Residual coding method and device for same
CN112673639B (en) System and method for encoding transform coefficient level values
US11477486B2 (en) Escape coding for coefficient levels
CN112534815A (en) Conventional coding bin reduction using thresholds for coefficient coding
US9490839B2 (en) Variable length coding of video block coefficients
KR20220038121A (en) Method and apparatus for deriving rice parameters in video/video coding system
CN113940067A (en) Cropping index coding for adaptive loop filter in video coding
CN113141505B (en) Video data coding method and device
JP2023531163A (en) Data encoding and decoding
EP4354861A1 (en) Video decoding and coding method, device and storage medium
WO2022188186A1 (en) Coefficient encoding method, coefficient decoding method, encoding device, decoding device, and storage medium
CN116982314A (en) Coefficient encoding and decoding method, encoding and decoding device, terminal and storage medium
JP2022178335A (en) Image decoding device and image encoding device
JP2022188825A (en) Image decoding device and image encoding device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40055846

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant