CN113573055A - Deblocking filtering method, apparatus, electronic device, and medium for picture sequence - Google Patents

Deblocking filtering method, apparatus, electronic device, and medium for picture sequence Download PDF

Info

Publication number
CN113573055A
CN113573055A CN202110844516.4A CN202110844516A CN113573055A CN 113573055 A CN113573055 A CN 113573055A CN 202110844516 A CN202110844516 A CN 202110844516A CN 113573055 A CN113573055 A CN 113573055A
Authority
CN
China
Prior art keywords
boundary
picture
threshold
degree
current picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110844516.4A
Other languages
Chinese (zh)
Other versions
CN113573055B (en
Inventor
邹箭
丁文鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110844516.4A priority Critical patent/CN113573055B/en
Publication of CN113573055A publication Critical patent/CN113573055A/en
Application granted granted Critical
Publication of CN113573055B publication Critical patent/CN113573055B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The present disclosure discloses a deblocking filtering method, apparatus, electronic device, and medium for a picture sequence, and particularly relates to the technical field of artificial intelligence, and in particular, to the technical field of media cloud in cloud computing. The specific implementation scheme is as follows: determining a category of a current picture in the sequence based on a predetermined threshold; adjusting a boundary threshold for deblocking filtering a block boundary between two adjacent blocks within the current picture based on the determined category; and selecting a deblocking filtering filter to be applied to the block boundary based on the boundary threshold.

Description

Deblocking filtering method, apparatus, electronic device, and medium for picture sequence
Technical Field
The present disclosure relates to the field of artificial intelligence technologies, and in particular, to a method and an apparatus for deblocking filtering a picture sequence, an electronic device, and a medium for media cloud technologies in cloud computing.
Background
High Efficiency Video coding hevc (high Efficiency Video coding) is a new generation of Video coding compression standard, and can save nearly 50% of code rate under the same definition compared with the previous generation of h.264/AVC standard. The method can be widely applied to the fields related to video compression, such as live broadcasting, on-demand broadcasting and the like. The coding framework of HEVC mainly includes prediction, transformation, quantization, loop filtering, entropy coding, etc. Loop filtering is an important block of an encoder and consists of deblocking filtering (deblocking) and sample adaptive offset (sample adaptive offset). The reason for introducing the deblocking filtering is that the coding is performed in units of blocks, and the coding process is subjected to processes such as transformation, quantization, motion compensation and the like, and discontinuous data is generated at the boundary of each block, so that a pseudo boundary or a blocking effect occurs subjectively, and subjective feeling is influenced. The block boundaries are deblock filtered to remove artifacts, thereby preserving true boundaries. However, the boundary threshold offset adopted by the current deblocking filtering needs to traverse all combinations thereof, regardless of the applied scene and picture complexity, resulting in a huge amount of computation.
Disclosure of Invention
The present disclosure provides a deblocking filtering method, apparatus, electronic device, and medium for a picture sequence.
According to an aspect of the present disclosure, there is provided a deblocking filtering method for a picture sequence, including:
determining a category of a current picture in the sequence based on a predetermined threshold;
adjusting a boundary threshold for deblocking filtering a block boundary between two adjacent blocks within the current picture based on the determined category; and
based on the boundary threshold, a deblocking filtering filter to be applied to the block boundary is selected.
According to another aspect of the present disclosure, there is provided a deblocking filtering apparatus for a picture sequence, including:
a picture category determination module that determines a category of a current picture in the sequence based on a predetermined threshold;
an adjusting module that adjusts a boundary threshold for deblocking filtering a block boundary between two adjacent blocks within a current picture based on the determined category; and
a selection module that selects a deblocking filtering filter to be applied to the block boundary based on the boundary threshold.
According to another aspect of the present disclosure, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method according to an aspect of the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform a method according to an aspect of the present disclosure.
According to another aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, implements a method according to an aspect of the present disclosure.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 is a flowchart of a deblocking filtering method for a picture sequence according to an embodiment of the present disclosure;
fig. 2 illustrates a schematic diagram of a block boundary between two adjacent blocks to which deblocking filtering is applied according to an embodiment of the present disclosure;
FIG. 3 is a flow chart of deriving boundary filtering strength according to an embodiment of the present disclosure;
FIGS. 4A and 4B illustrate a schematic diagram of a discontinuity at a block boundary according to an embodiment of the present disclosure;
fig. 5 is a flowchart of a deblocking filtering method for a picture sequence according to another embodiment of the present disclosure;
fig. 6A, 6B, and 6C are schematic diagrams illustrating application of different deblocking filter filters according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of applying deblocking filtering to a current picture according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a deblocking filtering apparatus for a picture sequence according to an embodiment of the present disclosure; and
FIG. 9 illustrates a schematic block diagram of an example electronic device that can be used to implement embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a flow diagram of a deblocking filtering method 100 for a picture sequence according to an embodiment of the present disclosure.
In step S110, a category of a current picture in the sequence is determined based on a predetermined threshold.
In some embodiments, the categories of pictures may include: high complexity pictures with e.g. high speed moving objects, medium complexity pictures with e.g. low speed moving objects, and low complexity pictures with substantially unchanged background areas.
In step S120, a boundary threshold for deblocking filtering a block boundary between two adjacent blocks within the current picture is adjusted based on the determined category.
In step S130, a deblocking filter to be applied to the block boundary is selected based on a boundary threshold.
In some embodiments, further, a degree of variation and a flatness degree between samples located on both sides of a block boundary in two adjacent blocks are calculated, and a deblocking filter to be applied to the block boundary is selected based on the degree of variation and the flatness degree and a boundary threshold. Depending on the degree to be filtered, either a strong filter or a weak filter may be applied. The detailed process of deblocking filter selection will be described below.
The design of HEVC continues with the block-based hybrid video coding framework. First, each picture in an input picture sequence is divided into coding tree units. For a picture with three sample arrays, each coding tree unit consists of one luma sample coding tree block and two corresponding chroma sample coding tree blocks and a syntax structure for coding samples; for monochrome pictures, each coding tree unit consists of a sample coding tree block and a syntax structure for coding the samples. Then, each coding tree block is subjected to recursive quadtree division to obtain a coding block which is a basic unit of intra-frame/inter-frame coding. Then, one or more prediction blocks are divided via an indication of the division mode flag with the coding block as a tree root of the prediction tree and the transform tree, and one or more transform blocks are obtained by quad-tree division. The prediction block is a unit on which the same prediction is performed, and the transform block is a unit on which the same transform is performed.
Fig. 2 illustrates a schematic diagram of a block boundary between two adjacent blocks to which deblocking filtering is applied according to an embodiment of the present disclosure. Referring to fig. 2, the deblocking filtering of HEVC is processed for block boundaries in 8 × 8 sized blocks of all prediction blocks and transform blocks, where the block boundaries are shown in bold solid lines.
Deblocking filtering decisions in HEVC can be divided into 3 stages:
(1) deriving a boundary filtering strength;
(2) filter switch decision;
(3) a deblocking filter is selected.
First, the boundary filtering strength is derived. And preliminarily judging whether filtering is needed or not and the filtering parameters according to the coding parameters of the boundary block. Using different coding parameters (e.g., different prediction modes, different reference pictures, different motion vectors, etc.) for neighboring blocks may cause discontinuity of sample values at block boundaries.
Fig. 3 shows a flow chart for deriving the boundary filtering strength bS. Where P and Q are 4 x 4 sized blocks on either side of the block boundary.
As shown in fig. 3, in S310, if P or Q is located in an encoding block with intra prediction as a prediction mode, i.e., P or Q employs intra prediction, S360 is entered, and bS is set to 2. Otherwise, when at least one of the following conditions is satisfied, S370 is entered, and bS is set to 1:
(1) at S320, the block boundary is a transform block boundary, and the transform block at which P or Q is located contains one or more non-zero transform coefficient levels;
(2) at S330, P and Q refer to different reference pictures;
(3) in S340, different numbers of motion vectors MV are used in the prediction of P and Q; or
(4) In S350, the absolute value of the difference (of the horizontal or vertical component) of the motion vectors of P and Q is 4 or more.
When none of S310 to S350 is satisfied, the process proceeds to S380, and bS is set to 0.
Next, a filter switching decision is made. Due to the spatial masking effect of the human eye, discontinuous boundaries of flat areas of the picture are more easily observed.
Fig. 4A and 4B illustrate schematic diagrams of discontinuities at block boundaries according to embodiments of the present disclosure. As shown in fig. 4A, when the sample values on both sides of the boundary are relatively smooth, but a large difference occurs at the boundary, the human visual system can clearly recognize such discontinuity at the boundary. When the sample values on both sides of the boundary exhibit a tendency to change highly, as shown in fig. 4B, such discontinuity is hardly noticeable. Therefore, it is necessary to adjust the boundary threshold value to compare the degree of change between samples on both sides of the block boundary with an appropriate boundary threshold value to decide whether to perform deblocking filtering.
Specifically, taking a vertical block boundary as an example, a block boundary between two adjacent blocks to which deblocking filtering is applied according to an embodiment of the present disclosure is illustrated, and p (x, y), q (x, y) are sample values on both sides of the block boundary, respectively. The degree of change between samples on both sides of the block boundary is calculated by the following equations (1) to (4), where dp0Indicating the degree of change, dq, between the samples of the first row in the P block0Representing the degree of variation, dp, between samples of the first line in the Q block3Pattern representing the fourth row in a P blockDegree of variation between books, and dq3Representing the degree of variation between the samples of the fourth row in the Q block.
dp0=|p(2,0)-2p(1,0)+p(0,0)| (1)
dq0=|q(2,0)-2q(1,0)+q(0,0)| (2)
dp3=|p(2,3)-2p(1,3)+p(0,3)| (3)
dq3=|q(2,3)-2q(1,3)+q(0,3)| (4)
Further, the texture degree of the vertical block boundary is calculated by the following equation (5):
CB=dp0+dq0+dp3+dq3 (5)
the larger the texture value of the block boundary area is, the more uneven the area is, and when the texture value is large to a certain extent, the deblocking filtering switch may be turned off. Therefore, in HEVC, a boundary threshold β is set, and when the following equation (6) is satisfied, the filter switch of the block boundary is opened, otherwise it is closed:
CB<β (6)
by predetermining the category of the picture before deblocking filtering is performed on the picture, different boundary thresholds are set for different categories of pictures, and the decision accuracy of whether deblocking filtering is applied to block boundaries and which deblocking filtering filter is applied can be improved. In addition, different boundary thresholds are set for different classes of pictures based on predetermined picture classes, and the amount of computation for each picture to traverse combinations of various boundary thresholds can be greatly reduced.
Fig. 5 is a flow diagram of a method 500 for deblock filtering of a picture sequence according to another embodiment of the present disclosure.
In fig. 5, steps S530 and S540 correspond to steps S120 and S130, respectively, in the method 100. Further, step S110 in the method 100 is subdivided into steps S510 and S520 before step S530.
In step S510, the texture degrees of a plurality of sample pictures are calculated, and a predetermined threshold value is set based on the calculated texture degrees.
In some embodiments, may be selected fromCalculating a texture μ for each sample picture by variance of a gray level histogram of the picture2(z):
Figure BDA0003179993450000061
Where z is the grey scale of the picture, p (z)i) Is the corresponding histogram, L is the number of different gray levels, and m is the mean of z:
Figure BDA0003179993450000062
the texture degree calculation is performed on a plurality of sample pictures, the sample pictures can be a test picture sequence, and a predetermined threshold value is set through training and pre-analysis based on the calculated texture degrees.
In some embodiments, to reduce the computational complexity, the texture of the sample picture may also be downsampled, e.g., 1/2, 1/4, before being computed, and the texture of the downsampled sample picture is computed.
In step S520, a texture degree of the current picture is calculated, and the calculated texture degree is compared with a predetermined threshold value to determine a category of the current picture.
In some embodiments, the texture of the current picture may be calculated directly or a downsampled texture of the current picture may be calculated.
By training and pre-analyzing the texture of the sample picture sequence, the predetermined threshold for determining the picture class can be set more accurately. In addition, the sample picture is downsampled before the texture degree is calculated, so that the calculation amount in texture degree calculation can be saved.
Fig. 6A, 6B, and 6C are schematic diagrams illustrating application of different deblocking filter filters according to an embodiment of the present disclosure.
Fig. 6A to 6C show 3 boundary cases. Compared with fig. 6B, the sample values on both sides of the boundary in fig. 6A are flat, and a stronger blocking effect is visually formed, so that the samples on both sides of the boundary in fig. 6A need to be strongly filtered to obtain a good visual effect. Whereas for fig. 6C, such block boundaries are due to the picture sequence content itself, rather than pseudo boundaries, when the difference between sample values at the boundaries is very large, since the sample distortion will always be within a certain range. Therefore, only weak filtering needs to be applied to the block boundaries in fig. 6C.
In some embodiments, in addition to the boundary threshold β, another boundary threshold t is setCFor comparison with the degree of variation and flatness between samples on both sides of the block boundary, respectively. Hereinafter, the boundary threshold β and the boundary threshold t are setCReferred to as first and second boundary thresholds, respectively.
The degree of variation and the degree of flatness between samples on both sides of a block boundary are calculated with reference to the block boundary schematic and equations (1) to (4) between two adjacent blocks to which deblocking filtering is applied according to an embodiment of the present disclosure, and a strong filter is selected in the case where the following equations (9) to (14) are all satisfied (e.g., block boundary in fig. 6B):
2(dp0+dq0)<(β>>2) (9)
2(dp3+dq3)<(β>>2) (10)
|p(3,0)-p(0,0)|+|q(0,0)-q(3,0)|<(β>>3) (11)
|p(3,3)-p(0,3)|+|q(0,3)-q(3,3)|<(β>>3) (12)
|p(0,0)-q(0,0)|<(5tC+1)>>1 (13)
|p(0,3)-q(0,3)|<(5tC+1)>>1 (14)
where the operator "a > b" indicates an arithmetic right shift, i.e., a is shifted b bits to the right. Equations (9) and (10) represent the degree of variation of the two sample values at the block boundary. Equations (11) and (12) are used to determine whether the samples of both block boundaries are flat. Equations (13) and (14) are used to determine whether the span of samples at the block boundary is controlled within a certain range.
Further, when at least one of equations (9) to (14) is not satisfied, such as a block boundary shown in fig. 6C, where the difference between samples p (0, 0) and q (0, 0) on both sides of the boundary is large, equation (13) is not satisfied, a weak filter is selected.
In HEVC, a first boundary threshold β and a second boundary threshold t are derived with reference to table 1 below based on quantization parameters QP of coding blocks to which blocks on both sides of a block boundary belong, and first offset values for the first boundary threshold and second offset values for the second boundary threshold, respectivelyC. Wherein the first and second offset values are represented in the bitstream by syntax elements slice _ beta _ offset _ div2 and slice _ tc _ offset _ div 2.
TABLE 1
Q 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
β′ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 7 8
tC 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1
Q 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
β′ 9 10 11 12 13 14 15 16 17 18 20 22 24 26 28 30 32 34 36
tC 1 1 1 1 1 1 1 1 2 2 2 2 3 3 3 3 4 4 4
Q 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53
β′ 38 40 42 44 46 48 50 52 54 56 58 60 62 64 - -
tC 5 5 6 6 7 8 9 10 11 13 14 16 18 20 22 24
Specifically, letQpPAnd QpQThe quantization parameters QP of the coding blocks to which the blocks P and Q belong, respectively, are obtained as a variable QP by the following equation (15)L
qPL=(QpP+QpQ+1)>>1 (15)
To derive the first boundary threshold β, Q is calculated by the following equation (16):
Q=Clip3(0,51,qPL+(slice_beta_offset_div2<<1) (16)
clip3(x, y, z) is a clamp function, limiting z to the range of x to y (inclusive). "a < b" indicates an arithmetic left shift, i.e., a is left shifted by b bits.
Referring to table 1, β' is derived, and based on the bit depth value of the luminance component, the first boundary threshold β is derived by the following equation (17):
β=β′*(1<<(BitDepthY-8)) (17)
wherein, BitDepthYThe depth of the samples of the luminance array is specified. For the master Profile (Main Profile) commonly used in HEVC, BitDepthYIn this case, β is β'.
Similarly, to derive the second boundary threshold tCQ is calculated by the following equation (18):
Q=Clip3(0,53,qPL+2*(bS-1)+(slice_tc_offset_tc<<1)) (18)
deriving t 'with reference to Table 1'CAnd the second boundary threshold value t is derived by the following equation (19) based on the bit depth value of the luminance componentC
tC=t′C*(1<<(BitDepthY-8)) (19)
In some embodiments, the encoder side adjusts the values of slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 after determining the class of the current picture in the sequence based on a predetermined threshold and encodes slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 in the bitstream.
Specifically, slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 are written into the slice segment header syntax:
Figure BDA0003179993450000091
slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 specify the beta and t for the current stripeCThe deblocking parameter of (2) is offset by the value of (after division by 2). The values of slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 should be in the range of-6 to 6, inclusive.
By predetermining picture classes and setting different combinations of boundary threshold offset values for different picture classes, the encoder can avoid traversing 13 × 13 — 169 possible combinations of boundary threshold offset values for each picture to be deblock filtered.
In some embodiments, the boundary threshold offset value combination may be set, for example, as shown in table 2 below:
TABLE 2
slice_beta_offset_div2 slice_tc_offset_div2
High complexity -2 -2
Moderate complexity 0 0
Low complexity 2 2
Fig. 7 is a schematic diagram of applying deblocking filtering to a current picture according to an embodiment of the present disclosure.
An encoder compliant with the HEVC standard has an embedded HEVC decoder compliant with the HEVC standard to reconstruct pictures in a sequence of pictures and save the reconstructed (possibly also loop filtered, depending on whether loop filtering is enabled) pictures in a decoded picture buffer as reference pictures that may be used in inter prediction. In fig. 7, a prediction 710 is performed on a current picture 700 in an input picture sequence, resulting in a predicted picture 700' for the current picture 700. The prediction may be intra prediction or inter prediction. When inter-frame prediction is performed, as described above, the prediction unit is a prediction block, and for a current prediction block in a current picture, a reference picture is taken from the decoded picture buffer according to a reference picture index, and a prediction block thereof is obtained based on a motion vector of the current prediction block.
Subsequently, the residual Δ 700 between the current picture 700 and its predicted picture 700' is transformed/quantized 720 and inverse quantized/inverse transformed 730 in sequence, resulting in a residual picture 700 "of the current picture 700.
The predicted picture 700 'and the residual picture 700 "are added by an adder 740 to obtain a reconstructed picture 700'" of the current picture 700.
With deblocking filtering enabled, the reconstructed picture 700' "is processed with the selected deblocking filtering filter to yield a filtered reconstructed picture 700". The filtered reconstructed picture 700 "" may be further subjected to sample adaptive offset processing, depending on the value of the respective enable flag.
Fig. 8 is a schematic diagram of a deblocking filtering apparatus 800 for a picture sequence according to an embodiment of the present disclosure.
As shown in fig. 8, deblocking filtering apparatus 800 includes a picture category determination module 810, an adjustment module 820, and a selection module 830.
The picture category determination module 810 determines a category of a current picture in the sequence based on a predetermined threshold. In some embodiments, the categories of pictures may include: high complexity pictures with e.g. high speed moving objects, medium complexity pictures with e.g. low speed moving objects, and low complexity pictures with substantially unchanged background areas.
The adjusting module 820 adjusts a boundary threshold for deblocking filtering a block boundary between two adjacent blocks within the current picture based on the determined category.
The selection module 830 selects a deblocking filter to be applied to the block boundary based on a boundary threshold.
In some embodiments, further, a degree of variation and a flatness degree between samples located on both sides of a block boundary in two adjacent blocks are calculated, and a deblocking filter to be applied to the block boundary is selected based on the degree of variation and the flatness degree and a boundary threshold. Depending on the degree to be filtered, either a strong filter or a weak filter may be applied. The detailed process of deblocking filter selection will be described below.
The design of HEVC continues with the block-based hybrid video coding framework. First, each picture in an input picture sequence is divided into coding tree units. For a picture with three sample arrays, each coding tree unit consists of one luma sample coding tree block and two corresponding chroma sample coding tree blocks and a syntax structure for coding samples; for monochrome pictures, each coding tree unit consists of a sample coding tree block and a syntax structure for coding the samples. Then, each coding tree block is subjected to recursive quadtree division to obtain a coding block which is a basic unit of intra-frame/inter-frame coding. Then, one or more prediction blocks are divided via an indication of the division mode flag with the coding block as a tree root of the prediction tree and the transform tree, and one or more transform blocks are obtained by quad-tree division. The prediction block is a unit on which the same prediction is performed, and the transform block is a unit on which the same transform is performed.
The deblocking filtering of HEVC processes for block boundaries in 8 × 8 sized blocks of all prediction blocks and transform blocks.
Deblocking filtering decisions in HEVC can be divided into 3 stages:
(1) deriving a boundary filtering strength;
(2) filter switch decision;
(3) a deblocking filter is selected.
First, the boundary filtering strength is derived. And preliminarily judging whether filtering is needed or not and the filtering parameters according to the coding parameters of the boundary block. Discontinuity of sample values at block boundaries may be caused by neighboring blocks using different coding parameters (e.g., different prediction modes, different reference pictures, different motion vectors, etc.).
Next, a filter switching decision is made. Due to the spatial masking effect of the human eye, discontinuous boundaries of flat areas of the picture are more easily observed. When the sample values on both sides of the boundary are relatively smooth, but large differences occur at the boundary, the human visual system can clearly recognize such discontinuities at the boundary. Such discontinuities are difficult to detect when the sample values on both sides of the boundary exhibit a highly varying trend. Therefore, it is necessary to adjust the boundary threshold value to compare the degree of change between samples on both sides of the block boundary with an appropriate boundary threshold value to decide whether to perform deblocking filtering.
Specifically, taking a vertical block boundary as an example, a block boundary between two adjacent blocks to which deblocking filtering is applied according to an embodiment of the present disclosure is illustrated, and p (x, y), q (x, y) are sample values on both sides of the block boundary, respectively. The degree of change between samples on both sides of the block boundary is calculated by the following equations (1) to (4), where dp0Indicating the degree of change, dq, between the samples of the first row in the P block0Representing the degree of variation, dp, between samples of the first line in the Q block3Represents the degree of change between samples of the fourth row in the P block, and dq3Representing the degree of variation between the samples of the fourth row in the Q block.
dp0=|p(2,0)-2p(1,0)+p(0,0)| (1)
dq0=|q(2,0)-2q(1,0)+q(0,0)| (2)
dp3=|p(2,3)-2p(1,3)+p(0,3)| (3)
dq3=|q(2,3)-2q(1,3)+q(0,3)| (4)
Further, the texture degree of the vertical block boundary is calculated by the following equation (5):
CB=dp0+dq0+dp3+dq3 (5)
the larger the texture value of the block boundary area is, the more uneven the area is, and when the texture value is large to a certain extent, the deblocking filtering switch may be turned off. Therefore, in HEVC, a boundary threshold β is set, and when the following equation (6) is satisfied, the filter switch of the block boundary is opened, otherwise it is closed:
CB<β (6)
by predetermining the category of the picture before deblocking filtering is performed on the picture, different boundary thresholds are set for different categories of pictures, and the decision accuracy of whether deblocking filtering is applied to block boundaries and which deblocking filtering filter is applied can be improved. In addition, different boundary thresholds are set for different classes of pictures based on predetermined picture classes, and the amount of computation for each picture to traverse combinations of various boundary thresholds can be greatly reduced.
Further, deblocking filtering apparatus 800 may further include a first calculation module, a setting module, a second calculation module, and a comparison module.
The first calculation module calculates texture degrees of the plurality of sample pictures, and the setting module sets a predetermined threshold value based on the calculated texture degrees.
In some embodiments, the texture μ for each sample picture may be calculated by the variance of the gray level histogram of the picture2(z):
Figure BDA0003179993450000121
Where z is the grey scale of the picture, p (z)i) Is the corresponding histogram, L is the number of different gray levels, and m is the mean of z:
Figure BDA0003179993450000122
the texture degree calculation is performed on a plurality of sample pictures, the sample pictures can be a test picture sequence, and a predetermined threshold value is set through training and pre-analysis based on the calculated texture degrees.
In some embodiments, to reduce the computational complexity, the texture of the sample picture may also be downsampled, e.g., 1/2, 1/4, before being computed, and the texture of the downsampled sample picture is computed.
The second calculation module calculates a texture degree of the current picture, and the comparison module compares the calculated texture degree with a predetermined threshold value to determine the category of the current picture by the picture category determination module 810.
In some embodiments, the texture of the current picture may be calculated directly or a downsampled texture of the current picture may be calculated.
By training and pre-analyzing the texture of the sample picture sequence, the predetermined threshold for determining the picture class can be set more accurately. In addition, the sample picture is downsampled before the texture degree is calculated, so that the calculation amount in texture degree calculation can be saved.
According to the illustration of applying different deblocking filtering filters of the embodiment of the present disclosure, for the case that the sample values on both sides of the block boundary are flat, a stronger blocking effect is visually formed, and therefore, the samples on both sides of the block boundary need to be strongly filtered to obtain a good visual effect. For the case where the difference between sample values at a block boundary is very large, such a block boundary is caused by the picture sequence content itself, rather than a pseudo boundary, since the sample distortion will always be within a certain range. Therefore, only weak filtering needs to be applied to the block boundaries.
In some embodiments, in addition to the boundary threshold β, another boundary threshold t is setCFor comparison with the degree of variation and flatness between samples on both sides of the block boundary, respectively. Hereinafter, the boundary threshold β and the boundary threshold t are setCReferred to as first and second boundary thresholds, respectively.
The degree of variation and the degree of flatness between samples on both sides of a block boundary are calculated with reference to the block boundary schematic and equations (1) to (4) between two adjacent blocks to which deblocking filtering is applied according to an embodiment of the present disclosure, and a strong filter is selected in the case where the following equations (9) to (14) are all satisfied (e.g., sample values on both sides of a block boundary are flat):
2(dp0+dq0)<(β>>2) (9)
2(dp3+dq3)<(β>>2) (10)
|p(3,0)-p(0,0)|+|q(0,0)-q(3,0)|<(β>>3) (11)
|p(3,3)-p(0,3)|+|q(0,3)-q(3,3)|<(β>>3) (12)
|p(0,0)-q(0,0)|<(5tC+1)>>1 (13)
|p(0,3)-q(0,3)|<(5tC+1)>>1 (14)
where the operator "a > b" indicates an arithmetic right shift, i.e., a is shifted b bits to the right. Equations (9) and (10) represent the degree of variation of the two sample values at the block boundary. Equations (11) and (12) are used to determine whether the samples of both block boundaries are flat. Equations (13) and (14) are used to determine whether the span of samples at the block boundary is controlled within a certain range.
Further, when at least one of equations (9) to (14) is not satisfied, for example, when the difference between samples p (0, 0) and q (0, 0) on both sides of the block boundary is large, equation (13) is not satisfied, the weak filter is selected.
In HEVC, a first boundary threshold β and a second boundary threshold t are derived with reference to table 1 below based on quantization parameters QP of coding blocks to which blocks on both sides of a block boundary belong, and first offset values for the first boundary threshold and second offset values for the second boundary threshold, respectivelyC. Wherein the first and second offset values are represented in the bitstream by syntax elements slice _ beta _ offset _ div2 and slice _ tc _ offset _ div 2.
TABLE 1
Q 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
β′ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 6 7 8
tC 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1
Q 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37
β′ 9 10 11 12 13 14 15 16 17 18 20 22 24 26 28 30 32 34 36
tC 1 1 1 1 1 1 1 1 2 2 2 2 3 3 3 3 4 4 4
Q 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53
β′ 38 40 42 44 46 48 50 52 54 56 58 60 62 64 - -
tC 5 5 6 6 7 8 9 10 11 13 14 16 18 20 22 24
Specifically, let QpPAnd QpQThe quantization parameters QP of the coding blocks to which the blocks P and Q belong, respectively, are obtained as a variable QP by the following equation (15)L
qPL=(QpP+QpQ+1)>>1 (15)
To derive the first boundary threshold β, Q is calculated by the following equation (16):
Q=Clip3(0,51,qPL+(slice_beta_offset_div2<<1) (16)
clip3(x, y, z) is a clamp function, limiting z to the range of x to y (inclusive). "a < b" indicates an arithmetic left shift, i.e., a is left shifted by b bits.
Referring to table 1, β' is derived, and based on the bit depth value of the luminance component, the first boundary threshold β is derived by the following equation (17):
β=β′*(1<<(BitDepthY-8)) (17)
wherein, BitDepthYThe depth of the samples of the luminance array is specified. For the master Profile (Main Profile) commonly used in HEVC, BitDepthYIn this case, β is β'.
Similarly, to derive the second boundary threshold tCQ is calculated by the following equation (18):
Q=Clip3(0,53,qPL+2*(bS-1)+(slice_tc_offset_tc<<1)) (18)
deriving t 'with reference to Table 1'CAnd the second boundary threshold value t is derived by the following equation (19) based on the bit depth value of the luminance componentC
tC=t′C*(1<<(BitDepthY-8)) (19)
In some embodiments, the encoder side adjusts the values of slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 after determining the class of the current picture in the sequence based on a predetermined threshold and encodes slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 in the bitstream.
Specifically, slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 are written into the slice segment header syntax:
Figure BDA0003179993450000151
slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 specify the beta and t for the current stripeCThe deblocking parameter of (2) is offset by the value of (after division by 2). The values of slice _ beta _ offset _ div2 and slice _ tc _ offset _ div2 should be in the range of-6 to 6, inclusive.
By predetermining picture classes and setting different combinations of boundary threshold offset values for different picture classes, the encoder can avoid traversing 13 × 13 — 169 possible combinations of boundary threshold offset values for each picture to be deblock filtered.
In some embodiments, the boundary threshold offset value combination may be set, for example, as shown in table 2 below:
TABLE 2
slice_beta_offset_div2 slice_tc_offset_div2
High complexity -2 -2
Moderate complexity 0 0
Low complexity 2 2
FIG. 9 illustrates a schematic block diagram of an example electronic device 900 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 9, the apparatus 900 includes a computing unit 901, which can perform various appropriate actions and processes in accordance with a computer program stored in a Read Only Memory (ROM)902 or a computer program loaded from a storage unit 908 into a Random Access Memory (RAM) 903. In the RAM903, various programs and data required for the operation of the device 900 can also be stored. The calculation unit 901, ROM 902, and RAM903 are connected to each other via a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
A number of components in the device 900 are connected to the I/O interface 905, including: an input unit 906 such as a keyboard, a mouse, and the like; an output unit 907 such as various types of displays, speakers, and the like; a storage unit 908 such as a magnetic disk, optical disk, or the like; and a communication unit 909 such as a network card, a modem, a wireless communication transceiver, and the like. The communication unit 909 allows the device 900 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 901 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 901 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The computing unit 901 performs the respective methods and processes described above, such as the methods 100 or 500. For example, in some embodiments, the above-described methods may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 908. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 900 via ROM 902 and/or communications unit 909. When the computer program is loaded into the RAM903 and executed by the computing unit 901, one or more steps of the above described method may be performed. Alternatively, in other embodiments, the computing unit 901 may be configured to perform the method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (16)

1. A deblocking filtering method for a sequence of pictures, comprising:
determining a category of a current picture in the sequence based on a predetermined threshold;
adjusting a boundary threshold for deblocking filtering a block boundary between two adjacent blocks within the current picture based on the determined category; and
based on the boundary threshold, a deblocking filtering filter to be applied to the block boundary is selected.
2. The method of claim 1, further comprising:
calculating the texture degree of a plurality of sample pictures; and
setting the predetermined threshold based on the calculated texture degree.
3. The method of claim 2, further comprising:
calculating the texture degree of the current picture; and
comparing the calculated texture degree with the predetermined threshold value to determine the category of the current picture.
4. The method of claim 1, further comprising:
adjusting the boundary threshold by changing an offset value of the boundary threshold, the offset value being for a slice in the current picture.
5. The method of claim 4, wherein the boundary threshold comprises a first boundary threshold and a second boundary threshold, and the offset value comprises a first offset value for the first boundary threshold and a second offset value for the second boundary threshold, wherein the first offset value and the second offset value are represented in a bitstream by syntax elements slice _ beta _ offset _ div2 and slice _ tc _ offset _ div 2.
6. The method of claim 5, further comprising:
calculating the degree of change and the degree of flatness between samples positioned at two sides of the block boundary in the two adjacent blocks; and
selecting a deblocking filtering filter to be applied to the block boundary based on the degree of variation and the degree of flatness and the first and second boundary thresholds.
7. The method of claim 1, further comprising:
predicting the current picture to obtain a predicted picture aiming at the current picture;
sequentially transforming/quantizing and inversely quantizing/inversely transforming the residual between the current picture and the predicted picture to obtain a residual picture of the current picture;
obtaining a reconstructed picture of the current picture based on the predicted picture and the residual picture; and
processing the reconstructed picture with the deblocking filter to obtain a filtered reconstructed picture.
8. A deblocking filtering apparatus for a picture sequence, comprising:
a picture category determination module that determines a category of a current picture in the sequence based on a predetermined threshold;
an adjusting module that adjusts a boundary threshold for deblocking filtering a block boundary between two adjacent blocks within a current picture based on the determined category; and
a selection module that selects a deblocking filtering filter to be applied to the block boundary based on the boundary threshold.
9. The apparatus of claim 8, further comprising:
the first calculation module is used for calculating the texture degrees of the sample pictures; and
and the setting module is used for setting the preset threshold value based on the calculated texture degree.
10. The apparatus of claim 9, further comprising:
the second calculation module is used for calculating the texture degree of the current picture; and
and the comparison module compares the calculated texture degree with the predetermined threshold value so as to determine the category of the current picture by the picture category determination module.
11. The apparatus of claim 8, further comprising:
a change module to adjust a boundary threshold by changing an offset value of the boundary threshold, the offset value being for a slice in the current picture.
12. The apparatus of claim 11, wherein the boundary threshold comprises a first boundary threshold and a second boundary threshold, and the offset value comprises a first offset value for the first boundary threshold and a second offset value for the second boundary threshold, wherein the first offset value and the second offset value are represented in a bitstream by syntax elements slice _ beta _ offset _ div2 and slice _ tc _ offset _ div 2.
13. The apparatus of claim 12, further comprising:
a third calculating module, for calculating the degree of variation and the degree of flatness between samples located at two sides of the block boundary in the two adjacent blocks; and
the selection module selects a deblocking filter to be applied to the block boundary based on the degree of variation and the degree of flatness and the first and second boundary thresholds.
14. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
15. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any of claims 1-7.
16. A computer program product comprising a computer program which, when executed by a processor, implements a method according to any one of claims 1 to 7.
CN202110844516.4A 2021-07-26 2021-07-26 Deblocking filtering method and device for picture sequence, electronic equipment and medium Active CN113573055B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110844516.4A CN113573055B (en) 2021-07-26 2021-07-26 Deblocking filtering method and device for picture sequence, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110844516.4A CN113573055B (en) 2021-07-26 2021-07-26 Deblocking filtering method and device for picture sequence, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN113573055A true CN113573055A (en) 2021-10-29
CN113573055B CN113573055B (en) 2024-03-01

Family

ID=78167387

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110844516.4A Active CN113573055B (en) 2021-07-26 2021-07-26 Deblocking filtering method and device for picture sequence, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN113573055B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019137749A1 (en) * 2018-01-10 2019-07-18 Telefonaktiebolaget Lm Ericsson (Publ) Determining filter length for deblocking during encoding and/or decoding of video
TW201943270A (en) * 2018-04-02 2019-11-01 美商高通公司 Unification of deblocking filter and adaptive loop filter
CN111652818A (en) * 2020-05-29 2020-09-11 浙江大华技术股份有限公司 Image filtering method and device based on pyramid and storage medium
CN111711825A (en) * 2020-06-23 2020-09-25 腾讯科技(深圳)有限公司 Deblocking filtering method, apparatus, device and medium in video encoding and decoding
US20210067773A1 (en) * 2019-09-03 2021-03-04 Novatek Microelectronics Corp. Video encoder, video decoder, and video system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019137749A1 (en) * 2018-01-10 2019-07-18 Telefonaktiebolaget Lm Ericsson (Publ) Determining filter length for deblocking during encoding and/or decoding of video
TW201943270A (en) * 2018-04-02 2019-11-01 美商高通公司 Unification of deblocking filter and adaptive loop filter
US20210067773A1 (en) * 2019-09-03 2021-03-04 Novatek Microelectronics Corp. Video encoder, video decoder, and video system
CN111652818A (en) * 2020-05-29 2020-09-11 浙江大华技术股份有限公司 Image filtering method and device based on pyramid and storage medium
CN111711825A (en) * 2020-06-23 2020-09-25 腾讯科技(深圳)有限公司 Deblocking filtering method, apparatus, device and medium in video encoding and decoding

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
QINGBO WU: "Mode dependent loop filter for intra prediction coding in H.264/AVC", JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION *
陈智贤;王国中;赵海武;李国平;滕国伟;: "改进的基于AVS2的去块效应滤波算法", 电视技术 *

Also Published As

Publication number Publication date
CN113573055B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
US10462459B2 (en) Non-local adaptive loop filter
CN110036637B (en) Method and device for denoising and vocalizing reconstructed image
US8582666B2 (en) Image compression and decompression
EP2670142B1 (en) Method and apparatus for noise filtering in video coding
EP2664141B1 (en) Deblocking filtering
CN109905711B (en) Image processing method and system and terminal equipment
CN111630857B (en) Video encoding and decoding method/device and corresponding non-volatile computer readable medium
KR20180083389A (en) Video coding method and apparatus
KR20050091270A (en) Filter for removing blocking effect and filtering method thereof
CN110024387A (en) Code device, decoding apparatus, coding method and coding/decoding method
US8005307B2 (en) Decoding apparatus, decoding method, computer readable medium storing program thereof, and computer data signal
Gandam et al. An efficient post-processing adaptive filtering technique to rectifying the flickering effects
Chen et al. Rough mode cost–based fast intra coding for high-efficiency video coding
US20180103274A1 (en) Deblocking filtering method and deblocking filter
WO2020252745A1 (en) Loop filter design for adaptive resolution video coding
Shin et al. Variable block-based deblocking filter for H. 264/AVC on low-end and low-bit rates terminals
Correa et al. Complexity control of HEVC through quadtree depth estimation
CN113573055B (en) Deblocking filtering method and device for picture sequence, electronic equipment and medium
Han et al. Quadtree-based non-local Kuan’s filtering in video compression
US20120044991A1 (en) Moving image coding apparatus and moving image coding method
WO2021146933A1 (en) Next-generation loop filter implementations for adaptive resolution video coding
Karimzadeh et al. An efficient deblocking filter algorithm for reduction of blocking artifacts in HEVC standard
EP3343919B1 (en) Video encoding apparatus, video encoding method, video decoding apparatus, and video decoding method
CN110249630B (en) Deblocking filter apparatus, method and storage medium
Li et al. Complexity Reduction of an Adaptive Loop Filter Based on Local Homogeneity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant