WO2013145174A1 - Video encoding method, video decoding method, video encoding device, and video decoding device - Google Patents

Video encoding method, video decoding method, video encoding device, and video decoding device Download PDF

Info

Publication number
WO2013145174A1
WO2013145174A1 PCT/JP2012/058209 JP2012058209W WO2013145174A1 WO 2013145174 A1 WO2013145174 A1 WO 2013145174A1 JP 2012058209 W JP2012058209 W JP 2012058209W WO 2013145174 A1 WO2013145174 A1 WO 2013145174A1
Authority
WO
WIPO (PCT)
Prior art keywords
filter
information
filter coefficient
coefficient information
pixel block
Prior art date
Application number
PCT/JP2012/058209
Other languages
French (fr)
Japanese (ja)
Inventor
隆志 渡辺
孝幸 伊東
山影 朋夫
昭行 谷沢
Original Assignee
株式会社 東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝 filed Critical 株式会社 東芝
Priority to PCT/JP2012/058209 priority Critical patent/WO2013145174A1/en
Publication of WO2013145174A1 publication Critical patent/WO2013145174A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • H04N19/82Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation involving filtering within a prediction loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process

Definitions

  • Embodiments relate to a moving image encoding technique and a decoding technique.
  • LCU Large Coding Unit
  • This moving image encoding apparatus is also called an LCU base encoder.
  • LCU means a pixel block treated as a coding unit.
  • Filter information corresponding to each LCU is encoded and signaled to the video decoding device.
  • an encoding mode for newly encoding the filter information itself, an encoding mode for encoding information referring to the already encoded filter information, and the like are prepared. Is done.
  • a picture-based encoder that sets a filter in units of pictures is also known.
  • the LCU base encoder can encode the filter information in units of LCUs, the delay associated with the encoding process can be reduced.
  • filters are optimized in units of LCUs, so it is necessary to set a large number of filters in a picture. Therefore, the overhead of encoded data tends to increase.
  • the picture-based encoder optimizes the filter for each picture. Therefore, according to the picture-based encoder, the total number of filters to be set is easily reduced as compared with the LCU-based encoder, and the overhead of encoded data is likely to be smaller than that of the LCU-based encoder. On the other hand, the picture-based encoder cannot set a filter until the encoding process for all the LCUs in the picture is completed. Therefore, the delay accompanying the encoding process is larger than that of the LCU-based encoder.
  • the embodiment is intended to reduce the overhead of encoded data.
  • one of the purposes of the embodiment is to reduce the delay associated with the encoding process.
  • the moving image encoding method encodes first filter coefficient information indicating each filter coefficient of one or more filters included in a first filter set set for a decoded image. Including that.
  • the moving image encoding method includes a first filter set and a second filter set set for the pixel block to be processed when loop filter processing is applied to the pixel block to be processed in the decoded image. It includes encoding the filter switching information indicating which is applied after the first filter coefficient information is encoded.
  • the moving image encoding method uses the second filter coefficient indicating each filter coefficient of one or more filters included in the second filter set. Encoding the information after the first filter coefficient information is encoded.
  • the moving image encoding method generates a pixel block in a reference image by applying one of the first filter set and the second filter set to the pixel block to be processed based on the filter switching information. Including that.
  • FIG. 1 is a block diagram illustrating a moving image encoding apparatus according to a first embodiment.
  • FIG. 2 is a block diagram illustrating a loop filter processing unit in FIG. 1.
  • Explanatory drawing of the syntax structure of a picture base encoder Explanatory drawing of the syntax structure of a LCU base encoder.
  • Explanatory drawing of the syntax structure which the moving image encoding device and moving image decoding device which concern on 1st Embodiment use The figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode 1st filter coefficient information.
  • the figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode filter switching information and 2nd filter coefficient information The figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode filter switching information and 2nd filter coefficient information.
  • 6 is a flowchart illustrating a part of an encoding process performed by the moving image encoding apparatus according to the first embodiment.
  • FIG. 1 is a block diagram illustrating a moving image decoding apparatus according to a first embodiment.
  • the block diagram which illustrates the moving picture coding device concerning a 2nd embodiment.
  • the block diagram which illustrates the moving picture decoding device concerning a 2nd embodiment.
  • the block diagram which illustrates the animation coding device concerning a 3rd embodiment.
  • the block diagram which illustrates the video decoding device concerning a 3rd embodiment.
  • FIG. 10 is a block diagram illustrating a moving image encoding apparatus according to a fifth embodiment.
  • the filter process may be a process of multiply-adding the pixel values of the pixel to be processed and one or more peripheral pixels using a plurality of filter coefficients, but is not limited thereto.
  • the filtering process may be only a process of adding one filter coefficient (also called an offset value) to the pixel value of the pixel to be processed.
  • the filtering process may be a process of adding a single filter coefficient (also referred to as an offset value) after performing a product-sum operation on pixel values of the processing target pixel and one or more neighboring pixels using a plurality of filter coefficients. Good.
  • the moving picture coding apparatus includes a moving picture coding unit 1000 and a coding control unit 111.
  • the moving image encoding unit 1000 includes a predicted image generation unit 101, a subtraction unit 102, a transform / quantization unit 103, an entropy encoding unit 104, an inverse quantization / inverse transform unit 105, an addition unit 106, A loop filter information generation unit 107, a loop filter processing unit 108, a first filter setting unit 109, and a first filter buffer 110 are provided.
  • the encoding control unit 111 controls the operation of each unit of the moving image encoding unit 1000.
  • the predicted image generation unit 101 generates the predicted image 11 by performing the prediction process of the input image 10 in units of pixel blocks, for example.
  • the input image 10 includes a plurality of pixel signals and is acquired from the outside of the moving image encoding device.
  • the prediction process may be a general process such as a temporal inter-screen prediction process using motion compensation and a spatial intra-screen prediction process using encoded pixels in the screen.
  • the predicted image generation unit 101 may perform a prediction process on the input image 10 based on a loop filter processed image 19 described later.
  • the predicted image generation unit 101 outputs the predicted image 11 to the subtraction unit 102 and the addition unit 106.
  • the subtraction unit 102 acquires the input image 10 from the outside of the moving image encoding device, and inputs the predicted image 11 from the predicted image generation unit 101.
  • the subtraction unit 102 subtracts the prediction image 11 from the input image 10 to generate a prediction error image 12.
  • the subtraction unit 102 outputs the prediction error image 12 to the transform / quantization unit 103.
  • the transform / quantization unit 103 receives the prediction error image 12 from the subtraction unit 102.
  • the transform / quantization unit 103 performs transform processing on the prediction error image 12 to generate transform coefficients. Further, the transform / quantization unit 103 quantizes the transform coefficient to generate a quantized transform coefficient.
  • the transform / quantization unit 103 outputs the quantized transform coefficient to the entropy coding unit 104 and the inverse quantization / inverse transform unit 105.
  • the transformation process is typically orthogonal transformation such as Discrete Cosine Transform (DCT). Note that the conversion process is not limited to DCT, and may be, for example, wavelet conversion, independent component analysis, or the like.
  • the quantization process is performed based on the quantization parameter set by the encoding control unit 111.
  • the entropy encoding unit 104 receives the quantized transform coefficient from the transform / quantization unit 103, receives the filter switching information 14 and the second filter coefficient information 15 from the loop filter information generation unit 107, and receives from the first filter buffer 110.
  • the first filter coefficient information 13 is input, and the encoding parameter is input from the encoding control unit 111.
  • the encoding parameters may include, for example, prediction mode information, motion information, encoded block division information, quantization parameters, and the like.
  • the entropy coding unit 104 performs entropy coding (for example, Huffman coding, arithmetic coding) on the quantized transform coefficient, the first filter coefficient information 13, the filter switching information 14, the second filter coefficient information 15, and the coding parameter according to the syntax. Encoded data 17 is generated. Details of the syntax used by the entropy encoding unit 104 will be described later.
  • the entropy encoding unit 104 outputs the encoded data 17 to the outside of the moving image encoding apparatus (for example, a communication system, a storage system, etc.). The encoded data 17 is decoded by a moving picture decoding apparatus described later.
  • the inverse quantization / inverse transform unit 105 inputs the quantized transform coefficient from the transform / quantization unit 103.
  • the inverse quantization / inverse transform unit 105 generates transform coefficients by inverse quantization of the quantized transform coefficients. Further, the inverse quantization / inverse transform unit 105 generates a prediction error image by performing an inverse transform process on the transform coefficient.
  • the inverse quantization / inverse transform unit 105 outputs the prediction error image to the addition unit 106.
  • the inverse quantization / inverse transform unit 105 performs the inverse process of the transform / quantization unit 103. That is, the inverse quantization is performed based on the quantization parameter set by the encoding control unit 111. Further, the inverse transform process is determined by the transform process performed by the transform / quantization unit 103.
  • the inverse transform process is, for example, inverse DCT (Inverse DCT; IDCT), inverse wavelet transform, or the like.
  • the addition unit 106 receives the prediction image 11 from the prediction image generation unit 101 and inputs the prediction error image from the inverse quantization / inverse conversion unit 105.
  • the adding unit 106 adds the prediction error image to the prediction image to generate a (local) decoded image 16.
  • Adder 106 outputs decoded image 16 to loop filter information generator 107 and loop filter processor 108.
  • the loop filter information generation unit 107 and the loop filter processing unit 108 perform processing in units of pixel blocks.
  • the pixel block is typically handled as a coding unit.
  • the pixel block is, for example, H.264. It corresponds to a macroblock in H.264 / AVC, an LCU in HEVC (High Efficiency Video Coding), and the like. In the following description, for convenience, it is assumed that the pixel block corresponds to an LCU. However, the present embodiment is applicable even when the pixel block does not correspond to an LCU.
  • the loop filter information generation unit 107 acquires the input image 10 in units of pixel blocks from the outside of the video encoding device, inputs the decoded image 16 in units of pixel blocks from the addition unit 106, and receives the first image from the first filter buffer 110.
  • the filter coefficient information 13 is input.
  • the loop filter information generation unit 107 generates filter setting information 18 corresponding to the pixel block to be processed based on the input image 10 and the decoded image 16. In addition, the loop filter information generation unit 107 generates second filter coefficient information 15 corresponding to the pixel block to be processed based on the filter setting information 18. Furthermore, the loop filter information generation unit 107 generates filter switching information 14 corresponding to the pixel block to be processed based on the input image 10, the first filter coefficient information 13, the second filter coefficient information 15, and the decoded image 16. .
  • the first filter coefficient information 13 is information indicating each filter coefficient of one or more filters included in the first filter set set in units of pictures.
  • the filter switching information 14 is information indicating which of the first filter set and the second filter set is applied when the loop filter process is applied to the corresponding pixel block.
  • the second filter coefficient information 15 is information indicating each filter coefficient of one or more filters included in the second filter set set in pixel block units.
  • the filter setting information 18 means information necessary for setting a filter coefficient for a pixel block to be processed.
  • the loop filter information generation unit 107 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104 and the loop filter processing unit 108.
  • the loop filter information generation unit 107 outputs the filter setting information 18 to the first filter setting unit 109.
  • the loop filter information generation unit 107 includes a filter setting information generation unit 112, a second filter setting unit 113, and a filter switching information generation unit 114.
  • the filter setting information generation unit 112 acquires the input image 10 from the outside of the video encoding device in units of pixel blocks, and inputs the decoded image 16 from the addition unit 106 in units of pixel blocks.
  • the filter setting information generation unit 112 generates filter setting information 18 corresponding to the pixel block to be processed based on the input image 10 and the decoded image 16.
  • the filter setting information generation unit 112 outputs the filter setting information 18 to the first filter setting unit 109 and the second filter setting unit 113.
  • the filter information generation unit 112 sets a filter indicating a correlation matrix based on the input image 10 in units of pixel blocks and the decoded image 16 in units of pixel blocks. Information 18 is generated.
  • the Wiener filter is a so-called pixel restoration filter and can minimize the residual sum of squares between the input image 10 and the loop filter processed image 19.
  • the second filter setting unit 113 receives the filter setting information 18 from the filter setting information generation unit 112.
  • the second filter setting unit 112 sets the filter coefficient of each of the one or more filters included in the second filter set applicable to the pixel block to be processed based on the filter setting information 18, thereby Two filter coefficient information 15 is generated.
  • the second filter setting unit 113 outputs the second filter coefficient information 15 to the entropy encoding unit 104, the loop filter processing unit 108, and the filter switching information generation unit 114.
  • Second filter coefficient information 15 can be generated.
  • the filter switching information generation unit 114 acquires the input image 10 from the outside of the video encoding device in units of pixel blocks, inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, and receives the first image from the first filter buffer 110.
  • the filter coefficient information 13 is input, and the second filter coefficient information 15 is input from the second filter setting unit 113.
  • the filter switching information generation unit 114 determines which of the first filter set and the second filter set is applied to the pixel block to be processed based on the input image 10 and the decoded image 16.
  • the filter switching information generation unit 114 can use, for example, the encoding cost represented by the following mathematical formula (1) in order to determine the filter set to be applied, but is not limited to this, and any technique can be used.
  • the filter switching information generation unit 114 generates filter switching information 14 based on the determination result.
  • the filter switching information generation unit 114 outputs the filter switching information 14 to the entropy encoding unit 104 and the loop filter processing unit 108.
  • Equation (1) Cost represents coding cost
  • D represents residual sum of squares
  • represents a coefficient
  • R represents a code amount.
  • the first filter coefficient information 13 is encoded in units of pictures, when the first filter set is applied to a pixel block, information indicating the filter coefficient itself is newly encoded for the pixel block. do not have to. Therefore, in general, the code amount when the first filter set is applied is not larger than the code amount when the second filter set is applied. That is, according to the above formula (1), the trade-off between the residual sum of squares and the code amount can be taken into account in the determination of the filter set applied to the pixel block.
  • the filter switching information 14 is obtained by applying a loop filter process to the corresponding pixel block in addition to information indicating which of the first filter set and the second filter set is applied to the corresponding pixel block. It may further include information indicating whether or not. That is, the filter switching information 14 includes information indicating whether or not the loop filter processing is applied to the corresponding pixel block, and the first filter set and the second filter set when the loop filter processing is applied. Information indicating which is applied.
  • the filter switching information generation unit 114 may generate filter switching information 14 indicating that the loop filter process is not applied to the pixel block.
  • the loop filter processing unit 108 inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, receives the filter switching information 14 and the second filter coefficient information 15 from the loop filter information generation unit 107, and receives from the first filter buffer 110. First filter coefficient information 13 is input.
  • the loop filter processing unit 108 applies either one of the first filter set and the second filter set based on the filter switching information 14 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 108 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 13 or the second filter coefficient information 15, so that the pixels in the loop filter processed image 19 are processed. Generate a block. As described above, if the filter switching information 14 indicates that the loop filter process is not applied to the pixel block to be processed, the loop filter processing unit 108 omits the loop filter process for the pixel block to be processed.
  • the loop filter processed image 19 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 101.
  • the loop filter processed image 19 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
  • the loop filter processing unit 108 includes a switch 115 and a filter application unit 116.
  • the switch 115 receives the first filter coefficient information 13 from the first filter buffer 110, and receives the filter switching information 14 and the second filter coefficient information 15 corresponding to the pixel block to be processed from the loop filter information generation unit 108.
  • the switch 115 selects one of the first filter coefficient information 13 and the second filter coefficient information 15 based on the filter switching information 14, and selects the selected filter coefficient information (hereinafter referred to as selected filter coefficient information). ) To the filter application unit 116. That is, if the filter switching information 14 indicates that the first filter set is applied to the pixel block to be processed, the switch 115 selects the first filter coefficient information 13.
  • the switch 115 selects the second filter coefficient information 15. If the filter switching information 14 indicates that the loop filter processing is not applied to the pixel block to be processed, the switch 115 selects neither the first filter coefficient information 13 nor the second filter coefficient information 15.
  • the filter application unit 116 inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, and inputs selection filter coefficient information from the switch 115.
  • the filter application unit 116 generates a pixel block in the loop filter processed image 19 by performing filter processing on the pixel block to be processed based on the selected filter coefficient information.
  • the filter application unit 116 outputs the loop filter processed image 19 to the predicted image generation unit 101. If the filter switching information 14 indicates that the loop filter process is not applied to the pixel block to be processed as described above, the filter application unit 116 directly uses the pixel block in the loop filter processed image 19 as it is. Use as
  • the first filter setting unit 109 receives the filter setting information 18 from the loop filter information setting unit 107.
  • the first filter setting unit 109 collects filter setting information 18 corresponding to, for example, all pixel blocks in a picture, and sets filter coefficients of the first filter set for subsequent pictures based on the collected filter setting information 18. Set.
  • the first filter setting unit 109 stores filter coefficient information indicating the set filter coefficient in the first filter buffer 110.
  • the subsequent picture may be a picture immediately after the picture to be processed in the encoding order, or may be a picture that is two or more pictures after.
  • the first filter setting unit 109 collects filter setting information 18 corresponding to all or some of the pixel blocks in the picture. Then, the first filter setting unit 109 integrates the correlation matrix by calculating the element sum of the correlation matrix indicated by the collected filter setting information 18.
  • the loop filter information 108 can set filter coefficient information for subsequent pictures by regarding the integrated correlation matrix as simultaneous equations and solving the simultaneous equations.
  • the first filter setting unit 109 does not necessarily need to set the filter coefficient information for the subsequent picture based on the filter setting information 18 of the pixel block in the current picture.
  • the first filter setting unit 109 may be able to refer to default filter coefficient information.
  • the first filter setting unit 109 may store default filter coefficient information in the first filter buffer 110 without depending on the filter setting information 18.
  • the filter coefficient information stored in the first filter buffer 110 is converted into first filter coefficient information 13 by the entropy encoding unit 104, the loop filter information generation unit 107, and the loop filter processing unit 108 in the encoding process for subsequent pictures. Read out.
  • the first filter buffer 110 may store filter coefficient information for a plurality of filter sets at the same time.
  • the first filter coefficient information 13 of the current picture corresponds to, for example, the level of the quantization parameter set for the picture from the filter coefficient information of the plurality of filter sets stored in the first filter buffer. May be selected.
  • the first filter coefficient information 13 is typically set based on the filter setting information 18 of the LCU in the preceding picture.
  • the first filter set does not necessarily minimize the residual sum of squares for the entire current picture.
  • the first filter coefficient information 13 set based on the filter setting information 18 of the LCU in the preceding picture is generally a residual even for the current picture. Suboptimal filtering that reduces the sum of squares can be realized.
  • the encoding control unit 111 performs encoding block division control, generated code amount feedback control, quantization control, mode control, and the like for the moving image encoding unit 1000.
  • the encoding control unit 111 outputs the encoding parameter to the entropy encoding unit 104.
  • the syntax used by the entropy encoding unit 104 will be described.
  • a syntax structure including a sequence level syntax, a picture level syntax, a slice level syntax, and an LCU level syntax is assumed.
  • the upper layer syntax information relating to the entire moving image such as the vertical and horizontal sizes of the image is encoded in the sequence level syntax SPS (Sequence Parameter Set).
  • SPS Sequence Parameter Set
  • PPS Picture Parameter Set
  • APS Adaptation Parameter Set
  • syntax information necessary for decoding at the slice level is encoded in the slice header of the slice level syntax.
  • syntax information such as quantization transform coefficients and prediction mode information of each LCU is encoded.
  • FIG. 4 shows a syntax structure according to the first comparative example.
  • the syntax structure of FIG. 4 is disclosed in WD5 (Working Draft 5) of HEVC.
  • the filter coefficient information is set in units of pictures and is encoded in APS with picture level syntax.
  • filter application information indicating whether or not a filter is applied in each LCU is collectively encoded in a slice header of slice level syntax. That is, the syntax structure of FIG. 4 is used by a so-called picture-based encoder.
  • filter coefficient information that minimizes the residual sum of squares can be set for each picture.
  • the delay until the encoded data is output is large.
  • FIG. 5 shows a syntax structure according to the second comparative example.
  • the syntax structure of FIG. 5 is disclosed in Non-Patent Document 2.
  • the filter coefficient information is set for each LCU, and is encoded for each LCU in the LCU level syntax.
  • the encoding of the filter coefficient information can be omitted.
  • it indicates whether or not a filter is applied in each LCU.
  • filter application information indicating whether the filter is an encoded filter or a new filter in the same slice.
  • each LCU is encoded. That is, the syntax structure of FIG.
  • LCU-based encoder is used by a so-called LCU-based encoder. According to the syntax structure of FIG. 5, it is possible to set filter coefficient information that minimizes the residual sum of squares in units of LCUs. Furthermore, since encoded data can be sequentially output every time encoding processing in units of LCU is completed, delay can be reduced. On the other hand, since the filters are optimized on an LCU basis, the set filters are diversified, and the overhead of filter coefficient information is likely to increase.
  • the entropy encoding unit 104 uses, for example, the syntax structure shown in FIG. According to the syntax structure of FIG. 6, the first filter coefficient information 13 is set for each picture and is encoded in the APS of the picture level syntax.
  • the filter switching information 14 is set for each LCU, and is encoded for each LCU in the LCU level syntax.
  • the second filter coefficient information 15 is also set for each LCU, and is encoded for each LCU in the LCU level syntax. However, when at least the second filter set is not applied to the LCU, the encoding of the second filter coefficient information 15 can be omitted.
  • the first filter coefficient information 13 is set for the following picture by the first filter setting unit 109.
  • the first filter coefficient information 13 is stored in the first filter buffer 110 before the encoding process for the first LCU of each picture starts. Therefore, according to the syntax structure of FIG. 6, the encoded data 17 can be sequentially output every time the encoding process in units of LCU is completed as in the LCU base encoder, so that the delay can be reduced.
  • the first filter set may be applied instead of the second filter set based on, for example, the above-described tradeoff between the residual sum of squares and the code amount. Therefore, since the total number of second filter sets to be set is reduced as compared with the LCU-based encoder, the overhead of the encoded data 17 can be reduced. Furthermore, as described above, if the sub-optimal first filter set is set based on the preceding picture, the possibility that the first filter set is applied to the LCU in the current picture increases. The overhead of the encoded data 17 is likely to be reduced.
  • the entropy encoding unit 104 can use, for example, the syntax shown in FIG. 7 to encode the first filter coefficient information 13.
  • FIG. 7 illustrates information related to loop filter processing in the APS illustrated in FIG.
  • aps_id is identification information for specifying an APS that is referenced in a layer below the slice level.
  • the slice can refer to any one of the already encoded APSs.
  • the APS in which the first filter coefficient information 13 is encoded and the encoded data of the LCU that can refer to the first filter coefficient information 13 are depicted as being continuous.
  • the APS in which the first filter coefficient information 13 is encoded and the first filter coefficient information 13 are referred to. There is no need to continuously encode the encoded data of possible LCUs.
  • the first filter coefficient information 13 that can be referred to can be adaptively switched in units of slices.
  • picture_based_filter_flag is information indicating whether or not the first filter coefficient information 13 is encoded in the current APS. If picture_based_filter_flag is 1, information on the first filter set including the first filter coefficient information 13 is encoded in the current APS.
  • Picture_based_filter_num_information is information indicating the total number of filters included in the first filter set.
  • the total number of filters included in the first filter set is represented as PictureBasedFilterNum.
  • the first filter set can include one or more filters.
  • the picture_based_filter_num_information may be obtained by encoding the total number of filters included in the first filter set, or may be obtained by subtracting 1 from the total number of filters included in the first filter set. It may be converted into one. Alternatively, if the total number of filters included in the first filter set is determined in advance, encoding of picture_based_filter_num_information can be omitted. In any case, a design is required so that the video decoding device can derive the total number of filters included in the first filter set.
  • This filter switching rule is encoded as picture_based_filter_group_information.
  • the HEVC WD5 discloses rules for switching up to 16 filters. Specifically, the 4 ⁇ 4 pixel area is classified into a maximum of 16 classes based on the gradient direction and activity of the pixel values of the 4 ⁇ 4 pixel area. Each of these classes specifies the filter to be applied. A filter corresponding to the class is applied to each 4 ⁇ 4 pixel region. Further, by integrating two or more of these 16 classes, the total number of classes (ie, the total number of filters) can be arbitrarily reduced. Then, in the HEVC WD5, the class integration information is encoded as picture_based_filter_group_information.
  • various information may be encoded as picture_based_filter_group_information.
  • encoding of picture_based_filter_group_information can be omitted.
  • a design is required that allows the video decoding device to switch between a plurality of filters included in the first filter set.
  • picture_based_filter_pred_information is information indicating a prediction method for encoding the first filter coefficient information 13.
  • the filter coefficient indicated by the first filter coefficient information 13 may be encoded as it is, or a prediction residual obtained by performing a prediction process on the filter coefficient may be encoded instead. In general, if an appropriate prediction process is performed, the absolute value of the prediction residual is smaller than the absolute value of the filter coefficient, so that the generated code amount decreases.
  • Various prediction methods are disclosed in HEVC WD5. Specifically, a method for predicting each filter coefficient of the first filter included in the same filter set based on the filter coefficient at the same position of the second filter can be prepared. Further, it is possible to prepare a method for predicting other filter coefficients in the filter based on a part of the filter coefficients in the filter on the assumption that the gain of the filter (that is, the sum of the filter coefficients) is constant. .
  • various information may be encoded as picture_based_filter_pred_information.
  • the prediction method of the filter coefficient is determined in advance, encoding of picture_based_filter_pred_information can be omitted. In any case, a design that allows the video decoding device to derive a filter coefficient prediction method is required.
  • picture_based_filter_coeff [i] [j] corresponds to the encoded first filter coefficient information 13. Specifically, information indicating the j-th filter coefficient of the i-th filter in the first filter set is encoded as picture_based_filter_coeff [i] [j].
  • the filter coefficient means a prediction residual when the above-described filter coefficient prediction is performed, and otherwise means a filter coefficient value itself.
  • PictureBasedFilterCoeffNum indicates the total number of filter coefficients included in each filter included in the first filter set.
  • the entropy encoding unit 104 can use, for example, the syntax shown in FIG. 8 to encode the filter switching information 14 and the second filter coefficient information 15.
  • FIG. 8 illustrates information related to loop filter processing in the LCU level syntax shown in FIG.
  • lcu_based_filter_flag indicates whether or not loop filter processing (that is, the first filter set or the second filter set) is applied to the corresponding LCU. That is, lcu_based_filter_flag corresponds to a part of the filter switching information 14 encoded. If lcu_based_filter_flag is 1, information on loop filter processing applied to the corresponding LCU is further encoded.
  • new_filter_flag indicates which of the first filter set and the second filter set is applied to the corresponding LCU. That is, new_filter_flag corresponds to a part of the filter switching information 14 encoded. If new_filter_flag is 1, information on the second filter set applied to the corresponding LCU (including the second filter coefficient information 15) is further encoded. On the other hand, if new_filter_flag is 1, no further information needs to be encoded since the first filter set is applied to the corresponding LCU.
  • lcu_based_filter_num_information is information indicating the total number of filters included in the second filter set applied to the corresponding LCU.
  • the total number of filters included in the second filter set is expressed as LCUBasedFilterNum.
  • This filter switching rule is encoded as lcu_based_filter_group_information.
  • lcu_based_filter_pred_information is information indicating a prediction method for encoding the second filter coefficient information 15.
  • a method of predicting the second filter coefficient information 15 based on the first filter coefficient information 13 can also be used.
  • lcu_based_filter_coeff [i] [j] corresponds to the encoded second filter coefficient information 15 corresponding to the pixel block to be processed. Specifically, information indicating the j-th filter coefficient of the i-th filter in the second filter set is encoded as lcu_based_filter_coeff [i] [j].
  • the filter coefficient means a prediction residual when the above-described filter coefficient prediction is performed, and otherwise means a filter coefficient value itself.
  • LCUBasedFilterCoeffNum indicates the total number of filter coefficients included in each filter included in the second filter set.
  • the second filter set is set in units of LCUs. Therefore, the syntax information related to the second filter set has a larger influence on the overhead of the encoded data 17 than the syntax information related to the first filter set. Therefore, the overhead of the encoded data 17 can be effectively reduced by appropriately reducing the size of the syntax information related to the second filter set.
  • the total number of filters included in the second filter set is fixed to 1, it is not necessary to encode lcu_based_filter_num_information and lcu_based_filter_group_information. Furthermore, since the total number of filters is minimized, the total number of filter coefficients is also minimized.
  • the shape of each filter included in the first filter set may not match the shape of each filter included in the second filter set.
  • the first filter set is set in units of pictures, a large overhead is allowed as compared with the second filter set. That is, the total number of filter coefficients included in each filter included in the first filter set may be larger than the total number of filter coefficients included in each filter included in the second filter set.
  • the shape of each filter may be changed according to a predetermined rule, and information indicating the shape of each filter may be signaled. At this time, for example, if the shape of the filter included in the second filter set is a subset of the shape of the filter included in the first filter set, the circuit scale for realizing the filter processing is reduced. You can also.
  • the entropy encoding unit 104 may use, for example, the syntax shown in FIG. 9 instead of FIG. 8 in order to encode the filter switching information 14 and the second filter coefficient information 15.
  • FIG. 9 illustrates information related to loop filter processing in the LCU level syntax shown in FIG.
  • the syntax information regarding the second filter set is encoded.
  • the second filter set applied to the LCU may be the same as the second filter set applied to the already encoded LCU. Therefore, the syntax of FIG. 9 allows to refer to syntax information regarding the second filter set corresponding to the already encoded LCU in such a case. Therefore, according to the syntax of FIG. 9, it is possible to simplify the encoding of syntax information regarding the second filter set applied to the current LCU.
  • lcu_based_filter_flag, lcu_based_filter_num_information, LCUBasedFilterNum, lcu_based_filter_group_information, lcu_based_filter_pred_information, the role of LCUBasedFilterCoeffNum and lcu_based_filter_coeff [i] [j] is identical or similar to that of FIG.
  • new_filter_flag is information indicating whether or not the second filter set is applied to the corresponding LCU and syntax information related to the second filter set needs to be newly encoded. That is, new_filter_flag corresponds to a part of the filter switching information 14 encoded. If new_filter_flag is 1, information on the second filter set applied to the corresponding LCU (including the second filter coefficient information 15) is further encoded.
  • new_filter_flag the first filter set or the same second filter set as the already encoded LCU is applied to the corresponding LCU.
  • the filter set applied to the LCU is determined by the picture_based_filter_flag. That is, in the syntax of FIG. 9, picture_based_filter_flag corresponds to an encoded part of the filter switching information 14. If picture_based_filter_flag is 1, no further information needs to be encoded since the first filter set is applied to the corresponding LCU.
  • the stored_filter_idx is encoded as identification information for referring to the syntax information related to the second filter set corresponding to the already encoded LCU, whereby the second filter set applied to the current LCU. It is possible to simplify the encoding of the syntax information regarding. Note that the syntax information referred to by stored_filter_idx is required to be common between the video encoding device and the video decoding device.
  • FIG. 10 shows a modification of the syntax of FIG.
  • lcu_based_filter_flag, lcu_based_filter_num_information, LCUBasedFilterNum, lcu_based_filter_group_information, lcu_based_filter_pred_information, the role of LCUBasedFilterCoeffNum and lcu_based_filter_coeff [i] [j] is identical or similar to that of FIG.
  • the role of stored_filter_idx is the same as or similar to that of FIG.
  • picture_based_filter_flag is information indicating whether the first filter set or the second filter set is applied to the corresponding LCU. That is, in the syntax of FIG. 10, picture_based_filter_flag corresponds to a part of the filter switching information 14 encoded. If picture_based_filter_flag is 1, no further information needs to be encoded since the first filter set is applied to the corresponding LCU. On the other hand, if picture_based_filter_flag is 0, new_filter_flag is further encoded.
  • New_filter_flag is information indicating whether or not it is necessary to newly encode syntax information related to the second filter set applied to the corresponding LCU. If new_filter_flag is 1, information on the second filter set applied to the corresponding LCU (including the second filter coefficient information 15) is further encoded. If new_filter_flag is 0, the same LCU as the already encoded LCU is applied to the corresponding LCU. Therefore, by encoding stored_filter_idx, encoding of syntax information regarding the second filter set applied to the corresponding LCU can be simplified.
  • FIG. 11 also shows a modification of the syntax of FIG.
  • lcu_based_filter_flag, lcu_based_filter_num_information, LCUBasedFilterNum, lcu_based_filter_group_information, lcu_based_filter_pred_information, the role of LCUBasedFilterCoeffNum and lcu_based_filter_coeff [i] [j] is identical or similar to that of FIG.
  • new_filter_flag is the same as or similar to that of FIG.
  • the picture_based_filter_flag encoded in the syntax of FIG. 9 and the syntax of FIG. 10 is not encoded.
  • any one value (for example, 0) of stored_filter_idx is used to refer to the syntax information regarding the first filter set. That is, in the syntax of FIG. 11, stored_filter_idx corresponds to a part of the filter switching information 14 encoded. For example, if new_filter_flag is 0 and stored_filter_idx is 0, the first filter set is applied to the corresponding LCU.
  • the moving image encoding device and the moving image decoding device require a storage unit (for example, a buffer) that stores the already encoded second filter coefficient information 15.
  • Constraints can be imposed on the second filter coefficient information 15 to be saved in order to save the capacity of the storage unit.
  • the second filter coefficient information 15 may be stored in the storage unit only for a maximum of N second filter sets within the same picture or the same LCU line, for example. In this case, any one of a maximum of N second filter sets can be referred to in each LCU.
  • the (N + 1) th and subsequent second filter sets may not be referred to, or may be referred to by overwriting part of the second filter coefficient information 15 stored in the storage unit. That is, when the second filter coefficient information 15 is newly stored in a state where the second filter coefficient information 15 of the N second filter sets is stored in the storage unit, The second filter coefficient information 15 of any one of the second filter sets may be overwritten.
  • the second filter coefficient information 15 to be overwritten may be the most recently saved, the least frequently referenced information, or the most recently referenced information. . In any case, a design is required so that the moving picture decoding apparatus can identify the second filter coefficient information 15 of the second filter set applied to each LCU.
  • the first filter coefficient information 13 is also referred to using stored_filter_idx. Therefore, the same restrictions as the second filter coefficient information 15 can be imposed on the first filter coefficient information 13. That is, in the storage unit, for example, filter coefficient information (that is, the first filter coefficient information 13 or the first filter set 13 or the second filter set) is limited to a maximum of N filter sets (first filter set or second filter set) within a picture or an LCU line. Second filter coefficient information 15) may be stored. However, as described above, if the first filter set is set semi-optimally for the entire picture, many LCUs refer to the first filter coefficient information 13, so that the overhead of the encoded data 17 is effectively reduced. Can be reduced. Therefore, the first filter coefficient information 13 may be excluded from overwriting.
  • the signaled information may be encoded in SPS, PPS, APS, slice header or LCU data.
  • picture_based_filter_flag, lcu_based_filter_flag, and new_filter_flag may be flags or other information.
  • a plurality of pieces of syntax information can be specified together by an index.
  • syntax information may be generated for each component, or common syntax information may be generated between two or more components.
  • the syntax structure may be different for each component.
  • the syntax structure of a certain component may be obtained by changing some syntax elements of the syntax structure of another component, or by deleting some syntax elements from the syntax structure of another component.
  • a part of syntax elements may be added to the syntax structure of other components.
  • the first filter coefficient information 13 is encoded in the APS, and the filter switching information 14 and the second filter coefficient information 15 are encoded in the LCU level syntax.
  • the first filter coefficient information 13 may be encoded in a PPS or slice header instead of APS.
  • the first filter coefficient information 13 may be encoded in an LCU syntax corresponding to a predetermined LCU (for example, a head LCU in a picture or a slice).
  • the moving picture encoding apparatus in FIG. 1 can perform encoding processing as follows, for example. Specifically, the prediction unit 101, the subtraction unit 102, the transform / quantization unit 103, the inverse quantization / inverse transform unit 105, and the addition unit 106 can operate as follows.
  • the prediction unit 101 generates the predicted image 11 based on the loop filter processed image 19, for example.
  • the subtraction unit 102 generates a prediction error image 12 by subtracting the prediction image 11 from the input image 10.
  • the transform / quantization unit 103 performs transform and quantization on the prediction error image 12 to generate a quantized transform coefficient.
  • the quantized transform coefficient is encoded by the entropy encoding unit 104.
  • the inverse quantization / inverse transform unit 105 generates a prediction error image by performing inverse quantization and inverse transform on the quantized transform coefficient.
  • the addition unit 106 generates a decoded image 16 by adding the prediction error image to the prediction image 11.
  • This operation corresponds to so-called hybrid coding including prediction processing and conversion processing.
  • the moving picture encoding apparatus does not necessarily have to perform hybrid encoding.
  • hybrid coding is replaced with DPCM (Differential Pulse Code Modulation)
  • prediction processing based on neighboring pixels may be performed instead of unnecessary processing being omitted.
  • the entropy encoding unit 104, the loop filter information generation unit 107, the loop filter processing unit 108, and the first filter setting unit 109 operate as shown in FIG. 12, for example.
  • the process of FIG. 12 is performed in units of pictures.
  • the entropy encoding unit 104, the loop filter information generation unit 107, and the loop filter processing unit 108 obtain the first filter coefficient information 13 set for the picture from the first filter buffer 110 (step S101).
  • the entropy encoding unit 104 encodes the first filter coefficient information 13 acquired in step S101 according to the syntax of FIG. 7, for example (step S102).
  • step S103 an encoding process is performed on an unencoded LCU in the picture (step S103).
  • the encoding process performed in step S103 is, for example, the above-described hybrid encoding. That is, the entropy encoding unit 104 encodes the quantization transform coefficient and the encoding parameter of the processing target LCU.
  • the filter setting information generation unit 112 generates filter setting information 18 of the processing target LCU based on the input image 10 and the decoded image 16 (step S104).
  • the second filter setting unit 113 sets the second filter coefficient information 15 for the LCU to be processed based on the filter setting information 18 generated in step S104 (step S105). .
  • the filter switching information generation unit 114 includes the input image 10, the decoded image 16, the first filter coefficient information 13 acquired in step S101, and the second filter coefficient generated in step S105. Based on the information 15, the filter switching information 14 of the LCU to be processed is generated (step 106).
  • the filter switching information 14 is information indicating which of the first filter set and the second filter set is applied to the processing target LCU. .
  • step S106 If the filter switching information 14 generated in step S106 indicates that the first filter set is applied to the LCU to be processed, the process proceeds to step S108 (step S107). On the other hand, if the filter switching information 14 generated in step S106 indicates that the second filter set is applied to the LCU to be processed, the process proceeds to step S110 (step S107).
  • step S108 the loop filter processing unit 108 generates the loop filter processed image 19 by applying the first filter set to the LCU to be processed using the first filter coefficient information 13 acquired in step S101. Further, the entropy encoding unit 104 encodes the filter switching information 14 generated in step S014 according to the syntax of FIG. 8, FIG. 9, FIG. 10, or FIG. 11, for example (step S109).
  • Step S112 If the encoding process for all the LCUs in the picture is completed at the time when Step S108 and Step S109 are completed, the process proceeds to Step S113 (Step S112). On the other hand, if the encoding process for all the LCUs in the picture has not been completed when Step S108 and Step S109 are completed, the process returns to Step S103 (Step S112).
  • step S110 the loop filter processing unit 108 generates the loop filter processed image 19 by applying the second filter set to the LCU to be processed using the second filter coefficient information 15 set in step S105. Further, the entropy encoding unit 104 encodes the filter switching information 14 generated in step S104 and the second filter coefficient information 15 set in step S105, for example, according to the syntax of FIG. 8, FIG. 9, FIG. (Step S111).
  • Step S112 If the encoding process for all the LCUs in the picture is completed at the time when Step S110 and Step S111 are completed, the process proceeds to Step S113 (Step S112). On the other hand, if the encoding process for all the LCUs in the picture has not been completed when Step S110 and Step S111 are completed, the process returns to Step S103 (Step S112).
  • step S113 the first filter setting unit 109 sets the first filter coefficient information 13 used in the subsequent picture based on the filter setting information 18 of each LCU generated by repeating step S104.
  • the first filter setting unit 109 stores the first filter coefficient information 13 set in step S113 in the first filter buffer 110, and the process ends (step S114).
  • the moving picture encoding apparatus uses the first filter set set in units of pictures and the second filter set set in units of pixel blocks for each pixel block. Selectively apply to. Therefore, according to this moving image encoding apparatus, the overhead of encoded data can be effectively reduced. Further, the first filter coefficient information can be encoded before the encoding of each pixel block starts. Therefore, according to this moving image encoding device, the encoded data of each pixel block can be sequentially output to reduce the delay associated with the moving image encoding process.
  • the moving picture decoding apparatus includes a moving picture decoding unit 2000 and a decoding control unit 207.
  • the moving image decoding unit 2000 includes an entropy decoding unit 201, an inverse quantization / inverse transform unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, and a loop filter processing unit 206.
  • the decoding control unit 207 controls the operation of each unit of the moving image decoding unit 2000.
  • the entropy decoding unit 201 inputs the encoded data 20 from the outside of the video decoding device (for example, a communication system or a storage system).
  • the encoded data 20 is the same as or similar to the encoded data 17 described above.
  • the entropy decoding unit 201 performs entropy decoding on the encoded data 20, thereby performing quantization transform coefficients, encoding parameters, first filter coefficient information 22, filter switching information 23, second filter coefficient information 24, Is generated.
  • the entropy decoding unit 201 may use, for example, the syntax of FIG. 7 in order to decode the first filter coefficient information 22. Further, the entropy decoding unit 201 may use, for example, the syntax of FIG. 8, FIG. 9, FIG. 10, or FIG. 11 to decode the filter switching information 23 and the second filter coefficient information 24.
  • the first filter coefficient information 22 may be the same as or similar to the first filter coefficient information 13.
  • the filter switching information 23 may be the same as or similar to the filter switching information 14.
  • the second filter coefficient information 24 may be the same as or similar to the second filter coefficient information 15.
  • the entropy decoding unit 201 outputs the quantized transform coefficient to the inverse quantization / inverse transform unit 202, outputs the encoding parameter to the decoding control unit 207, and sends the first filter coefficient information 22 to the first filter buffer 205.
  • the filter switching information 23 and the second filter coefficient information 24 are output to the loop filter processing unit 206.
  • the inverse quantization / inverse transform unit 202 inputs the quantized transform coefficient from the entropy decoding unit 201.
  • the inverse quantization / inverse transform unit 202 inversely quantizes the quantized transform coefficient to obtain a transform coefficient.
  • the inverse quantization / inverse transform unit 202 performs an inverse transform process on the transform coefficient to obtain a prediction error image.
  • the inverse quantization / inverse transform unit 202 outputs the prediction error image to the addition unit 204.
  • the inverse quantization / inverse transformation unit 202 performs the same or similar processing as the above-described inverse quantization / inverse transformation unit 105. That is, the inverse quantization is performed based on the quantization parameter set by the decoding control unit 207. Further, the inverse conversion process is determined by the conversion process performed on the encoding side. For example, the inverse transform process is IDCT, inverse wavelet transform, or the like.
  • the predicted image generation unit 203 performs output image prediction processing in units of pixel blocks, for example, and generates a predicted image.
  • the predicted image generation unit 203 may perform output image prediction processing based on a loop filter processed image 25 described later.
  • the predicted image generation unit 203 performs the same or similar processing as the predicted image generation unit 101 described above.
  • the predicted image generation unit 203 outputs the predicted image to the adding unit 204.
  • the addition unit 204 inputs a prediction image from the prediction image generation unit 203 and inputs a prediction error image from the inverse quantization / inverse conversion unit 202.
  • the adding unit 204 adds the prediction error image to the prediction image to generate the decoded image 21.
  • the adding unit 204 outputs the decoded image 21 to the loop filter processing unit 206.
  • the first filter coefficient information 22 stored in the first filter buffer 205 is read by the loop filter processing unit 206 as necessary. Note that if the loop filter processing unit 206 has a function of storing the first filter coefficient information 22, the first filter buffer 205 may be omitted. In such a case, the entropy decoding unit 201 outputs the first filter coefficient information 22 to the loop filter processing unit 206.
  • the loop filter processing unit 206 receives the decoded image 21 from the adding unit 204 in units of pixel blocks, receives the filter switching information 23 and the second filter coefficient information 24 from the entropy decoding unit 201, and receives the first filter buffer 205 from the first filter buffer 205.
  • the filter coefficient information 22 is input.
  • the loop filter processing unit 206 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 206 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the loop filter processed image 25 are processed. Generate a block. If the filter switching information 23 indicates that the loop filter process is not applied to the pixel block to be processed, the loop filter processing unit 206 omits the loop filter process for the pixel block to be processed. That is, the loop filter processing unit 206 performs the same or similar loop filter processing as the loop filter processing unit 108.
  • the loop filter processing unit 206 gives the loop filter processed image 25 as an output image to the outside (for example, a display system) of the moving image decoding apparatus. Further, the loop filter processed image 25 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 203. The loop filter processed image 25 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
  • the decoding control unit 207 receives the encoding parameter from the entropy decoding unit 201.
  • the decoding control unit 207 performs decoding timing control, encoding block division control, quantization control, mode control, and the like based on the encoding parameters.
  • the moving picture decoding apparatus in FIG. 2 can perform the decoding process as follows, for example.
  • the entropy decoding unit 201 decodes the first filter coefficient information 22 set to the picture in accordance with the syntax of FIG. 7, for example. Then, the entropy decoding unit 201 starts decoding the pixel block in the picture. Specifically, the entropy decoding unit 201 decodes the quantized transform coefficient and the encoding parameter of the pixel block. Furthermore, the entropy decoding unit 201 decodes the filter switching information 23 of the pixel block according to the syntax of FIG. 8, FIG. 9, FIG. 10, or FIG. The entropy decoding unit 201 further decodes the second filter coefficient information 15 if the filter switching information 23 indicates that the second filter set is applied to the pixel block.
  • Quantization transform coefficients and encoding parameters of the pixel block are processed based on the above-described hybrid encoding. That is, the inverse quantization / inverse transform unit 202 generates a prediction error image by performing inverse quantization and inverse transform on the quantized transform coefficient.
  • the prediction unit 203 generates a prediction image based on the loop filter processed image 25, for example.
  • the adding unit 204 generates the decoded image 21 by adding the prediction error image to the prediction image.
  • the loop filter processing unit 206 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block in the decoded image 21.
  • the loop filter processing unit 206 generates a pixel block in the loop filter processed image 25 by performing a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24.
  • the video decoding device generates an output image by decoding the encoded data from the video encoding device according to the first embodiment. Therefore, according to this moving picture decoding apparatus, the overhead of encoded data can be effectively reduced.
  • the video encoding apparatus includes a video encoding unit 3000 and an encoding control unit 111.
  • the moving image coding unit 3000 includes a predicted image generation unit 101, a subtraction unit 102, a transform / quantization unit 103, an entropy coding unit 104, an inverse quantization / inverse transform unit 105, an adder unit 106, A loop filter information generation unit 307, a loop filter processing unit 308, a first filter setting unit 109, a first filter buffer 110, a deblocking filter processing unit 315, and a SAO (Sample Adaptive Offset) processing unit 316 are provided. .
  • the encoding control unit 111 controls the operation of each unit of the moving image encoding unit 3000.
  • the entropy encoding unit 104 in FIG. 14 differs from the entropy encoding unit 104 in FIG. 1 in that the filter switching information 14 and the second filter coefficient information 15 are input from the loop filter information generation unit 307 instead of the loop filter information generation unit 107. Is different.
  • the decoded image 16 is output to the deblocking filter processing unit 315 instead of the loop filter information generation unit 107 and the loop filter processing unit 108.
  • the filter setting information 18 is input from the loop filter information generation unit 307 instead of the loop filter information generation unit 107.
  • the loop filter information generation unit 307 acquires the input image 10 in units of pixel blocks from the outside of the video encoding device, inputs the first filter coefficient information 13 from the first filter buffer 110, and will be described later from the SAO processing unit 316. SAO-processed images are input in units of pixel blocks.
  • the loop filter information generation unit 307 generates filter setting information 18 corresponding to the pixel block to be processed based on the SAO processed image and the input image 10. Further, the loop filter information generation unit 307 generates the second filter coefficient information 15 corresponding to the pixel block to be processed based on the filter setting information 18. Further, the loop filter information generation unit 307 generates filter switching information 14 corresponding to the pixel block to be processed based on the SAO processed image, the input image 10, the first filter coefficient information 13, and the second filter coefficient information 15. .
  • the loop filter information generation unit 307 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104 and the loop filter processing unit 308.
  • the loop filter information generation unit 307 outputs the filter setting information 18 to the first filter setting unit 109.
  • the loop filter processing unit 308 inputs the SAO processed image from the SAO processing unit 316 in units of pixel blocks, receives the filter switching information 14 and the second filter coefficient information 15 from the loop filter information generation unit 307, and the first filter buffer 110.
  • the first filter coefficient information 13 is input.
  • the loop filter processing unit 308 applies one of the first filter set and the second filter set based on the filter switching information 14 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 308 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 13 or the second filter coefficient information 15, so that the pixels in the loop filter processed image 19 are processed. Generate a block. If the filter switching information 14 indicates that the loop filter processing is not applied to the pixel block to be processed, the loop filter processing unit 308 omits the loop filter processing for the pixel block to be processed.
  • the loop filter processed image 19 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 101.
  • the loop filter processed image 19 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
  • the deblocking filter processing unit 315 inputs the decoded image 16 from the addition unit 106.
  • the deblocking filter processing unit 315 performs a deblocking filter process on the decoded image 16 to generate a deblocking filter process image.
  • the deblocking filter processing unit 315 outputs the deblocking filter processing image to the SAO processing unit 316.
  • the deblocking filter process performed by the deblocking filter processing unit 315 may be a process of applying a smoothing filter to a block boundary in the decoded image 16, for example.
  • This deblocking filter processing generally brings about an image quality improvement effect such as suppression of block distortion included in the decoded image 16.
  • the SAO processing unit 316 inputs a deblocking filter processing image from the deblocking filter processing unit 315.
  • the SAO processing unit 316 generates a SAO processed image by performing SAO processing on the deblocking filter processed image.
  • the SAO processing unit 316 outputs the SAO processed image to the loop filter information generation unit 307 and the loop filter processing unit 308.
  • the SAO processing performed by the SAO processing unit 316 sets an offset value for each pixel in the deblocking filtered image based on a comparison of the pixel values between the pixel and the surrounding pixels,
  • the offset value may be added to the pixel value of each pixel.
  • the moving image encoding apparatus performs the first image processing on an image obtained by performing deblocking filter processing and SAO processing on a decoded image instead of the decoded image.
  • Loop filter processing that is the same as or similar to that of the moving image encoding apparatus according to the embodiment is performed. Therefore, according to this moving image encoding apparatus, the image quality of the predicted image can be improved and the encoding efficiency can be improved.
  • the moving picture decoding apparatus includes a moving picture decoding unit 4000 and a decoding control unit 207.
  • the video decoding unit 4000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, a loop filter processing unit 406, A blocking filter processing unit 408 and an SAO processing unit 409 are provided.
  • the decoding control unit 207 controls the operation of each unit of the moving image decoding unit 4000.
  • the entropy decoding unit 201 of FIG. 13 differs from the entropy decoding unit 201 of FIG. 13 in that the filter switching information 23 and the second filter coefficient information 24 are output to the loop filter processing unit 406 instead of the loop filter processing unit 206.
  • the addition unit 204 in FIG. 15 is different from the addition unit 204 in FIG. 13 in that the decoded image 21 is output to the deblocking filter processing unit 408 instead of the loop filter processing unit 206.
  • the loop filter processing unit 406 receives the SAO processed image from the SAO processing unit 409 in pixel block units, receives the filter switching information 23 and the second filter coefficient information 24 from the entropy decoding unit 201, and receives the first filter buffer 205 from the first filter buffer 205. 1 filter coefficient information 22 is input.
  • the loop filter processing unit 406 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 406 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the loop filter processed image 25 are processed. Generate a block. If the filter switching information 23 indicates that the loop filter processing is not applied to the pixel block to be processed, the loop filter processing unit 406 omits the loop filter processing for the pixel block to be processed.
  • the loop filter processing unit 406 gives the loop filter processed image 25 as an output image to the outside of the video decoding device (for example, a display system). Further, the loop filter processed image 25 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 203. The loop filter processed image 25 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
  • a storage unit for example, a buffer
  • the deblocking filter processing unit 408 inputs the decoded image 21 from the adding unit 204.
  • the deblocking filter processing unit 408 generates a deblocking filter processed image by performing a deblocking filter process on the decoded image 21.
  • the deblocking filter processing unit 408 outputs the deblocking filter processing image to the SAO processing unit 409. Note that the deblocking filter processing unit 408 performs the same or similar deblocking filter processing as the deblocking filter processing unit 315.
  • the SAO processing unit 409 inputs a deblocking filter processing image from the deblocking filter processing unit 408.
  • the SAO processing unit 409 generates a SAO processed image by performing SAO processing on the deblocking filter processed image.
  • the SAO processing unit 409 outputs the SAO processed image to the loop filter processing unit 406. Note that the SAO processing unit 409 performs the same or similar SAO processing as the SAO processing unit 316.
  • the moving picture decoding apparatus performs the first implementation described above on an image obtained by performing deblocking filter processing and SAO processing on the decoded image instead of the decoded image.
  • the same or similar loop filter processing as the moving picture decoding apparatus according to the embodiment is performed. Therefore, according to this moving image encoding apparatus, the image quality of the predicted image can be improved and the encoding efficiency can be improved.
  • the moving picture encoding apparatus includes a moving picture encoding unit 5000 and an encoding control unit 111.
  • the moving image encoding unit 5000 includes a predicted image generating unit 101, a subtracting unit 102, a transform / quantization unit 103, an entropy encoding unit 104, an inverse quantization / inverse transform unit 105, an addition unit 106, A loop filter information generation unit 507, a loop filter processing unit 508, a first filter setting unit 109, a first filter buffer 110, a deblocking filter processing unit 315, and an SAO processing unit 316 are provided.
  • the encoding control unit 111 controls the operation of each unit of the moving image encoding unit 5000.
  • the entropy encoding unit 104 in FIG. 16 differs from the entropy encoding unit 104 in FIG. 1 in that the filter switching information 14 and the second filter coefficient information 15 are input from the loop filter information generation unit 507 instead of the loop filter information generation unit 107. Is different.
  • the filter setting information 18 is input from the loop filter information generation unit 507 instead of the loop filter information generation unit 107.
  • the loop filter information generation unit 507 acquires the input image 10 from the outside of the video encoding device in units of pixel blocks, inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, and receives the first image from the first filter buffer 110.
  • the filter coefficient information 13 is input, and the SAO processing image is input from the SAO processing unit 316 in pixel block units.
  • the loop filter information generation unit 507 generates filter setting information 18 corresponding to the pixel block to be processed based on the SAO processed image, the input image 10 and the decoded image 16. Further, the loop filter information generation unit 507 generates the second filter coefficient information 15 corresponding to the pixel block to be processed based on the filter setting information 18. Further, the loop filter information generation unit 507 performs filter switching information corresponding to the pixel block to be processed based on the SAO processed image, the input image 10, the first filter coefficient information 13, the second filter coefficient information 15, and the decoded image 16. 14 is generated.
  • the loop filter information generation unit 507 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104 and the loop filter processing unit 508.
  • the loop filter information generation unit 507 outputs the filter setting information 18 to the first filter setting unit 109.
  • the loop filter processing unit 508 inputs the decoded image 16 from the adder unit 106 in units of pixel blocks, inputs the SAO processed image from the SAO processing unit 316 in units of pixel blocks, and receives filter switching information 14 and from the loop filter information generation unit 507.
  • the second filter coefficient information 15 is input, and the first filter coefficient information 13 is input from the first filter buffer 110.
  • the loop filter processing unit 508 applies one of the first filter set and the second filter set based on the filter switching information 14 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 508 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 13 or the second filter coefficient information 15, so that the pixels in the loop filter processed image 19 are processed. Generate a block. If the filter switching information 14 indicates that the loop filter process is not applied to the pixel block to be processed as described above, the loop filter processing unit 508 omits the loop filter process for the pixel block to be processed.
  • the loop filter processed image 19 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 101.
  • the loop filter processed image 19 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
  • the loop filter processing unit 508 performs loop filter processing on the decoded image 16 and the SAO processed image.
  • the loop filter processing unit 508 may perform loop filter processing on an image obtained by performing weighted averaging of each pixel value of the decoded image 16 and each pixel value of the SAO processed image using appropriate weights. it can.
  • Block distortion generated by encoding can be generally suppressed by deblocking filter processing.
  • a high-intensity low-pass filter is applied, and high-frequency components (for example, edge components and texture components) included in the input image 10 may be deteriorated.
  • the loop filter process By performing the loop filter process on the images before and after the deblocking filter process (that is, the decoded image 16 and the SAO process image), it is possible to compensate for partial image quality degradation that has occurred through the deblocking filter process. That is, an image quality improvement effect can be obtained.
  • the image to be subjected to the loop filter process may be switched in units such as a sequence, a picture, and a slice.
  • the loop filter process may be performed on only the SAO processed image in a certain slice, and the loop filter process may be performed on the decoded image 16 and the SAO processed image in another slice.
  • information indicating the switching of the image to be the target of the loop filter process may be encoded in units such as a sequence, a picture, and a slice. Is possible.
  • the loop filter process may be performed on the images before and after the arbitrary filter process. Further, when a plurality of filter processes are performed on the decoded image 16 before the loop filter process, the loop filter process may be performed on images before and after any one or more filter processes.
  • the moving image encoding apparatus targets the images before and after the filter process (for example, the deblocking filter process and the SAO process), and the moving image according to the first embodiment described above.
  • the same or similar loop filter processing as that of the image encoding device is performed. Therefore, according to this moving image encoding apparatus, the image quality of the predicted image can be improved and the encoding efficiency can be improved.
  • the video decoding device includes a video decoding unit 6000 and a decoding control unit 207.
  • the video decoding unit 6000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, a loop filter processing unit 606, A blocking filter processing unit 408 and an SAO processing unit 409 are provided.
  • the decoding control unit 207 controls the operation of each unit of the moving image decoding unit 6000.
  • the 17 differs from the entropy decoding unit 201 in FIG. 13 in that the filter switching information 23 and the second filter coefficient information 24 are output to the loop filter processing unit 606 instead of the loop filter processing unit 206.
  • the entropy decoding unit 201 in FIG. 17 is different from the addition unit 204 of FIG. 13 in that the decoded image 21 is output to the deblocking filter processing unit 408 instead of the loop filter processing unit 206.
  • the loop filter processing unit 606 receives the decoded image 21 from the adding unit 204 in units of pixel blocks, inputs the SAO processed image from the SAO processing unit 409 in units of pixel blocks, and receives the filter switching information 23 and the second information from the entropy decoding unit 201.
  • the filter coefficient information 24 is input, and the first filter coefficient information 22 is input from the first filter buffer 205.
  • the loop filter processing unit 606 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 606 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the loop filter processed image 25 are processed. Generate a block. That is, the loop filter processing unit 606 performs the same or similar loop filter processing as the loop filter processing unit 508. If the filter switching information 23 indicates that the loop filter processing is not applied to the pixel block to be processed, the loop filter processing unit 606 omits the loop filter processing for the pixel block to be processed.
  • the loop filter processing unit 606 gives the loop filter processed image 25 as an output image to the outside of the video decoding device (for example, a display system). Further, the loop filter processed image 25 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 203. The loop filter processed image 25 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
  • a storage unit for example, a buffer
  • the moving picture decoding apparatus is directed to the moving picture decoding according to the first embodiment described above for the images before and after the filtering process (for example, the deblocking filtering process and the SAO process). Performs the same or similar loop filter processing as the device. Therefore, according to this moving image decoding apparatus, it is possible to improve the encoding efficiency by improving the image quality of the predicted image.
  • the filtering process for example, the deblocking filtering process and the SAO process.
  • the loop filter processed image 25 is provided as an output image outside the video decoding device (for example, a display system).
  • the decoded image 21 may be given to the outside of the moving image decoding apparatus as an output image instead of the loop filter processed image 25.
  • the video decoding device includes a video decoding unit 7000 and a decoding control unit 207.
  • the video decoding unit 7000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, and a loop filter processing unit 206.
  • the decoding control unit 207 controls the operation of each unit of the moving image decoding unit 7000.
  • the loop filter processing unit 206 in FIG. 18 adds the decoded image 21 to the loop filter processing unit 206 in the same manner as the addition unit 204 in FIG. Further, the adding unit 204 in FIG. 18 provides the decoded image 21 as an output image to the outside of the moving image decoding apparatus.
  • the loop filter processing unit 206 in FIG. 18 is different from the loop filter processing unit 206 in FIG. 13 in that the loop filter processed image 25 is not given as an output image to the outside of the video decoding device.
  • the moving picture decoding apparatus gives the decoded picture before the loop filter processing to the outside of the moving picture decoding apparatus as an output picture. Therefore, according to this moving image decoding apparatus, an output image can be given to the outside with a low delay.
  • the loop filter processing described in the first, second, or third embodiment can be replaced with post filter processing. That is, the moving picture encoding apparatus according to the fifth embodiment generates filter information (for example, first filter coefficient information 13, filter switching information 14, and second filter coefficient information 15) for post filter processing. Then, the video decoding device according to the fifth embodiment performs post-filter processing based on the filter information.
  • filter information for example, first filter coefficient information 13, filter switching information 14, and second filter coefficient information 15
  • the moving picture coding apparatus includes a moving picture coding unit 8000 and a coding control unit 111.
  • the moving image encoding unit 8000 includes a predicted image generation unit 101, a subtraction unit 102, a transform / quantization unit 103, an entropy encoding unit 104, an inverse quantization / inverse transform unit 105, an addition unit 106, A post filter information generation unit 807, a first filter setting unit 109, and a first filter buffer 110 are provided.
  • the encoding control unit 111 controls the operation of each unit of the moving image encoding unit 8000.
  • the entropy encoding unit 104 in FIG. 19 is different from the entropy encoding unit 104 in FIG. 1 in that the filter switching information 14 and the second filter coefficient information 15 are input from the post filter information generation unit 807 instead of the loop filter information generation unit 107. Is different.
  • the decoded image 16 is output to the post filter information generation unit 807 instead of the loop filter information generation unit 107 and the loop filter processing unit 108.
  • the decoded image 16 may be stored in a storage unit (not illustrated) (for example, a buffer) that can be accessed by the predicted image generation unit 101.
  • the decoded image 16 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
  • the filter setting information 18 is input from the post filter information generation unit 807 instead of the loop filter information generation unit 107.
  • the post filter information generation unit 807 acquires the input image 10 in units of pixel blocks from the outside of the video encoding device, inputs the decoded image 16 in units of pixel blocks from the addition unit 106, and receives the first image from the first filter buffer 110.
  • the filter coefficient information 13 is input.
  • the post filter information generation unit 807 performs the same or similar processing as the loop filter information setting unit 107, for example.
  • the post filter information generation unit 807 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104.
  • the post filter information generation unit 807 outputs the filter setting information 18 to the first filter setting unit 109.
  • the moving picture encoding apparatus uses the filter information for the loop filter processing in the first, second, or third embodiment described above as the filter for the post filter processing. Generate as information. Therefore, according to this moving image encoding device, in the moving image encoding device and the moving image decoding device in which post filter processing is used instead of loop filter processing, the same as in the first, second, or third embodiment or Similar effects can be obtained.
  • the video decoding device includes a video decoding unit 9000 and a decoding control unit 207.
  • the moving picture decoding unit 9000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, and a post filter processing unit 906. .
  • the decoding control unit 207 controls the operation of each unit of the moving image decoding unit 9000.
  • the entropy decoding unit 201 in FIG. 13 is different from the entropy decoding unit 201 in FIG. 13 in that the filter switching information 23 and the second filter coefficient information 24 are output to the post filter processing unit 906 instead of the loop filter processing unit 206.
  • the entropy decoding unit 201 in FIG. The predicted image generation unit 203 in FIG. 20 is different from the predicted image generation unit 203 in FIG. 13 in that a prediction process is performed based on the decoded image 21 instead of the loop filter processed image 25.
  • the decoded image 21 is output to the post filter processing unit 906 instead of the loop filter processing unit 206.
  • the decoded image 21 may be stored in a storage unit (not shown) (for example, a buffer) that can be accessed by the predicted image generation unit 203.
  • the decoded image 21 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
  • the post filter processing unit 906 inputs the decoded image 21 from the adder unit 204 in units of pixel blocks, receives the filter switching information 23 and the second filter coefficient information 24 from the entropy decoding unit 201, and receives the first filter buffer 205 from the first filter buffer 205.
  • the filter coefficient information 22 is input.
  • the post filter processing unit 906 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the post-filter processing unit 906 performs post-filter processing on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the post-filter processed image 35 are processed. Generate a block. If the filter switching information 23 indicates that post-filter processing is not applied to the pixel block to be processed, the post-filter processing unit 906 omits post-filter processing for the pixel block to be processed. The post filter processing unit 906 gives the post filter processed image 35 as an output image to the outside of the video decoding device (for example, a display system).
  • the video decoding device for example, a display system
  • the moving picture decoding apparatus performs post filter processing that is the same as or similar to the loop filter processing in the first, second, or third embodiment described above. Therefore, according to this moving image encoding device, in the moving image encoding device and the moving image decoding device in which post filter processing is used instead of loop filter processing, the same as in the first, second, or third embodiment or Similar effects can be obtained.
  • a program that is, software
  • a computer or an embedded system
  • Instructions corresponding to various processes described in the above-described embodiments are described as programs that can be executed by a computer.
  • the program can be a magnetic disk (eg, flexible disk (registered trademark), hard disk, etc.), optical disk (CD-ROM, CD-R, CD-RW, DVD-ROM, DVD ⁇ R, DVD ⁇ RW, etc.), semiconductor memory, It is recorded on a recording medium similar to these.
  • the program recording medium is not limited to the above example, and may be any computer-readable medium.
  • the program is not necessarily stored in one recording medium, and may be stored across a plurality of recording media.
  • the computer may download the program through a network (for example, a LAN (Local Area Network) or the Internet).
  • the recording medium in which the program is stored may be independent from the computer, or may be for storing the program downloaded through the network (including temporary storage).
  • an OS operating system
  • database management software database management software
  • network MW middleware
  • other MW that runs on a computer
  • a part of the processing by the video decoding device may be executed.
  • a computer or an embedded system is for executing each process in the present embodiment based on a program stored in a recording medium. Therefore, the computer or the embedded system may typically be one personal computer or a microcomputer, or may be a system formed by connecting a plurality of personal computers or microcomputers over a network.
  • the computer is not limited to a personal computer, and may be an arithmetic processing unit or a microcomputer included in an information processing device.
  • the computer is a generic term for devices or apparatuses that can realize the functions of the moving picture coding apparatus and the moving picture decoding apparatus according to each embodiment by a program.
  • Loop filter processing unit 109 ... First filter setting unit 110, 2 DESCRIPTION OF SYMBOLS 5 ... 1st filter buffer 111 ... Encoding control part 112 ... Filter setting information generation part 113 ... 2nd filter setting part 114 ... Filter switching information generation part 115 ... Switch 116- ... Filter application unit 201... Entropy decoding unit 207... Decoding control unit 315, 408 .. Deblocking filter processing unit 316, 409... SAO processing unit 807. ..Post filter processing unit 1000, 3000, 5000, 8000 ... moving image encoding unit 2000, 4000, 6000, 7000, 9000 ... moving image decoding unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

According to an embodiment, a video encoding method includes encoding first filter coefficient information indicating a filter coefficient for each of one or more filters included in a first filter set specified for a decoded image. The video encoding method includes encoding filter switching information indicating whether to apply the first filter set or a second filter set specified for a pixel block to be processed in the decoded image when a loop filter process is applied to the pixel block to be processed, after the first filter coefficient information is encoded. The video encoding method includes encoding second filter coefficient information indicating a filter coefficient for each of one or more filters included in the second filter set when the second filter set is applied to the pixel block to be processed, after the first filter coefficient information is encoded.

Description

動画像符号化方法、動画像復号方法、動画像符号化装置及び動画像復号装置Moving picture coding method, moving picture decoding method, moving picture coding apparatus, and moving picture decoding apparatus
 実施形態は、動画像の符号化技術及び復号技術に関する。 Embodiments relate to a moving image encoding technique and a decoding technique.
 ピクチャ内のLCU(Largest Coding Unit)単位でフィルタを設定できる動画像符号化装置が知られている。この動画像符号化装置は、LCUベースエンコーダとも呼ばれる。ここで、LCUとは、符号化単位として扱われる画素ブロックを意味する。各LCUに対応するフィルタ情報は、符号化されて動画像復号装置にシグナリングされる。尚、各LCUに対応するフィルタ情報の符号化に関して、例えば、フィルタ情報そのものを新たに符号化する符号化モード、既に符号化されたフィルタ情報を参照する情報を符号化する符号化モードなどが用意される。他方、ピクチャ単位でフィルタを設定するピクチャベースエンコーダも知られている。 2. Description of the Related Art There is known a moving picture coding apparatus capable of setting a filter in units of LCU (Large Coding Unit) in a picture. This moving image encoding apparatus is also called an LCU base encoder. Here, LCU means a pixel block treated as a coding unit. Filter information corresponding to each LCU is encoded and signaled to the video decoding device. Regarding encoding of filter information corresponding to each LCU, for example, an encoding mode for newly encoding the filter information itself, an encoding mode for encoding information referring to the already encoded filter information, and the like are prepared. Is done. On the other hand, a picture-based encoder that sets a filter in units of pictures is also known.
 LCUベースエンコーダは、LCU単位でフィルタ情報を符号化できるので、符号化処理に伴う遅延を低減できる。他方、LCUベースエンコーダによれば、LCU単位でフィルタが最適化されるので、ピクチャ内に多数のフィルタを設定する必要がある。故に、符号化データのオーバーヘッドは増大し易い。 Since the LCU base encoder can encode the filter information in units of LCUs, the delay associated with the encoding process can be reduced. On the other hand, according to the LCU base encoder, filters are optimized in units of LCUs, so it is necessary to set a large number of filters in a picture. Therefore, the overhead of encoded data tends to increase.
 ピクチャベースエンコーダは、ピクチャ単位でフィルタを最適化する。故に、ピクチャベースエンコーダによると、設定されるフィルタの総数がLCUベースエンコーダに比べて低減されやすく、符号化データのオーバーヘッドがLCUベースエンコーダに比べて小さくなりやすい。他方、ピクチャベースエンコーダは、ピクチャ内の全てのLCUに対する符号化処理が完了するまでフィルタを設定することができない。故に、LCUベースエンコーダに比べて符号化処理に伴う遅延が大きくなる。 The picture-based encoder optimizes the filter for each picture. Therefore, according to the picture-based encoder, the total number of filters to be set is easily reduced as compared with the LCU-based encoder, and the overhead of encoded data is likely to be smaller than that of the LCU-based encoder. On the other hand, the picture-based encoder cannot set a filter until the encoding process for all the LCUs in the picture is completed. Therefore, the delay accompanying the encoding process is larger than that of the LCU-based encoder.
 実施形態は、符号化データのオーバーヘッドを削減することを目的の1つとする。また、実施形態は、符号化処理に伴う遅延を減少させることを目的の1つとする。 The embodiment is intended to reduce the overhead of encoded data. In addition, one of the purposes of the embodiment is to reduce the delay associated with the encoding process.
 実施形態によれば、動画像符号化方法は、復号画像に対して設定される第1のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第1フィルタ係数情報を符号化することを含む。動画像符号化方法は、復号画像内の処理対象の画素ブロックにループフィルタ処理が適用される場合に第1のフィルタセット及び当該処理対象の画素ブロックに対して設定される第2のフィルタセットのいずれが適用されるかを示すフィルタ切り替え情報を第1フィルタ係数情報が符号化された後に符号化することを含む。動画像符号化方法は、処理対象の画素ブロックに第2のフィルタセットが適用される場合には、第2のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第2フィルタ係数情報を第1フィルタ係数情報が符号化された後に符号化することを含む。動画像符号化方法は、フィルタ切り替え情報に基づいて、処理対象の画素ブロックに第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用することによって、参照画像内の画素ブロックを生成することを含む。 According to the embodiment, the moving image encoding method encodes first filter coefficient information indicating each filter coefficient of one or more filters included in a first filter set set for a decoded image. Including that. The moving image encoding method includes a first filter set and a second filter set set for the pixel block to be processed when loop filter processing is applied to the pixel block to be processed in the decoded image. It includes encoding the filter switching information indicating which is applied after the first filter coefficient information is encoded. When the second filter set is applied to the pixel block to be processed, the moving image encoding method uses the second filter coefficient indicating each filter coefficient of one or more filters included in the second filter set. Encoding the information after the first filter coefficient information is encoded. The moving image encoding method generates a pixel block in a reference image by applying one of the first filter set and the second filter set to the pixel block to be processed based on the filter switching information. Including that.
第1の実施形態に係る動画像符号化装置を例示するブロック図。1 is a block diagram illustrating a moving image encoding apparatus according to a first embodiment. 図1のループフィルタ情報生成部を例示するブロック図。The block diagram which illustrates the loop filter information generation part of FIG. 図1のループフィルタ処理部を例示するブロック図。FIG. 2 is a block diagram illustrating a loop filter processing unit in FIG. 1. ピクチャベースエンコーダのシンタクス構造の説明図。Explanatory drawing of the syntax structure of a picture base encoder. LCUベースエンコーダのシンタクス構造の説明図。Explanatory drawing of the syntax structure of a LCU base encoder. 第1の実施形態に係る動画像符号化装置及び動画像復号装置が使用するシンタクス構造の説明図。Explanatory drawing of the syntax structure which the moving image encoding device and moving image decoding device which concern on 1st Embodiment use. 第1の実施形態に係る動画像符号化装置が第1フィルタ係数情報を符号化するために使用するシンタクスを例示する図。The figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode 1st filter coefficient information. 第1の実施形態に係る動画像符号化装置がフィルタ切り替え情報及び第2フィルタ係数情報を符号化するために使用するシンタクスを例示する図。The figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode filter switching information and 2nd filter coefficient information. 第1の実施形態に係る動画像符号化装置がフィルタ切り替え情報及び第2フィルタ係数情報を符号化するために使用するシンタクスを例示する図。The figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode filter switching information and 2nd filter coefficient information. 第1の実施形態に係る動画像符号化装置がフィルタ切り替え情報及び第2フィルタ係数情報を符号化するために使用するシンタクスを例示する図。The figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode filter switching information and 2nd filter coefficient information. 第1の実施形態に係る動画像符号化装置がフィルタ切り替え情報及び第2フィルタ係数情報を符号化するために使用するシンタクスを例示する図。The figure which illustrates the syntax which the moving image encoder which concerns on 1st Embodiment uses in order to encode filter switching information and 2nd filter coefficient information. 第1の実施形態に係る動画像符号化装置によって行われる符号化処理の一部を例示するフローチャート。6 is a flowchart illustrating a part of an encoding process performed by the moving image encoding apparatus according to the first embodiment. 第1の実施形態に係る動画像復号装置を例示するブロック図。1 is a block diagram illustrating a moving image decoding apparatus according to a first embodiment. 第2の実施形態に係る動画像符号化装置を例示するブロック図。The block diagram which illustrates the moving picture coding device concerning a 2nd embodiment. 第2の実施形態に係る動画像復号装置を例示するブロック図。The block diagram which illustrates the moving picture decoding device concerning a 2nd embodiment. 第3の実施形態に係る動画像符号化装置を例示するブロック図。The block diagram which illustrates the animation coding device concerning a 3rd embodiment. 第3の実施形態に係る動画像復号装置を例示するブロック図。The block diagram which illustrates the video decoding device concerning a 3rd embodiment. 第4の実施形態に係る動画像復号装置を例示するブロック図。The block diagram which illustrates the video decoding device concerning a 4th embodiment. 第5の実施形態に係る動画像符号化装置を例示するブロック図。FIG. 10 is a block diagram illustrating a moving image encoding apparatus according to a fifth embodiment. 第5の実施形態に係る動画像復号装置を例示するブロック図。The block diagram which illustrates the moving picture decoding device concerning a 5th embodiment.
 以下、図面を参照しながら実施形態の説明が述べられる。尚、以降、説明済みの要素と同一または類似の要素には同一または類似の符号が付され、重複する説明は基本的に省略される。 Hereinafter, embodiments will be described with reference to the drawings. Hereinafter, the same or similar elements as those already described are denoted by the same or similar reference numerals, and redundant description is basically omitted.
 尚、以降の説明において、フィルタ処理とは、複数のフィルタ係数を用いて処理対象画素及び1以上の周辺画素の画素値を積和演算する処理であってもよいが、これに限られない。例えば、フィルタ処理は、1つのフィルタ係数(オフセット値とも呼ばれる)を処理対象画素の画素値に加算する処理のみであってもよい。或いは、フィルタ処理は、複数のフィルタ係数を用いて処理対象画素及び1以上の周辺画素の画素値を積和演算してから1つのフィルタ係数(オフセット値とも呼ばれる)を加算する処理であってもよい。 In the following description, the filter process may be a process of multiply-adding the pixel values of the pixel to be processed and one or more peripheral pixels using a plurality of filter coefficients, but is not limited thereto. For example, the filtering process may be only a process of adding one filter coefficient (also called an offset value) to the pixel value of the pixel to be processed. Alternatively, the filtering process may be a process of adding a single filter coefficient (also referred to as an offset value) after performing a product-sum operation on pixel values of the processing target pixel and one or more neighboring pixels using a plurality of filter coefficients. Good.
 (第1の実施形態) 
 (動画像符号化装置) 
 図1に例示されるように、第1の実施形態に係る動画像符号化装置は、動画像符号化部1000と、符号化制御部111とを備える。動画像符号化部1000は、予測画像生成部101と、減算部102と、変換/量子化部103と、エントロピー符号化部104と、逆量子化/逆変換部105と、加算部106と、ループフィルタ情報生成部107と、ループフィルタ処理部108と、第1フィルタ設定部109と、第1フィルタバッファ110とを備える。符号化制御部111は、動画像符号化部1000の各部の動作を制御する。
(First embodiment)
(Moving picture encoding device)
As illustrated in FIG. 1, the moving picture coding apparatus according to the first embodiment includes a moving picture coding unit 1000 and a coding control unit 111. The moving image encoding unit 1000 includes a predicted image generation unit 101, a subtraction unit 102, a transform / quantization unit 103, an entropy encoding unit 104, an inverse quantization / inverse transform unit 105, an addition unit 106, A loop filter information generation unit 107, a loop filter processing unit 108, a first filter setting unit 109, and a first filter buffer 110 are provided. The encoding control unit 111 controls the operation of each unit of the moving image encoding unit 1000.
 予測画像生成部101は、入力画像10の予測処理を例えば画素ブロック単位で行うことによって、予測画像11を生成する。入力画像10は、複数の画素信号を含み、動画像符号化装置の外部より取得される。尚、予測処理は、動き補償を用いた時間方向の画面間予測処理、画面内の符号化済み画素を用いた空間方向の画面内予測処理などの一般的なものであってよい。例えば、予測画像生成部101は、後述されるループフィルタ処理画像19に基づいて入力画像10の予測処理を行ってよい。予測画像生成部101は、予測画像11を減算部102及び加算部106へと出力する。 The predicted image generation unit 101 generates the predicted image 11 by performing the prediction process of the input image 10 in units of pixel blocks, for example. The input image 10 includes a plurality of pixel signals and is acquired from the outside of the moving image encoding device. The prediction process may be a general process such as a temporal inter-screen prediction process using motion compensation and a spatial intra-screen prediction process using encoded pixels in the screen. For example, the predicted image generation unit 101 may perform a prediction process on the input image 10 based on a loop filter processed image 19 described later. The predicted image generation unit 101 outputs the predicted image 11 to the subtraction unit 102 and the addition unit 106.
 減算部102は、動画像符号化装置の外部から入力画像10を取得し、予測画像生成部101から予測画像11を入力する。減算部102は、入力画像10から予測画像11を減算し、予測誤差画像12を生成する。減算部102は、予測誤差画像12を変換/量子化部103へと出力する。 The subtraction unit 102 acquires the input image 10 from the outside of the moving image encoding device, and inputs the predicted image 11 from the predicted image generation unit 101. The subtraction unit 102 subtracts the prediction image 11 from the input image 10 to generate a prediction error image 12. The subtraction unit 102 outputs the prediction error image 12 to the transform / quantization unit 103.
 変換/量子化部103は、減算部102から予測誤差画像12を入力する。変換/量子化部103は、予測誤差画像12に変換処理を行って変換係数を生成する。更に、変換/量子化部103は、変換係数を量子化して量子化変換係数を生成する。変換/量子化部103は、量子化変換係数をエントロピー符号化部104及び逆量子化/逆変換部105へと出力する。ここで、変換処理は、典型的には離散コサイン変換(Discrete Cosine Transform;DCT)などの直交変換である。尚、変換処理は、DCTに限られず、例えばウェーブレット変換、独立成分解析などであってもよい。量子化処理は、符号化制御部111によって設定される量子化パラメータに基づいて行われる。 The transform / quantization unit 103 receives the prediction error image 12 from the subtraction unit 102. The transform / quantization unit 103 performs transform processing on the prediction error image 12 to generate transform coefficients. Further, the transform / quantization unit 103 quantizes the transform coefficient to generate a quantized transform coefficient. The transform / quantization unit 103 outputs the quantized transform coefficient to the entropy coding unit 104 and the inverse quantization / inverse transform unit 105. Here, the transformation process is typically orthogonal transformation such as Discrete Cosine Transform (DCT). Note that the conversion process is not limited to DCT, and may be, for example, wavelet conversion, independent component analysis, or the like. The quantization process is performed based on the quantization parameter set by the encoding control unit 111.
 エントロピー符号化部104は、変換/量子化部103から量子化変換係数を入力し、ループフィルタ情報生成部107からフィルタ切り替え情報14及び第2フィルタ係数情報15を入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力し、符号化制御部111から符号化パラメータを入力する。符号化パラメータは、例えば、予測モード情報、動き情報、符号化ブロック分割情報、量子化パラメータなどを含んでよい。 The entropy encoding unit 104 receives the quantized transform coefficient from the transform / quantization unit 103, receives the filter switching information 14 and the second filter coefficient information 15 from the loop filter information generation unit 107, and receives from the first filter buffer 110. The first filter coefficient information 13 is input, and the encoding parameter is input from the encoding control unit 111. The encoding parameters may include, for example, prediction mode information, motion information, encoded block division information, quantization parameters, and the like.
 エントロピー符号化部104は、シンタクスに従って、量子化変換係数、第1フィルタ係数情報13、フィルタ切り替え情報14、第2フィルタ係数情報15及び符号化パラメータをエントロピー符号化(例えば、ハフマン符号化、算術符号化など)することによって、符号化データ17を生成する。尚、エントロピー符号化部104によって使用されるシンタクスの詳細は後述される。エントロピー符号化部104は、動画像符号化装置の外部(例えば、通信系、蓄積系など)へと符号化データ17を出力する。符号化データ17は、後述される動画像復号装置によって復号されることになる。 The entropy coding unit 104 performs entropy coding (for example, Huffman coding, arithmetic coding) on the quantized transform coefficient, the first filter coefficient information 13, the filter switching information 14, the second filter coefficient information 15, and the coding parameter according to the syntax. Encoded data 17 is generated. Details of the syntax used by the entropy encoding unit 104 will be described later. The entropy encoding unit 104 outputs the encoded data 17 to the outside of the moving image encoding apparatus (for example, a communication system, a storage system, etc.). The encoded data 17 is decoded by a moving picture decoding apparatus described later.
 逆量子化/逆変換部105は、変換/量子化部103から量子化変換係数を入力する。逆量子化/逆変換部105は、量子化変換係数を逆量子化することによって、変換係数を生成する。更に、逆量子化/逆変換部105は、変換係数に逆変換処理を行うことによって、予測誤差画像を生成する。逆量子化/逆変換部105は、予測誤差画像を加算部106へと出力する。基本的に、逆量子化/逆変換部105は、変換/量子化部103の逆処理を行う。即ち、逆量子化は、符号化制御部111によって設定される量子化パラメータに基づいて行われる。更に、逆変換処理は、変換/量子化部103が行った変換処理によって決まる。逆変換処理は、例えば、逆DCT(Inverse DCT;IDCT)、逆ウェーブレット変換などである。 The inverse quantization / inverse transform unit 105 inputs the quantized transform coefficient from the transform / quantization unit 103. The inverse quantization / inverse transform unit 105 generates transform coefficients by inverse quantization of the quantized transform coefficients. Further, the inverse quantization / inverse transform unit 105 generates a prediction error image by performing an inverse transform process on the transform coefficient. The inverse quantization / inverse transform unit 105 outputs the prediction error image to the addition unit 106. Basically, the inverse quantization / inverse transform unit 105 performs the inverse process of the transform / quantization unit 103. That is, the inverse quantization is performed based on the quantization parameter set by the encoding control unit 111. Further, the inverse transform process is determined by the transform process performed by the transform / quantization unit 103. The inverse transform process is, for example, inverse DCT (Inverse DCT; IDCT), inverse wavelet transform, or the like.
 加算部106は、予測画像生成部101から予測画像11を入力し、逆量子化/逆変換部105から予測誤差画像を入力する。加算部106は、予測誤差画像を予測画像と加算し、(局所)復号画像16を生成する。加算部106は、復号画像16をループフィルタ情報生成部107及びループフィルタ処理部108へと出力する。 The addition unit 106 receives the prediction image 11 from the prediction image generation unit 101 and inputs the prediction error image from the inverse quantization / inverse conversion unit 105. The adding unit 106 adds the prediction error image to the prediction image to generate a (local) decoded image 16. Adder 106 outputs decoded image 16 to loop filter information generator 107 and loop filter processor 108.
 ここで、ループフィルタ情報生成部107及びループフィルタ処理部108は、画素ブロック単位で処理を行う。上記画素ブロックは、典型的には符号化単位として扱われる。上記画素ブロックは、例えば、H.264/AVCにおけるマクロブロック、HEVC(High Efficiency Video Coding)におけるLCUなどに相当する。以降の説明において、便宜的に上記画素ブロックはLCUに相当すると仮定されるが、上記画素ブロックがLCUに相当しない場合であっても本実施形態は適用可能である。 Here, the loop filter information generation unit 107 and the loop filter processing unit 108 perform processing in units of pixel blocks. The pixel block is typically handled as a coding unit. The pixel block is, for example, H.264. It corresponds to a macroblock in H.264 / AVC, an LCU in HEVC (High Efficiency Video Coding), and the like. In the following description, for convenience, it is assumed that the pixel block corresponds to an LCU. However, the present embodiment is applicable even when the pixel block does not correspond to an LCU.
 ループフィルタ情報生成部107は、動画像符号化装置の外部から入力画像10を画素ブロック単位で取得し、加算部106から復号画像16を画素ブロック単位で入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力する。 The loop filter information generation unit 107 acquires the input image 10 in units of pixel blocks from the outside of the video encoding device, inputs the decoded image 16 in units of pixel blocks from the addition unit 106, and receives the first image from the first filter buffer 110. The filter coefficient information 13 is input.
 ループフィルタ情報生成部107は、入力画像10及び復号画像16に基づいて、処理対象の画素ブロックに対応するフィルタ設定情報18を生成する。また、ループフィルタ情報生成部107は、フィルタ設定情報18に基づいて、処理対象の画素ブロックに対応する第2フィルタ係数情報15を生成する。更に、ループフィルタ情報生成部107は、入力画像10、第1フィルタ係数情報13、第2フィルタ係数情報15及び復号画像16に基づいて、処理対象の画素ブロックに対応するフィルタ切り替え情報14を生成する。 The loop filter information generation unit 107 generates filter setting information 18 corresponding to the pixel block to be processed based on the input image 10 and the decoded image 16. In addition, the loop filter information generation unit 107 generates second filter coefficient information 15 corresponding to the pixel block to be processed based on the filter setting information 18. Furthermore, the loop filter information generation unit 107 generates filter switching information 14 corresponding to the pixel block to be processed based on the input image 10, the first filter coefficient information 13, the second filter coefficient information 15, and the decoded image 16. .
 ここで、第1フィルタ係数情報13は、ピクチャ単位で設定される第1のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す情報である。フィルタ切り替え情報14は、対応する画素ブロックにループフィルタ処理が適用される場合に第1のフィルタセット及び第2のフィルタセットのいずれが適用されるかを示す情報である。第2フィルタ係数情報15は、画素ブロック単位で設定される第2のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す情報である。フィルタ設定情報18は、処理対象の画素ブロックにフィルタ係数を設定するために必要な情報を意味する。 Here, the first filter coefficient information 13 is information indicating each filter coefficient of one or more filters included in the first filter set set in units of pictures. The filter switching information 14 is information indicating which of the first filter set and the second filter set is applied when the loop filter process is applied to the corresponding pixel block. The second filter coefficient information 15 is information indicating each filter coefficient of one or more filters included in the second filter set set in pixel block units. The filter setting information 18 means information necessary for setting a filter coefficient for a pixel block to be processed.
 ループフィルタ情報生成部107は、フィルタ切り替え情報14及び第2フィルタ係数情報15をエントロピー符号化部104及びループフィルタ処理部108へと出力する。ループフィルタ情報生成部107は、フィルタ設定情報18を第1フィルタ設定部109へと出力する。 The loop filter information generation unit 107 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104 and the loop filter processing unit 108. The loop filter information generation unit 107 outputs the filter setting information 18 to the first filter setting unit 109.
 具体的には図2に例示されるように、ループフィルタ情報生成部107は、フィルタ設定情報生成部112と、第2フィルタ設定部113と、フィルタ切り替え情報生成部114とを備える。 Specifically, as illustrated in FIG. 2, the loop filter information generation unit 107 includes a filter setting information generation unit 112, a second filter setting unit 113, and a filter switching information generation unit 114.
 フィルタ設定情報生成部112は、動画像符号化装置の外部から入力画像10を画素ブロック単位で取得し、加算部106から復号画像16を画素ブロック単位で入力する。フィルタ設定情報生成部112は、入力画像10及び復号画像16に基づいて、処理対象の画素ブロックに対応するフィルタ設定情報18を生成する。フィルタ設定情報生成部112は、フィルタ設定情報18を第1フィルタ設定部109及び第2フィルタ設定部113へと出力する。 The filter setting information generation unit 112 acquires the input image 10 from the outside of the video encoding device in units of pixel blocks, and inputs the decoded image 16 from the addition unit 106 in units of pixel blocks. The filter setting information generation unit 112 generates filter setting information 18 corresponding to the pixel block to be processed based on the input image 10 and the decoded image 16. The filter setting information generation unit 112 outputs the filter setting information 18 to the first filter setting unit 109 and the second filter setting unit 113.
 例えば、ループフィルタ処理部108が例えば2次元のWiener filter処理を行う場合には、フィルタ情報生成部112は画素ブロック単位の入力画像10及び画素ブロック単位の復号画像16に基づく相関行列を示すフィルタ設定情報18を生成する。尚、Wiener filterはいわゆる画素復元フィルタであり、入力画像10とループフィルタ処理画像19との間の残差二乗和を最小化することができる。 For example, when the loop filter processing unit 108 performs, for example, a two-dimensional Wiener filter process, the filter information generation unit 112 sets a filter indicating a correlation matrix based on the input image 10 in units of pixel blocks and the decoded image 16 in units of pixel blocks. Information 18 is generated. The Wiener filter is a so-called pixel restoration filter and can minimize the residual sum of squares between the input image 10 and the loop filter processed image 19.
 第2フィルタ設定部113は、フィルタ設定情報生成部112からフィルタ設定情報18を入力する。第2フィルタ設定部112は、フィルタ設定情報18に基づいて、処理対象の画素ブロックに適用可能な第2のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を設定することにより、第2フィルタ係数情報15を生成する。第2フィルタ設定部113は、第2フィルタ係数情報15をエントロピー符号化部104、ループフィルタ処理部108及びフィルタ切り替え情報生成部114へと出力する。 The second filter setting unit 113 receives the filter setting information 18 from the filter setting information generation unit 112. The second filter setting unit 112 sets the filter coefficient of each of the one or more filters included in the second filter set applicable to the pixel block to be processed based on the filter setting information 18, thereby Two filter coefficient information 15 is generated. The second filter setting unit 113 outputs the second filter coefficient information 15 to the entropy encoding unit 104, the loop filter processing unit 108, and the filter switching information generation unit 114.
 例えば、ループフィルタ処理部108が例えば2次元のWiener filter処理を行う場合には、第2フィルタ設定部113はフィルタ設定情報18が示す相関行列を連立方程式とみなして当該連立方程式を解くことによって、第2フィルタ係数情報15を生成できる。 For example, when the loop filter processing unit 108 performs, for example, a two-dimensional Wiener filter process, the second filter setting unit 113 regards the correlation matrix indicated by the filter setting information 18 as simultaneous equations and solves the simultaneous equations, Second filter coefficient information 15 can be generated.
 フィルタ切り替え情報生成部114は、動画像符号化装置の外部から入力画像10を画素ブロック単位で取得し、加算部106から復号画像16を画素ブロック単位で入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力し、第2フィルタ設定部113から第2フィルタ係数情報15を入力する。 The filter switching information generation unit 114 acquires the input image 10 from the outside of the video encoding device in units of pixel blocks, inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, and receives the first image from the first filter buffer 110. The filter coefficient information 13 is input, and the second filter coefficient information 15 is input from the second filter setting unit 113.
 フィルタ切り替え情報生成部114は、入力画像10及び復号画像16に基づいて、処理対象の画素ブロックに第1のフィルタセット及び第2のフィルタセットのいずれが適用されるかを判定する。尚、フィルタ切り替え情報生成部114は、適用されるフィルタセットを判定するために、例えば下記数式(1)に示される符号化コストを利用できるが、これに限らず任意の技法を利用できる。フィルタ切り替え情報生成部114は、判定結果に基づいてフィルタ切り替え情報14を生成する。フィルタ切り替え情報生成部114は、フィルタ切り替え情報14をエントロピー符号化部104及びループフィルタ処理部108へと出力する。
Figure JPOXMLDOC01-appb-M000001
The filter switching information generation unit 114 determines which of the first filter set and the second filter set is applied to the pixel block to be processed based on the input image 10 and the decoded image 16. The filter switching information generation unit 114 can use, for example, the encoding cost represented by the following mathematical formula (1) in order to determine the filter set to be applied, but is not limited to this, and any technique can be used. The filter switching information generation unit 114 generates filter switching information 14 based on the determination result. The filter switching information generation unit 114 outputs the filter switching information 14 to the entropy encoding unit 104 and the loop filter processing unit 108.
Figure JPOXMLDOC01-appb-M000001

 数式(1)において、Costは符号化コストを表し、Dは残差二乗和を表し、λは係数を表し、Rは符号量を表す。ループフィルタ処理部108が例えば2次元のWiener filter処理を行う場合には、処理対象の画素ブロックに対して設定したフィルタを適用することで、前述のように当該画素ブロックにおいて入力画像10とループフィルタ処理画像19との間の残差二乗和を最小化することができる。従って、第2のフィルタセットが適用される場合の残差二乗和は、フィルタ係数の演算精度の影響が十分小さければ第1のフィルタセットが適用される場合の残差二乗和よりも大きくならない。他方、第1フィルタ係数情報13はピクチャ単位で符号化されるので、画素ブロックに第1のフィルタセットが適用される場合には当該画素ブロックのためにフィルタ係数そのものを示す情報を新たに符号化する必要はない。従って、一般には第1のフィルタセットが適用される場合の符号量は、第2のフィルタセットが適用される場合の符号量よりも大きくならない。即ち、上記数式(1)によれば、画素ブロックに適用されるフィルタセットの判定において、残差二乗和及び符号量のトレードオフを考慮することができる。

In Equation (1), Cost represents coding cost, D represents residual sum of squares, λ represents a coefficient, and R represents a code amount. When the loop filter processing unit 108 performs, for example, two-dimensional Wien filter processing, by applying a filter set to the pixel block to be processed, the input image 10 and the loop filter in the pixel block as described above. The residual sum of squares with the processed image 19 can be minimized. Therefore, the residual sum of squares when the second filter set is applied does not become larger than the residual sum of squares when the first filter set is applied if the influence of the filter coefficient calculation accuracy is sufficiently small. On the other hand, since the first filter coefficient information 13 is encoded in units of pictures, when the first filter set is applied to a pixel block, information indicating the filter coefficient itself is newly encoded for the pixel block. do not have to. Therefore, in general, the code amount when the first filter set is applied is not larger than the code amount when the second filter set is applied. That is, according to the above formula (1), the trade-off between the residual sum of squares and the code amount can be taken into account in the determination of the filter set applied to the pixel block.
 尚、フィルタ切り替え情報14は、対応する画素ブロックに第1のフィルタセット及び第2のフィルタセットのいずれが適用されるかを示す情報に加えて、対応する画素ブロックにループフィルタ処理が適用されるか否かを示す情報を更に含んでもよい。即ち、フィルタ切り替え情報14は、対応する画素ブロックにループフィルタ処理が適用されるか否かを示す情報と、ループフィルタ処理が適用される場合には第1のフィルタセット及び第2のフィルタセットのいずれが適用されるかを示す情報とを含んでよい。 The filter switching information 14 is obtained by applying a loop filter process to the corresponding pixel block in addition to information indicating which of the first filter set and the second filter set is applied to the corresponding pixel block. It may further include information indicating whether or not. That is, the filter switching information 14 includes information indicating whether or not the loop filter processing is applied to the corresponding pixel block, and the first filter set and the second filter set when the loop filter processing is applied. Information indicating which is applied.
 一般に、第1のフィルタセットはピクチャ単位で設定されるので、ピクチャ内の一部の画素ブロックに対して残差二乗和を却って増大させるおそれがある。係る場合には、ループフィルタ処理が適用されない場合の残差二乗和が第1のフィルタセットが適用される場合の残差二乗和より小さくなる。また、ループフィルタ処理が適用されない場合には、フィルタ係数情報を符号化する必要がないので、第1のフィルタセットまたは第2のフィルタセットが適用される場合に比べて符号量も大きくならない。故に、ループフィルタ処理が適用されない場合の符号化コストが、第1のフィルタセットが適用される場合の符号化コスト及び第2のフィルタセットが適用される場合の符号化コストよりも小さくなることがある。係る場合に、フィルタ切り替え情報生成部114は、画素ブロックにループフィルタ処理が適用されないことを示すフィルタ切り替え情報14を生成してもよい。 Generally, since the first filter set is set in units of pictures, there is a risk that the residual sum of squares may be increased for some pixel blocks in the picture. In such a case, the residual sum of squares when the loop filter process is not applied is smaller than the residual sum of squares when the first filter set is applied. Further, when the loop filter process is not applied, it is not necessary to encode the filter coefficient information, so that the code amount does not increase compared to the case where the first filter set or the second filter set is applied. Therefore, the encoding cost when the loop filter processing is not applied may be smaller than the encoding cost when the first filter set is applied and the encoding cost when the second filter set is applied. is there. In such a case, the filter switching information generation unit 114 may generate filter switching information 14 indicating that the loop filter process is not applied to the pixel block.
 ループフィルタ処理部108は、加算部106から復号画像16を画素ブロック単位で入力し、ループフィルタ情報生成部107からフィルタ切り替え情報14及び第2フィルタ係数情報15を入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力する。 The loop filter processing unit 108 inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, receives the filter switching information 14 and the second filter coefficient information 15 from the loop filter information generation unit 107, and receives from the first filter buffer 110. First filter coefficient information 13 is input.
 ループフィルタ処理部108は、処理対象の画素ブロックに対応するフィルタ切り替え情報14に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。具体的には、ループフィルタ処理部108は、第1フィルタ係数情報13または第2フィルタ係数情報15を用いて処理対象の画素ブロックにループフィルタ処理を行うことによって、ループフィルタ処理画像19内の画素ブロックを生成する。尚、前述のようにフィルタ切り替え情報14が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、ループフィルタ処理部108は処理対象の画素ブロックに対するループフィルタ処理を省略する。 The loop filter processing unit 108 applies either one of the first filter set and the second filter set based on the filter switching information 14 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 108 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 13 or the second filter coefficient information 15, so that the pixels in the loop filter processed image 19 are processed. Generate a block. As described above, if the filter switching information 14 indicates that the loop filter process is not applied to the pixel block to be processed, the loop filter processing unit 108 omits the loop filter process for the pixel block to be processed.
 ループフィルタ処理画像19は、予測画像生成部101がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてもよい。ループフィルタ処理画像19は、必要に応じて予測画像生成部101によって参照画像として読み出され、予測処理に利用される。 The loop filter processed image 19 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 101. The loop filter processed image 19 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
 具体的には図3に例示されるように、ループフィルタ処理部108は、スイッチ115と、フィルタ適用部116とを備える。 Specifically, as illustrated in FIG. 3, the loop filter processing unit 108 includes a switch 115 and a filter application unit 116.
 スイッチ115は、第1フィルタバッファ110から第1フィルタ係数情報13を入力し、ループフィルタ情報生成部108から処理対象の画素ブロックに対応するフィルタ切り替え情報14及び第2フィルタ係数情報15を入力する。スイッチ115は、フィルタ切り替え情報14に基づいて、第1フィルタ係数情報13及び第2フィルタ係数情報15のいずれか一方を選択し、選択されたフィルタ係数情報(以下、選択フィルタ係数情報と称される)をフィルタ適用部116へと出力する。即ち、フィルタ切り替え情報14が処理対象の画素ブロックに第1のフィルタセットが適用されることを示すならば、スイッチ115は第1フィルタ係数情報13を選択する。他方、フィルタ切り替え情報14が処理対象の画素ブロックに第2のフィルタセットが適用されることを示すならば、スイッチ115は第2フィルタ係数情報15を選択する。尚、フィルタ切り替え情報14が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、スイッチ115は第1フィルタ係数情報13及び第2フィルタ係数情報15のいずれも選択しない。 The switch 115 receives the first filter coefficient information 13 from the first filter buffer 110, and receives the filter switching information 14 and the second filter coefficient information 15 corresponding to the pixel block to be processed from the loop filter information generation unit 108. The switch 115 selects one of the first filter coefficient information 13 and the second filter coefficient information 15 based on the filter switching information 14, and selects the selected filter coefficient information (hereinafter referred to as selected filter coefficient information). ) To the filter application unit 116. That is, if the filter switching information 14 indicates that the first filter set is applied to the pixel block to be processed, the switch 115 selects the first filter coefficient information 13. On the other hand, if the filter switching information 14 indicates that the second filter set is applied to the pixel block to be processed, the switch 115 selects the second filter coefficient information 15. If the filter switching information 14 indicates that the loop filter processing is not applied to the pixel block to be processed, the switch 115 selects neither the first filter coefficient information 13 nor the second filter coefficient information 15.
 フィルタ適用部116は、加算部106から復号画像16を画素ブロック単位で入力し、スイッチ115から選択フィルタ係数情報を入力する。フィルタ適用部116は、選択フィルタ係数情報に基づいて処理対象の画素ブロックにフィルタ処理を施すことによって、ループフィルタ処理画像19内の画素ブロックを生成する。フィルタ適用部116は、ループフィルタ処理画像19を予測画像生成部101へと出力する。尚、前述のようにフィルタ切り替え情報14が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、フィルタ適用部116は処理対象の画素ブロックをそのままループフィルタ処理画像19内の画素ブロックとして利用する。 The filter application unit 116 inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, and inputs selection filter coefficient information from the switch 115. The filter application unit 116 generates a pixel block in the loop filter processed image 19 by performing filter processing on the pixel block to be processed based on the selected filter coefficient information. The filter application unit 116 outputs the loop filter processed image 19 to the predicted image generation unit 101. If the filter switching information 14 indicates that the loop filter process is not applied to the pixel block to be processed as described above, the filter application unit 116 directly uses the pixel block in the loop filter processed image 19 as it is. Use as
 第1フィルタ設定部109は、ループフィルタ情報設定部107からフィルタ設定情報18を入力する。第1フィルタ設定部109は、ピクチャ内の例えば全ての画素ブロックに対応するフィルタ設定情報18を収集し、収集されたフィルタ設定情報18に基づいて後続するピクチャに対する第1のフィルタセットのフィルタ係数を設定する。第1フィルタ設定部109は、設定したフィルタ係数を示すフィルタ係数情報を第1フィルタバッファ110に保存する。後続するピクチャとは、符号化される順序において処理対象のピクチャの直後のピクチャであってもよいし、2枚以上後のピクチャであってもよい。 The first filter setting unit 109 receives the filter setting information 18 from the loop filter information setting unit 107. The first filter setting unit 109 collects filter setting information 18 corresponding to, for example, all pixel blocks in a picture, and sets filter coefficients of the first filter set for subsequent pictures based on the collected filter setting information 18. Set. The first filter setting unit 109 stores filter coefficient information indicating the set filter coefficient in the first filter buffer 110. The subsequent picture may be a picture immediately after the picture to be processed in the encoding order, or may be a picture that is two or more pictures after.
 例えば、ループフィルタ処理部108が2次元のWiner filter処理を行う場合には、第1フィルタ設定部109はピクチャ内の全てまたは一部の画素ブロックに対応するフィルタ設定情報18を収集する。そして、第1フィルタ設定部109は、収集されたフィルタ設定情報18が示す相関行列の要素和を算出することによって相関行列を統合する。ループフィルタ情報108は、統合された相関行列を連立方程式とみなして当該連立方程式を解くことによって、後続するピクチャに対するフィルタ係数情報を設定できる。 For example, when the loop filter processing unit 108 performs a two-dimensional Wine filter process, the first filter setting unit 109 collects filter setting information 18 corresponding to all or some of the pixel blocks in the picture. Then, the first filter setting unit 109 integrates the correlation matrix by calculating the element sum of the correlation matrix indicated by the collected filter setting information 18. The loop filter information 108 can set filter coefficient information for subsequent pictures by regarding the integrated correlation matrix as simultaneous equations and solving the simultaneous equations.
 但し、第1フィルタ設定部109は、必ずしも現行のピクチャ内の画素ブロックのフィルタ設定情報18に基づいて後続するピクチャに対してフィルタ係数情報を設定しなくてもよい。例えば、第1フィルタ設定部109は、デフォルトのフィルタ係数情報を参照可能であってもよい。そして、第1フィルタ設定部109は、フィルタ設定情報18に依存せずに、デフォルトのフィルタ係数情報を第1フィルタバッファ110に保存してもよい。 However, the first filter setting unit 109 does not necessarily need to set the filter coefficient information for the subsequent picture based on the filter setting information 18 of the pixel block in the current picture. For example, the first filter setting unit 109 may be able to refer to default filter coefficient information. Then, the first filter setting unit 109 may store default filter coefficient information in the first filter buffer 110 without depending on the filter setting information 18.
 第1フィルタバッファ110に保存されたフィルタ係数情報は、後続するピクチャに対する符号化処理において、エントロピー符号化部104、ループフィルタ情報生成部107及びループフィルタ処理部108によって、第1フィルタ係数情報13として読み出される。また、第1フィルタバッファ110には、同時に複数のフィルタセットのフィルタ係数情報が保存されていてもよい。係る場合に、現行のピクチャの第1フィルタ係数情報13は、第1フィルタバッファに保存された複数のフィルタセットのフィルタ係数情報の中から例えば当該ピクチャに設定された量子化パラメータの高低に応じて選択されてよい。 The filter coefficient information stored in the first filter buffer 110 is converted into first filter coefficient information 13 by the entropy encoding unit 104, the loop filter information generation unit 107, and the loop filter processing unit 108 in the encoding process for subsequent pictures. Read out. The first filter buffer 110 may store filter coefficient information for a plurality of filter sets at the same time. In such a case, the first filter coefficient information 13 of the current picture corresponds to, for example, the level of the quantization parameter set for the picture from the filter coefficient information of the plurality of filter sets stored in the first filter buffer. May be selected.
 要するに、第1フィルタ係数情報13は、典型的には先行するピクチャ内のLCUのフィルタ設定情報18に基づいて設定されている。故に、第1のフィルタセットは、現行のピクチャ全体の残差二乗和を必ずしも最小化しない。しかしながら、一般に、ピクチャ間の画像特性の相関は高いので、先行するピクチャ内のLCUのフィルタ設定情報18に基づいて設定された第1フィルタ係数情報13は、現行のピクチャに対しても概して残差二乗和を低減する準最適なフィルタ処理を実現することができる。 In short, the first filter coefficient information 13 is typically set based on the filter setting information 18 of the LCU in the preceding picture. Thus, the first filter set does not necessarily minimize the residual sum of squares for the entire current picture. However, generally, since the correlation of image characteristics between pictures is high, the first filter coefficient information 13 set based on the filter setting information 18 of the LCU in the preceding picture is generally a residual even for the current picture. Suboptimal filtering that reduces the sum of squares can be realized.
 符号化制御部111は、動画像符号化部1000に対して符号化ブロックの分割制御、発生符号量のフィードバック制御、量子化制御、モード制御などを行う。符号化制御部111は、符号化パラメータをエントロピー符号化部104へと出力する。 The encoding control unit 111 performs encoding block division control, generated code amount feedback control, quantization control, mode control, and the like for the moving image encoding unit 1000. The encoding control unit 111 outputs the encoding parameter to the entropy encoding unit 104.
 以下、エントロピー符号化部104によって使用されるシンタクスが説明される。以降の説明において、シーケンスレベルシンタクス、ピクチャレベルシンタクス、スライスレベルシンタクス及びLCUレベルシンタクスを備えるシンタクス構造が想定される。 Hereinafter, the syntax used by the entropy encoding unit 104 will be described. In the following description, a syntax structure including a sequence level syntax, a picture level syntax, a slice level syntax, and an LCU level syntax is assumed.
 例えば画像の縦横サイズなどの動画像全体に関する上位レイヤのシンタクス情報は、シーケンスレベルシンタクスのSPS(Sequence Parameter Set)において符号化される。ピクチャレベルで復号に必要なシンタクス情報は、ピクチャレベルシンタクスのPPS(Picture Parameter Set)またはAPS(Adaptation Parameter Set)において符号化される。スライスレベルで復号に必要なシンタクス情報は、スライスレベルシンタクスのスライスヘッダにおいて符号化される。LCUレベルシンタクスにおいて、例えば各LCUの量子化変換係数、予測モード情報などのシンタクス情報が符号化される。 For example, the upper layer syntax information relating to the entire moving image such as the vertical and horizontal sizes of the image is encoded in the sequence level syntax SPS (Sequence Parameter Set). Syntax information necessary for decoding at the picture level is encoded in PPS (Picture Parameter Set) or APS (Adaptation Parameter Set) of picture level syntax. Syntax information necessary for decoding at the slice level is encoded in the slice header of the slice level syntax. In the LCU level syntax, for example, syntax information such as quantization transform coefficients and prediction mode information of each LCU is encoded.
 第1の比較例に係るシンタクス構造が図4に示されている。図4のシンタクス構造は、HEVCのWD5(Working Draft 5)に開示されている。図4のシンタクス構造によると、フィルタ係数情報はピクチャ単位で設定され、ピクチャレベルシンタクスのAPSにおいて符号化される。また、各LCUにおいてフィルタが適用されるか否かを示すフィルタ適用情報は、スライスレベルシンタクスのスライスヘッダにおいてまとめて符号化される。即ち、図4のシンタクス構造は、いわゆるピクチャベースエンコーダによって使用される。図4のシンタクス構造によれば、ピクチャ単位で残差二乗和を最小化するフィルタ係数情報を設定できる。他方、全てのLCUの符号化処理が完了するまでフィルタ係数情報を設定することができないので、符号化データが出力されるまでの遅延が大きい。 FIG. 4 shows a syntax structure according to the first comparative example. The syntax structure of FIG. 4 is disclosed in WD5 (Working Draft 5) of HEVC. According to the syntax structure of FIG. 4, the filter coefficient information is set in units of pictures and is encoded in APS with picture level syntax. Also, filter application information indicating whether or not a filter is applied in each LCU is collectively encoded in a slice header of slice level syntax. That is, the syntax structure of FIG. 4 is used by a so-called picture-based encoder. According to the syntax structure of FIG. 4, filter coefficient information that minimizes the residual sum of squares can be set for each picture. On the other hand, since the filter coefficient information cannot be set until the encoding process of all the LCUs is completed, the delay until the encoded data is output is large.
 第2の比較例に係るシンタクス構造が図5に示されている。図5のシンタクス構造は、非特許文献2に開示されている。図5のシンタクス構造によると、フィルタ係数情報はLCU単位で設定され、LCUレベルシンタクスにおいてLCU毎に符号化されている。但し、LCUにフィルタが適用されない場合及び同一スライス内で既に符号化済みのフィルタ係数情報を参照する場合には、フィルタ係数情報の符号化を省略することができる。また、各LCUにおいてフィルタが適用されるか否かを示し、更に、フィルタが適用される場合には同一スライス内で符号化済みのフィルタであるか新たなフィルタであるかを示すフィルタ適用情報が、LCUレベルシンタクスにおいてLCU毎に符号化されている。即ち、図5のシンタクス構造は、いわゆるLCUベースエンコーダによって使用される。図5のシンタクス構造によれば、LCU単位で残差二乗和を最小化するフィルタ係数情報を設定することができる。更に、LCU単位の符号化処理が完了する度に符号化データを順次出力できるので、遅延を低減することができる。他方、LCUベースでフィルタが最適化されるので、設定されるフィルタは多様化し、フィルタ係数情報のオーバーヘッドが増大し易い。 FIG. 5 shows a syntax structure according to the second comparative example. The syntax structure of FIG. 5 is disclosed in Non-Patent Document 2. According to the syntax structure of FIG. 5, the filter coefficient information is set for each LCU, and is encoded for each LCU in the LCU level syntax. However, when the filter is not applied to the LCU and when referring to already encoded filter coefficient information in the same slice, the encoding of the filter coefficient information can be omitted. Further, it indicates whether or not a filter is applied in each LCU. Further, when a filter is applied, filter application information indicating whether the filter is an encoded filter or a new filter in the same slice. In the LCU level syntax, each LCU is encoded. That is, the syntax structure of FIG. 5 is used by a so-called LCU-based encoder. According to the syntax structure of FIG. 5, it is possible to set filter coefficient information that minimizes the residual sum of squares in units of LCUs. Furthermore, since encoded data can be sequentially output every time encoding processing in units of LCU is completed, delay can be reduced. On the other hand, since the filters are optimized on an LCU basis, the set filters are diversified, and the overhead of filter coefficient information is likely to increase.
 エントロピー符号化部104は、例えば図6に示されるシンタクス構造を使用する。図6のシンタクス構造によれば、第1フィルタ係数情報13はピクチャ単位で設定され、ピクチャレベルシンタクスのAPSにおいて符号化される。フィルタ切り替え情報14はLCU単位で設定され、LCUレベルシンタクスにおいてLCU毎に符号化されている。また、第2フィルタ係数情報15もまたLCU単位で設定され、LCUレベルシンタクスにおいてLCU毎に符号化される。但し、LCUに少なくとも第2のフィルタセットが適用されない場合には、第2フィルタ係数情報15の符号化を省略することができる。 The entropy encoding unit 104 uses, for example, the syntax structure shown in FIG. According to the syntax structure of FIG. 6, the first filter coefficient information 13 is set for each picture and is encoded in the APS of the picture level syntax. The filter switching information 14 is set for each LCU, and is encoded for each LCU in the LCU level syntax. The second filter coefficient information 15 is also set for each LCU, and is encoded for each LCU in the LCU level syntax. However, when at least the second filter set is not applied to the LCU, the encoding of the second filter coefficient information 15 can be omitted.
 前述のように、第1フィルタ係数情報13は、第1フィルタ設定部109によって後続するピクチャに対して設定される。換言すれば、各ピクチャの先頭LCUに対する符号化処理が開始するよりも前に、第1フィルタ係数情報13は第1フィルタバッファ110に保存されている。従って、図6のシンタクス構造によれば、LCUベースエンコーダのように、LCU単位の符号化処理が完了する度に符号化データ17を順次出力できるので、遅延を低減することができる。 As described above, the first filter coefficient information 13 is set for the following picture by the first filter setting unit 109. In other words, the first filter coefficient information 13 is stored in the first filter buffer 110 before the encoding process for the first LCU of each picture starts. Therefore, according to the syntax structure of FIG. 6, the encoded data 17 can be sequentially output every time the encoding process in units of LCU is completed as in the LCU base encoder, so that the delay can be reduced.
 また、図6のシンタクス構造によれば、例えば前述の残差二乗和及び符号量のトレードオフに基づいて、第2のフィルタセットの代わりに第1のフィルタセットが適用されることがある。故に、設定される第2のフィルタセットの総数がLCUベースエンコーダに比べて低減するので、符号化データ17のオーバーヘッドを低減することができる。更に、前述のように、先行するピクチャに基づいて準最適な第1のフィルタセットが設定されていれば、現行のピクチャ内のLCUに第1のフィルタセットが適用される可能性が高くなるので、符号化データ17のオーバーヘッドが低減され易い。 Further, according to the syntax structure of FIG. 6, the first filter set may be applied instead of the second filter set based on, for example, the above-described tradeoff between the residual sum of squares and the code amount. Therefore, since the total number of second filter sets to be set is reduced as compared with the LCU-based encoder, the overhead of the encoded data 17 can be reduced. Furthermore, as described above, if the sub-optimal first filter set is set based on the preceding picture, the possibility that the first filter set is applied to the LCU in the current picture increases. The overhead of the encoded data 17 is likely to be reduced.
 エントロピー符号化部104は、第1フィルタ係数情報13を符号化するために例えば図7に示されるシンタクスを使用できる。図7は、図6に示されるAPSのうちループフィルタ処理に関する情報を例示する。 The entropy encoding unit 104 can use, for example, the syntax shown in FIG. 7 to encode the first filter coefficient information 13. FIG. 7 illustrates information related to loop filter processing in the APS illustrated in FIG.
 図7において、aps_idは、スライスレベル以下のレイヤにおいて参照されるAPSを特定するための識別情報である。例えば、スライスヘッダにおいてaps_idを符号化することによって、スライスは既に符号化済みのAPSのいずれか1つを参照することができる。 In FIG. 7, aps_id is identification information for specifying an APS that is referenced in a layer below the slice level. For example, by encoding aps_id in the slice header, the slice can refer to any one of the already encoded APSs.
 尚、図6において、第1フィルタ係数情報13が符号化されるAPSと当該第1フィルタ係数情報13を参照可能なLCUの符号化データとが連続しているかのように描かれている。しかしながら、aps_idを用いれば所与のLCUが参照可能な第1フィルタ係数情報13は一意に特定されるので、第1フィルタ係数情報13が符号化されるAPSと当該第1フィルタ係数情報13を参照可能なLCUの符号化データとを連続して符号化する必要はない。また、ピクチャ単位で複数のAPSを符号化すれば、参照可能な第1フィルタ係数情報13をスライス単位で適応的に切り替えることができる。 In FIG. 6, the APS in which the first filter coefficient information 13 is encoded and the encoded data of the LCU that can refer to the first filter coefficient information 13 are depicted as being continuous. However, since the first filter coefficient information 13 that can be referred to by a given LCU is uniquely specified by using aps_id, the APS in which the first filter coefficient information 13 is encoded and the first filter coefficient information 13 are referred to. There is no need to continuously encode the encoded data of possible LCUs. In addition, if a plurality of APSs are encoded in units of pictures, the first filter coefficient information 13 that can be referred to can be adaptively switched in units of slices.
 図7において、picture_based_filter_flagは、現行のAPSにおいて第1フィルタ係数情報13が符号化されるか否かを示す情報である。picture_based_filter_flagが1であれば、第1フィルタ係数情報13を含む第1のフィルタセットに関する情報が現行のAPSにおいて符号化される。 In FIG. 7, picture_based_filter_flag is information indicating whether or not the first filter coefficient information 13 is encoded in the current APS. If picture_based_filter_flag is 1, information on the first filter set including the first filter coefficient information 13 is encoded in the current APS.
 picture_based_filter_num_informationは、第1のフィルタセットに包含されるフィルタの総数を示す情報である。尚、図7において、第1のフィルタセットに包含されるフィルタの総数は、PictureBasedFilterNumとして表現されている。前述のように、第1のフィルタセットは1以上のフィルタを包含できる。尚、picture_based_filter_num_informationは、第1のフィルタセットに包含されるフィルタの総数そのものを符号化したものであってもよいし、第1のフィルタセットに包含されるフィルタの総数から1を減算した値を符号化したものであってもよい。或いは、第1のフィルタセットに包含されるフィルタの総数が予め決定されているならば、picture_based_filter_num_informationの符号化を省略することができる。いずれにせよ、動画像復号装置が第1のフィルタセットに包含されるフィルタの総数を導出できるような設計が求められる。 Picture_based_filter_num_information is information indicating the total number of filters included in the first filter set. In FIG. 7, the total number of filters included in the first filter set is represented as PictureBasedFilterNum. As described above, the first filter set can include one or more filters. The picture_based_filter_num_information may be obtained by encoding the total number of filters included in the first filter set, or may be obtained by subtracting 1 from the total number of filters included in the first filter set. It may be converted into one. Alternatively, if the total number of filters included in the first filter set is determined in advance, encoding of picture_based_filter_num_information can be omitted. In any case, a design is required so that the video decoding device can derive the total number of filters included in the first filter set.
 第1のフィルタセットが複数のフィルタを包含する場合には、画素ブロック内の局所領域毎に適用されるフィルタを切り替えることができる。このフィルタの切り替え規則は、picture_based_filter_group_informationとして符号化される。 When the first filter set includes a plurality of filters, it is possible to switch the filter applied to each local region in the pixel block. This filter switching rule is encoded as picture_based_filter_group_information.
 HEVC WD5には、最大16個のフィルタを切り替えるための規則が開示されている。具体的には、4×4画素領域の画素値の勾配方向及びアクティビティに基づいて、当該4×4画素領域が最大16個のクラスに分類される。これらクラスの各々は、適用されるフィルタを特定する。そして、4×4画素領域毎にそのクラスに応じたフィルタが適用される。更に、これら16個のクラスのうち2以上のクラスを統合することによって、クラスの総数(即ち、フィルタの総数)を任意に減少させることもできる。そして、HEVC WD5において、クラスの統合情報がpicture_based_filter_group_informationとして符号化される。 HEVC WD5 discloses rules for switching up to 16 filters. Specifically, the 4 × 4 pixel area is classified into a maximum of 16 classes based on the gradient direction and activity of the pixel values of the 4 × 4 pixel area. Each of these classes specifies the filter to be applied. A filter corresponding to the class is applied to each 4 × 4 pixel region. Further, by integrating two or more of these 16 classes, the total number of classes (ie, the total number of filters) can be arbitrarily reduced. Then, in the HEVC WD5, the class integration information is encoded as picture_based_filter_group_information.
 フィルタの切り替えは種々の規則に基づいて実現され得るので、種々の情報がpicture_based_filter_group_informationとして符号化されてよい。或いは、フィルタの切り替え規則が予め決定されているならば、picture_based_filter_group_informationの符号化を省略することができる。いずれにせよ、動画像復号装置が第1のフィルタセットに包含される複数のフィルタを切り替えられるような設計が求められる。 Since switching of the filter can be realized based on various rules, various information may be encoded as picture_based_filter_group_information. Alternatively, if a filter switching rule is determined in advance, encoding of picture_based_filter_group_information can be omitted. In any case, a design is required that allows the video decoding device to switch between a plurality of filters included in the first filter set.
 図7において、picture_based_filter_pred_informationは、第1フィルタ係数情報13を符号化するための予測方式を示す情報である。第1フィルタ係数情報13が示すフィルタ係数はそのまま符号化されてもよいし、当該フィルタ係数に予測処理を施して得られる予測残差が代わりに符号化されてもよい。一般に、適切な予測処理が行われるならば、予測残差の絶対値はフィルタ係数の絶対値よりも小さくなるので発生符号量は減少する。 In FIG. 7, picture_based_filter_pred_information is information indicating a prediction method for encoding the first filter coefficient information 13. The filter coefficient indicated by the first filter coefficient information 13 may be encoded as it is, or a prediction residual obtained by performing a prediction process on the filter coefficient may be encoded instead. In general, if an appropriate prediction process is performed, the absolute value of the prediction residual is smaller than the absolute value of the filter coefficient, so that the generated code amount decreases.
 HEVC WD5には様々な予測方式が開示されている。具体的には、同一のフィルタセットに包含される第1のフィルタの各フィルタ係数を第2のフィルタの同一位置のフィルタ係数に基づいて予測する方式を用意することができる。また、フィルタのゲイン(即ち、フィルタ係数の総和)が一定であるとみなして当該フィルタ内の一部のフィルタ係数に基づいて当該フィルタ内のその他のフィルタ係数を予測する方式を用意することができる。 Various prediction methods are disclosed in HEVC WD5. Specifically, a method for predicting each filter coefficient of the first filter included in the same filter set based on the filter coefficient at the same position of the second filter can be prepared. Further, it is possible to prepare a method for predicting other filter coefficients in the filter based on a part of the filter coefficients in the filter on the assumption that the gain of the filter (that is, the sum of the filter coefficients) is constant. .
 フィルタ係数の予測は種々の方式に基づいて実現され得るので、種々の情報がpicture_based_filter_pred_informationとして符号化されてよい。或いは、フィルタ係数の予測方式が予め決定されているならば、picture_based_filter_pred_informationの符号化を省略することができる。いずれにせよ、動画像復号装置がフィルタ係数の予測方式を導出できるような設計が求められる。 Since the prediction of filter coefficients can be realized based on various methods, various information may be encoded as picture_based_filter_pred_information. Alternatively, if the prediction method of the filter coefficient is determined in advance, encoding of picture_based_filter_pred_information can be omitted. In any case, a design that allows the video decoding device to derive a filter coefficient prediction method is required.
 図7において、picture_based_filter_coeff[i][j]は、第1フィルタ係数情報13を符号化したものに相当する。具体的には、第1のフィルタセットのうちi番目のフィルタのj番目のフィルタ係数を示す情報がpicture_based_filter_coeff[i][j]として符号化される。ここで、フィルタ係数とは、前述のフィルタ係数の予測が行われる場合には予測残差を意味し、そうでない場合にはフィルタ係数値そのものを意味する。図7において、PictureBasedFilterCoeffNumは、第1のフィルタセットに包含される各フィルタが持つフィルタ係数の総数を示す。 In FIG. 7, picture_based_filter_coeff [i] [j] corresponds to the encoded first filter coefficient information 13. Specifically, information indicating the j-th filter coefficient of the i-th filter in the first filter set is encoded as picture_based_filter_coeff [i] [j]. Here, the filter coefficient means a prediction residual when the above-described filter coefficient prediction is performed, and otherwise means a filter coefficient value itself. In FIG. 7, PictureBasedFilterCoeffNum indicates the total number of filter coefficients included in each filter included in the first filter set.
 エントロピー符号化部104は、フィルタ切り替え情報14及び第2フィルタ係数情報15を符号化するために例えば図8に示されるシンタクスを使用できる。図8は、図6に示されるLCUレベルシンタクスのうちループフィルタ処理に関する情報を例示する。 The entropy encoding unit 104 can use, for example, the syntax shown in FIG. 8 to encode the filter switching information 14 and the second filter coefficient information 15. FIG. 8 illustrates information related to loop filter processing in the LCU level syntax shown in FIG.
 図8において、lcu_based_filter_flagは、対応するLCUにループフィルタ処理(即ち、第1のフィルタセットまたは第2のフィルタセット)が適用されるか否かを示す。即ち、lcu_based_filter_flagは、フィルタ切り替え情報14の一部を符号化したものに相当する。lcu_based_filter_flagが1であるならば、対応するLCUに適用されるループフィルタ処理の情報が更に符号化される。 In FIG. 8, lcu_based_filter_flag indicates whether or not loop filter processing (that is, the first filter set or the second filter set) is applied to the corresponding LCU. That is, lcu_based_filter_flag corresponds to a part of the filter switching information 14 encoded. If lcu_based_filter_flag is 1, information on loop filter processing applied to the corresponding LCU is further encoded.
 図8において、new_filter_flagは、対応するLCUに第1のフィルタセット及び第2のフィルタセットのいずれが適用されるかを示す。即ち、new_filter_flagは、フィルタ切り替え情報14の一部を符号化したものに相当する。new_filter_flagが1であるならば、対応するLCUに適用される第2のフィルタセットに関する情報(第2フィルタ係数情報15を含む)が更に符号化される。他方、new_filter_flagが1であるならば、対応するLCUには第1のフィルタセットが適用されるので更なる情報を符号化する必要はない。 In FIG. 8, new_filter_flag indicates which of the first filter set and the second filter set is applied to the corresponding LCU. That is, new_filter_flag corresponds to a part of the filter switching information 14 encoded. If new_filter_flag is 1, information on the second filter set applied to the corresponding LCU (including the second filter coefficient information 15) is further encoded. On the other hand, if new_filter_flag is 1, no further information needs to be encoded since the first filter set is applied to the corresponding LCU.
 図8において、lcu_based_filter_num_informationは、対応するLCUに適用される第2のフィルタセットに包含されるフィルタの総数を示す情報である。尚、図8において、第2のフィルタセットに包含されるフィルタの総数は、LCUBasedFilterNumとして表現されている。 In FIG. 8, lcu_based_filter_num_information is information indicating the total number of filters included in the second filter set applied to the corresponding LCU. In FIG. 8, the total number of filters included in the second filter set is expressed as LCUBasedFilterNum.
 第2のフィルタセットに包含されるフィルタの総数が複数である場合には、画素ブロック内の局所領域毎に適用されるフィルタを切り替えることができる。このフィルタの切り替え規則は、lcu_based_filter_group_informationとして符号化される。 When the total number of filters included in the second filter set is plural, it is possible to switch the filter applied to each local region in the pixel block. This filter switching rule is encoded as lcu_based_filter_group_information.
 図8において、lcu_based_filter_pred_informationは、第2フィルタ係数情報15を符号化するための予測方式を示す情報である。尚、前述の予測方式に加えて、第1フィルタ係数情報13に基づいて第2フィルタ係数情報15を予測する方式も利用可能である。 8, lcu_based_filter_pred_information is information indicating a prediction method for encoding the second filter coefficient information 15. In addition to the above prediction method, a method of predicting the second filter coefficient information 15 based on the first filter coefficient information 13 can also be used.
 図8において、lcu_based_filter_coeff[i][j]は、処理対象の画素ブロックに対応する第2フィルタ係数情報15を符号化したものに相当する。具体的には、第2のフィルタセットのうちi番目のフィルタのj番目のフィルタ係数を示す情報がlcu_based_filter_coeff[i][j]として符号化される。ここで、フィルタ係数とは、前述のフィルタ係数の予測が行われる場合には予測残差を意味し、そうでない場合にはフィルタ係数値そのものを意味する。図8において、LCUBasedFilterCoeffNumは、第2のフィルタセットに包含される各フィルタが持つフィルタ係数の総数を示す。 In FIG. 8, lcu_based_filter_coeff [i] [j] corresponds to the encoded second filter coefficient information 15 corresponding to the pixel block to be processed. Specifically, information indicating the j-th filter coefficient of the i-th filter in the second filter set is encoded as lcu_based_filter_coeff [i] [j]. Here, the filter coefficient means a prediction residual when the above-described filter coefficient prediction is performed, and otherwise means a filter coefficient value itself. In FIG. 8, LCUBasedFilterCoeffNum indicates the total number of filter coefficients included in each filter included in the second filter set.
 ところで、前述のように、第2のフィルタセットはLCU単位で設定される。故に、第2のフィルタセットに関わるシンタクス情報は、第1のフィルタセットに関わるシンタクス情報に比べて、符号化データ17のオーバーヘッドに与える影響が大きい。従って、第2のフィルタセットに関わるシンタクス情報のサイズを適切に低減することによって、符号化データ17のオーバーヘッドを効果的に低減することができる。 Incidentally, as described above, the second filter set is set in units of LCUs. Therefore, the syntax information related to the second filter set has a larger influence on the overhead of the encoded data 17 than the syntax information related to the first filter set. Therefore, the overhead of the encoded data 17 can be effectively reduced by appropriately reducing the size of the syntax information related to the second filter set.
 例えば、第2のフィルタセットに包含されるフィルタの総数が1に固定されれば、lcu_based_filter_num_information及びlcu_based_filter_group_informationを符号化する必要がない。更に、フィルタの総数が最小化されるので、フィルタ係数の総数もまた最小化される。 For example, if the total number of filters included in the second filter set is fixed to 1, it is not necessary to encode lcu_based_filter_num_information and lcu_based_filter_group_information. Furthermore, since the total number of filters is minimized, the total number of filter coefficients is also minimized.
 また、フィルタの形状(例えば、タップ数)に依存して、当該フィルタが持つフィルタ係数の総数は増減する。故に、フィルタの形状を適切に設定することにより、符号化データ17のオーバーヘッドを効果的に低減できる。例えば、第1のフィルタセットに包含される各フィルタの形状と第2のフィルタセットに包含される各フィルタの形状とは、一致しなくてもよい。具体的には、第1のフィルタセットはピクチャ単位で設定されるので、第2のフィルタセットに比べて大きなオーバーヘッドが許容される。即ち、第1のフィルタセットに包含される各フィルタが持つフィルタ係数の総数が、第2のフィルタセットに包含される各フィルタが持つフィルタ係数の総数に比べて大きくてもよい。各フィルタの形状は、予め定められた規則に従って変更されてもよいし、各フィルタの形状を示す情報がシグナリングされてもよい。このとき、例えば第2のフィルタセットに包含されるフィルタの形状が第1のフィルタセットに包含されるフィルタの形状のサブセットとなっていれば、フィルタ処理を実現するための回路規模を小さくすることもできる。 Also, depending on the shape of the filter (for example, the number of taps), the total number of filter coefficients that the filter has increases or decreases. Therefore, the overhead of the encoded data 17 can be effectively reduced by appropriately setting the filter shape. For example, the shape of each filter included in the first filter set may not match the shape of each filter included in the second filter set. Specifically, since the first filter set is set in units of pictures, a large overhead is allowed as compared with the second filter set. That is, the total number of filter coefficients included in each filter included in the first filter set may be larger than the total number of filter coefficients included in each filter included in the second filter set. The shape of each filter may be changed according to a predetermined rule, and information indicating the shape of each filter may be signaled. At this time, for example, if the shape of the filter included in the second filter set is a subset of the shape of the filter included in the first filter set, the circuit scale for realizing the filter processing is reduced. You can also.
 エントロピー符号化部104は、フィルタ切り替え情報14及び第2フィルタ係数情報15を符号化するために図8の代わりに例えば図9に示されるシンタクスを使用してもよい。図9は、図6に示されるLCUレベルシンタクスのうちループフィルタ処理に関する情報を例示する。 The entropy encoding unit 104 may use, for example, the syntax shown in FIG. 9 instead of FIG. 8 in order to encode the filter switching information 14 and the second filter coefficient information 15. FIG. 9 illustrates information related to loop filter processing in the LCU level syntax shown in FIG.
 図8のシンタクスによれば、LCUに第2のフィルタセットが適用される場合には当該第2のフィルタセットに関するシンタクス情報が符号化される。しかしながら、LCUに適用される第2のフィルタセットは、既に符号化済みのLCUに適用された第2のフィルタセットと同じであるかもしれない。そこで、図9のシンタクスは、係る場合に既に符号化済みのLCUに対応する第2のフィルタセットに関するシンタクス情報を参照することを許容する。故に、図9のシンタクスによれば、現行のLCUに適用される第2のフィルタセットに関するシンタクス情報の符号化を簡略化することができる。 According to the syntax of FIG. 8, when the second filter set is applied to the LCU, the syntax information regarding the second filter set is encoded. However, the second filter set applied to the LCU may be the same as the second filter set applied to the already encoded LCU. Therefore, the syntax of FIG. 9 allows to refer to syntax information regarding the second filter set corresponding to the already encoded LCU in such a case. Therefore, according to the syntax of FIG. 9, it is possible to simplify the encoding of syntax information regarding the second filter set applied to the current LCU.
 図9のシンタクスにおいて、lcu_based_filter_flag、lcu_based_filter_num_information、LCUBasedFilterNum、lcu_based_filter_group_information、lcu_based_filter_pred_information、LCUBasedFilterCoeffNum及びlcu_based_filter_coeff[i][j]の役割は図8のものと同一または類似である。 In the syntax of FIG. 9, lcu_based_filter_flag, lcu_based_filter_num_information, LCUBasedFilterNum, lcu_based_filter_group_information, lcu_based_filter_pred_information, the role of LCUBasedFilterCoeffNum and lcu_based_filter_coeff [i] [j] is identical or similar to that of FIG.
 図9のシンタクスにおいて、new_filter_flagは、対応するLCUに第2のフィルタセットが適用され、かつ、当該第2のフィルタセットに関するシンタクス情報を新たに符号化する必要があるかどうかを示す情報である。即ち、new_filter_flagは、フィルタ切り替え情報14の一部を符号化したものに相当する。new_filter_flagが1であれば、対応するLCUに適用される第2のフィルタセットに関する情報(第2フィルタ係数情報15を含む)が更に符号化される。 In the syntax of FIG. 9, new_filter_flag is information indicating whether or not the second filter set is applied to the corresponding LCU and syntax information related to the second filter set needs to be newly encoded. That is, new_filter_flag corresponds to a part of the filter switching information 14 encoded. If new_filter_flag is 1, information on the second filter set applied to the corresponding LCU (including the second filter coefficient information 15) is further encoded.
 new_filter_flagが0であれば、対応するLCUには第1のフィルタセットまたは既に符号化済みのLCUと同一の第2のフィルタセットが適用される。LCUに適用されるフィルタセットは、picture_based_filter_flagによって決まる。即ち、図9のシンタクスにおいて、picture_based_filter_flagは、フィルタ切り替え情報14の一部を符号化したものに相当する。picture_based_filter_flagが1であるならば、対応するLCUには第1のフィルタセットが適用されるので更なる情報を符号化する必要はない。 If new_filter_flag is 0, the first filter set or the same second filter set as the already encoded LCU is applied to the corresponding LCU. The filter set applied to the LCU is determined by the picture_based_filter_flag. That is, in the syntax of FIG. 9, picture_based_filter_flag corresponds to an encoded part of the filter switching information 14. If picture_based_filter_flag is 1, no further information needs to be encoded since the first filter set is applied to the corresponding LCU.
 new_filter_flagが0であり、かつ、picture_based_filter_flagが0であるならば、対応するLCUには既に符号化済みのLCUと同一の第2のフィルタセットが適用される。この場合に、既に符号化済みのLCUに対応する第2のフィルタセットに関するシンタクス情報を参照するための識別情報としてstored_filter_idxが符号化されることにより、現行のLCUに適用される第2のフィルタセットに関するシンタクス情報の符号化を簡略化することができる。尚、stored_filter_idxによって参照されるシンタクス情報は、動画像符号化装置と動画像復号装置との間で共通であることが求められる。 If new_filter_flag is 0 and picture_based_filter_flag is 0, the same second filter set as the already encoded LCU is applied to the corresponding LCU. In this case, the stored_filter_idx is encoded as identification information for referring to the syntax information related to the second filter set corresponding to the already encoded LCU, whereby the second filter set applied to the current LCU. It is possible to simplify the encoding of the syntax information regarding. Note that the syntax information referred to by stored_filter_idx is required to be common between the video encoding device and the video decoding device.
 図9のシンタクスの変形例が図10に示されている。図10のシンタクスにおいて、lcu_based_filter_flag、lcu_based_filter_num_information、LCUBasedFilterNum、lcu_based_filter_group_information、lcu_based_filter_pred_information、LCUBasedFilterCoeffNum及びlcu_based_filter_coeff[i][j]の役割は図8のものと同一または類似である。また、図10のシンタクスにおいて、stored_filter_idxの役割は図9のものと同一または類似である。 FIG. 10 shows a modification of the syntax of FIG. In the syntax of FIG. 10, lcu_based_filter_flag, lcu_based_filter_num_information, LCUBasedFilterNum, lcu_based_filter_group_information, lcu_based_filter_pred_information, the role of LCUBasedFilterCoeffNum and lcu_based_filter_coeff [i] [j] is identical or similar to that of FIG. Further, in the syntax of FIG. 10, the role of stored_filter_idx is the same as or similar to that of FIG.
 図10のシンタクスにおいて、picture_based_filter_flagは、対応するLCUに第1のフィルタセットが適用されるか第2のフィルタセットが適用されるかを示す情報である。即ち、図10のシンタクスにおいて、picture_based_filter_flagは、フィルタ切り替え情報14の一部を符号化したものに相当する。picture_based_filter_flagが1であるならば、対応するLCUには第1のフィルタセットが適用されるので更なる情報を符号化する必要はない。他方、picture_based_filter_flagが0であるならば、new_filter_flagが更に符号化される。 In the syntax of FIG. 10, picture_based_filter_flag is information indicating whether the first filter set or the second filter set is applied to the corresponding LCU. That is, in the syntax of FIG. 10, picture_based_filter_flag corresponds to a part of the filter switching information 14 encoded. If picture_based_filter_flag is 1, no further information needs to be encoded since the first filter set is applied to the corresponding LCU. On the other hand, if picture_based_filter_flag is 0, new_filter_flag is further encoded.
 new_filter_flagは、対応するLCUに適用される第2のフィルタセットに関するシンタクス情報を新たに符号化する必要があるかどうかを示す情報である。new_filter_flagが1であるならば、対応するLCUに適用される第2のフィルタセットに関する情報(第2フィルタ係数情報15を含む)が更に符号化される。new_filter_flagが0であるならば、対応するLCUには既に符号化済みのLCUと同一の第2のフィルタセットが適用される。故に、stored_filter_idxが符号化されることにより、対応するLCUに適用される第2のフィルタセットに関するシンタクス情報の符号化を簡略化することができる。 New_filter_flag is information indicating whether or not it is necessary to newly encode syntax information related to the second filter set applied to the corresponding LCU. If new_filter_flag is 1, information on the second filter set applied to the corresponding LCU (including the second filter coefficient information 15) is further encoded. If new_filter_flag is 0, the same LCU as the already encoded LCU is applied to the corresponding LCU. Therefore, by encoding stored_filter_idx, encoding of syntax information regarding the second filter set applied to the corresponding LCU can be simplified.
 図10のシンタクスによれば、LCUに第1のフィルタセットが適用される場合にnew_filter_flagを符号化する必要がない。図9のシンタクスによれば、LCUに第1のフィルタセットが適用される場合にもnew_filter_flagを符号化する必要がある。故に、図10のシンタクスによれば、図9のシンタクスに比較して、LCUに第1のフィルタセットが適用される可能性が高い場合(例えば、第1のフィルタセットを適用することによる残差二乗和の削減効果がピクチャ全体で高い場合)に符号化データ17のオーバーヘッドを削減できる。 According to the syntax of FIG. 10, it is not necessary to encode new_filter_flag when the first filter set is applied to the LCU. According to the syntax of FIG. 9, it is necessary to encode new_filter_flag even when the first filter set is applied to the LCU. Therefore, according to the syntax of FIG. 10, when there is a high possibility that the first filter set is applied to the LCU as compared to the syntax of FIG. 9 (for example, residual by applying the first filter set). The overhead of the encoded data 17 can be reduced when the square sum reduction effect is high for the entire picture.
 他方、図10のシンタクスによれば、LCUに第2のフィルタセットが適用される場合にもpicture_based_filter_flagを符号化する必要がある。図9のシンタクスによれば、LCUに第2のフィルタセットが適用され、かつ、当該第2のフィルタセットに関するシンタクス情報を新たに符号化する必要がある場合にpicture_based_filter_flagを符号化する必要がない。故に、図9のシンタクスによれば、図10のシンタクスに比較して、LCUに第2のフィルタセットが適用される可能性が高い場合(例えば、第1のフィルタセットを適用することによる残差二乗和の削減効果がピクチャ全体で低い場合)に符号化データ17のオーバーヘッドを削減できる。 On the other hand, according to the syntax of FIG. 10, it is necessary to encode picture_based_filter_flag even when the second filter set is applied to the LCU. According to the syntax of FIG. 9, when the second filter set is applied to the LCU and syntax information related to the second filter set needs to be newly encoded, it is not necessary to encode the picture_based_filter_flag. Therefore, according to the syntax of FIG. 9, there is a high possibility that the second filter set is applied to the LCU as compared to the syntax of FIG. 10 (for example, the residual by applying the first filter set). The overhead of the encoded data 17 can be reduced when the square sum reduction effect is low for the entire picture.
 図9のシンタクスの変形例が図11にも示されている。図11のシンタクスにおいて、lcu_based_filter_flag、lcu_based_filter_num_information、LCUBasedFilterNum、lcu_based_filter_group_information、lcu_based_filter_pred_information、LCUBasedFilterCoeffNum及びlcu_based_filter_coeff[i][j]の役割は図8のものと同一または類似である。図11のシンタクスにおいて、new_filter_flagは図9のものと同一または類似である。 FIG. 11 also shows a modification of the syntax of FIG. In the syntax of FIG. 11, lcu_based_filter_flag, lcu_based_filter_num_information, LCUBasedFilterNum, lcu_based_filter_group_information, lcu_based_filter_pred_information, the role of LCUBasedFilterCoeffNum and lcu_based_filter_coeff [i] [j] is identical or similar to that of FIG. In the syntax of FIG. 11, new_filter_flag is the same as or similar to that of FIG.
 図11のシンタクスにおいて、図9のシンタクス及び図10のシンタクスにおいて符号化されたpicture_based_filter_flagは符号化されない。しかしながら、図11のシンタクスにおいて、stored_filter_idxのうちいずれか1つの値(例えば、0)が第1のフィルタセットに関するシンタクス情報を参照するために利用される。即ち、図11のシンタクスにおいて、stored_filter_idxはフィルタ切り替え情報14の一部を符号化したものに相当する。例えば、new_filter_flagが0であり、かつ、stored_filter_idxが0であれば、対応するLCUには第1のフィルタセットが適用される。他方、new_filter_flagが0であり、かつ、stored_filter_idxが0でなければ、対応するLCUには既に符号化済みのLCUと同一の第2のフィルタセットが適用される。故に、stored_filter_idxが符号化されることによって、対応するLCUに適用される第2のフィルタセットに関するシンタクス情報の符号化を簡略化することができる。 In the syntax of FIG. 11, the picture_based_filter_flag encoded in the syntax of FIG. 9 and the syntax of FIG. 10 is not encoded. However, in the syntax of FIG. 11, any one value (for example, 0) of stored_filter_idx is used to refer to the syntax information regarding the first filter set. That is, in the syntax of FIG. 11, stored_filter_idx corresponds to a part of the filter switching information 14 encoded. For example, if new_filter_flag is 0 and stored_filter_idx is 0, the first filter set is applied to the corresponding LCU. On the other hand, if new_filter_flag is 0 and stored_filter_idx is not 0, the same second filter set as the already encoded LCU is applied to the corresponding LCU. Therefore, by encoding stored_filter_idx, encoding of syntax information regarding the second filter set applied to the corresponding LCU can be simplified.
 図9、図10及び図11のシンタクスによれば、同一の第2フィルタ係数情報15を重複して符号化する必要がない一方、既に符号化済みの第2フィルタ係数情報15を参照のために保存する必要がある。即ち、動画像符号化装置及び動画像復号装置において、既に符号化済みの第2フィルタ係数情報15を保存する記憶部(例えばバッファ)が必要とされる。 According to the syntaxes of FIGS. 9, 10, and 11, it is not necessary to encode the same second filter coefficient information 15 redundantly, while the already encoded second filter coefficient information 15 is used for reference. Need to save. That is, the moving image encoding device and the moving image decoding device require a storage unit (for example, a buffer) that stores the already encoded second filter coefficient information 15.
 上記記憶部の容量を節約するために、保存される第2フィルタ係数情報15について制約を課すことができる。例えば、記憶部には、例えば同一ピクチャ内或いは同一LCUライン内で最大N個の第2のフィルタセットに限って第2フィルタ係数情報15が保存されてよい。この場合には、各LCUにおいて最大N個の第2のフィルタセットの中からいずれか1つを参照することができる。 Constraints can be imposed on the second filter coefficient information 15 to be saved in order to save the capacity of the storage unit. For example, the second filter coefficient information 15 may be stored in the storage unit only for a maximum of N second filter sets within the same picture or the same LCU line, for example. In this case, any one of a maximum of N second filter sets can be referred to in each LCU.
 尚、N+1個目以降の第2のフィルタセットは、参照不可としてもよいし、記憶部に保存された第2フィルタ係数情報15の一部を上書きすることによって参照可能としてもよい。即ち、記憶部にN個の第2のフィルタセットの第2フィルタ係数情報15が保存された状態で新たに第2フィルタ係数情報15が保存される場合には、記憶部に保存されたN個のうちいずれか1つの第2のフィルタセットの第2フィルタ係数情報15が上書きされてよい。上書きされる第2フィルタ係数情報15は、最も過去に保存されたものであってもよいし、最も参照頻度の低いものであってもよいし、最も過去に参照されたものであってもよい。いずれにせよ、動画像復号装置が各LCUに適用される第2のフィルタセットの第2フィルタ係数情報15を識別できるような設計が求められる。 It should be noted that the (N + 1) th and subsequent second filter sets may not be referred to, or may be referred to by overwriting part of the second filter coefficient information 15 stored in the storage unit. That is, when the second filter coefficient information 15 is newly stored in a state where the second filter coefficient information 15 of the N second filter sets is stored in the storage unit, The second filter coefficient information 15 of any one of the second filter sets may be overwritten. The second filter coefficient information 15 to be overwritten may be the most recently saved, the least frequently referenced information, or the most recently referenced information. . In any case, a design is required so that the moving picture decoding apparatus can identify the second filter coefficient information 15 of the second filter set applied to each LCU.
 また、図11のシンタクスによれば、第1フィルタ係数情報13もstored_filter_idxを用いて参照される。従って、第1フィルタ係数情報13について第2フィルタ係数情報15と同様の制約を課すこともできる。即ち、記憶部には、例えばピクチャ内或いはLCUライン内で最大N個のフィルタセット(第1のフィルタセットまたは第2のフィルタセット)に限ってフィルタ係数情報(即ち、第1フィルタ係数情報13または第2フィルタ係数情報15)が保存されてもよい。しかしながら、前述のように、第1のフィルタセットがピクチャ全体で準最適に設定されていれば、多くのLCUが第1フィルタ係数情報13を参照するので、符号化データ17のオーバーヘッドを効果的に低減することができる。従って、第1フィルタ係数情報13は上書きの対象から除外されてもよい。 Further, according to the syntax of FIG. 11, the first filter coefficient information 13 is also referred to using stored_filter_idx. Therefore, the same restrictions as the second filter coefficient information 15 can be imposed on the first filter coefficient information 13. That is, in the storage unit, for example, filter coefficient information (that is, the first filter coefficient information 13 or the first filter set 13 or the second filter set) is limited to a maximum of N filter sets (first filter set or second filter set) within a picture or an LCU line. Second filter coefficient information 15) may be stored. However, as described above, if the first filter set is set semi-optimally for the entire picture, many LCUs refer to the first filter coefficient information 13, so that the overhead of the encoded data 17 is effectively reduced. Can be reduced. Therefore, the first filter coefficient information 13 may be excluded from overwriting.
 更に、前述の制限数(=N)及び上書きされるフィルタ係数情報の選択方法は、それぞれ、動画像符号化装置と動画像復号装置との間で予め定められていてもよいし、動画像符号化装置から動画像復号装置に必要な情報をシグナリングすることによって指定されてもよい。シグナリングされる情報は、SPS、PPS、APS、スライスヘッダまたはLCUデータにおいて符号化されてよい。 Further, the above-described limit number (= N) and the method of selecting overwritten filter coefficient information may be determined in advance between the moving image encoding device and the moving image decoding device, respectively. It may be specified by signaling necessary information from the encoding device to the video decoding device. The signaled information may be encoded in SPS, PPS, APS, slice header or LCU data.
 尚、前述の各シンタクスは例示に過ぎないので、本実施形態の要旨を逸脱しない範囲での様々なシンタクス情報の変更、削除または追加が想定できる。例えば、picture_based_filter_flag、lcu_based_filter_flag、new_filter_flagはフラグであってもよいし、その他の情報であってもよい。また、複数のシンタクス情報をまとめてインデックスによって指定することも可能である。 Note that the syntaxes described above are merely examples, and various syntax information can be changed, deleted, or added without departing from the scope of the present embodiment. For example, picture_based_filter_flag, lcu_based_filter_flag, and new_filter_flag may be flags or other information. In addition, a plurality of pieces of syntax information can be specified together by an index.
 複数の種別のLCUシンタクスを用意しておき、適用されるLCUシンタクスを例えばスライス単位で選択することも可能である。但し、動画像復号装置が選択されたLCUシンタクスを識別できるような設計が求められる。 It is also possible to prepare a plurality of types of LCU syntax and select the applied LCU syntax in units of slices, for example. However, a design is required so that the moving picture decoding apparatus can identify the selected LCU syntax.
 画像が複数のコンポーネントによって構成される場合には、コンポーネント毎にシンタクス情報が生成されてもよいし、2以上のコンポーネントの間で共通のシンタクス情報が生成されてもよい。また、コンポーネント単位でシンタクス構造を異ならせてもよい。例えば、あるコンポーネントのシンタクス構造が、他のコンポーネントのシンタクス構造の一部のシンタクス要素を変更したものであってもよいし、他のコンポーネントのシンタクス構造から一部のシンタクス要素を削除したものであってもよいし、他のコンポーネントのシンタクス構造に一部のシンタクス要素を追加したものであってもよい。 When the image is composed of a plurality of components, syntax information may be generated for each component, or common syntax information may be generated between two or more components. Also, the syntax structure may be different for each component. For example, the syntax structure of a certain component may be obtained by changing some syntax elements of the syntax structure of another component, or by deleting some syntax elements from the syntax structure of another component. Alternatively, a part of syntax elements may be added to the syntax structure of other components.
 上記説明において、第1フィルタ係数情報13がAPSにおいて符号化され、フィルタ切り替え情報14及び第2フィルタ係数情報15がLCUレベルシンタクスにおいて符号化されている。しかしながら、上記シンタクス構造と同一または類似の効果が得られるならば、これと異なるシンタクス構造が採用されてもよい。例えば、第1フィルタ係数情報13はAPSの代わりに、PPSまたはスライスヘッダにおいて符号化されてもよい。或いは、第1フィルタ係数情報13は、予め定められたLCU(例えば、ピクチャ内或いはスライス内の先頭LCU)に対応するLCUシンタクスにおいて符号化されてもよい。 In the above description, the first filter coefficient information 13 is encoded in the APS, and the filter switching information 14 and the second filter coefficient information 15 are encoded in the LCU level syntax. However, if the same or similar effect as the above syntax structure can be obtained, a different syntax structure may be adopted. For example, the first filter coefficient information 13 may be encoded in a PPS or slice header instead of APS. Alternatively, the first filter coefficient information 13 may be encoded in an LCU syntax corresponding to a predetermined LCU (for example, a head LCU in a picture or a slice).
 図1の動画像符号化装置は、例えば以下のように符号化処理を行うことができる。 
 具体的には、予測部101、減算部102、変換/量子化部103、逆量子化/逆変換部105及び加算部106は、以下のように動作できる。
The moving picture encoding apparatus in FIG. 1 can perform encoding processing as follows, for example.
Specifically, the prediction unit 101, the subtraction unit 102, the transform / quantization unit 103, the inverse quantization / inverse transform unit 105, and the addition unit 106 can operate as follows.
 予測部101が例えばループフィルタ処理画像19に基づいて予測画像11を生成する。減算部102が、入力画像10から予測画像11を減算することによって、予測誤差画像12を生成する。変換/量子化部103が、予測誤差画像12に変換及び量子化を行い、量子化変換係数を生成する。量子化変換係数はエントロピー符号化部104によって符号化される。逆量子化/逆変換部105が、量子化変換係数に逆量子化及び逆変換を行うことによって、予測誤差画像を生成する。加算部106が、予測誤差画像を予測画像11に加算することによって、復号画像16を生成する。 The prediction unit 101 generates the predicted image 11 based on the loop filter processed image 19, for example. The subtraction unit 102 generates a prediction error image 12 by subtracting the prediction image 11 from the input image 10. The transform / quantization unit 103 performs transform and quantization on the prediction error image 12 to generate a quantized transform coefficient. The quantized transform coefficient is encoded by the entropy encoding unit 104. The inverse quantization / inverse transform unit 105 generates a prediction error image by performing inverse quantization and inverse transform on the quantized transform coefficient. The addition unit 106 generates a decoded image 16 by adding the prediction error image to the prediction image 11.
 係る動作は、予測処理及び変換処理を含むいわゆるハイブリッド符号化に相当する。しかしながら、本実施形態に係る動画像符号化装置は、必ずしもハイブリッド符号化を行わなくてもよい。例えば、ハイブリッド符号化をDPCM(Differential Pulse Code Modulation)に置き換える場合に、不要な処理が省略される代わりに隣接画素に基づく予測処理が行われてよい。 This operation corresponds to so-called hybrid coding including prediction processing and conversion processing. However, the moving picture encoding apparatus according to the present embodiment does not necessarily have to perform hybrid encoding. For example, when hybrid coding is replaced with DPCM (Differential Pulse Code Modulation), prediction processing based on neighboring pixels may be performed instead of unnecessary processing being omitted.
 エントロピー符号化部104、ループフィルタ情報生成部107、ループフィルタ処理部108及び第1フィルタ設定部109は、例えば図12に示されるように動作する。図12の処理は、ピクチャ単位で行われる。 The entropy encoding unit 104, the loop filter information generation unit 107, the loop filter processing unit 108, and the first filter setting unit 109 operate as shown in FIG. 12, for example. The process of FIG. 12 is performed in units of pictures.
 エントロピー符号化部104、ループフィルタ情報生成部107及びループフィルタ処理部108は、ピクチャに設定された第1フィルタ係数情報13を第1フィルタバッファ110から取得する(ステップS101)。エントロピー符号化部104は、ステップS101において取得された第1フィルタ係数情報13を例えば図7のシンタクスに従って符号化する(ステップS102)。 The entropy encoding unit 104, the loop filter information generation unit 107, and the loop filter processing unit 108 obtain the first filter coefficient information 13 set for the picture from the first filter buffer 110 (step S101). The entropy encoding unit 104 encodes the first filter coefficient information 13 acquired in step S101 according to the syntax of FIG. 7, for example (step S102).
 ステップS102の後に、ピクチャ内の符号化されていないLCUに対する符号化処理が実行される(ステップS103)。ステップS103において行われる符号化処理は、例えば前述のハイブリッド符号化である。即ち、エントロピー符号化部104は、処理対象のLCUの量子化変換係数及び符号化パラメータを符号化する。 After step S102, an encoding process is performed on an unencoded LCU in the picture (step S103). The encoding process performed in step S103 is, for example, the above-described hybrid encoding. That is, the entropy encoding unit 104 encodes the quantization transform coefficient and the encoding parameter of the processing target LCU.
 ループフィルタ情報生成部107のうちフィルタ設定情報生成部112は、入力画像10及び復号画像16に基づいて、処理対象のLCUのフィルタ設定情報18を生成する(ステップS104)。 Among the loop filter information generation unit 107, the filter setting information generation unit 112 generates filter setting information 18 of the processing target LCU based on the input image 10 and the decoded image 16 (step S104).
 ループフィルタ情報生成部107のうち第2フィルタ設定部113は、ステップS104において生成されたフィルタ設定情報18に基づいて、処理対象のLCUに対して第2フィルタ係数情報15を設定する(ステップS105)。 Of the loop filter information generation unit 107, the second filter setting unit 113 sets the second filter coefficient information 15 for the LCU to be processed based on the filter setting information 18 generated in step S104 (step S105). .
 ループフィルタ情報生成部107のうちフィルタ切り替え情報生成部114は、入力画像10と、復号画像16と、ステップS101において取得された第1フィルタ係数情報13と、ステップS105において生成された第2フィルタ係数情報15とに基づいて、処理対象のLCUのフィルタ切り替え情報14を生成する(ステップ106)。尚、簡単化のために、図12の動作例においてフィルタ切り替え情報14は、処理対象のLCUに第1のフィルタセット及び第2のフィルタセットのいずれが適用されるかを示す情報であるとする。 Of the loop filter information generation unit 107, the filter switching information generation unit 114 includes the input image 10, the decoded image 16, the first filter coefficient information 13 acquired in step S101, and the second filter coefficient generated in step S105. Based on the information 15, the filter switching information 14 of the LCU to be processed is generated (step 106). For simplification, in the operation example of FIG. 12, the filter switching information 14 is information indicating which of the first filter set and the second filter set is applied to the processing target LCU. .
 ステップS106において生成されたフィルタ切り替え情報14が処理対象のLCUに第1のフィルタセットが適用されることを示すならば、処理はステップS108に進む(ステップS107)。他方、ステップS106において生成されたフィルタ切り替え情報14が処理対象のLCUに第2のフィルタセットが適用されることを示すならば、処理はステップS110に進む(ステップS107)。 If the filter switching information 14 generated in step S106 indicates that the first filter set is applied to the LCU to be processed, the process proceeds to step S108 (step S107). On the other hand, if the filter switching information 14 generated in step S106 indicates that the second filter set is applied to the LCU to be processed, the process proceeds to step S110 (step S107).
 ステップS108において、ループフィルタ処理部108はステップS101において取得された第1フィルタ係数情報13を用いて第1のフィルタセットを処理対象のLCUに適用することによって、ループフィルタ処理画像19を生成する。また、エントロピー符号化部104は、ステップS014において生成されたフィルタ切り替え情報14を例えば図8、図9、図10または図11のシンタクスに従って符号化する(ステップS109)。 In step S108, the loop filter processing unit 108 generates the loop filter processed image 19 by applying the first filter set to the LCU to be processed using the first filter coefficient information 13 acquired in step S101. Further, the entropy encoding unit 104 encodes the filter switching information 14 generated in step S014 according to the syntax of FIG. 8, FIG. 9, FIG. 10, or FIG. 11, for example (step S109).
 ステップS108及びステップS109が完了した時点でピクチャ内の全てのLCUに対する符号化処理が完了していれば、処理はステップS113に進む(ステップS112)。他方、ステップS108及びステップS109が完了した時点でピクチャ内の全てのLCUに対する符号化処理が完了していなければ、処理はステップS103に戻る(ステップS112)。 If the encoding process for all the LCUs in the picture is completed at the time when Step S108 and Step S109 are completed, the process proceeds to Step S113 (Step S112). On the other hand, if the encoding process for all the LCUs in the picture has not been completed when Step S108 and Step S109 are completed, the process returns to Step S103 (Step S112).
 ステップS110において、ループフィルタ処理部108はステップS105において設定された第2フィルタ係数情報15を用いて第2のフィルタセットを処理対象のLCUに適用することによって、ループフィルタ処理画像19を生成する。また、エントロピー符号化部104は、ステップS104において生成されたフィルタ切り替え情報14及びステップS105において設定された第2フィルタ係数情報15を例えば図8、図9、図10または図11のシンタクスに従って符号化する(ステップS111)。 In step S110, the loop filter processing unit 108 generates the loop filter processed image 19 by applying the second filter set to the LCU to be processed using the second filter coefficient information 15 set in step S105. Further, the entropy encoding unit 104 encodes the filter switching information 14 generated in step S104 and the second filter coefficient information 15 set in step S105, for example, according to the syntax of FIG. 8, FIG. 9, FIG. (Step S111).
 ステップS110及びステップS111が完了した時点でピクチャ内の全てのLCUに対する符号化処理が完了していれば、処理はステップS113に進む(ステップS112)。他方、ステップS110及びステップS111が完了した時点でピクチャ内の全てのLCUに対する符号化処理が完了していなければ、処理はステップS103に戻る(ステップS112)。 If the encoding process for all the LCUs in the picture is completed at the time when Step S110 and Step S111 are completed, the process proceeds to Step S113 (Step S112). On the other hand, if the encoding process for all the LCUs in the picture has not been completed when Step S110 and Step S111 are completed, the process returns to Step S103 (Step S112).
 ステップS113において、第1フィルタ設定部109は、ステップS104の繰り返しによって生成された各LCUのフィルタ設定情報18に基づいて、後続するピクチャにおいて用いられる第1フィルタ係数情報13を設定する。第1フィルタ設定部109は、ステップS113において設定した第1フィルタ係数情報13を第1フィルタバッファ110に保存し、処理は終了する(ステップS114)。 In step S113, the first filter setting unit 109 sets the first filter coefficient information 13 used in the subsequent picture based on the filter setting information 18 of each LCU generated by repeating step S104. The first filter setting unit 109 stores the first filter coefficient information 13 set in step S113 in the first filter buffer 110, and the process ends (step S114).
 以上説明したように、第1の実施形態に係る動画像符号化装置は、ピクチャ単位で設定される第1のフィルタセットと、画素ブロック単位で設定される第2のフィルタセットとを画素ブロック毎に選択的に適用する。従って、この動画像符号化装置によれば、符号化データのオーバーヘッドを効果的に低減することができる。また、第1フィルタ係数情報は、各画素ブロックの符号化が開始するよりも前に符号化できる。従って、この動画像符号化装置によれば、各画素ブロックの符号化データを順次出力して動画像符号化処理に伴う遅延を低減することができる。 As described above, the moving picture encoding apparatus according to the first embodiment uses the first filter set set in units of pictures and the second filter set set in units of pixel blocks for each pixel block. Selectively apply to. Therefore, according to this moving image encoding apparatus, the overhead of encoded data can be effectively reduced. Further, the first filter coefficient information can be encoded before the encoding of each pixel block starts. Therefore, according to this moving image encoding device, the encoded data of each pixel block can be sequentially output to reduce the delay associated with the moving image encoding process.
 (動画像復号装置) 
 図13に例示されるように、第1の実施形態に係る動画像復号装置は、動画像復号部2000と、復号制御部207とを備える。動画像復号部2000は、エントロピー復号部201と、逆量子化/逆変換部202と、予測画像生成部203と、加算部204と、第1フィルタバッファ205と、ループフィルタ処理部206とを備える。復号制御部207は、動画像復号部2000の各部の動作を制御する。
(Video decoding device)
As illustrated in FIG. 13, the moving picture decoding apparatus according to the first embodiment includes a moving picture decoding unit 2000 and a decoding control unit 207. The moving image decoding unit 2000 includes an entropy decoding unit 201, an inverse quantization / inverse transform unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, and a loop filter processing unit 206. . The decoding control unit 207 controls the operation of each unit of the moving image decoding unit 2000.
 エントロピー復号部201は、動画像復号装置の外部(例えば通信系または蓄積系)から符号化データ20を入力する。符号化データ20は、前述の符号化データ17と同一または類似である。エントロピー復号部201は、符号化データ20にエントロピー復号を行うことによって、量子化変換係数と、符号化パラメータと、第1フィルタ係数情報22と、フィルタ切り替え情報23と、第2フィルタ係数情報24とを生成する。エントロピー復号部201は、第1フィルタ係数情報22を復号するために例えば図7のシンタクスを使用してもよい。また、エントロピー復号部201は、フィルタ切り替え情報23及び第2フィルタ係数情報24を復号するために例えば図8、図9、図10または図11のシンタクスを使用してもよい。 The entropy decoding unit 201 inputs the encoded data 20 from the outside of the video decoding device (for example, a communication system or a storage system). The encoded data 20 is the same as or similar to the encoded data 17 described above. The entropy decoding unit 201 performs entropy decoding on the encoded data 20, thereby performing quantization transform coefficients, encoding parameters, first filter coefficient information 22, filter switching information 23, second filter coefficient information 24, Is generated. The entropy decoding unit 201 may use, for example, the syntax of FIG. 7 in order to decode the first filter coefficient information 22. Further, the entropy decoding unit 201 may use, for example, the syntax of FIG. 8, FIG. 9, FIG. 10, or FIG. 11 to decode the filter switching information 23 and the second filter coefficient information 24.
 第1フィルタ係数情報22は、第1フィルタ係数情報13と同一または類似であってよい。また、フィルタ切り替え情報23は、フィルタ切り替え情報14と同一または類似であってよい。また、第2フィルタ係数情報24は、第2フィルタ係数情報15と同一または類似であってよい。 The first filter coefficient information 22 may be the same as or similar to the first filter coefficient information 13. The filter switching information 23 may be the same as or similar to the filter switching information 14. The second filter coefficient information 24 may be the same as or similar to the second filter coefficient information 15.
 エントロピー復号部201は、量子化変換係数を逆量子化/逆変換部202へと出力し、符号化パラメータを復号制御部207へと出力し、第1フィルタ係数情報22を第1フィルタバッファ205に保存し、フィルタ切り替え情報23及び第2フィルタ係数情報24をループフィルタ処理部206へと出力する。 The entropy decoding unit 201 outputs the quantized transform coefficient to the inverse quantization / inverse transform unit 202, outputs the encoding parameter to the decoding control unit 207, and sends the first filter coefficient information 22 to the first filter buffer 205. The filter switching information 23 and the second filter coefficient information 24 are output to the loop filter processing unit 206.
 逆量子化/逆変換部202は、エントロピー復号部201から量子化変換係数を入力する。逆量子化/逆変換部202は、量子化変換係数を逆量子化して変換係数を得る。更に、逆量子化/逆変換部202は、変換係数に逆変換処理を行って予測誤差画像を得る。逆量子化/逆変換部202は、予測誤差画像を加算部204へと出力する。逆量子化/逆変換部202は、前述の逆量子化/逆変換部105と同一または類似の処理を行う。即ち、逆量子化は、復号制御部207によって設定される量子化パラメータに基づいて行われる。更に、逆変換処理は、符号化側において行われた変換処理によって決まる。例えば、逆変換処理は、IDCT、逆ウェーブレット変換などである。 The inverse quantization / inverse transform unit 202 inputs the quantized transform coefficient from the entropy decoding unit 201. The inverse quantization / inverse transform unit 202 inversely quantizes the quantized transform coefficient to obtain a transform coefficient. Further, the inverse quantization / inverse transform unit 202 performs an inverse transform process on the transform coefficient to obtain a prediction error image. The inverse quantization / inverse transform unit 202 outputs the prediction error image to the addition unit 204. The inverse quantization / inverse transformation unit 202 performs the same or similar processing as the above-described inverse quantization / inverse transformation unit 105. That is, the inverse quantization is performed based on the quantization parameter set by the decoding control unit 207. Further, the inverse conversion process is determined by the conversion process performed on the encoding side. For example, the inverse transform process is IDCT, inverse wavelet transform, or the like.
 予測画像生成部203は、出力画像の予測処理を例えば画素ブロック単位で行い、予測画像を生成する。予測画像生成部203は、後述されるループフィルタ処理画像25に基づいて出力画像の予測処理を行ってよい。予測画像生成部203は、前述の予測画像生成部101と同一または類似の処理を行う。予測画像生成部203は、予測画像を加算部204へと出力する。 The predicted image generation unit 203 performs output image prediction processing in units of pixel blocks, for example, and generates a predicted image. The predicted image generation unit 203 may perform output image prediction processing based on a loop filter processed image 25 described later. The predicted image generation unit 203 performs the same or similar processing as the predicted image generation unit 101 described above. The predicted image generation unit 203 outputs the predicted image to the adding unit 204.
 加算部204は、予測画像生成部203から予測画像を入力し、逆量子化/逆変換部202から予測誤差画像を入力する。加算部204は、予測誤差画像を予測画像と加算し、復号画像21を生成する。加算部204は、復号画像21をループフィルタ処理部206へと出力する。 The addition unit 204 inputs a prediction image from the prediction image generation unit 203 and inputs a prediction error image from the inverse quantization / inverse conversion unit 202. The adding unit 204 adds the prediction error image to the prediction image to generate the decoded image 21. The adding unit 204 outputs the decoded image 21 to the loop filter processing unit 206.
 第1フィルタバッファ205に保存された第1フィルタ係数情報22は、ループフィルタ処理部206によって必要に応じて読み出される。尚、ループフィルタ処理部206が第1フィルタ係数情報22を保存する機能を持つならば、第1フィルタバッファ205は省略されてよい。係る場合には、エントロピー復号部201は第1フィルタ係数情報22をループフィルタ処理部206へと出力する。 The first filter coefficient information 22 stored in the first filter buffer 205 is read by the loop filter processing unit 206 as necessary. Note that if the loop filter processing unit 206 has a function of storing the first filter coefficient information 22, the first filter buffer 205 may be omitted. In such a case, the entropy decoding unit 201 outputs the first filter coefficient information 22 to the loop filter processing unit 206.
 ループフィルタ処理部206は、加算部204から復号画像21を画素ブロック単位で入力し、エントロピー復号部201からフィルタ切り替え情報23及び第2フィルタ係数情報24を入力し、第1フィルタバッファ205から第1フィルタ係数情報22を入力する。 The loop filter processing unit 206 receives the decoded image 21 from the adding unit 204 in units of pixel blocks, receives the filter switching information 23 and the second filter coefficient information 24 from the entropy decoding unit 201, and receives the first filter buffer 205 from the first filter buffer 205. The filter coefficient information 22 is input.
 ループフィルタ処理部206は、処理対象の画素ブロックに対応するフィルタ切り替え情報23に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。具体的には、ループフィルタ処理部206は、第1フィルタ係数情報22または第2フィルタ係数情報24を用いて処理対象の画素ブロックにループフィルタ処理を行うことによって、ループフィルタ処理画像25内の画素ブロックを生成する。尚、フィルタ切り替え情報23が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、ループフィルタ処理部206は処理対象の画素ブロックに対するループフィルタ処理を省略する。即ち、ループフィルタ処理部206は、ループフィルタ処理部108と同一または類似のループフィルタ処理を行う。 The loop filter processing unit 206 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 206 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the loop filter processed image 25 are processed. Generate a block. If the filter switching information 23 indicates that the loop filter process is not applied to the pixel block to be processed, the loop filter processing unit 206 omits the loop filter process for the pixel block to be processed. That is, the loop filter processing unit 206 performs the same or similar loop filter processing as the loop filter processing unit 108.
 ループフィルタ処理部206は、ループフィルタ処理画像25を出力画像として動画像復号装置の外部(例えば表示系など)に与える。また、ループフィルタ処理画像25は、予測画像生成部203がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてよい。ループフィルタ処理画像25は、必要に応じて予測画像生成部203によって参照画像として読み出され、予測処理に利用される。 The loop filter processing unit 206 gives the loop filter processed image 25 as an output image to the outside (for example, a display system) of the moving image decoding apparatus. Further, the loop filter processed image 25 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 203. The loop filter processed image 25 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
 復号制御部207は、エントロピー復号部201から符号化パラメータを入力する。復号制御部207は、符号化パラメータに基づいて、復号タイミングの制御、符号化ブロックの分割制御、量子化制御、モード制御などを行う。 The decoding control unit 207 receives the encoding parameter from the entropy decoding unit 201. The decoding control unit 207 performs decoding timing control, encoding block division control, quantization control, mode control, and the like based on the encoding parameters.
 図2の動画像復号装置は、例えば以下のように復号処理を行うことができる。 
 エントロピー復号部201は、ピクチャの復号を開始すると、例えば図7のシンタクスに従って当該ピクチャに設定された第1フィルタ係数情報22を復号する。それから、エントロピー復号部201は、ピクチャ内の画素ブロックの復号を開始する。具体的には、エントロピー復号部201は、画素ブロックの量子化変換係数及び符号化パラメータを復号する。更に、エントロピー復号部201は、例えば図8、図9、図10または図11のシンタクスに従って画素ブロックのフィルタ切り替え情報23を復号する。また、エントロピー復号部201は、当該フィルタ切り替え情報23が画素ブロックに第2のフィルタセットが適用されることを示すならば第2フィルタ係数情報15を更に復号する。
The moving picture decoding apparatus in FIG. 2 can perform the decoding process as follows, for example.
When starting to decode a picture, the entropy decoding unit 201 decodes the first filter coefficient information 22 set to the picture in accordance with the syntax of FIG. 7, for example. Then, the entropy decoding unit 201 starts decoding the pixel block in the picture. Specifically, the entropy decoding unit 201 decodes the quantized transform coefficient and the encoding parameter of the pixel block. Furthermore, the entropy decoding unit 201 decodes the filter switching information 23 of the pixel block according to the syntax of FIG. 8, FIG. 9, FIG. 10, or FIG. The entropy decoding unit 201 further decodes the second filter coefficient information 15 if the filter switching information 23 indicates that the second filter set is applied to the pixel block.
 画素ブロックの量子化変換係数及び符号化パラメータは、前述のハイブリッド符号化に基づいて処理される。即ち、逆量子化/逆変換部202が、量子化変換係数に逆量子化及び逆変換を行うことによって、予測誤差画像を生成する。予測部203が例えばループフィルタ処理画像25に基づいて予測画像を生成する。加算部204が、予測誤差画像を予測画像に加算することによって、復号画像21を生成する。 Quantization transform coefficients and encoding parameters of the pixel block are processed based on the above-described hybrid encoding. That is, the inverse quantization / inverse transform unit 202 generates a prediction error image by performing inverse quantization and inverse transform on the quantized transform coefficient. The prediction unit 203 generates a prediction image based on the loop filter processed image 25, for example. The adding unit 204 generates the decoded image 21 by adding the prediction error image to the prediction image.
 ループフィルタ処理部206は、復号画像21内の画素ブロックに対応するフィルタ切り替え情報23に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。ループフィルタ処理部206は、第1フィルタ係数情報22または第2フィルタ係数情報24を用いて処理対象の画素ブロックにループフィルタ処理を行うことによって、ループフィルタ処理画像25内の画素ブロックを生成する。 The loop filter processing unit 206 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block in the decoded image 21. The loop filter processing unit 206 generates a pixel block in the loop filter processed image 25 by performing a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24.
 以上説明したように、第1の実施形態に係る動画像復号装置は、第1の実施形態に係る動画像符号化装置からの符号化データを復号することによって、出力画像を生成する。従って、この動画像復号装置によれば、符号化データのオーバーヘッドを効果的に低減することができる。 As described above, the video decoding device according to the first embodiment generates an output image by decoding the encoded data from the video encoding device according to the first embodiment. Therefore, according to this moving picture decoding apparatus, the overhead of encoded data can be effectively reduced.
 (第2の実施形態) 
 (動画像符号化装置) 
 図14に例示されるように、第2の実施形態に係る動画像符号化装置は、動画像符号化部3000と、符号化制御部111とを備える。動画像符号化部3000は、予測画像生成部101と、減算部102と、変換/量子化部103と、エントロピー符号化部104と、逆量子化/逆変換部105と、加算部106と、ループフィルタ情報生成部307と、ループフィルタ処理部308と、第1フィルタ設定部109と、第1フィルタバッファ110と、デブロッキングフィルタ処理部315と、SAO(Sample Adaptive Offset)処理部316とを備える。符号化制御部111は、動画像符号化部3000の各部の動作を制御する。
(Second Embodiment)
(Moving picture encoding device)
As illustrated in FIG. 14, the video encoding apparatus according to the second embodiment includes a video encoding unit 3000 and an encoding control unit 111. The moving image coding unit 3000 includes a predicted image generation unit 101, a subtraction unit 102, a transform / quantization unit 103, an entropy coding unit 104, an inverse quantization / inverse transform unit 105, an adder unit 106, A loop filter information generation unit 307, a loop filter processing unit 308, a first filter setting unit 109, a first filter buffer 110, a deblocking filter processing unit 315, and a SAO (Sample Adaptive Offset) processing unit 316 are provided. . The encoding control unit 111 controls the operation of each unit of the moving image encoding unit 3000.
 図14のエントロピー符号化部104は、ループフィルタ情報生成部107ではなくループフィルタ情報生成部307からフィルタ切り替え情報14及び第2フィルタ係数情報15を入力する点で図1のエントロピー符号化部104とは異なる。 The entropy encoding unit 104 in FIG. 14 differs from the entropy encoding unit 104 in FIG. 1 in that the filter switching information 14 and the second filter coefficient information 15 are input from the loop filter information generation unit 307 instead of the loop filter information generation unit 107. Is different.
 図14の加算部106は、ループフィルタ情報生成部107及びループフィルタ処理部108の代わりにデブロッキングフィルタ処理部315へと復号画像16を出力する点で図1の加算部106とは異なる。 14 differs from the addition unit 106 in FIG. 1 in that the decoded image 16 is output to the deblocking filter processing unit 315 instead of the loop filter information generation unit 107 and the loop filter processing unit 108.
 図14の第1フィルタ設定部109は、ループフィルタ情報生成部107ではなくループフィルタ情報生成部307からフィルタ設定情報18を入力する点で図1の第1フィルタ設定部109とは異なる。 14 is different from the first filter setting unit 109 in FIG. 1 in that the filter setting information 18 is input from the loop filter information generation unit 307 instead of the loop filter information generation unit 107.
 ループフィルタ情報生成部307は、動画像符号化装置の外部から入力画像10を画素ブロック単位で取得し、第1フィルタバッファ110から第1フィルタ係数情報13を入力し、SAO処理部316から後述されるSAO処理画像を画素ブロック単位で入力する。 The loop filter information generation unit 307 acquires the input image 10 in units of pixel blocks from the outside of the video encoding device, inputs the first filter coefficient information 13 from the first filter buffer 110, and will be described later from the SAO processing unit 316. SAO-processed images are input in units of pixel blocks.
 ループフィルタ情報生成部307は、SAO処理画像及び入力画像10に基づいて、処理対象の画素ブロックに対応するフィルタ設定情報18を生成する。また、ループフィルタ情報生成部307は、フィルタ設定情報18に基づいて、処理対象の画素ブロックに対応する第2フィルタ係数情報15を生成する。更に、ループフィルタ情報生成部307は、SAO処理画像、入力画像10、第1フィルタ係数情報13及び第2フィルタ係数情報15に基づいて、処理対象の画素ブロックに対応するフィルタ切り替え情報14を生成する。 The loop filter information generation unit 307 generates filter setting information 18 corresponding to the pixel block to be processed based on the SAO processed image and the input image 10. Further, the loop filter information generation unit 307 generates the second filter coefficient information 15 corresponding to the pixel block to be processed based on the filter setting information 18. Further, the loop filter information generation unit 307 generates filter switching information 14 corresponding to the pixel block to be processed based on the SAO processed image, the input image 10, the first filter coefficient information 13, and the second filter coefficient information 15. .
 ループフィルタ情報生成部307は、フィルタ切り替え情報14及び第2フィルタ係数情報15をエントロピー符号化部104及びループフィルタ処理部308へと出力する。ループフィルタ情報生成部307は、フィルタ設定情報18を第1フィルタ設定部109へと出力する。 The loop filter information generation unit 307 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104 and the loop filter processing unit 308. The loop filter information generation unit 307 outputs the filter setting information 18 to the first filter setting unit 109.
 ループフィルタ処理部308は、SAO処理部316からSAO処理画像を画素ブロック単位で入力し、ループフィルタ情報生成部307からフィルタ切り替え情報14及び第2フィルタ係数情報15を入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力する。 The loop filter processing unit 308 inputs the SAO processed image from the SAO processing unit 316 in units of pixel blocks, receives the filter switching information 14 and the second filter coefficient information 15 from the loop filter information generation unit 307, and the first filter buffer 110. The first filter coefficient information 13 is input.
 ループフィルタ処理部308は、処理対象の画素ブロックに対応するフィルタ切り替え情報14に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。具体的には、ループフィルタ処理部308は、第1フィルタ係数情報13または第2フィルタ係数情報15を用いて処理対象の画素ブロックにループフィルタ処理を行うことによって、ループフィルタ処理画像19内の画素ブロックを生成する。尚、フィルタ切り替え情報14が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、ループフィルタ処理部308は処理対象の画素ブロックに対するループフィルタ処理を省略する。 The loop filter processing unit 308 applies one of the first filter set and the second filter set based on the filter switching information 14 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 308 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 13 or the second filter coefficient information 15, so that the pixels in the loop filter processed image 19 are processed. Generate a block. If the filter switching information 14 indicates that the loop filter processing is not applied to the pixel block to be processed, the loop filter processing unit 308 omits the loop filter processing for the pixel block to be processed.
 ループフィルタ処理画像19は、予測画像生成部101がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてもよい。ループフィルタ処理画像19は、必要に応じて予測画像生成部101によって参照画像として読み出され、予測処理に利用される。 The loop filter processed image 19 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 101. The loop filter processed image 19 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
 デブロッキングフィルタ処理部315は、加算部106から復号画像16を入力する。デブロッキングフィルタ処理部315は、復号画像16に対してデブロッキングフィルタ処理を行うことによって、デブロッキングフィルタ処理画像を生成する。デブロッキングフィルタ処理部315は、デブロッキングフィルタ処理画像をSAO処理部316へと出力する。 The deblocking filter processing unit 315 inputs the decoded image 16 from the addition unit 106. The deblocking filter processing unit 315 performs a deblocking filter process on the decoded image 16 to generate a deblocking filter process image. The deblocking filter processing unit 315 outputs the deblocking filter processing image to the SAO processing unit 316.
 ここで、デブロッキングフィルタ処理部315によって行われるデブロッキングフィルタ処理は、例えば復号画像16内のブロック境界に対して平滑化フィルタを適用する処理であってよい。このデブロッキングフィルタ処理は、概して、復号画像16に含まれるブロック歪を抑制するなどの画質改善効果をもたらす。 Here, the deblocking filter process performed by the deblocking filter processing unit 315 may be a process of applying a smoothing filter to a block boundary in the decoded image 16, for example. This deblocking filter processing generally brings about an image quality improvement effect such as suppression of block distortion included in the decoded image 16.
 SAO処理部316は、デブロッキングフィルタ処理部315からデブロッキングフィルタ処理画像を入力する。SAO処理部316は、デブロッキングフィルタ処理画像にSAO処理を行うことによって、SAO処理画像を生成する。SAO処理部316は、SAO処理画像をループフィルタ情報生成部307及びループフィルタ処理部308へと出力する。 The SAO processing unit 316 inputs a deblocking filter processing image from the deblocking filter processing unit 315. The SAO processing unit 316 generates a SAO processed image by performing SAO processing on the deblocking filter processed image. The SAO processing unit 316 outputs the SAO processed image to the loop filter information generation unit 307 and the loop filter processing unit 308.
 ここで、SAO処理部316によって行われるSAO処理は、デブロッキングフィルタ処理画像内の各画素に対して当該画素と周囲の画素との間の画素値の大小比較に基づいてオフセット値を設定し、各画素の画素値にオフセット値を加算する処理であってよい。 Here, the SAO processing performed by the SAO processing unit 316 sets an offset value for each pixel in the deblocking filtered image based on a comparison of the pixel values between the pixel and the surrounding pixels, The offset value may be added to the pixel value of each pixel.
 以上説明したように、第2の実施形態に係る動画像符号化装置は、復号画像の代わりに当該復号画像にデブロッキングフィルタ処理及びSAO処理を施して得られる画像に対して前述の第1の実施形態に係る動画像符号化装置と同一または類似のループフィルタ処理を行う。従って、この動画像符号化装置によれば、予測画像の画質を改善して符号化効率を向上させることができる。 As described above, the moving image encoding apparatus according to the second embodiment performs the first image processing on an image obtained by performing deblocking filter processing and SAO processing on a decoded image instead of the decoded image. Loop filter processing that is the same as or similar to that of the moving image encoding apparatus according to the embodiment is performed. Therefore, according to this moving image encoding apparatus, the image quality of the predicted image can be improved and the encoding efficiency can be improved.
 (動画像復号装置) 
 図15に例示されるように、第2の実施形態に係る動画像復号装置は、動画像復号部4000と、復号制御部207とを備える。動画像復号部4000は、エントロピー復号部201と、逆量子化/逆変換部202と、予測画像生成部203と、加算部204と、第1フィルタバッファ205と、ループフィルタ処理部406と、デブロッキングフィルタ処理部408と、SAO処理部409とを備える。復号制御部207は、動画像復号部4000の各部の動作を制御する。
(Video decoding device)
As illustrated in FIG. 15, the moving picture decoding apparatus according to the second embodiment includes a moving picture decoding unit 4000 and a decoding control unit 207. The video decoding unit 4000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, a loop filter processing unit 406, A blocking filter processing unit 408 and an SAO processing unit 409 are provided. The decoding control unit 207 controls the operation of each unit of the moving image decoding unit 4000.
 図15のエントロピー復号部201は、ループフィルタ処理部206ではなくループフィルタ処理部406へとフィルタ切り替え情報23及び第2フィルタ係数情報24を出力する点で図13のエントロピー復号部201とは異なる。図15の加算部204は、ループフィルタ処理部206の代わりにデブロッキングフィルタ処理部408へと復号画像21を出力する点で図13の加算部204とは異なる。 15 differs from the entropy decoding unit 201 of FIG. 13 in that the filter switching information 23 and the second filter coefficient information 24 are output to the loop filter processing unit 406 instead of the loop filter processing unit 206. The entropy decoding unit 201 of FIG. The addition unit 204 in FIG. 15 is different from the addition unit 204 in FIG. 13 in that the decoded image 21 is output to the deblocking filter processing unit 408 instead of the loop filter processing unit 206.
 ループフィルタ処理部406は、SAO処理部409からSAO処理画像を画素ブロック単位で入力し、エントロピー復号部201からフィルタ切り替え情報23及び第2フィルタ係数情報24を入力し、第1フィルタバッファ205から第1フィルタ係数情報22を入力する。 The loop filter processing unit 406 receives the SAO processed image from the SAO processing unit 409 in pixel block units, receives the filter switching information 23 and the second filter coefficient information 24 from the entropy decoding unit 201, and receives the first filter buffer 205 from the first filter buffer 205. 1 filter coefficient information 22 is input.
 ループフィルタ処理部406は、処理対象の画素ブロックに対応するフィルタ切り替え情報23に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。具体的には、ループフィルタ処理部406は、第1フィルタ係数情報22または第2フィルタ係数情報24を用いて処理対象の画素ブロックにループフィルタ処理を行うことによって、ループフィルタ処理画像25内の画素ブロックを生成する。尚、フィルタ切り替え情報23が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、ループフィルタ処理部406は処理対象の画素ブロックに対するループフィルタ処理を省略する。 The loop filter processing unit 406 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 406 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the loop filter processed image 25 are processed. Generate a block. If the filter switching information 23 indicates that the loop filter processing is not applied to the pixel block to be processed, the loop filter processing unit 406 omits the loop filter processing for the pixel block to be processed.
 ループフィルタ処理部406は、ループフィルタ処理画像25を出力画像として動画像復号装置の外部(例えば表示系など)に与える。また、ループフィルタ処理画像25は、予測画像生成部203がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてもよい。ループフィルタ処理画像25は、必要に応じて予測画像生成部203によって参照画像として読み出され、予測処理に利用される。 The loop filter processing unit 406 gives the loop filter processed image 25 as an output image to the outside of the video decoding device (for example, a display system). Further, the loop filter processed image 25 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 203. The loop filter processed image 25 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
 デブロッキングフィルタ処理部408は、加算部204から復号画像21を入力する。デブロッキングフィルタ処理部408は、復号画像21に対してデブロッキングフィルタ処理を行うことによって、デブロッキングフィルタ処理画像を生成する。デブロッキングフィルタ処理部408は、デブロッキングフィルタ処理画像をSAO処理部409へと出力する。尚、デブロッキングフィルタ処理部408は、デブロッキングフィルタ処理部315と同一または類似のデブロッキングフィルタ処理を行う。 The deblocking filter processing unit 408 inputs the decoded image 21 from the adding unit 204. The deblocking filter processing unit 408 generates a deblocking filter processed image by performing a deblocking filter process on the decoded image 21. The deblocking filter processing unit 408 outputs the deblocking filter processing image to the SAO processing unit 409. Note that the deblocking filter processing unit 408 performs the same or similar deblocking filter processing as the deblocking filter processing unit 315.
 SAO処理部409は、デブロッキングフィルタ処理部408からデブロッキングフィルタ処理画像を入力する。SAO処理部409は、デブロッキングフィルタ処理画像にSAO処理を行うことによって、SAO処理画像を生成する。SAO処理部409は、SAO処理画像をループフィルタ処理部406へと出力する。尚、SAO処理部409は、SAO処理部316と同一または類似のSAO処理を行う。 The SAO processing unit 409 inputs a deblocking filter processing image from the deblocking filter processing unit 408. The SAO processing unit 409 generates a SAO processed image by performing SAO processing on the deblocking filter processed image. The SAO processing unit 409 outputs the SAO processed image to the loop filter processing unit 406. Note that the SAO processing unit 409 performs the same or similar SAO processing as the SAO processing unit 316.
 以上説明したように、第2の実施形態に係る動画像復号装置は、復号画像の代わりに当該復号画像にデブロッキングフィルタ処理及びSAO処理を施して得られる画像に対して前述の第1の実施形態に係る動画像復号装置と同一または類似のループフィルタ処理を行う。従って、この動画像符号化装置によれば、予測画像の画質を改善して符号化効率を向上させることができる。 As described above, the moving picture decoding apparatus according to the second embodiment performs the first implementation described above on an image obtained by performing deblocking filter processing and SAO processing on the decoded image instead of the decoded image. The same or similar loop filter processing as the moving picture decoding apparatus according to the embodiment is performed. Therefore, according to this moving image encoding apparatus, the image quality of the predicted image can be improved and the encoding efficiency can be improved.
 (第3の実施形態) 
 (動画像符号化装置) 
 図16に例示されるように、第3の実施形態に係る動画像符号化装置は、動画像符号化部5000と、符号化制御部111とを備える。動画像符号化部5000は、予測画像生成部101と、減算部102と、変換/量子化部103と、エントロピー符号化部104と、逆量子化/逆変換部105と、加算部106と、ループフィルタ情報生成部507と、ループフィルタ処理部508と、第1フィルタ設定部109と、第1フィルタバッファ110と、デブロッキングフィルタ処理部315と、SAO処理部316とを備える。符号化制御部111は、動画像符号化部5000の各部の動作を制御する。
(Third embodiment)
(Moving picture encoding device)
As illustrated in FIG. 16, the moving picture encoding apparatus according to the third embodiment includes a moving picture encoding unit 5000 and an encoding control unit 111. The moving image encoding unit 5000 includes a predicted image generating unit 101, a subtracting unit 102, a transform / quantization unit 103, an entropy encoding unit 104, an inverse quantization / inverse transform unit 105, an addition unit 106, A loop filter information generation unit 507, a loop filter processing unit 508, a first filter setting unit 109, a first filter buffer 110, a deblocking filter processing unit 315, and an SAO processing unit 316 are provided. The encoding control unit 111 controls the operation of each unit of the moving image encoding unit 5000.
 図16のエントロピー符号化部104は、ループフィルタ情報生成部107ではなくループフィルタ情報生成部507からフィルタ切り替え情報14及び第2フィルタ係数情報15を入力する点で図1のエントロピー符号化部104とは異なる。 The entropy encoding unit 104 in FIG. 16 differs from the entropy encoding unit 104 in FIG. 1 in that the filter switching information 14 and the second filter coefficient information 15 are input from the loop filter information generation unit 507 instead of the loop filter information generation unit 107. Is different.
 図16の加算部106は、ループフィルタ情報生成部107及びループフィルタ処理部108の代わりにデブロッキングフィルタ処理部315、ループフィルタ情報生成部507及びループフィルタ処理部508へと復号画像16を出力する点で図1の加算部106とは異なる。 16 outputs the decoded image 16 to the deblocking filter processing unit 315, the loop filter information generation unit 507, and the loop filter processing unit 508 instead of the loop filter information generation unit 107 and the loop filter processing unit 108. This is different from the adding unit 106 in FIG.
 図16の第1フィルタ設定部109は、ループフィルタ情報生成部107ではなくループフィルタ情報生成部507からフィルタ設定情報18を入力する点で図1の第1フィルタ設定部109とは異なる。 16 is different from the first filter setting unit 109 in FIG. 1 in that the filter setting information 18 is input from the loop filter information generation unit 507 instead of the loop filter information generation unit 107.
 ループフィルタ情報生成部507は、動画像符号化装置の外部から入力画像10を画素ブロック単位で取得し、加算部106から復号画像16を画素ブロック単位で入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力し、SAO処理部316からSAO処理画像を画素ブロック単位で入力する。 The loop filter information generation unit 507 acquires the input image 10 from the outside of the video encoding device in units of pixel blocks, inputs the decoded image 16 from the addition unit 106 in units of pixel blocks, and receives the first image from the first filter buffer 110. The filter coefficient information 13 is input, and the SAO processing image is input from the SAO processing unit 316 in pixel block units.
 ループフィルタ情報生成部507は、SAO処理画像、入力画像10及び復号画像16に基づいて、処理対象の画素ブロックに対応するフィルタ設定情報18を生成する。また、ループフィルタ情報生成部507は、フィルタ設定情報18に基づいて、処理対象の画素ブロックに対応する第2フィルタ係数情報15を生成する。更に、ループフィルタ情報生成部507は、SAO処理画像、入力画像10、第1フィルタ係数情報13、第2フィルタ係数情報15及び復号画像16に基づいて、処理対象の画素ブロックに対応するフィルタ切り替え情報14を生成する。 The loop filter information generation unit 507 generates filter setting information 18 corresponding to the pixel block to be processed based on the SAO processed image, the input image 10 and the decoded image 16. Further, the loop filter information generation unit 507 generates the second filter coefficient information 15 corresponding to the pixel block to be processed based on the filter setting information 18. Further, the loop filter information generation unit 507 performs filter switching information corresponding to the pixel block to be processed based on the SAO processed image, the input image 10, the first filter coefficient information 13, the second filter coefficient information 15, and the decoded image 16. 14 is generated.
 ループフィルタ情報生成部507は、フィルタ切り替え情報14及び第2フィルタ係数情報15をエントロピー符号化部104及びループフィルタ処理部508へと出力する。ループフィルタ情報生成部507は、フィルタ設定情報18を第1フィルタ設定部109へと出力する。 The loop filter information generation unit 507 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104 and the loop filter processing unit 508. The loop filter information generation unit 507 outputs the filter setting information 18 to the first filter setting unit 109.
 ループフィルタ処理部508は、加算部106から復号画像16を画素ブロック単位で入力し、SAO処理部316からSAO処理画像を画素ブロック単位で入力し、ループフィルタ情報生成部507からフィルタ切り替え情報14及び第2フィルタ係数情報15を入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力する。 The loop filter processing unit 508 inputs the decoded image 16 from the adder unit 106 in units of pixel blocks, inputs the SAO processed image from the SAO processing unit 316 in units of pixel blocks, and receives filter switching information 14 and from the loop filter information generation unit 507. The second filter coefficient information 15 is input, and the first filter coefficient information 13 is input from the first filter buffer 110.
 ループフィルタ処理部508は、処理対象の画素ブロックに対応するフィルタ切り替え情報14に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。具体的には、ループフィルタ処理部508は、第1フィルタ係数情報13または第2フィルタ係数情報15を用いて処理対象の画素ブロックにループフィルタ処理を行うことによって、ループフィルタ処理画像19内の画素ブロックを生成する。尚、前述のようにフィルタ切り替え情報14が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、ループフィルタ処理部508は処理対象の画素ブロックに対するループフィルタ処理を省略する。 The loop filter processing unit 508 applies one of the first filter set and the second filter set based on the filter switching information 14 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 508 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 13 or the second filter coefficient information 15, so that the pixels in the loop filter processed image 19 are processed. Generate a block. If the filter switching information 14 indicates that the loop filter process is not applied to the pixel block to be processed as described above, the loop filter processing unit 508 omits the loop filter process for the pixel block to be processed.
 ループフィルタ処理画像19は、予測画像生成部101がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてもよい。ループフィルタ処理画像19は、必要に応じて予測画像生成部101によって参照画像として読み出され、予測処理に利用される。 The loop filter processed image 19 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 101. The loop filter processed image 19 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
 ここで、ループフィルタ処理部508は、復号画像16及びSAO処理画像を対象にループフィルタ処理を行う。例えば、ループフィルタ処理部508は、復号画像16の各画素値とSAO処理画像の各画素値とを適切な重みを用いて加重平均することによって得られる画像に対してループフィルタ処理を行うことができる。 Here, the loop filter processing unit 508 performs loop filter processing on the decoded image 16 and the SAO processed image. For example, the loop filter processing unit 508 may perform loop filter processing on an image obtained by performing weighted averaging of each pixel value of the decoded image 16 and each pixel value of the SAO processed image using appropriate weights. it can.
 符号化によって発生するブロック歪は、概して、デブロッキングフィルタ処理によって抑制することができる。しかしながら、デブロッキングフィルタ処理において、強度の高いローパスフィルタが適用され、入力画像10に含まれていた高周波成分(例えば、エッジ成分、テクスチャ成分など)が劣化することもある。デブロッキングフィルタ処理の前後の画像(即ち、復号画像16及びSAO処理画像)を対象にループフィルタ処理を行うことにより、デブロッキングフィルタ処理を通じて発生した部分的な画質劣化を補償することができる。即ち、画質改善効果が得られる。 Block distortion generated by encoding can be generally suppressed by deblocking filter processing. However, in the deblocking filter process, a high-intensity low-pass filter is applied, and high-frequency components (for example, edge components and texture components) included in the input image 10 may be deteriorated. By performing the loop filter process on the images before and after the deblocking filter process (that is, the decoded image 16 and the SAO process image), it is possible to compensate for partial image quality degradation that has occurred through the deblocking filter process. That is, an image quality improvement effect can be obtained.
 また、ループフィルタ処理の対象となる画像は、例えばシーケンス、ピクチャ、スライスなどの単位で切り替えられてもよい。例えば、あるスライスにおいてSAO処理画像のみに対してループフィルタ処理が行われ、別のスライスにおいて復号画像16及びSAO処理画像に対してループフィルタ処理が行われてもよい。ループフィルタ処理の対象となる画像の切り替えを動画像復号装置に識別させるために、ループフィルタ処理の対象となる画像の切り替えを示す情報を例えばシーケンス、ピクチャ、スライスなどの単位で符号化することも可能である。 Further, the image to be subjected to the loop filter process may be switched in units such as a sequence, a picture, and a slice. For example, the loop filter process may be performed on only the SAO processed image in a certain slice, and the loop filter process may be performed on the decoded image 16 and the SAO processed image in another slice. In order to allow the video decoding device to identify the switching of the image to be subjected to the loop filter process, information indicating the switching of the image to be the target of the loop filter process may be encoded in units such as a sequence, a picture, and a slice. Is possible.
 尚、前述の部分的な画質の劣化は、デブロッキングフィルタ処理に限られず種々のフィルタ処理によって生じ得る。故に、任意のフィルタ処理の前後の画像を対象にループフィルタ処理が行われてよい。また、復号画像16に対してループフィルタ処理よりも前に複数のフィルタ処理が行われる場合には、任意の1以上のフィルタ処理の前後の画像を対象にループフィルタ処理が行われてよい。 Note that the above-described partial deterioration in image quality is not limited to the deblocking filter process, and may be caused by various filter processes. Therefore, the loop filter process may be performed on the images before and after the arbitrary filter process. Further, when a plurality of filter processes are performed on the decoded image 16 before the loop filter process, the loop filter process may be performed on images before and after any one or more filter processes.
 以上説明したように、第3の実施形態に係る動画像符号化装置は、フィルタ処理(例えば、デブロッキングフィルタ処理及びSAO処理)の前後の画像を対象に前述の第1の実施形態に係る動画像符号化装置と同一または類似のループフィルタ処理を行う。従って、この動画像符号化装置によれば、予測画像の画質を改善して符号化効率を向上させることができる。 As described above, the moving image encoding apparatus according to the third embodiment targets the images before and after the filter process (for example, the deblocking filter process and the SAO process), and the moving image according to the first embodiment described above. The same or similar loop filter processing as that of the image encoding device is performed. Therefore, according to this moving image encoding apparatus, the image quality of the predicted image can be improved and the encoding efficiency can be improved.
 (動画像復号装置) 
 図17に例示されるように、第3の実施形態に係る動画像復号装置は、動画像復号部6000と、復号制御部207とを備える。動画像復号部6000は、エントロピー復号部201と、逆量子化/逆変換部202と、予測画像生成部203と、加算部204と、第1フィルタバッファ205と、ループフィルタ処理部606と、デブロッキングフィルタ処理部408と、SAO処理部409とを備える。復号制御部207は、動画像復号部6000の各部の動作を制御する。
(Video decoding device)
As illustrated in FIG. 17, the video decoding device according to the third embodiment includes a video decoding unit 6000 and a decoding control unit 207. The video decoding unit 6000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, a loop filter processing unit 606, A blocking filter processing unit 408 and an SAO processing unit 409 are provided. The decoding control unit 207 controls the operation of each unit of the moving image decoding unit 6000.
 図17のエントロピー復号部201は、ループフィルタ処理部206ではなくループフィルタ処理部606へとフィルタ切り替え情報23及び第2フィルタ係数情報24を出力する点で図13のエントロピー復号部201とは異なる。図17の加算部204は、ループフィルタ処理部206の代わりにデブロッキングフィルタ処理部408へと復号画像21を出力する点で図13の加算部204とは異なる。 17 differs from the entropy decoding unit 201 in FIG. 13 in that the filter switching information 23 and the second filter coefficient information 24 are output to the loop filter processing unit 606 instead of the loop filter processing unit 206. The entropy decoding unit 201 in FIG. 17 is different from the addition unit 204 of FIG. 13 in that the decoded image 21 is output to the deblocking filter processing unit 408 instead of the loop filter processing unit 206.
 ループフィルタ処理部606は、加算部204から復号画像21を画素ブロック単位で入力し、SAO処理部409からSAO処理画像を画素ブロック単位で入力し、エントロピー復号部201からフィルタ切り替え情報23及び第2フィルタ係数情報24を入力し、第1フィルタバッファ205から第1フィルタ係数情報22を入力する。 The loop filter processing unit 606 receives the decoded image 21 from the adding unit 204 in units of pixel blocks, inputs the SAO processed image from the SAO processing unit 409 in units of pixel blocks, and receives the filter switching information 23 and the second information from the entropy decoding unit 201. The filter coefficient information 24 is input, and the first filter coefficient information 22 is input from the first filter buffer 205.
 ループフィルタ処理部606は、処理対象の画素ブロックに対応するフィルタ切り替え情報23に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。具体的には、ループフィルタ処理部606は、第1フィルタ係数情報22または第2フィルタ係数情報24を用いて処理対象の画素ブロックにループフィルタ処理を行うことによって、ループフィルタ処理画像25内の画素ブロックを生成する。即ち、ループフィルタ処理部606は、ループフィルタ処理部508と同一または類似のループフィルタ処理を行う。尚、フィルタ切り替え情報23が処理対象の画素ブロックにループフィルタ処理が適用されないことを示すならば、ループフィルタ処理部606は処理対象の画素ブロックに対するループフィルタ処理を省略する。 The loop filter processing unit 606 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the loop filter processing unit 606 performs a loop filter process on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the loop filter processed image 25 are processed. Generate a block. That is, the loop filter processing unit 606 performs the same or similar loop filter processing as the loop filter processing unit 508. If the filter switching information 23 indicates that the loop filter processing is not applied to the pixel block to be processed, the loop filter processing unit 606 omits the loop filter processing for the pixel block to be processed.
 ループフィルタ処理部606は、ループフィルタ処理画像25を出力画像として動画像復号装置の外部(例えば表示系など)に与える。また、ループフィルタ処理画像25は、予測画像生成部203がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてもよい。ループフィルタ処理画像25は、必要に応じて予測画像生成部203によって参照画像として読み出され、予測処理に利用される。 The loop filter processing unit 606 gives the loop filter processed image 25 as an output image to the outside of the video decoding device (for example, a display system). Further, the loop filter processed image 25 may be stored in a storage unit (not shown) (for example, a buffer) accessible by the predicted image generation unit 203. The loop filter processed image 25 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
 以上説明したように、第3の実施形態に係る動画像復号装置は、フィルタ処理(例えばデブロッキングフィルタ処理及びSAO処理)の前後の画像を対象に前述の第1の実施形態に係る動画像復号装置と同一または類似のループフィルタ処理を行う。従って、この動画像復号装置によれば、予測画像の画質を改善して符号化効率を向上させることができる。 As described above, the moving picture decoding apparatus according to the third embodiment is directed to the moving picture decoding according to the first embodiment described above for the images before and after the filtering process (for example, the deblocking filtering process and the SAO process). Performs the same or similar loop filter processing as the device. Therefore, according to this moving image decoding apparatus, it is possible to improve the encoding efficiency by improving the image quality of the predicted image.
 (第4の実施形態) 
 前述の第1、第2または第3の実施形態に係る動画像復号装置において、ループフィルタ処理画像25が出力画像として動画像復号装置の外部(例えば表示系など)に与えられている。しかしながら、ループフィルタ処理画像25の代わりに復号画像21が出力画像として動画像復号装置の外部に与えられてもよい。
(Fourth embodiment)
In the video decoding device according to the first, second, or third embodiment described above, the loop filter processed image 25 is provided as an output image outside the video decoding device (for example, a display system). However, the decoded image 21 may be given to the outside of the moving image decoding apparatus as an output image instead of the loop filter processed image 25.
 (動画像復号装置) 
 図18に例示されるように、第4の実施形態に係る動画像復号装置は、動画像復号部7000と、復号制御部207とを備える。動画像復号部7000は、エントロピー復号部201と、逆量子化/逆変換部202と、予測画像生成部203と、加算部204と、第1フィルタバッファ205と、ループフィルタ処理部206とを備える。復号制御部207は、動画像復号部7000の各部の動作を制御する。
(Video decoding device)
As illustrated in FIG. 18, the video decoding device according to the fourth embodiment includes a video decoding unit 7000 and a decoding control unit 207. The video decoding unit 7000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, and a loop filter processing unit 206. . The decoding control unit 207 controls the operation of each unit of the moving image decoding unit 7000.
 図18の加算部204は、図13の加算部204と同じく復号画像21をループフィルタ処理部206へ出力する。更に、図18の加算部204は、復号画像21を出力画像として動画像復号装置の外部へ与える。図18のループフィルタ処理部206は、ループフィルタ処理画像25を出力画像として動画像復号装置の外部へ与えない点で図13のループフィルタ処理部206とは異なる。 18 adds the decoded image 21 to the loop filter processing unit 206 in the same manner as the addition unit 204 in FIG. Further, the adding unit 204 in FIG. 18 provides the decoded image 21 as an output image to the outside of the moving image decoding apparatus. The loop filter processing unit 206 in FIG. 18 is different from the loop filter processing unit 206 in FIG. 13 in that the loop filter processed image 25 is not given as an output image to the outside of the video decoding device.
 以上説明したように、第4の実施形態に係る動画像復号装置は、ループフィルタ処理前の復号画像を出力画像として動画像復号装置の外部へ与える。従って、この動画像復号装置によれば、出力画像を低遅延で外部に与えることができる。 As described above, the moving picture decoding apparatus according to the fourth embodiment gives the decoded picture before the loop filter processing to the outside of the moving picture decoding apparatus as an output picture. Therefore, according to this moving image decoding apparatus, an output image can be given to the outside with a low delay.
 (第5の実施形態) 
 前述の第1、第2または第3の実施形態において説明されたループフィルタ処理は、ポストフィルタ処理に置き換えて利用することもできる。即ち、第5の実施形態に係る動画像符号化装置は、ポストフィルタ処理のためのフィルタ情報(例えば、第1フィルタ係数情報13、フィルタ切り替え情報14及び第2フィルタ係数情報15)を生成する。そして、第5の実施形態に係る動画像復号装置は、上記フィルタ情報に基づいてポストフィルタ処理を行う。
(Fifth embodiment)
The loop filter processing described in the first, second, or third embodiment can be replaced with post filter processing. That is, the moving picture encoding apparatus according to the fifth embodiment generates filter information (for example, first filter coefficient information 13, filter switching information 14, and second filter coefficient information 15) for post filter processing. Then, the video decoding device according to the fifth embodiment performs post-filter processing based on the filter information.
 (動画像符号化装置) 
 図19に例示されるように、第5の実施形態に係る動画像符号化装置は、動画像符号化部8000と、符号化制御部111とを備える。動画像符号化部8000は、予測画像生成部101と、減算部102と、変換/量子化部103と、エントロピー符号化部104と、逆量子化/逆変換部105と、加算部106と、ポストフィルタ情報生成部807と、第1フィルタ設定部109と、第1フィルタバッファ110とを備える。符号化制御部111は、動画像符号化部8000の各部の動作を制御する。
(Moving picture encoding device)
As illustrated in FIG. 19, the moving picture coding apparatus according to the fifth embodiment includes a moving picture coding unit 8000 and a coding control unit 111. The moving image encoding unit 8000 includes a predicted image generation unit 101, a subtraction unit 102, a transform / quantization unit 103, an entropy encoding unit 104, an inverse quantization / inverse transform unit 105, an addition unit 106, A post filter information generation unit 807, a first filter setting unit 109, and a first filter buffer 110 are provided. The encoding control unit 111 controls the operation of each unit of the moving image encoding unit 8000.
 図19の予測画像生成部101は、ループフィルタ処理画像19の代わりに復号画像16に基づいて予測処理を行う点で図1の予測画像生成部101とは異なる。図19のエントロピー符号化部104は、ループフィルタ情報生成部107ではなくポストフィルタ情報生成部807からフィルタ切り替え情報14及び第2フィルタ係数情報15を入力する点で図1のエントロピー符号化部104とは異なる。 19 is different from the predicted image generation unit 101 in FIG. 1 in that the predicted image generation unit 101 in FIG. 19 performs a prediction process based on the decoded image 16 instead of the loop filter processed image 19. The entropy encoding unit 104 in FIG. 19 is different from the entropy encoding unit 104 in FIG. 1 in that the filter switching information 14 and the second filter coefficient information 15 are input from the post filter information generation unit 807 instead of the loop filter information generation unit 107. Is different.
 図19の加算部106は、ループフィルタ情報生成部107及びループフィルタ処理部108の代わりにポストフィルタ情報生成部807へと復号画像16を出力する点で図1の加算部106とは異なる。また、復号画像16は、予測画像生成部101がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてもよい。復号画像16は、必要に応じて予測画像生成部101によって参照画像として読み出され、予測処理に利用される。 19 is different from the addition unit 106 in FIG. 1 in that the decoded image 16 is output to the post filter information generation unit 807 instead of the loop filter information generation unit 107 and the loop filter processing unit 108. In addition, the decoded image 16 may be stored in a storage unit (not illustrated) (for example, a buffer) that can be accessed by the predicted image generation unit 101. The decoded image 16 is read as a reference image by the predicted image generation unit 101 as necessary, and is used for the prediction process.
 図19の第1フィルタ設定部109は、ループフィルタ情報生成部107ではなくポストフィルタ情報生成部807からフィルタ設定情報18を入力する点で図1の第1フィルタ設定部109とは異なる。 19 is different from the first filter setting unit 109 in FIG. 1 in that the filter setting information 18 is input from the post filter information generation unit 807 instead of the loop filter information generation unit 107.
 ポストフィルタ情報生成部807は、動画像符号化装置の外部から入力画像10を画素ブロック単位で取得し、加算部106から復号画像16を画素ブロック単位で入力し、第1フィルタバッファ110から第1フィルタ係数情報13を入力する。ポストフィルタ情報生成部807は、例えばループフィルタ情報設定部107と同一または類似の処理を行う。ポストフィルタ情報生成部807は、フィルタ切り替え情報14及び第2フィルタ係数情報15をエントロピー符号化部104へと出力する。ポストフィルタ情報生成部807は、フィルタ設定情報18を第1フィルタ設定部109へと出力する。 The post filter information generation unit 807 acquires the input image 10 in units of pixel blocks from the outside of the video encoding device, inputs the decoded image 16 in units of pixel blocks from the addition unit 106, and receives the first image from the first filter buffer 110. The filter coefficient information 13 is input. The post filter information generation unit 807 performs the same or similar processing as the loop filter information setting unit 107, for example. The post filter information generation unit 807 outputs the filter switching information 14 and the second filter coefficient information 15 to the entropy encoding unit 104. The post filter information generation unit 807 outputs the filter setting information 18 to the first filter setting unit 109.
 以上説明したように、第5の実施形態に係る動画像符号化装置は、前述の第1、第2または第3の実施形態におけるループフィルタ処理のためのフィルタ情報をポストフィルタ処理のためのフィルタ情報として生成する。従って、この動画像符号化装置によれば、ループフィルタ処理の代わりにポストフィルタ処理が用いられる動画像符号化装置及び動画像復号装置において、第1、第2または第3の実施形態と同一または類似の効果を得ることができる。 As described above, the moving picture encoding apparatus according to the fifth embodiment uses the filter information for the loop filter processing in the first, second, or third embodiment described above as the filter for the post filter processing. Generate as information. Therefore, according to this moving image encoding device, in the moving image encoding device and the moving image decoding device in which post filter processing is used instead of loop filter processing, the same as in the first, second, or third embodiment or Similar effects can be obtained.
 (動画像復号装置) 
 図20に例示されるように、第5の実施形態に係る動画像復号装置は、動画像復号部9000と、復号制御部207とを備える。動画像復号部9000は、エントロピー復号部201と、逆量子化/逆変換部202と、予測画像生成部203と、加算部204と、第1フィルタバッファ205と、ポストフィルタ処理部906とを備える。復号制御部207は、動画像復号部9000の各部の動作を制御する。
(Video decoding device)
As illustrated in FIG. 20, the video decoding device according to the fifth embodiment includes a video decoding unit 9000 and a decoding control unit 207. The moving picture decoding unit 9000 includes an entropy decoding unit 201, an inverse quantization / inverse transformation unit 202, a predicted image generation unit 203, an addition unit 204, a first filter buffer 205, and a post filter processing unit 906. . The decoding control unit 207 controls the operation of each unit of the moving image decoding unit 9000.
 図20のエントロピー復号部201は、ループフィルタ処理部206ではなくポストフィルタ処理部906へとフィルタ切り替え情報23及び第2フィルタ係数情報24を出力する点で図13のエントロピー復号部201とは異なる。図20の予測画像生成部203は、ループフィルタ処理画像25の代わりに復号画像21に基づいて予測処理を行う点で図13の予測画像生成部203とは異なる。 20 is different from the entropy decoding unit 201 in FIG. 13 in that the filter switching information 23 and the second filter coefficient information 24 are output to the post filter processing unit 906 instead of the loop filter processing unit 206. The entropy decoding unit 201 in FIG. The predicted image generation unit 203 in FIG. 20 is different from the predicted image generation unit 203 in FIG. 13 in that a prediction process is performed based on the decoded image 21 instead of the loop filter processed image 25.
 図20の加算部204は、ループフィルタ処理部206の代わりにポストフィルタ処理部906へと復号画像21を出力する点で図13の加算部204とは異なる。また、復号画像21は、予測画像生成部203がアクセス可能な図示されない記憶部(例えばバッファなど)に保存されてもよい。復号画像21は、必要に応じて予測画像生成部203によって参照画像として読み出され、予測処理に利用される。 20 is different from the addition unit 204 of FIG. 13 in that the decoded image 21 is output to the post filter processing unit 906 instead of the loop filter processing unit 206. In addition, the decoded image 21 may be stored in a storage unit (not shown) (for example, a buffer) that can be accessed by the predicted image generation unit 203. The decoded image 21 is read as a reference image by the predicted image generation unit 203 as necessary, and is used for the prediction process.
 ポストフィルタ処理部906は、加算部204から復号画像21を画素ブロック単位で入力し、エントロピー復号部201からフィルタ切り替え情報23及び第2フィルタ係数情報24を入力し、第1フィルタバッファ205から第1フィルタ係数情報22を入力する。 The post filter processing unit 906 inputs the decoded image 21 from the adder unit 204 in units of pixel blocks, receives the filter switching information 23 and the second filter coefficient information 24 from the entropy decoding unit 201, and receives the first filter buffer 205 from the first filter buffer 205. The filter coefficient information 22 is input.
 ポストフィルタ処理部906は、処理対象の画素ブロックに対応するフィルタ切り替え情報23に基づいて第1のフィルタセット及び第2のフィルタセットのいずれか一方を適用する。具体的には、ポストフィルタ処理部906は、第1フィルタ係数情報22または第2フィルタ係数情報24を用いて処理対象の画素ブロックにポストフィルタ処理を行うことによって、ポストフィルタ処理画像35内の画素ブロックを生成する。尚、フィルタ切り替え情報23が処理対象の画素ブロックにポストフィルタ処理が適用されないことを示すならば、ポストフィルタ処理部906は処理対象の画素ブロックに対するポストフィルタ処理を省略する。ポストフィルタ処理部906は、ポストフィルタ処理画像35を出力画像として動画像復号装置の外部(例えば表示系など)に与える。 The post filter processing unit 906 applies one of the first filter set and the second filter set based on the filter switching information 23 corresponding to the pixel block to be processed. Specifically, the post-filter processing unit 906 performs post-filter processing on the pixel block to be processed using the first filter coefficient information 22 or the second filter coefficient information 24, so that the pixels in the post-filter processed image 35 are processed. Generate a block. If the filter switching information 23 indicates that post-filter processing is not applied to the pixel block to be processed, the post-filter processing unit 906 omits post-filter processing for the pixel block to be processed. The post filter processing unit 906 gives the post filter processed image 35 as an output image to the outside of the video decoding device (for example, a display system).
 以上説明したように、第5の実施形態に係る動画像復号装置は、前述の第1、第2または第3の実施形態におけるループフィルタ処理と同一または類似のポストフィルタ処理を行う。従って、この動画像符号化装置によれば、ループフィルタ処理の代わりにポストフィルタ処理が用いられる動画像符号化装置及び動画像復号装置において、第1、第2または第3の実施形態と同一または類似の効果を得ることができる。 As described above, the moving picture decoding apparatus according to the fifth embodiment performs post filter processing that is the same as or similar to the loop filter processing in the first, second, or third embodiment described above. Therefore, according to this moving image encoding device, in the moving image encoding device and the moving image decoding device in which post filter processing is used instead of loop filter processing, the same as in the first, second, or third embodiment or Similar effects can be obtained.
 尚、前述の各実施形態において説明された種々の処理は、対応する命令が記述されたプログラム(即ち、ソフトウェア)を用いて実現することもできる。具体的には、コンピュータ(または組み込みシステム)は、このプログラムをインストールして実行することによって、各実施形態に係る動画像符号化装置及び動画像復号装置として機能してもよい。前述の各実施形態において説明された種々の処理に対応する命令は、コンピュータに実行させることのできるプログラムとして記述される。 It should be noted that the various processes described in the above-described embodiments can also be realized by using a program (that is, software) in which corresponding instructions are described. Specifically, a computer (or an embedded system) may function as a moving picture encoding apparatus and a moving picture decoding apparatus according to each embodiment by installing and executing this program. Instructions corresponding to various processes described in the above-described embodiments are described as programs that can be executed by a computer.
 プログラムは、磁気ディスク(例えば、フレキシブルディスク(登録商標)、ハードディスク等)、光ディスク(CD-ROM、CD-R、CD-RW、DVD-ROM、DVD±R、DVD±RW等)、半導体メモリまたはこれらに類似する記録媒体に記録される。尚、プログラムの記録媒体は、上記の例に限られず、コンピュータが読み取り可能なものであればよい。また、プログラムは必ずしも1つの記録媒体に保存される必要はなく、複数の記録媒体に亘って保存されてもよい。また、コンピュータは、ネットワーク(例えば、LAN(Local Area Network)またはインターネット)を通じてプログラムをダウンロードしてもよい。また、プログラムが保存される記録媒体は、コンピュータから独立したものであってもよいし、ネットワークを通じてダウンロードされたプログラムを記憶(一時記憶を含む)するためのものであってもよい。 The program can be a magnetic disk (eg, flexible disk (registered trademark), hard disk, etc.), optical disk (CD-ROM, CD-R, CD-RW, DVD-ROM, DVD ± R, DVD ± RW, etc.), semiconductor memory, It is recorded on a recording medium similar to these. Note that the program recording medium is not limited to the above example, and may be any computer-readable medium. Further, the program is not necessarily stored in one recording medium, and may be stored across a plurality of recording media. Further, the computer may download the program through a network (for example, a LAN (Local Area Network) or the Internet). In addition, the recording medium in which the program is stored may be independent from the computer, or may be for storing the program downloaded through the network (including temporary storage).
 また、コンピュータ上で稼働するOS(オペレーティングシステム)、データベース管理ソフトウェア、ネットワークMW(ミドルウェア)または他のMWなどは、プログラムに記述された指示に基づいて、各実施形態に係る動画像符号化装置及び動画像復号装置による処理の一部を実行してもよい。 In addition, an OS (operating system), database management software, network MW (middleware), or other MW that runs on a computer is based on instructions described in the program, and a video encoding device according to each embodiment A part of the processing by the video decoding device may be executed.
 上記説明において、コンピュータまたは組み込みシステムとは、記録媒体に保存されたプログラムに基づいて、本実施形態における各処理を実行するためのものである。従って、コンピュータまたは組み込みシステムとは、典型的には、1つのパーソナルコンピュータまたはマイクロコンピュータであってもよいし、複数のパーソナルコンピュータまたはマイクロコンピュータがネットワーク接続されて形成されるシステムであってもよい。 In the above description, a computer or an embedded system is for executing each process in the present embodiment based on a program stored in a recording medium. Therefore, the computer or the embedded system may typically be one personal computer or a microcomputer, or may be a system formed by connecting a plurality of personal computers or microcomputers over a network.
 コンピュータは、パーソナルコンピュータに限られず、情報処理機器に含まれる演算処理装置、マイクロコンピュータなどであってもよい。要するに、コンピュータとは、プログラムによって各実施形態に係る動画像符号化装置及び動画像復号装置の機能を実現することが可能な機器または装置を総称する。 The computer is not limited to a personal computer, and may be an arithmetic processing unit or a microcomputer included in an information processing device. In short, the computer is a generic term for devices or apparatuses that can realize the functions of the moving picture coding apparatus and the moving picture decoding apparatus according to each embodiment by a program.
 本発明のいくつかの実施形態を説明したが、これらの実施形態は、例として提示したものであり、発明の範囲を限定することは意図していない。これら新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。これら実施形態やその変形は、発明の範囲や要旨に含まれるとともに、特許請求の範囲に記載された発明とその均等の範囲に含まれる。 Although several embodiments of the present invention have been described, these embodiments are presented as examples and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other forms, and various omissions, replacements, and changes can be made without departing from the spirit of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the invention described in the claims and the equivalents thereof.
 10・・・入力画像
 11・・・予測画像
 12・・・予測誤差画像
 13,22・・・第1フィルタ係数情報
 14,23・・・フィルタ切り替え情報
 15,24・・・第2フィルタ係数情報
 16,21・・・復号画像
 17,20・・・符号化データ
 18・・・フィルタ設定情報
 19,25・・・ループフィルタ処理画像
 35・・・ポストフィルタ処理画像
 101,203・・・予測画像生成部
 102・・・減算部
 103・・・変換/量子化部
 104・・・エントロピー符号化部
 105,202・・・逆量子化/逆変換部
 106,204・・・加算部
 107,307,507・・・ループフィルタ情報生成部
 108,206,308,406,508,606・・・ループフィルタ処理部
 109・・・第1フィルタ設定部
 110,205・・・第1フィルタバッファ
 111・・・符号化制御部
 112・・・フィルタ設定情報生成部
 113・・・第2フィルタ設定部
 114・・・フィルタ切り替え情報生成部
 115・・・スイッチ
 116・・・フィルタ適用部
 201・・・エントロピー復号部
 207・・・復号制御部
 315,408・・・デブロッキングフィルタ処理部
 316,409・・・SAO処理部
 807・・・ポストフィルタ情報生成部
 906・・・ポストフィルタ処理部
 1000,3000,5000,8000・・・動画像符号化部
 2000,4000,6000,7000,9000・・・動画像復号部
DESCRIPTION OF SYMBOLS 10 ... Input image 11 ... Predicted image 12 ... Prediction error image 13, 22 ... First filter coefficient information 14, 23 ... Filter switching information 15, 24 ... Second filter coefficient information 16, 21 ... Decoded image 17, 20 ... Encoded data 18 ... Filter setting information 19, 25 ... Loop filter processed image 35 ... Post filter processed image 101, 203 ... Predicted image Generation unit 102 ... subtraction unit 103 ... transform / quantization unit 104 ... entropy encoding unit 105, 202 ... inverse quantization / inverse transform unit 106, 204 ... adder 107, 307, 507 ... Loop filter information generation unit 108, 206, 308, 406, 508, 606 ... Loop filter processing unit 109 ... First filter setting unit 110, 2 DESCRIPTION OF SYMBOLS 5 ... 1st filter buffer 111 ... Encoding control part 112 ... Filter setting information generation part 113 ... 2nd filter setting part 114 ... Filter switching information generation part 115 ... Switch 116- ... Filter application unit 201... Entropy decoding unit 207... Decoding control unit 315, 408 .. Deblocking filter processing unit 316, 409... SAO processing unit 807. ..Post filter processing unit 1000, 3000, 5000, 8000 ... moving image encoding unit 2000, 4000, 6000, 7000, 9000 ... moving image decoding unit

Claims (14)

  1.  復号画像に対して設定される第1のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第1フィルタ係数情報を符号化することと、
     前記復号画像内の処理対象の画素ブロックにループフィルタ処理が適用される場合に前記第1のフィルタセット及び当該処理対象の画素ブロックに対して設定される第2のフィルタセットのいずれが適用されるかを示すフィルタ切り替え情報を前記第1フィルタ係数情報が符号化された後に符号化することと、
     前記処理対象の画素ブロックに前記第2のフィルタセットが適用される場合には、前記第2のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第2フィルタ係数情報を前記第1フィルタ係数情報が符号化された後に符号化することと、
     前記フィルタ切り替え情報に基づいて、前記処理対象の画素ブロックに前記第1のフィルタセット及び前記第2のフィルタセットのいずれか一方を適用することによって、参照画像内の画素ブロックを生成することと
     を具備する動画像符号化方法。
    Encoding first filter coefficient information indicating each filter coefficient of one or more filters included in a first filter set set for a decoded image;
    When the loop filter process is applied to the pixel block to be processed in the decoded image, either the first filter set or the second filter set set for the pixel block to be processed is applied. Encoding the filter switching information indicating whether the first filter coefficient information is encoded,
    When the second filter set is applied to the pixel block to be processed, second filter coefficient information indicating each filter coefficient of one or more filters included in the second filter set is stored in the first filter coefficient information. Encoding after one filter coefficient information is encoded;
    Generating one pixel block in a reference image by applying one of the first filter set and the second filter set to the pixel block to be processed based on the filter switching information; A moving image encoding method provided.
  2.  前記フィルタ切り替え情報は、前記処理対象の画素ブロックにループフィルタ処理が適用されるか否かを示し、前記処理対象の画素ブロックにループフィルタ処理が適用される場合には前記第1のフィルタセット及び前記第2のフィルタセットのいずれが適用されるかを更に示す、請求項1の動画像符号化方法。 The filter switching information indicates whether or not loop filter processing is applied to the pixel block to be processed. When loop filter processing is applied to the pixel block to be processed, the first filter set and The moving picture coding method according to claim 1, further showing which of the second filter sets is applied.
  3.  前記第1フィルタ係数情報は、前記第2フィルタ係数情報よりも上位のシンタクスにおいて符号化される、請求項1の動画像符号化方法。 The moving picture encoding method according to claim 1, wherein the first filter coefficient information is encoded in a higher-order syntax than the second filter coefficient information.
  4.  前記フィルタ係数は、オフセット値のみからなる、請求項1の動画像符号化方法。 The moving picture encoding method according to claim 1, wherein the filter coefficient includes only an offset value.
  5.  前記処理対象の画素ブロックに前記第1のフィルタセットが適用される場合には、前記第1フィルタ係数情報を参照する第1の値を持つ識別情報が前記切り替え情報の少なくとも一部として符号化され、
     前記処理対象の画素ブロックに前記第2のフィルタセットが適用され、かつ、既に符号化済みの画素ブロックと同一の第2のフィルタセットが適用される場合には、前記既に符号化済みの画素ブロックに対応する第2フィルタ係数情報を参照する第2の値を持つ前記識別情報が前記処理対象の画素ブロックに対応する第2フィルタ係数情報として符号化される、
     請求項1の動画像符号化方法。
    When the first filter set is applied to the pixel block to be processed, identification information having a first value referring to the first filter coefficient information is encoded as at least part of the switching information. ,
    In the case where the second filter set is applied to the pixel block to be processed and the same second filter set as the already encoded pixel block is applied, the already encoded pixel block The identification information having a second value referring to the second filter coefficient information corresponding to is encoded as second filter coefficient information corresponding to the pixel block to be processed.
    The moving image encoding method according to claim 1.
  6.  符号化された前記第1フィルタ係数情報を記憶部に保存することと、
     符号化された前記第2フィルタ係数情報を前記記憶部に保存することと、
     を更に具備し、
     前記記憶部に保存されている前記第1フィルタ係数情報に対応する第1のフィルタセットと前記記憶部に保存されている前記第2フィルタ係数情報に対応する第2のフィルタセットとの総数が制限数に達している場合に、前記符号化された第2フィルタ係数情報を前記記憶部に保存するために前記記憶部に保存されている第2フィルタ係数情報のうち1つの第2のフィルタセットに対応する第2フィルタ係数情報が上書きされる
     請求項5の動画像符号化方法。
    Storing the encoded first filter coefficient information in a storage unit;
    Storing the encoded second filter coefficient information in the storage unit;
    Further comprising
    The total number of the first filter set corresponding to the first filter coefficient information stored in the storage unit and the second filter set corresponding to the second filter coefficient information stored in the storage unit is limited. A second filter set of the second filter coefficient information stored in the storage unit for storing the encoded second filter coefficient information in the storage unit when the number of the second filter coefficient information has been reached. The moving image coding method according to claim 5, wherein the corresponding second filter coefficient information is overwritten.
  7.  復号画像に対して設定される第1のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第1フィルタ係数情報を復号することと、
     前記復号画像内の処理対象の画素ブロックにループフィルタ処理が適用される場合に前記第1のフィルタセット及び当該処理対象の画素ブロックに対して設定される第2のフィルタセットのいずれが適用されるかを示すフィルタ切り替え情報を前記第1フィルタ係数情報が復号された後に復号することと、
     前記処理対象の画素ブロックに前記第2のフィルタセットが適用される場合には、前記第2のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第2フィルタ係数情報を前記第1フィルタ係数情報が復号された後に復号することと、
     前記フィルタ切り替え情報に基づいて、前記処理対象の画素ブロックに前記第1のフィルタセット及び前記第2のフィルタセットのいずれか一方を適用することによって、参照画像内の画素ブロックを生成することと
     を具備する動画像復号方法。
    Decoding first filter coefficient information indicating each filter coefficient of one or more filters included in a first filter set set for a decoded image;
    When the loop filter process is applied to the pixel block to be processed in the decoded image, either the first filter set or the second filter set set for the pixel block to be processed is applied. Decoding after the first filter coefficient information is decoded,
    When the second filter set is applied to the pixel block to be processed, second filter coefficient information indicating each filter coefficient of one or more filters included in the second filter set is stored in the first filter coefficient information. Decoding after 1 filter coefficient information is decoded;
    Generating one pixel block in a reference image by applying one of the first filter set and the second filter set to the pixel block to be processed based on the filter switching information; A moving picture decoding method provided.
  8.  前記フィルタ切り替え情報は、前記処理対象の画素ブロックにループフィルタ処理が適用されるか否かを示し、前記処理対象の画素ブロックにループフィルタ処理が適用される場合には前記第1のフィルタセット及び前記第2のフィルタセットのいずれが適用されるかを更に示す、請求項7の動画像復号方法。 The filter switching information indicates whether or not loop filter processing is applied to the pixel block to be processed. When loop filter processing is applied to the pixel block to be processed, the first filter set and The moving picture decoding method according to claim 7, further indicating which of the second filter sets is applied.
  9.  前記第1フィルタ係数情報は、前記第2フィルタ係数情報よりも上位のシンタクスにおいて復号される、請求項7の動画像復号方法。 The moving picture decoding method according to claim 7, wherein the first filter coefficient information is decoded in a higher-order syntax than the second filter coefficient information.
  10.  前記フィルタ係数は、オフセット値のみからなる、請求項7の動画像復号方法。 The moving picture decoding method according to claim 7, wherein the filter coefficient includes only an offset value.
  11.  前記処理対象の画素ブロックに前記第1のフィルタセットが適用される場合には、前記第1フィルタ係数情報を参照する第1の値を持つ識別情報が前記切り替え情報の少なくとも一部として復号され、
     前記処理対象の画素ブロックに前記第2のフィルタセットが適用され、かつ、既に復号済みの画素ブロックと同一の第2のフィルタセットが適用される場合には、前記既に復号済みの画素ブロックに対応する第2フィルタ係数情報を参照する第2の値を持つ前記識別情報が前記処理対象の画素ブロックに対応する第2フィルタ係数情報として復号される、
     請求項7の動画像復号方法。
    When the first filter set is applied to the pixel block to be processed, identification information having a first value referring to the first filter coefficient information is decoded as at least part of the switching information,
    When the second filter set is applied to the pixel block to be processed and the same second filter set as the already decoded pixel block is applied, it corresponds to the already decoded pixel block The identification information having a second value referring to the second filter coefficient information is decoded as second filter coefficient information corresponding to the pixel block to be processed.
    The moving image decoding method according to claim 7.
  12.  復号された前記第1フィルタ係数情報を記憶部に保存することと、
     復号された前記第2フィルタ係数情報を前記記憶部に保存することと、
     を更に具備し、
     前記記憶部に保存されている前記第1フィルタ係数情報に対応する第1のフィルタセットと前記記憶部に保存されている前記第2フィルタ係数情報に対応する第2のフィルタセットとの総数が制限数に達している場合に、前記復号された第2フィルタ係数情報を前記記憶部に保存するために前記記憶部に保存されている第2フィルタ係数情報のうち1つの第2のフィルタセットに対応する第2フィルタ係数情報が上書きされる、
     請求項11の動画像復号方法。
    Storing the decoded first filter coefficient information in a storage unit;
    Storing the decoded second filter coefficient information in the storage unit;
    Further comprising
    The total number of the first filter set corresponding to the first filter coefficient information stored in the storage unit and the second filter set corresponding to the second filter coefficient information stored in the storage unit is limited. When the number has reached the number, it corresponds to one second filter set among the second filter coefficient information stored in the storage unit in order to store the decoded second filter coefficient information in the storage unit The second filter coefficient information to be overwritten,
    The moving image decoding method according to claim 11.
  13.  復号画像に対して設定される第1のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第1フィルタ係数情報を符号化し、前記復号画像内の処理対象の画素ブロックにループフィルタ処理が適用される場合に前記第1のフィルタセット及び当該処理対象の画素ブロックに対して設定される第2のフィルタセットのいずれが適用されるかを示すフィルタ切り替え情報を前記第1フィルタ係数情報が符号化された後に符号化し、前記処理対象の画素ブロックに前記第2のフィルタセットが適用される場合には、前記第2のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第2フィルタ係数情報を前記第1フィルタ係数情報が符号化された後に符号化する符号化部と、
     前記フィルタ切り替え情報に基づいて、前記処理対象の画素ブロックに前記第1のフィルタセット及び前記第2のフィルタセットのいずれか一方を適用することによって、参照画像内の画素ブロックを生成するループフィルタ処理部と
     を具備する、動画像符号化装置。
    First filter coefficient information indicating each filter coefficient of one or more filters included in a first filter set set for a decoded image is encoded, and a loop filter is applied to a pixel block to be processed in the decoded image When the processing is applied, filter switching information indicating which of the first filter set and the second filter set set for the pixel block to be processed is applied is the first filter coefficient information. Are encoded, and when the second filter set is applied to the pixel block to be processed, each filter coefficient of one or more filters included in the second filter set is An encoding unit for encoding the second filter coefficient information to be encoded after the first filter coefficient information is encoded;
    Loop filter processing for generating a pixel block in a reference image by applying one of the first filter set and the second filter set to the pixel block to be processed based on the filter switching information A moving image encoding apparatus.
  14.  復号画像に対して設定される第1のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第1フィルタ係数情報を復号し、前記復号画像内の処理対象の画素ブロックにループフィルタ処理が適用される場合に前記第1のフィルタセット及び当該処理対象の画素ブロックに対して設定される第2のフィルタセットのいずれが適用されるかを示すフィルタ切り替え情報を前記第1フィルタ係数情報が復号された後に復号し、前記処理対象の画素ブロックに前記第2のフィルタセットが適用される場合には、前記第2のフィルタセットに包含される1以上のフィルタの各々のフィルタ係数を示す第2フィルタ係数情報を前記第1フィルタ係数情報が復号された後に復号する復号部と、
     前記フィルタ切り替え情報に基づいて、前記処理対象の画素ブロックに前記第1のフィルタセット及び前記第2のフィルタセットのいずれか一方を適用することによって、参照画像内の画素ブロックを生成するループフィルタ処理部と
     を具備する動画像復号装置。
    First filter coefficient information indicating each filter coefficient of one or more filters included in a first filter set set for a decoded image is decoded, and a loop filter is applied to a pixel block to be processed in the decoded image When the processing is applied, filter switching information indicating which of the first filter set and the second filter set set for the pixel block to be processed is applied is the first filter coefficient information. When the second filter set is applied to the pixel block to be processed, each filter coefficient of one or more filters included in the second filter set is indicated. A decoding unit for decoding second filter coefficient information after the first filter coefficient information is decoded;
    Loop filter processing for generating a pixel block in a reference image by applying one of the first filter set and the second filter set to the pixel block to be processed based on the filter switching information And a video decoding device.
PCT/JP2012/058209 2012-03-28 2012-03-28 Video encoding method, video decoding method, video encoding device, and video decoding device WO2013145174A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/058209 WO2013145174A1 (en) 2012-03-28 2012-03-28 Video encoding method, video decoding method, video encoding device, and video decoding device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/058209 WO2013145174A1 (en) 2012-03-28 2012-03-28 Video encoding method, video decoding method, video encoding device, and video decoding device

Publications (1)

Publication Number Publication Date
WO2013145174A1 true WO2013145174A1 (en) 2013-10-03

Family

ID=49258539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/058209 WO2013145174A1 (en) 2012-03-28 2012-03-28 Video encoding method, video decoding method, video encoding device, and video decoding device

Country Status (1)

Country Link
WO (1) WO2013145174A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222118A (en) * 2021-12-17 2022-03-22 北京达佳互联信息技术有限公司 Encoding method and device, and decoding method and device
JP2022527012A (en) * 2019-04-16 2022-05-27 北京字節跳動網絡技術有限公司 ON adaptive loop filtering for video coding

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010143427A1 (en) * 2009-06-10 2010-12-16 パナソニック株式会社 Image encoding method, image decoding method, and devices therefor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010143427A1 (en) * 2009-06-10 2010-12-16 パナソニック株式会社 Image encoding method, image decoding method, and devices therefor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHING-YEH CHEN ET AL.: "CE8.a.l: One-stage SAO and ALF with LCU-based syntax", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG16 WP3 AND ISO/IEC JTC1/SC29/WG11 8TH MEETING, 4 February 2012 (2012-02-04), SAN JOSE, CA, USA *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022527012A (en) * 2019-04-16 2022-05-27 北京字節跳動網絡技術有限公司 ON adaptive loop filtering for video coding
US11611747B2 (en) 2019-04-16 2023-03-21 Beijing Bytedance Network Technology Co., Ltd. Adaptive loop filtering for video coding
CN114222118A (en) * 2021-12-17 2022-03-22 北京达佳互联信息技术有限公司 Encoding method and device, and decoding method and device
CN114222118B (en) * 2021-12-17 2023-12-12 北京达佳互联信息技术有限公司 Encoding method and device, decoding method and device

Similar Documents

Publication Publication Date Title
JP6615290B2 (en) Video encoding method, video decoding method, video encoder, and video decoder
JP2022087158A (en) Intra-prediction method and encoder and decoder using same
JP6047614B2 (en) Video decoding device
KR101530782B1 (en) Method, apparatus and system for image encoding and decoding
KR101749269B1 (en) Apparaus and method for video encoding and decoding apparatus using adaptive in loop filter
EP2755388B1 (en) Method, device, and program for encoding and decoding image
WO2009133844A1 (en) Video encoding and decoding method and device equipped with edge-referenced filtering function
KR102393178B1 (en) Method and apparatus for generating reconstruction block
JP2015146627A (en) Intra-prediction mode decoding method and device
JPWO2010001999A1 (en) Video encoding / decoding method and apparatus
KR101530774B1 (en) Method, apparatus and system for image encoding and decoding
JP2024506213A (en) Encoding/decoding method, apparatus and device thereof
KR102349435B1 (en) The method of encoding and decoding of quantization matrix and the apparatus for using the same
WO2013145174A1 (en) Video encoding method, video decoding method, video encoding device, and video decoding device
JP5937926B2 (en) Image encoding device, image decoding device, image encoding program, and image decoding program
JP2012209873A (en) Moving image encoder and moving image encoding method
WO2012169054A1 (en) Video coding method and device, and video decoding method and device
JP2012209874A (en) Moving image decoder and moving image decoding method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12872938

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12872938

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP