WO2016063440A1 - Dispositif de codage d'image vidéo, dispositif de décodage d'image vidéo, procédé de codage d'image vidéo, procédé de décodage d'image vidéo, et programme - Google Patents

Dispositif de codage d'image vidéo, dispositif de décodage d'image vidéo, procédé de codage d'image vidéo, procédé de décodage d'image vidéo, et programme Download PDF

Info

Publication number
WO2016063440A1
WO2016063440A1 PCT/JP2015/004041 JP2015004041W WO2016063440A1 WO 2016063440 A1 WO2016063440 A1 WO 2016063440A1 JP 2015004041 W JP2015004041 W JP 2015004041W WO 2016063440 A1 WO2016063440 A1 WO 2016063440A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
color difference
color space
difference quantization
quantization offset
Prior art date
Application number
PCT/JP2015/004041
Other languages
English (en)
Japanese (ja)
Inventor
慶一 蝶野
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Publication of WO2016063440A1 publication Critical patent/WO2016063440A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • H04N19/126Details of normalisation or weighting functions, e.g. normalisation matrices or variable uniform quantisers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards

Definitions

  • the present invention relates to a video encoding device and a video decoding device that use residual area adaptive color conversion and color difference quantization offset.
  • each frame of the digitized image is divided into Coding Tree Units (CTU: Coding Tree Unit), and each CTU in the raster scan order. Encoded.
  • CTU is divided into coding units (CU: Coding Unit) in a quad tree structure and encoded.
  • CU Coding Unit
  • Each CU is predicted by being divided into prediction units (PU: Prediction Unit).
  • PU Prediction Unit
  • the prediction error of each CU is divided into transform units (TU: Transform Unit) in a quadtree structure, and is subjected to frequency conversion.
  • CU is a coding unit for intra prediction / interframe prediction.
  • Intra prediction is prediction in which a prediction signal is generated from a reconstructed image of an encoding target frame.
  • HEVC / H.265 defines 33 types of angle intra prediction.
  • an intra prediction signal is generated by extrapolating the reconstructed pixels around the encoding target block in any of 33 types of directions.
  • DC prediction and Planar prediction are specified as intra prediction.
  • DC prediction the average value of the reference image is used as the prediction value for all pixels of the prediction target TU.
  • Planar prediction a predicted image is generated by linear interpolation from pixels in a reference image.
  • Inter-frame prediction is prediction based on an image of a reconstructed frame (reference picture) having a display time different from that of the encoding target frame. Inter-frame prediction is also called inter prediction. In inter prediction, an inter prediction signal is generated based on a reconstructed image block of a reference picture (using pixel interpolation if necessary).
  • a digital color image is composed of R, G, and B digital images.
  • the RGB space is used to increase compression efficiency (to reduce the amount of data).
  • the image signal is converted into a signal in a color space (YCoCr space) that is a combination of a luminance signal (Y) and a color difference signal (Cb, Cr).
  • the quantization parameter (QP) for the color difference signal is generated by converting the QP for the luminance signal using an offset value called chroma_qp_index_offset.
  • QP quantization parameter
  • cb_qp_index_offset first color difference quantization offset
  • cr_qp_index_offset second color difference quantization offset
  • Non-Patent Document 2 proposes a technique called residual area adaptive color transformation (Adaptive color transform in residual domain).
  • the residual area adaptive color conversion is a technique for adaptively switching the prediction error signal of the image signal in the RGB color space to the signal in the YCoCr color space in units of blocks.
  • FIG. 17 shows an example in which data compression is performed in the YCoCr space for the hatched blocks and data compression is performed in the RGB space for the other blocks.
  • cu_residual_csc_flag 0 indicates that the signal in the RGB space is compressed
  • cu_residual_csc_flag 1 indicates that the signal is compressed after being converted into the YCoCr space.
  • the receiver performs a decoding process after returning the YCoCr space signal to the RGB space signal using the Backward color space conversion matrix described below.
  • Patent Document 1 describes a video encoding device and a video decoding device that perform different signal processing depending on whether the input image signal is a signal in the RGB color space or the YCoCr space. Specifically, when performing weighted prediction based on the H.264 / AVC method, the video encoding device relates to the offset added to the prediction signal, the R, G, B signal, and the luminance signal (Y signal). ) And the same offset are applied to the color difference signal. However, Patent Document 1 does not teach new knowledge regarding color difference quantization offset.
  • a switch 101 ⁇ includes a switch 101 ⁇ , a color space converter 102, a switch 103, a frequency converter / quantizer 104, an inverse quantization / inverse frequency converter 105, a switch 106, an inverse color space converter 107, A switch 108, a buffer 109, a predictor 110, a prediction parameter determiner 111 ⁇ , an entropy encoder 112, a subtractor 115, and an adder 116 are provided.
  • the predictor 110 generates a prediction signal for the input image signal of the CU. Specifically, the predictor 110 generates a prediction signal (intra prediction signal) based on the intra prediction, and generates a prediction signal (inter prediction signal) based on the inter prediction.
  • the image input to the video encoding device is input to the switch 101 ⁇ ⁇ ⁇ as a prediction error image after the prediction image supplied from the predictor 110 ⁇ is subtracted by the subtractor 115.
  • the input image signal is a signal in RGB space.
  • the video encoding device has a residual area adaptive color conversion function.
  • the video encoding apparatus can adaptively switch the prediction error signal of the image signal in the RGB Y space to the signal in the YCoCr space in units of blocks.
  • the switch 101 When the prediction error signal is handled in the RGB space, the switch 101 is set so that the prediction error image is input to the switch 103. When the prediction error signal is handled in the YCoCr space, the switch 101 is set so that the prediction error image is input to the color space converter 102. Note that the switch 101 sets the output destination of the prediction error image, for example, according to the control of the prediction parameter determiner 111.
  • the color space converter 102 ⁇ ⁇ converts the RGB ⁇ space prediction error signal into a YCoCr space signal using the above equation (1) (Forward color space conversion matrix), and then outputs the signal to the switch 103.
  • the switch 103 ⁇ ⁇ When the prediction error signal is handled in the RGB color space, the switch 103 ⁇ ⁇ outputs the prediction error signal from the switch 101 ⁇ ⁇ to the frequency converter / quantizer 104. When the prediction error signal is handled in the YCoCr space, the switch 103 outputs the prediction error signal from the color space converter 102 to the frequency converter / quantizer 104. Note that the switch 103 ⁇ ⁇ selects the input source of the prediction error image, for example, according to the control of the prediction parameter determiner 111.
  • the frequency converter / quantizer 104 performs frequency conversion on the prediction error image and quantizes the frequency converted prediction error image (coefficient image).
  • the entropy encoder 112 performs entropy encoding on the prediction parameter and the quantized coefficient image, and outputs a bit stream.
  • the inverse quantization / inverse frequency converter 105 inversely quantizes the quantized coefficient image. Further, the inverse quantization / inverse frequency converter 105 performs inverse frequency conversion on the inversely quantized coefficient image.
  • the reconstructed prediction error image subjected to the inverse frequency conversion is input to the switch 106 ⁇ ⁇ ⁇ ⁇ .
  • the switch 106 When the prediction error signal is handled in the RGB space, the switch 106 is set so that the reconstructed prediction error image is input to the switch 108. When the prediction error signal is handled in the YCoCr space, the switch 106 is set so that the reconstructed prediction error image is input to the inverse color space converter 107. Note that the switch 106 selects the output destination of the reconstructed prediction error image according to the control of the prediction parameter determiner 111, for example.
  • the inverse color space converter 107 converts the reconstruction prediction error signal of the YCoCr space into a signal in the RGB space using the above equation (1) (Backward color space conversion matrix), and then outputs it to the switch 108.
  • Switch 108 selects the reconstructed prediction error signal from switch 106 when handling the prediction error signal in the RGB space.
  • the switch 108 selects the reconstructed prediction error signal from the inverse color space converter 107. Note that the switch 108 ⁇ ⁇ selects any one of the reconstructed prediction error images according to the control of the prediction parameter determiner 111 ⁇ ⁇ ⁇ , for example.
  • the reconstructed prediction error image from the switch 108 is supplied to the buffer 109 ⁇ as a reconstructed image after the prediction signal is added by the adder 116.
  • the buffer 109 stores the reconstructed image.
  • the prediction parameter determiner 111 instructs the predictor 110 to compare the input image signal and the prediction signal and determine, for example, a prediction parameter that minimizes the coding cost.
  • the prediction parameter determiner 111 ⁇ ⁇ ⁇ supplies the determined prediction parameter to the entropy encoder 112.
  • the prediction parameter is information related to block prediction, such as a prediction mode (intra prediction, inter prediction), an intra prediction block size, an intra prediction direction, an inter prediction block size, and a motion vector.
  • the prediction parameter determiner 111 further determines, for each block, whether to handle the prediction error signal in RGB space or YCoCr space.
  • FIG. 19 is a block diagram illustrating an example of a configuration of a general video decoding apparatus that decodes a bitstream output from a general video encoding apparatus and obtains a decoded image. The configuration and operation of a general video decoding device will be described with reference to FIG.
  • the video decoding apparatus shown in FIG. 19 includes an entropy decoder 212, an inverse quantization / inverse frequency converter 205, a switch 206, an inverse color space converter 207, a switch 208, a buffer 209, a predictor 210, and an adder 216. .
  • the entropy decoder 212 performs entropy decoding on the input bit stream.
  • the entropy decoder 212 supplies the quantized coefficient image to the inverse quantization / inverse frequency converter 205, and supplies the prediction parameters to the predictor 210.
  • the inverse quantization / inverse frequency converter 205 inversely quantizes the input quantized coefficient image and outputs it as a coefficient image. Further, the inverse quantization / inverse frequency converter 205 converts the frequency domain coefficient image into a spatial domain image and outputs it as a prediction error image. The prediction error image is input to the switch 206.
  • the switch 206 When the prediction error signal is handled in the RGB color space, the switch 206 is set so that the prediction error image is input to the switch 208. When the prediction error signal is handled in the YCoCr space, the switch 206 is set so that the prediction error image is input to the inverse color space converter 207.
  • the switch 206 ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ can specify whether the prediction error signal should be handled in the RGB space or the prediction error signal in the YCoCr ⁇ ⁇ ⁇ ⁇ space by signaling from the video encoding device.
  • the inverse color space converter 207 converts the YCoCr ⁇ space prediction error signal into the RGB space signal using the above equation (1) (Backward color space conversion matrix), and then outputs the prediction error signal to the switch 208. .
  • the switch 208 ⁇ ⁇ selects the prediction error signal from the switch 206 ⁇ ⁇ when handling the prediction error signal in the RGB space.
  • the switch 208 selects the prediction error signal from the inverse color space converter 207.
  • the switch 208 ⁇ can specify whether the prediction error signal should be handled in the RGB space or the prediction error signal in the YCoCr space by signaling from the video encoding device.
  • the prediction error image from the switch 208 ⁇ ⁇ is supplied to the buffer 209 ⁇ as a reconstructed image after the prediction signal from the predictor 210 ⁇ ⁇ is added by the adder 216.
  • the buffer 209 stores the reconstructed image.
  • the reconstructed image stored in the buffer 209 is output as a decoded image (decoded video).
  • the buffer 209 accumulates previously decoded images as reference images.
  • the predictor 210 predicts a decoding target image from adjacent reconstructed images decoded in the past in the image currently being decoded, and generates a predicted image.
  • the predictor 210 ⁇ ⁇ generates a prediction image based on the reference image supplied from the buffer 209 when performing inter prediction.
  • the color difference quantization offset technique is a technique for adjusting a quantization parameter for each color component by signaling a color difference quantization offset value for the second color component and the third color component. That is, the quantization intensity can be changed.
  • -Picture unit pps_cb_qp_offset / pps_cr_qp_offset / slice_qp_delta_cb / slice_qp_delta_cr ⁇
  • Slice unit slice_qp_delta_cb / slice_qp_delta_cr -Block unit: cu_chroma_qp_offset_idx
  • Subjective image quality can be improved by adjusting the quantization intensity for each color component using the above syntax.
  • the video encoding device shown in FIG. 18 and the video decoding device shown in FIG. 19 also apply a color difference quantization offset. As shown in FIG. 18, a predetermined color difference quantization offset is input to the video encoding device.
  • the frequency transformer / quantizer 104 ⁇ ⁇ quantizes the coefficient image when handling the prediction error signal in the RGB space, as shown in FIG. Increase / decrease according to the one color difference quantization offset, and increase / decrease the quantization parameter of the R component according to the second color difference quantization offset.
  • the inverse quantization / inverse frequency converter 105 increases / decreases the inverse quantization parameter of the B component according to the first color difference quantization offset, and increases / decreases the inverse quantization parameter of the R component according to the second color difference quantization offset.
  • the frequency transform / quantizer 104 ⁇ sets the Co component quantization parameter according to the first color difference quantization offset. Increase / decrease, and increase / decrease the Cr component quantization parameter according to the second color difference quantization offset.
  • the inverse quantization / inverse frequency converter 105 increases or decreases the inverse quantization parameter of the Co component according to the first color difference quantization offset, and increases or decreases the inverse quantization parameter of the Cr component according to the second color difference quantization offset.
  • the inverse quantization / inverse frequency converter 205 operates in the same manner as the inverse quantization / inverse frequency converter 105 in the video encoding device.
  • the color difference quantization offset technique is a technique of signaling the color difference quantization offset value for the second color component (second color component) and the third color component (third color component).
  • the residual area adaptation is performed.
  • the block compressed in the RGB space and the block compressed in the YCoCr space share the quantization intensity.
  • the quantization intensity cannot be set appropriately according to the color space.
  • the subjective image quality improvement effect by the color difference quantization offset technique cannot be obtained.
  • the present invention relates to a video encoding device, a video decoding device, a video encoding method, and a video decoding method that do not impair the subjective image quality improvement effect when residual region adaptive color conversion and color difference quantization offset are used in combination. And to provide a program.
  • the video encoding device can select a color space of a prediction error signal from a plurality of color spaces in units of encoding blocks, and derive an adaptive color difference quantization offset for deriving a color difference quantization offset for each color space.
  • Means and inverse quantization means for inversely quantizing the quantized coefficient image using a color difference quantization offset for each color space.
  • the video decoding apparatus is capable of selecting a color space of a prediction error signal in units of encoding blocks from a plurality of color spaces, and deriving an adaptive color difference quantization offset deriving unit that derives a color difference quantization offset for each color space And inverse quantization means for inversely quantizing the quantization coefficient image using the color difference quantization offset for each color space.
  • the video coding method can select a color space of a prediction error signal from a plurality of color spaces in units of coding blocks, derive a color difference quantization offset for each color space, and obtain a color difference for each color space.
  • the quantization coefficient image is inversely quantized using a quantization offset.
  • the video decoding method can select a color space of a prediction error signal from a plurality of color spaces in units of coding blocks, derive a color difference quantization offset for each color space, and perform color difference quantization for each color space.
  • the quantization coefficient image is inversely quantized using the quantization offset.
  • a video encoding program is a video encoding program that implements a video encoding method capable of selecting a color space of a prediction error signal from a plurality of color spaces in units of encoding blocks. It is characterized in that a process for deriving a color difference quantization offset for each space and a process for dequantizing a quantization coefficient image using the color difference quantization offset for each color space are executed.
  • a video decoding program is a video decoding program for executing a video decoding method capable of selecting a color space of a prediction error signal from a plurality of color spaces in units of encoding blocks.
  • the present invention is characterized in that a process for deriving a color difference quantization offset and a process for dequantizing a quantization coefficient image using a color difference quantization offset for each color space are executed.
  • the subjective image quality improvement effect can be prevented from being impaired.
  • FIG. 10 is an explanatory diagram illustrating an example of syntax for transmitting cb_qp_offset_list [i] and cr_qp_offset_list [i]. It is explanatory drawing which shows an example of the syntax for transmitting alt_cb_qp_offset_list
  • FIG. FIG. 1 is a block diagram showing a first embodiment of a video encoding device. With reference to FIG. 1, the configuration of a video encoding apparatus that outputs a bit stream using each frame of a digitized video as an input image will be described.
  • the video encoding apparatus of the first embodiment is similar to the general video encoding apparatus shown in FIG. 18 in that a switch 101, a color space converter 102, a switch 103, a frequency conversion / Quantizer 104, Inverse quantization / inverse frequency converter 105, Switch 106, Inverse color space converter 107, Switch 108 ⁇ ⁇ , Buffer 109, Predictor 110, Prediction parameter determiner 111, Entropy encoder 112 ⁇ , Subtractor 115 and adder 116 are provided.
  • the video encoding apparatus further includes an adaptive color difference quantization offset deriving unit 121 and a switch 122.
  • FIG. 2 (b) is a flowchart showing processing related to signaling of the color difference quantization offset.
  • the video encoding apparatus signals information indicating whether or not to perform residual area adaptive color conversion using adaptive_color_trans_flag. Further, when performing the residual area adaptive color conversion, the video encoding apparatus signals information indicating the color space of the block with cu_residual_csc_flag.
  • the RGB offset space color difference quantization offset) input to the quantization offset deriving unit 121 ⁇ ⁇ is transmitted with the following syntax (steps S 101 and S 102).
  • the entropy encoder 112 ⁇ ⁇ transmits the color difference quantization offset for the RGB color space derived by the adaptive color difference quantization offset deriving unit 121 ⁇ with the following syntax (steps S103 and S104). .
  • -Picture unit pps_cb_qp_offset / pps_cr_qp_offset / slice_qp_delta_cb / slice_qp_delta_cr ⁇ Slice unit: slice_qp_delta_cb / slice_qp_delta_cr
  • the entropy encoder 112 transmits the color difference quantization offset for the YCoCr space derived by the adaptive color difference quantization offset derivation 121 ⁇ with the following syntax (steps S103 and S105).
  • the adaptive color difference quantization offset deriving unit 121 outputs the derived color difference quantization offsets for the YCoCr space (first color difference quantization offset and second color difference quantization offset) to the switch 122.
  • the adaptive color difference quantization offset deriving unit 121 outputs the derived RGB color space color difference quantization offsets (first color difference quantization offset and second color difference quantization offset) to the switch 122.
  • the adaptive color difference quantization offset derivation unit 121 recognizes whether compression is performed in the RGB color space or the YCoCr color space according to the cu_residual_csc_flag.
  • the frequency transformer / quantizer 104 adjusts the quantization parameter using the color difference quantization offset determined by the prediction parameter determiner 111 ⁇ .
  • the prediction parameter determiner 111 stores, for example, a color difference quantization offset value for RGB space and a color difference quantization offset value for YCoCr ⁇ ⁇ space in advance, and an appropriate value for the RGB space quantization offset or YCoCr space.
  • the value of the color difference quantization offset for use is supplied to the frequency converter / quantizer 104.
  • the color difference quantization offset value for the RGB color space and the color difference quantization offset value for the YCoCr space are included in the prediction parameters supplied to the entropy encoder 112.
  • the entropy encoder 112 signals the value of the color difference quantization offset for the RGB color space and the value of the color difference quantization offset for the YCoCr space.
  • the video encoding apparatus explicitly signals the color difference quantization offset.
  • the value of the color difference quantization offset is signaled.
  • the operation of the video encoding device other than the above operation is the same as the operation of the video encoding device shown in FIG.
  • FIG. FIG. 3 is a block diagram illustrating a configuration of a video decoding apparatus that decodes a bitstream output from a video encoding apparatus that signals a color difference quantization offset and obtains a decoded image.
  • the configuration of the video decoding apparatus according to the second embodiment will be described.
  • the video decoding apparatus of the present embodiment is similar to the general video decoding apparatus shown in FIG. 19, with an entropy decoder 212, an inverse quantization / inverse frequency converter 205, a switch 206, An inverse color space converter 207, a switch 208, a buffer 209, a predictor 210, and an adder 216 ⁇ ⁇ are provided.
  • the video decoding apparatus further includes an adaptive color difference quantization offset deriving unit 221 and a switch 222.
  • the inverse quantization / inverse frequency converter 205, the switch 206, the inverse color space converter 207, the switch 208, the buffer 209, the predictor 210, and the adder 216 operate in the same manner as those shown in FIG.
  • the operations of the adaptive color difference quantization offset derivation unit 221 and the switch 222 and the operation of the entropy decoder 212 ⁇ ⁇ related to the derivation of the color difference quantization offset will be mainly described.
  • Fig. 4 is a flowchart showing processing related to derivation of the color difference quantization offset.
  • the adaptive color difference quantization offset derivation unit 221 derives a color difference quantization offset for the YCoCr space (step S204).
  • the adaptive color difference quantization offset derivation 221 derives the color difference quantization offset for RGB space. (Step S203).
  • the adaptive color difference quantization offset deriving unit 221 derives the color difference quantization offset for the RGB space (first color difference quantization offset qPi Cb and second color difference quantization offset qPi Cr ) as follows.
  • qPi Cb Clip3 (-QpBdOffset C , 57, Qp Y + pps_cb_qp_offset + slice_cb_qp_offset + CuQpOffset Cb )
  • qPi Cr Clip3 (-QpBdOffset C , 57, Qp Y + pps_cr_qp_offset + slice_cr_qp_offset + CuQpOffset Cr ) (2)
  • Clip3 (x, y, z) is a function for clipping the input z into the range [x, y].
  • Qp Y is a quantization parameter for the first color component
  • CuQpOffset Cb is a color difference quantization offset for each block of the second color component
  • CuQpOffset Cr is a color difference quantization offset for each block of the third color component.
  • QPI Cb has been described as QPI Cr
  • the first color component is a G component
  • a second color component B component when the third color component of the RGB space of the R component
  • QPI Cb, for the B component corresponds to the color difference quantization offset
  • QPI Cr corresponds to the color difference quantization offset for R component.
  • the adaptive chrominance quantization offset derivation unit 221 derives a chrominance quantization offset (first chrominance quantization offset qPi Cb and second chrominance quantization offset qPi Cr ) for YCoCr space as shown in the following equation (3). .
  • qPi Cb Clip3 (-QpBdOffset C , 57, Qp Y + alt_pps_cb_qp_offset + alt_slice_cb_qp_offset + CuQpOffset Cb )
  • qPi Cr Clip3 (-QpBdOffset C , 57, Qp Y + alt_pps_cr_qp_offset + alt_slice_cr_qp_offset + CuQpOffset Cr ) (3)
  • the quantization parameters (Qp ′ Cb , Qp ′ Cr ) are calculated as in the following equation (4).
  • qP Cb Clip3 (-QpBdOffset C , 57, Qp Y + pps_cb_qp_offset + slice_cb_qp_offset + CuQpOffset Cb )
  • qPi Cr Clip3 (-QpBdOffset C , 57, Qp Y + pps_cr_qp_offset + slice_cr_qp_offset + CuQpOffset Cr ) -"Otherwise (cu_residual_csc_flag is equal to 1)
  • the variables qP Cb and qP Cr are set equal to Min (qPi, 51), based on the index qPi equal to qPi Cb and qPi Cr , respectively.
  • the inverse quantization / inverse frequency converter 205 dequantizes the input quantized coefficient image and outputs the coefficient image as a coefficient image by converting the quantization parameter into the color difference quantization offset from the adaptive color difference quantization offset deriving device 221. Increase or decrease according to
  • FIG. 5 is an explanatory diagram showing an example of syntax for transmitting alt_pps_cb_qp_offset and alt_pps_cr_qp_offset (improvement of 7.3.2.3.2 Picture parameter set range extensions syntax described in Non-Patent Document 1).
  • the italicized display locations indicate the characteristic locations of this embodiment.
  • FIGS. 6 and 7 are explanatory diagrams showing an example of syntax for transmitting alt_slice_qp_delta_cb and alt_slice_qp_delta_cr (improvement of 7.3.1.1 General slice segment header syntax described in Non-Patent Document 1).
  • the italicized display locations indicate the characteristic locations of the present embodiment.
  • the configuration of the video encoding apparatus of the present embodiment is the same as that shown in FIG.
  • the entropy encoder 112 specifies information that can specify the color difference quantization offset for the RGB space (for example, specifies a data table in which the color difference quantization offset held in the video decoding device is set) Index or color difference quantization offset value) is transmitted to the video decoding device.
  • the entropy encoder 112 ⁇ uses the syntax illustrated in FIGS. 5, 6, and 7 ⁇ to identify information that can specify the color difference quantization offset for the YCoCr space (for example, the color difference quantum). Signaling offset value itself).
  • Embodiment 4 FIG. Next, a video decoding apparatus according to the fourth embodiment will be described.
  • the video decoding apparatus according to the present embodiment corresponds to the video encoding apparatus according to the third embodiment. Note that the configuration of the video decoding apparatus of the present embodiment is the same as the configuration shown in FIG.
  • the entropy decoder 212 decodes that the data compression is performed in the YCoCr space by the syntax illustrated in FIGS. 5, 6, and 7, the adaptive color difference quantization offset derivation 221 Similar to the case of the second embodiment, a chrominance quantization offset is derived.
  • the adaptive color difference quantization offset derivation unit 121 operates in the same manner as the adaptive color difference quantization offset derivation unit 221.
  • FIG. 8 shows an example of syntax for additionally transmitting cb_qp_offset_list [i] and cr_qp_offset_list [i] for YCoCr space (7.3.2.3.2 Improvement of Picture parameter set range extensions syntax described in Non-Patent Document 1)
  • FIG. 8 italicized display locations indicate the feature locations of this embodiment (that is, the size of cb_qp_offset_list / cr_qp_offset_list (the range of chroma_qp_offset_list_len_minus1) is expanded according to the value of adaptive_color_trans_flag).
  • the quantization offset for RGB space and YCoCr space is switched in units of blocks by adjusting the value of cu_chroma_qp_offset_idx syntax transmitted in units of blocks according to the value of cu_residual_csc_flag syntax. Can do.
  • the entropy encoder 112 specifies information that can specify the color difference quantization offset for the RGB space (for example, specifies a data table in which the color difference quantization offset held in the video decoding device is set) Cu_chroma_qp_offset_idx syntax) that is an index to be transmitted to the video decoding device.
  • the entropy encoder 112 has information that can specify the color difference quantization offset for the YCoCr space (for example, the color difference quantization offset held in the video decoding device is set).
  • Cu_chroma_qp_offset_idx syntax that is an index for designating the data table is transmitted to the video decoding device.
  • the video decoding apparatus according to the present embodiment corresponds to the video encoding apparatus according to the fifth embodiment. Note that the configuration of the video decoding apparatus of the present embodiment is the same as the configuration shown in FIG.
  • the color difference quantization offset is read from the data table specified by the index, and adaptively
  • the color difference quantization offset derivation unit 221 calculates the color difference quantization parameter as in the case of the second embodiment.
  • the adaptive color difference quantization offset derivation unit 121 operates in the same manner as the adaptive color difference quantization offset derivation unit 221.
  • FIG. 9 shows an example of syntax for transmitting alt_cb_qp_offset_list [i] and alt_cr_qp_offset_list [i] for YCoCr space (7.3.2.3.2 Picture parameter set range extensions syntax described in Non-Patent Document 1). It is explanatory drawing shown. In FIG. 9, italicized display locations indicate the characteristic locations of this embodiment.
  • cu_chroma_qp_offset_idx 0 derives cb_qp_offset_list [0] and cr_qp_offset_list [0] for RGB. Therefore, in the fifth embodiment, when the size of the list is 4 (when chroma_qp_offset_list_len_minus1 is 3), in order to derive cb_qp_offset_list [4] and alt_cr_qp_offset_list [4] for YCoCr, cu_chroma_qp_offset_ Must be transmitted.
  • the configuration of the video encoding apparatus of the present embodiment is the same as that shown in FIG.
  • the entropy encoder 112 specifies information that can specify the color difference quantization offset for the RGB space (for example, specifies a data table in which the color difference quantization offset held in the video decoding device is set) Is transmitted to the video decoding device.
  • the entropy encoder 112 has information that can specify the color difference quantization offset for the YCoCr space (for example, the color difference quantization offset held in the video decoding device is set).
  • the index for designating the data table is transmitted to the video decoding device.
  • Embodiment 8 FIG. Next, a video decoding apparatus according to the eighth embodiment will be described.
  • the video decoding apparatus according to the present embodiment corresponds to the video encoding apparatus according to the seventh embodiment. Note that the configuration of the video decoding apparatus of the present embodiment is the same as the configuration shown in FIG.
  • the entropy decoder 212 decodes that the data is compressed in the YCoCr ⁇ ⁇ ⁇ space by the syntax illustrated in Fig. 9 ⁇ ⁇ ⁇ , for example, reads the color difference quantization offset from the data table specified by the index, and adapts The color difference quantization offset derivation unit 221 calculates the color difference quantization parameter as in the case of the second embodiment.
  • the adaptive color difference quantization offset derivation unit 121 operates in the same manner as the adaptive color difference quantization offset derivation unit 221.
  • cu_chroma_qp_offset_idx when present, specifies the index into the cb_qp_offset_list [] and cr_qp_offset_list [] or the alt_cb_qp_offset_list [] and alt_cr_qp_offset_list [] that is used to determine the value of CuQpOffsetCb_ Cu_p_offsetCb_ Cu_p of 0 to chroma_qp_offset_list_len_minus1, inclusive.When not present, the value of cu_chroma_qp_offset_idx is inferred to be equal to 0.
  • CuQpOffsetCb cb_qp_offset_list [cu_chroma_qp_offset_idx]
  • Embodiment 9 FIG.
  • the video encoding apparatus explicitly signals the color difference quantization offset, but signals that the color space of the prediction error signal is selected in units of blocks, but does not signal the color difference quantization offset. It may be. In this specification, this is referred to as implicit color difference quantization offset signaling.
  • the adaptive color difference quantization offset derivation unit 221 Reads out the color difference quantization offset value for the RGB color space stored in advance in the video decoding device.
  • the adaptive color difference quantization offset derivation unit 221 ⁇ ⁇ calculates the YCoCr from the color difference quantization offset value for the RGB color space stored in advance. The value of the color difference quantization offset for space is calculated.
  • the color difference quantization offset for RGB color space and the color difference quantization offset for YCoCr color space are correlated to some extent, that is, to calculate the color difference quantization offset for YCoCr color space from the color difference quantization offset for RGB color space. Therefore, the adaptive color difference quantization offset deriving unit 221 can derive the color difference quantization offset for YCoCr space by such a calculation equation.
  • the video decoding device implicitly derives the color difference quantization offset.
  • the adaptive color difference quantization offset derivation unit 121 operates in the same manner as the adaptive color difference quantization offset derivation unit 221.
  • the video encoding device implicitly signals the color difference quantization offset, the amount of data to be transmitted can be reduced.
  • the color difference quantization offset for one color space may be set to zero.
  • the adaptive color difference quantization offset derivation unit 221 uses the RGB space value 0 color difference stored in advance in the video decoding device. Read the value of quantization offset.
  • the adaptive color difference quantization offset derivation unit 221 transmits pps_cb_qp_offset / pps_cr_qp_offset / slice_qp_delta_cb transmitted in units of pictures.
  • the value of the color difference quantization offset for YCoCr space is calculated from slice_qp_delta_cb / slice_qp_delta_cr.
  • a specific example of the procedure for deriving the color difference quantization offset is shown below. In the following description, a part surrounded by quotation marks is a characteristic part in the present embodiment.
  • qP Cb Clip3 (-QpBdOffset C , 57, Qp Y + pps_cb_qp_offset + slice_cb_qp_offset + CuQpOffset Cb )
  • the variables qP Cb and qP Cr are set equal to Min (qPi, 51), based on the index qPi equal to qPi Cb and qPi Cr , respectively.
  • the RGB color space and the YCoCr color space are exemplified as the two color spaces, but the method of each of the above embodiments is applied even if one or both of the two color spaces are other color spaces. can do.
  • the first color component is G
  • the second color component is B
  • the third color component is R (see FIG. 20).
  • the present invention is not limited to this, and an arbitrary color signal can be assigned to each color component.
  • the video encoding device and the video decoding device handle two color spaces, but it is also possible to handle three or more color spaces.
  • each of the above embodiments can be configured by hardware, it can also be realized by a computer program.
  • the information processing system shown in FIG. 10 includes a processor 1001, a program memory 1002, a storage medium 1003 for storing video data, and a storage medium 1004 for storing a bitstream.
  • the storage medium 1003 and the storage medium 1004 may be separate storage media, or may be storage areas composed of the same storage medium.
  • a magnetic storage medium such as a hard disk can be used as the storage medium.
  • the program memory 1002 stores a program for realizing the function of each block (excluding the buffer block) shown in FIGS. Then, the processor 1001 implements the functions of the video encoding device or the video decoding device shown in FIG. 1B and FIG. 3B by executing processing according to the program stored in the program memory 1002.
  • FIG. 11 is a block diagram showing a main part of the video encoding device.
  • the video encoding device 301 includes an adaptive color difference quantization offset deriving unit 311 that derives a color difference quantization offset for each color space (for example, the adaptive color difference quantization offset deriving device shown in FIG.
  • Inverse quantization unit 312 that dequantizes the quantized coefficient image using the color difference quantization offset for each color space (as an example, the inverse quantization / inverse frequency converter shown in Fig. 1) Equivalent to 105).
  • FIG. 12 is a block diagram showing a main part of another example of the video encoding device. As shown in FIG. 12, the video encoding device 302 signals that the color space of the prediction error signal is selected in units of blocks, and the color space selection notification unit 313 (for example, the entropy encoder shown in FIG. 112 equivalent).
  • the color space selection notification unit 313 for example, the entropy encoder shown in FIG. 112 equivalent.
  • the video encoding device 302 ⁇ when the video encoding device 302 ⁇ does not include means for signaling information that can specify the value of the quantization offset for each color space, the video encoding device 302 The color difference quantization offset is derived implicitly.
  • FIG. 13 is a block diagram showing a main part of still another example of the video encoding device.
  • the video encoding device 303 is a quantization offset information transmission unit 314 that signals information that can specify the value of the color difference quantization offset for each color space (for example, the entropy shown in FIG. And an encoder 112).
  • the information that can specify the color difference quantization offset value is, for example, the color difference quantization offset value itself, or an index that designates a data table in which the color difference quantization offset held in the video decoding device is set. .
  • FIG. 14 is a block diagram showing the main part of the video decoding apparatus.
  • the video decoding apparatus 401 includes an adaptive color difference quantization offset deriving unit 411 that derives a color difference quantization offset for each color space (for example, the adaptive color difference quantization offset deriving device 221 shown in FIG. And an inverse quantization unit 412 that inversely quantizes the quantized coefficient image using the color difference quantization offset for each color space (for example, the inverse quantization / inverse frequency converter 205 shown in FIG. 3 205) Equivalent).
  • FIG. 15 is a block diagram showing a main part of another example of the video decoding apparatus. As shown in FIG. 15, the video decoding device 402 decodes the fact that the color space of the prediction error signal is selected in block units from the received bitstream (as an example, shown in FIG. 3). An entropy decoder 212).
  • the video decoding device 402 ⁇ ⁇ does not include means for decoding information that can identify the value of the color difference quantization offset for each color space, the video decoding device 402 Thus, a color difference quantization offset is derived.
  • FIG. 16 is a block diagram showing a main part of still another example of the video decoding apparatus.
  • the video decoding device 403 specifies a color difference quantization offset decoding unit 414 (for example, as shown in FIG. 16) that specifies the value of the color difference quantization offset for each color space based on the information decoded from the received bitstream.
  • a color difference quantization offset decoding unit 414 for example, as shown in FIG. 16
  • an entropy decoder 212 shown in 3).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

 L'invention concerne un dispositif de codage d'image vidéo qui comprend une unité de dérivation de décalage de quantification de différence de couleur adaptative apte à sélectionner l'espace couleur d'un signal d'erreur de prédiction dans des unités de bloc de codage parmi une pluralité d'espaces couleur et de dériver le décalage de quantification pour chaque espace couleur, et une unité de déquantification pour déquantifier une image de coefficient de quantification à l'aide du décalage de quantification de différence de couleur pour chaque espace couleur.
PCT/JP2015/004041 2014-10-22 2015-08-12 Dispositif de codage d'image vidéo, dispositif de décodage d'image vidéo, procédé de codage d'image vidéo, procédé de décodage d'image vidéo, et programme WO2016063440A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-214959 2014-10-22
JP2014214959 2014-10-22

Publications (1)

Publication Number Publication Date
WO2016063440A1 true WO2016063440A1 (fr) 2016-04-28

Family

ID=55760504

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/004041 WO2016063440A1 (fr) 2014-10-22 2015-08-12 Dispositif de codage d'image vidéo, dispositif de décodage d'image vidéo, procédé de codage d'image vidéo, procédé de décodage d'image vidéo, et programme

Country Status (2)

Country Link
TW (1) TW201626797A (fr)
WO (1) WO2016063440A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114208169A (zh) * 2019-06-11 2022-03-18 Lg电子株式会社 用于色度分量的图像解码方法及其装置
CN114258684A (zh) * 2019-08-22 2022-03-29 Lg 电子株式会社 图像解码方法及其设备
CN114270843A (zh) * 2019-12-26 2022-04-01 Kddi 株式会社 图像解码装置、图像解码方法和程序

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050259730A1 (en) * 2004-05-18 2005-11-24 Sharp Laboratories Of America, Inc. Video coding with residual color conversion using reversible YCoCg
US20060018559A1 (en) * 2004-07-22 2006-01-26 Samsung Electronics Co., Ltd. Method and apparatus to transform/inverse transform and quantize/dequantize color image, and method and apparatus to encode/decode color image using it

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050259730A1 (en) * 2004-05-18 2005-11-24 Sharp Laboratories Of America, Inc. Video coding with residual color conversion using reversible YCoCg
US20060018559A1 (en) * 2004-07-22 2006-01-26 Samsung Electronics Co., Ltd. Method and apparatus to transform/inverse transform and quantize/dequantize color image, and method and apparatus to encode/decode color image using it

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LI ZHANG ET AL.: "SCCE5 Test 3.2.1: In-loop color-space transform", JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG 11 18TH MEETING, 30 June 2014 (2014-06-30), Sapporo, JP *
WOO-SHIK KIM ET AL.: "Residue Color Transform", JOINT VIDEO TEAM (JVT) OF ISO/IEC MPEG & ITU-T VCEG (ISO/IEC JTC1/SC29/WG11 AND ITU-T SG 16 Q.6) 12TH MEETING, 17 July 2004 (2004-07-17), Redmond, WA, USA *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114208169A (zh) * 2019-06-11 2022-03-18 Lg电子株式会社 用于色度分量的图像解码方法及其装置
CN114258684A (zh) * 2019-08-22 2022-03-29 Lg 电子株式会社 图像解码方法及其设备
CN114270843A (zh) * 2019-12-26 2022-04-01 Kddi 株式会社 图像解码装置、图像解码方法和程序

Also Published As

Publication number Publication date
TW201626797A (zh) 2016-07-16

Similar Documents

Publication Publication Date Title
US11575900B2 (en) Video coding device, video decoding device, video coding method, video decoding method and program
US11539956B2 (en) Robust encoding/decoding of escape-coded pixels in palette mode
TWI700919B (zh) 視訊編解碼方法/裝置及相應地存儲介質
CN105409221B (zh) 用于样本自适应偏移滤波的编码器侧决策
US10038917B2 (en) Search strategies for intra-picture prediction modes
US20170006283A1 (en) Computationally efficient sample adaptive offset filtering during video encoding
KR102594690B1 (ko) 크로마 양자화 파라미터 데이터 기반 영상 디코딩 방법 및 그 장치
KR20160068288A (ko) 변환생략을 참조하는 디블록킹 필터링을 이용한 영상의 부호화/복호화 방법 및 이를 이용하는 장치
JP7142180B2 (ja) 符号化装置、復号装置、及びプログラム
WO2016063440A1 (fr) Dispositif de codage d'image vidéo, dispositif de décodage d'image vidéo, procédé de codage d'image vidéo, procédé de décodage d'image vidéo, et programme
WO2016185651A1 (fr) Dispositif de codage de vidéo, dispositif de décodage de vidéo, procédé de codage de vidéo, procédé de décodage de vidéo, et programme
KR102644971B1 (ko) 크로마 양자화 파라미터 테이블을 사용하는 영상 디코딩 방법 및 그 장치
JP2016076755A (ja) 映像符号化装置、映像復号装置、映像符号化方法、映像復号方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15853451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15853451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP