WO2020059051A1 - 映像符号化装置、映像符号化方法、映像符号化プログラム、映像復号装置、映像復号方法、及び映像復号プログラム - Google Patents
映像符号化装置、映像符号化方法、映像符号化プログラム、映像復号装置、映像復号方法、及び映像復号プログラム Download PDFInfo
- Publication number
- WO2020059051A1 WO2020059051A1 PCT/JP2018/034681 JP2018034681W WO2020059051A1 WO 2020059051 A1 WO2020059051 A1 WO 2020059051A1 JP 2018034681 W JP2018034681 W JP 2018034681W WO 2020059051 A1 WO2020059051 A1 WO 2020059051A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- prediction mode
- block
- intra prediction
- video
- encoding
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/11—Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/119—Adaptive subdivision aspects, e.g. subdivision of a picture into rectangular or non-rectangular coding blocks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/157—Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/176—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/593—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/70—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
Definitions
- the present invention relates to a video encoding device, a video encoding method, a video encoding program, a video decoding device, a video decoding method, and a video decoding program.
- H.264 H.265 / HEVC High Efficiency Video Coding
- H. HEVC may be referred to as H.265 / HEVC.
- HEVC employs two prediction methods, intra prediction and inter prediction, and three types of intra prediction modes are defined: planar prediction, DC prediction, and angle prediction.
- FIG. 1 shows the angles used in HEVC angle prediction.
- intra prediction a locally decoded pixel value of a block previously coded in raster scan order is used as a predicted pixel value, and therefore, the reference direction is any direction from the lower left direction to the clockwise upper right direction. If the angle indicating the left horizontal direction is 0 degree, the angle range of the reference direction is a range from -45 degrees to +135 degrees.
- Numbers from 2 to 34 are sequentially assigned to angles from ⁇ 45 degrees to +135 degrees, and these numbers represent 33 intra prediction modes for angle prediction. Note that 0 and 1 are assigned to planar prediction and DC prediction, respectively. These two intra prediction modes correspond to non-directional spatial intra prediction.
- a predicted pixel value of the encoding target block is generated by performing extrapolation that matches a specified reference direction among 33 reference directions.
- FIG. 2 shows an example of an extrapolation method in the case of the intra prediction mode “6” ( ⁇ 22.5 degrees).
- the upper adjacent block adjacent to the upper side of the current block and the left adjacent block adjacent to the left side of the current block are encoded blocks.
- the upper adjacent block is adjacent to the upper side of the two horizontal sides of the current block, and the left adjacent block is adjacent to the left side of the two vertical sides of the current block.
- the adjacent pixel 201 (hatched square) is a pixel in the upper adjacent block or the left adjacent block, and the pixel 202 (open square) is a pixel in the encoding target block.
- a line segment 203 with an arrow passing through each pixel 202 indicates a reference direction in the intra prediction mode “6”.
- the pixel value of the adjacent pixel 201 existing at the end of the line segment 203 with an arrow passing through each pixel 202 is used as the predicted pixel value of the pixel 202.
- weighted addition of the pixel values of the adjacent pixels 201 existing at the end of each of the line segments 203 with arrow becomes a predicted pixel value.
- VVC Very Video Coding
- FIG. 3 shows an example of block division in VVC.
- FIG. 3A shows four divisions.
- a block whose horizontal size (width) is W pixels and whose vertical size (height) is H pixels is divided into four blocks of the same shape.
- Each divided block has a width of W / 2 pixels and a height of H / 2 pixels.
- the width of the W pixel may be described as “width W”
- the height of the H pixel may be described as “height H”.
- FIG. 3B shows horizontal and vertical splits.
- a block having a width W and a height H is divided into two blocks having the same shape by a horizontal dividing line.
- the width of each block after division is W pixels, and the height is H / 2 pixels.
- a block having a width W and a height H is divided into two blocks having the same shape by a vertical dividing line.
- the width of each block after division is W / 2 pixels, and the height is H pixels.
- FIG. 3 (c) shows horizontal and vertical divisions into three.
- a block having a width W and a height H is divided into three blocks by two horizontal dividing lines.
- the width of each block after division is W pixels
- the height of the upper and lower two blocks is H / 4 pixels
- the height of the center block is H / 2 pixels.
- a block having a width W and a height H is divided into three blocks by two dividing lines in the vertical direction.
- the height of each divided block is H pixels
- the width of two blocks on the left and right is W / 4 pixels
- the width of the center block is W / 2 pixels.
- FIG. 4 shows an example of image block division. As shown in FIG. 4, not only square blocks but also rectangular blocks can be used in VVC. As the ratio of height and width (aspect ratio) of the rectangular block, not only 1: 2 and 2: 1, but also other aspect ratios can be used.
- an MPM list including three most probable modes (Most Probable Mode, MPM) as entries is used as in HEVC.
- the MPM is used as a candidate value (prediction value) of the intra prediction mode in the current block.
- an object of the present invention is to set an appropriate candidate value in video encoding using a candidate value of an intra prediction mode for a rectangular block.
- the video encoding device includes a generating unit, a predicting unit, a first encoding unit, and a second encoding unit.
- the generation unit converts the first prediction mode information into the second prediction mode information based on a combination of the shape of the encoding target block in the image included in the video and the shape of the encoded block adjacent to the encoding target block.
- Change to The first prediction mode information is prediction mode information indicating an intra prediction mode used for encoding a coded block. Then, the generation unit uses the second prediction mode information to generate candidate information including a candidate value of the prediction mode information.
- the prediction unit generates an intra prediction pixel value of the current block in a predetermined intra prediction mode.
- the first encoding unit encodes the current block using the intra prediction pixel value, and the second encoding unit encodes prediction mode information indicating a predetermined intra prediction mode using the candidate information.
- an appropriate candidate value can be set in video encoding using a candidate value of an intra prediction mode for a rectangular block.
- FIG. 3 is a diagram showing an extrapolation method.
- FIG. 3 is a diagram illustrating block division in VVC.
- FIG. 4 is a diagram illustrating block division of an image.
- FIG. 5 is a diagram illustrating angles used in VVC angle prediction. It is a figure which shows the intra prediction mode allocated to angle prediction. It is a figure which shows the intra prediction mode added for the rectangular block. It is a figure showing angle prediction to a rectangular block.
- FIG. 2 is a functional configuration diagram of the video encoding device. It is a functional block diagram of a video decoding apparatus.
- FIG. 3 is a functional configuration diagram illustrating a specific example of a video encoding device.
- FIG. 3 is a functional configuration diagram of an intra prediction unit in the video encoding device. It is a figure showing the change method of the 1st intra prediction mode. It is a flowchart of a video encoding process. 5 is a flowchart of an intra prediction process in the video encoding device.
- FIG. 6 is a diagram illustrating a first adjacent block determination method.
- FIG. 9 is a diagram illustrating a second adjacent block determination method. It is a figure showing the 3rd adjacent block decision method.
- It is a functional block diagram showing a specific example of a video decoding device. It is a functional block diagram of the intra prediction part in a video decoding apparatus. It is a flowchart of a video decoding process. It is a flowchart of an intra prediction process in a video decoding device.
- FIG. 2 is a configuration diagram of an information processing device.
- HEVC angle prediction ( ⁇ 45 degrees to +135 degrees) may not be sufficient.
- VVC addresses this problem by extending the angle range of angle prediction for rectangular blocks.
- FIG. 5 shows the angles used in the angle prediction of VVC.
- VVC the intervals between the used angles are reduced to half in order to double the accuracy of HEVC angle prediction. Further, angles in the range of -73 degrees to -45 degrees and +135 degrees to +163 degrees are added as reference directions for rectangular blocks.
- the angle range 501 represents ⁇ 45 degrees to +45 degrees of angle prediction (33 patterns) for square and rectangular blocks, and the angle range 502 represents the +45 degrees to +135 degrees of angle prediction (32 bits) for square and rectangular blocks. Street).
- the angle range 503 represents the angle prediction ( ⁇ 10 degrees) of ⁇ 73 degrees to ⁇ 45 degrees added for the rectangular block, and the angle range 504 represents the angles of ⁇ 135 degrees to +163 degrees added for the rectangular block.
- the prediction (10 ways) is shown. When planar prediction and DC prediction are added to a total of 85 angle predictions, the total number of intra prediction modes is 87.
- FIG. 6 shows an intra prediction mode assigned to the angle prediction of the angle range 501 and the angle range 502 in FIG. Numbers from 2 to 66 are sequentially assigned to angles from ⁇ 45 degrees to +135 degrees, and these numbers represent 65 types of intra prediction modes. 0 and 1 are assigned to planar prediction and DC prediction, respectively, as in the case of HEVC.
- FIG. 7 shows an intra prediction mode assigned to the angle prediction of the angle range 503 and the angle range 504 in FIG.
- Numbers from 67 to 76 are sequentially assigned to each angle from immediately after +135 degrees to +163 degrees, and from -10 to -45 for each angle from -73 degrees to just before -45 degrees. Numbers up to 1 are sequentially assigned.
- Angle predictions 67-76 are used for horizontally long blocks whose width is greater than height
- angle predictions of -10--1 are used for vertically long blocks whose height is larger than width.
- the prediction error in intra prediction can be reduced, but the bit amount of the parameter indicating the intra prediction mode increases.
- the total number of intra prediction modes for a rectangular block increases from 67 to 87.
- the total number of intra prediction modes that can be selected for each block remains at 67.
- FIG. 8 shows an example of angle prediction for a rectangular block.
- the encoding target block 801 is a horizontally long block having an aspect ratio of 1: 2, and the pixel 802 is located at the lower right corner in the encoding target block 801.
- the upper adjacent block adjacent to the upper side of the current block and the left adjacent block adjacent to the left side of the current block are coded blocks.
- the adjacent pixel 803 (diagonal line) is a pixel in the upper adjacent block or the left adjacent block, and is referred to in intra prediction.
- Arrows 810, 811, 820, and 821 indicate the reference directions of ⁇ 45 degrees, ⁇ 30 degrees, +135 degrees, and +150 degrees, respectively.
- arrows 811 and 821 indicate reference directions parallel to a diagonal line 831 connecting the lower left vertex and the upper right vertex of the encoding target block 801.
- the prediction efficiency of the angle prediction is inversely proportional to the distance between the prediction target pixel and the reference pixel. That is, as the distance between the prediction target pixel and the reference pixel is shorter, the prediction error can be expected to be smaller, and the prediction efficiency is improved. As a result, the coding efficiency of the prediction target pixel is improved.
- each arrow in FIG. 8 indicates the distance between the prediction target pixel and the reference pixel.
- the length of the arrow 810 is longer than the length of the arrow 820 existing on the extension.
- the lengths of the two become the same.
- the length of the arrow 811 is the same as the length of the arrow 821 existing on the extension line.
- the angle prediction in the range of ⁇ 30 degrees to ⁇ 45 degrees has low prediction efficiency and the probability of being selected in intra prediction is low.
- the reference direction of the arrow 820 is selected instead of the reference direction of the arrow 810.
- the angles obtained by inverting the ten angles in the angle range 504 by about 180 ° are not used for the angle prediction.
- ten angles from the bottom to the top of the angle range 501 are not used, and the numbers 2 to 11 of these angles are assigned to the angles 67 to 76 of the angle range 504, respectively.
- the angles obtained by inverting the ten angles in the angle range 503 by about 180 ° are not used for angle prediction. Specifically, ten angles from the right to the left of the angle range 502 are not used, and the numbers 57 to 66 of these angles are assigned to the angles -10 to -1 of the angle range 503, respectively.
- the same number as the original number is assigned by re-assignment irrespective of the shape of the encoding target block.
- the numbers 0 to 66 after the reassignment may be described as the first intra prediction mode, and the numbers -10 to 76 before the reassignment may be described as the second intra prediction mode.
- the first intra prediction mode is an example of prediction mode information.
- the number of the second intra prediction mode indicates the angle shown in FIGS.
- the first intra prediction mode of the left adjacent block and the upper adjacent block having a high correlation with the first intra prediction mode of the current block is used.
- MPM list the first intra prediction mode of the left adjacent block and the upper adjacent block having a high correlation with the first intra prediction mode of the current block.
- MPM list planar prediction and DC prediction that are highly likely to be selected in the current block are also used as entries in the MPM list.
- Different first intra prediction modes are set in the three entries of the MPM list. Then, one of the three entries is specified by a 2-bit syntax element IntraLumaMPMIdx.
- the first intra prediction mode of the current block is directly coded by the syntax element IntraLumaMPMRemainder.
- a procedure in which the video encoding device derives IntraLumaMPMRemainder from IntraDir indicating the first intra prediction mode is as follows.
- P1 The video encoding device sets the value of IntraDir to IntraLumaMPMRemainder.
- IntraLumaMPMRemainder finally obtained is smaller than the value of IntraDir by up to 3 at most.
- the procedure in which the video decoding device derives IntraDir from IntraLumaMPMRemainder is as follows.
- the video decoding device sets the value of IntraLumaMPMRemainder to IntraDir.
- the video decoding device generates mpm_sort [i] in the same manner as in the procedure (P2).
- IntraDir that is finally obtained will be up to 3 larger than the value of IntraLumaMPMRemainder.
- candModeList [0] to candModeList [2] are determined as follows.
- the angle prediction usable in the adjacent block may not be usable in the encoding target block. is there.
- the adjacent block is a vertically long block and the encoding target block is a square block.
- the second intra prediction mode of the adjacent block is “ ⁇ 8”, the corresponding first intra prediction mode is “59”.
- the candidate values of the first intra prediction mode of the square block to be encoded include the second intra prediction mode “ ⁇ ” in the angle prediction shown in FIG.
- the number “2” which is the number closest to the angle indicated by “8” is more suitable. Since the prediction error can be expected to be smaller as the angle of the reference direction is smaller, the probability that the reference direction is selected in the encoding target block increases.
- the aspect ratio of the adjacent block is different from the aspect ratio of the encoding target block, it is desirable to generate the MPM list in consideration of the continuity of the reference direction between the blocks.
- FIG. 9 shows a functional configuration example of the video encoding device of the embodiment.
- the video encoding device 901 in FIG. 9 includes a generation unit 911, a prediction unit 912, a first encoding unit 913, and a second encoding unit 914.
- the generation unit 911 converts the first prediction mode information into the second prediction mode based on a combination of the shape of the encoding target block in the image included in the video and the shape of the encoded block adjacent to the encoding target block. Change to information.
- the first prediction mode information is prediction mode information indicating an intra prediction mode used for encoding a coded block. Then, the generation unit 911 uses the second prediction mode information to generate candidate information including a candidate value of the prediction mode information.
- the prediction unit 912 generates an intra prediction pixel value of the current block in a predetermined intra prediction mode.
- the first encoding unit 913 encodes the current block using the intra prediction pixel value, and the second encoding unit 914 encodes prediction mode information indicating a predetermined intra prediction mode using the candidate information. I do.
- FIG. 10 shows an example of a functional configuration of the video decoding device according to the embodiment.
- the video decoding device 1001 in FIG. 10 includes a decoding unit 1011, a generation unit 1012, a prediction unit 1013, and a restoration unit 1014.
- the decoding unit 1011 decodes the encoded video, and extracts prediction residual information of a decoding target block in the encoded image included in the encoded video. Furthermore, the decoding unit 1011 extracts prediction mode information indicating the intra prediction mode of the current block and first prediction mode information indicating the intra prediction mode of a decoded block adjacent to the current block.
- the generating unit 1012 changes the first prediction mode information to the second prediction mode information based on a combination of the shape of the current block and the shape of the decoded block, and uses the second prediction mode information to generate the prediction mode information. Is generated.
- the prediction unit 1013 generates the intra prediction pixel value of the decoding target block in the intra prediction mode indicated by the prediction mode information of the decoding target block using the candidate information.
- the restoration unit 1014 generates a pixel value of the current block using the intra prediction pixel value and the prediction residual information.
- an appropriate candidate value can be set in video encoding using a candidate value in the intra prediction mode for a rectangular block.
- FIG. 11 shows a specific example of the video encoding device 901 in FIG. 11 includes a subtraction unit 1111, a transform / quantization unit 1112, an entropy coding unit 1113, a mode determination unit 1114, an intra prediction unit 1115, and an inter prediction unit 1116.
- the video encoding device 1101 further includes an inverse quantization / inverse transform unit 1117, an addition unit 1118, a post filter unit 1119, and a frame memory 1120.
- the subtraction unit 1111 and the transform / quantization unit 1112 correspond to the first encoding unit 913 in FIG.
- the video encoding device 1101 can be implemented, for example, as a hardware circuit.
- each component of the video encoding device 1101 may be implemented as an individual circuit, or may be implemented as one integrated circuit.
- the video encoding device 1101 encodes an input video and outputs the encoded video as an encoded stream.
- the video encoding device 1101 can transmit the encoded stream to the video decoding device 1001 in FIG. 10 via a communication network.
- the video encoding device 1101 may be incorporated in a video camera, a video transmission device, a video phone system, a computer, or a mobile terminal device.
- the input video includes a plurality of images corresponding to a plurality of times, respectively.
- the image at each time is sometimes called a picture or a frame.
- Each image may be a color image or a monochrome image.
- pixel values may be in RGB format or YUV format.
- the same prediction image is obtained from the parameter indicating the prediction mode and the prediction residual information. Can be generated. In this case, since only the difference information needs to be transmitted as an encoded stream, video encoding with high compression efficiency is realized.
- the inverse quantization / inverse transform unit 1117, the addition unit 1118, the post filter unit 1119, and the frame memory 1120 are used for local decoding processing in the video encoding device 901.
- Each image is divided into unit blocks of a predetermined size, and is encoded for each unit block in raster scan order.
- the unit block may be used as it is as a coding target block, or a block obtained by further dividing the unit block may be used as a coding target block. Then, intra prediction or inter prediction is performed on the current block.
- intra prediction In the case of intra prediction, a predicted image of a coding target block in each intra prediction mode is generated using adjacent pixels in the upper adjacent block or the left adjacent block, and the intra prediction mode having the highest prediction efficiency is selected.
- the intra prediction mode planar prediction, DC prediction, and the angle prediction shown in FIG. 5 are used, and a parameter indicating the intra prediction mode with the highest prediction efficiency and prediction residual information are transmitted as an encoded stream.
- inter prediction an image that has been coded in the past is set as a reference image, and block matching processing is performed between the current block and a reference block in the reference image by motion vector search to achieve the highest prediction efficiency. Is detected. Then, the information of the reference image and the information of the motion vector indicating the position of the detected reference block are transmitted as parameters indicating the inter prediction mode, and the difference between the reference block and the current block is used as prediction residual information. Transmitted.
- the intra prediction unit 1115 calculates the intra prediction pixel value of the current block using the decoded pixel value before the post filter application output from the addition unit 1118, and outputs the pixel value to the mode determination unit 1114.
- the inter prediction unit 1116 calculates the inter prediction pixel value of the current block using the pixel value of the reference image output from the frame memory 1120, and outputs the calculated pixel value to the mode determination unit 1114.
- the mode determination unit 1114 determines which of intra prediction and inter prediction has higher prediction efficiency, and determines whether the prediction efficiency is higher. Select the higher prediction result. Then, the mode determination unit 1114 outputs the prediction pixel value of the selected prediction result from the intra prediction pixel value or the inter prediction pixel value to the subtraction unit 1111 and the addition unit 1118.
- the subtraction unit 1111 outputs the difference between the pixel value of the current block and the prediction pixel value output from the mode determination unit 1114 to the transformation / quantization unit 1112 as a prediction residual.
- the transform / quantization unit 1112 performs orthogonal transform and quantization of the prediction residual, and outputs the quantization coefficient to the entropy coding unit 1113 and the inverse quantization / inverse transformation unit 1117 as prediction residual information.
- the entropy coding unit 1113 converts the quantization coefficient and a parameter indicating the selected intra prediction mode or inter prediction mode into a binary sequence by entropy coding (variable length coding), and outputs a coded video. .
- the inverse quantization / inverse transform unit 1117 performs inverse quantization and inverse orthogonal transform of the quantized coefficient to restore the prediction residual, and outputs the restored prediction residual to the addition unit 1118.
- the addition unit 1118 adds the prediction pixel value output from the mode determination unit 1114 and the prediction residual output from the inverse quantization / inverse transformation unit 1117 to generate a decoded pixel value before applying the post filter. . Then, the adding unit 1118 outputs the generated decoded pixel value to the post-filter unit 1119 and the intra prediction unit 1115.
- the post-filter unit 1119 applies a post-filter to the decoded pixel value before the post-filter is applied to reduce the quantization error, and generates a decoded pixel value after the post-filter is applied. Then, the post-filter unit 1119 outputs the generated decoded pixel value to the frame memory 1120.
- the frame memory 1120 stores the decoded pixel value after the post filter application as a local decoded pixel value.
- the locally decoded pixel value stored in the frame memory 1120 is output to the inter prediction unit 1116 as a pixel value of a reference image.
- FIG. 12 shows a functional configuration example of the intra prediction unit 1115 in FIG. 12 includes an MPM generation unit 1211, a prediction mode calculation unit 1212, an encoding unit 1213, a prediction mode calculation unit 1214, and a filter unit 1215.
- the MPM generation unit 1211, the encoding unit 1213, and the filter unit 1215 correspond to the generation unit 911, the second encoding unit 914, and the prediction unit 912 in FIG. 9, respectively.
- the shape parameters indicating the shapes of the encoding target block, the left adjacent block, and the upper adjacent block are input to the MPM generation unit 1211 and the prediction mode calculation unit 1214 from an encoding control unit (not shown).
- the width W and the height H of each block are used as the shape parameters.
- the first intra prediction mode of the current block, the left adjacent block, and the upper adjacent block is input from the prediction mode calculation unit 1212 to the MPM generation unit 1211.
- the MPM generation unit 1211 changes the first intra prediction mode of the adjacent block based on a combination of the shape of the current block and the shape of each adjacent block. Note that the first intra prediction mode of the adjacent block for which the inter prediction mode has been selected is regarded as DC prediction.
- the MPM generation unit 1211 generates an MPM list using the changed first intra prediction mode of the left adjacent block and the upper adjacent block, and generates the MPM list and the first intra prediction mode of the current block. Is output to the encoding unit 1213.
- the MPM list is an example of candidate information including candidate values of prediction mode information.
- the ⁇ ⁇ ⁇ prediction mode calculation unit 1214 determines the second intra prediction mode having the highest prediction efficiency for the current block by performing a search process of calculating the prediction efficiencies of all the second intra prediction modes. Then, the prediction mode calculation unit 1214 outputs the determined second intra prediction mode to the prediction mode calculation unit 1212 and the filter unit 1215.
- the prediction mode calculation unit 1212 converts the second intra prediction mode output from the prediction mode calculation unit 1214 into the first intra prediction mode, and outputs the first intra prediction mode to the MPM generation unit 1211.
- the number of the second intra prediction mode indicating each angle shown in FIGS. 6 and 7 is converted to the number of the first intra prediction mode.
- the filter unit 1215 applies a filter corresponding to the second intra prediction mode output from the prediction mode calculation unit 1214 to the decoded pixel value before applying the post filter, and generates an intra prediction pixel value of the current block. I do. Then, the filter unit 1215 outputs the generated intra prediction pixel value to the mode determination unit 1114.
- the filter corresponding to the second intra prediction mode is defined by the VVC standard.
- the encoding unit 1213 encodes the first intra prediction mode of the current block using the MPM list, and generates an intra prediction parameter indicating the first intra prediction mode. Then, encoding section 1213 outputs the generated intra prediction parameter to mode determination section 1114. IntraLumaMPMFlag, IntraLumaMPMIdx, and IntraLumaMPMRemainder are used as the intra prediction parameters.
- IntraLumaMPMFlag is a flag indicating whether or not to use the MPM list. When IntraLumaMPMFlag is logical "1", the MPM list is used. When IntraLumaMPMFlag is logical "0", the MPM list is not used.
- IntraLumaMPMIdx is a parameter that specifies an entry in the MPM list
- IntraLumaMPMRemainder is a parameter that specifies the remaining first intra prediction mode that is not registered in the MPM list.
- IntraLumaMPMFlag is set to logical “1”, and IntraLumaMPMIdx specifying the entry is generated.
- IntraLumaMPMFlag is set to logic “0”. Then, the first intra prediction mode is converted into IntraLumaMPMRemainder by the above-described procedures (P1) to (P3).
- the MPM generation unit 1211 determines, based on a combination of the ratio H / W of the height H to the width W of the encoding target block and the ratio Hn / Wn of the height Hn to the width Wn of each adjacent block, the neighboring block. Change the first intra prediction mode of the block. This change is performed independently for each of the left adjacent block and the upper adjacent block.
- the angle A1 indicated by the first intra prediction mode of the adjacent block is an angle that is not used for intra prediction in the shape of the encoding target block.
- the first intra prediction mode corresponding to the angle A2 closest to the angle A1 is used as the changed first intra prediction mode.
- an MPM list including, as an entry, the first intra prediction mode usable in the current block. Further, among the angle predictions usable in the current block, the angle prediction closest to the angle prediction adopted in the adjacent block is included as an entry, so that the prediction efficiency of the first intra prediction mode based on the MPM list is improved. .
- FIG. 13 shows an example of such a method of changing the first intra prediction mode.
- Each row of the table in FIG. 13 corresponds to a predetermined value of Hn / Wn, and each column corresponds to a predetermined value of H / W. Therefore, each cell of the table corresponds to a predetermined combination of Hn / Wn and H / W.
- the fifth column represents H / W ⁇ 1 /.
- a change instruction in the form of “Same” or “ModeBefore ⁇ ModeAfter” is described. “Same” indicates an instruction not to change the first intra prediction mode, and a change instruction of the form “ModeBefore ⁇ ModeAfter” indicates the first intra prediction mode indicated by the ModeBefore number and the first intra prediction mode indicated by the ModeAfter number. Indicates an instruction to change to the mode.
- the MPM generation unit 1211 uses the first intra prediction mode changed by the change method in FIG. 13 as candIntraPredModeA and candIntraPredModeB, and determines candModeList [0] to candModeList [2] according to the above-described generation method of the VVC standard.
- the first intra prediction mode of the adjacent block is changed based on the combination of the shape of the current block and the shape of the adjacent block. Accordingly, even when the aspect ratio of the adjacent block is different from the aspect ratio of the current block, an appropriate MPM list can be generated in consideration of the continuity of the reference direction between the blocks.
- FIG. 14 is a flowchart illustrating an example of a video encoding process performed by the video encoding device 1101 in FIG. In this video encoding process, the encoding process is performed for each CU (Coding @ Unit) which is an example of a block.
- CU Coding @ Unit
- the intra prediction unit 1115 performs intra prediction on a block (CU) of each block size (step 1401). Then, the intra prediction unit 1115 performs an intra prediction mode determination and selects an intra prediction mode with the highest prediction efficiency (step 1402).
- the inter prediction unit 1116 performs inter prediction on blocks of each block size (step 1403).
- the inter prediction is performed for each PU (Prediction @ Unit) obtained by further dividing the CU.
- the inter prediction unit 1116 performs the inter prediction mode determination and selects the inter prediction mode with the highest prediction efficiency (Step 1404).
- the mode determination unit 1114 performs a mode determination to determine whether to apply the intra prediction mode or the inter prediction mode for each block (CU) (step 1405). Then, the subtraction unit 1111 and the transform / quantization unit 1112 encode the current block according to the prediction mode determined by the mode determination unit 1114, and generate quantization coefficients (step 1406).
- the video encoding device 1101 determines whether or not the encoding of the image has been completed (Step 1407). If an unprocessed block remains (step 1407, NO), the video encoding device 1101 repeats the processing from step 1401 onward for the next block.
- the entropy encoding unit 1113 performs variable length encoding on the quantization coefficient and the parameter indicating the determined prediction mode (Step 1408).
- the video coding apparatus 1101 determines whether or not video coding has been completed (Step 1409). If an unprocessed image remains (step 1409, NO), the video encoding device 1101 repeats the processing from step 1401 onward for the next image. Then, when the video encoding is completed (Step 1409, YES), the video encoding device 1101 ends the processing.
- FIG. 15 is a flowchart showing an example of the intra prediction process in step 1401 of FIG.
- the MPM generation unit 1211 changes the first intra prediction mode of the left adjacent block and the upper adjacent block, and generates an MPM list using the changed first intra prediction mode (step 1501).
- the prediction mode calculation unit 1214 determines the second intra prediction mode of the coding target block (step 1502), and the prediction mode calculation unit 1212 sets the determined second intra prediction mode to the first intra prediction mode. Conversion is performed (step 1503).
- the encoding unit 1213 generates an IntraLumaMPMFlag indicating whether to use the MPM list (step 1504), and checks the value of the generated IntraLumaMPMFlag (step 1505).
- IntraLumaMPMFlag When IntraLumaMPMFlag is logical “1” (step 1505, YES), the encoding unit 1213 generates IntraLumaMPMIdx indicating an entry of the MPM list corresponding to the first intra prediction mode of the current block (step 1506). . On the other hand, when IntraLumaMPMFlag is logic “0” (step 1505, NO), the encoding unit 1213 generates an IntraLumaMPMRemainder corresponding to the first intra prediction mode of the current block (step 1507).
- the filter unit 1215 generates an intra prediction pixel value of the coding target block in the determined second intra prediction mode (Step 1508).
- FIG. 16 shows an example of the first neighboring block determination method.
- the uppermost left adjacent block 1602 is selected as the left adjacent block used for generating the MPM list.
- the leftmost upper neighboring block 1603 is selected as the upper neighboring block used for generating the MPM list.
- FIG. 17 shows an example of the second neighboring block determination method.
- the leftmost neighboring block 1702 located at the top of the plurality of left neighboring blocks adjacent to the left of the encoding target block 1701 is the left neighboring block used for generating the MPM list. Selected as a block.
- the upper right block 1703 located at the rightmost is selected as the upper adjacent block used for generating the MPM list.
- the lowermost left adjacent block 1712 among a plurality of left adjacent blocks adjacent to the left of the encoding target block 1711 is used for generating the MPM list. Selected as the left neighboring block.
- the upper adjacent block 1713 located at the leftmost is selected as the upper adjacent block used for generating the MPM list.
- FIG. 18 shows an example of the third adjacent block determination method.
- the left adjacent block having the highest frequency first intra prediction mode is the left adjacent block used for generating the MPM list. Selected as a block.
- the upper neighboring block having the highest frequency first intra prediction mode is selected as the upper neighboring block used for generating the MPM list.
- the prediction efficiency of the first intra prediction mode based on the MPM list is improved.
- the first intra prediction mode M1 of the upper adjacent block 1811 to the upper adjacent block 1814 adjacent to the upper side of the horizontally long encoding target block 1801 is determined as follows.
- the frequency of I1 is one
- the frequency of I2 is two
- the frequency of I3 is one. Therefore, the upper neighboring block 1812 and the upper neighboring block 1813 having the highest frequency I2 are selected, and the first intra prediction mode of these blocks is adopted as the first intra prediction mode of the upper neighboring block.
- the upper neighboring block 1811 to the upper neighboring block 1814 have different first intra prediction modes, the upper neighboring block is selected according to the first neighboring block determining method or the second neighboring block determining method. If any upper neighboring block is coded in the inter prediction mode, the upper neighboring block is excluded from frequency counting targets.
- the left adjacent block used for generating the MPM list is selected in the same manner as the upper adjacent block 1811 to the upper adjacent block 1814. You.
- the same method of determining an adjacent block is applied to an upper adjacent block 1841 and an upper adjacent block 1842 which are adjacent above the vertically long encoding target block 1831.
- the same adjacent block determination method is applied to the left adjacent block 1851 to the left adjacent block 1853 adjacent to the left side of the current block 1831.
- the first intra prediction mode M1 of the left adjacent block 1851 to the left adjacent block 1853 is determined as follows.
- the length of the side of the left adjacent block 1852 that is in contact with the encoding target block 1831 is twice the length of the side of the left adjacent block 1851 or the left adjacent block 1853 that is in contact with the encoding target block 1831. . Therefore, the left adjacent block 1852 having the longest side in contact with the encoding target block 1831 may be selected as the left adjacent block used for generating the MPM list.
- FIG. 19 shows a specific example of the video decoding device 1001 in FIG.
- the video decoding device 1901 in FIG. 19 includes an entropy decoding unit 1911, an inverse quantization / inverse transformation unit 1912, an intra prediction unit 1913, an inter prediction unit 1914, an addition unit 1915, a post filter unit 1916, and a frame memory 1917.
- the entropy decoding unit 1911 corresponds to the decoding unit 1011 in FIG. 10
- the inverse quantization / inverse transformation unit 1912 and the addition unit 1915 correspond to the restoration unit 1014.
- the video decoding device 1901 can be implemented as, for example, a hardware circuit. In this case, each component of the video decoding device 1901 may be implemented as an individual circuit or as one integrated circuit.
- the video decoding device 1901 decodes an encoded stream of an input encoded video and outputs a decoded video.
- the video decoding device 1901 can receive an encoded stream from the video encoding device 1101 of FIG. 11 via a communication network.
- the video decoding device 1901 may be incorporated in a video camera, a video receiving device, a video phone system, a computer, or a mobile terminal device.
- the entropy decoding unit 1911 decodes the encoded video by entropy decoding (variable length decoding), extracts the quantization coefficient of each block in the decoding target image as prediction residual information, and sets a parameter indicating the prediction mode of each block. Is extracted. Further, the entropy decoding unit 1911 also extracts shape parameters indicating the shape of each block.
- the parameter indicating the prediction mode includes an intra prediction parameter indicating the intra prediction mode or an inter prediction parameter indicating the inter prediction mode.
- entropy decoding section 1911 outputs the quantized coefficient to inverse quantization / inverse transform section 1912, outputs the shape parameter and the intra prediction parameter to intra prediction section 1913, and outputs the inter prediction parameter to inter prediction section 1914.
- the inverse quantization / inverse transform unit 1912 performs inverse quantization and inverse orthogonal transform of the quantized coefficient to restore the prediction residual, and outputs the restored prediction residual to the addition unit 1915.
- the intra prediction unit 1913 uses the shape parameter and the intra prediction parameter output from the entropy decoding unit 1911 to calculate the intra prediction pixel value of the current block from the decoded pixel value before the post filter application output from the addition unit 1915. calculate. Then, the intra prediction unit 1913 outputs the calculated intra prediction pixel value to the addition unit 1915.
- the inter prediction unit 1914 performs a motion compensation process using the inter prediction parameter output from the entropy decoding unit 1911 and the pixel value of the reference image output from the frame memory 1917, and performs the inter prediction pixel value of the decoding target block. Is calculated. Then, the inter prediction unit 1914 outputs the calculated inter prediction pixel value to the addition unit 1915.
- the addition unit 1915 adds the prediction pixel value output from the intra prediction unit 1913 or the inter prediction unit 1914 and the prediction residual output from the inverse quantization / inverse transformation unit 1912, and performs decoding before applying the post filter. Generate pixel values. Then, the addition unit 1915 outputs the generated decoded pixel value to the post-filter unit 1916 and the intra prediction unit 1913.
- the post-filter unit 1916 applies a post-filter to the decoded pixel value before applying the post-filter to generate a decoded pixel value after applying the post-filter, in order to reduce the quantization error. Then, the post-filter unit 1916 outputs the generated decoded pixel value to the frame memory 1917.
- the frame memory 1917 stores the decoded pixel value after the post filter is applied, and outputs a decoded video including the decoded pixel value.
- the decoded pixel value stored in the frame memory 1917 is output to the inter prediction unit 1914 as a pixel value of a reference image.
- FIG. 20 shows a functional configuration example of the intra prediction unit 1913 in FIG. 20 includes an MPM generation unit 2011, a storage unit 2012, a prediction mode calculation unit 2013, a prediction mode calculation unit 2014, and a filter unit 2015.
- the MPM generation unit 2011 and the filter unit 2015 correspond to the generation unit 1012 and the prediction unit 1013 in FIG. 10, respectively.
- the shape parameter is input from the entropy decoding unit 1911 to the MPM generation unit 2011 and the prediction mode calculation unit 2014. Further, the prediction mode calculation unit 2013 receives the intra prediction parameters from the entropy decoding unit 1911.
- the input intra prediction parameters include IntraLumaMPMFlag and IntraLumaMPMIdx or IntraLumaMPMRemainder.
- the storage unit 2012 stores the width, height, and first intra prediction mode of each block.
- DC prediction is stored as the first intra prediction mode of the block for which the inter prediction mode has been selected. Then, the storage unit 2012 outputs the width Wn and the height Hn of each of the left adjacent block and the upper adjacent block and the first intra prediction mode of each of the left adjacent block and the upper adjacent block to the MPM generation unit 2011.
- the MPM generation unit 2011 changes the first intra prediction mode of the adjacent block based on the combination of the shape of the current block and the shape of each adjacent block by the same change method as the video encoding device 1101 in FIG. I do. At this time, based on a combination of the ratio H / W of the height H to the width W of the decoding target block and the ratio Hn / Wn of the height Hn to the width Wn of each adjacent block, the MPM generation unit 2011 determines the neighboring Change the first intra prediction mode of the block. This change is performed independently for each of the left adjacent block and the upper adjacent block.
- the angle A1 indicated by the first intra prediction mode of the adjacent block is an angle not used for intra prediction in the shape of the decoding target block.
- the first intra prediction mode corresponding to the angle A2 closest to the angle A1 is used as the changed first intra prediction mode.
- the MPM generation unit 2011 can change the first intra prediction mode of the adjacent block according to the change method shown in FIG.
- the MPM list used for encoding the intra prediction parameters can be restored from the encoded video.
- the MPM generation unit 2011 generates an MPM list by the above-described VVC standard generation method using the changed first intra prediction mode of the left adjacent block and the upper adjacent block, and outputs the generated MPM list to the prediction mode calculation unit 2013. Output to
- the prediction mode calculation unit 2013 obtains the first intra prediction mode of the decoding target block from the input intra prediction parameters using the MPM list, and outputs the first intra prediction mode to the storage unit 2012 and the prediction mode calculation unit 2014.
- the IntraLumaMPMFlag is logic “1”
- the entry of the MPM list specified by IntraLumaMPMIdx is output as the first intra prediction mode of the decoding target block.
- IntraLumaMPMFlag is logic “0”
- IntraDir is obtained from IntraLumaMPMRemainder by the above-described procedures (P11) to (P13), and the IntraDir is output as the first intra prediction mode of the decoding target block.
- the prediction mode calculation unit 2014 converts the first intra prediction mode of the current block to the second intra prediction mode based on the width W and the height H of the current block.
- the filter unit 2015 applies a filter corresponding to the second intra prediction mode output from the prediction mode calculation unit 2014 to the decoded pixel value before applying the post filter, and generates an intra prediction pixel value of the decoding target block. . Then, the filter unit 2015 outputs the generated intra prediction pixel value to the addition unit 1915.
- the encoded video output from the video encoding device 1101 in FIG. 11 can be decoded to restore the original video.
- FIG. 21 is a flowchart illustrating an example of a video decoding process performed by the video decoding device 1901 in FIG. In this video decoding process, a decoding process is performed for each CU that is an example of a block.
- the entropy decoding unit 1911 performs variable-length decoding on the coded video to extract quantization coefficients, shape parameters, and parameters indicating prediction modes of the current block (CU to be decoded) (step 2101). Then, entropy decoding section 1911 checks whether the parameter indicating the prediction mode is an intra prediction parameter or an inter prediction parameter (step 2102).
- the intra prediction unit 1913 performs intra prediction on the current block and calculates an intra prediction pixel value of the current block (step 2103). .
- the inter prediction unit 1914 performs a motion compensation process on the current block to calculate the inter prediction pixel value of the current block ( Step 2104).
- the inverse quantization / inverse transform unit 1912 decodes the quantized coefficient of the decoding target block to restore the prediction residual (Step 2105). Then, the adding unit 1915 and the post-filter unit 1916 generate a decoded pixel value of the decoding target block using the restored prediction residual and the predicted pixel value output from the intra prediction unit 1913 or the inter prediction unit 1914. I do.
- the video decoding device 1901 determines whether or not decoding of the encoded video has been completed (Step 2106). If an unprocessed binary string remains (step 2106, NO), the video decoding device 1901 repeats the processing from step 2101 on for the next binary string. Then, when the decoding of the encoded video is completed (step 2106, YES), the video decoding device 1901 ends the processing.
- FIG. 22 is a flowchart showing an example of the intra prediction process in step 2103 of FIG.
- the MPM generation unit 2011 changes the first intra prediction mode of the left adjacent block and the upper adjacent block, and generates an MPM list using the changed first intra prediction mode (Step 2201).
- the prediction mode calculation unit 2013 checks the value of IntraLumaMPMFlag (step 2202). When IntraLumaMPMFlag is logic “1”, the prediction mode calculation unit 2013 acquires the value of IntraLumaMPMIdx (step 2203). Then, the prediction mode calculation unit 2013 acquires the entry of the MPM list specified by IntraLumaMPMIdx as the first intra prediction mode of the decoding target block (Step 2204).
- the prediction mode calculation unit 2013 acquires the value of IntraLumaMPMRemainder (Step 2205), and converts the acquired value to the first intra prediction mode (Step 2206).
- the prediction mode calculation unit 2014 converts the first intra prediction mode of the decoding target block into the second intra prediction mode (Step 2207). Then, the filter unit 2015 generates an intra prediction pixel value of the decoding target block based on the second intra prediction mode output from the prediction mode calculation unit 2014 (Step 2208).
- the adjacent block used for generating the MPM list is determined in the same manner as the adjacent block determination method shown in FIGS. .
- the first to third adjacent block determination methods may be applied by replacing the current block to be decoded in FIGS. 16 to 18 with the current block to be decoded.
- the configuration of the video encoding device in FIGS. 9 and 11 is merely an example, and some components may be omitted or changed according to the use or conditions of the video encoding device.
- the configuration of the intra prediction unit 1115 in FIG. 12 is merely an example, and some components may be omitted or changed depending on the application or conditions of the video encoding device.
- the video encoding device may employ an encoding method other than VVC.
- the configuration of the video decoding device in FIGS. 10 and 19 is merely an example, and some components may be omitted or changed according to the application or conditions of the video decoding device.
- the configuration of the intra prediction unit 1913 in FIG. 20 is merely an example, and some components may be omitted or changed depending on the application or conditions of the video decoding device.
- the video decoding device may employ a decoding method other than VVC.
- FIGS. 14, 15, 21, and 22 are merely examples, and some processes may be omitted or changed depending on the configuration or conditions of the video encoding device or the video decoding device. .
- the coding target block shown in FIGS. 2, 8, and 16 to 18 and the left adjacent block and the upper adjacent block shown in FIGS. 16 to 18 are merely examples. It changes according to the image to be played.
- the neighboring block determination method shown in FIGS. 16 to 18 is merely an example, and a neighboring block used for generating the MPM list may be determined by another neighboring block determination method.
- the method of changing the first intra prediction mode shown in FIG. 13 is only an example, and the first intra prediction mode of the adjacent block may be changed by another changing method.
- the video encoding device of FIGS. 9 and 11 and the video decoding device of FIGS. 10 and 19 can be implemented as a hardware circuit, or can be implemented using an information processing device (computer).
- FIG. 23 illustrates a configuration example of an information processing device used as the video encoding device 901, the video decoding device 1001, the video encoding device 1101, and the video decoding device 1901.
- 23 includes a CPU (Central Processing Unit) 2301, a memory 2302, an input device 2303, an output device 2304, an auxiliary storage device 2305, a medium drive device 2306, and a network connection device 2307. These components are connected to each other by a bus 2308.
- a CPU Central Processing Unit
- the memory 2302 is a semiconductor memory such as a ROM (Read Only Memory), a RAM (Random Access Memory), or a flash memory, and stores programs and data used for processing.
- the memory 2302 can be used as the frame memory 1120 in FIG. 11, the frame memory 1917 in FIG. 19, or the storage unit 2012 in FIG.
- the CPU 2301 (processor) operates as the generation unit 911, the prediction unit 912, the first encoding unit 913, and the second encoding unit 914 in FIG. 9 by executing a program using the memory 2302, for example.
- the CPU 2301 operates as the decoding unit 1011, the generation unit 1012, the prediction unit 1013, and the restoration unit 1014 in FIG. 10 by executing a program using the memory 2302.
- the CPU 2301 operates as the subtraction unit 1111, the transformation / quantization unit 1112, the entropy encoding unit 1113, and the mode determination unit 1114 in FIG. 11 by executing the program using the memory 2302.
- the CPU 2301 operates as an intra prediction unit 1115, an inter prediction unit 1116, an inverse quantization / inverse conversion unit 1117, an addition unit 1118, and a post filter unit 1119 by executing a program using the memory 2302.
- the CPU 2301 operates as the MPM generation unit 1211, the prediction mode calculation unit 1212, the encoding unit 1213, the prediction mode calculation unit 1214, and the filter unit 1215 in FIG. 12 by executing a program using the memory 2302.
- the CPU 2301 also operates as the entropy decoding unit 1911, the inverse quantization / inverse transformation unit 1912, the intra prediction unit 1913, and the inter prediction unit 1914 in FIG. 19 by executing the program using the memory 2302.
- the CPU 2301 operates as an addition unit 1915 and a post-filter unit 1916 by executing a program using the memory 2302.
- the CPU 2301 also operates as the MPM generation unit 2011, the prediction mode calculation unit 2013, the prediction mode calculation unit 2014, and the filter unit 2015 of FIG. 20 by executing a program using the memory 2302.
- the input device 2303 is, for example, a keyboard, a pointing device, or the like, and is used for inputting an instruction or information from a user or an operator.
- the output device 2304 is, for example, a display device, a printer, a speaker, or the like, and is used for inquiring a user or an operator or outputting a processing result.
- the processing result may be a decoded video.
- the auxiliary storage device 2305 is, for example, a magnetic disk device, an optical disk device, a magneto-optical disk device, a tape device, or the like.
- the auxiliary storage device 2305 may be a hard disk drive.
- the information processing device can store programs and data in the auxiliary storage device 2305 and load them into the memory 2302 for use.
- the medium driving device 2306 drives the portable recording medium 2309 and accesses the recorded contents.
- the portable recording medium 2309 is a memory device, a flexible disk, an optical disk, a magneto-optical disk, or the like.
- the portable recording medium 2309 may be a CD-ROM (Compact Disk Read Only Memory), a DVD (Digital Versatile Disk), or a USB (Universal Serial Bus) memory.
- the user or the operator can store programs and data in the portable recording medium 2309 and load them into the memory 2302 for use.
- the computer-readable recording medium that stores the program and data used for the processing includes a physical (non-temporary) medium such as the memory 2302, the auxiliary storage device 2305, and the portable recording medium 2309.
- a recording medium is included.
- the network connection device 2307 is a communication interface circuit that is connected to a communication network such as a LAN (Local Area Network) and a WAN (Wide Area Network) and performs data conversion accompanying communication.
- the network connection device 2307 can transmit the encoded video to the video decoding device and can receive the encoded video from the video encoding device.
- the information processing device can receive the program and data from an external device via the network connection device 2307, and can use them by loading them into the memory 2302.
- the information processing device does not need to include all the components in FIG. 23, and some of the components may be omitted depending on the application or conditions. For example, when an interface with a user or an operator is unnecessary, the input device 2303 and the output device 2304 may be omitted. When the information processing device does not access the portable recording medium 2309, the medium driving device 2306 may be omitted.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Description
非正方形である長方形のブロックに対するイントラ予測を行う場合、HEVCの角度予測(-45度~+135度)では不十分となる場合がある。そこで、VVCでは、長方形のブロックに対する角度予測の角度範囲を拡張することで、この問題に対応している。
(P1)映像符号化装置は、IntraDirの値をIntraLumaMPMRemainderに設定する。
(P2)映像符号化装置は、MPMリストのエントリを昇順でソートし、mpm_sort[i] (i=0..2, mpm_sort[0]<mpm_sort[1]<mpm_sort[2])を生成する。
(P3)映像符号化装置は、mpm_sort[i]とIntraLumaMPMRemainderとを順番に比較し、mpm_sort[i]<=IntraLumaMPMRemainderであれば、IntraLumaMPMRemainderを1だけデクリメントする。
(P11)映像復号装置は、IntraLumaMPMRemainderの値をIntraDirに設定する。
(P12)映像復号装置は、手順(P2)と同様にして、mpm_sort[i]を生成する。
(P13)映像復号装置は、mpm_sort[i]とIntraDirとを順番に比較し、mpm_sort[i]<= IntraDirであれば、IntraDirを1だけインクリメントする。
・(candIntraPredModeA == candIntraPredModeB)かつ(candIntraPredModeA < 2)の場合
candModeList[0] = 0
candModeList[1] = 1
candModeList[2] = 50
・(candIntraPredModeA == candIntraPredModeB)かつ(candIntraPredModeA >= 2)の場合
candModeList[0] = candIntraPredModeA
candModeList[1] = 2 + ((candIntraPredModeA + 61) % 64)
candModeList[2] = 2 + ((candIntraPredModeA - 1) % 64)
・(candIntraPredModeA != candIntraPredModeB)の場合
candModeList[0] = candIntraPredModeA
candModeList[1] = candIntraPredModeB
(candModeList[0] != 0)かつ(candModeList[1] != 0)の場合
candModeList[2] = 0
(candModeList[0] != 1)かつ(candModeList[1] != 1)の場合
candModeList[2] = 1
上記以外の場合
candModeList[2] = 50
ただし、“% 64”は、64を除数とする除算の剰余を表す。この生成方法によれば、candIntraPredModeA又はcandIntraPredModeBがMPMとして用いられることがある。
上隣接ブロック1811 M1=I1
上隣接ブロック1812 M1=I2
上隣接ブロック1813 M1=I2
上隣接ブロック1814 M1=I3
I1~I3は、それぞれ異なる番号である。この場合、I1の頻度は1回であり、I2の頻度は2回であり、I3の頻度は1回である。したがって、最も高い頻度のI2を有する上隣接ブロック1812及び上隣接ブロック1813が選択され、これらのブロックの第1イントラ予測モードが、上隣接ブロックの第1イントラ予測モードとして採用される。
左隣接ブロック1851 M1=I4
左隣接ブロック1852 M1=I5
左隣接ブロック1853 M1=I6
I4~I6は、それぞれ異なる番号である。この場合、I4~I6の頻度はいずれも1回である。
・W=Hの場合
M2=M1
・W>Hの場合
・2≦M1<mLの場合、M2=M1+65
W=2Hの場合、mL=8
W=2H以外(W>2H)の場合、mL=12
・2≦M1<mL以外の場合、M2=M1
・W<Hの場合
・mH<M1≦66の場合、M2=M1?67
H=2Wの場合、mH=60
H=2W以外(H>2W)の場合、mH=56
・mH<M1≦66以外の場合、M2=M1
Claims (10)
- 映像に含まれる画像内の符号化対象ブロックの形状と、前記符号化対象ブロックに隣接する符号化済みブロックの形状との組み合わせに基づいて、前記符号化済みブロックの符号化に用いたイントラ予測モードを示す第1予測モード情報を第2予測モード情報に変更し、前記第2予測モード情報を用いて、予測モード情報の候補値を含む候補情報を生成する生成部と、
所定のイントラ予測モードにおける、前記符号化対象ブロックのイントラ予測画素値を生成する予測部と、
前記イントラ予測画素値を用いて、前記符号化対象ブロックを符号化する第1符号化部と、
前記候補情報を用いて、前記所定のイントラ予測モードを示す予測モード情報を符号化する第2符号化部と、
を備えることを特徴とする映像符号化装置。 - 前記生成部は、前記符号化済みブロックの符号化に用いたイントラ予測モードが示す第1角度が、前記符号化対象ブロックの形状においてイントラ予測に使用されない角度である場合、前記符号化対象ブロックの形状においてイントラ予測に使用される角度のうち、前記第1角度に最も近い第2角度を示すイントラ予測モードに対応する予測モード情報を、前記第2予測モード情報として用いることを特徴とする請求項1記載の映像符号化装置。
- 前記生成部は、前記符号化対象ブロックの1つの辺に隣接する複数の符号化済みブロックのうち、最も高い頻度の第1予測モード情報を有する符号化済みブロックを用いて、前記候補情報を生成することを特徴とする請求項1又は2記載の映像符号化装置。
- 映像符号化装置によって実行される映像符号化方法であって、
前記映像符号化装置が、
映像に含まれる画像内の符号化対象ブロックの形状と、前記符号化対象ブロックに隣接する符号化済みブロックの形状との組み合わせに基づいて、前記符号化済みブロックの符号化に用いたイントラ予測モードを示す第1予測モード情報を第2予測モード情報に変更し、
前記第2予測モード情報を用いて、予測モード情報の候補値を含む候補情報を生成し、
所定のイントラ予測モードにおける、前記符号化対象ブロックのイントラ予測画素値を生成し、
前記イントラ予測画素値を用いて、前記符号化対象ブロックを符号化し、
前記候補情報を用いて、前記所定のイントラ予測モードを示す予測モード情報を符号化する、
ことを特徴とする映像符号化方法。 - 映像に含まれる画像内の符号化対象ブロックの形状と、前記符号化対象ブロックに隣接する符号化済みブロックの形状との組み合わせに基づいて、前記符号化済みブロックの符号化に用いたイントラ予測モードを示す第1予測モード情報を第2予測モード情報に変更し、
前記第2予測モード情報を用いて、予測モード情報の候補値を含む候補情報を生成し、
所定のイントラ予測モードにおける、前記符号化対象ブロックのイントラ予測画素値を生成し、
前記イントラ予測画素値を用いて、前記符号化対象ブロックを符号化し、
前記候補情報を用いて、前記所定のイントラ予測モードを示す予測モード情報を符号化する、
処理をコンピュータに実行させるための映像符号化プログラム。 - 符号化映像を復号して、前記符号化映像に含まれる符号化画像内の復号対象ブロックの予測残差情報と、前記復号対象ブロックのイントラ予測モードを示す予測モード情報と、前記復号対象ブロックに隣接する復号済みブロックのイントラ予測モードを示す第1予測モード情報とを抽出する、復号部と、
前記復号対象ブロックの形状と前記復号済みブロックの形状との組み合わせに基づいて、前記第1予測モード情報を第2予測モード情報に変更し、前記第2予測モード情報を用いて、予測モード情報の候補値を含む候補情報を生成する生成部と、
前記候補情報を用いて、前記復号対象ブロックの予測モード情報が示すイントラ予測モードにおける、前記復号対象ブロックのイントラ予測画素値を生成する予測部と、
前記イントラ予測画素値と前記予測残差情報とを用いて、前記復号対象ブロックの画素値を生成する復元部と、
を備えることを特徴とする映像復号装置。 - 前記生成部は、前記復号済みブロックの符号化に用いたイントラ予測モードが示す第1角度が、前記復号対象ブロックの形状においてイントラ予測に使用されない角度である場合、前記復号対象ブロックの形状においてイントラ予測に使用される角度のうち、前記第1角度に最も近い第2角度を示すイントラ予測モードに対応する予測モード情報を、前記第2予測モード情報として用いることを特徴とする請求項6記載の映像復号装置。
- 前記生成部は、前記復号対象ブロックの1つの辺に隣接する複数の復号済みブロックのうち、最も高い頻度の第1予測モード情報を有する復号済みブロックを用いて、前記候補情報を生成することを特徴とする請求項6又は7記載の映像復号装置。
- 映像復号装置によって実行される映像復号方法であって、
前記映像復号装置が、
符号化映像を復号して、前記符号化映像に含まれる符号化画像内の復号対象ブロックの予測残差情報と、前記復号対象ブロックのイントラ予測モードを示す予測モード情報と、前記復号対象ブロックに隣接する復号済みブロックのイントラ予測モードを示す第1予測モード情報とを抽出し、
前記復号対象ブロックの形状と前記復号済みブロックの形状との組み合わせに基づいて、前記第1予測モード情報を第2予測モード情報に変更し、
前記第2予測モード情報を用いて、予測モード情報の候補値を含む候補情報を生成し、
前記候補情報を用いて、前記復号対象ブロックの予測モード情報が示すイントラ予測モードにおける、前記復号対象ブロックのイントラ予測画素値を生成し、
前記イントラ予測画素値と前記予測残差情報とを用いて、前記復号対象ブロックの画素値を生成する、
ことを特徴とする映像復号方法。 - 符号化映像を復号して、前記符号化映像に含まれる符号化画像内の復号対象ブロックの予測残差情報と、前記復号対象ブロックのイントラ予測モードを示す予測モード情報と、前記復号対象ブロックに隣接する復号済みブロックのイントラ予測モードを示す第1予測モード情報とを抽出し、
前記復号対象ブロックの形状と前記復号済みブロックの形状との組み合わせに基づいて、前記第1予測モード情報を第2予測モード情報に変更し、
前記第2予測モード情報を用いて、予測モード情報の候補値を含む候補情報を生成し、
前記候補情報を用いて、前記復号対象ブロックの予測モード情報が示すイントラ予測モードにおける、前記復号対象ブロックのイントラ予測画素値を生成し、
前記イントラ予測画素値と前記予測残差情報とを用いて、前記復号対象ブロックの画素値を生成する、
処理をコンピュータに実行させるための映像復号プログラム。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020547522A JP7180679B2 (ja) | 2018-09-19 | 2018-09-19 | 映像符号化装置、映像符号化方法、映像符号化プログラム、映像復号装置、映像復号方法、及び映像復号プログラム |
CA3112324A CA3112324A1 (en) | 2018-09-19 | 2018-09-19 | Video coding and decoding method and apparatus which change the prediction mode according to the combination of the shape of a coding target block and the shape of a block adjacent to the coding target block |
MX2021002968A MX2021002968A (es) | 2018-09-19 | 2018-09-19 | Aparato de codificacion de video, metodo de codificacion de video, programa de codificacion de video, aparato de decodificacion de video, metodo de decodificacion de video y programa de decodificacion de video. |
CN201880097442.9A CN112673628B (zh) | 2018-09-19 | 2018-09-19 | 影像编码装置及方法、影像解码装置及方法、记录介质 |
KR1020217007092A KR102570374B1 (ko) | 2018-09-19 | 2018-09-19 | 영상 부호화 장치, 영상 부호화 방법, 영상 부호화 프로그램, 영상 복호 장치, 영상 복호 방법, 및 영상 복호 프로그램 |
PCT/JP2018/034681 WO2020059051A1 (ja) | 2018-09-19 | 2018-09-19 | 映像符号化装置、映像符号化方法、映像符号化プログラム、映像復号装置、映像復号方法、及び映像復号プログラム |
EP18934192.8A EP3855739A4 (en) | 2018-09-19 | 2018-09-19 | VIDEO CODING DEVICE, VIDEO CODING METHOD, VIDEO CODING PROGRAM, VIDEO DECODING DEVICE, VIDEO DECODING METHOD AND VIDEO DECODING PROGRAM |
BR112021003510-9A BR112021003510A2 (pt) | 2018-09-19 | 2018-09-19 | aparelho de codificação de vídeo, método de codificação de vídeo, programa de codificação de vídeo, aparelho de decodificação de vídeo, método de decodificação de vídeo e programa de decodificação de vídeo |
US17/199,498 US20210203926A1 (en) | 2018-09-19 | 2021-03-12 | Video coding apparatus, video coding method, video decoding apparatus, and video decoding method |
JP2022098633A JP7323014B2 (ja) | 2018-09-19 | 2022-06-20 | 映像復号方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/034681 WO2020059051A1 (ja) | 2018-09-19 | 2018-09-19 | 映像符号化装置、映像符号化方法、映像符号化プログラム、映像復号装置、映像復号方法、及び映像復号プログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/199,498 Continuation US20210203926A1 (en) | 2018-09-19 | 2021-03-12 | Video coding apparatus, video coding method, video decoding apparatus, and video decoding method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020059051A1 true WO2020059051A1 (ja) | 2020-03-26 |
Family
ID=69887002
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/034681 WO2020059051A1 (ja) | 2018-09-19 | 2018-09-19 | 映像符号化装置、映像符号化方法、映像符号化プログラム、映像復号装置、映像復号方法、及び映像復号プログラム |
Country Status (9)
Country | Link |
---|---|
US (1) | US20210203926A1 (ja) |
EP (1) | EP3855739A4 (ja) |
JP (1) | JP7180679B2 (ja) |
KR (1) | KR102570374B1 (ja) |
CN (1) | CN112673628B (ja) |
BR (1) | BR112021003510A2 (ja) |
CA (1) | CA3112324A1 (ja) |
MX (1) | MX2021002968A (ja) |
WO (1) | WO2020059051A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020182167A1 (en) * | 2019-03-12 | 2020-09-17 | Zhejiang Dahua Technology Co., Ltd. | Systems and methods for image coding |
WO2023051637A1 (en) * | 2021-09-29 | 2023-04-06 | Beijing Bytedance Network Technology Co., Ltd. | Method, device, and medium for video processing |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013150164A (ja) * | 2012-01-19 | 2013-08-01 | Sony Corp | 符号化装置および符号化方法、並びに、復号装置および復号方法 |
JP2014042309A (ja) * | 2010-09-30 | 2014-03-06 | Panasonic Corp | 画像符号化方法および画像符号化装置 |
JP2016027756A (ja) | 2012-09-28 | 2016-02-18 | 日本電信電話株式会社 | イントラ予測符号化方法、イントラ予測復号方法、イントラ予測符号化装置、イントラ予測復号装置、それらのプログラム並びにプログラムを記録した記録媒体 |
WO2018037896A1 (ja) * | 2016-08-26 | 2018-03-01 | シャープ株式会社 | 画像復号装置、画像符号化装置、画像復号方法、および画像符号化方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5321426B2 (ja) * | 2009-11-26 | 2013-10-23 | 株式会社Jvcケンウッド | 画像符号化装置、画像復号化装置、画像符号化方法、及び画像復号化方法 |
CN107277527B (zh) | 2010-07-15 | 2020-02-18 | 威勒斯媒体国际有限公司 | 解码装置、解码方法、编码装置以及编码方法 |
RU2619202C1 (ru) * | 2010-07-20 | 2017-05-12 | Нтт Докомо, Инк. | Устройство кодирования изображений с предсказанием, способ кодирования изображений с предсказанием, программа кодирования изображений с предсказанием, устройство декодирования изображений с предсказанием, способ декодирования изображений с предсказанием и программа декодирования изображений с предсказанием |
US10142627B2 (en) * | 2015-06-18 | 2018-11-27 | Qualcomm Incorporated | Intra prediction and intra mode coding |
KR20180040319A (ko) * | 2016-10-12 | 2018-04-20 | 가온미디어 주식회사 | 영상 처리 방법, 그를 이용한 영상 복호화 및 부호화 방법 |
CN114222137A (zh) * | 2016-05-28 | 2022-03-22 | 世宗大学校产学协力团 | 构成预测运动矢量列表的方法 |
JP2019525577A (ja) | 2016-07-18 | 2019-09-05 | エレクトロニクス アンド テレコミュニケーションズ リサーチ インスチチュートElectronics And Telecommunications Research Institute | 画像符号化/復号方法、装置、及び、ビットストリームを保存した記録媒体 |
JP6792996B2 (ja) * | 2016-10-12 | 2020-12-02 | 日本放送協会 | 符号化装置、復号装置及びプログラム |
FI20175006A1 (en) | 2017-01-03 | 2019-02-15 | Nokia Technologies Oy | Video and image coding using wide-angle intra-prediction |
EP3577898A4 (en) * | 2017-01-31 | 2020-06-24 | Sharp Kabushiki Kaisha | SYSTEMS AND METHODS FOR PARTITIONING A VIDEO BLOCK IMAGE FOR VIDEO CODING |
KR102424239B1 (ko) * | 2017-04-28 | 2022-07-25 | 한국전자통신연구원 | 영상 부호화/복호화 방법, 장치 및 비트스트림을 저장한 기록 매체 |
EP3422716A1 (en) * | 2017-06-26 | 2019-01-02 | Thomson Licensing | Method and apparatus for most probable mode (mpm) sorting and signaling in video encoding and decoding |
US10382772B1 (en) * | 2018-07-02 | 2019-08-13 | Tencent America LLC | Method and apparatus for video coding |
-
2018
- 2018-09-19 JP JP2020547522A patent/JP7180679B2/ja active Active
- 2018-09-19 CA CA3112324A patent/CA3112324A1/en active Pending
- 2018-09-19 WO PCT/JP2018/034681 patent/WO2020059051A1/ja unknown
- 2018-09-19 CN CN201880097442.9A patent/CN112673628B/zh active Active
- 2018-09-19 KR KR1020217007092A patent/KR102570374B1/ko active IP Right Grant
- 2018-09-19 EP EP18934192.8A patent/EP3855739A4/en active Pending
- 2018-09-19 BR BR112021003510-9A patent/BR112021003510A2/pt unknown
- 2018-09-19 MX MX2021002968A patent/MX2021002968A/es unknown
-
2021
- 2021-03-12 US US17/199,498 patent/US20210203926A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014042309A (ja) * | 2010-09-30 | 2014-03-06 | Panasonic Corp | 画像符号化方法および画像符号化装置 |
JP2013150164A (ja) * | 2012-01-19 | 2013-08-01 | Sony Corp | 符号化装置および符号化方法、並びに、復号装置および復号方法 |
JP2016027756A (ja) | 2012-09-28 | 2016-02-18 | 日本電信電話株式会社 | イントラ予測符号化方法、イントラ予測復号方法、イントラ予測符号化装置、イントラ予測復号装置、それらのプログラム並びにプログラムを記録した記録媒体 |
WO2018037896A1 (ja) * | 2016-08-26 | 2018-03-01 | シャープ株式会社 | 画像復号装置、画像符号化装置、画像復号方法、および画像符号化方法 |
Non-Patent Citations (3)
Title |
---|
"Versatile Video Coding (Draft 2", JVET-K1001, JVET OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG 11, July 2018 (2018-07-01) |
BROSS, BENJAMIN ET AL., VERSATILE VIDEO CODING (DRAFT 2), JVET-K1001-V5, 18 September 2018 (2018-09-18), pages 45 - 68, XP055694594, Retrieved from the Internet <URL:http://phenix.it-sudparis.eu/jvet/doc_end_user/documents/11_Ljubljana/wg11/JVET-K1001-v5.zip> * |
XU, JUN ET AL.: "Non-CE6: Improvements for SDIP", JCTUC-G354R2, 18 November 2011 (2011-11-18), pages 1 - 4, XP030050479, Retrieved from the Internet <URL:http://phenix.it-sudparis.eu/jct/doc_end_user/documents/7_Geneva/wg11/JCTVC-G354-v4.zip> * |
Also Published As
Publication number | Publication date |
---|---|
KR102570374B1 (ko) | 2023-08-25 |
CN112673628A (zh) | 2021-04-16 |
JPWO2020059051A1 (ja) | 2021-09-09 |
JP7180679B2 (ja) | 2022-11-30 |
US20210203926A1 (en) | 2021-07-01 |
CA3112324A1 (en) | 2020-03-26 |
EP3855739A1 (en) | 2021-07-28 |
KR20210039470A (ko) | 2021-04-09 |
BR112021003510A2 (pt) | 2021-05-18 |
CN112673628B (zh) | 2024-03-26 |
EP3855739A4 (en) | 2021-07-28 |
MX2021002968A (es) | 2021-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2603543C2 (ru) | Способ и устройство для кодирования видео и способ и устройство для декодирования видео | |
US11805262B2 (en) | Image encoding device, image decoding device, and image processing method | |
WO2014054267A1 (ja) | 画像符号化装置及び画像符号化方法 | |
US10075725B2 (en) | Device and method for image encoding and decoding | |
US20160373767A1 (en) | Encoding and Decoding Methods and Apparatuses | |
JP6164360B2 (ja) | 画像符号化装置、画像復号装置、画像符号化方法、及び画像復号方法 | |
JP6212890B2 (ja) | 動画像符号化装置、動画像符号化方法、及び動画像符号化プログラム | |
US20210203926A1 (en) | Video coding apparatus, video coding method, video decoding apparatus, and video decoding method | |
JP6662123B2 (ja) | 画像符号化装置、画像符号化方法、及び画像符号化プログラム | |
JP7323014B2 (ja) | 映像復号方法 | |
WO2019150411A1 (ja) | 映像符号化装置、映像符号化方法、映像復号装置、映像復号方法、及び映像符号化システム | |
US11218705B2 (en) | Information processing device and video encoding method | |
JP2019036772A (ja) | 動画像符号化装置、動画像符号化方法、及び動画像符号化プログラム | |
JP6886825B2 (ja) | 予測装置、符号化装置、復号装置、及びプログラム | |
JP6917718B2 (ja) | 予測装置、符号化装置、復号装置、及びプログラム | |
JP6331972B2 (ja) | 動画像符号化装置、動画像符号化方法、及び動画像符号化プログラム | |
JP2012120086A (ja) | 動画像符号化装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18934192 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020547522 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3112324 Country of ref document: CA Ref document number: 20217007092 Country of ref document: KR Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112021003510 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018934192 Country of ref document: EP Effective date: 20210419 |
|
ENP | Entry into the national phase |
Ref document number: 112021003510 Country of ref document: BR Kind code of ref document: A2 Effective date: 20210224 |