GB2591379A - Single-line cross component linear model prediction mode - Google Patents

Single-line cross component linear model prediction mode Download PDF

Info

Publication number
GB2591379A
GB2591379A GB2103284.2A GB202103284A GB2591379A GB 2591379 A GB2591379 A GB 2591379A GB 202103284 A GB202103284 A GB 202103284A GB 2591379 A GB2591379 A GB 2591379A
Authority
GB
United Kingdom
Prior art keywords
samples
luma
luma samples
video
downsampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2103284.2A
Other versions
GB2591379B (en
GB202103284D0 (en
Inventor
Zhang Kai
Zhang Li
Liu Hongbin
Wang Yue
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
ByteDance Inc
Original Assignee
Beijing ByteDance Network Technology Co Ltd
ByteDance Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201810005182.XA external-priority patent/CN109994601B/en
Application filed by Beijing ByteDance Network Technology Co Ltd, ByteDance Inc filed Critical Beijing ByteDance Network Technology Co Ltd
Priority claimed from PCT/IB2019/057699 external-priority patent/WO2020053805A1/en
Publication of GB202103284D0 publication Critical patent/GB202103284D0/en
Publication of GB2591379A publication Critical patent/GB2591379A/en
Application granted granted Critical
Publication of GB2591379B publication Critical patent/GB2591379B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/11Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/184Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being bits, e.g. of the compressed video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/59Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/593Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • H04N19/82Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation involving filtering within a prediction loop

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Television Systems (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Devices, systems and methods for digital video coding, which includes cross-component prediction, are described. In a representative aspect, a method for video coding includes receiving a bitstream representation of a current block of video data comprising a luma component and a chroma component, determining parameters of a linear model based on a first set of samples that are generated by down-sampling a second set of samples of the luma component, and processing, based on the parameters of the linear model, the bitstream representation to generate the current block.

Claims (31)

1. A method of video processing, comprising: generating downsampled outside luma samples corresponding to chroma samples outside a chroma block of a video using a first downsampling scheme; generating downsampled inside luma samples corresponding to chroma samples inside the chroma block using a second downsampling scheme; at least using the downsampled outside luma samples to derive a linear model for cross component prediction; determining predicted samples for the chroma block using the linear model and the downsampled inside luma samples; and performing a conversion between the video and a coded representation of the video using the predicted samples for the chroma block.
2. The method of claim 1, wherein the first downsampling scheme corresponds to downsampling outside above luma samples to lower left and/or lower right positions.
3. The method of claims 1 or 2, wherein the first downsampling scheme corresponds to downsampling outside luma samples to lower left positions.
4. The method of claim 3, wherein the first downsampling scheme calculates downsampled luma samples d[i] from above adjacent luma samples a[i] as d[i] = (a[2i-l]+2*a[2i]+a[2i+l]+2)>>2 in a case that all a samples are available, or d[i] = (3*a[2i]+a[2i+l]+2)>>2 in a case that a[2i-l] is unavailable, or d[i] = (3*a[2i]+a[2i-l]+2)>>2 in a case that a[2i+l] is unavailable.
5. The method of claim 3, wherein the first downsampling scheme calculates downsampled luma samples d[i] from above adjacent luma samples a[i] as d[i]=(a[2i-l]+2*a[2i]+a[2i+l]+offset0)>>2, for i > 0 and one of d[i]=(3*a[2i]+a[2i+l]+ offsetl)>>2, or d[i]= a[2i], or d[i]=(a[2i]+a[2i+l]+offset2)>>l , for i= 0, where offsetO, offsetl and offset2 are fractions.
6. The method of claim 5, wherein offsetO = offsetl = 2 and offset2 = 1.
7. The method of claim 1, wherein the first downsampling scheme corresponds to downsampling outside left luma samples to lower right and/or upper right positions.
8. The method of claim 1, wherein the first downsampling scheme corresponds to downsampling outside left luma samples to halfway between lower right and upper right positions.
9. The method of claim 8, wherein a downsampled luma samples d[i] is calculated as: d[j] = (a[2j]+a[2j+l]+l)»2.
10. The method of claim 1, wherein the chroma block has a width W and a height of H pixels, and wherein the first downsampling scheme generates N*W luma samples, where N is an integer.
11. The method of claim 10, wherein N = 2, and wherein the first downsampling scheme generates luma samples from above adjacent luma samples.
12. The method of claim 1, wherein the chroma block has a width W and a height of H pixels, and wherein the first downsampling scheme generates W + K luma downsampled samples from above adjacent luma samples, where K is a positive integer.
13. The method of claim 1, wherein the chroma block has a width W and a height of H pixels, and wherein the first downsampling scheme generates W/N luma downsampled samples from above adjacent luma samples, where N is a positive integer.
14. The method of claim 1, wherein the chroma block has a width W and a height of H pixels, and wherein the first downsampling scheme generates W*N luma downsampled samples from left adjacent luma samples, where N is a positive integer.
15. The method of claim 1, wherein the chroma block has a width W and a height of H pixels, and wherein the first downsampling scheme generates W + K luma downsampled samples from left adjacent luma samples, where K is a positive integer.
16. The method of claim 1, wherein the chroma block has a width W and a height of H pixels, and wherein the first downsampling scheme generates W/N luma downsampled samples from left adjacent luma samples, where N is a positive integer.
17. The method of claim 1, wherein the first downsampling scheme or the second downsampling scheme is determined based on a position of a chroma block of a video meeting a position criterion.
18. The method of any of claims 1-16, wherein the position criterion specifies to use the method for only video blocks at a top boundary of a coding tree unit of the video.
19. The method of any of claims 1-16, wherein the position criterion specifies to use the method for only video blocks at a left boundary of a coding tree unit of the video.
20. The method of any of claims 1-16, wherein the position criterion specifies to use the method for only video blocks at a top boundary of a coding tree unit of the video or video blocks at a left boundary of a coding tree unit of the video.
21. The method of any of claims 1 -20, wherein the first downsampling scheme uses only one above neighboring luma samples row to derive the downsampled outside luma samples.
22. The method of claim 21, wherein the one above neighboring luma samples row comprises above neighboring samples and above-right neighboring samples.
23. A method for video processing, comprising: determining, for a conversion between a chroma block of a video and a coded representation of the video, a linear model; generating prediction values of the chroma block from a luma block that corresponds to the chroma block based on the linear model; and performing the conversion using the linear model; wherein the predicting the chroma block from the luma block includes downsampling luma samples above the luma block by a first filtering method and dowsampling luma samples to left of the luma block by a second filtering method, and wherein the linear model is determined at least based on the downsampled luma samples.
24. The method of claim 23, wherein the first filtering method uses only luma samples of a single above neighboring luma row during the conversion of the video.
25. The method of claim 23, wherein the first filtering method is different than the second filtering method.
26. The method of any of claims 23-25, wherein the first filtering method uses a horizontal three-tap filter.
27. The method of any of claims 23-26, wherein the second filtering method uses a 2-d 6 tap filter.
28. The method of any of claims 1-27, wherein the conversion comprises generating the video from the coded representation.
29. The method of any of claims 1-27, wherein the conversion comprises generating the coded representation from the video.
30. A video processing apparatus comprising a processor configured to implement a method recited in any one or more of claims 1-29.
31. A computer-readable program medium having code stored thereupon, the code, when executed by a processor, causing the processor to implement a method recited in any one or more of claims 1-29.
GB2103284.2A 2018-01-03 2019-09-12 Single-line cross component linear model prediction mode Active GB2591379B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201810005182.XA CN109994601B (en) 2018-01-03 2018-01-03 Method for manufacturing magnetic random access memory circuit connection
CN201810008681 2018-09-29
CN2019088005 2019-05-22
PCT/IB2019/057699 WO2020053805A1 (en) 2018-09-12 2019-09-12 Single-line cross component linear model prediction mode

Publications (3)

Publication Number Publication Date
GB202103284D0 GB202103284D0 (en) 2021-04-21
GB2591379A true GB2591379A (en) 2021-07-28
GB2591379B GB2591379B (en) 2023-02-15

Family

ID=84975662

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2103284.2A Active GB2591379B (en) 2018-01-03 2019-09-12 Single-line cross component linear model prediction mode

Country Status (1)

Country Link
GB (1) GB2591379B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120287995A1 (en) * 2011-05-12 2012-11-15 Madhukar Budagavi Luma-Based Chroma Intra-Prediction for Video Coding
GB2495942A (en) * 2011-10-25 2013-05-01 Canon Kk Prediction of Image Components Using a Prediction Model
US20180077426A1 (en) * 2016-09-15 2018-03-15 Qualcomm Incorporated Linear model chroma intra prediction for video coding

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120287995A1 (en) * 2011-05-12 2012-11-15 Madhukar Budagavi Luma-Based Chroma Intra-Prediction for Video Coding
GB2495942A (en) * 2011-10-25 2013-05-01 Canon Kk Prediction of Image Components Using a Prediction Model
US20180077426A1 (en) * 2016-09-15 2018-03-15 Qualcomm Incorporated Linear model chroma intra prediction for video coding

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHEN H ET AL, "Description of SDR, HDR and 360° video coding technology proposal by Huawei, GoPro, HiSilicon, and Samsung "" general application scenario", 10. JVET MEETING; 10-4-2018 - 20-4-2018; SAN DIEGO; (THE JOINT VIDEO EXPLORATION TEAM OF ISO/IEC JTC1/SC29/WG11 AND ITU-T SG.16 ); URL: HTTP://P *
VAN DER AUWERA (QUALCOMM) G ET AL, "Description of Core Experiment 3 (CE3): Intra Prediction and Mode Coding", no. JVET-J1023, (20180621), 10. JVET MEETING; 20180410 - 20180420; SAN DIEGO; (THE JOINT VIDEO EXPLORATION TEAM OF ISO/IEC JTC1/SC29/WG11 AND ITU-T SG.16 ),, URL: http://phenix.int-evry.fr/ *

Also Published As

Publication number Publication date
GB2591379B (en) 2023-02-15
GB202103284D0 (en) 2021-04-21

Similar Documents

Publication Publication Date Title
US11178404B2 (en) Method and apparatus of video coding
GB2590844A (en) Simplified cross component prediction
CA3048426C (en) Adaptive unequal weight planar prediction
MX2021006254A (en) Context-based intra prediction.
CN105706450B (en) It is determined according to the encoder of the result of the Block- matching based on hash
KR101452921B1 (en) Method for performing localized multihypothesis prediction during video coding of a coding unit, and associated apparatus
EP4300953A3 (en) Size selective application of decoder side refining tools
CN107360419B (en) A kind of movement forward sight video interprediction encoding method based on perspective model
CN103957415B (en) CU dividing methods and device based on screen content video
EP2983365A1 (en) Image prediction coding method and image coder
KR101198320B1 (en) Method and apparatus for converting 2d image into 3d image
CN108781284A (en) The method and device of coding and decoding video with affine motion compensation
HUE034631T2 (en) Adaptive partition coding
WO2016109309A2 (en) Computationally efficient motion estimation
CN102045563A (en) Methods and apparatus for adaptively choosing a search range for motion estimation
CN102804776A (en) Method and apparatus for adaptive loop filtering
MY187403A (en) Apparatus and method for video motion compensation with selectable interpolation filter
CN108028927A (en) Picture coding device, picture decoding apparatus and its program
CN1694499A (en) Motion estimation employing adaptive spatial update vectors
CN104113754A (en) Method for high-performance video interframe coding based on time domain relevance and transcoder thereof
KR20190090728A (en) A method and an apparatus for processing a video signal using subblock-based motion compensation
CN104012086A (en) System and method for depth-guided image filtering in a video conference environment
CN101627626A (en) Motion vector selection
CN105103556A (en) Method and apparatus for bi-prediction of illumination compensation
MX2022004896A (en) Derivation of linear parameter in cross-component video coding.