CN112437298A - Encoding and decoding method and device - Google Patents

Encoding and decoding method and device Download PDF

Info

Publication number
CN112437298A
CN112437298A CN201910792560.8A CN201910792560A CN112437298A CN 112437298 A CN112437298 A CN 112437298A CN 201910792560 A CN201910792560 A CN 201910792560A CN 112437298 A CN112437298 A CN 112437298A
Authority
CN
China
Prior art keywords
pixel value
current coding
coding block
prediction
reference pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910792560.8A
Other languages
Chinese (zh)
Inventor
欧阳晓
王凡
吕卓逸
朴银姬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecom R&D Center
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN201910792560.8A priority Critical patent/CN112437298A/en
Priority to PCT/KR2020/011171 priority patent/WO2021040330A1/en
Publication of CN112437298A publication Critical patent/CN112437298A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/147Data rate or code amount at the encoder output according to rate distortion criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/11Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/119Adaptive subdivision aspects, e.g. subdivision of a picture into rectangular or non-rectangular coding blocks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/129Scanning of coding units, e.g. zig-zag scan of transform coefficients or flexible macroblock ordering [FMO]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • H04N19/159Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/182Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/593Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

A coding and decoding method and device are provided. The encoding method comprises the following steps: performing intra-frame prediction on the current coding block based on the availability of the reference pixels of the current coding block; writing information for intra prediction, which includes at least information on an intra prediction mode, into a code stream. The decoding method comprises the following steps: parsing information for intra prediction from a code stream, the information for intra prediction including at least information about an intra prediction mode; calculating a prediction pixel value of the current coding block according to the analyzed information for intra-frame prediction based on the availability of the reference pixels of the current coding block; and reconstructing the image according to the predicted pixel value to obtain a decoded image. Thus, the intra-frame prediction efficiency is improved, and the performance of video coding is further improved.

Description

Encoding and decoding method and device
Technical Field
The present disclosure relates to the field of video encoding and decoding technology. More particularly, the present disclosure relates to a coding and decoding method and apparatus.
Background
In the conventional video or image encoding and decoding method, an image is generally divided into a plurality of image blocks, and then each image block is encoded or decoded. The step of encoding each image block can be divided into prediction, transformation, quantization and entropy encoding, wherein the prediction is to predict the image block to be encoded currently by using the reconstructed pixel values (these pixels are referred to as reference pixels) of the image block that has been encoded previously to derive predicted pixel values, and then encode the difference between the actual value and the predicted pixel value of the current image block into the code stream. During decoding, the decoder also needs to predict the current image block to be decoded by using the reconstructed pixel values (these pixels are referred to as reference pixels) of the image block that has been decoded before to derive predicted pixel values, and then adds the difference values decoded from the code stream to the predicted pixel values to obtain the reconstructed values of the decoded image block. In order to ensure the consistency of encoding and decoding, encoding and decoding codes must use the same reference pixels and the same prediction method when performing prediction. There are many specific prediction methods, and generally, an encoder selects a current image block and then writes information about the selected prediction method into a code stream to tell a decoder, so that the decoder can predict a current coding block using the same prediction method.
With the development of video coding and decoding technology, a new intra-frame prediction method includes: intra Prediction Filtering (IPF), inter Step Cross Component Prediction Mode (TSCPM), extended Tree (DT), and the like. IPF is a process of weighting a part of intra-prediction pixel values in a current coding block and a left prediction pixel value or an upper prediction pixel value of the current coding block to obtain a new prediction pixel value, and using the new prediction pixel value as a final prediction pixel value of intra-prediction, as shown in fig. 1. The TSCPM finds two reference samples from the reference pixel on the left side and the reference pixel on the upper side of the current block, and constructs a linear model of the luminance component and the chrominance component according to the two reference samples, and finally predicts the chrominance component by the linear model, as shown in fig. 2. DT is the division of a current coding block into a plurality of prediction blocks (PU), the prediction of each prediction block, and the division of the prediction block into transform blocks (TU) for transform quantization operation, as shown in fig. 3.
With the development of video coding and decoding technology, besides a common left-to-right coding sequence, a right-to-left coding sequence also appears, which means that a current coding block or a prediction block can not only obtain reference information on the left side and reference information on the upper side of the current block (including an intra-frame prediction direction, a reference pixel value and the like), but also obtain reference information on the right side of the current block. New intra prediction techniques (IPF, TSCPM, DT, etc.) do not take into account how to utilize more reference information to further improve coding efficiency, provided that such information is available.
The coding order of image blocks, in addition to the conventional left-to-right coding order, the coding order from right to left is proposed in the MPEG EVC, which makes it possible for the encoder to acquire the reference pixels on the right side of the current block when encoding the current block. However, the conventional intra prediction technology only considers the case where the reference information on the left side of the current block and the reference information on the upper side are available, and does not consider the case where the reference information on the right side of the current block and the reference information on the left and right sides are both available.
Disclosure of Invention
An exemplary embodiment of the present disclosure is to provide a coding and decoding method and apparatus, so as to utilize surrounding reference pixel information to the maximum extent and improve the efficiency of intra-frame coding.
According to an exemplary embodiment of the present disclosure, there is provided an encoding method including: performing intra-frame prediction on the current coding block based on the availability of the reference pixels of the current coding block; and writing information for intra-frame prediction into the code stream, wherein the information for intra-frame prediction at least comprises information about an intra-frame prediction mode.
Optionally, the step of intra-predicting the current coding block may include: traversing all intra-frame prediction modes aiming at a current coding block to obtain a prediction pixel value corresponding to each intra-frame prediction mode, and carrying out intra-frame prediction filtering on the prediction pixel value to obtain a final prediction pixel value corresponding to each intra-frame prediction mode; calculating the rate distortion cost corresponding to each intra-frame prediction mode according to the final prediction pixel value; and according to the rate distortion cost of each intra-frame prediction mode, taking the intra-frame prediction mode with the lowest rate distortion cost as the intra-frame prediction mode of the current coding block.
Optionally, the step of intra prediction filtering the predicted pixel values may comprise: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
Optionally, the intra prediction filtering according to the intra prediction mode of the current coding block and the reference pixel value on the right side of the current coding block may include: when the intra-frame prediction mode of the current coding block is not the angle prediction mode, weighting the prediction pixel value on the upper side and the prediction pixel value on the right side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
Optionally, the step of calculating a final predicted pixel value according to an angle between the current coding block and the horizontal direction or the vertical direction may include: when the included angle between the current coding block and the horizontal direction is smaller than a first angle, weighting the prediction pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final prediction pixel value; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain a final prediction pixel value.
Optionally, the intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and/or the reference pixel value on the right side of the current coding block may include: performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and a reference pixel value on the left side of the current coding block, or performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and a reference pixel value on the right side of the current coding block, or performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, a reference pixel value on the left side of the current coding block and a reference pixel value on the right side of the current coding block, or selecting an intra-frame filtering mode with the lowest rate-distortion cost from the following intra-frame filtering modes to perform intra-frame prediction: the intra-frame filtering mode comprises an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, and an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side and.
Optionally, the intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and the reference pixel value on the right side of the current coding block may include: when the intra-frame prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the left upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side of the current coding block to obtain a final prediction pixel value on the left upper side of the current coding block, weighting a prediction pixel value on the left side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on; when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
Alternatively, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include information on an intra filtering mode.
According to an exemplary embodiment of the present disclosure, there is provided a decoding method including: parsing information for intra prediction from a code stream, the information for intra prediction including at least information about an intra prediction mode; calculating a prediction pixel value of the current coding block according to the analyzed information for intra-frame prediction based on the availability of the reference pixels of the current coding block; and reconstructing the image according to the predicted pixel value to obtain a decoded image.
Alternatively, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include information on an intra filtering mode.
Optionally, the step of calculating the predicted pixel value of the current coding block may comprise: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
Optionally, the intra prediction filtering according to the intra prediction mode of the current coding block and the reference pixel value on the right side of the current coding block may include: when the intra-frame prediction mode of the current coding block is not the angle prediction mode, weighting a prediction pixel value on the upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
Optionally, the step of calculating a final predicted pixel value according to an angle between the current coding block and the horizontal direction or the vertical direction may include: when the included angle between the current coding block and the horizontal direction is smaller than a first angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain the final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side of the current coding block to obtain the final prediction pixel value on the right side of the current coding block.
Optionally, the intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and/or the reference pixel value on the right side of the current coding block may include: and carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, or carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, or carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and the reference pixel value on the right side of the current coding block, or carrying out intra-frame prediction filtering according to the analyzed intra-frame.
Optionally, the intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and the reference pixel value on the right side of the current coding block may include: when the intra-frame prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the left upper side of the current coding block, a reference pixel value on the left side and a reference pixel value on the right side to obtain a final prediction pixel value on the left upper side of the current coding block, weighting a prediction pixel value on the left side of the current coding block, a reference pixel value on the left side and a reference pixel value on the right side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side; when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
According to an exemplary embodiment of the present disclosure, there is provided an encoding method including: determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point; calculating a prediction pixel value of a chrominance component of the current coding block according to the constructed linear model, and determining an intra-frame prediction mode of the current coding block; writing information for intra prediction, which includes at least information on whether an intra prediction mode is a TSCPM mode, into a code stream.
Optionally, the step of determining reference sampling points for constructing the linear model may comprise: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, selecting two reference sampling points from the reference pixel on the upper side and the reference pixel on the left side respectively; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the situation that the rate distortion cost is low in the two situations.
Optionally, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include reference sample point construction information.
According to an exemplary embodiment of the present disclosure, there is provided a decoding method including: parsing information for intra prediction from a code stream, the information for intra prediction including at least information on whether an intra prediction mode is a TSCPM mode; determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point; calculating a prediction pixel value of a chrominance component of a current coding block according to the constructed linear model; and reconstructing the chroma component of the current coding block according to the predicted pixel value of the chroma component of the current coding block to obtain a decoded image.
Optionally, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include reference sample point construction information.
Optionally, the step of determining reference sampling points for constructing the linear model may comprise: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, selecting two reference sampling points from the reference pixel on the upper side and the reference pixel on the left side respectively; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the analyzed reference sample point construction information.
According to an exemplary embodiment of the present disclosure, there is provided an encoding method including: determining the coding sequence of a prediction block in a current coding block based on the partition mode of the block and the availability of reference pixels of the current coding block; determining a coding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block; and writing the DT mode information into the code stream based on the determined coding sequence of a prediction block in the current coding block and the coding sequence of a transformation block in the prediction block, wherein the DT mode information at least comprises information about the partitioning mode of the block.
Alternatively, when the partition mode of the block is vertical partition, the determining of the coding order of the prediction block in the current coding block may include: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, the coding order of the prediction block is from left to right; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, the coding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the coding sequence of the prediction block is from left to right; when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the coding order of the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
Alternatively, when the partition mode of the block is vertical partition, the determining of the coding order of the transform blocks in the prediction block may include: when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available, the coding order of the transform blocks in the prediction block is from left to right; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the coding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the coding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the coding order of the transform block in the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
Alternatively, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the DT mode information may further include information on the coding order of the prediction block and/or information on the coding order of the transform blocks in the prediction block.
According to an exemplary embodiment of the present disclosure, there is provided a decoding method including: parsing DT mode information from a code stream, wherein the DT mode information at least comprises information about a partitioning mode of a block; determining a decoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block; determining a decoding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block; and decoding according to the determined decoding order of the prediction block in the current coding block and the determined decoding order of the transformation block in the prediction block.
Optionally, the DT mode information may further include information on a decoding order of the prediction block and/or information on a decoding order of the transform blocks in the prediction block.
Alternatively, when the partition mode of the block is a vertical partition, the step of determining the decoding order of the prediction block in the current coding block may include: when reference pixel values on the left side of the current coding block are available and reference pixel values on the right side are not available, the decoding order of the prediction block is from left to right; when reference pixel values on the left side of the current coding block are not available and reference pixel values on the right side are available, the decoding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the decoding order of the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current coding block are available, the decoding order of the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the prediction block determined according to the information on the decoding order of the prediction block.
Alternatively, when the partition mode of the block is vertical partition, the determining of the decoding order of the transform blocks in the prediction block may include: when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available, the decoding order of the transform blocks in the prediction block is from left to right; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the decoding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the decoding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the decoding order of the transform blocks in the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the transform blocks in the prediction block determined according to information on a decoding order of the transform blocks in the prediction block.
According to an exemplary embodiment of the present disclosure, there is provided an encoding apparatus including: an intra prediction unit configured to intra predict a current encoding block based on availability of reference pixels of the current encoding block; and a first encoding unit configured to write information for intra prediction, which includes at least information on an intra prediction mode, into a code stream.
Alternatively, the intra prediction unit may be configured to: traversing all intra-frame prediction modes aiming at a current coding block to obtain a prediction pixel value corresponding to each intra-frame prediction mode, and carrying out intra-frame prediction filtering on the prediction pixel value to obtain a final prediction pixel value corresponding to each intra-frame prediction mode; calculating the rate distortion cost corresponding to each intra-frame prediction mode according to the final prediction pixel value; and according to the rate distortion cost of each intra-frame prediction mode, taking the intra-frame prediction mode with the lowest rate distortion cost as the intra-frame prediction mode of the current coding block.
Optionally, the intra prediction unit may be further configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
Optionally, the intra prediction unit may be further configured to: when the intra-frame prediction mode of the current coding block is not the angle prediction mode, weighting the prediction pixel value on the upper side and the prediction pixel value on the right side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
Optionally, the intra prediction unit may be further configured to: when the included angle between the current coding block and the horizontal direction is smaller than a first angle, weighting the prediction pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final prediction pixel value; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain a final prediction pixel value.
Optionally, the intra prediction unit may be further configured to: performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and a reference pixel value on the left side of the current coding block, or performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and a reference pixel value on the right side of the current coding block, or performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, a reference pixel value on the left side of the current coding block and a reference pixel value on the right side of the current coding block, or selecting an intra-frame filtering mode with the lowest rate-distortion cost from the following intra-frame filtering modes to perform intra-frame prediction: the intra-frame filtering mode comprises an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, and an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side and.
Optionally, the intra prediction unit may be further configured to: when the intra-frame prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the left upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side of the current coding block to obtain a final prediction pixel value on the left upper side of the current coding block, weighting a prediction pixel value on the left side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on; when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
Alternatively, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include information on an intra filtering mode.
According to an exemplary embodiment of the present disclosure, there is provided a decoding apparatus including: a first parsing unit configured to parse information for intra prediction from a code stream, the information for intra prediction including at least information on an intra prediction mode; a first calculation unit configured to calculate a prediction pixel value of the current encoding block from the parsed information for intra prediction based on availability of reference pixels of the current encoding block; and the first decoding unit is configured to reconstruct the image according to the predicted pixel value to obtain a decoded image.
Alternatively, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include information on an intra filtering mode.
Optionally, the first computing unit may be configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
Optionally, the first computing unit may be further configured to: when the intra-frame prediction mode of the current coding block is not the angle prediction mode, weighting a prediction pixel value on the upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
Optionally, the first computing unit may be further configured to: when the included angle between the current coding block and the horizontal direction is smaller than a first angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain the final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side of the current coding block to obtain the final prediction pixel value on the right side of the current coding block.
Optionally, the first computing unit may be further configured to: and carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, or carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, or carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and the reference pixel value on the right side of the current coding block, or carrying out intra-frame prediction filtering according to the analyzed intra-frame.
Optionally, the first computing unit may be further configured to: when the intra-frame prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the left upper side of the current coding block, a reference pixel value on the left side and a reference pixel value on the right side to obtain a final prediction pixel value on the left upper side of the current coding block, weighting a prediction pixel value on the left side of the current coding block, a reference pixel value on the left side and a reference pixel value on the right side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side; when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
According to an exemplary embodiment of the present disclosure, there is provided an encoding apparatus including: the model building unit is configured to determine a reference sampling point for building a linear model based on the availability of the reference pixel of the current coding block, and build the linear model according to the determined reference sampling point; a mode determination unit configured to calculate a prediction pixel value of a chrominance component of the current coding block according to the constructed linear model and determine an intra prediction mode of the current coding block; and a second encoding unit configured to write information for intra prediction, which includes at least information on whether an intra prediction mode is a TSCPM mode, into a code stream.
Optionally, the model building unit may be configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, selecting two reference sampling points from the reference pixel on the upper side and the reference pixel on the left side respectively; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the situation that the rate distortion cost is low in the two situations.
Optionally, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include reference sample point construction information.
Optionally, a decoding device, comprising: a second parsing unit configured to parse information for intra prediction from the code stream, the information for intra prediction including at least information on whether an intra prediction mode is a TSCPM mode; the model building unit is configured to determine a reference sampling point for building a linear model based on the availability of the reference pixel of the current coding block, and build the linear model according to the determined reference sampling point; a second calculation unit configured to calculate a predicted pixel value of a chrominance component of the current coding block according to the constructed linear model; and the second decoding unit is configured to reconstruct the chroma component of the current coding block according to the predicted pixel value of the chroma component of the current coding block to obtain a decoded image.
Optionally, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include reference sample point construction information.
Optionally, the model building unit may be configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, selecting two reference sampling points from the reference pixel on the upper side and the reference pixel on the left side respectively; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the analyzed reference sample point construction information.
According to an exemplary embodiment of the present disclosure, there is provided an encoding apparatus including: a first order determination unit configured to determine a coding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block; a second order determination unit configured to determine a coding order of the transform blocks in the prediction block based on a partition mode of the block and availability of reference pixels of a current coding block; a third encoding unit configured to write extended tree (DT) mode information into the code stream based on the determined coding order of a prediction block in the current coding block and the coding order of a transform block in the prediction block, wherein the DT mode information includes at least information on a partition mode of the block.
Alternatively, the first order determination unit may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current coding block are available and reference pixel values on the right side are not available, the coding order of the prediction block is from left to right; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, the coding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the coding sequence of the prediction block is from left to right; when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the coding order of the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
Alternatively, the second order determination unit may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available, the encoding order of the transform blocks in the prediction block is from left to right; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the coding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the coding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the coding order of the transform block in the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
Alternatively, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the DT mode information may further include information on the coding order of the prediction block and/or information on the coding order of the transform blocks in the prediction block.
According to an exemplary embodiment of the present disclosure, there is provided a decoding apparatus including: a third parsing unit configured to parse DT mode information from the codestream, wherein the DT mode information includes at least information on a partition mode of the block; a third order determination unit configured to determine a decoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block; a fourth order determination unit configured to determine a decoding order of the transform blocks in the prediction block based on a partition mode of the block and availability of reference pixels of a current encoding block; a third decoding unit configured to decode according to the determined decoding order of the prediction block in the current coding block and the determined decoding order of the transform block in the prediction block.
Optionally, the DT mode information may further include information on a decoding order of the prediction block and/or information on a decoding order of the transform blocks in the prediction block.
Alternatively, the third order determination unit may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current encoding block are available and reference pixel values on the right side are not available, the decoding order of the prediction block is from left to right; when reference pixel values on the left side of the current coding block are not available and reference pixel values on the right side are available, the decoding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the decoding order of the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current coding block are available, the decoding order of the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the prediction block determined according to the information on the decoding order of the prediction block.
Alternatively, the fourth order determination unit may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available, the decoding order of the transform blocks in the prediction block is from left to right; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the decoding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the decoding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the decoding order of the transform blocks in the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the transform blocks in the prediction block determined according to information on a decoding order of the transform blocks in the prediction block.
According to an exemplary embodiment of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements an encoding method and a decoding method according to an exemplary embodiment of the present disclosure.
According to an exemplary embodiment of the present disclosure, there is provided a computing apparatus including: a processor; a memory storing a computer program that, when executed by the processor, implements an encoding method and a decoding method according to exemplary embodiments of the present disclosure.
According to the coding and decoding method and device disclosed by the exemplary embodiment of the disclosure, flexible coding sequence is adopted, spatial reference information is expanded, an intra-frame prediction filtering technology (IPF), a cross-component prediction Technology (TSCPM) and an expanded tree technology (DT) are designed according to the availability of adjacent reference pixels on the left side and the right side of a current coding or decoding image block, and the information of surrounding reference pixels is utilized to the maximum extent, so that the efficiency of intra-frame coding is improved. When the left pixel and the right pixel of the current image block are available, the intra-frame prediction technologies are consistent with the prior art, so that the compatibility among the technologies is maintained to the maximum extent. When the right pixel of the image block is available, for the IPF technique, the right and upper pixels are used as the filtering pixels, that is, the right and upper pixels and the prediction value are weighted to obtain a new prediction value. For the TSCPM technique, linear models of the luminance component and the chrominance component are derived using the pixels on the right and upper sides, and the chrominance component is predicted based on the linear models. For the DT technique, when the right-side pixel is available, the prediction block and the transform block are encoded in the order from right to left, so that when the right-to-left encoding order is adopted, the encoded or decoded pixel information around the image block can still be used to the maximum extent, thereby improving the intra-prediction efficiency and further improving the performance of video encoding.
Additional aspects and/or advantages of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
Drawings
The above and other objects and features of the exemplary embodiments of the present disclosure will become more apparent from the following description taken in conjunction with the accompanying drawings which illustrate exemplary embodiments, wherein:
FIG. 1 is a schematic diagram showing that a reference pixel at the upper side and a reference pixel at the left side of a current coding block in the IPF technology can be obtained;
FIG. 2 is a diagram illustrating two reference samples found from the reference pixels on the left and top of the current block in the TSCPM technique;
FIG. 3 illustrates a schematic diagram of a DT technique for dividing a coding block into prediction blocks;
fig. 4 shows a flow chart of an encoding method according to an example embodiment of the present disclosure;
fig. 5 shows a schematic diagram where reference pixels at the upper side and right side of a current coding block are available according to an exemplary embodiment of the present disclosure;
fig. 6 shows a schematic diagram available for a reference pixel at the upper side, a reference pixel at the left side, and a reference pixel at the right side of a current coding block according to an exemplary embodiment of the present disclosure;
fig. 7 shows a flowchart of a decoding method according to an example embodiment of the present disclosure;
fig. 8 shows a flowchart of an encoding method according to another exemplary embodiment of the present disclosure;
fig. 9 illustrates a schematic diagram of reference samples in TSCPM technology according to an exemplary embodiment of the present disclosure;
fig. 10 shows a flowchart of a decoding method according to another exemplary embodiment of the present disclosure;
fig. 11 shows a flowchart of an encoding method according to another exemplary embodiment of the present disclosure;
fig. 12 shows a schematic diagram of a coding order of a prediction block according to an exemplary embodiment of the present disclosure;
fig. 13 illustrates a schematic diagram of a coding order of a prediction block according to another exemplary embodiment of the present disclosure;
fig. 14 illustrates a schematic diagram of a coding order of a prediction block according to another exemplary embodiment of the present disclosure;
fig. 15 illustrates a schematic diagram of a coding order of transform blocks in a prediction block according to another exemplary embodiment of the present disclosure;
fig. 16 illustrates a schematic diagram of a coding order of transform blocks in a prediction block according to another exemplary embodiment of the present disclosure;
fig. 17 illustrates a schematic diagram of a coding order of transform blocks in a prediction block according to another exemplary embodiment of the present disclosure;
fig. 18 shows a flowchart of a decoding method according to another exemplary embodiment of the present disclosure;
fig. 19 shows a block diagram of an encoding apparatus according to an exemplary embodiment of the present disclosure;
fig. 20 shows a block diagram of a decoding apparatus according to an exemplary embodiment of the present disclosure;
fig. 21 shows a block diagram of an encoding apparatus according to another exemplary embodiment of the present disclosure;
fig. 22 illustrates a block diagram of a decoding apparatus according to another exemplary embodiment of the present disclosure;
fig. 23 illustrates a block diagram of an encoding apparatus according to another exemplary embodiment of the present disclosure;
fig. 24 illustrates a block diagram of a decoding apparatus according to another exemplary embodiment of the present disclosure; and
fig. 25 shows a schematic diagram of a computing device according to an example embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present disclosure by referring to the figures.
Fig. 4 shows a flowchart of an encoding method according to an exemplary embodiment of the present disclosure. The encoding method of fig. 4 may be applied to an encoder using an Intra Prediction Filtering (IPF) technique.
Referring to fig. 4, in step S401, intra prediction is performed on a current coding block based on the availability of reference pixels of the current coding block.
In the exemplary embodiment of the present disclosure, when intra-frame prediction is performed on a current coding block, all intra-frame prediction modes may be traversed for the current coding block to obtain a prediction pixel value corresponding to each intra-frame prediction mode, intra-frame prediction filtering is performed on the prediction pixel values to obtain a final prediction pixel value corresponding to each intra-frame prediction mode, then a rate distortion cost corresponding to each intra-frame prediction mode is calculated according to the final prediction pixel value, and then, according to the rate distortion cost of each intra-frame prediction mode, the intra-frame prediction mode with the lowest rate distortion cost is used as the intra-frame prediction mode of the current coding block.
In an exemplary embodiment of the present disclosure, in intra prediction filtering of a prediction pixel value, intra prediction filtering may be performed according to an intra prediction mode of a current coding block and a reference pixel value on the left side of the current coding block when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
In an exemplary embodiment of the present disclosure, when performing intra prediction filtering according to an intra prediction mode of a current coding block and a reference pixel value on the right side of the current coding block, when the intra prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the upper side and the prediction pixel value on the right side of the current coding block with the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
In an exemplary embodiment of the present disclosure, when a final predicted pixel value is calculated according to an angle between a current coding block and a horizontal direction or a vertical direction, when the angle between the current coding block and the horizontal direction is smaller than a first angle, a predicted pixel value on an upper side of the current coding block and a reference pixel value on the upper side may be weighted to obtain the final predicted pixel value; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain a final prediction pixel value.
In the exemplary embodiments of the present disclosure, when performing intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and/or the reference pixel value on the right side of the current coding block, intra prediction filtering may be performed according to the intra prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, or intra prediction filtering may be performed according to the intra prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, or intra prediction filtering may be performed according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and the reference pixel value on the right side of the current coding block, or an intra filtering mode with the lowest rate distortion cost among the following intra filtering modes may be selected for: the intra-frame filtering mode comprises an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, and an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side and.
In the exemplary embodiments of the present disclosure, when performing intra prediction filtering according to an intra prediction mode of a current coding block, a reference pixel value on the left side of the current coding block, and a reference pixel value on the right side of the current coding block, when the intra prediction mode of the current coding block is not an angular prediction mode, weighting a prediction pixel value on the left side of the current coding block with a reference pixel value on the upper side and the reference pixel value on the left side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right side of the current coding block with a reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value on the, weighting the predicted pixel value on the right side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode, whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle is determined; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain the final predicted pixel value on the upper side of the current coding block; when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain a final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain a final predicted pixel value on the right side of the current coding block. It is understood that the first to fourth angles may be the same or different.
In an exemplary embodiment of the present disclosure, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction further includes information on an intra filtering mode.
Specifically, when the encoder performs intra prediction on the current coding block, the encoder may first traverse all intra prediction modes for the current coding block to obtain corresponding prediction pixel values, perform intra prediction filtering on the prediction pixel values to obtain final prediction pixel values, perform operations such as transformation and quantization on the current coding block according to the final prediction pixel values to calculate rate distortion cost corresponding to each intra prediction mode, and finally select the intra prediction mode with the lowest rate distortion cost according to the rate distortion cost of each intra prediction mode.
For example, fig. 1 illustrates a schematic diagram that a reference pixel on the upper side and a reference pixel on the left side of a current coding block are available in the IPF technology, as shown in fig. 1, when a reference pixel value on the left side of the current coding block is available and a reference pixel value on the right side is not available, filtering is performed according to an intra prediction mode (PredMode) of the current coding block, when the intra prediction mode of the current coding block is not an angular prediction mode (for example, the prediction mode is a Direct Current (DC), Planar (Planar), or bi-directional interpolation (Bilinear), etc.), a prediction pixel value near the upper side of the current coding block is weighted with a corresponding reference pixel value on the upper side and a corresponding reference pixel value on the left side to obtain a final prediction pixel value near the upper side of the current coding block, and a prediction pixel value near the left side of the current coding block is weighted with a corresponding reference pixel value on the upper side and a corresponding reference pixel value on the left side to obtain a final prediction pixel value near the left side of Measuring a pixel value; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the horizontal direction is less than 45 degrees, weighting a prediction pixel value close to the upper side of the current coding block and a corresponding reference pixel value on the upper side to obtain a final prediction pixel value close to the upper side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the intra-frame prediction mode and the vertical direction is less than 45 degrees, weighting the prediction pixel value close to the left side of the current coding block and the corresponding reference pixel value on the left side to obtain the final prediction pixel value close to the left side of the current coding block.
For example, fig. 5 shows a schematic diagram that a reference pixel on the upper side and a reference pixel on the right side of a current coding block are available according to an exemplary embodiment of the present disclosure, and when a pixel value on the left side of the current coding block is unavailable and a pixel value on the right side is available, filtering is performed according to an intra prediction mode of the current coding block as shown in fig. 5. Specifically, when the intra prediction mode of the current coding block is not the angular prediction mode (for example, when the prediction mode is DC, Planar, bifilar, or the like), weighting the prediction pixel value close to the upper side of the current coding block with the corresponding upper side reference pixel value and the corresponding right side reference pixel value to obtain a final prediction pixel value close to the upper side of the current coding block, and weighting the prediction pixel value close to the right side of the current coding block with the corresponding upper side reference pixel value and the corresponding right side reference pixel value to obtain a final prediction pixel value close to the right side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the horizontal direction is less than 45 degrees, weighting a prediction pixel value close to the upper side of the current coding block and a corresponding reference pixel value on the upper side to obtain a final prediction pixel value close to the upper side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the vertical direction is less than 45 degrees, weighting the prediction pixel value close to the right side of the current coding block and the corresponding reference pixel value on the right side to obtain the final prediction pixel value close to the right side of the current coding block.
In addition, when the reference pixel value on the left side of the current coding block is not available and the reference pixel value on the right side is not available, one possible implementation method is to fill the reference pixel value on the left side and then perform intra-frame filtering in the case that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available; one possible implementation method is: and filling the right reference pixel value, and performing intra-frame filtering under the condition that the reference pixel value at the left side of the current coding block is unavailable and the right reference pixel value is available.
Furthermore, when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is available, one possible approach is: performing intra-frame filtering in the case that the reference pixel value at the left side of the current coding block is available and the reference pixel value at the right side is unavailable; one possible method is: performing intra-frame filtering in the case that the reference pixel value at the left side of the current coding block is unavailable and the reference pixel value at the right side is available; one possible method is: for example, fig. 6 illustrates a schematic diagram that may be obtained by an upper side reference pixel, a left side reference pixel and a right side reference pixel of a current coding block according to an exemplary embodiment of the present disclosure, and as shown in fig. 6, when an intra prediction mode of the current coding block is not an angular prediction mode (for example, when the prediction mode is a DC, Planar, bifilar, or the like), a prediction pixel value near the upper left side of the current coding block is weighted with a corresponding upper side reference pixel value and a corresponding left side reference pixel value to obtain a final prediction pixel value, a prediction pixel value near the left side of the current coding block is weighted with a corresponding upper side reference pixel value and a corresponding left side reference pixel value to obtain a final prediction pixel value near the left side of the current coding block, a prediction pixel value near the upper right side of the current coding block is weighted with a corresponding upper side reference pixel value and a corresponding right side reference pixel value to obtain a final prediction pixel near the upper right side of the current coding block The pixel value, the predicted pixel value close to the right side of the current coding block, the corresponding upper side reference pixel value and the corresponding right side reference pixel value are weighted to obtain the final predicted pixel value close to the right side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the horizontal direction is less than 45 degrees, weighting a prediction pixel value close to the upper side of the current coding block and a corresponding reference pixel value on the upper side to obtain a final prediction pixel value close to the upper side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the vertical direction is less than 45 degrees, weighting a prediction pixel value close to the left side of the current coding block and a corresponding reference pixel value on the left side to obtain a final prediction pixel value close to the left side of the current coding block, and weighting a prediction pixel value close to the right side of the current coding block and a corresponding reference pixel value on the right side to obtain a final prediction pixel value close to the right side of the current coding block; one possible method is: and traversing the intra-frame filtering modes in the three methods, and selecting the intra-frame filtering mode with the lowest rate distortion cost for intra-frame filtering. When performing intra-frame filtering in an intra-frame filtering mode with the lowest distortion cost of the selected rate, the selected intra-frame filtering mode information needs to be written into the code stream, where the intra-frame filtering mode information may be a filtering mode index value, which is not limited herein.
In step S102, information for intra prediction including at least information on an intra prediction mode is written into the code stream.
Specifically, when writing information for intra prediction into a code stream, one possible method is: writing the intra-frame prediction mode into a code stream according to the prior art; another possible method is: firstly, writing an intra-frame prediction mode into a code stream according to the prior art, and writing filter mode information into the code stream when both a reference pixel value on the left side and a reference pixel value on the right side of a current coding block are available.
Fig. 7 illustrates a flowchart of a decoding method according to an exemplary embodiment of the present disclosure. The decoding method in fig. 7 can be applied to a decoder using an Intra Prediction Filtering (IPF) technique.
Referring to fig. 7, in step S701, information for intra prediction, which includes at least information on an intra prediction mode, is parsed from a code stream.
In an exemplary embodiment of the present disclosure, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction further includes information on an intra filtering mode.
Specifically, when information for intra prediction is parsed from a code stream, one possible method is: analyzing an intra-frame prediction mode from a code stream according to the prior art; another possible method is: the intra-frame prediction mode is firstly analyzed from the code stream according to the prior art to obtain the current coding reference pixel information, and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both obtained, the filtering mode information is analyzed.
In step S702, a prediction pixel value of the current encoding block is calculated from the parsed information for intra prediction based on the availability of the reference pixels of the current encoding block.
In an exemplary embodiment of the present disclosure, in calculating a prediction pixel value of a current coding block, intra prediction filtering may be performed according to an intra prediction mode of the current coding block and a reference pixel value on the left side of the current coding block when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
In an exemplary embodiment of the present disclosure, when performing intra prediction filtering according to an intra prediction mode of a current coding block and a reference pixel value on the right side of the current coding block, when the intra prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the upper side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value on the upper side of the current coding block, and weighting the prediction pixel value on the right side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value on the right side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
In an exemplary embodiment of the present disclosure, when calculating a final predicted pixel value according to an angle between a current coding block and a horizontal direction or a vertical direction, when the angle between the current coding block and the horizontal direction is smaller than a first angle, weighting a predicted pixel value on an upper side of the current coding block and a reference pixel value on the upper side to obtain the final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side of the current coding block to obtain the final prediction pixel value on the right side of the current coding block.
In the exemplary embodiments of the present disclosure, when performing intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and/or the reference pixel value on the right side of the current coding block, intra prediction filtering may be performed according to the intra prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, or intra prediction filtering may be performed according to the intra prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, or intra prediction filtering may be performed according to the intra prediction mode of the current coding block, the reference pixel value on the left side of the current coding block, and the reference pixel value on the right side of the current coding block, or intra prediction filtering may be performed according to the parsed.
In the exemplary embodiments of the present disclosure, when performing intra prediction filtering according to an intra prediction mode of a current coding block, a reference pixel value on the left side of the current coding block, and a reference pixel value on the right side of the current coding block, when the intra prediction mode of the current coding block is not an angular prediction mode, weighting a prediction pixel value on the left side of the current coding block with the reference pixel value on the left side and the reference pixel value on the right side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the left side of the current coding block with the reference pixel value on the left side and the reference pixel value on the right side to obtain a final prediction pixel value on the right side of the current coding block, weighting a prediction pixel value on the right side of the current coding block with the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value, weighting the predicted pixel value on the right side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
In particular, the decoder may filter according to the intra-prediction mode of the current coding block when reference pixel values to the left of the current coding block are available and reference pixel values to the right are not available. That is, when the intra prediction mode of the current coding block is not the angular prediction mode (for example, when the prediction mode is DC, Planar, bifilar, or the like), weighting the prediction pixel value close to the upper side of the current coding block with the corresponding upper side reference pixel value and the corresponding left side reference pixel value to obtain the final prediction pixel value close to the upper side of the current coding block, and weighting the prediction pixel value close to the left side of the current coding block with the corresponding upper side reference pixel value and the corresponding left side reference pixel value to obtain the final prediction pixel value close to the left side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the horizontal direction is less than 45 degrees, weighting a prediction pixel value close to the upper side of the current coding block and a corresponding reference pixel value on the upper side to obtain a final prediction pixel value close to the upper side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the intra-frame prediction mode and the vertical direction is less than 45 degrees, weighting the prediction pixel value close to the left side of the current coding block and the corresponding reference pixel value on the left side to obtain the final prediction pixel value close to the left side of the current coding block.
In addition, when the reference pixel value at the left side of the current coding block is unavailable and the reference pixel value at the right side is available, filtering is performed according to the intra prediction mode of the current coding block. Specifically, when the intra prediction mode of the current coding block is not the angular prediction mode (for example, when the prediction mode is DC, Planar, bifilar, or the like), weighting the prediction pixel value close to the upper side of the current coding block with the corresponding upper side reference pixel value and the corresponding right side reference pixel value to obtain a final prediction pixel value close to the upper side of the current coding block, and weighting the prediction pixel value close to the right side of the current coding block with the corresponding upper side reference pixel value and the corresponding right side reference pixel value to obtain a final prediction pixel value close to the right side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the horizontal direction is less than 45 degrees, weighting a prediction pixel value close to the upper side of the current coding block and a corresponding reference pixel value on the upper side to obtain a final prediction pixel value close to the upper side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the vertical direction is less than 45 degrees, weighting the prediction pixel value close to the right side of the current coding block and the corresponding reference pixel value on the right side to obtain the final prediction pixel value close to the right side of the current coding block.
In addition, when the reference pixel value on the left side of the current coding block is not available and the reference pixel value on the right side is not available, one possible implementation method is to fill the reference pixel value on the left side and then perform intra-frame filtering in the case that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available; one possible implementation method is: and filling the right reference pixel value, and performing intra-frame filtering under the condition that the reference pixel value at the left side of the current coding block is unavailable and the right reference pixel value is available.
Furthermore, when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is available, one possible approach is: performing intra-frame filtering in the case that the reference pixel value at the left side of the current coding block is available and the reference pixel value at the right side is unavailable; one possible method is: performing intra-frame filtering in the case that the reference pixel value at the left side of the current coding block is unavailable and the reference pixel value at the right side is available; one possible method is: as shown in fig. 6, when the intra prediction mode of the current coding block is not the angular prediction mode (for example, when the prediction mode is DC, Planar, bifilar, etc.), the prediction pixel value near the upper left of the current coding block, the corresponding upper reference pixel value, and the corresponding left reference pixel value are weighted to obtain the final prediction pixel value, the prediction pixel value near the left of the current coding block, the corresponding upper reference pixel value, and the corresponding left reference pixel value are weighted to obtain the final prediction pixel value near the left of the current coding block, the prediction pixel value near the upper right of the current coding block, the corresponding upper reference pixel value, and the corresponding right reference pixel value are weighted to obtain the final prediction pixel value near the upper right of the current coding block, the prediction pixel value near the right of the current coding block, the corresponding upper reference pixel value, and the corresponding right reference pixel value are weighted to obtain the final prediction pixel value near the current coding block, and the prediction pixel value near the right of the current coding block, the upper reference pixel value, and the corresponding right reference pixel The final predicted pixel value on the right side of the block; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the horizontal direction is less than 45 degrees, weighting a prediction pixel value close to the upper side of the current coding block and a corresponding reference pixel value on the upper side to obtain a final prediction pixel value close to the upper side of the current coding block; when the intra-frame prediction mode of the current coding block is an angle prediction mode and the included angle between the current coding block and the vertical direction is less than 45 degrees, weighting a prediction pixel value close to the left side of the current coding block and a corresponding reference pixel value on the left side to obtain a final prediction pixel value close to the left side of the current coding block, and weighting a prediction pixel value close to the right side of the current coding block and a corresponding reference pixel value on the right side to obtain a final prediction pixel value close to the right side of the current coding block; one possible method is: and performing intra-frame filtering on the current coding block according to the filtering mode corresponding to the filtering mode information obtained in the step S702 to obtain a final predicted pixel value.
In step S703, the image is reconstructed according to the predicted pixel value, and a decoded image is obtained.
Specifically, the decoder may reconstruct the image according to the final predicted pixel value to obtain a final decoded image.
Fig. 8 illustrates a flowchart of an encoding method according to another exemplary embodiment of the present disclosure. The encoding method in fig. 8 may be applied to an encoder using an intra cross component prediction Technique (TSCPM).
Referring to fig. 8, in step S801, a reference sampling point for constructing a linear model is determined based on availability of reference pixels of a current encoding block, and the linear model is constructed according to the determined reference sampling point.
In an exemplary embodiment of the present disclosure, in determining reference samples for constructing a linear model, two reference samples may be selected from an upper side reference pixel and a left side reference pixel, respectively, when a left side reference pixel value of a current coding block is available and a right side reference pixel value is not available; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the situation that the rate distortion cost is low in the two situations.
In an exemplary embodiment of the present disclosure, when both the reference pixel value on the left side and the reference pixel value on the right side of the current encoding block are available, the information for intra prediction may further include reference sample point construction information.
Specifically, for example, fig. 2 illustrates a schematic diagram of finding two reference samples from reference pixels on the left and top sides of the current block in the TSCPM technique, and as shown in fig. 2, when a reference pixel value on the left side of the current coding block is available and a reference pixel value on the right side is not available, two reference samples (including a chrominance component and a luminance component) are selected from the reference pixel on the top side and the reference pixel on the left side, respectively. The first position of the upper reference pixel is 2, namely the second reference sampling point of the upper reference pixel; the first position of the left reference pixel is 2, i.e. the second reference sample point of the left reference pixel; when the length (width) of the current coding block is larger than or equal to the height (height), the second position of the upper reference pixel is width-width/height +2, and the second position of the left reference pixel is height, otherwise, the second position of the upper reference pixel is width, and the second position of the left reference pixel is height-height/width + 2.
For example, fig. 9 shows a schematic diagram of reference samples in the TSCPM technique according to an exemplary embodiment of the present disclosure, and as shown in fig. 9, when a left-side pixel value of a current coding block is not available and a right-side pixel value is available, two reference samples (including a chrominance component and a luminance component) are selected from an upper-side reference pixel and a right-side reference pixel, respectively. The first position of the upper reference pixel is width-1; the first position of the right reference pixel is 2, namely the second reference sampling point of the left reference pixel; when the length of the coding block is larger than or equal to the height, the second position of the reference pixel on the upper side is width/height-2, the second position of the reference pixel on the left side is height, otherwise, the second position of the reference pixel on the upper side is 1, namely the first reference sample point of the reference pixel on the upper side, and the second position of the reference pixel on the right side is height-height/width + 2.
Furthermore, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, one possible method is: acquiring a reference sampling point according to the condition that a reference pixel value on the left side of a current coding block is available and a reference pixel value on the right side is unavailable; one possible method is: acquiring a reference sampling point according to the condition that a reference pixel value on the right side of the current coding block is available and a reference pixel value on the left side is unavailable; one possible method is: and traversing two situations, namely the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable and the situation that the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, selecting the low rate distortion cost as the optimal situation according to the rate distortion cost, and writing the reference sample point construction information of the situation into the code stream.
Furthermore, after the reference sampling points for constructing the linear model are determined, the 4 reference sampling points may be divided into two types according to the prior art, and a and b in the linear model predChroma ═ a × recLuma + b are calculated, where predChroma is the predicted pixel value of the chrominance component and recLuma is the reconstructed value of the luminance component.
In step S802, a prediction pixel value of a chrominance component of the current coding block is calculated according to the constructed linear model, and an intra prediction mode of the current coding block is determined.
In step S803, information for intra prediction is written into the code stream. Here, the information for intra prediction includes at least information on whether an intra prediction mode is a TSCPM mode.
Specifically, when writing information for intra prediction into a code stream, one possible method is: writing the identifier of whether the TSCPM mode is the TSCPM mode into a code stream; one possible method is: and writing the TSCPM mode identification into the code stream, and writing the reference sampling point construction information into the code stream when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available.
Fig. 10 illustrates a flowchart of a decoding method according to another exemplary embodiment of the present disclosure. The decoding method in fig. 10 may be applied to a decoder using an intra cross component prediction Technique (TSCPM).
Referring to fig. 10, in step S1001, information for intra prediction is analyzed from a code stream. Here, the information for intra prediction includes at least information on whether an intra prediction mode is a TSCPM mode.
In an exemplary embodiment of the present disclosure, when both the reference pixel value on the left side and the reference pixel value on the right side of the current encoding block are available, the information for intra prediction may further include reference sample point construction information.
Specifically, when information for intra prediction is parsed from a code stream, one possible method is: analyzing the prediction mode from the code stream, if the code stream is identified to be the TSCPM mode, adopting the TSCPM to predict the chromaticity component, otherwise, carrying out intra-frame prediction according to the prior art; another possible method is: and analyzing the prediction mode from the code stream, if the code stream is identified to be the TSCPM mode, predicting the chromaticity component by adopting the TSCPM, and otherwise, performing intra-frame prediction according to the prior art. If the TSCPM is adopted to predict the chrominance components, when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point construction information is analyzed from the code stream.
In step S1002, a reference sampling point for constructing a linear model is determined based on the availability of the reference pixels of the current coding block, and the linear model is constructed according to the determined reference sampling point.
In an exemplary embodiment of the present disclosure, in determining reference samples for constructing a linear model, two reference samples may be selected from an upper side reference pixel and a left side reference pixel, respectively, when a left side reference pixel value of a current coding block is available and a right side reference pixel value is not available; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the analyzed reference sample point construction information.
Specifically, when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, two reference samples (including the chrominance component and the luminance component) are selected from the reference pixel on the upper side and the reference pixel on the left side, respectively. The first position of the upper reference pixel is 2, namely the second reference sampling point of the upper reference pixel; the first position of the left reference pixel is 2, i.e. the second reference sample point of the left reference pixel; when the length (width) of the coding block is larger than or equal to the height (height), the second position of the upper reference pixel is width-width/height +2, and the second position of the left reference pixel is height, otherwise, the second position of the upper reference pixel is width, and the second position of the left reference pixel is height-height/width + 2.
Furthermore, when the left pixel value of the current coding block is not available and the right pixel value is available, two reference samples (including a chrominance component and a luminance component) are selected from the upper reference pixel and the right reference pixel, respectively. The first position of the upper reference pixel is width-1; the first position of the right reference pixel is 2, namely the second reference sampling point of the left reference pixel; when the length of the coding block is larger than or equal to the height, the second position of the reference pixel on the upper side is width/height-2, the second position of the reference pixel on the left side is height, otherwise, the second position of the reference pixel on the upper side is 1, namely the first reference sample point of the reference pixel on the upper side, and the second position of the reference pixel on the right side is height-height/width + 2.
Furthermore, when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is available, one possible approach is: acquiring a reference sampling point according to the condition that a reference pixel value on the left side of a current coding block is available and a reference pixel value on the right side is unavailable; one possible method is: acquiring a reference sampling point according to the conditions that a reference pixel value on the right side of the current coding block can be obtained and a reference pixel value on the right side can be obtained; another possible method is: and analyzing the reference sample point construction information in the code stream, namely selecting one of the conditions that the left reference pixel value and the right reference pixel value are both available and the left reference pixel value and the right reference pixel value of the current coding block are both unavailable by using the reference pixel on which side to obtain the reference sample point.
Furthermore, after the reference sampling points for constructing the linear model are determined, the 4 reference sampling points may be divided into two types according to the prior art, and a and b in a linear model predChroma ═ a × recLuma + b are calculated, where predChroma is the predicted pixel value of the chrominance component and recLuma is the reconstructed value of the luminance component.
In step S1003, a predicted pixel value of the chrominance component of the current coding block is calculated according to the constructed linear model.
In step S1004, the chroma component of the current coding block is reconstructed according to the predicted pixel value of the chroma component of the current coding block, so as to obtain a decoded image.
Fig. 11 illustrates a flowchart of an encoding method according to another exemplary embodiment of the present disclosure. The encoding method in fig. 11 can be applied to an encoder using a spread tree technique (DT).
Referring to fig. 11, in step S1101, the coding order of a prediction block in a current coding block is determined based on the partition mode of the block and the availability of reference pixels of the current coding block.
In an exemplary embodiment of the present disclosure, when the partition mode of the block is vertical partition, in determining the encoding order of the prediction block in the current coding block, the encoding order of the prediction block may be from left to right when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, the coding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the coding sequence of the prediction block is from left to right; when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the coding order of the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
Specifically, fig. 12 illustrates a schematic diagram of a coding order of a prediction block according to an exemplary embodiment of the present disclosure, and as shown in fig. 12, when reference pixel values on the left side of a current coding block are available and reference pixel values on the right side are unavailable, the prediction block is in a left-to-right coding order when the current coding block is vertically divided. Fig. 13 illustrates a schematic diagram of a coding order of a prediction block according to another exemplary embodiment of the present disclosure, and as shown in fig. 13, when reference pixel values on the left side of a current coding block are not available and reference pixel values on the right side are available, when the current coding block is vertically divided, the prediction block is in a right-to-left coding order. Fig. 14 illustrates a schematic diagram of a coding order of a prediction block according to another exemplary embodiment of the present disclosure, and as shown in fig. 14, when a current coding block is vertically divided when neither a left reference pixel value nor a right reference pixel value of the current coding block is available, the prediction block is in a left-to-right coding order.
In addition, when the reference pixel value at the left side and the reference pixel value at the right side of the current coding block are both available, one possible implementation method is as follows: when the current coding block is divided vertically, the prediction block is subjected to a left-to-right coding sequence; one possible implementation method is: when the current coding block is divided vertically, the prediction block is subjected to a coding sequence from right to left; another possible implementation method is: and comparing the rate distortion costs of the left-to-right sequence and the right-to-left sequence, selecting the prediction block coding sequence with the minimum rate distortion cost, and determining the identifier of the prediction block coding sequence corresponding to the prediction block coding sequence.
In step S1102, the coding order of the transform blocks in the prediction block is determined based on the partition mode of the block and the availability of reference pixels of the current coding block.
In an exemplary embodiment of the present disclosure, when the partition mode of the block is vertical partition, in determining the coding order of the transform blocks in the prediction block, the coding order of the transform blocks in the prediction block may be from left to right when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the coding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the coding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the coding order of the transform block in the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
Specifically, fig. 15 illustrates a schematic diagram of a coding order of transform blocks in a prediction block according to another exemplary embodiment of the present disclosure, and as shown in fig. 15, when a reference pixel value on the left side of a current prediction block is available and a reference pixel value on the right side is not available, the transform blocks are in a coding order from left to right when the current prediction block is vertically divided. Fig. 16 illustrates a schematic diagram of a coding order of transform blocks in a prediction block according to another exemplary embodiment of the present disclosure, and as shown in fig. 16, when a left reference pixel value of a current prediction block is not available and a right reference pixel value is available, the transform blocks are in a right-to-left coding order when the current prediction block is vertically divided. Fig. 17 illustrates a schematic diagram of a coding order of transform blocks in a prediction block according to another exemplary embodiment of the present disclosure, and as shown in fig. 17, when a reference pixel value on the left side and a reference pixel value on the right side of a current prediction block are not available, the transform blocks are subjected to a coding order from left to right when the current prediction block is vertically divided.
Furthermore, when both the left reference pixel value and the right reference pixel value of the current prediction block are available, one possible implementation is: when the current prediction block is divided vertically, the transformation blocks are subjected to a left-to-right coding sequence; one possible implementation method is: when the current prediction block is divided vertically, the transformation blocks are coded in the sequence from right to left; another possible implementation method is: traversing two sequences of the transformation block from left to right coding and from right to left coding, selecting the coding sequence with the lowest probability distortion cost, and determining the identification of the transformation block coding sequence corresponding to the transformation block coding sequence.
In step S1103, extended tree DT mode information is written into the code stream based on the determined coding order of the prediction block in the current coding block and the coding order of the transform block in the prediction block, wherein the DT mode information includes at least information about the partition mode of the block.
In an exemplary embodiment of the present disclosure, when both reference pixel values on the left side and reference pixel values on the right side of the current coding block are available, the DT mode information may further include information on the coding order of the prediction block and/or information on the coding order of the transform blocks in the prediction block.
Specifically, when the DT mode information of the extended tree is written into the code stream, one possible implementation method is: writing the division mode adopted by the DT into a code stream; another possible implementation method is: writing the division mode adopted by the DT into a code stream, writing the coding sequence identifier of the prediction block into the code stream under the condition that the reference pixel on the left side and the reference pixel on the right side of the prediction block are both available, and writing the coding sequence identifier of the transformation block in the prediction block into the code stream under the condition that the reference pixel on the left side and the reference pixel on the right side of the transformation block are both available.
Fig. 18 illustrates a flowchart of a decoding method according to another exemplary embodiment of the present disclosure. The decoding method in fig. 18 can be applied to a decoder using the extended tree technique (DT).
Referring to fig. 18, DT mode information including at least information about a partition mode of a block is parsed from a codestream in step S1801.
In exemplary embodiments of the present disclosure, the DT mode information may further include information on a decoding order of the prediction block and/or information on a decoding order of the transform blocks in the prediction block.
Specifically, when the DT mode information is parsed from the code stream, one possible implementation method is: resolving the division mode adopted by the DT from the code stream; another possible implementation method is: and analyzing the division mode adopted by the DT from the code stream, analyzing the decoding sequence identifier of the prediction block from the code stream when the reference pixel on the left side and the reference pixel on the right side of the prediction block are both available, and analyzing the decoding sequence identifier of the transformation block in the prediction block from the code stream when the reference pixel on the left side and the reference pixel on the right side of the transformation block are both available.
In step S1802, a decoding order of a prediction block in a current coding block is determined based on a partition mode of the block and availability of reference pixels of the current coding block.
In an exemplary embodiment of the present disclosure, when the partition mode of the block is vertical partition, in determining the decoding order of the prediction block in the current coding block, the decoding order of the prediction block may be from left to right when reference pixel values on the left side of the current coding block are available and reference pixel values on the right side are not available; when reference pixel values on the left side of the current coding block are not available and reference pixel values on the right side are available, the decoding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the decoding order of the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current coding block are available, the decoding order of the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the prediction block determined according to the information on the decoding order of the prediction block.
Specifically, when reference pixel values on the left side of the current coding block are available and reference pixel values on the right side are not available, the prediction block is in a left-to-right decoding order when the current coding block is vertically divided. When reference pixel values on the left side of the current coding block are not available and reference pixel values on the right side are available, then the prediction block is in a right-to-left decoding order when the current coding block is vertically divided. When the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the prediction block is subjected to a left-to-right decoding sequence when the current coding block is divided vertically. When the reference pixel value at the left side and the reference pixel value at the right side of the current coding block are both available, one possible implementation method is: when the current coding block is divided vertically, the prediction block is subjected to a decoding sequence from left to right; one possible implementation method is: when the current coding block is divided vertically, the prediction block is subjected to a decoding sequence from right to left; another possible implementation method is: according to the prediction block decoding order identification, a decoding order from left to right or from right to left is selected.
In step S1803, a decoding order of the transform blocks in the prediction block is determined based on the partition mode of the block and the availability of reference pixels of the current coding block.
In an exemplary embodiment of the present disclosure, when the partition mode of the block is vertical partition, in determining the decoding order of the transform blocks in the prediction block, the decoding order of the transform blocks in the prediction block may be from left to right when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the decoding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the decoding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the decoding order of the transform blocks in the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the transform blocks in the prediction block determined according to information on a decoding order of the transform blocks in the prediction block.
Specifically, when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available, the transform block is in decoding order from left to right when the current prediction block is vertically divided. When reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the transform block is in a decoding order from right to left when the current prediction block is vertically divided. When both the left reference pixel value and the right reference pixel value of the current prediction block are not available, the transform block is subjected to a left-to-right decoding order when the current prediction block is vertically divided. When both the left reference pixel value and the right reference pixel value of the current prediction block are available, one possible implementation is: when the current prediction block is divided vertically, the transformation blocks are subjected to decoding sequence from left to right; one possible implementation method is: when the current prediction block is divided vertically, the transformation blocks are subjected to decoding sequence from right to left; another possible implementation method is: and selecting a decoding order from left to right or from right to left according to the transform block decoding order identification.
In step S1804, decoding is performed according to the determined decoding order of the prediction block in the current coding block and the determined decoding order of the transform block in the prediction block.
The encoding and decoding method according to the exemplary embodiment of the present disclosure has been described above with reference to fig. 1 to 18. Hereinafter, a codec device and units thereof according to an exemplary embodiment of the present disclosure will be described with reference to fig. 19 to 24.
Fig. 19 illustrates a block diagram of an encoding apparatus according to an exemplary embodiment of the present disclosure.
Referring to fig. 19, the encoding apparatus includes an intra prediction unit 191 and a first encoding unit 192.
The intra prediction unit 191 is configured to intra predict the current encoding block based on the availability of reference pixels for the current encoding block.
In an exemplary embodiment of the present disclosure, the intra prediction unit 191 may be configured to: traversing all intra-frame prediction modes aiming at a current coding block to obtain a prediction pixel value corresponding to each intra-frame prediction mode, and carrying out intra-frame prediction filtering on the prediction pixel value to obtain a final prediction pixel value corresponding to each intra-frame prediction mode; calculating the rate distortion cost corresponding to each intra-frame prediction mode according to the final prediction pixel value; and according to the rate distortion cost of each intra-frame prediction mode, taking the intra-frame prediction mode with the lowest rate distortion cost as the intra-frame prediction mode of the current coding block.
In an exemplary embodiment of the present disclosure, the intra prediction unit 191 may be further configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
In an exemplary embodiment of the present disclosure, the intra prediction unit 191 may be further configured to: when the intra-frame prediction mode of the current coding block is not the angle prediction mode, weighting the prediction pixel value on the upper side and the prediction pixel value on the right side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
In an exemplary embodiment of the present disclosure, the intra prediction unit 191 may be further configured to: when the included angle between the current coding block and the horizontal direction is smaller than a first angle, weighting the prediction pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final prediction pixel value; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain a final prediction pixel value.
In an exemplary embodiment of the present disclosure, the intra prediction unit 191 may be further configured to: performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and a reference pixel value on the left side of the current coding block, or performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and a reference pixel value on the right side of the current coding block, or performing intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, a reference pixel value on the left side of the current coding block and a reference pixel value on the right side of the current coding block, or selecting an intra-frame filtering mode with the lowest rate-distortion cost from the following intra-frame filtering modes to perform intra-frame prediction: the intra-frame filtering mode comprises an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, and an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side and.
In an exemplary embodiment of the present disclosure, the intra prediction unit 191 may be further configured to: when the intra-frame prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the left upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side of the current coding block to obtain a final prediction pixel value on the left upper side of the current coding block, weighting a prediction pixel value on the left side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on; when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
The first encoding unit 192 is configured to write information for intra prediction, which includes at least information on an intra prediction mode, into a code stream.
In an exemplary embodiment of the present disclosure, when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the information for intra prediction may further include information on an intra filtering mode.
Fig. 20 illustrates a block diagram of a decoding apparatus according to an exemplary embodiment of the present disclosure.
Referring to fig. 20, the encoding apparatus includes a first parsing unit 201, a first calculation unit 202, and a first decoding unit 203.
The first parsing unit 201 is configured to parse information for intra prediction, which includes at least information on an intra prediction mode, from a code stream.
In an exemplary embodiment of the present disclosure, the information for intra prediction further includes information on an intra filtering mode when both reference pixel values on the left side and reference pixel values on the right side of the current coding block are available.
The first calculation unit 202 is configured to calculate a predicted pixel value of the current encoding block from the parsed information for intra prediction based on the availability of reference pixels of the current encoding block.
In an exemplary embodiment of the present disclosure, the first computing unit 202 may be configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block; and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
In an exemplary embodiment of the present disclosure, the first computing unit 202 may be configured to: when the intra-frame prediction mode of the current coding block is not the angle prediction mode, weighting a prediction pixel value on the upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right side of the current coding block; and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
In an exemplary embodiment of the present disclosure, the first computing unit 202 may be configured to: when the included angle between the current coding block and the horizontal direction is smaller than a first angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain the final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side of the current coding block to obtain the final prediction pixel value on the right side of the current coding block.
In an exemplary embodiment of the present disclosure, the first computing unit 202 may be configured to: and carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, or carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, or carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and the reference pixel value on the right side of the current coding block, or carrying out intra-frame prediction filtering according to the analyzed intra-frame.
In an exemplary embodiment of the present disclosure, the first computing unit 202 may be configured to: when the intra-frame prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the left upper side of the current coding block, a reference pixel value on the left side and a reference pixel value on the right side to obtain a final prediction pixel value on the left upper side of the current coding block, weighting a prediction pixel value on the left side of the current coding block, a reference pixel value on the left side and a reference pixel value on the right side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side; when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle; when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block; and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
The first decoding unit 203 is configured to reconstruct the image according to the predicted pixel values, resulting in a decoded image.
Fig. 21 illustrates a block diagram of an encoding apparatus according to another exemplary embodiment of the present disclosure.
Referring to fig. 21, the encoding apparatus includes a model construction unit 211, a mode determination unit 212, and a second encoding unit 213.
The model construction unit 211 is configured to determine reference sample points for constructing the linear model based on the availability of reference pixels of the current encoding block and construct the linear model from the determined reference sample points.
In an exemplary embodiment of the present disclosure, the model building unit 211 may be configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, selecting two reference sampling points from the reference pixel on the upper side and the reference pixel on the left side respectively; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the situation that the rate distortion cost is low in the two situations.
The mode determination unit 212 is configured to calculate a prediction pixel value of a chroma component of the current coding block according to the constructed linear model and determine an intra prediction mode of the current coding block.
The second encoding unit 213 is configured to write information for intra prediction, which includes at least information on whether the intra prediction mode is a TSCPM mode, into the code stream.
In an exemplary embodiment of the present disclosure, when both the reference pixel value on the left side and the reference pixel value on the right side of the current encoding block are available, the information for intra prediction may further include reference sample point construction information.
Fig. 22 illustrates a block diagram of a decoding apparatus according to another exemplary embodiment of the present disclosure.
Referring to fig. 22, the encoding apparatus includes a second parsing unit 221, a model building unit 222, a second calculating unit 223, and a second decoding unit 224.
The second parsing unit 221 is configured to parse information for intra prediction, which includes at least information on whether an intra prediction mode is a TSCPM mode, from a code stream.
In an exemplary embodiment of the present disclosure, when both the reference pixel value on the left side and the reference pixel value on the right side of the current encoding block are available, the information for intra prediction may further include reference sample point construction information.
The model construction unit 222 is configured to determine reference sample points for constructing the linear model based on the availability of reference pixels of the currently encoded block, and construct the linear model from the determined reference sample points.
In an exemplary embodiment of the present disclosure, the model building unit 222 may be configured to: when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available, selecting two reference sampling points from the reference pixel on the upper side and the reference pixel on the left side respectively; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, two reference sampling points are selected from the reference pixel on the upper side and the reference pixel on the right side respectively; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, the reference sample point is obtained according to the situation that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, or the reference sample point is obtained according to the situation that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is unavailable, or the reference sample point is obtained according to the analyzed reference sample point construction information.
The second calculation unit 223 is configured to calculate a predicted pixel value of a chrominance component of the current coding block according to the constructed linear model.
The second decoding unit 224 is configured to reconstruct the chroma component of the current coding block according to the predicted pixel value of the chroma component of the current coding block, resulting in a decoded image.
Fig. 23 illustrates a block diagram of an encoding apparatus according to another exemplary embodiment of the present disclosure.
Referring to fig. 23, the encoding apparatus includes a first order determination unit 231, a second order determination unit 232, and a third encoding unit 233.
The first order determination unit 231 is configured to determine an encoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block.
In an exemplary embodiment of the present disclosure, the first order determination unit 231 may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current coding block are available and reference pixel values on the right side are not available, the coding order of the prediction block is from left to right; when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side is available, the coding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the coding sequence of the prediction block is from left to right; when both the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are available, the coding order of the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
The second order determination unit 232 is configured to determine the coding order of the transform blocks in the prediction block based on the partition mode of the block and the availability of reference pixels of the current coding block.
In an exemplary embodiment of the present disclosure, the second order determination unit 232 may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available, the encoding order of the transform blocks in the prediction block is from left to right; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the coding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the coding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the coding order of the transform block in the prediction block is one of the following coding orders: coding order from left to right, coding order from right to left, coding order with low rate-distortion cost in left to right and right to left.
The third encoding unit 233 is configured to write extended tree DT mode information, which includes at least information on a partition mode of a block, into a code stream based on the determined coding order of a prediction block in the current coding block and the coding order of a transform block in the prediction block.
In an exemplary embodiment of the present disclosure, when both reference pixel values on the left side and reference pixel values on the right side of the current coding block are available, the DT mode information may further include information on the coding order of the prediction block and/or information on the coding order of the transform blocks in the prediction block.
Fig. 24 illustrates a block diagram of a decoding apparatus according to another exemplary embodiment of the present disclosure.
Referring to fig. 24, the encoding apparatus includes a third parsing unit 241, a third order determination unit 242, a fourth order determination unit 243, and a third decoding unit 244.
The third parsing unit 241 is configured to parse DT mode information from the codestream, wherein the DT mode information includes at least information about a partition mode of a block.
In exemplary embodiments of the present disclosure, the DT mode information further includes information on a decoding order of the prediction block and/or information on a decoding order of the transform blocks in the prediction block.
The third order determination unit 242 is configured to determine a decoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block.
In an exemplary embodiment of the present disclosure, the third order determination unit 242 may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current encoding block are available and reference pixel values on the right side are not available, the decoding order of the prediction block is from left to right; when reference pixel values on the left side of the current coding block are not available and reference pixel values on the right side are available, the decoding order of the prediction block is from right to left; when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, the decoding order of the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current coding block are available, the decoding order of the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the prediction block determined according to the information on the decoding order of the prediction block.
The fourth order determination unit 243 is configured to determine a decoding order of the transform blocks in the prediction block based on a partition mode of the block and availability of reference pixels of the current encoding block.
In an exemplary embodiment of the present disclosure, the fourth order determination unit 243 may be configured to: when the partition mode of the block is vertical partition, when reference pixel values on the left side of the current prediction block are available and reference pixel values on the right side are not available, the decoding order of the transform blocks in the prediction block is from left to right; when reference pixel values on the left side of the current prediction block are not available and reference pixel values on the right side are available, the decoding order of the transform blocks in the prediction block is from right to left; when neither the left reference pixel value nor the right reference pixel value of the current prediction block is available, the decoding order of the transform blocks in the prediction block is from left to right; when both the left reference pixel value and the right reference pixel value of the current prediction block are available, the decoding order of the transform blocks in the prediction block is one of the following decoding orders: a decoding order from left to right, a decoding order from right to left, a decoding order of the transform blocks in the prediction block determined according to information on a decoding order of the transform blocks in the prediction block.
The third decoding unit 244 is configured to decode according to the determined decoding order of the prediction block in the current coding block and the determined decoding order of the transform blocks in the prediction block.
Further, according to an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed, implements an encoding method or a decoding method according to an exemplary embodiment of the present disclosure.
In an exemplary embodiment of the disclosure, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: performing intra-frame prediction on the current coding block based on the availability of the reference pixels of the current coding block; and writing information for intra-frame prediction into the code stream, wherein the information for intra-frame prediction at least comprises information about an intra-frame prediction mode.
In an exemplary embodiment of the disclosure, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: parsing information for intra prediction from a code stream, wherein the information for intra prediction includes at least information about an intra prediction mode; calculating a prediction pixel value of the current coding block according to the analyzed information for intra-frame prediction based on the availability of the reference pixels of the current coding block; and reconstructing the image according to the predicted pixel value to obtain a decoded image.
In an exemplary embodiment of the disclosure, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point; calculating a prediction pixel value of a chrominance component of the current coding block according to the constructed linear model, and determining an intra-frame prediction mode of the current coding block; writing information for intra prediction into the code stream, wherein the information for intra prediction includes at least information on whether an intra prediction mode is a TSCPM mode.
In an exemplary embodiment of the disclosure, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: parsing information for intra prediction from a code stream, the information for intra prediction including at least information on whether an intra prediction mode is a TSCPM mode; determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point; calculating a prediction pixel value of a chrominance component of a current coding block according to the constructed linear model; and reconstructing the chroma component of the current coding block according to the predicted pixel value of the chroma component of the current coding block to obtain a decoded image.
In an exemplary embodiment of the disclosure, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: determining the coding sequence of a prediction block in a current coding block based on the partition mode of the block and the availability of reference pixels of the current coding block; determining a coding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block; and writing the DT mode information into the code stream based on the determined coding sequence of a prediction block in the current coding block and the coding sequence of a transformation block in the prediction block, wherein the DT mode information at least comprises information about the partitioning mode of the block.
In an exemplary embodiment of the disclosure, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: parsing DT mode information from a code stream, wherein the DT mode information at least comprises information about a partitioning mode of a block; determining a decoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block; determining a decoding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block; and decoding according to the determined coding sequence of the prediction block in the current coding block and the determined decoding sequence of the transformation block in the prediction block.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing. The computer readable storage medium may be embodied in any device; it may also be present separately and not assembled into the device.
The encoding and decoding apparatus according to the exemplary embodiment of the present disclosure has been described above with reference to fig. 19 to 24. Next, a computing device according to an exemplary embodiment of the present disclosure is described with reference to fig. 25.
Fig. 25 shows a schematic diagram of a computing device according to an example embodiment of the present disclosure.
Referring to fig. 25, the computing device 25 according to an exemplary embodiment of the present disclosure includes a memory 261 and a processor 252, the memory 251 having stored thereon a computer program that, when executed by the processor 252, implements an encoding method or a decoding method according to an exemplary embodiment of the present disclosure.
In an exemplary embodiment of the disclosure, the computer program, when executed by the processor 252, may implement the steps of: performing intra-frame prediction on the current coding block based on the availability of the reference pixels of the current coding block; and writing information for intra-frame prediction into the code stream, wherein the information for intra-frame prediction at least comprises information about an intra-frame prediction mode.
In an exemplary embodiment of the disclosure, the computer program, when executed by the processor 252, may implement the steps of: parsing information for intra prediction from a code stream, wherein the information for intra prediction includes at least information about an intra prediction mode; calculating a prediction pixel value of the current coding block according to the analyzed information for intra-frame prediction based on the availability of the reference pixels of the current coding block; and reconstructing the image according to the predicted pixel value to obtain a decoded image.
In an exemplary embodiment of the disclosure, the computer program, when executed by the processor 252, may implement the steps of: determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point; calculating a prediction pixel value of a chrominance component of the current coding block according to the constructed linear model, and determining an intra-frame prediction mode of the current coding block; writing information for intra prediction into the code stream, wherein the information for intra prediction includes at least information on whether an intra prediction mode is a TSCPM mode.
In an exemplary embodiment of the disclosure, the computer program, when executed by the processor 252, may implement the steps of: parsing information for intra prediction from a code stream, the information for intra prediction including at least information on whether an intra prediction mode is a TSCPM mode; determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point; calculating a prediction pixel value of a chrominance component of a current coding block according to the constructed linear model; and reconstructing the chroma component of the current coding block according to the predicted pixel value of the chroma component of the current coding block to obtain a decoded image.
In an exemplary embodiment of the disclosure, the computer program, when executed by the processor 252, may implement the steps of: determining the coding sequence of a prediction block in a current coding block based on the partition mode of the block and the availability of reference pixels of the current coding block; determining a coding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block; and writing the DT mode information into the code stream based on the determined coding sequence of a prediction block in the current coding block and the coding sequence of a transformation block in the prediction block, wherein the DT mode information at least comprises information about the partitioning mode of the block.
In an exemplary embodiment of the disclosure, the computer program, when executed by the processor 252, may implement the steps of: parsing DT mode information from a code stream, wherein the DT mode information at least comprises information about a partitioning mode of a block; determining a decoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block; determining a decoding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block; and decoding according to the determined decoding order of the prediction block in the current coding block and the determined decoding order of the transformation block in the prediction block.
The computing device illustrated in fig. 25 is only one example and should not impose any limitations on the functionality or scope of use of embodiments of the disclosure.
The encoding and decoding methods and apparatuses according to exemplary embodiments of the present disclosure have been described above with reference to fig. 1 to 25. However, it should be understood that: the codec devices and units thereof shown in fig. 19 to 24 may be respectively configured as software, hardware, firmware, or any combination thereof to perform a specific function, the computing device shown in fig. 2 is not limited to include the above-shown components, but some components may be added or deleted as needed, and the above components may also be combined.
According to the coding and decoding method and device disclosed by the exemplary embodiment of the disclosure, flexible coding sequence is adopted, spatial reference information is expanded, an intra-frame prediction filtering technology (IPF), a cross-component prediction Technology (TSCPM) and an expanded tree technology (DT) are designed according to the availability of adjacent reference pixels on the left side and the right side of a current coding or decoding image block, and the information of surrounding reference pixels is utilized to the maximum extent, so that the efficiency of intra-frame coding is improved. When the left pixel and the right pixel of the current image block are available, the intra-frame prediction technologies are consistent with the prior art, so that the compatibility among the technologies is maintained to the maximum extent. When the right pixel of the image block is available, for the IPF technique, the right and upper pixels are used as the filtering pixels, that is, the right and upper pixels and the prediction value are weighted to obtain a new prediction value. For the TSCPM technique, linear models of the luminance component and the chrominance component are derived using the pixels on the right and upper sides, and the chrominance component is predicted based on the linear models. For the DT technique, when the right-side pixel is available, the prediction block and the transform block are encoded in the order from right to left, so that when the right-to-left encoding order is adopted, the encoded or decoded pixel information around the image block can still be used to the maximum extent, thereby improving the intra-prediction efficiency and further improving the performance of video encoding.
While the present disclosure has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the following claims.

Claims (20)

1. An encoding method, comprising:
performing intra-frame prediction on the current coding block based on the availability of the reference pixels of the current coding block;
writing information for intra prediction, which includes at least information on an intra prediction mode, into a code stream.
2. The encoding method of claim 1, wherein intra-predicting a current coding block comprises:
traversing all intra-frame prediction modes aiming at a current coding block to obtain a prediction pixel value corresponding to each intra-frame prediction mode, and carrying out intra-frame prediction filtering on the prediction pixel value to obtain a final prediction pixel value corresponding to each intra-frame prediction mode;
calculating the rate distortion cost corresponding to each intra-frame prediction mode according to the final prediction pixel value;
and according to the rate distortion cost of each intra-frame prediction mode, taking the intra-frame prediction mode with the lowest rate distortion cost as the intra-frame prediction mode of the current coding block.
3. The encoding method according to claim 2, wherein the step of intra prediction filtering the prediction pixel values comprises:
when the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is unavailable, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block;
when the reference pixel value on the left side of the current coding block is unavailable and the reference pixel value on the right side of the current coding block is available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block;
when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are both available, carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block, the reference pixel value on the left side of the current coding block and/or the reference pixel value on the right side of the current coding block;
and when the reference pixel value on the left side and the reference pixel value on the right side of the current coding block are not available, filling the reference pixel value on the left side or the reference pixel value on the right side, and performing intra-frame prediction filtering according to the condition that the reference pixel value on the left side of the current coding block is available and the reference pixel value on the right side is not available or the condition that the reference pixel value on the right side of the current coding block is available and the reference pixel value on the left side is not available.
4. The encoding method of claim 3, wherein the intra prediction filtering according to the intra prediction mode of the current coding block and the reference pixel value on the right side of the current coding block comprises:
when the intra-frame prediction mode of the current coding block is not the angle prediction mode, weighting the prediction pixel value on the upper side and the prediction pixel value on the right side of the current coding block, the reference pixel value on the upper side and the reference pixel value on the right side to obtain a final prediction pixel value;
and when the intra-frame prediction mode of the current coding block is an angle prediction mode, calculating a final prediction pixel value according to the included angle between the current coding block and the horizontal direction or the vertical direction.
5. The encoding method as claimed in claim 4, wherein the step of calculating the final predicted pixel value according to the angle between the current coding block and the horizontal direction or the vertical direction comprises:
when the included angle between the current coding block and the horizontal direction is smaller than a first angle, weighting the prediction pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final prediction pixel value;
and when the included angle between the current coding block and the vertical direction is smaller than a second angle, weighting the prediction pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain a final prediction pixel value.
6. The encoding method of claim 3, wherein the intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value at the left side of the current coding block, and/or the reference pixel value at the right side of the current coding block comprises:
intra prediction filtering is performed according to the intra prediction mode of the current coding block and the reference pixel value at the left side of the current coding block,
or, the intra-frame prediction filtering is carried out according to the intra-frame prediction mode of the current coding block and the reference pixel value at the right side of the current coding block,
or, the intra-frame prediction filtering is carried out according to the intra-frame prediction mode of the current coding block, the reference pixel value at the left side of the current coding block and the reference pixel value at the right side of the current coding block,
or selecting the intra-frame filtering mode with the lowest rate distortion cost from the following intra-frame filtering modes for intra-frame prediction filtering: the intra-frame filtering mode comprises an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side of the current coding block, an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the right side of the current coding block, and an intra-frame filtering mode for carrying out intra-frame prediction filtering according to the intra-frame prediction mode of the current coding block and the reference pixel value on the left side and.
7. The encoding method of claim 6, wherein the intra prediction filtering according to the intra prediction mode of the current coding block, the reference pixel value at the left side of the current coding block, and the reference pixel value at the right side of the current coding block comprises:
when the intra-frame prediction mode of the current coding block is not an angle prediction mode, weighting a prediction pixel value on the left upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side of the current coding block to obtain a final prediction pixel value on the left upper side of the current coding block, weighting a prediction pixel value on the left side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the left side to obtain a final prediction pixel value on the left side of the current coding block, weighting a prediction pixel value on the right upper side of the current coding block, a reference pixel value on the upper side and a reference pixel value on the right side to obtain a final prediction pixel value on the right upper side of the current coding block, and weighting a prediction pixel value on the right side of the current coding block, a reference pixel value on the upper side and a reference pixel value on;
when the intra-frame prediction mode of the current coding block is an angle prediction mode, determining whether the included angle between the current coding block and the horizontal direction or the vertical direction is smaller than a preset angle;
when the included angle between the current coding block and the horizontal direction is smaller than a third angle, weighting the predicted pixel value on the upper side of the current coding block and the reference pixel value on the upper side to obtain a final predicted pixel value on the upper side of the current coding block;
and when the included angle between the current coding block and the vertical direction is smaller than a fourth angle, weighting the predicted pixel value on the left side of the current coding block and the reference pixel value on the left side to obtain the final predicted pixel value on the left side of the current coding block, and weighting the predicted pixel value on the right side of the current coding block and the reference pixel value on the right side to obtain the final predicted pixel value on the right side of the current coding block.
8. A decoding method, comprising:
parsing information for intra prediction from a code stream, the information for intra prediction including at least information about an intra prediction mode;
calculating a prediction pixel value of the current coding block according to the analyzed information for intra-frame prediction based on the availability of the reference pixels of the current coding block;
and reconstructing the image according to the predicted pixel value to obtain a decoded image.
9. An encoding method, comprising:
determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point;
calculating a prediction pixel value of a chrominance component of the current coding block according to the constructed linear model, and determining an intra-frame prediction mode of the current coding block;
writing information for intra prediction, which includes at least information on whether an intra prediction mode is a TSCPM mode, into a code stream.
10. A decoding method, comprising:
parsing information for intra prediction from a code stream, the information for intra prediction including at least information on whether an intra prediction mode is a TSCPM mode;
determining a reference sampling point for constructing a linear model based on the availability of the reference pixel of the current coding block, and constructing the linear model according to the determined reference sampling point;
calculating a prediction pixel value of a chrominance component of a current coding block according to the constructed linear model;
and reconstructing the chroma component of the current coding block according to the predicted pixel value of the chroma component of the current coding block to obtain a decoded image.
11. An encoding method, comprising:
determining the coding sequence of a prediction block in a current coding block based on the partition mode of the block and the availability of reference pixels of the current coding block;
determining a coding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block;
and writing the DT mode information into the code stream based on the determined coding sequence of a prediction block in the current coding block and the coding sequence of a transformation block in the prediction block, wherein the DT mode information at least comprises information about the partitioning mode of the block.
12. A decoding method, comprising:
parsing DT mode information from a code stream, wherein the DT mode information at least comprises information about a partitioning mode of a block;
determining a decoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block;
determining a decoding order of transform blocks in the prediction block based on a partition mode of a block and availability of reference pixels of a current coding block;
and decoding according to the determined decoding order of the prediction block in the current coding block and the determined decoding order of the transformation block in the prediction block.
13. An encoding apparatus comprising:
an intra prediction unit configured to intra predict a current encoding block based on availability of reference pixels of the current encoding block; and
a first encoding unit configured to write information for intra prediction, which includes at least information on an intra prediction mode, into a code stream.
14. A decoding apparatus, comprising:
a first parsing unit configured to parse information for intra prediction from a code stream, the information for intra prediction including at least information on an intra prediction mode;
a first calculation unit configured to calculate a prediction pixel value of the current encoding block from the parsed information for intra prediction based on availability of reference pixels of the current encoding block;
and the first decoding unit is configured to reconstruct the image according to the predicted pixel value to obtain a decoded image.
15. An encoding apparatus comprising:
the model building unit is configured to determine a reference sampling point for building a linear model based on the availability of the reference pixel of the current coding block, and build the linear model according to the determined reference sampling point;
a mode determination unit configured to calculate a prediction pixel value of a chrominance component of the current coding block according to the constructed linear model and determine an intra prediction mode of the current coding block; and
a second encoding unit configured to write information for intra prediction, which includes at least information on whether an intra prediction mode is a TSCPM mode, into a code stream.
16. A decoding apparatus, comprising:
a second parsing unit configured to parse information for intra prediction from the code stream, the information for intra prediction including at least information on whether an intra prediction mode is a TSCPM mode;
the model building unit is configured to determine a reference sampling point for building a linear model based on the availability of the reference pixel of the current coding block, and build the linear model according to the determined reference sampling point;
a second calculation unit configured to calculate a predicted pixel value of a chrominance component of the current coding block according to the constructed linear model;
and the second decoding unit is configured to reconstruct the chroma component of the current coding block according to the predicted pixel value of the chroma component of the current coding block to obtain a decoded image.
17. An encoding apparatus comprising:
a first order determination unit configured to determine a coding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block;
a second order determination unit configured to determine a coding order of the transform blocks in the prediction block based on a partition mode of the block and availability of reference pixels of a current coding block;
a third encoding unit configured to write extended tree (DT) mode information into the code stream based on the determined coding order of a prediction block in the current coding block and the coding order of a transform block in the prediction block, wherein the DT mode information includes at least information on a partition mode of the block.
18. A decoding apparatus, comprising:
a third parsing unit configured to parse DT mode information from the codestream, wherein the DT mode information includes at least information on a partition mode of the block;
a third order determination unit configured to determine a decoding order of a prediction block in a current coding block based on a partition mode of the block and availability of reference pixels of the current coding block;
a fourth order determination unit configured to determine a decoding order of the transform blocks in the prediction block based on a partition mode of the block and availability of reference pixels of a current encoding block;
and a third decoding unit configured to decode according to the determined coding order of the prediction block in the current coding block and the determined decoding order of the transform block in the prediction block.
19. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the encoding method or the decoding method of any one of claims 1 to 12.
20. A computing device, comprising:
a processor;
a memory storing a computer program which, when executed by the processor, implements the encoding method or the decoding method of any one of claims 1 to 12.
CN201910792560.8A 2019-08-26 2019-08-26 Encoding and decoding method and device Pending CN112437298A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910792560.8A CN112437298A (en) 2019-08-26 2019-08-26 Encoding and decoding method and device
PCT/KR2020/011171 WO2021040330A1 (en) 2019-08-26 2020-08-21 Method and device for coding and decoding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910792560.8A CN112437298A (en) 2019-08-26 2019-08-26 Encoding and decoding method and device

Publications (1)

Publication Number Publication Date
CN112437298A true CN112437298A (en) 2021-03-02

Family

ID=74685639

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910792560.8A Pending CN112437298A (en) 2019-08-26 2019-08-26 Encoding and decoding method and device

Country Status (2)

Country Link
CN (1) CN112437298A (en)
WO (1) WO2021040330A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022197135A1 (en) * 2021-03-19 2022-09-22 현대자동차주식회사 Video coding method and device using adaptive order of divided sub-blocks
WO2023177198A1 (en) * 2022-03-15 2023-09-21 주식회사 케이티 Image encoding/decoding method and apparatus
WO2023200214A1 (en) * 2022-04-12 2023-10-19 현대자동차주식회사 Image encoding/decoding method and apparatus, and recording medium storing bitstream

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101718969B1 (en) * 2015-10-02 2017-03-23 인하대학교 산학협력단 Early Block Size Decision Scheme Fast HEVC Intra Prediction
US10652575B2 (en) * 2016-09-15 2020-05-12 Qualcomm Incorporated Linear model chroma intra prediction for video coding
CN110710214B (en) * 2017-09-21 2023-10-31 株式会社Kt Video signal processing method and device
EP4283991A3 (en) * 2017-09-28 2024-02-28 Samsung Electronics Co., Ltd. Encoding method and device, and decoding method and device
CN111386707B (en) * 2017-11-22 2024-05-17 英迪股份有限公司 Image encoding/decoding method and apparatus, and recording medium for storing bit stream

Also Published As

Publication number Publication date
WO2021040330A1 (en) 2021-03-04

Similar Documents

Publication Publication Date Title
RU2651181C2 (en) Coding and decoding video with increased fault tolerance
US10116942B2 (en) Method and apparatus for decoding a video using an intra prediction
CN104869408B (en) Method for decoding video
US10148947B2 (en) Method and device for determining parameters for encoding or decoding of an image of a video sequence
KR101379255B1 (en) Method and apparatus for encoding and decoding based on intra prediction using differential equation
CN112437298A (en) Encoding and decoding method and device
CN106031176B (en) Video encoding method and apparatus and video decoding method and apparatus involving intra prediction
CN101682775B (en) Motion vector searching method and device
US9503721B2 (en) Method and apparatus for predictive encoding/decoding of motion vector
BR112012025309B1 (en) METHOD OF ENCODING A VIDEO BY PERFORMING MESH FILTRATION BASED ON ENCODING UNITS, METHOD OF DECODING A VIDEO BY PERFORMING MESH FILTRATION BASED ON ENCODING UNITS, VIDEO ENCODING EQUIPMENT TO ENCODE A VIDEO BY PERFORMING FILTRATION IN MESH BASED ON ENCODING UNITS, VIDEO DECODING EQUIPMENT TO DECODE A VIDEO BY PERFORMING MESH FILTRATION BASED ON ENCODING UNITS, AND COMPUTER READable RECORDING MEDIA.
US20110243220A1 (en) Method and apparatus for encoding and decoding image and method and apparatus for decoding image using adaptive coefficient scan order
KR102070431B1 (en) Method and apparatus for encoding video with restricting bi-directional prediction and block merging, method and apparatus for decoding video
BRPI0807912B1 (en) VIDEO ENCODING METHOD USING INTRA PREDICTION, VIDEO DECODING METHOD USING INTRA PREDICTION, VIDEO ENCODING EQUIPMENT USING INTRA PREDICTION AND VIDEO DECODING EQUIPMENT USING INTRA PREDICTION
MX2012011650A (en) Method and apparatus for encoding and decoding image and method and apparatus for decoding image using adaptive coefficient scan order.
KR20130004548A (en) Method and apparatus for video encoding with intra prediction by unification of availability check, method and apparatus for video decoding with intra prediction by unification of availability check
KR20150034696A (en) Method of video coding by prediction of the partitioning of a current block, method of decoding, coding and decoding devices and computer programs corresponding thereto
KR20140043828A (en) Image coding method, image decoding method, image coding apparatus, and image decoding apparatus
KR20130116215A (en) Method and apparatus for encoding/decoding video for parallel processing
JP2019531031A (en) Method and apparatus for encoding video
KR20110100912A (en) Video encoding apparatus and method therefor, and video decoding apparatus and method therefor
US20110310975A1 (en) Method, Device and Computer-Readable Storage Medium for Encoding and Decoding a Video Signal and Recording Medium Storing a Compressed Bitstream
CN107306353B (en) Image space prediction mode selection method and device, and image compression method and device
KR20140031974A (en) Image coding method, image decoding method, image coding device, image decoding device, image coding program, and image decoding program
CN112218086A (en) Encoding method, decoding method, transmission method, encoding device, decoding device, and system
KR20200004348A (en) Method and apparatus for processing video signal through target region correction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination