US20200314432A1 - Intra-frame and Inter-frame Combined Prediction Method for P Frames or B Frames - Google Patents
Intra-frame and Inter-frame Combined Prediction Method for P Frames or B Frames Download PDFInfo
- Publication number
- US20200314432A1 US20200314432A1 US16/629,777 US201816629777A US2020314432A1 US 20200314432 A1 US20200314432 A1 US 20200314432A1 US 201816629777 A US201816629777 A US 201816629777A US 2020314432 A1 US2020314432 A1 US 2020314432A1
- Authority
- US
- United States
- Prior art keywords
- intra
- prediction
- inter
- frame
- prediction block
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/157—Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
- H04N19/159—Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/105—Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/107—Selection of coding mode or of prediction mode between spatial and temporal predictive coding, e.g. picture refresh
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/146—Data rate or code amount at the encoder output
- H04N19/147—Data rate or code amount at the encoder output according to rate distortion criteria
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/146—Data rate or code amount at the encoder output
- H04N19/149—Data rate or code amount at the encoder output by estimating the code amount by means of a model, e.g. mathematical model or statistical model
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/176—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
Definitions
- the present disclosure generally relates to a field of video coding, specifically relates to a prediction method for inter prediction frame (i.e., P frames or B frames), which may combine an intra prediction block and an inter prediction block to gain a final prediction block, enhancing prediction accuracy and thereby improving coding efficiency.
- inter prediction frame i.e., P frames or B frames
- intra prediction and inter prediction are very effective tools that can remove redundant information in video sequences. Specifically, inter prediction uses correlations between current frame and reference frame thereof to reduce temporal redundancy, while intra prediction uses the similarity between spatially adjacent pixels to eliminate spatial redundancy.
- Intra prediction and inter prediction are independent of each other.
- Intra prediction may be used for intra prediction blocks in an intra prediction frame (I frame) or an inter prediction frame, while inter prediction may only be used for an inter prediction frame (a forward prediction frame (P frame) or a bidirectional prediction frame (B frame)).
- I frame intra prediction frame
- inter prediction may only be used for an inter prediction frame (a forward prediction frame (P frame) or a bidirectional prediction frame (B frame)).
- Intra prediction and inter prediction are independent of each other in the present art, and the advantages and disadvantages of intra prediction and inter prediction are not comprehensively considered, and a proper one of them is selected only depending on decision of the encoder side, and it's unable to take advantage of the advantages of the two prediction methods, thereby limits the prediction performance to a certain extent.
- the present disclosure provides an intra-frame and inter-frame combined prediction method for P frames or B frames, to effectively reduce distortion of prediction blocks, and improve prediction precision of the prediction blocks.
- An intra-frame and inter-frame combined prediction method for P frames or B frames Self-adaptively select by means of a rate-distortion optimization (RDO) decision whether to use the method or not.
- the prediction method of the present application obtains a final prediction block by weighting an intra prediction block and an inter prediction block. Wherein, weighting coefficients of the intra prediction block and the inter prediction block are obtained according to prediction distortion statistics of the two kinds of prediction methods. Therefore, prediction precision can be improved, and coding efficiency of the prediction blocks can be improved.
- the method comprises the following steps:
- the inter prediction information will be used to obtain inter prediction blocks.
- different inter prediction information will be transmitted. If motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained by deriving, only syntax elements related to the derivation of the inter information need be transmitted. If the motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained through motion estimation, the corresponding motion information needs to be transmitted.
- a completely new inter prediction mode can also be defined for combining with intra prediction blocks, as long as the corresponding inter prediction information has been transmitted to the decoder side.
- the core of the present application includes a process of combining an intra prediction block and an inter prediction block, as shown in Formula 1:
- P′ comb ( x,y ) W intra ( x,y ) ⁇ P intra ( x,y )+(1 ⁇ W intra ( x,y )) ⁇ P inter ( x,y ) 0 ⁇ x,y ⁇ N
- P′ comb ( x,y ) W intra ( x,y ) ⁇ P intra ( x,y )+(1 ⁇ W intra ( x,y )) ⁇ P inter ( x,y ) 0 ⁇ x,y ⁇ N
- P intra (x,y) is a pixel value of the intra prediction block and P inter (x,y) is a pixel value of the inter prediction block, and Wintra (x, y) is a weighting coefficient for the intra prediction pixel value.
- N represents the size of the current prediction block.
- weighting coefficients are designed according to the magnitude of distortions of intra prediction and inter prediction, and the weighting coefficients are inversely proportional to the distortions of prediction blocks, and the distortions of the prediction blocks are obtained by statistic.
- the application provides a prediction method combining intra prediction and inter prediction, which can reduce the overall distortion of a prediction block, improve prediction precision, and thereby improve coding efficiency of the prediction block.
- the technical advantages of the present invention are reflected in the following aspects:
- the present application determines by means of a RDO decision whether to use the intra-frame and inter-frame combined prediction method or not. Since not all prediction blocks are suitable to be predicted by using the intra-frame and inter-frame combined prediction, using a RDO decision improves practicality and robustness of the technical solutions.
- the present application proposes to use the weighting coefficients which are inversely proportional to distortions of the prediction blocks.
- the method can take advantages of intra prediction and inter prediction, and the optimal prediction parts of the two methods are selected to be combined, thus, to a certain extent, areas with excessive distortion in the intra prediction block and the inter prediction block can be removed, so as to obtain a better prediction effect.
- FIG. 1 is a flowchart of encoding of a coding unit (CU) of the present application.
- FIG. 2 is a flowchart of decoding of a coding unit (CU) of the present application.
- FIG. 3 is a schematic diagram of intra prediction modes in HEVC.
- FIG. 4 is a schematic diagram of weighting coefficients in units of rows or columns in a 16 ⁇ 16 intra-frame and inter-frame combined prediction block in a specific embodiment of the present application.
- the present disclosure provides an effective intra-frame and inter-frame combined prediction method, and adaptively determines whether to use it or not by means of a RDO decision.
- the proposed intra-frame and inter-frame combined prediction method obtains a new prediction block by means of weighting an intra prediction block and an inter prediction block. Weighting coefficients that have been used were determined according to intra-frame and inter-frame prediction distortion statistics, and are inversely proportional to the corresponding distortion statistics of prediction blocks.
- the invention can resolve distortion of the prediction blocks, and prediction precision and encoding efficiency of prediction blocks can be improved.
- FIG. 1 is a flowchart of encoding on the encoder side of the present application.
- a coding unit For a coding unit, firstly perform intra prediction and inter prediction. Then, calculate weighted average of an intra prediction block and an inter prediction block to obtain an intra-frame and inter-frame combined prediction block. After that, determine whether to use the intra-frame and inter-frame combined prediction method or not by means of a rate-distortion optimization decision, and output a 1-bit flag into the bitstream. If the intra-frame and inter-frame combined prediction method is used, code the used inter prediction information into a bitstream.
- FIG. 2 is a flowchart of decoding on the decoder side of the present application.
- a coding unit For a coding unit, firstly read the 1-bit flag. Then, perform intra prediction. According to the value of the flag, determine whether to use the intra-frame and inter-frame combined prediction method or not. If the intra-frame and inter-frame combined prediction method is not used, reconstruction of the coding block may be performed directly. Otherwise, read the inter prediction information from the bitstream and perform inter prediction on the coding unit (CU), then calculate weighted average of an intra prediction block and an inter prediction block to obtain an intra-frame and inter-frame combined prediction block. Finally, perform reconstruction of the current coding block.
- CU coding unit
- the weight coefficients in the intra-frame and inter-frame combined prediction method proposed by the present application are designed according to intra-frame and and inter-frame prediction distortion, wherein the distortions of the prediction blocks are obtained by statistic, and the weight coefficients are inversely proportional to distortion statistics of prediction blocks.
- the determining of the weight coefficients are as follows:
- weighting coefficients of the intra prediction block and the inter prediction block can be expressed as Formula 2 and Formula 3, respectively, and the sum of them is 1.
- W intra ⁇ ( x , y ) D inter ⁇ ( x , y ) D intra ⁇ ( x , y ) + D inter ⁇ ( x , y ) Formula ⁇ ⁇ 2
- W inter ⁇ ( x , ⁇ y ) D intra ⁇ ( x , y ) D intra ⁇ ( x , y ) + D inter ⁇ ( x , y )
- W intra (x, y) and W inter (x, y) are the weighting coefficients of the intra prediction block and the inter prediction block, respectively, and x and y are coordinates of the pixels in the prediction blocks.
- weighting coefficients can be generated for each size of prediction block and for each intra prediction mode.
- weighting coefficients for P frames and B frames need to be designed separately.
- intra prediction blocks may be divided into four groups based on intra prediction modes. Take HEVC intra prediction mode as an example, as shown in FIG. 3 . Wherein, the first group includes modes 0 and 1 , the second group includes modes 2 - 13 , the third group includes modes 14 - 22 , and the fourth group includes modes 23 - 34 . A set of weighting coefficients is designed for each group.
- weighting coefficients in units of rows or columns can be used, that is, a row or a column of pixels in a prediction block may correspond to a same weighting coefficient.
- weighting coefficients in units of rows or columns can be designed for each of the above four groups of intra prediction modes.
- FIG. 4 shows the weighting coefficients in units of rows or columns designed for a 16 ⁇ 16 intra prediction block of the B frames.
- (A), (b), and (c) are weighting coefficients of the second group, the third group, and the fourth group of intra prediction blocks, respectively.
- the intra prediction blocks in the first group directly use 1 ⁇ 2 as the weighting coefficient for the combination of prediction blocks.
- the floating-point weighting coefficients proposed in the present application may be converted into integers, and a right shift operation may be performed after weighting. Specifically, the weighting coefficient can be multiplied by the m-th power of 2, and the predicted value may be shifted to the right by m bits after weighted calculation.
- Formula 1 is shown as Formula 4.
- the value of m relates to required computational accuracy. The higher is the required accuracy, the larger is the value of m and the more accurate is the calculation result.
- the inter prediction information will be used to obtain the inter prediction blocks.
- different inter prediction information will be transmitted. If motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained by deriving, such as a skip mode and a merge mode in HEVC, only syntax elements related to the derivation of the inter information need be transmitted. Specifically, if the skip mode is used in the intra-frame and inter-frame combined prediction, the skip mode information, that is, the index of candidate motion information of the skip mode, need to be transmitted; and if the merge mode is used, candidate index and residual information of the merge mode motion information need to be transmitted.
- the motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained through motion estimation, the corresponding motion information needs to be transmitted.
- a completely new inter prediction mode can also be defined for combining with intra prediction blocks, as long as the corresponding inter prediction information has been transmitted to the decoder side.
- the first candidate motion information of the skip mode can be used invariably, and since the invariant inter prediction mode is used, there is no need to transmit any inter prediction information to the decoder side.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Algebra (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Compression, Expansion, Code Conversion, And Decoders (AREA)
Abstract
Description
- The present application is a national stage filing under 35 U.S.C. § 371 of PCT/CN2018/107210, filed on Sep. 25, 2018 which claims priority to CN Application No. 201711381157.3 filed on Dec. 20, 2017. The applications are incorporated herein by reference in their entirety.
- The present disclosure generally relates to a field of video coding, specifically relates to a prediction method for inter prediction frame (i.e., P frames or B frames), which may combine an intra prediction block and an inter prediction block to gain a final prediction block, enhancing prediction accuracy and thereby improving coding efficiency.
- In the field of video coding, intra prediction and inter prediction are very effective tools that can remove redundant information in video sequences. Specifically, inter prediction uses correlations between current frame and reference frame thereof to reduce temporal redundancy, while intra prediction uses the similarity between spatially adjacent pixels to eliminate spatial redundancy.
- In coding standards of the prior art, intra prediction and inter prediction are independent of each other. Intra prediction may be used for intra prediction blocks in an intra prediction frame (I frame) or an inter prediction frame, while inter prediction may only be used for an inter prediction frame (a forward prediction frame (P frame) or a bidirectional prediction frame (B frame)).
- Main shortcomings of the above prediction technology are:
- Intra prediction and inter prediction are independent of each other in the present art, and the advantages and disadvantages of intra prediction and inter prediction are not comprehensively considered, and a proper one of them is selected only depending on decision of the encoder side, and it's unable to take advantage of the advantages of the two prediction methods, thereby limits the prediction performance to a certain extent.
- To overcome drawbacks in the existing technologies, the present disclosure provides an intra-frame and inter-frame combined prediction method for P frames or B frames, to effectively reduce distortion of prediction blocks, and improve prediction precision of the prediction blocks.
- The technical solutions provided by the present application are as follows:
- An intra-frame and inter-frame combined prediction method for P frames or B frames. Self-adaptively select by means of a rate-distortion optimization (RDO) decision whether to use the method or not. The prediction method of the present application obtains a final prediction block by weighting an intra prediction block and an inter prediction block. Wherein, weighting coefficients of the intra prediction block and the inter prediction block are obtained according to prediction distortion statistics of the two kinds of prediction methods. Therefore, prediction precision can be improved, and coding efficiency of the prediction blocks can be improved. The method comprises the following steps:
- 1) On the encoder side, performing rate-distortion optimization on each coding unit (CU) as following steps:
-
- 11) Firstly, performing intra prediction on a coding unit (CU);
- 12) Then, performing inter prediction on the coding unit (CU);
- 13) Calculating weighted average of the intra prediction block and the inter prediction block, to obtain an intra-frame and inter-frame combined prediction block;
- 14) Determining whether to use the intra-frame and inter-frame combined prediction method by means of a RDO decision; a 1-bit flag is transmitted to a bitstream in each coding unit to mark whether the intra-frame and inter-frame combined prediction method is used;
- 15) Writing inter prediction information into the bitstream, if the intra-frame and inter-frame combined prediction method is used.
- In the intra-frame and inter-frame combined prediction, the inter prediction information will be used to obtain inter prediction blocks. When different inter prediction modes are used, different inter prediction information will be transmitted. If motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained by deriving, only syntax elements related to the derivation of the inter information need be transmitted. If the motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained through motion estimation, the corresponding motion information needs to be transmitted. In the intra-frame and inter-frame combined prediction, a completely new inter prediction mode can also be defined for combining with intra prediction blocks, as long as the corresponding inter prediction information has been transmitted to the decoder side.
- 2) On the decoder side, for a coding unit, reading the 1-bit flag in the bitstream and decoding it according to the flag. The following steps are performed:
-
- 21) Firstly, for the coding unit (CU), performing intra prediction;
- 22) Determining whether to use the intra-frame and inter-frame combined prediction method or not on the decoder side according to the 1-bit flag read from the bitstream;
- 23) If the intra-frame and inter-frame combined prediction method is used, performing the following steps: Firstly, read the inter prediction information from the bitstream and perform inter prediction on the coding unit, so as to obtain an intra-frame and inter-frame combined prediction block. Finally, perform reconstruction of the current coding block (the intra-frame and inter-frame combined prediction block).
- 24) If the intra-frame and inter-frame combined prediction method is not used, reconstruction of the coding block is performed directly.
- The core of the present application includes a process of combining an intra prediction block and an inter prediction block, as shown in Formula 1:
-
P′ comb(x,y)=W intra(x,y)·P intra(x,y)+(1−W intra(x,y))·P inter(x,y) 0≤x,y<N Formula 1 - wherein Pintra(x,y) is a pixel value of the intra prediction block and Pinter(x,y) is a pixel value of the inter prediction block, and Wintra (x, y) is a weighting coefficient for the intra prediction pixel value. P′comb(x,y) is a pixel value of a weighted combination of intra prediction and inter prediction; x, y are coordinates of the prediction blocks, x=0 represents the first column of the prediction block, y=0 represents the first row of the prediction block. N represents the size of the current prediction block.
- In the intra-frame and inter-frame combined prediction method of the present application, weighting coefficients are designed according to the magnitude of distortions of intra prediction and inter prediction, and the weighting coefficients are inversely proportional to the distortions of prediction blocks, and the distortions of the prediction blocks are obtained by statistic.
- Compared with the prior art, the present disclosure has the following beneficial effects:
- The application provides a prediction method combining intra prediction and inter prediction, which can reduce the overall distortion of a prediction block, improve prediction precision, and thereby improve coding efficiency of the prediction block. Specifically, the technical advantages of the present invention are reflected in the following aspects:
- In the present application, it determines by means of a RDO decision whether to use the intra-frame and inter-frame combined prediction method or not. Since not all prediction blocks are suitable to be predicted by using the intra-frame and inter-frame combined prediction, using a RDO decision improves practicality and robustness of the technical solutions.
- In addition, the present application proposes to use the weighting coefficients which are inversely proportional to distortions of the prediction blocks. The method can take advantages of intra prediction and inter prediction, and the optimal prediction parts of the two methods are selected to be combined, thus, to a certain extent, areas with excessive distortion in the intra prediction block and the inter prediction block can be removed, so as to obtain a better prediction effect.
-
FIG. 1 is a flowchart of encoding of a coding unit (CU) of the present application. -
FIG. 2 is a flowchart of decoding of a coding unit (CU) of the present application. -
FIG. 3 is a schematic diagram of intra prediction modes in HEVC. -
FIG. 4 is a schematic diagram of weighting coefficients in units of rows or columns in a 16×16 intra-frame and inter-frame combined prediction block in a specific embodiment of the present application. - Hereinafter, the present disclosure is further described through the embodiments, but the scope of the present disclosure is not limited in any manner.
- The present disclosure provides an effective intra-frame and inter-frame combined prediction method, and adaptively determines whether to use it or not by means of a RDO decision. The proposed intra-frame and inter-frame combined prediction method obtains a new prediction block by means of weighting an intra prediction block and an inter prediction block. Weighting coefficients that have been used were determined according to intra-frame and inter-frame prediction distortion statistics, and are inversely proportional to the corresponding distortion statistics of prediction blocks. The invention can resolve distortion of the prediction blocks, and prediction precision and encoding efficiency of prediction blocks can be improved.
-
FIG. 1 is a flowchart of encoding on the encoder side of the present application. For a coding unit, firstly perform intra prediction and inter prediction. Then, calculate weighted average of an intra prediction block and an inter prediction block to obtain an intra-frame and inter-frame combined prediction block. After that, determine whether to use the intra-frame and inter-frame combined prediction method or not by means of a rate-distortion optimization decision, and output a 1-bit flag into the bitstream. If the intra-frame and inter-frame combined prediction method is used, code the used inter prediction information into a bitstream. -
FIG. 2 is a flowchart of decoding on the decoder side of the present application. For a coding unit, firstly read the 1-bit flag. Then, perform intra prediction. According to the value of the flag, determine whether to use the intra-frame and inter-frame combined prediction method or not. If the intra-frame and inter-frame combined prediction method is not used, reconstruction of the coding block may be performed directly. Otherwise, read the inter prediction information from the bitstream and perform inter prediction on the coding unit (CU), then calculate weighted average of an intra prediction block and an inter prediction block to obtain an intra-frame and inter-frame combined prediction block. Finally, perform reconstruction of the current coding block. - The weight coefficients in the intra-frame and inter-frame combined prediction method proposed by the present application are designed according to intra-frame and and inter-frame prediction distortion, wherein the distortions of the prediction blocks are obtained by statistic, and the weight coefficients are inversely proportional to distortion statistics of prediction blocks. The determining of the weight coefficients are as follows:
- Firstly, statistic of the prediction distortion distribution of each intra prediction mode is gathered and recorded as Dintra(x, y), and statistic of the prediction distortion of the inter prediction block is gathered and recorded as Dinter(x, y). Therefore, in the intra-frame and inter-frame combined prediction, weighting coefficients of the intra prediction block and the inter prediction block can be expressed as
Formula 2 andFormula 3, respectively, and the sum of them is 1. -
- Wherein, Wintra(x, y) and Winter(x, y) are the weighting coefficients of the intra prediction block and the inter prediction block, respectively, and x and y are coordinates of the pixels in the prediction blocks.
- By gathering statistic of coding results, a set of weighting coefficients can be generated for each size of prediction block and for each intra prediction mode. In addition, considering that distortion of the prediction blocks may be different in P frames and B frames, weighting coefficients for P frames and B frames need to be designed separately.
- In order to save space for storing the weighting coefficients, the weighting coefficients can be simplified. For example, in the present application, intra prediction blocks may be divided into four groups based on intra prediction modes. Take HEVC intra prediction mode as an example, as shown in
FIG. 3 . Wherein, the first group includesmodes - In addition, in order to further save space for storing the weighting coefficients, weighting coefficients in units of rows or columns can be used, that is, a row or a column of pixels in a prediction block may correspond to a same weighting coefficient. For example, in HEVC, weighting coefficients in units of rows or columns can be designed for each of the above four groups of intra prediction modes.
FIG. 4 shows the weighting coefficients in units of rows or columns designed for a 16×16 intra prediction block of the B frames. InFIG. 4 , (A), (b), and (c) are weighting coefficients of the second group, the third group, and the fourth group of intra prediction blocks, respectively. And, the intra prediction blocks in the first group directly use ½ as the weighting coefficient for the combination of prediction blocks. - To avoid floating-point arithmetic, the floating-point weighting coefficients proposed in the present application may be converted into integers, and a right shift operation may be performed after weighting. Specifically, the weighting coefficient can be multiplied by the m-th power of 2, and the predicted value may be shifted to the right by m bits after weighted calculation. After been converted into integer arithmetic,
Formula 1 is shown asFormula 4. The value of m relates to required computational accuracy. The higher is the required accuracy, the larger is the value of m and the more accurate is the calculation result. -
P′ comb(x,y)=(2m ·W intra(x,y)·Pintra)(x,y)+(2m−2m ·W intra(x,y))·P inter(x,y)+2m−1)>>m Formula 4 - In the intra-frame and inter-frame combined prediction, the inter prediction information will be used to obtain the inter prediction blocks. When different inter prediction modes are used, different inter prediction information will be transmitted. If motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained by deriving, such as a skip mode and a merge mode in HEVC, only syntax elements related to the derivation of the inter information need be transmitted. Specifically, if the skip mode is used in the intra-frame and inter-frame combined prediction, the skip mode information, that is, the index of candidate motion information of the skip mode, need to be transmitted; and if the merge mode is used, candidate index and residual information of the merge mode motion information need to be transmitted. If the motion information of the inter prediction mode used in the intra-frame and inter-frame combined prediction is obtained through motion estimation, the corresponding motion information needs to be transmitted. In the intra-frame and inter-frame combined prediction, a completely new inter prediction mode can also be defined for combining with intra prediction blocks, as long as the corresponding inter prediction information has been transmitted to the decoder side. For example, the first candidate motion information of the skip mode can be used invariably, and since the invariant inter prediction mode is used, there is no need to transmit any inter prediction information to the decoder side.
- It needs to be noted that the embodiments as disclosed are intended to facilitating further understanding of the present disclosure; however, those skilled in the art may understand that various substitutions and modifications are possible without departing from the spirit and scope of the present disclosure. Therefore, the present disclosure should not be limited to the contents disclosed in the embodiments, but should be governed by the appended claims.
Claims (10)
P′ comb(x,y)=W intra(x,y)·P intra(x,y)+(1−W intra(x,y))·P inter(x,y) 0≤x,y<N Formula 1
P′ comb(x,y)=(2m ·W intra(x,y)·P intra(x,y)+(2m−2m ·W intra(x,y))·P inter(x,y)+2m−1)>>m Formula 4
P′ comb(x,y)=W intra(x,y)·P intra(x,y)+(1−W intra(x,y))·P inter(x,y) 0≤x,y<N Formula 1
P′ comb(x,y)=(2m ·W intra(x,y)·P intra(x,y)+(2m−2m ·W intra(x,y))·P inter(x,y)+2m−1)>>m Formula 4
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711381157.3A CN107995489A (en) | 2017-12-20 | 2017-12-20 | A kind of combination forecasting method between being used for the intra frame of P frames or B frames |
CN201711381157.3 | 2017-12-20 | ||
PCT/CN2018/107210 WO2019119910A1 (en) | 2017-12-20 | 2018-09-25 | Intra-frame and inter-frame combined prediction method for p frames or b frames |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200314432A1 true US20200314432A1 (en) | 2020-10-01 |
US11051027B2 US11051027B2 (en) | 2021-06-29 |
Family
ID=62039172
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/629,777 Active US11051027B2 (en) | 2017-12-20 | 2018-09-25 | Intra-frame and inter-frame combined prediction method for P frames or B frames |
Country Status (3)
Country | Link |
---|---|
US (1) | US11051027B2 (en) |
CN (1) | CN107995489A (en) |
WO (1) | WO2019119910A1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210297688A1 (en) * | 2018-12-06 | 2021-09-23 | Huawei Technologies Co., Ltd. | Weighted prediction method for multi-hypothesis encoding and apparatus |
US11277624B2 (en) | 2018-11-12 | 2022-03-15 | Beijing Bytedance Network Technology Co., Ltd. | Bandwidth control methods for inter prediction |
US11330257B2 (en) | 2019-03-21 | 2022-05-10 | Beijing Bytedance Network Technology Co., Ltd. | Extended application of combined intra-inter prediction |
US20220224936A1 (en) * | 2018-04-04 | 2022-07-14 | Nippon Hoso Kyokai | Prediction image correcting device, image encoding device, image decoding device, and program |
US11438630B2 (en) * | 2019-06-19 | 2022-09-06 | Lg Electronics Inc. | Motion prediction-based image coding method and device |
CN115118977A (en) * | 2022-08-29 | 2022-09-27 | 华中科技大学 | Intra-frame prediction encoding method, system, and medium for 360-degree video |
US11509923B1 (en) | 2019-03-06 | 2022-11-22 | Beijing Bytedance Network Technology Co., Ltd. | Usage of converted uni-prediction candidate |
WO2023277603A1 (en) * | 2021-07-02 | 2023-01-05 | 현대자동차주식회사 | Method and device for encoding/decoding video |
US11582451B2 (en) | 2019-09-09 | 2023-02-14 | Beijing Bytedance Network Technology Co., Ltd. | Coefficient scaling for high precision image and video coding |
US11683500B2 (en) | 2019-09-21 | 2023-06-20 | Beijing Bytedance Network Technology Co., Ltd | Precision transform coefficient scaling and quantization for image and video coding |
US11838539B2 (en) | 2018-10-22 | 2023-12-05 | Beijing Bytedance Network Technology Co., Ltd | Utilization of refined motion vector |
US11917196B2 (en) | 2019-08-19 | 2024-02-27 | Beijing Bytedance Network Technology Co., Ltd | Initialization for counter-based intra prediction mode |
US11956465B2 (en) | 2018-11-20 | 2024-04-09 | Beijing Bytedance Network Technology Co., Ltd | Difference calculation based on partial position |
US12052420B2 (en) | 2019-04-23 | 2024-07-30 | Beijing Bytedance Network Technology Co., Ltd | Intra prediction and residual coding |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107995489A (en) | 2017-12-20 | 2018-05-04 | 北京大学深圳研究生院 | A kind of combination forecasting method between being used for the intra frame of P frames or B frames |
CN108632616B (en) * | 2018-05-09 | 2021-06-01 | 电子科技大学 | Method for inter-frame weighted prediction based on reference quality |
US11647222B2 (en) * | 2018-06-18 | 2023-05-09 | Industry Academy Cooperation Foundation Of Sejong University | Method and apparatus for encoding/decoding image |
CN111372086B (en) * | 2018-12-26 | 2021-08-03 | 华为技术有限公司 | Video image decoding method and device |
CN111385569B (en) * | 2018-12-28 | 2022-04-26 | 杭州海康威视数字技术股份有限公司 | Coding and decoding method and equipment thereof |
CN111010578B (en) * | 2018-12-28 | 2022-06-24 | 北京达佳互联信息技术有限公司 | Method, device and storage medium for intra-frame and inter-frame joint prediction |
CN109714596A (en) * | 2019-01-30 | 2019-05-03 | 江苏允博信息科技有限公司 | A method of the HEVC intraframe predictive coding based on deep learning |
CN113302920B (en) * | 2019-02-01 | 2024-09-10 | 北京字节跳动网络技术有限公司 | Extended application of combined inter-frame intra-frame prediction |
US11290726B2 (en) | 2019-02-07 | 2022-03-29 | Qualcomm Incorporated | Inter-intra prediction mode for video data |
GB2582929A (en) * | 2019-04-08 | 2020-10-14 | Canon Kk | Residual signalling |
WO2020253822A1 (en) * | 2019-06-21 | 2020-12-24 | Huawei Technologies Co., Ltd. | Adaptive filter strength signalling for geometric partition mode |
CN113794878B (en) * | 2019-09-23 | 2022-12-23 | 杭州海康威视数字技术股份有限公司 | Encoding and decoding method, device and equipment |
CN113709501B (en) * | 2019-12-23 | 2022-12-23 | 杭州海康威视数字技术股份有限公司 | Encoding and decoding method, device and equipment |
CN116711304A (en) * | 2020-12-28 | 2023-09-05 | Oppo广东移动通信有限公司 | Prediction method, encoder, decoder, and storage medium |
CN113794885B (en) * | 2020-12-30 | 2022-12-23 | 杭州海康威视数字技术股份有限公司 | Encoding and decoding method, device and equipment |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101877785A (en) * | 2009-04-29 | 2010-11-03 | 祝志怡 | Hybrid predicting-based video encoding method |
US9609343B1 (en) * | 2013-12-20 | 2017-03-28 | Google Inc. | Video coding using compound prediction |
CN107113425A (en) * | 2014-11-06 | 2017-08-29 | 三星电子株式会社 | Method for video coding and equipment and video encoding/decoding method and equipment |
CN107995489A (en) | 2017-12-20 | 2018-05-04 | 北京大学深圳研究生院 | A kind of combination forecasting method between being used for the intra frame of P frames or B frames |
-
2017
- 2017-12-20 CN CN201711381157.3A patent/CN107995489A/en active Pending
-
2018
- 2018-09-25 US US16/629,777 patent/US11051027B2/en active Active
- 2018-09-25 WO PCT/CN2018/107210 patent/WO2019119910A1/en active Application Filing
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220224936A1 (en) * | 2018-04-04 | 2022-07-14 | Nippon Hoso Kyokai | Prediction image correcting device, image encoding device, image decoding device, and program |
US11849141B2 (en) | 2018-04-04 | 2023-12-19 | Nippon Hoso Kyokai | Prediction image correcting device, image encoding device, image decoding device, and program |
US11877003B2 (en) * | 2018-04-04 | 2024-01-16 | Nippon Hoso Kyokai | Prediction image correcting device, image encoding device, image decoding device, and program |
US20240121428A1 (en) * | 2018-04-04 | 2024-04-11 | Nippon Hoso Kyokai | Prediction image correcting device, image encoding device, image decoding device, and program |
US11889108B2 (en) | 2018-10-22 | 2024-01-30 | Beijing Bytedance Network Technology Co., Ltd | Gradient computation in bi-directional optical flow |
US12041267B2 (en) | 2018-10-22 | 2024-07-16 | Beijing Bytedance Network Technology Co., Ltd. | Multi-iteration motion vector refinement |
US11838539B2 (en) | 2018-10-22 | 2023-12-05 | Beijing Bytedance Network Technology Co., Ltd | Utilization of refined motion vector |
US11956449B2 (en) | 2018-11-12 | 2024-04-09 | Beijing Bytedance Network Technology Co., Ltd. | Simplification of combined inter-intra prediction |
US11516480B2 (en) | 2018-11-12 | 2022-11-29 | Beijing Bytedance Network Technology Co., Ltd. | Simplification of combined inter-intra prediction |
US11284088B2 (en) | 2018-11-12 | 2022-03-22 | Beijing Bytedance Network Technology Co., Ltd. | Using combined inter intra prediction in video processing |
US11277624B2 (en) | 2018-11-12 | 2022-03-15 | Beijing Bytedance Network Technology Co., Ltd. | Bandwidth control methods for inter prediction |
US11843725B2 (en) | 2018-11-12 | 2023-12-12 | Beijing Bytedance Network Technology Co., Ltd | Using combined inter intra prediction in video processing |
US11956465B2 (en) | 2018-11-20 | 2024-04-09 | Beijing Bytedance Network Technology Co., Ltd | Difference calculation based on partial position |
US20210297688A1 (en) * | 2018-12-06 | 2021-09-23 | Huawei Technologies Co., Ltd. | Weighted prediction method for multi-hypothesis encoding and apparatus |
US11930165B2 (en) | 2019-03-06 | 2024-03-12 | Beijing Bytedance Network Technology Co., Ltd | Size dependent inter coding |
US11509923B1 (en) | 2019-03-06 | 2022-11-22 | Beijing Bytedance Network Technology Co., Ltd. | Usage of converted uni-prediction candidate |
US11425406B2 (en) | 2019-03-21 | 2022-08-23 | Beijing Bytedance Network Technology Co., Ltd. | Weighting processing of combined intra-inter prediction |
US11876993B2 (en) | 2019-03-21 | 2024-01-16 | Beijing Bytedance Network Technology Co., Ltd | Signaling of combined intra-inter prediction |
US11330257B2 (en) | 2019-03-21 | 2022-05-10 | Beijing Bytedance Network Technology Co., Ltd. | Extended application of combined intra-inter prediction |
US12052420B2 (en) | 2019-04-23 | 2024-07-30 | Beijing Bytedance Network Technology Co., Ltd | Intra prediction and residual coding |
US20220345749A1 (en) * | 2019-06-19 | 2022-10-27 | Lg Electronics Inc. | Motion prediction-based image coding method and device |
US11438630B2 (en) * | 2019-06-19 | 2022-09-06 | Lg Electronics Inc. | Motion prediction-based image coding method and device |
US11962810B2 (en) * | 2019-06-19 | 2024-04-16 | Lg Electronics Inc. | Motion prediction-based image coding method and device |
US11917196B2 (en) | 2019-08-19 | 2024-02-27 | Beijing Bytedance Network Technology Co., Ltd | Initialization for counter-based intra prediction mode |
US11582451B2 (en) | 2019-09-09 | 2023-02-14 | Beijing Bytedance Network Technology Co., Ltd. | Coefficient scaling for high precision image and video coding |
US11985317B2 (en) | 2019-09-09 | 2024-05-14 | Beijing Bytedance Network Technology Co., Ltd. | Coefficient scaling for high precision image and video coding |
US11683500B2 (en) | 2019-09-21 | 2023-06-20 | Beijing Bytedance Network Technology Co., Ltd | Precision transform coefficient scaling and quantization for image and video coding |
WO2023277603A1 (en) * | 2021-07-02 | 2023-01-05 | 현대자동차주식회사 | Method and device for encoding/decoding video |
CN115118977A (en) * | 2022-08-29 | 2022-09-27 | 华中科技大学 | Intra-frame prediction encoding method, system, and medium for 360-degree video |
Also Published As
Publication number | Publication date |
---|---|
WO2019119910A1 (en) | 2019-06-27 |
US11051027B2 (en) | 2021-06-29 |
CN107995489A (en) | 2018-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11051027B2 (en) | Intra-frame and inter-frame combined prediction method for P frames or B frames | |
US20210314568A1 (en) | Moving image decoding method and moving image coding method | |
US7738714B2 (en) | Method of and apparatus for lossless video encoding and decoding | |
US20180338139A1 (en) | Method and apparatus of intra mode coding | |
RU2559737C2 (en) | Method and device for coding/decoding of movement vector | |
US20210058627A1 (en) | Filtering method for intra-frame and inter-frame prediction | |
CN101884219B (en) | Method and apparatus for processing video signal | |
US11979578B2 (en) | Method and apparatus for encoding or decoding video data in FRUC mode with reduced memory accesses | |
CN102835110B (en) | Motion-vector prediction coding method, motion-vector prediction coding/decoding method, dynamic image encoding device, moving image decoding apparatus and program thereof | |
KR100846512B1 (en) | Method and apparatus for video encoding and decoding | |
CN105284111A (en) | Dynamic-image coding device, dynamic-image decoding device, dynamic-image coding method, dynamic-image decoding method, and program | |
US9313496B2 (en) | Video encoder and video encoding method as well as video decoder and video decoding method | |
CN100591136C (en) | Video frequency intraframe coding method based on null field decomposition | |
US20120263237A1 (en) | Video encoder and video decoder | |
KR20090087767A (en) | Method for predictive intra coding for image data | |
KR20140077988A (en) | Predictive coding method for motion vector, predictive decoding method for motion vector, video coding device, video decoding device, and programs therefor | |
CN116569549A (en) | Inter prediction method, encoder, decoder and storage medium | |
CN111510726B (en) | Coding and decoding method and equipment thereof | |
CN109714596A (en) | A method of the HEVC intraframe predictive coding based on deep learning | |
CA3039172C (en) | Motion video predict coding method, motion video predict coding device, motion video predict coding program, motion video predict decoding method, motion video predict decoding device, and motion video predict decoding program | |
CN114071159A (en) | Inter prediction method, encoder, decoder, and computer-readable storage medium | |
JP2013062767A (en) | Moving image encoder and program | |
KR20180117095A (en) | Coding method, decoding method, and apparatus for video global disparity vector. | |
JP2012023514A (en) | Encoding device and decoding device adaptively determining scan order of orthogonal transform coefficient |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PEKING UNIVERSITY SHENZHEN GRADUATE SCHOOL, CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, RONGGANG;FAN, KUI;LI, GE;AND OTHERS;REEL/FRAME:051467/0689 Effective date: 20200108 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |