CN105208387A - HEVC intra-frame prediction mode fast selection method - Google Patents

HEVC intra-frame prediction mode fast selection method Download PDF

Info

Publication number
CN105208387A
CN105208387A CN201510675511.8A CN201510675511A CN105208387A CN 105208387 A CN105208387 A CN 105208387A CN 201510675511 A CN201510675511 A CN 201510675511A CN 105208387 A CN105208387 A CN 105208387A
Authority
CN
China
Prior art keywords
estimated
sad
coordinate
predictive mode
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510675511.8A
Other languages
Chinese (zh)
Other versions
CN105208387B (en
Inventor
朱威
张训华
沈吉龙
杨洋
陈朋
郑雅羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201510675511.8A priority Critical patent/CN105208387B/en
Publication of CN105208387A publication Critical patent/CN105208387A/en
Application granted granted Critical
Publication of CN105208387B publication Critical patent/CN105208387B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention relates to an HEVC intra-frame prediction mode fast selection method which comprises the following steps: (1) inputting a PU to be estimated, and establishing a practical useable intra-frame prediction mode assembly; (2) calculating the sum of absolute difference of all pixels in the PU to be estimated and adjacent pixels in space in the direction different from that of the PU to be estimated; (3) according to the sum of absolute difference of the adjacent pixels in space in the difference direction, judging texture direction characters of the PU to be estimated; (4) according to the texture direction characters, determining the crude mode search range; (5) according to the crude mode search range and the practical useable intra-frame prediction mode assembly, establishing a rate-distortion optimization candidate mode assembly; (6) selecting the optimal intra-frame prediction mode. According to the texture direction characters of the PU to be estimated, the crude search range of the prediction mode can be reduced, and then the number of candidate modes subjected to rate-distortion optimization can be reduced, so that the good code rate-distortion performance can be kept, and meanwhile the computation complexity of HEVC intra-frame prediction mode selection can be obviously reduced.

Description

A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction
Technical field
The present invention relates to digital video coding field, be specifically related to a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction.
Background technology
Along with the fast development of multimedia technology, the video data of various resolution occurs in succession (comprising SD, high definition and ultra high-definition video), the transmission of video data and storage faces enormous challenge.For meeting the growth requirement of video data compression and transmission, the Video coding joint specialist group (JointCollaborativeTeamonVideoCoding organized by ISO/IEC and ITU-T, JCT-VC) high efficiency video encoding standard (HighEfficiencyVideoCoding, HEVC/H.265) of new generation has been formulated.Under identical video quality, the video encoding standard in HEVC and last generation H.264 compared with decrease code stream about half (see G.J.Sullivan, J.-R.Ohm, W.-J.Han, andT.Wiegand, Overviewofthehighefficiencyvideocoding (HEVC) standard, i.e. " general introduction of high efficiency video encoding standard ", IEEETransactionsonCircuitsandSystemsforVideoTechnology, vol.22, no.12, pp.1649-1668, Dec.2012), namely code efficiency is doubled, but its computation complexity increases several times.Although HEVC equally adopts mixed coding technology with traditional video encoding standard, introduce new coding techniques in many aspects, the such as inter-frame forecast mode etc. of code tree unit (CodingTreeUnit, CTU) quad-tree partition, multi-angle intra prediction mode and multiple dividing mode.In order to coded image more neatly, HEVC proposes three kinds of partition modes, is respectively coding unit (CodingUnit, CU), predicting unit (PredictionUnit, PU) and converter unit (TransformUnit).The PU forecasting process of HEVC comprises inter prediction and infra-frame prediction, and wherein the PU of I frame or every IDR frame only adopts infra-frame prediction, and other type frame can adopt infra-frame prediction and inter prediction simultaneously.In order to improve the compression efficiency of infra-frame prediction, the reconstruction pixel that HEVC adopts the space that exists around PU adjacent carries out infra-frame prediction, 35 kinds of intra prediction modes can be adopted at most (see J.Lainema, F.Bossen, W.-JHan, J.Min, andK.Ugur, IntracodingoftheHEVCstandard, i.e. " intraframe coding of HEVC standard ", IEEETransactionsonCircuitsandSystemsforVideoTechnology, vol.22, no.12, pp.1792-1801, Dec.2012).In all intra prediction modes of HEVC, the Planar pattern of numbering 0 and the DC pattern of numbering 1 are applicable to flat site, the corresponding 33 kinds of angle predictive modes of numbering 2 ~ 34, wherein, the angle predictive mode of numbering 10 horizontally right direction is predicted, the angle predictive mode edge of numbering 26 is predicted in direction straight down, the angle predictive mode of numbering 18 is diagonally predicted toward lower right, the angle predictive mode of numbering 2 is diagonally predicted toward upper right, and the angle predictive mode of numbering 34 is diagonally predicted toward lower left.In the test model HM of HEVC, intra-prediction process first will carry out thick level mode decision (RoughModeDecision, RMD), by the absolute error after the residual signals Hadamard transform of calculating PU and (SumofAbsoluteTransformedDifference, SATD) Preliminary screening predictive mode is carried out, for the PU being of a size of 4 × 4 and 8 × 8, choose 8 kinds of predictive modes alternatively pattern, for being of a size of 16 × 16, the PU of 32 × 32 and 64 × 64, choose 3 kinds of predictive modes alternatively pattern (see L.Zhao, L.Zhang, X.Zhao, S.Ma, D.Zhao, W.Gao, Furtherencoderimprovementforintramodedecision (JCTVC-D283), i.e. the further optimization of intra mode decision " in the cataloged procedure ", ProceedingsoftheJCT-VC4thmeeting, pp.1-4, Jan.2011).Then rate-distortion optimization (RateDistortionOptimization is adopted, RDO) technology is (see WiegandT, SchwarzH, JochA, KossentiniF, SullivanGJ, Rate-constrainedcodercontrolandcomparisonofvideocodingst andards, i.e. " the restricted encoder of code check of video encoding standard controls and compares ", IEEETransactionsonCircuitsandSystemsforVideoTechnology, 2003, 13 (7): 688-703), the minimum pattern of rate distortion costs is chosen as optimum prediction mode in the frame of PU from some candidate pattern.HEVC intra prediction mode is more rich and varied than H.264, more can be applicable to encoding high resolution video, but which increases the computation complexity of HEVC intraframe coding.
The Adaptive Mode Selection Method for Intra-Prediction of more existing HEVC at present.Application number be the patent of 201210138816.1 in RMD process, by candidate modes number, 2 ~ 5 are reduced to for the PU being of a size of 4 × 4 and 8 × 8,16 × 16 and 32 × 32.The people such as Qi Meibin propose a kind of HEVC intra mode decision fast method based on image texture direction and spatial coherence (see Qi Meibin, Zhu Guanghui, Yang Yanfang, Jiang Jianguo. utilize the HEVC Intra prediction mode selection [J] of texture and spatial coherence. Journal of Image and Graphics, 2014,19 (8), 1119-1125).The PU grain direction that the method utilizes Sobel operator to obtain is set up and is selected predictive mode list, and adds this list by based on optimum prediction mode in the adjacent PU frame of spatial coherence.Application number is that the patent of 201410842187.X provides a kind of HEVC Intra prediction mode selection accelerated method.In the method, if PU has using texture homogeneity feature, then the 1st predictive mode chosen by RMD is directly chosen for optimum frame inner estimation mode; Secondly front 2 predictive modes chosen by RMD are divided into the different situation of 3 classes to accelerate model selection.Different from the method for above-mentioned minimizing candidate modes, application number is a kind of HEVC fast intra-frame predicting method based on SATD of patent proposition of 201410024635.5, is reduced the computation complexity of the infra-frame prediction of HEVC by the division stopping CU.The method calculates one group of adaptable threshold value by SATD, if the SATD of current C U is less than given threshold value, then terminates the division of CU.And according to Texture complication, the patent that application number is 201310445775.5 determines that CU divides on the one hand; On the other hand, according to the texture features of PU, some predictive modes that least may become optimal mode are deleted from candidate modes list.
Summary of the invention
In order to effectively reduce the computation complexity of HEVC infra-frame prediction under the condition keeping encoding rate distortion performance, the invention provides a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction.
In order to the technical scheme solving the problems of the technologies described above employing is:
A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction, described method comprises the following steps:
(1) input a PU to be estimated, set up actual available intra prediction mode set:
According to the adjacent reconstruction pixel in space that the reconstruction pixel that around PU to be estimated, Existential Space is adjacent and each HEVC intra prediction mode need, for PU to be estimated chooses the available intra prediction mode of all reality, composition set omega, namely to each HEVC intra prediction mode, if there is the adjacent reconstruction pixel in space that this pattern carries out infra-frame prediction needs around PU to be estimated, then this pattern is joined Ω.
(2) calculate the spatial neighborhood pixels of all pixels in PU to be estimated and its different directions absolute difference and:
For 33 kinds of angle predictive modes of HEVC intra prediction mode, the angle predictive mode that the grain direction characteristic of PU to be estimated and this PU finally choose has correlation.Therefore, can by calculating the poor absolute value of pixel in PU to be estimated and its spatial neighborhood pixels and determining the grain direction characteristic of PU to be estimated, to select intra prediction mode fast.
First, when there is the angle predictive mode being numbered 18 in Ω, namely PU to be estimated can adopt and diagonally carry out toward lower right the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its upper left side neighbor in PU to be estimated lU, shown in (1):
SAD L U = Σ x = 0 N - 1 Σ y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y - 1 ) | - - - ( 1 )
In formula (1), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than or equal to N-1, and coordinate is (x-1, y-1) pixel is positioned at the upper left side that coordinate is (x, y).Coordinate is (-1,-1) pixel is coordinate is (0,0) top left pixel of pixel, the pixel of coordinate to be the pixel of (0,0) be PU upper left to be estimated corner position, coordinate is (0, N-1) pixel is the pixel of PU lower-left to be estimated corner position, the pixel of coordinate to be the pixel of (N-1,0) be PU upper right corner position to be estimated, the pixel of coordinate to be the pixel of (N-1, N-1) be PU bottom right to be estimated corner position.Coordinate is (0, y) pixel is the coboundary pixel of PU to be estimated, coordinate is (x, 0) pixel is the left margin pixel of PU to be estimated, coordinate is (N-1, y) pixel is the right margin pixel of PU to be estimated, the lower boundary pixel of coordinate to be the pixel of (x, N-1) be PU to be estimated.
Similarly, when there is the angle predictive mode being numbered 26 in Ω, namely PU to be estimated can adopt direction straight down to carry out the angle predictive mode predicted, then calculate all pixels in PU to be estimated and the absolute difference of neighbor and SAD above it u, shown in (2):
SAD U = Σ x = 0 N - 1 Σ y = 0 N - 1 | p ( x , y ) - p ( x , y - 1 ) | - - - ( 2 )
In formula (2), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x, y-1) pixel is positioned at coordinate for directly over (x, y).
When there is the angle predictive mode being numbered 34 in Ω, namely PU to be estimated can adopt diagonally lower left to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its upper right side neighbor in PU to be estimated rU, shown in (3):
SAD R U = Σ x = 0 N - 1 Σ y = 0 N - 1 | p ( x , y ) - p ( x + 1 , y - 1 ) | - - - ( 3 )
In formula (3), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x+1, y-1) pixel is positioned at the upper right side that coordinate is (x, y).
When there is the angle predictive mode being numbered 10 in Ω, namely PU to be estimated can adopt horizontal right direction to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its left neighbor in PU to be estimated l, shown in (4):
SAD L = Σ x = 0 N - 1 Σ y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y ) | - - - ( 4 )
In formula (4), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x-1, y) pixel is positioned at the left that coordinate is (x, y).
When there is the angle predictive mode being numbered 2 in Ω, namely PU to be estimated can adopt diagonally upper right to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its lower left neighbor in PU to be estimated lB, shown in (5):
SAD L B = Σ x = 0 N - 1 Σ y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y + 1 ) | - - - ( 5 )
In formula (5), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x-1, y+1) pixel is positioned at the lower left that coordinate is (x, y).
(3) according to absolute difference and the line judging PU to be estimated of different directions spatial neighborhood pixels
Reason directional characteristic:
First, carry out step selection according to the absolute difference calculated from step (2) and SAD number: if step (2) calculates SAD number be less than 3, then perform step (5); Otherwise first the SAD that step (2) calculates is arranged from small to large, if first three minimum SAD is followed successively by SAD mIN-0, SAD mIN-1and SAD mIN-2; Again according to these three minimum SAD, the textural characteristics of PU to be estimated is classified, shown in (6):
C l a s s = { 0 , i f SAD M I N - 0 > &alpha;SAD M I N - 2 1 , e l s e i f SAD M I N - 0 < &beta; &times; SAD M I N - 1 2 , e l s e i f SAD M I N - 1 < &beta; &times; SAD M I N - 2 3 , o t h e r s - - - ( 6 )
In formula (6), Class represents the texture classification of PU to be estimated, value is that the texture of 0 expression PU to be estimated is more smooth, value is that the texture of 1 expression PU to be estimated presents obvious level, vertical or diagonal, value is that the texture of 2 expression PU to be estimated presents other angle direction, value is that the texture of 3 expression PU to be estimated is complicated, and parameter alpha, β and γ are for regulating SAD mIN-irelation between (i=0,1,2), wherein α is set to 0.9 ~ 1.0, β and γ is set to 0.6 ~ 1.0.
Then the PU texture classification Class calculated by formula (6) and SAD relation, obtain PU grain direction characteristic to be estimated, as shown in table 1.In Table 1,0 degree of direction refers to horizontally right direction, and pi/2 direction refers to direction straight down, edge, and direction, π/4 refers to 45 degree of directions along bottom right, and direction ,-π/4 refers to that, along upper right 45 degree of directions, 3 directions, π/4 refer to 45 degree of directions along lower-left.When texture classification Class equals 0, it is more smooth that the grain direction characteristic of PU to be estimated is designated as texture.When texture classification Class equals 1, the grain direction characteristic of PU to be estimated is according to SAD mIN-0whether equal SAD lU, SAD u, SAD rU, SAD land SAD lBrespectively PU to be estimated is designated as that texture is direction, π/4, texture is pi/2 direction, texture is 3 directions, π/4, texture is 0 degree of direction and texture in direction ,-π/4.When texture classification Class equals 2, the grain direction characteristic of PU to be estimated is according to SAD mIN-0and SAD mIN-1value whether be SAD lU, SAD u, SAD rU, SAD land SAD lBin two sad values in adjacent direction differentiate the grain direction characteristic of PU to be estimated: if (a) SAD lUequal SAD mIN-0and SAD uequal SAD mIN-1, or SAD lUequal SAD mIN-1and SAD uequal SAD mIN-0, then grain direction characteristic is designated as texture in [π/4, pi/2] direction; If (b) SAD uequal SAD mIN-0and SAD rUequal SAD mIN-1, or SAD uequal SAD mIN-1and SAD rUequal SAD mIN-0, then grain direction characteristic is designated as texture in [pi/2,3 π/4] direction; If (c) SAD lUequal SAD mIN-0and SAD lequal SAD mIN-1, or SAD lUequal SAD mIN-1and SAD lequal SAD mIN-0, then grain direction characteristic is designated as texture in [0, π/4] direction; If (d) SAD lequal SAD mIN-0and SAD lBequal SAD mIN-1, or SAD lequal SAD mIN-1and SAD lBequal SAD mIN-0, then grain direction characteristic is designated as texture in [-π/4,0] direction; F () other situation, be then designated as complex texture direction by grain direction characteristic.When texture classification Class equals 3, the grain direction characteristic of PU to be estimated is designated as complex texture direction.
Table 1 PU grain direction to be estimated characteristic
(4) thick level pattern search scope is determined according to grain direction characteristic:
According to the grain direction characteristic of PU to be estimated, reduce the predictive mode kind of candidate, the predictive mode after adjustment forms thick level pattern search scope S, and the predictive mode wherein in S arranges according to the grain direction characteristic of PU to be estimated, as shown in table 2 below:
Predictive mode in table 2S
(5) set of rate-distortion optimization candidate pattern is set up according to thick level hunting zone and Ω:
From the predictive mode hunting zone that step (4) obtains, predictive mode number is still more in some cases, therefore needs further to screen predictive mode before step (6) adopts rate-distortion optimization choice of technology optimum prediction mode.
First SATD cost pattern search scope Ψ is determined: if performed to current procedures from step (4), then Ψ is the common factor that the thick level pattern search scope S that obtains of step (4) and step (1) obtain Ω, if performed to current procedures from step (3), then direct Ω is assigned to Ψ.
Then the HEVC intra prediction of each predictive mode in Ψ is calculated, then the SATD cost J of computational prediction residual error, shown in (7):
J=SATD+λ×R(7)
Wherein J represents cost, SATD represent the absolute error after residual signals Hadamard transform and, λ represents Lagrangian, R intermediate scheme select after coding required for bit number.
Then according to SATD cost J order from small to large, each predictive mode is sorted, set up rate-distortion optimization candidate pattern set Φ according to the predictive mode after sequence again: when the predictive mode being arranged in the 1st is DC pattern or Planar pattern, then only the predictive mode of first 1 of arrangement is added Φ; When the predictive mode being arranged in the 1st is angle mode and the predictive mode of the 2nd is DC pattern or Planar pattern, then only the predictive mode of first 2 of arrangement is added Φ; When the predictive mode being arranged in front 2 is all adjacent angle mode, then only the predictive mode of first 2 of arrangement is added Φ; When the predictive mode being arranged in front 2 is non-conterminous angle mode, then first the predictive mode of first 2 of arrangement is added Φ, then angle mode adjacent for these 2 kinds of predictive modes is added Φ; In other cases, for the PU to be estimated being of a size of 16 × 16,32 × 32 and 64 × 64, then the predictive mode of first 3 of arrangement is added Φ, for the PU to be estimated being of a size of 4 × 4 and 8 × 8, then the predictive mode of first 8 of arrangement is added Φ.
(6) optimum frame inner estimation mode is chosen:
Adopt rate-distortion optimization technology to choose the optimum frame inner estimation mode of the minimum candidate pattern of rate distortion costs as PU to be estimated from the candidate pattern set Φ that step (5) obtains, complete the Intra prediction mode selection of PU to be estimated.
Technical conceive of the present invention is: the SAD first calculating all pixels and its different directions spatial neighborhood pixels in PU, draws the texture features of PU, to determine grain direction feature; Then according to grain direction feature, establish thick level pattern search scope, reduce the candidate pattern number of carrying out thick level pattern search; Finally set up the set of rate-distortion optimization candidate pattern, further reduce the candidate pattern number of finally carrying out rate-distortion optimization.
Compared with prior art, the present invention has following beneficial effect:
The present invention proposes a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction.The method compared with prior art, has following features and advantage: first by compare the inner each original pixels of PU absolute difference in different directions and, realize classifying to the grain direction feature of PU; The hunting zone of predictive mode is reduced again according to grain direction feature; Finally reduce the candidate pattern number of carrying out rate-distortion optimization according to the sequence of predictive mode SATD cost.Under the condition keeping good encoding rate distortion performance, the present invention can reduce the Intra prediction mode selection computation complexity in HEVC frame and in inter-frame encoding frame significantly.
Accompanying drawing explanation
Fig. 1 is the basic flow sheet of the inventive method.
Fig. 2 is that HEVC encodes angle predictive mode in 33 kinds of frames.
Embodiment
Describe the present invention in detail below in conjunction with embodiment and accompanying drawing, but the present invention is not limited to this.
As shown in Figure 1, a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction, comprises the following steps:
(1) input a PU to be estimated, set up actual available intra prediction mode set;
(2) calculate the spatial neighborhood pixels of all pixels in PU to be estimated and its different directions absolute difference and;
(3) according to absolute difference and the grain direction characteristic judging PU to be estimated of different directions spatial neighborhood pixels;
(4) thick level pattern search scope is determined according to grain direction characteristic;
(5) set of rate-distortion optimization candidate pattern is set up according to thick level pattern search scope and the available intra prediction mode set of reality;
(6) optimum frame inner estimation mode is chosen.
Step (1) specifically comprises:
According to the adjacent reconstruction pixel in space that the reconstruction pixel that around PU to be estimated, Existential Space is adjacent and each HEVC intra prediction mode need, for PU to be estimated chooses the available intra prediction mode of all reality, composition set omega, namely to each HEVC intra prediction mode, if there is the adjacent reconstruction pixel in space that this pattern carries out infra-frame prediction needs around PU to be estimated, then this pattern is joined Ω.
Step (2) specifically comprises:
First, when there is the angle predictive mode being numbered 18 in Ω, namely PU to be estimated can adopt and diagonally carry out toward lower right the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its upper left side neighbor in PU to be estimated lU, shown in (1):
SAD L U = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y - 1 ) | - - - ( 1 )
In formula (1), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than or equal to N-1, and coordinate is (x-1, y-1) pixel is positioned at the upper left side that coordinate is (x, y).Coordinate is (-1,-1) pixel is coordinate is (0,0) top left pixel of pixel, the pixel of coordinate to be the pixel of (0,0) be PU upper left to be estimated corner position, coordinate is (0, N-1) pixel is the pixel of PU lower-left to be estimated corner position, the pixel of coordinate to be the pixel of (N-1,0) be PU upper right corner position to be estimated, the pixel of coordinate to be the pixel of (N-1, N-1) be PU bottom right to be estimated corner position.In HEVC standard, its prediction direction of angle predictive mode of different numbering as shown in Figure 2.
Similarly, when there is the angle predictive mode being numbered 26 in Ω, namely PU to be estimated can adopt direction straight down to carry out the angle predictive mode predicted, then calculate all pixels in PU to be estimated and the absolute difference of neighbor and SAD above it u, shown in (2):
SAD U = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x , y - 1 ) | - - - ( 2 )
In formula (2), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x, y-1) pixel is positioned at coordinate for directly over (x, y).
When there is the angle predictive mode being numbered 34 in Ω, namely PU to be estimated can adopt diagonally lower left to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its upper right side neighbor in PU to be estimated rU, shown in (3):
SAD R U = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x + 1 , y - 1 ) | - - - ( 3 )
In formula (3), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x+1, y-1) pixel is positioned at the upper right side that coordinate is (x, y).
When there is the angle predictive mode being numbered 10 in Ω, namely PU to be estimated can adopt horizontal right direction to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its left neighbor in PU to be estimated l, shown in (4):
SAD L = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y ) | - - - ( 4 )
In formula (4), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x-1, y) pixel is positioned at the left that coordinate is (x, y).
When there is the angle predictive mode being numbered 2 in Ω, namely PU to be estimated can adopt diagonally upper right to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its lower left neighbor in PU to be estimated lB, shown in (5):
SAD L B = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y + 1 ) | - - - ( 5 )
In formula (5), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x-1, y+1) pixel is positioned at the lower left that coordinate is (x, y).
Step (3) specifically comprises:
First, the absolute difference SAD number according to calculating from step (2) carries out step selection: if step (2) calculates SAD number be less than 3, then perform step (5); Otherwise first the SAD that step (2) calculates is arranged from small to large, if first three minimum SAD is followed successively by SAD mIN-0, SAD mIN-1and SAD mIN-2; Again according to these three minimum SAD, the textural characteristics of PU to be estimated is classified, shown in (6):
C l a s s = { 0 , i f SAD M I N - 0 > &alpha;SAD M I N - 2 1 , e l s e i f SAD M I N - 0 < &beta; &times; SAD M I N - 1 2 , e l s e i f SAD M I N - 1 < &beta; &times; SAD M I N - 2 3 , o t h e r s - - - ( 6 )
In formula (6), Class represents the texture classification of PU to be estimated, value is that the texture of 0 expression PU to be estimated is more smooth, value is that the texture of 1 expression PU to be estimated presents obvious level, vertical or diagonal, value is that the texture of 2 expression PU to be estimated presents other angle direction, value is that the texture of 3 expression PU to be estimated is complicated, and parameter alpha, β and γ are for regulating SAD mIN-irelation between (i=0,1,2), wherein α is set to 0.9 ~ 1.0, β and γ is set to 0.6 ~ 1.0, and α is set to 0.95, β and γ is set to 0.9 herein.
Then the PU texture classification Class calculated by formula (6) and SAD relation, obtain PU grain direction characteristic to be estimated, as shown in table 1.In Table 1,0 degree of direction refers to horizontally right direction, and pi/2 direction refers to direction straight down, edge, and direction, π/4 refers to 45 degree of directions along bottom right, and direction ,-π/4 refers to that, along upper right 45 degree of directions, 3 directions, π/4 refer to 45 degree of directions along lower-left.When texture classification Class equals 2, the grain direction characteristic of PU to be estimated is according to SAD mIN-0and SAD mIN-1value whether be SAD lU, SAD u, SAD rU, SAD land SAD lBin two sad values in adjacent direction differentiate the grain direction characteristic of PU to be estimated: if (a) SAD lUequal SAD mIN-0and SAD uequal SAD mIN-1, or SAD lUequal SAD mIN-1and SAD uequal SAD mIN-0, then grain direction characteristic is designated as texture in [π/4, pi/2] direction; If (b) SAD uequal SAD mIN-0and SAD rUequal SAD mIN-1, or SAD uequal SAD mIN-1and SAD rUequal SAD mIN-0, then grain direction characteristic is designated as texture in [pi/2,3 π/4] direction; If (c) SAD lUequal SAD mIN-0and SAD lequal SAD mIN-1, or SAD lUequal SAD mIN-1and SAD lequal SAD mIN-0, then grain direction characteristic is designated as texture in [0, π/4] direction; If (d) SAD lequal SAD mIN-0and SAD lBequal SAD mIN-1, or SAD lequal SAD mIN-1and SAD lBequal SAD mIN-0, then grain direction characteristic is designated as texture in [-π/4,0] direction; F () other situation, be then designated as complex texture direction by grain direction characteristic.When texture classification Class equals 3, the grain direction characteristic of PU to be estimated is designated as complex texture direction.
Table 1 PU grain direction to be estimated characteristic
Step (4) specifically comprises:
According to the grain direction characteristic of PU to be estimated, reduce the predictive mode kind of candidate, the thick level pattern search scope S of the predictive mode predicted composition pattern after adjustment, the predictive mode wherein in S arranges according to the grain direction characteristic of PU to be estimated, as shown in table 2 below:
Predictive mode in table 2S
Step (5) specifically comprises:
First SATD cost pattern search scope Ψ is determined: if performed to current procedures from step (4), then Ψ is the common factor that the thick level pattern search scope S that obtains of step (4) and step (1) obtain Ω, if performed to current procedures from step (3), then direct Ω is assigned to Ψ.
Then the HEVC intra prediction of each predictive mode in Ψ is calculated, then the SATD cost of computational prediction residual error, shown in (7):
J=SATD+λ×R(7)
Wherein J represents cost, SATD represent the absolute error after residual signals Hadamard transform and, λ represents Lagrangian, R intermediate scheme select after coding required for bit number.
Then according to SATD cost J order from small to large, each predictive mode is sorted, set up rate-distortion optimization candidate pattern set Φ according to the predictive mode after sequence again: when the predictive mode being arranged in the 1st is DC pattern or Planar pattern, then only the predictive mode of first 1 of arrangement is added Φ; When the predictive mode being arranged in the 1st is angle mode and the predictive mode of the 2nd is DC pattern or Planar pattern, then only the predictive mode of first 2 of arrangement is added Φ; When the predictive mode being arranged in front 2 is all adjacent angle mode, then only the predictive mode of first 2 of arrangement is added Φ; When the predictive mode being arranged in front 2 is non-conterminous angle mode, then first the predictive mode of first 2 of arrangement is added Φ, then angle mode adjacent for these 2 kinds of predictive modes is added Φ; In other cases, for the PU to be estimated being of a size of 16 × 16,32 × 32 and 64 × 64, then the predictive mode of first 3 of arrangement is added Φ, for the PU to be estimated being of a size of 4 × 4 and 8 × 8, then the predictive mode of first 8 of arrangement is added Φ.
Step (6) specifically comprises:
Adopt rate-distortion optimization technology to choose the optimum frame inner estimation mode of the minimum candidate pattern of rate distortion costs as PU to be estimated from the candidate pattern set Φ that step (5) obtains, complete the Intra prediction mode selection of PU to be estimated.

Claims (4)

1. a HEVC Adaptive Mode Selection Method for Intra-Prediction, is characterized in that, described system of selection comprises the following steps:
(1) input a PU to be estimated, set up actual available intra prediction mode set:
The adjacent reconstruction pixel in space that the reconstruction pixel adjacent according to PU to be estimated Existential Space and each HEVC intra prediction mode need, for PU to be estimated chooses the available intra prediction mode of all reality, composition set omega;
(2) calculate the spatial neighborhood pixels of all pixels in PU to be estimated and its different directions absolute difference and:
When there is the angle predictive mode being numbered 18 in Ω, namely PU to be estimated can adopt and diagonally carry out toward lower right the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its upper left side neighbor in PU to be estimated lU, shown in (1):
SAD L U = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y - 1 ) | - - - ( 1 )
In formula (1), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x-1, y-1) pixel is positioned at the upper left side that coordinate is (x, y);
When there is the angle predictive mode being numbered 26 in Ω, namely PU to be estimated can adopt direction straight down to carry out the angle predictive mode predicted, then calculate all pixels in PU to be estimated and the absolute difference of neighbor and SAD above it u, shown in (2):
SAD U = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x , y - 1 ) | - - - ( 2 )
In formula (2), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x, y-1) pixel is positioned at coordinate for directly over (x, y);
When there is the angle predictive mode being numbered 34 in Ω, namely PU to be estimated can adopt diagonally lower left to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its upper right side neighbor in PU to be estimated rU, shown in (3):
SAD R U = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x + 1 , y - 1 ) | - - - ( 3 )
In formula (3), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x+1, y-1) pixel is positioned at the upper right side that coordinate is (x, y);
When there is the angle predictive mode being numbered 10 in Ω, namely PU to be estimated can adopt horizontal right direction to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its left neighbor in PU to be estimated l, shown in (4):
SAD L = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y ) | - - - ( 4 )
In formula (4), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x-1, y) pixel is positioned at the left that coordinate is (x, y);
When there is the angle predictive mode being numbered 2 in Ω, namely PU to be estimated can adopt diagonally upper right to carry out the angle predictive mode predicted, then calculate absolute difference and the SAD of all pixels and its lower left neighbor in PU to be estimated lB, shown in (5):
SAD L B = &Sigma; x = 0 N - 1 &Sigma; y = 0 N - 1 | p ( x , y ) - p ( x - 1 , y + 1 ) | - - - ( 5 )
In formula (5), PU to be estimated is of a size of N × N (N=4,8,16,32,64), p (x, y) for coordinate in PU to be estimated be the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, y is vertical coordinate, and in PU to be estimated, their value is the integer being more than or equal to 0 and being less than N, and coordinate is (x-1, y+1) pixel is positioned at the lower left that coordinate is (x, y);
(3) according to absolute difference and the grain direction characteristic judging PU to be estimated of different directions spatial neighborhood pixels:
First step selection is carried out according to the absolute difference calculated from step (2) and SAD number: if step (2) calculates SAD number be less than 3, then perform step (5); Otherwise first the SAD that step (2) calculates is arranged from small to large, the grain direction characteristic of PU to be estimated is classified;
(4) thick level pattern search scope is determined according to grain direction characteristic;
(5) set of rate-distortion optimization candidate pattern is set up according to thick level pattern search scope and Ω;
(6) optimum frame inner estimation mode is chosen:
Adopt rate-distortion optimization technology to choose the optimum frame inner estimation mode of the minimum candidate pattern of rate distortion costs as PU to be estimated from the candidate pattern set that step (5) obtains, complete the Intra prediction mode selection of PU to be estimated.
2. a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction as claimed in claim 1, is characterized in that, in described step (3), if first three minimum SAD is followed successively by SAD mIN-0, SAD mIN-1and SAD mIN-2, then according to these three minimum SAD, the textural characteristics of PU to be estimated is classified, shown in (6):
C l a s s = 0 , i f SAD M I N - 0 > &alpha; &times; SAD M I N - 2 1 , e l s e i f SAD M I N - 0 < &beta; &times; SAD M I N - 1 2 , e l s e i f SAD M I N - 1 < &gamma; &times; SAD M I N - 2 3 , o t h e r s - - - ( 6 )
In formula (6), Class represents the texture classification of PU to be estimated, value is that the texture of 0 expression PU to be estimated is more smooth, value is that the texture of 1 expression PU to be estimated presents obvious level, vertical or diagonal, value is that the texture of 2 expression PU to be estimated presents other angle direction, value is that the texture of 3 expression PU to be estimated is complicated, and parameter alpha, β and γ are for regulating SAD mIN-irelation between (i=0,1,2), wherein α is set to 0.9 ~ 1.0, β and γ is set to 0.6 ~ 1.0;
Then the PU texture classification Class calculated by formula (6) and SAD relation, obtain PU grain direction characteristic to be estimated, as shown in table 1, wherein 0 degree of direction refers to horizontally right direction, pi/2 direction refers to direction straight down, edge, direction, π/4 refers to 45 degree of directions along bottom right, and direction ,-π/4 refers to that, along upper right 45 degree of directions, 3 directions, π/4 refer to 45 degree of directions along lower-left.
Table 1 PU grain direction to be estimated characteristic
3. a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction as claimed in claim 1, it is characterized in that, in described step (4), according to the grain direction characteristic of the PU to be estimated that step (3) obtains, reduce the predictive mode kind of candidate, predictive mode after adjustment forms thick level pattern search scope S, and the predictive mode wherein in S arranges according to the grain direction characteristic of PU to be estimated, as shown in table 2 below.
Predictive mode in table 2S
4. a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction as claimed in claim 1, it is characterized in that in step (5), first SATD cost pattern search scope Ψ is determined: if performed to current procedures from step (4), then Ψ is the common factor that the thick level pattern search scope S that obtains of step (4) and step (1) obtain Ω, if performed to current procedures from step (3), then direct Ω is assigned to Ψ; Then the HEVC intra prediction of each predictive mode in Ψ is calculated, then the SATD cost of computational prediction residual error; Then according to SATD cost J order from small to large, each predictive mode is sorted, set up rate-distortion optimization candidate pattern set Φ according to the predictive mode after sequence again: when the predictive mode being arranged in the 1st is DC pattern or Planar pattern, then only the predictive mode of first 1 of arrangement is added Φ; When the predictive mode being arranged in the 1st is angle mode and the predictive mode of the 2nd is DC pattern or Planar pattern, then only the predictive mode of first 2 of arrangement is added Φ; When the predictive mode being arranged in front 2 is all adjacent angle mode, then only the predictive mode of first 2 of arrangement is added Φ; When the predictive mode being arranged in front 2 is non-conterminous angle mode, then first the predictive mode of first 2 of arrangement is added Φ, then angle mode adjacent for these 2 kinds of predictive modes is added Φ; In other cases, for the PU to be estimated being of a size of 16 × 16,32 × 32 and 64 × 64, then the predictive mode of first 3 of arrangement is added Φ, for the PU to be estimated being of a size of 4 × 4 and 8 × 8, then the predictive mode of first 8 of arrangement is added Φ.
CN201510675511.8A 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction Active CN105208387B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510675511.8A CN105208387B (en) 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510675511.8A CN105208387B (en) 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction

Publications (2)

Publication Number Publication Date
CN105208387A true CN105208387A (en) 2015-12-30
CN105208387B CN105208387B (en) 2018-03-13

Family

ID=54955775

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510675511.8A Active CN105208387B (en) 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction

Country Status (1)

Country Link
CN (1) CN105208387B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105812825A (en) * 2016-05-10 2016-07-27 中山大学 Image coding method based on grouping
CN106331726A (en) * 2016-09-23 2017-01-11 合网络技术(北京)有限公司 HEVC(High Efficiency Video Coding)-based intra-frame prediction decoding method and apparatus
CN109361922A (en) * 2018-10-26 2019-02-19 西安科锐盛创新科技有限公司 Predict quantization coding method
CN109413435A (en) * 2018-10-26 2019-03-01 西安科锐盛创新科技有限公司 A kind of prediction technique based on video compress
CN109510996A (en) * 2018-10-26 2019-03-22 西安科锐盛创新科技有限公司 Rear selection prediction technique in bandwidth reduction
CN109618162A (en) * 2018-10-26 2019-04-12 西安科锐盛创新科技有限公司 Rear selection prediction technique in bandwidth reduction
CN109618169A (en) * 2018-12-25 2019-04-12 中山大学 For decision-making technique, device and storage medium in the frame of HEVC
CN109640092A (en) * 2018-10-26 2019-04-16 西安科锐盛创新科技有限公司 Rear selection prediction technique in bandwidth reduction
CN109660793A (en) * 2018-10-26 2019-04-19 西安科锐盛创新科技有限公司 Prediction technique for bandwidth reduction
CN110213576A (en) * 2018-05-03 2019-09-06 腾讯科技(深圳)有限公司 Method for video coding, video coding apparatus, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102665079A (en) * 2012-05-08 2012-09-12 北方工业大学 Adaptive fast intra prediction mode decision for high efficiency video coding (HEVC)
CN103517069A (en) * 2013-09-25 2014-01-15 北京航空航天大学 HEVC intra-frame prediction quick mode selection method based on texture analysis
CN103763570A (en) * 2014-01-20 2014-04-30 华侨大学 Rapid HEVC intra-frame prediction method based on SATD
CN104581152A (en) * 2014-12-25 2015-04-29 同济大学 HEVC intra-frame prediction mode decision accelerating method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102665079A (en) * 2012-05-08 2012-09-12 北方工业大学 Adaptive fast intra prediction mode decision for high efficiency video coding (HEVC)
CN103517069A (en) * 2013-09-25 2014-01-15 北京航空航天大学 HEVC intra-frame prediction quick mode selection method based on texture analysis
CN103763570A (en) * 2014-01-20 2014-04-30 华侨大学 Rapid HEVC intra-frame prediction method based on SATD
CN104581152A (en) * 2014-12-25 2015-04-29 同济大学 HEVC intra-frame prediction mode decision accelerating method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MENGMENG ZHANG等: "An adaptive fast intra mode decision in HEVC", 《IMAGE PROCESSING (ICIP), 2012 19TH IEEE INTERNATIONAL CONFERENCE ON》 *
齐美斌: "利用纹理和空间相关性的HEVC帧内预测模式选择", 《中国图像图形学报》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105812825A (en) * 2016-05-10 2016-07-27 中山大学 Image coding method based on grouping
CN105812825B (en) * 2016-05-10 2019-02-26 中山大学 A kind of packet-based image encoding method
CN106331726A (en) * 2016-09-23 2017-01-11 合网络技术(北京)有限公司 HEVC(High Efficiency Video Coding)-based intra-frame prediction decoding method and apparatus
CN110213576B (en) * 2018-05-03 2023-02-28 腾讯科技(深圳)有限公司 Video encoding method, video encoding device, electronic device, and storage medium
CN110213576A (en) * 2018-05-03 2019-09-06 腾讯科技(深圳)有限公司 Method for video coding, video coding apparatus, electronic equipment and storage medium
CN109640092A (en) * 2018-10-26 2019-04-16 西安科锐盛创新科技有限公司 Rear selection prediction technique in bandwidth reduction
CN109618162A (en) * 2018-10-26 2019-04-12 西安科锐盛创新科技有限公司 Rear selection prediction technique in bandwidth reduction
CN109510996A (en) * 2018-10-26 2019-03-22 西安科锐盛创新科技有限公司 Rear selection prediction technique in bandwidth reduction
CN109660793A (en) * 2018-10-26 2019-04-19 西安科锐盛创新科技有限公司 Prediction technique for bandwidth reduction
CN109413435A (en) * 2018-10-26 2019-03-01 西安科锐盛创新科技有限公司 A kind of prediction technique based on video compress
CN109413435B (en) * 2018-10-26 2020-10-16 苏州市吴越智博大数据科技有限公司 Prediction method based on video compression
CN109361922B (en) * 2018-10-26 2020-10-30 西安科锐盛创新科技有限公司 Predictive quantization coding method
CN109660793B (en) * 2018-10-26 2021-03-16 西安科锐盛创新科技有限公司 Prediction method for bandwidth compression
CN109361922A (en) * 2018-10-26 2019-02-19 西安科锐盛创新科技有限公司 Predict quantization coding method
CN109618169A (en) * 2018-12-25 2019-04-12 中山大学 For decision-making technique, device and storage medium in the frame of HEVC
CN109618169B (en) * 2018-12-25 2023-10-27 中山大学 Intra-frame decision method, device and storage medium for HEVC

Also Published As

Publication number Publication date
CN105208387B (en) 2018-03-13

Similar Documents

Publication Publication Date Title
CN105208387A (en) HEVC intra-frame prediction mode fast selection method
CN105120292B (en) A kind of coding intra-frame prediction method based on image texture characteristic
CN103517069B (en) A kind of HEVC intra-frame prediction quick mode selection method based on texture analysis
CN101964906B (en) Rapid intra-frame prediction method and device based on texture characteristics
CN104754357B (en) Intraframe coding optimization method and device based on convolutional neural networks
CN103997646B (en) Fast intra-mode prediction mode selecting method in a kind of HD video coding
CN107277509B (en) A kind of fast intra-frame predicting method based on screen content
CN105791826B (en) A kind of HEVC interframe fast schema selection method based on data mining
CN106534846B (en) A kind of screen content and natural contents divide and fast encoding method
CN106507116B (en) A kind of 3D-HEVC coding method predicted based on 3D conspicuousness information and View Synthesis
CN103118262B (en) Rate distortion optimization method and device, and video coding method and system
CN104902271A (en) Prediction mode selection method and device
CN104811729B (en) A kind of video multi-reference frame coding method
CN101820546A (en) Intra-frame prediction method
CN104284186A (en) Fast algorithm suitable for HEVC standard intra-frame prediction mode judgment process
CN106688238A (en) Improved reference pixel selection and filtering for intra coding of depth map
CN109587503A (en) A kind of 3D-HEVC depth map intra-frame encoding mode high-speed decision method based on edge detection
CN110351552B (en) Fast coding method in video coding
CN105187826A (en) Rapid intra-frame mode decision method specific to high efficiency video coding standard
CN104883566B (en) The fast algorithm that a kind of intra prediction block size suitable for HEVC standard divides
CN109151467B (en) Screen content coding inter-frame mode rapid selection method based on image block activity
CN103702131B (en) Pattern-preprocessing-based intraframe coding optimization method and system
CN104333755B (en) The CU based on SKIP/Merge RD Cost of B frames shifts to an earlier date terminating method in HEVC
CN107454425B (en) A kind of SCC intraframe coding unit candidate modes reduction method
Xue et al. Fast coding unit decision for intra screen content coding based on ensemble learning

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant