CN110519604A - Pixel decoding method is jumped out in index mapping encoding and decoding - Google Patents

Pixel decoding method is jumped out in index mapping encoding and decoding Download PDF

Info

Publication number
CN110519604A
CN110519604A CN201910496007.XA CN201910496007A CN110519604A CN 110519604 A CN110519604 A CN 110519604A CN 201910496007 A CN201910496007 A CN 201910496007A CN 110519604 A CN110519604 A CN 110519604A
Authority
CN
China
Prior art keywords
palette
pixel
decoding
index
encoding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910496007.XA
Other languages
Chinese (zh)
Other versions
CN110519604B (en
Inventor
庄子德
陈庆晔
孙域晨
夜静
刘杉
许晓中
金廷宣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HFI Innovation Inc
Original Assignee
HFI Innovation Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HFI Innovation Inc filed Critical HFI Innovation Inc
Priority claimed from CN201580061695.7A external-priority patent/CN107005717B/en
Publication of CN110519604A publication Critical patent/CN110519604A/en
Application granted granted Critical
Publication of CN110519604B publication Critical patent/CN110519604B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/147Data rate or code amount at the encoder output according to rate distortion criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • H04N19/159Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/423Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
    • H04N19/426Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements using memory downsizing methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • H04N19/463Embedding additional information in the video signal during the compression process by compressing encoding parameters before transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/593Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/93Run-length coding

Abstract

Disclose a kind of video coding-decoding method for reducing cost of implementation for the transformation coefficient buffer palette of palette encoding and decoding by reuse.If current prediction mode is intra prediction mode or inter-frame forecast mode, will be stored in transformation coefficient buffer by the relevant information of transformation coefficient of the prediction residual of intra prediction or the current block of inter-prediction generation.If current prediction mode is palette encoding/decoding mode, information relevant to the associated palette data of current block is stored in transformation coefficient buffer.If current block encoding and decoding under intra prediction mode or inter-frame forecast mode, then it is based on information relevant to transformation coefficient, current block is encoded or is decoded, or if current prediction mode is palette encoding/decoding mode, it is then based on information relevant to palette data in transformation coefficient buffer is stored in, current block is encoded or is decoded.

Description

Pixel decoding method is jumped out in index mapping encoding and decoding
[cross reference]
It is on November 12nd, 2014 this application claims the applying date, the U.S. of U.S. Provisional Application No. 62/078,595 is interim Application case, the applying date are on December 4th, 2014, the United States provisional application of U.S. Provisional Application No. 62/087,454, application Day is on 2 24th, 2015, and the United States provisional application of U.S. Provisional Application No. 62/119,950, the applying date are 2015 4 The moon 10, the United States provisional application of U.S. Provisional Application No. 62/145,578, the applying date are on May 15th, 2015, the U.S. The United States provisional application and the applying date that Provisional Application No. is 62/162,313 are on June 4th, 2015, U.S. Provisional Application No. It is the priority of 62/170,828 United States provisional application, the content of above-mentioned Provisional Application is incorporated herein together.
[technical field]
The present invention relates to the palette encoding and decoding of video data (palette coding).Particularly, the present invention relates to logical Cross reuse transformation coefficient (transform coefficient) buffer or aggregation jump out value (Group Escape Value), Palette predicted value (palette predictor) initialization, palette predicted value entry semantic (entry semantics), Palette entry (entry) semanteme come save together (conserve) system storage or increase throughput of system various skills Art.
[background technique]
Efficient video encoding and decoding (High Efficiency Video Coding, be abbreviated as HEVC) are to have opened in recent years The new encoding and decoding standard of hair.In efficient video encoding and decoding (HEVC) system, fixed size macro block H.264/AVC (fixed-size macro block) is replaced by the flexible block of referred to as codec unit (coding unit, be abbreviated as CU).CU In pixel share identical codec parameters to improve encoding-decoding efficiency.CU can be started with maximum CU (LCU), Also referred to as encoding and decoding tree unit (coded tree unit, be abbreviated as CTU) in HEVC.In addition to the concept of codec unit, The concept of predicting unit (prediction unit, be abbreviated as PU) has also been introduced in HEVC.Once the division of CU hierarchical tree is complete At each leaf CU is then further split into one or more predicting units (PU) according to type of prediction and PU segmentation. Develop several encoding and decoding tools for screen content encoding and decoding.These tool brief reviews related to the present invention are as follows.
Palette encoding and decoding
During the exploitation of HEVC screen content encoding and decoding (screen content coding, be abbreviated as SCC), Several suggestions are disclosed to solve the encoding and decoding based on palette.For example, JCTVC-N0247 (Guo et al., " RCE3: Results of Test 3.1on Palette Mode for Screen Content Codin g ", ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG, 11 coding and decoding video integration and cooperation group (Joint Collaborative Team On Video Coding) (JCT-VC), the 14th meeting: Vienna, AT, 2013.7.25-2013.8.2, document code: ) and JCTV C-O0218 (Guo et al., " Evaluation of Palette Mode Coding on HM- JCTVC-N0247 11 coding and decoding video integration and cooperation of 12.0+RExt -4.1 ", ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG Group (Joint Collaborative Team on Video Coding) (JCT-VC), the 1st 5 meeting: Geneva, CH, 2013.10.23-2013.11.1 document code: JCTVC-O0218) in disclose palette prediction and technology of sharing.In In JCTVC-N0247 and JCTVC-O0218, the palette of each color component is constructed and transmitted.It can be pre- from its left adjacent C U (or shared) palette is surveyed to reduce bit rate.Then, all pixels in given block are carried out using their palette index Encoding and decoding.Example according to the coded treatment of J CTVC-N0247 is as follows.
The transmission of palette: transmission color index table (also referred to as palette table) size first, followed by plate element of mixing colours (that is, color value).
The transmission (index is directed toward the color in palette) of pixel palette index value: the index value of the pixel in CU is with light Grid scanning sequency encodes (encoding).For each position, mark is sent first to indicate whether that " operational mode is used (run mode) " or " duplication top (copy above) operational mode ".
A. " operational mode ": in " operational mode ", it is " palette_run " later that sending palette first, which indexes, (for example, M).Identical palette index is indexed with identified palette due to having, is not needed as current location Further information is sent with subsequent M position.Palette index (for example, i) is shared by all three color components, this meaning Reconstruct pixel value be (Y, U, V)=(paletteY[i], paletteU[i], paletteV[i]) (it is assumed that color space is YUV)。
B. " duplication top operational mode ": in " duplication top operational mode ", sending value " copy_r un " (for example, N) With instruction for subsequent N number of position (including current location), palette index is equal to the tune for being in same position in going above Colour table index.
The transmission of residual error: the palette index sent in the stage 2 is converted back to pixel value and is used as prediction.Residual error letter Breath is transmitted using the encoding and decoding of HEVC residual error, and is added in reconstruct prediction.
In JCTVC-N0247, the palette of each component is constructed and transmitted.Palette can be predicted from its left neighbour CU (or shared) is to reduce bit rate.In JCTVC-O0218, each element in palette is triple (triplet), is indicated The specific combination of three kinds of color components.Prediction encoding and decoding across CU (across CU) palette are removed.
JCTVC-O0182 (Guo et al., " AHG8:Major-color-based screen content coding ", 11 coding and decoding video integration and cooperation group (Joint of ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG Collaborative Team on Video Coding) (JCT-VC), the 15th meeting: Geneva, CH, 2013.10.23- 2013.11.1, document code: JCTVC-O0182) in, disclose another palette decoding method.Instead of pre- from left side CU Entire palette table is surveyed, it can be from the definite corresponding palette of colors entry prediction palette in top CU or left CU Each palette of colors entry.
Transmission for pixel palette index value will predict that decoding method is applied to index according to JCTVC-O0182. Index line can be predicted by different modes.Particularly, index line uses three kinds of row modes, i.e. horizontal pattern (horizontal mode), vertical mode (vertical mode) and normal mode (n ormal mode).In horizontal pattern Under, with all indexes value having the same in a line.If the value is identical as the first pixel of topmost pixel row, only send Row mode signaling position (1ine mode signaling bi t).Otherwise, index value is also sent.Under vertical mode, currently Index line is identical as top index line.Therefore, row mode signaling position is only transmitted.In the normal mode, the index coverlet in a line Solely prediction.For each index position, left adjacent or upper adjacent (left or above neighbor) is used as predicted value, and will be pre- It surveys symbol and is sent to decoder.
In addition, pixel is classified as primary color pixel (major color pixel) (tool according to JCTVC-O0182 Have the palette for being directed toward palette of colors index) and jump out pixel (escape pixel).For primary color pixel, decoder According to primary color index (that is, the palette in JCTVC-N0247 and JCTVC-O0182 indexes) and palette table reconstructed pixel Value.For jumping out pixel, encoder will further send pixel value.
Palette table signaling (signaling)
In the reference software of screen content encoding and decoding (Screen Content Coding, be abbreviated as SCC) standard, SCM-2.0 (JCTVC-R1014:Joshi etc., Screen content coding test model2 (S CM 2), ITU-T 11 coding and decoding video integration and cooperation group (Joint of SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG Collaborative Team on Video Coding) (JCT-VC), the 18th meeting: Sapporo, JP, 2014.6.30- 7.9, document code: JCTVC-R1014) the palette table of last encoding and decoding palette CU is used as the encoding and decoding of the current color palette table Predicted value.In palette table encoding and decoding, mark palette_share_flag first.If palette_share_flag is 1, then all palette of colors in last encoding and decoding palette table will be reused for current CU.In this case, currently Palette size is equal to the palette size of last encoding and decoding palette CU.Otherwise (i.e. palette_share_flag is 0), leads to Cross which palette of colors indicated can reuse in last encoding and decoding palette table or by sending new palette face Color indicates (signaled) the current color palette table.The current color palette be sized to prediction palette size (that is, NumPredPreviousPalette) plus the size of transmitted palette (that is, num_signaled_palette_ entries).The palette of prediction is the palette obtained from the C U of the palette encoding and decoding previously reconstructed.It is compiled when by current CU When being decoded as pallet mode, directly transmitted without using the palette of colors of prediction palette prediction by bit stream.For example, if Current CU is by the pallet mode that encoding and decoding are that palette size is equal to 6.If three kinds in six kinds of mass-tones are predicted from palette Value prediction, three kinds directly pass through bit stream.Following pseudocode illustrates three palettes of transmission using above-mentioned example grammer The example of color.
Num_signaled_palette_entries=3
For (cIdx=0;CIdx < 3;CIdx++) // difference component signal colour
For (i=0;I < num_signaled_palette_entries;i++)
palette_entries[cIdx][numPredPreviousPalette+i]
Since palette size is 6 in this example, so the palette index from 0 to 5 is used to indicate each palette Encoding and decoding pixel, and the primary color that each palette can be reconstructed into palette of colors table.
In SCM-2.0, if not applying wavefront parallel processing, (waveffont parallel processing, is write a Chinese character in simplified form For WPP), then each (slice) beginning or at the beginning of each tiling block (tile) to the tune of last encoding and decoding Colour table table is initialized (that is, resetting (reset)).If the palette table of last encoding and decoding is not only each using WPP The beginning of piece or the beginning of each tiling block are initialised (reset), and are initialised at the beginning of each CTU row (that is, reset).
Palette index mapping scanning sequence
In SCM-3.0 (3 (SCM of JCTVC-R1014:Joshi etc., Screen content coding test model 3), 11 coding and decoding video integration and cooperation group (Joint of ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG Collaborative Team on Video Coding) (JCT-VC), the 19th meeting: Strasbourg, FR, 2014.10.17-24, document code: JCTVC-S1014) in pallet mode encoding and decoding, traverse scanning (traverse Scan) encoding and decoding are mapped for index as shown in Figure 1.Fig. 1 shows the example of 8 × 8 pieces of traverse scanning.In traverse scanning In (Traverse Scan), for even number line scanning from left to right, for odd-numbered line scanning from right to left.Traverse scanning Suitable for all block sizes under pallet mode.
Palette index mapping encoding and decoding in SCM-4.0
In SCM-4.0 (4 (SCM of JCTVC-T1014:Joshi etc., Screen content coding test model 4), 11 coding and decoding video integration and cooperation group (Joint of ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG Collaborative Team on Video Coding) (JCT-VC), the 20th meeting: Geneva, CH, 2015.2.10- 18, document code: JCTVC-T1014) in pallet mode encoding and decoding, palette is indexed before the codec data of corresponding blocks Assembled and indicated (signaled) in face (that is, before palette_run_mo de and palette_run encoding and decoding).It is another Aspect, at the ending of the codec data of corresponding block, pixel (escape pixel) is jumped out in encoding and decoding.Syntactic element Palette_r un_mode and palette_run is indexed in palette and is jumped out between pixel by encoding and decoding.Fig. 2 shows roots According to the exemplary process diagram of the index mapping grammer signaling of S CM 4.0.Mark index (210), the last one operation type (runtype) quantity of (230) and aggregat ion pheromones (220).After indicating index information, repeat to indicate a pair of of operation type (240) and number of run (250).Finally, if it is necessary, one group of mark jumps out value (260).
The initialization of palette predicted value
In SCM-4.0, global palette prediction value set be indicated in PPS (picture parameter set, Picture parameter set) in.Using the value obtained from PPS, rather than by all palette predicted states (including PredictorPaletteSize, PreviousPaletteSize and PredictorPaletteEntries) reset to 0.
Palette grammer
Operation for the index in index mapping, the several elements for needing to be labeled include:
1) operation type: it is the operation of duplication top or duplication index operation.
2) palette indexes: for indicating which index for this operation in duplication index operation.
3) running length: it represents the length of this operation of duplication top and duplication index type.
4) it jumps out pixel: if thering is N (>=1 N) is a to jump out pixel in operation, needing N number of to jump out pixel mark for these Show N number of pixel value.
In JCTVC-T0064 (JCTVC-T0064:Joshi etc., Screen content coding test model 4 (SCM 4), 11 coding and decoding video integration and cooperation group of ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG (Joint Collaborative Team on Video Coding) (JCT-VC), the 20th meeting: Geneva, CH, 2015.2.10-18, document code: in JCTVC-T1014, all palette indexes all flock together.Mark toning first The quantity of plate index, followed by palette index.
It is standardized according to existing HEVC, when the palette of each color component index is aggregated together, for difference Other most of palette encoding and decoding related datas of color component are staggered in bit stream.In addition, with independent memory space It stores encoding and decoding block (Inter/Intra coded block) in interframe/frame and stores palette encoding and decoding block.It is expected that exploitation mentions High throughput of system and/or the technology for reducing system implementation cost.
[summary of the invention]
It discloses and a kind of is implemented as by reusing for the transformation coefficient buffer palettes of palette encoding and decoding to reduce This video coding-decoding method.It, will be by pre- in frame if current prediction mode is intra prediction mode or inter-frame forecast mode It surveys or the relevant information of transformation coefficient of the prediction residual of the current block of inter-prediction generation is stored in transformation coefficient buffer. If current prediction mode is palette encoding/decoding mode, information relevant to the associated palette data of current block is deposited Storage is in transformation coefficient buffer.If current block encoding and decoding under intra prediction mode or inter-frame forecast mode, based on The relevant information of transformation coefficient, encodes current block or is decoded, or if current prediction mode is that palette compiles solution Pattern is then based on information relevant to palette data in transformation coefficient buffer is stored in, and carries out encoding and decoding to current block Or decoding.
If current prediction mode is palette encoding/decoding mode, palette data can correspond to related to current block The palette operation type of connection, palette operation, jumps out value, jumps out mark, palette table or its any group palette index It closes.Information relevant to palette data can correspond to the palette of palette data, the palette data of parsing or reconstruct Data.For example, the parsing palette index of the sample in resolution phase reconstruct current block, and in decoder-side by the tune of reconstruct Colour table index and the value of jumping out of reconstruct are stored in transformation coefficient buffer.In another example, the sampling of current block is reconstructed Palette index is parsed, in resolution phase, the palette index of reconstruct is further reconstructed into the picture of reconstruct using palette table Element value, and the value of jumping out of the pixel value of reconstruct and reconstruct is stored in the transformation coefficient buffer of decoder-side.Furthermore, it is possible to A storage region is specified in the analysis phase to store palette table, and storage region can be in reconstruction stage by palette table From using middle release.Jumping out mark also can store in the transformation coefficient buffer of decoder-side.In another example, it jumps out Mark is stored in a part (for example, part most significant bit (MSB) of transformation coefficient buffer) of transformation coefficient buffer, And the pixel value that reconstructs is jumped out value and is stored in another part of transformation coefficient buffer.
In another embodiment, if current prediction mode is palette encoding/decoding mode, the phase to flock together All with color component jump out value and parse in decoder-side from video bit stream, or all jumps for same color component It is worth out and flocks together in coder side.Then coding or the decoding of current block will be used for comprising the information for jumping out value.It can be Value is jumped out in the aggregation for indicating same color component at the end of the encoding and decoding palette data of current block.It can be sent out respectively for current block Value is jumped out in the aggregation of different colours component out.The aggregation of the same color component of current block is jumped out value and be can store in transformation coefficient In buffer.The aggregation of different colours component is jumped out value and can be once stored in by the way that value is jumped out in the aggregation of a color component Transformation coefficient buffer is shared in transformation coefficient buffer.
In another embodiment, it is parsed from video bit stream in sequence parameter set (SPS), image parameter in decoder-side All initial palette predicted values of the same color component to flock together in collection (PPS) or piece header, or in encoder Side flocks together all initial palette predicted values of same color component.Using initial palette predicted value to corresponding sequence At least one palette encoding and decoding block in column, picture or piece is encoded or is decoded.
In another embodiment, in decoder-side, the same color component to flock together is parsed from video bit stream Current block all palette predicted value entries or palette entry, or in coder side, same color component will to be used for All palette predicted value entries or palette entry flock together.Then it uses by all palette predicted value entry groups At palette predicted value or the palette table that is made of all palette entries current block is encoded or is decoded.
[Detailed description of the invention]
Fig. 1 shows the example of 8 × 8 pieces of traverse scanning.
Fig. 2 shows reflected according to the exemplary toner plate of screen content encoding and decoding test module edition 4 (SCM-4.0) index Penetrate grammer signaling.
Fig. 3 A shows the example index mapping overturning before index encoding and decoding.
Fig. 3 B shows the example of roll over indexing mapping corresponding with the index mapping in Fig. 3 A.
Fig. 4 shows the example that roll over indexing maps before index maps encoding and decoding, wherein adjacent formation above use Pixel come to predict the pixel in the last row in physical location be inefficient.
Fig. 5 A-B shows the prediction from the top CU mapped with roll over indexing of embodiment according to the present invention Example, wherein duplication top operational mode in index always from its physics top position predict, but regardless of index mapping whether It is reversed.In fig. 5, line filling block indicates roll over indexing mapping, and the transparent block in Fig. 5 B represents original index mapping.
Fig. 6 A-B shows showing from the another of top CU mapped with roll over indexing for embodiment according to the present invention Example, wherein running encoding and decoding pixel above the duplication of the sample predictions the first row of its physics proximal most position.In fig. 6, line is filled out Filling block indicates the index mapping of overturning, and the transparent block in Fig. 6 B represents original index mapping.
Fig. 7 A-B show embodiment according to the present invention predicted from the top CU that is mapped with roll over indexing it is another Example, wherein according to encoding and decoding pixel is run above the duplication of the last row of physical picture element position prediction of top adjacent C U.Scheming In 7A, line filling block indicate overturning index mapping, and Fig. 7 B transparent block represent primary index mapping.
Fig. 8 A shows the example of operational mode above the duplication of extension, wherein multiple from the lastrow being located above the boundary CU Make two row pixels (that is, L=2).
Fig. 8 B, which is shown through signaling syntax element pixel_num (M), indicates to predict that preceding M is a (that is, M from reconstructed pixel =11) sample across CU prediction example.
Fig. 9 A shows through the reconstructed pixel value of the last line of top CU the pixel value for predicting front two row sample Example.
Fig. 9 B shows through the reconstructed pixel value of the rightmost column of left CU showing for the pixel value for predicting first two columns sample Example.
Figure 10 A-C shows three kinds of different scan patterns for predicting across CU of embodiment according to the present invention.
Figure 11 A-C shows three kinds of different scanning moulds for predicting across CU according to another embodiment of the invention Formula.
Figure 12 A-B show embodiment according to the present invention for two kinds of different scannings across the CU reverse scan predicted Mode.
Figure 13 is shown will extend to 8 × 8CU of the encoding and decoding in inter-frame mode based on capable copy pixel from adjacent C U Example.
Figure 14 shows the example of the position of the change neighboring reference pixel of embodiment according to the present invention, wherein from upper right The CU of side replicates upper right reference pixel.
Figure 15 shows another example of the position of the change neighboring reference pixel of embodiment according to the present invention, wherein from The pixel of the rightmost of the third line replicates upper right reference pixel.
Figure 16 is shown according to N number of embodiment for jumping out pixel in the different location having in current codec block (N=5) Decoding jump out the example of color, wherein the pixel value for each jumping out pixel appearance is still written into bit stream, and using horizontal Traverse scanning.
Figure 17 shows the example that color is jumped out in the decoding according to another embodiment, wherein current codec block (N= 5) pixel is jumped out with N number of in the different location in, wherein only non-repetitive color is decoded.
Figure 18 shows the boundary across CU and uses the example of the special index for the adjacent formation pixel (NCP) for being expressed as N.
Figure 19 shows showing for the special index in the case where maximum index value is 0 using adjacent formation pixel (NCP) Example.
Figure 20, which is shown, is shown adjacent formation pixel (NCP) using special index in the case where maximum index value is 1 Example.
Figure 21 shows the exemplary process diagram for supporting the signaling of the index prediction across CU, wherein being added to new mark All_pixel_from_NCP_flag, and predicted according to the grammer of SCM3.0 for the index across CU.
Figure 22 shows the exemplary process diagram for supporting the signaling of the index prediction across CU, wherein being added to new mark Will all_pixel_from_NCP_flag exists when all_pixel_from_NCP_flag is closed according to the grammer of SCM3.0 It is used in the case where across CU index prediction.
Figure 23 shows the another exemplary flow chart similar to Figure 22.However, when maximum index value is not 0, according to The grammer of SCM3.0 is for the index prediction on CU.
Figure 24 shows the exemplary stream of the signaling for supporting the index prediction across CU of embodiment according to the present invention Cheng Tu.
Figure 25 shows the example of the signaling for supporting the index prediction across CU according to another embodiment of the present invention Property flow chart.
Figure 26 shows the exemplary of the signaling according to another embodiment of the present invention for supporting the index prediction across CU Flow chart.
Figure 27 shows showing for the signaling for supporting the index prediction across CU according to another embodiment of the invention Example property flow chart.
Figure 28 A shows showing for the signaling for supporting the index prediction across CU according to another embodiment of the invention Example property flow chart.
Figure 28 B shows showing for the signaling for supporting the index prediction across CU according to another embodiment of the invention Example property flow chart.
Figure 29 shows the example that the source pixel of prediction and compensation is replicated for intra block, and midpoint filling region corresponds to Unfiltered pixel in current CTU (encoding and decoding tree unit) and left CTU.
Figure 30 shows another example that the source pixel of prediction and compensation is replicated for intra block, midpoint filling region pair It should be in four row of bottom of current CTU (encoding and decoding tree unit), left CTU and top CTU and four row of bottom of upper left CTU not Filtered pixel.
Figure 31 shows another example that the source pixel of prediction and compensation is replicated for intra block, midpoint filling region pair It should be in four row of bottom of current CTU (encoding and decoding tree unit), N number of left CTU, four row of bottom of top CTU and N number of upper left CTU In unfiltered pixel.
Figure 32 shows another example that the source pixel of prediction and compensation is replicated for intra block, midpoint filling region pair It should be in the unfiltered picture in current CTU (encoding and decoding tree unit), four row of bottom of top CTU and four row of bottom of left CTU Element.
Figure 33 shows another example that the source pixel of prediction and compensation is replicated for intra block, midpoint filling region pair It should be in four row of bottom of current CTU (encoding and decoding tree unit), N number of left CTU and top CTU, four row of bottom of N number of upper left CTU With the unfiltered pixel in the column of right four of N+1 left CTU.
Figure 34, which is shown, shares transformation coefficient buffer for palette encoding and decoding block in conjunction with the embodiment of the present invention The exemplary process diagram of system.
[specific embodiment]
It is described below and implements best contemplated mode of the invention.This description is in order to illustrate general original of the invention Reason, and be not considered as restrictive.The scope of the present invention is determined preferably by the appended claims.
It reuses HEVC transformation coefficient buffer and is used for palette relevant information
In HEVC, other than palette encoding and decoding, inter-prediction (Inter prediction) and intra prediction (Intra prediction) is available encoding/decoding mode.When using interframe or intra prediction, transform coding and decoding is usually answered Prediction residual for being obtained by interframe/intra prediction.Then quantization and entropy encoding/decoding is carried out to transformation coefficient to compile to be included in It decodes in bit stream.In decoder-side, reverse operating is applied to the received bit stream of institute.In other words, entropy decoding is applied to position Stream is to restore the encoding and decoding symbol corresponding to quantization transform coefficient.Then the transformation coefficient quantified is gone quantization and inverse transformation with weight Structure interframe/intra prediction.Transformation coefficient buffer is commonly used in coder side and decoder-side, in entropy encoding/decoding and Transformation coefficient is stored as needed between quantization/map function.
For color video data, it may be necessary to multiple transformation coefficient buffers.However, it is also possible to be one by system configuration Secondary (at a time) handles a color component, so that only needing a transformation coefficient buffer.When block is by palette encoding and decoding When, block is not converted, and do not use transformation coefficient buffer.In order to save system cost of implementation, implementation of the invention Example reuses transformation coefficient buffer to store palette encoding and decoding related data.Therefore, for encoding and decoding block in interframe/frame, In The desorption coefficient of coefficient resolution phase, TU is stored in coefficient buffer.However, being compiled for SCC (screen content encoding and decoding) Decoding block, palette encoding and decoding block do not need residual error encoding and decoding.Therefore, according to one embodiment of present invention, transformation coefficient is slow Device is rushed for storing palette encoding and decoding relevant information, may include palette operation type, palette index, palette fortune Go, jump out value, jump out mark, palette table or any combination thereof.
For example, the parsing palette index of the sample in resolution phase reconstructed blocks.Transformation coefficient buffer is for storing weight The palette of structure indexes and jumps out value.
In another example, it is reconstructed in the palette index through parsing of the sample of resolution phase block.Parsing index is used In the pixel value for searching reconstruct corresponding color component using palette table in resolution phase.Therefore, coefficient buffer is for storing The pixel value of reconstruct and jump out value, or for storing the pixel value reconstructed, jumping out value and jumping out mark.In this case, Palette table only needs to be stored in resolution phase.Storage and maintenance palette table is not needed in reconstruction stage.Needed for transformation coefficient Data depth could possibly be higher than the data depth of palette encoding and decoding related data.For example, transformation coefficient may need 16 locating depths The buffer of degree.However, if maximum palette index is 63, the storage of palette index is only for palette encoding and decoding Need 6 depth.In addition, the maximum bit length for jumping out value is equal to usually 8 or 10 bit depth.Therefore, 16 A part of (i.e. 8 or 6) are idle, and can be used for storing the information for jumping out mark.For example, coefficient buffer Six MSB (most significant bits, most significant bit) can be used for store jump out mark.Remaining position is for storing The pixel value of reconstruct jumps out value.In reconstruction stage, MSB, which can be used for directly reconstructing, jumps out value or reconstructed pixel value.
In another example, it indexes in the parsing palette of resolution phase reconstructed sample, and also makes in resolution phase It is searched with palette table to reconstruct the reconstruction value of different colours component.The pixel value of sample is jumped out also by jumping out in resolution phase Value reconstruct.Therefore, coefficient buffer is used to store the pixel value of reconstruct.In this case, palette table only needs to be stored in Resolution phase.Reconstruction stage does not need storage and maintenance palette table.
The value of jumping out of same color component is assembled
In JCTVC-T0064 (JCTVC-T0064:Joshi etc., Screen content coding test model 4 (SCM 4), 11 coding and decoding video integration and cooperation group of ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/WG (Joint Collaborative Team on Video Coding) (JCT-VC), the 20th meeting: Geneva, CH, 2015.2.10-18, document code: in JCTVC-T1014, the value of jumping out of three color components is aggregated together.In other words It says, for each sample, three components jump out value by sequence encoding and decoding.Value encoding and decoding are jumped out according to JCTVC-T1014 Syntax table is shown in Table 1.
Table 1.
Due to not being used for the residual error encoding and decoding of pallet mode, it is possible to reuse coefficient buffer to store palette Index map information.In the parsing of HEVC coefficient, the primary TU for only parsing a color component.Grammar parser is for a face Colouring component can only have a coefficient buffer, and can only access a coefficient buffer of a color component.However, In In palette encoding and decoding, a palette index indicates the pixel value of multiple color components.Pallet mode can be decoded once Multiple color component values.Palette index can store in a coefficient buffer of a color component.However, current SCM-4.0 syntax parsing sequence in, three coefficient buffers of three color components may be needed by jumping out value.
Therefore, according to another embodiment, change the syntax parsing sequence for jumping out value, involve a need to three coefficients to overcome Buffer is come the problem of storing three color components.The value of jumping out of same color component is aggregated together, and in the toning of block It is labeled at the end of plate grammer encoding and decoding.The value of jumping out of different colours component issues signal respectively.In conjunction with the toning of the present embodiment The example syntax of plate encoding and decoding is shown in table 2.
Table 2.
In table 2, the text representation in line filling region is deleted.Do-loop sentence (i.e. " for of different colours component (cIdx=0;CIdx < numComps;CIdx++) ") home position shown in annotation (2-1) is moved upwards up to by annotating (2- 2) new position shown in.Therefore, all of color component that it can parse CU jump out value.The first color point can be write out The palette of amount indexes and jumps out value (that is, cIdx is equal to 0), and then resolver can reuse identical buffer to parse the The information of second colors component.By the grammer change in table 2, for storing the palette encoding and decoding for being used for a color component One coefficient buffer of related data is sufficient for pallet mode parsing.The implementation complexity and cost of SCC resolver are simultaneously It not will increase.
The palette predicted value initialization of same color component is assembled
In SCM 4.0, one group of PPS (image parameters collection) syntactic element is indicated with specified palette predicted value initialization Device.The existing grammer of PPS extension is as shown in table 3.
Table 3.
It is as follows according to the palette predicted value initialization procedure of existing SCM 4.0:
The output of this process be initialization palette predictive variable PredictorPaletteSize and PredictorPaletteEntries。
PredictorPaletteSize is exported as follows:
If palette_predictor_initializer_present_flag is equal to 1, PredictorPaletteSize is set equal to tonum_palette_predictor_initializer_minus1 and adds 1.
Otherwise (palette_predictor_initializer_present_flag is equal to 0), PredictorPaletteSize is arranged to 0.
PredictorPaletteEntries array (array) export is as follows:
If palette_predictor_initializer_present_flag is equal to 1,
For (i=0;I < PredictorPaletteSize;i++)
For (comp=0;Comp < 3;comp++)
PredictorPaletteEntries [i] [comp]=
palette_predictor_initializers[i][comp]
Otherwise (palette_predictor_initializer_present_flag is equal to 0), PredictorPaletteEntries is arranged to 0.
In one embodiment, predicted value initialization value, which can be combined, is respectively used to Y-component, Cb component, Cr point Amount.Grammer change is as shown in table 4.Compared with table 3, two do-loop sentences are carried out as shown in annotation (4-1) and annotation (4-2) Exchange, to combine the palette predicted value initialization of same color component.
Table 4.
Exemplary toner plate predicted value initialization procedure corresponding to above-described embodiment is as follows.
The output of this process be initialization palette predictive variable PredictorPaletteSize and PredictorPaletteEntries。
PredictorPaletteSize is exported as follows:
If palette_predictorinitializer_present_flag is equal to 1, PredictorPaletteSize is set equal to num_palette_predictor_initializer_minus1 and adds 1.
Otherwise (palette_predictor_initializer_present_flag is equal to 0), PredictorPaletteSize is set equal to 0.
The export of PredictorPaletteEntries array is as follows:
If palette_predictor_initializer_present_flag is is equal to 1,
For (comp=0;Comp < 3;comp++)
For (i=0;I < PredictorPaletteSize;i++)
PredictorPaletteEntries [i] [comp]=palette_predictor_initializers [i] [comp]
Otherwise (palette_predictor_initializer_present_flag is equal to 0), PredictorPaletteEntries is arranged to 0.
If palette predicted value initializer quilt at SPS (sequence parameter set) or piece header (slice header) Mark, then can apply identical method for congregating.In other words, all initialization values of Y-component are labeled together, Cb component All initialization values are labeled together, and all initialization values of Cr component are labeled together.
The palette predicted value renewal process of same color component is assembled
Existing palette predicted value renewal process is as follows.
The induced variable numComps as shown in equation (1):
NumComps=(ChromaArrayType==0) 1: 3 (1)
Variable PredictorPaletteSize and Predictor Palette Entries is modified, such as the puppet in table 5 Shown in code.
Table 5.
As shown in table 5, the first part of newPredictorPaletteEntries renewal process is by annotating (5-1) It is executed in the dual do-loop indicated with annotation (5-2), wherein including do-loop associated with color component (i.e. cIdx) In portion do-loop.Therefore, for each palette entry, three color components are updated.Remaining NewPredictorPaletteEntries update two do-loop sentences by annotate (5-3) and annotation (5-4) expression, The do-loop sentence of middle different colours component corresponds to interior circulation (inner loop).Therefore, for each entry, three are updated A color component.Two do-loop sentences that PredictorPaletteEntries updates are by annotating (5-5) and annotation (5-6) It indicates, wherein the do-loop sentence of different colours component corresponds to outer circulation (outer loop).It therefore, is each color point Amount updates PredictorPaletteEntries value.
In one embodiment, the palette predicted value of three respective components is assembled, is then updated.To existing The exemplary modification of renewal process is as follows.
The export of variable numComps is identical as equation (1) holding.Variable PredictorPaletteSize and PredictorPaletteEntries is modified, as shown in the following exemplary pseudocode in table 6.
Table 6.
In the examples described above, two do-loop sentences that first part newPredictorPaletteEntries updates Indicate that wherein the do-loop sentence of different colours component corresponds to outer circulation by annotating (6-1) and annotation (6-2).Therefore, it is Each color component updates remaining newPredictorPaletteEntries value.By annotating (6-3) and annotation (6-4) table Existing renewal process in the newPredictorPaletteEntries renewal process and table 5 of the dual do-loop instruction shown It is consistent.By annotating (6-5) and annotating the dual do-loop instruction that (6-6) is indicated PredictorPaletteEntries renewal process is consistent with existing renewal process in table 5.
In another embodiment, variable PredictorPaletteSize and PredictorPaletteEntries quilt Modification, as shown in the following exemplary pseudocode in table 7.
Table 7.
In the examples described above, two do-loop languages of the first part that newPredictorPaletteEntries updates Sentence indicates that wherein the do-loop sentence of different colours component corresponds to outer circulation by annotating (7-1) and annotation (7-2).Therefore, Remaining newPredictorPaletteEntries value is updated for each color component.By annotating (7-3) and annotation (7-4) The newPredictorPaletteEntries renewal process of dual do-loop instruction is indicated, wherein the do-loop of different components Sentence corresponds to outer circulation.Therefore, newPred ictorPaletteEntries value is updated for each color component.By Note The Pre dictorPaletteEntries renewal process and table 5 for the dual do-loop instruction that (7-5) and Note (7-6) are indicated In existing renewal process be consistent.
In another embodiment, variable PredictorPaletteSize and PredictorPaletteEntries quilt Modification, as shown in the following exemplary pseudocode in table 8.
Table 8.
In the examples described above, newPredictorPaletteEntries and PredictorPaletteEntries updates Two-way Cycle (two-loop) sentence of process by annotate (8-1), annotation (8-2), annotation (8-3) and annotation (8-4) expression, The do-loop sentence of middle different colours component corresponds to outer circulation.Therefore, it is updated for each color component NewPredictorPaletteEntries value and PredictorPaletteEntries.
It has been illustrated in table 6 into table 8 and gathers for palette predicted value of the renewal process to same color component The various examples of collection, wherein according to an embodiment of the invention, using HEVC syntactic element come by same color component Palette predicted value assembled to demonstrate renewal process.However, the specific language listed in the present invention is not restricted to these example Method element and specific pseudocode.Those skilled in the art can practice this hair without deviating from the spirit of the present invention It is bright.
The palette entry semanteme of same color component is assembled
In current SCM4.0, syntactic element palette_entry is used to specify the palette entry of the current color palette In component value.The predicted value of variable PredictorPaletteEntries [cIdx] [i] designated color component cIdx is mixed colours I-th of element in plate.Variable numComps be as shown in equation (1) derived from.Variable CurrentPaletteEntries I-th of element in the current color palette of [cIdx] [i] designated color component cIdx, and as the following exemplary in table 9 is pseudo- Export shown in code.
Table 9.
As shown in table 9, the CurrentPaletteEntries of palette predicted value such as annotation (9-1) and annotation (9-2) Shown update, wherein the do-loop sentence of different colours component corresponds to interior circulation.Therefore, for each entry, three are updated Color component.In addition, the CurrentPaletteEntries from new palette entry will be updated to annotation (9-3) and annotate Shown in (9-4), wherein the do-loop sentence of different colours component corresponds to outer circulation.Therefore, it is updated for each color component CurrentPaletteEntries value.
In one embodiment, the palette predicted value of each of three components can flock together.To existing The exemplary change of process is as follows.
The induced variable numComps as shown in equation (1).Variable CurrentPaletteEntries [cIdx] [i] is specified I-th of element in the current color palette of color component cIdx, and exported as shown in the following exemplary pseudocode in table 10.
Table 10.
As shown in table 10, CurrentPaletteEntries such as annotation (10-1) and annotation from palette predicted value It is updated shown in (10-2), wherein the do-loop sentence of different colours component corresponds to outer circulation.It therefore, is each color component Update CurrentPaletteEntries value.CurrentPaletteEn tries such as annotation (10- in new palette entry 3) it and shown in annotation (10-4) updates, and is consistent with table 9.
In another embodiment, three components can be combined.Change is as follows.
The induced variable numComps as shown in equation (1):
Variable CurrentPaletteEntries [cIdx] [i] is specified in the current color palette for color component cIdx I-th of element, and as shown in the following exemplary pseudocode in table 11 export.
Table 11.
As shown in table 11, the CurrentPaletteEntries in palette predicted value and from new palette entry CurrentPaletteEntries is updated as shown in annotation (11-1), annotation (11-2), annotation (11-3) and annotation (11-4), Wherein the do-loop sentence of different colours component corresponds to outer circulation.Therefore, it is updated for each color component CurrentPaletteEntries value.
According to an embodiment of the invention, having been illustrated with the same color component for updating processing in table 10 and table 11 Palette entry semanteme aggregation various examples, wherein HEVC syntactic element has been used to by same color component Palette entry semanteme is assembled to demonstrate renewal process.However, the specific language listed in the present invention is not restricted to these example Method element and specific pseudocode.Those skilled in the art can practice this hair without deviating from the spirit of the present invention It is bright.
The aggregation of other palette grammers
In SCM-4.0, palette index is collected at front, and jumps out value and assembled finally.Palette index Binary file (bypass bin) encoding and decoding are bypassed with value is jumped out.Using palette index to jump out value assembled (that is, The binary file of aggregation bypass encoding and decoding) parsing handling capacity can be increased.In S CM-4.0, what is parsed jumps out the number of value Amount depends on palette operational mode, palette index and palette operation.Therefore, jumping out value can only assemble finally.In order to Assembled using palette index to jumping out value, is disclosed directly below several method.
Method one: the value of jumping out of front is assembled using palette index:
If jumping out value is gathered in front, the quantity for jumping out value to be parsed should be unrelated with palette operation.For Value parsing is jumped out, in order to remove the data dependency of palette operation, when jumping out sample from top row duplication, reality of the invention It applies example and changes duplication top operational mode behavior.If predictive variable is to jump out sample, it will jump out sample and be considered as prediction change Measure the predefined color index in replication mode.
Above the duplication in operational mode, transmission or export " palette operation " value with indicate will from top row duplication with The quantity of sample afterwards.Color index is equal to the color index in top row.In one embodiment, if top or left samples It is to jump out sample, then the color index of top sample is considered as predefined color index (such as 0).Current index is set as pre- The index of definition.These predicted value replication modes do not need to jump out value.In the method, it is jumped out even if flagged index is equal to Index, palette operation can also be flagged for indexing operational mode.If the operation for jumping out index is greater than 0 (such as N, N > 0), then first sample is reconstructed with the value of jumping out of encoding and decoding.It can set the index of first sample to jumping out index or pre- The color index of definition.Predetermined index (for example, index 0) is set by the index of remaining N number of sample, and utilizes predetermined index N number of sample of the value reconstructed residual of (for example, index 0).In one embodiment, in addition to first sample of index operational mode Except, maximum codeword in operational mode (for example, adjustIndexMax) index is fixed (such as is fixed as indexMax-1).For first sample in CU, adjus tIndexMax is equal to indexMax.Still redundancy can be applied Index is deleted.In this approach, the quantity for jumping out value for needing to parse depends on being equal to the parsing for jumping out index/reconstruct index Quantity.For example, if palette indexes the binary code encoding and decoding with truncation, and the binary file of encoding and decoding is all 1, thenParsingIndexing (parsed index) is to jump out index.The quantity for jumping out value and palette for needing to parse run nothing It closes.Therefore, it is indexed using palette, the grammer for jumping out value can be put into front (i.e. before palette operation).
Quantity → last_run_mode → operation class of syntax sequence example 1:copy_above_run or number_run Value aggregation (bypass encoding and decoding) → fortune is jumped out in type aggregation (context encoding and decoding) → palette index aggregation (bypass encoding and decoding) → The aggregation of row length.
In this case, last_run_mode indicates that the last one operational mode is copy_above_run or in dex_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, and last_run_mode is index_run, then and operation type aggregation will be terminated.If compiling solution Code/decoded index_run quantity is equal to the quantity of index_run, and last_run_mode is c opy_above_run, then Operation type aggregation will be also terminated, and copy_above_run is inserted into end.
The quantity of syntax sequence example 2:copy_above_run or index_run → (context is compiled for operation type aggregation Decoding) → last_run_mode → palette index aggregation (bypass encoding and decoding) → jumps out value aggregation (bypass encoding and decoding) → transport The aggregation of row length.
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, then operation type aggregation will be terminated, and last_run_mode is labeled.If lastrun_ Mode is copy_above_run, then is inserted into copy_above_run at end.
The quantity of syntax sequence example 3:copy_above_run or index_run → (context is compiled for operation type aggregation Decoding) → palette index aggregation (bypass encoding and decoding) → jump out value aggregation (bypassing encoding and decoding) → last_run_mode → fortune The aggregation of row length.
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, then operation type aggregation will be terminated.Indicate last_run_mode.If last_run_mode is Copy_above_run is then inserted into a copy_above_run at end.
The quantity of syntax sequence example 4:copy_above_run or the quantity of index_run → operation type aggregation (on Hereafter encoding and decoding) → palette index aggregation (bypass encoding and decoding) → last_run_mode → jumps out value (bypass encoding and decoding) and gathers Collection → running length aggregation.
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, then operation type aggregation will be terminated.Indicate last_run_mode.If last_run_mode is Copy_above_run is then being finally inserted a copy_above_run.
The quantity of syntax sequence example 5:copy_above_run or the quantity of index_run → palette index aggregation → Jump out value aggregation → last_run_mode → operation type aggregation → running length aggregation.
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, then operation type aggregation will be terminated.If last_run_mode is copy_above_run, A copy_above_run not being labeled is inserted at end.For the last one palette operational mode, palette operation It is inferred to be at the end of PU/CU.
The quantity of syntax sequence example 6:copy_above_run or the quantity of index_run → palette index aggregation → Last_run_mode → jump out value aggregation → operation type aggregation → running length aggregation.
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run number is equal to The quantity of index_run, then operation type aggregation will be terminated.If last_run_mode is copy_above_run, Copy_above_run is inserted in end, without indicating.For the last one palette operational mode, palette operation is pushed away Break as at the end of PU/CU.
The quantity of syntax sequence example 7:copy_above_run or the quantity of index_run → palette index aggregation → Jump out value aggregation → last_run_mode → staggeredly { palette operation type, palette running length }.
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, then operation type aggregation will be terminated.If last_run_mode is copy_above_run, Copy_above_mn is inserted in end, without indicating.For the last one palette operational mode, palette operation is pushed away Break as at the end of PU/CU.
The quantity of syntax sequence example 8:copy_above_run or the quantity of index_run → palette index aggregation → Last_run_mode → jumping out value aggregation → interlocks { palette operation type, palette running length }.
In this case, lasr_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, in the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, then operation type aggregation will be terminated.If last_run_mode is copy_above_run, Copy_above_run is inserted in end, without being labeled.For the last one palette operational mode, palette runs quilt It is inferred as at the end of PU/CU.
In the examples described above, copy_above_run corresponds to the example syntax member of above-mentioned " duplication top operational mode " Element.In addition, index_run corresponds to the example syntax element of " operational mode ".
In the example 1,2,3 and 5 of above-mentioned syntax sequence, " palette index aggregation (bypass encoding and decoding) → jump out (bypass Encoding and decoding) value aggregation " it can be " staggeredly { palette index, jump out value } ".Palette indexes and jump out value can be with staggeredly side Formula carries out encoding and decoding.If the index of parsing, which is equal to, jumps out index, it can parse immediately and jump out value.
In the example 2 to 7 of above-mentioned syntax sequence, " last_run_mode " can be in " the quantity of copy_above_run Or the quantity of index_run " be labeled later.
In the example 1 to 8 of above-mentioned syntax sequence, can indicate the last one palette operation before from anywhere in It indicates " last_run_mode ".
In the example 1 to 4 of above-mentioned syntax sequence, palette operation is decoded before being gathered in palette operation aggregation. Therefore, signaling is run for palette, maximum possible operation (maximum possible run) can be subtracted further The quantity, the quantity or remaining of remaining duplication top operational mode of (subtracted bv) remaining index operational mode The quantity of operational mode above the quantity operational mode of index+remaining duplication.For example, maxPaletteRun=nCbS* The quantity or maxPaletteRun=nCbS*nCbS- of the remaining COPY_INDEX_MODE of nCbS-scanPos-1- The quantity or maxPaletteRun=nCbS*nCbS- scanPos-1- of the remaining COPY_ABOVE_MODE of scanPos-1- The quantity of the remaining COPY_INDEX_MODE of the quantity-of remaining COPY_ABOVE_MODE.In the examples described above, MaxPaletteRun corresponds to the example syntax element for maximum palette operation, and nCbS, which corresponds to, is used for present intensity The example syntax element of the size of encoding and decoding block, scanPos correspond to the scan position of current pixel.
In the example 1 to 7 of above-mentioned syntax sequence, for the last one palette operational mode, palette operation is pushed away Break as at the end of PU/CU.
Method 2: palette index is assembled to the end using value is jumped out.
In SCM-4.0, the context of palette operation, which is formed, depends on palette index.Therefore, palette index Encoding and decoding can be carried out before palette is run.In order to which palette index is gathered finally, the context of palette operation is formed It should be unrelated with palette index.
Therefore, the context formation of palette operation needs to change, so that the context formation of palette operation will only take Certainly in the current color palette operational mode, previous palette operational mode, the operation of previous palette, the palette operation of previous sample The palette operation or the combination of above- mentioned information of mode, previous sample.Alternatively, palette operation can use bypass binary system text Part carries out encoding and decoding.
The various examples of the syntax sequence of palette index mapping encoding and decoding are as follows.
Quantity → last_run_mode of the quantity of syntax sequence example 1:copy_above_run or index_run → Operation type assembles the aggregation of (context encoding and decoding) → running length and (indexes) the index aggregation of → palette independent of palette (bypass encoding and decoding) → jump out value aggregation (bypass encoding and decoding).
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, when the aggregation of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, and last_run_mode is index_run, then operation type aggregation will be terminated.If encoding and decoding/ Decoded index_run quantity is equal to the quantity of index_run, and last_run_mode is c opy_above_run, then transports The aggregation of row type will be also terminated, and copy_above_run is inserted into end.Mould is run for the last one palette Formula, palette operation are inferred to be at the end of PU/CU.
The quantity of syntax sequence example 2:copy_above_run or the quantity of index_run → operation type aggregation → Last_run_mode → running length aggregation → palette index is assembled → jumps out value aggregation.
In this case, last_run_mode indicate the last one operational mode be copy_above_run or index_run.For example, when the grouping of coding/decoding operation type, if encoding and decoding/decoded index_run quantity is equal to The quantity of index_run, then operation type aggregation will be terminated, and indicate last_run_mode.If last_run_ Mode is copy_above_run, then copy_above_run is inserted into end.For the last one palette operational mode, Palette operation is inferred to be at the end of PU/CU.
Syntax sequence example 3: staggeredly { palette operation type, palette running length }, palette indexes aggregation → jump Value aggregation out.
In the example 1 to 3 of above-mentioned syntax sequence, " value aggregation is assembled → jumped out to palette index " can be and " staggeredly { adjust Colour table index, jumps out value } ".Palette indexes and jumps out value can carry out encoding and decoding in an interleaved manner.If the index etc. of parsing In jumping out index, then it can parse immediately and jump out value.
In the example 1 to 3 of above-mentioned syntax sequence, can indicate the last one palette operation before from anywhere in It indicates " last_run_mode ".
In the example 1 and 2 of above-mentioned syntax sequence, palette operation aggregation can be solved before palette indexes aggregation Code.Therefore, signaling is run for palette, the operation of maximum possible can further subtract remaining index operational mode, residue The quantity of the quantity of duplication top operational mode or remaining index operational mode+residue duplication top operational mode quantity.Example Such as, the quantity of maxPaletteRun=nCbS*nCbS-scanPos-1- residue COPY_INDEX_MODE, or The quantity or maxPaletteRun of maxPaletteRun=nCbS*nCbS-scanPos- 1- residue COPY_ABOVE_MODE Quantity-residue COPY_INDEX_MODE quantity of=nCbS*nCbS- scanPos-1- residue COPY_ABOVE_MODE.
In the example 1 and 2 of above-mentioned syntax sequence, for the last one palette operational mode, palette operation is pushed away Break as at the end of PU/CU.
The invention further relates to the various aspects of palette encoding and decoding disclosed below.
Remove the line buffer in palette index Mapping Resolution
In SCM-3.0, four palette index mapping syntactic elements of encoding and decoding are (that is, palette is run in an interleaved manner Mode, palette index, palette run and jump out value).Although the context of palette operational mode is modified to solely The palette operational mode of the top sample in SCM-4.0 is stood on, index Mapping Resolution needs the information of top row.For example, working as When using duplication top operational mode, the quantity for jumping out value to be parsed depends on the number for jumping out pixel to replicate from top row Amount.When previous encoding/decoding mode is duplication top operational mode, index restructuring also depends on the palette index of top sample. In order to save (save) line buffer in palette index Mapping Resolution, discloses and remove data dependence pass from above-mentioned example The several method of system.
Method -1: it if predicted value is to jump out sample, is directly replicated with predicted value replication mode and jumps out value.
In order to remove dependence during calculating the quantity for jumping out pixel to be parsed in operational mode above duplication, Embodiment according to the present invention has modified duplication top operational mode behavior, removes sample for replicating from top row.
Above the duplication in operational mode, transmission or export " palette operation " value with indicate will from top row duplication with The quantity of lower sample.Color index is equal to the color index in top row.According to one embodiment, if predicted value is (that is, above Position) be to jump out sample, then current sample not only replicates index (jumping out index), but also duplication come from above capable jump out Value.Parsing is not needed in these samples jumps out value.In this approach, even if the index of mark, which is equal to, jumps out index, still may be used With mark operation for indexing operational mode.If the operation for jumping out index is greater than 0 (such as N, N > 0), decoder will be from first A sample starts to fill the reconstruction value (or jumping out value) of N number of sample.It can be indicated after operation grammer and jump out value.
In order to which when mode is duplication top operational mode previous, removal index parses and the reconstruct of index operational mode Data dependency, when previous mode be duplication top operational mode when, redundancy index delete it is disabled, such as JCTVC- T0078 (JCTVC-T00784:Kim etc., CE1-related:simplification for index map coding in Palette mode, ITU-T SG 16WP 3 and ISO/IEC JTC 1/SC 29/ WG, 11 coding and decoding video integration and cooperations are small Group (Joint Collaborative Team on Video Coding) (JCT-VC), the 20th meeting: Geneva, CH, 2015.2.10-18 document code: JCTVC-T0078) described in.
Based on method -1, information of the Mapping Resolution independent of top row is indexed.Entropy decoder can be by being used only first The dependence of preceding encoding and decoding palette operational mode and palette index maps grammers to parse all palette indexes.
In one embodiment, it in duplication index operational mode, sends or export " palette operation " value is to indicate to want The quantity of the following sample of encoding and decoding in bit stream.The color index of current location is by encoding and decoding.However, if current location Sample is to jump out sample, then not only carries out encoding and decoding to the index of current sample (jumping out index), but also compile to value is jumped out Decoding.In this approach, even if the index of mark, which is equal to, jumps out index, or index operational mode mark operation.Such as The operation that fruit jumps out index is greater than 0 (for example, N, N > 0), then decoder will fill the weight of N number of sample since first sample Structure value (or jumping out value).It can be indicated after operation grammer and jump out value.
In order to which when mode is that duplication indexes operational mode previous, the parsing of removal index and duplication index operational mode The data dependency of reconstruct, when previous mode is duplication index operational mode, disabling redundancy index is deleted.
Using the above method, Mapping Resolution is indexed independent of the information previously indexed.Entropy decoder can be by using All palette index mapping grammers are parsed to the dependence of palette operational mode and the palette index of previous encoding and decoding.
Method -2: if predicted value is to jump out sample, it will jump out sample as the predetermined face in predicted value replication mode Color index is handled.
Above the duplication in operational mode, transmission or export " palette operation " value with indicate will from top row duplication with The quantity of lower sample.Color index is equal to the color index in top row.According to one embodiment, if top or left samples It is to jump out sample, then the color index of top sample is considered as predefined color index (such as 0).Current index is set as pre- The index of definition.These predicted value replication modes do not need to jump out value.According to one embodiment, jumped even if the index of mark is equal to It indexes out, or index operational mode mark palette operation.It can be indicated after operation grammer and jump out value.If jumped out The operation of index is greater than 0 (such as N, N > 0), then reconstructs first sample with the value of jumping out of encoding and decoding.It can be by first sample This index is set as jumping out index or predefined color index.The index of remaining N number of sample is arranged to predetermined index (for example, index 0).Remaining N number of sample is reconstructed with the value of predefined index (such as index 0).In another example, if jumped The operation indexed out is greater than 0 (such as N, N > 0), then with value first sample of reconstruct is jumped out, and it also requires mark is then N number of Sample jumps out value.Remaining sample indicates respectively jumps out value reconstruct.The index of first sample and remaining N number of sample can be set It is set to predetermined index (for example, index 0).
According to method -2, other than first sample of index operational mode, operational mode (that is, AdjustedIndexMax the maximum codeword index in) is fixed (such as being fixed as indexMax-1).For in CU One sample, adjustIndexMax are equal to indexMax.It can continue to delete using redundancy index.
According to method -2, although index mapping reconstruction still needs the index value of above-mentioned sample, entropy decoder can be with All palette ropes are parsed by the dependence using only palette operational mode and the palette index to previous encoding and decoding Draw mapping grammer.
According to JCTVC-T0078, in order to which when mode is duplication top operational mode previous, removal indexes operational mode Index restructuring data dependency, when previous mode be duplication top operational mode when, redundancy index delete will be banned With.
Mode above duplication under index mapping overturning
The invention further relates to the problems related with the index mapping overturning before index maps encoding and decoding.It is overturn in decoder After index mapping, the prediction source in the operational mode of duplication top is and topmost pixels physically different in the past (physical above pixels).Fig. 3 A shows the example index mapping overturning before index encoding and decoding.Fig. 3 B is shown The example of roll over indexing mapping.
Fig. 3 A shows the example of original codec unit.Pixel (i.e. picture after index mapping overturning, in last line Element 0 is to pixel 7) it will be turned to the first row, as shown in Figure 3B.If prediction can cross over CU, by top adjacent formation picture Plain (neighboring constructed pixel, be abbreviated as NCP) predicts present the first row pixel.As shown in Figure 3B, line The index mapping of filling block instruction overturning, and the transparent block (clear block) in Fig. 3 A indicates primary index mapping.For turning over Turn after other rows in pixel, duplication top operational mode in prediction become overturning before by following physical location pixel The prediction of position.In the method, index restructuring does not need second time (second pass).
However, roll over indexing mapping means to predict physical location using above-mentioned NCP before index maps encoding and decoding Pixel in middle last line, as shown in Figure 4.This prediction process be it is invalid, because of predicted value and the potential rope to be predicted The distance between draw very big.
Therefore, it disclosed below the method that raising and index mapping overturn related encoding-decoding efficiency.
No matter method 1: as fig. 5 a and fig. 5b, indexing whether mapping is reversed, the rope in the operational mode of duplication top It is pre- from the top position (if transposition mark opens (transpose flag is on), for leftward position) of its physics for drawing It surveys.As shown in Figure 5A, line filling block indicates roll over indexing mapping, and the transparent block in Fig. 5 B represents original index mapping.
Method 2: the different operation scanning starting position of signaling.This method is similar to method 1, wherein duplication top runs mould Index in formula is the pixel prediction from the top position of physics.Additional information can be indicated to indicate that " operation scans Beginning position " or " scan pattern "." operation scanning starting position " can be upper left, upper right, lower-left or bottom right." scan pattern " can To be horizontal sweep, vertical scanning, horizontal traverse scanning or vertical traverse scanning.
Method 3: if index mapping is reversed, from the duplication of the sample predictions the first row of the proximal most position of its physics Fang Yunhang encoding and decoding pixel, as shown in Figure 6 A and 6 B.Fig. 6 A shows and overturns sample as shown in line filling block, and Fig. 6 B is shown The sample of original physical position.If transposition mark is closed, encoding and decoding are run in physical location above the duplication of last line Pixel is predicted from the sample of the proximal most position of its physics.
Method 4: if index mapping is reversed, from the duplication of the last row of physical picture element position prediction of above-mentioned adjacent C U Top runs encoding and decoding pixel, as shown in figures 7 a and 7b.Fig. 7 A shows the overturning pixel indicated by line filling block, and Fig. 7 B shows The sample of original physical position is gone out.If transposition mark is closed, can be from the physical picture element position prediction of above-mentioned adjacent C U The first row (or preceding M row) of physical location.M can be 1,2 or 3.M can also depend on CU size.M can be indicated, so that Decoder can be correspondingly decoded.
Across CU prediction
In order to further increase encoding-decoding efficiency, a kind of special operation is disclosed.This special operation extend from Operation above the duplication that first sample of palette encoding and decoding CU starts.The special operation can indicate primary (signaled once).Sample in the operational mode of extension duplication top is that the reconstructed pixel from adjacent C U is predicted.Using SCM-4.0 or The palette grammer specified in SCM-3.0 carries out encoding and decoding to remaining sample in the CU, the difference is that total in PU/CU Palette encoding and decoding sample is reduced.
Method 1: mark syntactic element (for example, the line num indicated by L) first, to indicate the weight from adjacent C U L row sample before structure pixel prediction, wherein L is positive integer.Using the palette grammer in SCM-4.0 or SCM-3.0 to remaining Sample carries out encoding and decoding, the difference is that total palette encoding and decoding sample in PU/CU is reduced.For preceding L row sample, from it Reconstructed pixel in adjacent C U predicts their pixel value.For example, being used if palette_transpose_flag is 0 Reconstructed pixel in the CU of top.The pixel value of preceding L row sample is the reconstructed pixel value of the last line of top CU.Similar to will hang down Straight intra-prediction (Intra vertical prediction) is applied to preceding L row sample, and the normal pallet mode of remaining row Carry out encoding and decoding.
Fig. 8 A shows the example of extension duplication top operational mode, wherein from the lastrow being located above the boundary CU 810 Replicate two row pixels (that is, L=2).
Method 2: mark syntactic element (for example, the pixel_nam indicated by M) first, to indicate the weight from adjacent C U M sample before structure pixel prediction, wherein M is positive integer.Using the palette grammer in SCM-4.0 or SCM-3.0 to remaining Sample carries out encoding and decoding, the difference is that total palette encoding and decoding sample in PU/CU is reduced.For example, if palette_ Transpose_flag is 0, then the reconstructed pixel above use in CU.The pixel value of preceding M sample is last of top CU Capable reconstructed pixel value.It is applied to preceding M row sample similar to by vertical interior prediction.If the width of CU is CU_width, root According to the grammer in SCM-4.0, first of (M+CU_width) a sample first sample at (M+1) into CU CU_width sample cannot the encoding and decoding in operational mode above duplication.In other words, according to the grammer in SCM-4.0, scanning Sample of the position equal to M to (M+CU_width-1) cannot the encoding and decoding in operational mode above duplication.Fig. 8 B, which is shown, to be passed through Syntactic element pixel_num (M) is indicated to indicate the prediction across CU of M (that is, M=11) a sample before predicting from reconstructed pixel Example.
For example, the syntax table of palette_coding in SCM-3.0 can be modified, as shown in table 12.
Table 12.
As shown in table 12, syntactic element pixel_num is incorporated into, and is such as annotated shown in (12-1), wherein indexing in palette Pixel_num is indicated before mapping encoding and decoding.Scan position encoding and decoding since the pixel_num as shown in annotation (12-2) Remaining palette sample.Palette sample position before is obtained according to annotation (12-3).In preceding pixel_num sample First sample after this is capable not to allow to replicate top mode, such as annotates shown in (12-4).
Indicate that the variable adjustIndexMax of adjusted largest index is exported as follows:
AdjustedIndexMax=indexMax
If (scanPos > pixel_num)
AdjustedIndexMax-=1
Indicate that the variable adjustRefIndexMax export of adjusted maximum reference key is as follows:
AdjustedRefIndex=indexMax+1
If (scanPos > pixel_num)
if(palette_run_type_flag[xcPrev][ycPrev]!=
COPY_ABOVE_MODE)
AdjustedRefIndex=PaletteIndexMap [xcPrev] [ycPrev]
else
AdjustedRefIndex=PaletteIndexMap [xC] [yC-1]
In the above method 1 and method 2, syntactic element copy_from_neighboring_CU_ can be indicated first flag.If copy_from_neighboring_CU_flag is 0, line_num and pixel_num will not be indicated, and will It is inferred as 0.If copy_from_neighboring_CU_flag is 1, line_num and pixel_num can be indicated. The practical line_num equal with the line_num of parsing and pixel_num and pixel_num increases by 1.
Method 3: in the method, adjacent pixel is used to predict the current pixel of the encoding and decoding under pallet mode.First Num_copy_pixel_line is indicated, indicates that the reconstructed pixel from adjacent C U predicts preceding num_copy_pixel_line sample Current row.In addition to starting point changes, except total palette encoding and decoding sample in PU/CU is reduced, remaining sample is by normal palette Index mapping encoding and decoding carry out encoding and decoding.
For the preceding num_copy_pixel_line row of sample, their pixel value is the reconstructed pixel from adjacent C U Prediction, wherein syntactic element num_copy_pixel_line corresponds to the quantity for the pixel line to be replicated.For example, if Palette_transpose_flag is 0, then the reconstructed pixel above use in CU, as shown in Figure 9 A.Most by top CU The reconstructed pixel value of a line predicts the pixel value of preceding num_copy_pixel_line (such as K) row sample afterwards.It is similar to Vertical prediction in frame (Intra vertical prediction) is applied to the preceding K row of sample, and remaining row is adjusted with normal Colour table mode carries out encoding and decoding.If palette_transpose_flag is 1, using the reconstructed pixel in left CU, such as scheme Shown in 9B.
The grammer of num_copy_pixel_line can pallet mode mark after and the encoding and decoding of palette table it Preceding mark.If num_copy_pixel_line is equal to CU width or CU height, palette_transpose_ is indicated Flag is to indicate to predict entire CU from topmost pixel or left pixel.Due to predicting all samples in current CU, so jumping Cross the grammer of the encoding and decoding of palette table and index mapping encoding and decoding.
If num_copy_pixel_line is less than CU width or CU height, using common palette table encoding and decoding and Palette index mapping encoding and decoding.In SCM-4.0, if MaxPaletteIndex is equal to 0, palette_ Transpose_flag is inferred to be 0.But according to current method, if MaxPaletteIndex is equal to 0, and num_ Copy_pixel_line not equal to 0, palette_transpose_flag there is still a need for mark, with instruction will from top CU or The preceding num_copy_pixel_line row or column sample of reconstructed pixel prediction in the CU of left side.Encoding and decoding are mapped for index, are risen Beginning sample position is set as num_copy_pixel_line*CU_width.For with num_copy_pixel_line*CU_ The sample of sample position between width and (num_copy_pixel_line+1) * CU_width -1, can not select in duplication Square operational mode.In other words, sample of the sample position less than CU_width* (num_copy_pixel_line+1) cannot be marked It is shown as duplication top operational mode.
Table 13 shows the example syntax table of the palette encoding and decoding according to the embodiment of method disclosed above.
Table 13.
In table 13, syntactic element num_copy_Pixel_line is incorporated into before all grammers, such as annotates (13- 1) shown in.If num_copy_pixel_line is equal to nCbS (i.e. CU width or CU height), grammer palette_ Transpose_flag is merged as shown in annotation (13-2).The palette index of entire CU is assigned to -1, such as annotates (13- 3) it shown in, indicates from adjacent C U copy pixel value.If num_copy_pixel_line is not equal to 0, MaxPaletteIndex was not more than for 0 (such as MaxPaletteIndex is equal to 0), then syntactic element palette_transpose_ Flag is merged as shown in annotation (13-4).Sample in num_copy_Pixel_line row palette index be assigned as- 1, it such as annotates shown in (13-5).First sample after preceding nCbS*num_copy_pixel_line row is capable not to be allowed to replicate Top mode such as annotates shown in (13-6).
Variables A djustedMaxPaletteIndex export is as follows:
AdjustedMaxPaletteIndex=MaxPaletteIndex
If (PaletteScanPos > num_copy_pixel_line*nCbS)
AdjustedMaxPaletteIndex-=1
Variable adjustRefPaletteIndex is exported as follows:
AdjustedRefPaletteIndex=MaxPaletteIndex+1
If (PaletteScanPos > num_copy_pixel_line*nCbS)
XcPrev=x0+travScan [PaletteScanPos-1] [0]
YcPrev=y0+travScan [PaletteScanPos-1] [1]
if(palette_run_type_flag[xcPrev][ycPrev]!=
COPY_ABOVE_MODE){
AdjustedRefPaletteIndex=PaletteIndexMap [xcPrev] [ycPrev]
(7-80)
}
else
AdjustedRefPaletteIndex=PaletteIndexMap [xC] [yC-1]
}
If PaletteIndexMap [xC] [yC] is equal to -1, corresponding pixel value pixel adjacent thereto is identical.According to This method, if top or left pixel are unavailable (for example, the sample when the restricted intra prediction of application, at frame boundaries Or the sample of interframe encoding and decoding), then the color indexed using the palette having equal to 0.If the palette table of current CU is not By encoding and decoding (for example, num_copy_pixel_line is equal to CU_width), then using first in palette predicted value table A palette.
In another example, if top or left pixel are unavailable, HEVC intra prediction boundary pixel can be used Fill method generates replacement adjacent pixel.
It can be according to syntactic element num_copy_pixel_line_ using the line number of duplication topmost pixel row mode Indication export, directly indicates not as num_copy_Pixel_line.If num_copy_pixel_line_ Indication is 0, if num_copy_pixel_line is then exported as 0. num_copy_pixel_line_ Indication is 1, then num_copy_pixel_line is exported as N, is predefined number.If num_copy_ Pixel_line_indication is k, then num_copy_pixel_line is exported as k*N.
Method 4: in the method, num_copy_pixel_ is indicated before palette_transpose_flag grammer Line grammer.Table 14 shows the exemplary toner plate encoding and decoding syntax table according to the embodiment of this method.
Table 14.
In table 14, after palette_escape_val_present_flag signaling, syntactic element num_copy_ Pixel_line is incorporated into, and is such as annotated shown in (14-1).If num_copy_pixel_line is not equal to 0, MaxPaletteIndex was not more than for 0 (such as MaxPaletteIndex is equal to 0), then grammer palette_transpose_ Flag is incorporated into as shown in annotation (14-2).Sample in num_copy_pixel_line row palette index be assigned to- 1, it such as annotates shown in (14-3).First sample after preceding nCbS* num_copy_pixel_line is capable not to be allowed to replicate Top mode such as annotates shown in (14-4).
Syntactic element num_copy_pixel_line can be in syntactic element palette_escape_val_present_ It is indicated before flag.Table 15 shows the example syntax table of the palette encoding and decoding according to this method.
Table 15.
In table 15, before palette_escape_val_present_flag grammer signaling, syntactic element num_ Copy_pixel_line is incorporated into, and is such as annotated shown in (15-1).If num_copy_pixel_line not equal to 0 and MaxPaletteIndex, then can be according to mode shown in annotation (15-2) no more than 0 (such as MaxPaletteIndex is equal to 0) Merge grammer palette_transpose_flag.The palette index of sample in num_copy_pixel_line row is divided With being -1, such as annotate shown in (15-3).First sample after the one before nCbS* num_copy_pixel_line row is capable not Allow to replicate top mode, such as annotates shown in (15-4).
In this embodiment, if syntactic element num_copy_pixel_line is equal to CU_width, syntactic element NumPredictedPaletteEntries and num_signaled_palette_entries should all be 0.First encoding and decoding Syntactic element palette_predictor_run should be 1.
Signaling (signaling) num_ before palette_escape_val_present_flag (such as table 15) In the grammar design of copy_pixel_line, NumPredictedPaletteEntries and num_signaled_ Palette_entries should be 0, and the palette_predictor_run of first encoding and decoding should be 1, and palette_ Escape_val_present_flag is inferred to be 0.
Method 5: according to this method, num_copy_pixel_line is indicated after palette_transpose_flag. In this grammar design, even if MaxPaletteIndex is equal to 0, it is also desirable to indicate syntactic element palette_ transpose_flag.Table 16 shows the example syntax table of the palette encoding and decoding according to the embodiment of this method.
Table 16.
Since even if syntactic element palette_transpose_flag is in the case where MaxPalettelndex is equal to 0 It is also required to indicate, so grammer can be merged except test " if (MaxPaletteIndex > 0) " (as annotated (16-1) institute Show).Meanwhile the syntactic element palette_transpose_flag in " if (MaxPaletteIndex > 0) " is tested as infused It releases and is deleted shown in the row filling text in (16-2).Syntactic element num_copy_pixel_line quilt as shown in annotation (16-3) It is incorporated to.The palette index of sample in num_copy_Pixel_line row is assigned to -1, such as annotates shown in (16-4).
If num_copy_pixel_line is even number, for the first normal (normal line) using from left to right Scanning is naturally, as shown in Figure 10 A.But if num_copy_pixel_line is odd number, there are two types of sweeping for type Retouching can choose.One be the first normal as shown in Figure 10 B scanning from left to right, the other is as illustrated in figure 10 c The scanning from right to left of first normal.
As shown in Figure 10 B, it moves down traverse scanning (traverse scan downward) or from first normal Use traversal.In Figure 10 C, it is and to skip preceding num_copy_ using the traverse scanning of first sample from current CU The scanning of pixel_line row.In table 13 to 16, the scanning in Figure 10 C is used.Syntax table can correspondingly be modified for scheming Scanning in 10B.
Method 6: table 17 shows the example syntax table of the palette encoding and decoding for scanning in Figure 10 B.
Table 17.
In table 17, syntactic element num_copy_pixel_line is incorporated into before all grammers, such as annotates (17- 1) shown in.If num_copy_pixel_line is equal to nCbS (i.e. CU width or CU height), grammer palette_ Transpose_flag is incorporated into, and is such as annotated shown in (17-2).The palette index of entire CU is assigned to -1, such as annotates (17- 3) it shown in, indicates from adjacent C U copy pixel value.If num_copy_pixel_line be not equal to 0, and MaxPaletteIndex can then be closed no more than 0 (such as MaxPaletteIndex is equal to 0) in the way of annotation (17-4) And grammer palette_transpose_flag.The palette of sample in num_copy_pixel_line row indexes distribution To -1, such as annotate shown in (17-5).PaletteScanPos will reset to 0, such as annotate shown in (17-6).Due to PaletteScanPos is reset (reset), and actual sample index needs to increase num_copy_pixel_line*nCbS, As shown in annotation (17-7 and 17-10 to 17-12).Upright position needs to increase num_copy_pixel_line, such as annotates Shown in (17-8,17-9 and 17-13).
Variables A djustedMaxPaletteIndex is exported as follows:
AdjustedMaxPaletteIndex=MaxPaletteIndex
If (PaletteScanPos > 0)
AdjustedMaxPaletteIndex-=1
Variable adjustRefPaletteIndex is exported as follows:
AdjustedRefPaletteIndex=MaxPaletteIndex+1
If (PaletteScanPos > 0)
if(palette_run_type_flag[xcPrev][ycPrev]!=
COPY_ABOVE_MODE){
XcPrev=x0+travScan [PaletteScanPos-1] [0]
YcPrev=y0+travScan [PaletteScanPos-1] [1]+num_copy_pixel_line
AdjustedRefPaletteIndex=PaletteIndexMap [xcPrev] [ycPrev] (mistake!It is specified The text of lemon formula there is no in file.-1)
}
else
AdjustedRefPaletteIndex=PaletteIndexMap [xC] [yC-1]
}
Note that the signaling of num_copy_Pixel_line can also be by using existing in the above method 3 into method 6 Palette_run syntactic element and context in HEVC SCC specification construct.For example, it can be indicated as shown in table 18.
Table 18.
In table 18, using above-mentioned grammer, decoded copy_pixel_line_length can be semantically duplication The quantity of pixel column.The maximum value of the encoding and decoding element is block height -1.
Decoded copy_pixel_line length is also possible to the actual sample number using copy pixel mode, and will The quantity of copy pixel line exports as copy_pixel_line length/block_width.Note that in the method, to copy_ Pixel_line length applies consistency constraint (conformance constraint), so that it must be block_width Multiple.
Method 7: in the method, whether carry out mark block from current using the current grammar structure with modification decoding process The instruction that copy pixel mode outside CU starts, and the line number (sample number) using copy pixel mode.For example, this can be with It is accomplished by the following way:
Allow to replicate top operational mode for first sample in block.This is by using syntactic element Palette_run_type_flag [0] [0] is indicated.If palette_run_type_flag [0] [0] is 1, exercise below The sample of mark line number is filled with copy pixel.If palette_run_type_flag [0] [0] is 0, remaining grammer letter Enable (syntax signaling) will keep identical as current grammar structure.
When palette_run_type_flag [0] [0] is 1, using for the identical of signaling palette running length Syntactic element indicate the line number using copy pixel.Palette running length grammer can be used to notify there are two types of method Line number.
Decoding palette running length R semanteme of the ο for copy pixel mode means line number (rather than sample number).Cause This, the actual motion of copy pixel is the solution code value R * block_width of the horizontal sweep or solution code value R* of vertical scanning block_height.The maximum value of the decoded R should be block height (or width).
O another kind method is that decoded palette running length R is actual copy pixel operation.
This method does not need the semanteme and decoding process of change copy pixel operation.
Note that for this method, it is necessary to apply consistency constraint to solution code value R, so that it must be block_width Multiple.
When palette_run_type_flag [0] [0] is 1, after copy pixel row, next line sample cannot make With duplication top operational mode.Parsing standard is modified according to this condition, not parse the run_type mark of this journey.
When palette_run_type_flag [0] [0] is 0, remaining sample in the first row cannot use duplication Top operational mode.Parsing standard is modified according to this condition, not parse the run_type mark of these samples.
Table 19 shows the example syntax table of the palette encoding and decoding according to this method." the toning of copy pixel mode The encoding and decoding of plate running length R " semantically mean line number (not being sample number).
Table 19.
In table 19, even if since MaxPaletteIndex is equal to 0, it is also desirable to indicate syntactic element palette_ Transpose_flag, so combining grammer (such as note (19-1) except test " if (MaxPaletteIndex > 0) " It is shown).Meanwhile it annotating the row filling text representation in (19-2) and deleting in test " if (MaxPaletteIndex > 0) " Syntactic element palette_transpose_flag.The palette index of entire CU is re-set as -1 first, such as annotates Shown in (19-3), instruction pixel value is replicated from adjacent C U.The Palette_ of first sample (PaletteScanPos==0) Run_type_flag and paletteRun are used to indicate num_copy_pixel_line.As shown in annotation (19-5), for One sample, if palette_run_type_flag is equal to COPY_ABOVE_MODE, the maxPaletteRun is set It is set to and is equal to nCbS-1, such as annotate shown in (19-4), and num_copy_pixel_line is equal to decoded paletteRun. Otherwise, num_copy_pixel_line is set as 0.
Table 20 shows the another exemplary syntax table of the palette encoding and decoding according to this method.Copy pixel mode The encoding and decoding of " palette running length R " are actual running lengths, and apply consistency constraint and make R that must be CU_ The multiple of width.
Table 20.
In table 20, even if since MaxPaletteIndex is equal to 0, it is also desirable to indicate syntactic element palette_ Transpose_flag, so combining grammer (as infused (20-1) institute except test " if (MaxPaletteIndex > 0 " Show).Meanwhile it annotating the row filling text representation in (20-2) and deleting the language tested in " if (MaxPaletteIndex > 0) " Method element palette_transpose_flag.The palette index of entire CU is re-set as -1 first, such as annotates (20- 3) shown in, indicate that pixel value is replicated from adjacent CU.The palette_run_ of first sample (PaletteScanPos==0) Type_flag and paletteRun are used to indicate num_copy_pixel_line.As shown in annotation (20-4), for first Sample, if palette_run_type_flag is equal to COPY_ABOVE_MODE, num_copy_pixel_line is equal to Decoded paletteRun/nCbs.It note that in this case, consistency constraint applied to paletteRun, keeps its necessary It is the multiple of nCbs.Otherwise (i.e. palette_run_type_flag is not equal to COPY_ABOVE_MODE), num_copy_ Pixel_line is set as 0.
Method 8: according to this method, num_copy_pixel_line=0 or CU_width is only tested.By introducing newly The shortcut of syntactic element pred_from_neighboring_pixels offer pallet mode.If pred_from_ Neighboring_pixels is 1, then shows palette_transpose_flag.
If pred_from_neighboring_pixels is 1, and palette_transpose_flag is 0, then From all samples of the pixel prediction of top CU.If pred_from_neighboring_pixels is 1, and palette_ Transpose_flag is 1, then from all samples of pixel prediction of left side CU.If adjacent pixel is unavailable, there are two types of sides Method generates replacement pixel.According to first method, intra prediction boundary (Intra prediction can be used Boundary) pixel generation method.It is similar to the horizontally or vertically intra prediction of no residual error encoding and decoding.According to second of side Method is equal to 0 color using toning partitioned index.If the palette table of current CU is not by encoding and decoding (for example, num_copy_ Pixel_line is equal to CU_width), then using first palette in palette predicted value table.
Table 21 shows the example syntax table of the palette encoding and decoding according to this method.
Table 21.
In table 21, syntactic element pred_from_neighboring_pixels grammer is closed as shown in annotation (21-1) And.If pred_from_neighboring_pixels is true (true), palette_transpose_flag is closed And and entirely the palette index of CU is arranged to -1, it such as annotates shown in (21-2).
In method 3 into method 8, a palette_transpose_flag is only indicated.If palette_ Transpose_flag is equal to 0, then uses vertical copy pixel mode first, then scanned using index level.Otherwise, first Using horizontal copy pixel, then using index vertical scanning.
Alternatively, two transposition marks can be indicated, for example, palette_copy_pixel_transpose_flag and palette_scan_transpose_flag.Palette_copy_pixel_transpose_flag is indicated from adjacent C U's The direction of copy pixel mode.Palette_scan_transpose_flag indicates the direction of palette scanning.For example, Palette_copy_pixel_transpose_flag, which is equal to 0, to be indicated to come from above the copy pixel mode of CU, and Palette_copy_pixel_transpose_flag, which is equal to 1, indicates the copy pixel mode from left side CU.Palette_ Scan_transpose_flag, which is equal to 0, to be indicated to use horizontal sweep, and Palette_scan_transpose_flag is equal to 1 It indicates to use vertical scanning.
For example, in Figure 11 A, palette_copy_pixel_transpose_flag 0, palette_ Scantranspose_flag is 0.In Figure 11 B, palette_copy_pixel_transpose_flag 0, Palette_scan_transpose_flag is 1.In Fig. 1 lC, palette_copy_pixel_transpose_flag is 1, palette_scan_transpose_flag 0.
It is predicted on CU by reversed traverse scanning and rotation traverse scanning
According to one embodiment, the prediction across CU can be realized with reversed traverse scanning and rotation traverse scanning.Figure 12 A The example predicted by reversed traverse scanning across CU is shown, Figure 12 B, which is shown, to be shown by rotation traverse scanning across what CU was predicted Example.The scan position of the normal index mapping encoding and decoding of two scanning ends at nCbS*nCbS-num_copy_ since 0 pixel_line*nCbS-1.For sample position equal to or more than nCbS*nCbS-num_copy_pixel_line*nCbS's Remaining sample, PaletteIndexMap [xC] [yC] are set as -1, it means that their pixel value is identical as adjacent pixel.
The context of num_copy_Pixel_line is formed and binaryzation (Binarization)
Syntactic element hum_copy_pixel_line can be used K rank (EG-K code) index-Columbus ( Exponential-Golomb) code, index-Columbus's code (the EG-K code of truncation) of the truncation of K rank, the unitary code of N truncations (two-value turns to palette_ to identical binarization method used in (Unary code)+EG-K code or palette operation Run_msb_id_plus1 and palette_run_refinement_bits).
Context binary file (context bins) can be used for encoding and decoding num_copy_pixel_line.For example, We can be used and identical binarization method used in the operation of the palette of num_copy_pixel_line. Context encoding and decoding binary file (bins) can be used in the top n position of palette_run_msb_id_plus1.For example, N It can be 3.Remaining binary file (bins) is by encoding and decoding in bypass binary file (bins).Toning can be passed through Plate runs encoding and decoding and shares context.For example, context can be shared with operational mode above the duplication of extension.
Probability due to num_copy_pixel_line equal to CU_width and 0 is greater than num_copy_Pixel_line As the probability of other numbers, therefore code binaryzation is had modified so that the num_copy_Pixel_ of CU_width will be equal to The code word (codeword) of line reduces.For example, the value of CU_width is inserted into the number in the codeword table for binaryzation Before word P (such as 1).The value of num_copy_pixel_line equal to or more than P will increase by 1 in coder side.It is decoding Device side, if the code word of parsing is P, then it represents that num_copy_pixel_line is equal to CU_width.If the code word of parsing Less than P, then num_copy_pixel_line is equal to the code word of parsing.If the code word of parsing is greater than P, num_copy_ The code word that pixel_line is equal to parsing subtracts 1.
In another embodiment, two extra orders are used to indicate whether num_copy_pixel_line is equal to 0, CU_ Width or other numbers.For example, " 0 " indicates that num_copy_pixel_line is equal to 0, " 10 " indicate num_copy_ Pixel_line is equal to CU_width, and " 11+codeord-L " indicates that num_copy_pixel_line is equal to L+1.
In another embodiment, indicated using two extra orders whether num_copy_pixel_line is equal to 0, CU_width or other numbers.For example, " 0 " indicates that num_copy_pixel_line is equal to CU_width, " 10 " indicate num_ Copy_pixel_line is equal to 0, and " 11+codeord-L " indicates that num_copy_Pixel_line is equal to L+1.
Method 4 for mentioning in across CU regional prediction indicates num_ after palette table encoding and decoding to method 7 copy_pixel_line.The binaryzation of num_copy_pixel_line can according to the decoded informations of palette table encoding and decoding into Row modification.For example, if NumPredictedPaletteEntries and num_signaled_palette_entries are 0, then this means that at least one sample row/column carries out encoding and decoding by normal pallet mode.Therefore, num_copy_ Pixel_line should not be equal to CU_width.Therefore, the code word range of num_copy_pixel_line be restricted to from 0 to (CU_width-1).For example, if be truncated using index-Columbus's code (the EG-K code of truncation) of the truncation of K rank, N one Metacode+EG-K code, or the identical binarization method used in palette operation are used for num_copy_pixel_ Line carries out encoding and decoding, then cMax, MaxPaletteRun or maximum value possible are set as CU_width-1.For palette Palette will be run two-value and turn to palette_run_msb_id_plusl and palette_run_ by the binarization method of operation refinement_bits。
Determine the searching method of num_copy_pixel_line
In another embodiment, it discloses for determining copy pixel line number (that is, num_copy_pixel_line) Searching method.
Method 1: in coder side, the value of num_copy_pixel_line is determined.Preceding num_ is predicted from adjacent pixel Copy_pixel_line column/row.Remaining sample is used to export the palette of remaining sample and index mapping.
Method 2: in coder side, palette is exported first using the sample of entire CU.It, can be with according to the palette table Num_copy_ is determined using rate-distortion optimisation (ate-distortion-optimization is abbreviated as RDO) process The optimum value of pixel_line.Interpolation (interpolation) can be used for estimating the position of different num_copy_pixel_line Cost.For example, if the position cost of num_copy_pixel_line=0 is R0, and CU_width is 16, then num_copy_ The position cost of pixel_line=3 is equal to R0* (CU_width -3)/CU_width (i.e. 13/16*R0).
After determining num_copy_pixel_line, from adjacent pixel predict before num_copy_pixel_line column/ Row.Remaining sample for exporting the palette of remaining sample and index mapping again.
From adjacent C U based on capable copy pixel
In part in front, discloses and solution is compiled for pallet mode based on capable copy pixel from adjacent C U Code.The mark syntactic element num_copy_pixel_row that indicates copy pixel line number first, with from the reconstructed image in adjacent C U The preceding num_copy_pixel_row row sample to be predicted of element instruction.Similar concept can be applied to other modes, such as interframe Mode, BC mode in BC mode and frame in frame.
In one embodiment, it is applied in inter-frame mode and/or frame from adjacent C U based on capable copy pixel Mode PU or CU.Syntactic element num_copy_pixel_row is indicated first.If num_copy_pixel_row is not equal to 0, Indicate syntactic element copy_pixel_row_direction_flag then to indicate the direction of copy pixel row.If copy_ Pixel_row_direction_flag is 0, then num_copy_pixel_row indicates the reconstructed pixel prediction from the CU of top Preceding num_copy_pixel_row row sample.If copy_pixel_row_direction_flag is 1, num_ Copy_pixel_row indicates the preceding num_copy_pixel_row column sample of the reconstructed pixel prediction from left CU.For example, figure 13 show the example of 8 × 8CU of encoding and decoding in inter-frame mode.The value of the num_copy_pixel_row of the CU is 3, and should The copy_pixel_row_direction_flag of CU is 0.Predicted value in upper three row by top CU last line weight Structure pixel value replaces.Rest of pixels is by original inter-mode prediction.It, which is similar to, executes inter-mode prediction to entire CU/PU, Then preceding num_copy_pixel_row row or column is replaced with adjacent pixel.
Intra prediction adjacent pixel construction can be used for generating neighboring reference pixel.If adjacent pixel it is unavailable (for example, Except image boundary), then neighboring reference pixel can be generated using the reference pixel fill method in intra prediction.In frame Smoothing filter in prediction can be applied or be closed.
Syntactic element num_copy_pixel_row and copy_pixel_row_direction_flag can with CU grades or PU grades of beacon signals.Num_copy_pixel_row and copy_pixel_row_ can be indicated before CU or PU Direction_flag perhaps can be indicated or can be indicated at the end of CU or PU among CU or PU.For example, such as Fruit num_copy_pixel_row and copy_Pixel_row_direction_flag is indicated with CU grades and is marked before CU Show, then can indicate num_copy_pixel_row and copy_pixel_row_direction_ before part_mode flag.It can adaptively be changed according to the value of num_copy_pixel_row and copy_Pixel_row_direction_flag Become the code word of part_mode.For example, if copy_pixel_row_direction_flag is 0 and num_copy_ Pixel_row is equal to or more than CU_height/2, then PART_2NxN and PART_2NxnU are invalid.Modify the code of part_mode Word binaryzation.For example, the binaryzation of part_mode is shown in table 22 if deleting PART_2NxN and PART_2NxnU In.
Table 22.
In table 22, the text with line filling background corresponds to the text deleted.In another example, if in CU Grade denote num_copy_pixel_row and copy_pixel_row_direction_flag, and at the ending of CU or Intermediate in CU indicates num_copy_pixel_row and copy_pixel_row_direction_flag, then can be Num_opy_pixel_row and copy_Pixel_row_direction_flag is indicated after part_mode.It is receiving After part_mode, the value of num_copy_pixel_row and copy_pixel_row_direction_flag can be limited. For example, if part_mode is PART_2NxN and copy_pixel_row_direction_flag is 0, num_copy_ In the range of the value of pixel_row will be limited in from 0 to CU_height/2.
Frame inner boundary (Intra Boundary) reference pixel
In frame mode, if num_copy_pixel_row is not 0, neighboring reference pixel can with from being originated from The pixel of HEVC is identical.It is identical as the intra prediction of PU is executed, and then replaces former row or column with the pixel of adjacent C U.
In another embodiment, if num_copy_pixel_row is not 0, the position of neighboring reference pixel is changed Become.Figure 14 and Figure 15 shows the example of the position of the change neighboring reference pixel according to the present embodiment.In figures 14 and 15, Num_copy_pixel_row is 3, and copy_pixel_row_direction_flag is 0.Top reference pixel and upper left Reference pixel is moved to the third line.As shown in figure 14, from the reference pixel in the CU in upper right side duplication upper right side.As shown in figure 15, Reference pixel from the pixel duplication upper right side of the rightmost of the third line.
For the residual error encoding and decoding from the region that adjacent C U is predicted
According to one embodiment, if num_copy_Pixel_row is N and copy_pixel_rowdirection_ Flag is 0, then predicts top N row from adjacent C U.The residual error of top N row can be limited to 0.It according to another embodiment, can be with The residual error of mark top N row.For inter-frame mode, using HEVC residual error quaternary tree.
The context of num_copy_pixel_row is formed and binaryzation
The index Golomb code (EG-K code) of K rank, the index of K rank truncation can be used in num_copy_pixelrow Identical two-value used in Golomb code (the EG-K code of truncation), the N unitary code+EG-K codes being truncated or palette operation Change method is (that is, be binarized as palette_run_msb_id_plusl and palette_run_refinement_bits).
Context binary file can be used for carrying out encoding and decoding to num_copy_pixel_row.It is, for example, possible to use Binarization method identical with the num_copy_pixel_row in palette operation encoding and decoding.palette_run_msb_id_ The binary file of context encoding and decoding can be used in the top n position of plusl.For example, N can be 3.Remaining binary file By encoding and decoding in bypass binary file.Encoding and decoding can be run by palette share context.For example, context can be with It is shared that top operational mode is replicated with extension.
Probability due to num_copy_pixel_row equal to CU_width and 0 is made higher than num_copy_pixel_row For other number probability, therefore can modify code binaryzation with reduce be equal to CU_width num_copy_pixel_ The code word of row.For example, the value of CU_width can be inserted into before digital M (such as 1) in codeword table.Equal to or more than M's The value of num_copy_pixel_row will increase by 1 in coder side.In decoder-side, if the code word of parsing is M, then it represents that Num_copy_pixel_row is equal to CU_width.If the code word of parsing is less than M, num_copy_pixel_row is equal to Resolved code word.If the code word of parsing is greater than M, the code word that num_copy_pixel_row is equal to parsing subtracts 1.
According to another embodiment, indicate whether num_copy_pixel_row is equal to 0, CU_ using two extra orders Width or other numbers.For example, " 0 " indicates that num_copy_pixel_row is equal to 0, " 10 " indicate num_copy_Pixel_ Row is equal to CU_widm, and " 11+codeord-L " indicates that num_copy_pixel_row is equal to L+1.In another example, " 0 " indicates that num_copy_pixel_row is equal to CU_width, and " 10 " indicate that num_copy_pixel_row is equal to 0, " 11+ Codeord-L " indicates that num_copy_pixel_row is equal to L+1.
Method 4 for mentioning in the prediction across the part CU can be indicated to method 7 after palette table encoding and decoding num_copy_pixel_row.The binaryzation of num_copy_pixel_row can be according to the decoded information of palette table encoding and decoding It modifies.For example, if NumPredictedPaletteEntries and num_signaled_palette_entries all It is 0, then it represents that the sample of at least one row/column carries out encoding and decoding with normal pallet mode.num_copy_pixel_row CU_width should not be equal to.Therefore, the code word range of num_copy_pixel_row is restricted to from 0 to (CU_width-1). For example, if K rank truncation index-Columbus code (the EG-K code of truncation), N be truncated unary code+EG-K codes, The identical binarization method used in palette operation (be binarized as palette_run_msb_id_plus1 and Palette_run_refinement_bits) for carrying out encoding and decoding to num_copy_pixel_row, by cMax, MaxPaletteRun or maximum value possible are set as CU_width-1.Num_copy_pixel_row should not be equal to CU_width
In the portion, above-mentioned CU_width can with origin from the CU grade of adjacent method or PU grades based on capable duplication picture CU_height, PU_width or PU_height substitution of element.
Index encoding and decoding quantity
It is indexed in mapping encoding and decoding in SCM-4.0 palette, first mark index number.It is first in order to indicate the quantity of index First induced variable " index quantity-palette size ".Then mapping processing is executed to reflect " index quantity-palette size " It is mapped to " mapping value ".Identical binarization method carries out two-value with " coeff_abs_level_remaining " for mapping value use Change, and indicates.Prefix part is indicated by the Rice code being truncated.Suffix portion is indicated by Exp- Golomb.This binaryzation mistake Journey, which may be considered that, has input cParam, and cParam is arranged to (2+indexMax/6).But indexMax/6 Need division or look up table operations.Therefore, according to an embodiment of the invention, setting cParam to (2+indexMax/M), and And M is 2 integral number power (power-of-2) (such as 2n, n is integer).Therefore, indexMax/M can pass through by IndexMax is shifted to the right to n to realize.It, can be with for example, cParam is set as (2+indexMax/4) or (2+ indexMax/8) It is realized by the way that indexMax is moved to right 2 or 3 respectively.
Assemble before the index mapping of coding/parsing palette and all jumps out color
In current HEVC SCC specification or previous version, during the encoding and decoding of index mapping, modulation panel encoding and decoding In the value for jumping out pixel with Else Rule index indicate in an interleaved manner, or index mapping encoding and decoding complete after will Pixel value flocks together.According to one embodiment of present invention, all pixel values of jumping out are collected at index mapping encoding and decoding Before.
Assuming that the different location in current codec block jumps out pixel there are N number of, wherein N is positive integer.In a reality It applies in example, before the palette index mapping to the encoding and decoding block encodes/decodes, these jump out all colours of pixel Value is encoded together/decodes.In this way, when index is decoded as jumping out index, corresponding pixel value no longer needs It is decoded.It note that certain pixels of jumping out may color value having the same.In one embodiment, pixel is each jumped out to go out Existing pixel value is still written into bit stream.
Figure 16 is the embodiment according to this method, and the example of color is jumped out with N=5 decoding, is gone out wherein each jumping out pixel Existing pixel value is still written in bit stream.In this example, using horizontal traverse scanning.According to decoding order, each jump Pixel can find corresponding color in decoding table out.
In another embodiment, only non-repetitive color value is written in bit stream, and each to jump out pixel Indicate the index of the color of these write-ins.
Figure 17 shows the example for jumping out color is decoded according to the embodiment of the method for N=5, wherein only non-repetitive Color value is written into bit stream.In this example, using horizontal traverse scanning.Only unduplicated color is decoded (such as M= 3), and the index in the color table for each jumping out pixel appearance is labeled notice (such as N=5).It is suitable according to decoding Sequence, corresponding color can be found in decoding table by each jumping out pixel.
Jump out the termination of color signaling
In one embodiment, the quantity for jumping out color of coding/decoding is indicated.For example, being compiled in each color of jumping out After decoding or decoding, if this is to be encoded or the last one decoded color, 1 bit flag " end_of_ is used Escape_color_flag " issues signal.When decoded end_of_escape_color_flag is 1, do not need to solve Code more jumps out color.N number of color, last M pixel color having the same are jumped out assuming that having in current codec block Value, wherein M and N is integer, M <=N.In another embodiment, it is only necessary to a color value of these M pixel is sent, And end_of_escape_color_flag is arranged to 1.Finally (M-1) a pixel of jumping out is inferred to be shared last solution Code jumps out color value.For N=5 and M=3, the example of this method is shown in table 23.
Table 23.
In another embodiment, before encoding/decoding to color value, explicitly (explicitly) is indicated Jump out the sum of color.In addition, end_of_escape_color_flag can be bypass encoding and decoding or context encoding and decoding.
What is allowed in encoding and decoding block jumps out the limitation of color sum
It can be by the way that maximum allowable quantity be arranged at the level header of such as sequence-level, picture level or piece (slice) grade To limit the decoded sum for jumping out color.
If reaching the maximum number, according to one embodiment, infer remaining pixel of jumping out with shared last decoding The value for jumping out color.In another embodiment, infer and remaining jump out pixel to share specific decoded color of jumping out Value, such as most common color.
Copy pixel outside from CU
In SCM3.0, color index value range depends on palette size and jumps out mark.If jumping out mark to close, Then maximum index value is equal to palette size.If jumping out mark to open, maximum index value is equal to (palette size+1).Such as Fruit maximum index value is equal to N, then possible index value range arrives (N-1) for 0.In SCM 3.0, forbid the most hawser equal to 0 Draw value.If maximum index value, which is equal to all colours index in 1, CU, will be estimated to be 0.If only one possible rope Draw value, then assumes that all colours index should be 0.
According to above disclosure, pixel can be indicated by COPY_ABOVE.In this case, it not only will duplication The pixel index of topmost pixel, but also the pixel value that topmost pixel will be replicated.Decoder can be reconstructed from the pixel value of copy Pixel in COPY_ABOVE, without quoting palette.If above-mentioned pixel crosses over the boundary CU, according to above disclosure, divide The special index (being expressed as N) of adjacent formation pixel (NCP) with adjacent C U.When pixel is indicated by COPY_ABOVE, it is not Can only replicate the pixel index (N) of topmost pixel, but also by copy pattern 18 by the topmost pixel of filling region instruction Pixel value, wherein Figure 18 illustrates the boundary CU (1810).
The method of pixel value based on duplication NCP, not with the hypothesis of the CU of zero/mono- index value processing palette encoding and decoding It is genuine.It the case where 0 is equal to for maximum index value, can be from all pixels predicted in NCP in CU, as shown in figure 19.
If maximum index value is equal to 1, not every color index can be 0.A part of pixel can be 0, and one part of pixel can be predicted from NCP, as shown in figure 20.
In the example shown in Figure 19 and Figure 20, it can be used in SCM3.0 without corresponding grammer.Therefore, public below The new syntax for indicating these situations is opened.
Syntactic element for the index prediction across CU
In SCM3.0, palette encoding and decoding CU includes following grammer:
Palette_share_flag is equal to 1 specified palette size and is equal to previously PreviousPetteBoxEntries, entire palette entry are identical as previous palette entry.
Palette_transpose_flag is equal to the 1 specified association palette rope that transposition process is applied to current CU Draw.Palette_transpose_flag is equal to the association palette index that 0 specified transposition process is not applied to current CU.
The specified sample value for jumping out encoding and decoding of Palette_escape_val_present_flag.
Palette_prediction_run [i] specifies previous palette previousPaletteEntries In difference between the index currently reused and the index of next palette entry re-called, but have following exception: Palette_prediction_run, which is equal to 0, indicates that the difference between current and next index for reusing entry is 1, Palette_prediction_run, which is equal to 1, to be indicated no longer to reuse the more of previous palette previousPaletteEntries Entry.
Num_signaled_palette_entries is specified in palette and is clearly indicated for current codec unit Entry number.
I-th of element in the palette of Palette_entries designated color component cIdx
The operation encoding/decoding mode of Palette_run_coding () assigned indexes mapping.
In order to provide the grammer of the index prediction across CU, embodiment according to the present invention discloses following syntax example:
Syntax example 1: new mark all_pixel_from_NCP_flag is added.If all_pixel_from_ NCP_flag is closed, other grammers are identical as SCM3.0.In the first row, it is pre- across CU to allow that duplication operational mode can be indicated It surveys.If all_pixel_from_NCP_flag is opened, mean that (imply) all pixels will all be predicted from NCP. Palette_transpose_flag can indicate the prediction from left NCP or the above NCP.Other prediction sides can also be indicated To.If all_pixel_from_NCP_flag open, can skip signaling palette_share_flag, palette_escape_val_present_flag、palette_prediction_run、 num_signaled_palette_ Entries, palette_entries or palette_run_coding ().
Figure 21 shows the exemplary process diagram of the signaling for supporting the index prediction across CU according to above-mentioned example.In In step 2110, whether test all_pixel_from_NCP_flag is equal to 1.If result is "Yes", then follow the steps 2130.If result is "No", 2120 are thened follow the steps.In step 2130, palette_transpose_flag is labeled To indicate the prediction from left NCP or top NCP.In step 2120, the grammer based on SCM3.0 is pre- for the index across CU It surveys.
Syntax example 2: the new mark any_pixel_from_NCP_flag of addition.If any_pixel_from_NCP_ Flag is closed, other grammers are identical as SCM3.0.In the first row, duplication operational mode is not indicated (not across the pre- of CU It surveys).If any_pixel_from_NCP_flag open, imply (imply) will from NCP predicted portions pixel.Encoder Palette_share_flag, palette_prediction_run, num_signaled_palette_ can be indicated Entries, palette_escape_val_present_fiag, and decoder can calculate largest index based on the information Value.If maximum index value is equal to 0, all pixels are predicted from NCP, and can skip palette_run_coding ().If maximum index value is greater than 0, from NCP predicted portions pixel, and palette_run_coding () can be indicated.
Figure 22 shows the exemplary process diagram of the signaling for supporting the index prediction across CU according to above-mentioned example.Such as Shown in step 2210, whether test any_pixel_from_NCP_flag is equal to 1.If result is "Yes", then follow the steps 2230.If result is "No", 2220 are thened follow the steps.In step 2220, the grammer based on SCM3.0 is predicted for indexing, And across CU prediction is not needed.In step 2230, mark includes palette_share_flag, palette_prediction_ The various languages of run, num_signaled_palette_entries and palette_escape_val_present_flag Method element.Decoder is based on the information and calculates maximum index value, and checks whether maximum index value is equal in step 2240 0.If result is "Yes", 2260 are thened follow the steps.If result is "No", 2250 are thened follow the steps.In step 2250, mark Show palette_transpose_flag and palette_run_coding ().In step 2260, palette_ is indicated Transpose_flag, and skip palette_run_coding () (that is, predicting all pixels from NCP).
Syntax example 3: the new mark any_pixel_from_NCP_flag of addition.If any_pixel_from_NCP_ Flag is closed, other grammers are identical as SCM3.0.In the first row, duplication operational mode is not indicated (not across the pre- of CU It surveys).If any_pixel_from_NCP_flag open, imply will from NCP predicted portions pixel.Encoder can be marked Show palette_share_fag, palette_prediction_run, num_signaled_palette_entries, Palette_escape_val_present_flag, and decoder can calculate maximum index value based on the information.If most Massive index value is equal to 0, then all pixels is predicted from NCP, and can skip palette_run_coding (), such as Fig. 2 institute Show.Otherwise, other grammers are identical as SCM3.0.In the first row, duplication operational mode is labeled (across CU prediction).It note that If maximum index value is equal to 1, palette_run_coding () and palette_transpose_fiag can be skipped.
Figure 23 shows the exemplary process diagram of the signaling for supporting the index prediction across CU according to above-mentioned example.It should Flow chart with it is essentially identical in Figure 22, the case where in addition to largest index not being 0 (that is, "No" path from step 2240) it Outside.In this case, as shown in step 2310, the prediction across CU is used for according to the grammer of SCM3.0.
Syntax example 4: in " all_pixel_from_NCP_flag " and syntax example 2 or 3 in syntax example 1 " any_pixel_from_NCP_flag " can be combined into palette_prediction_run.In SCM3.0, palette_ Prediction_run is running length encoding and decoding.If operation (i.e. palette_prediction_run [0]) etc. for the first time In fixed or derived value, then all_pixel_from_NCP_flag or any_pixel_from_NCP_flag, which is inferred to be, beats Open (on).The value can be 0 or 1.
Syntax example 5: as shown in the step 2410 of Figure 24, encoder can indicate palette_share_flag, Palette_prediction_run and num_signaled_palette_entries signal.It is then possible to be led according to information Palette size out.
Palette size is checked to determine if to be equal to 0, as shown in step 2420.If palette size is greater than 0 (that is, "No" path from step 2420), other grammers are identical as SCM3.0, as shown in step 2430.In the first row, It can be according to whether being across CU prediction indicates duplication operational mode.
If palette size is equal to 0 (that is, "Yes" path from step 2420), signal_from_ is indicated NCP_flag.Check whether any_pixel_from_NCP_flag opens in step 2440.If any_pixel_from_ NCP_flag closes (that is, "No" path from step 2440), then palette_escape_val_present_flag quilt It is inferred as opening, as shown in step 2450, and the grammer based on SCM3.0 is predicted for indexing, and does not need across CU prediction. If any_pixel_from_NCP_flag is opened, palette_escape_val_present_flag is indicated.If Any_pixel_from_NCP_flag opens (that is, "Yes" path from step 2440) and palette_escape_val_ Present_flag closes (that is, "No" path from step 2460), then can predict all pixels from NCP, and can be with Palette_run_coding () is skipped, as shown in step 2470.If any_pixel_from_NCP_flag opening (that is, "Yes" path from step 2440), and palette_escape_val_present_flag open, then partial pixel from NCP prediction, and palette_runcoding () can be indicated, as shown in step 2480.
Syntax example 6: the example and syntax example 5 are essentially identical, in addition to any_pixel_from_NCP_flag is to beat (that is, "Yes" path from step 2440) for opening and palette_escape_val_present_flag are open (that is, "Yes" path from step 2460) situation.In this case, as shown in the step 2510 of Figure 25, all pixels are all It is to jump out index.
Syntax example 7: encoder can indicate palette_share_flag, palette_prediction_run, Num_signaled_palette_entries, as shown in the step 2610 of Figure 26.It is then possible to export palette according to information Size.
Palette size is checked in step 2620 to determine if to be equal to 0.If palette size be equal to 0 (that is, "Yes" from step 2620), then it indicates all_pixelfrom_NCP_flag and checks all_ in step 2640 Whether pixel_from_NCP_flag opens.If all_pixel_from_NCP_flag is opened (that is, coming from step 2640 "Yes" path), then all pixels be implied to be prediction from NCP, as shown in step 2660.In such a case, it is possible to indicate Palette_transpose_flag is to imply prediction from left NCP or top NCP.Other prediction directions can also be indicated.It is no Then, grammer is identical as SCM3.0 shown in step 2650.In the first row, duplication operational mode can be indicated (i.e. across CU's Prediction).
Syntax example 8: in syntax example 8, operation encoding and decoding can be indicated to indicate the situation in Figure 27.In Figure 27 Flow chart is similar to the flow chart in Figure 26.However, the step 2630 and step 2650 in Figure 26 be replaced by step 2710 and Step 2720 (that is, mark palette_escape_val_Present_flag, palette_transcope_flag and palette_run_coding())。
Syntax example 9: as shown in the step 2810 of Figure 28 A, encoder can send palette_share_flag, palette_reuse_flag()、num_signaled_palette_entries、 palette_escape_val_ Present_flag, and decoder can calculate Maximum Index value based on the information.If maximum index value be equal to 0 or 1 (that is, "No" path from step 2820), then it can skip palette_run_coding ().If maximum index value is equal to 0 (that is, "No" path from step 2830) predicts all pixels from NCP then as shown in step 2850.As step 2840 institute Show, if maximum index value is greater than 1 (i.e. the "Yes" path from step 2820), palette_transpose_ can be indicated Flag is to imply the prediction from left NCP or top NCP.Other prediction directions can also be indicated.If maximum index value is equal to 1 (i.e. the "Yes" path from step 2830), then all colours index in CU will be inferred to be 0 or jump out, such as step 2860 institute Show.
Syntax example 10: as shown in the step 2812 of Figure 28 B, encoder can indicate palette_share_flag, palette_reuse_flag()、num_signaled_palette_entries、 palette_escape_val_ Present_flag, decoder can calculate maximum index value based on the information.If maximum index value is equal to 0 (that is, from step Rapid 2822 "No" path), then as shown in step 2832, all pixels are predicted from NCP.If maximum index value be greater than 0 (that is, "Yes" path from step 2822), then palette_transpose_flag can be indicated to imply and come from left NCP or top The prediction of NCP, as shown in step 2842.Other prediction directions can also be indicated.Palette_run_coding can be skipped ().If maximum index value is greater than 0, palette_transpose_flag and palette_run_coding can be indicated (), as shown in figure 20.
In above-mentioned syntax example, NCP can be hithermost top row or hithermost left-hand column.If NCP row Quantity is greater than 1 (for example, two hithermost top rows or two hithermost left-hand columns), it may be necessary to which additional signaling refers to Show which NCP row be used to predict.In syntax example in the case where " whole pixels are predicted from NCP ", NCP can be limited For nearest top row or hithermost left-hand column.
Although according to an embodiment of the invention, being illustrated using specific syntax elements for supporting the index prediction across CU Syntax example, but these specific syntax elements are not necessarily to be construed as limitation of the present invention.Spirit of the invention is not being departed from In the case where, other syntactic elements can be used in those skilled in the art and semanteme carrys out across CU execution index prediction.
For enabling the syntactic element of the index prediction across CU
For the index prediction across CU, mark mark can be enabled in PPS (picture parameter set) or SPS (sequence parameter set) Will.In addition, only mark could be indicated when the palette_mode_enabled_flag in SPS is true.Otherwise it will be by It is inferred as vacation.When the enabling mark of the index prediction across CU is fictitious time, the index prediction across CU is disabled.In another embodiment In, when across CU index prediction enabling mark be fictitious time, adjacent index or value can be estimated as predetermined value.For example, can To set 0 for the adjacent index of each color component, and 128 can be set by adjacent value.In the present invention, it predicts The method of index or value in block is not limited to the information of adjacent pixel (or block).
The context of operational mode selects: in SCM 3.0, for indexing the syntactic element of mapping encoding and decoding Palette_mode is by context coding.There are two contexts by palette_mode.The palette_mode indexed according to top Select context.However, being indexed for the index of the first row without top.
The several method of processing context selection is disclosed in this case:
1. the context of the first row uses a fixed context.The context can be when top index passes through First context used in INDEX-RUN encoding and decoding, or when top index by COPY- operational mode encoding and decoding when it is used Second context.
2. the index of the first row can be third context.
All indexes in 3.CU carry out encoding and decoding palette_mode using identical context.It can be selected according to CU size Context is selected, and all indexes in CU carry out encoding and decoding palette_mode using identical context.In this case It can be there are two context.If CU size is greater than threshold value that is fixed or deriving from, first context is used.Otherwise, it uses Another context.In another example, the quantity of context can be reduced to 1.In this case, all CU sizes is upper It is hereafter identical.
The intra prediction scheme of modification
In another embodiment, it in order to realize effect identical with prediction scheme disclosed in Figure 19 and Figure 20, is based on HEVC, HEVC range extension or HEVC SCC in conventional intra prediction, syntactic element rqt_root_cbf be labeled with refer to Show whether the TU (converter unit) from root in (rooted from) current CU has residual error.The signaling of rqt_root_cbf can With between CU in HEVC (inter CU) the rqt_root_cbf signaling of redundancy encoding and decoding it is identical.
The intra prediction with rqt_root_cbf based on embodiment according to the present invention, can be by rqt_root_ Cbf be selectively used for include the intra prediction mode of brightness and chroma intra prediction modes subset.rqt_root_cbf Those of it is only used in subset intra prediction mode.In one example, modification is only applicable to be equal to horizontally or vertically pre- The luma intra prediction modes of survey mode, and chroma intra prediction modes are not modified.
In another embodiment, a CU grades of marks are indicated whether for CU (intra CU) in frame for this The residual signals of internal CU encoding and decoding.Similarly, the mark property of can choose it is applied to the subset of intra prediction mode.
Intra block replicates (IntraBC) search
One embodiment of the present of invention changes the source pixel of IntraBC.The pixel predicted and compensated for IntraBC can Be position depending on pixel unfiltered pixel (that is, before deblocking (deblock)) or filtered pixel (that is, in deblocking and After SAO (the adaptive offset of sampling)).
For example, as shown in figure 29, the pixel predicted and compensated for IntraBC can be based on current CTU (2910) and left The unfiltered pixel of pixel in CTU (2920).Other pixels still use filtered pixel.Figure 29 is shown according to the present invention Embodiment source pixel example, midpoint filler pixels come from unfiltered pixel, and transparent pixels are from being used for The filtered pixel of IntraBC.
In another example, as shown in figure 30, the pixel predicted and compensated for IntraBC is current CTU (3010), in four row of bottom of four row of bottom and upper left CTU (3040) of left CTU (3020) and top CTU (3030) The unfiltered pixel of pixel.Other pixels use filtered pixel.In Figure 30, point filler pixels from non-filtered pixel, And transparent pixels are from the filtered pixel for IntraBC.
In another example, as shown in figure 31, the pixel predicted and compensated for IntraBC is that current CTU, left are N number of The unfiltered pixel of the pixel of four row of bottom of four row of bottom and N number of upper left CTU of in CTU and top CTU.N is just whole Number.Other pixels use filtered pixel.In Figure 31, filler pixels are put from non-filtered pixel, and transparent pixels come from The filtered pixel of IntraBC.
In another example, for IntraBC predict and compensate pixel be current CTU, top CTU four row of bottom and The unfiltered pixel of pixel in the column of the right side four of left CTU.Other pixels using being filtered as shown in figure 32.In Figure 32, point Filler pixels come from unfiltered pixel, and filtered pixel of the transparent pixels from IntraBC.
In another example, the pixel predicted and compensated for IntraBC is current CTU, N number of left CTU, in the CTU of top Four row of bottom and N number of upper left side CTU bottom four row and (N+1) a left CTU the column of right four in pixel Unfiltered pixel.N is positive integer value.Other pixels use filtered pixel.In Figure 33, point filler pixels come from unfiltered picture Element, and filtered pixel of the transparent pixels from IntraBC.
Figure 34, which is shown, shares transformation coefficient buffer for palette encoding and decoding block in conjunction with the embodiment of the present invention The exemplary process diagram of system.System determines the current prediction mode of current block in step 3410, and will in step 3420 Storage region is appointed as transformation coefficient buffer.In step 3430, if current prediction mode is intra prediction mode or frame Between predict, the relevant first information of transformation coefficient of prediction residual of the current block generated by intra prediction or inter-prediction is deposited Storage is in transformation coefficient buffer.In step 3440, if current prediction mode is palette encoding/decoding mode, will with work as The relevant information of preceding piece of associated palette data is stored in transformation coefficient buffer.In step 3450, if currently Block encoding and decoding under intra prediction mode or inter-frame forecast mode, then be based on information relevant to transformation coefficient, to current block into Row coding or decoding, or if current block encoding and decoding under modulation panel encoding/decoding mode, based on related to palette data The information being stored in transformation coefficient buffer, encoding and decoding or decoding are carried out to current block.
Foregoing description is presented so that those of ordinary skill in the art can be in specific application and its context of requirement Implement the present invention.The various modifications of described embodiment will be apparent those skilled in the art, and this The General Principle of text definition can be applied to other embodiments.Therefore, the present invention is not limited to shown or described specific Embodiment, but meet the widest scope consistent with principles disclosed herein and novel feature.In foregoing detailed description, Various details are shown in order to provide thorough understanding of the present invention.It will be understood by those skilled in the art, however, that can be real Apply the present invention.
Embodiment present invention as described above can be realized with the combinations of various hardware, software code or both.Example Such as, the one or more electronic circuits or be integrated into video pressure that the embodiment of the present invention can be integrated into video compress chip Program code in contracting software, to execute process described herein.The embodiment of the present invention is also possible to will be at digital signal It is executed in reason device (DSP) to execute the program code of process described herein.The present invention can also relate to by computer processor, Many functions that digital signal processor, microprocessor or field programmable gate array (FPGA) execute.It can be fixed by executing These processors are configured to execute by the machine-readable software code or firmware code of the presently embodied ad hoc approach of justice Particular task according to the present invention.Software code or firmware code can be with different programming languages and different formats or styles Exploitation.Or different target platform composing software code.However, different code lattice of software code according to the present invention The other modes of formula, pattern and language and configuration code will not fall off the spirit and scope of the present invention.
In the case where not departing from spirit or essential attributes of the invention, the present invention can be implemented in other specific forms. Described example is only considered illustrative rather than restrictive in all respects.Therefore, the scope of the present invention is by institute Attached claim rather than the description of front indicate.Belong to all changes in the meaning and scope of the equivalent of claim It will be included within the scope of its.

Claims (7)

1. a kind of video coding-decoding method uses multiple prediction modes including palette encoding/decoding mode, which comprises
Receive the video bit stream of the codec data or input data associated with the current block that correspond to current block;
If current prediction mode is the palette encoding/decoding mode, parse in decoder-side from the video bits stream aggregation The all of same color component together jump out value, or gather all values of jumping out of the same color component in coder side It gathers together;And
The current block is decoded or is encoded using including the information for jumping out value.
2. video coding-decoding method according to claim 1, which is characterized in that the aggregation of the same color component Jumping out value is indicated at the end of the encoding and decoding palette data of the current block.
3. video coding-decoding method according to claim 1, which is characterized in that the aggregation for different colours component Value is jumped out to be indicated respectively to the current block.
4. video coding-decoding method according to claim 1, which is characterized in that further include by the phase of the current block The aggregation with color component is jumped out value and is stored in transformation coefficient buffer.
5. video coding-decoding method according to claim 4, which is characterized in that the aggregation for different colours component It jumps out value and jumps out value in the transformation coefficient buffer to share by once storing the aggregation an of color component State transformation coefficient buffer.
6. a kind of video coding-decoding method uses multiple prediction modes including palette encoding/decoding mode, which comprises
It flocks together in sequence parameter set, picture parameter set or piece header in decoder-side from parsing in video bit stream Same color component all initial palette predicted values, or in coder side by all initial of the same color component Palette predicted value flocks together;And
At least one palette encoding and decoding block in corresponding sequence, picture or piece is carried out using the initial palette predicted value It decodes or encodes.
7. a kind of video coding-decoding method uses multiple prediction modes including palette encoding/decoding mode, which comprises
In decoder-side, all palettes of the current block of the same color component to flock together from video bit stream parsing are pre- Measured value entry or palette entry, or all palette predicted value entries or toning in coder side aggregation same color component Lath mesh;And
Use the palette predicted value being made of all palette predicted value entries or the toning being made of all palette entries Plate table to the current block is decoded or encodes.
CN201910496007.XA 2014-11-12 2015-11-12 Skip pixel coding and decoding method in index mapping coding and decoding Active CN110519604B (en)

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US201462078595P 2014-11-12 2014-11-12
US62/078,595 2014-11-12
US201462087454P 2014-12-04 2014-12-04
US62/087,454 2014-12-04
US201562119950P 2015-02-24 2015-02-24
US62/119,950 2015-02-24
US201562145578P 2015-04-10 2015-04-10
US62/145,578 2015-04-10
US201562162313P 2015-05-15 2015-05-15
US62/162,313 2015-05-15
US201562170828P 2015-06-04 2015-06-04
US62/170,828 2015-06-04
CN201580061695.7A CN107005717B (en) 2014-11-12 2015-11-12 Skip pixel coding and decoding method in index mapping coding and decoding
PCT/CN2015/094410 WO2016074627A1 (en) 2014-11-12 2015-11-12 Methods of escape pixel coding in index map coding

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201580061695.7A Division CN107005717B (en) 2014-11-12 2015-11-12 Skip pixel coding and decoding method in index mapping coding and decoding

Publications (2)

Publication Number Publication Date
CN110519604A true CN110519604A (en) 2019-11-29
CN110519604B CN110519604B (en) 2022-04-01

Family

ID=55953749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910496007.XA Active CN110519604B (en) 2014-11-12 2015-11-12 Skip pixel coding and decoding method in index mapping coding and decoding

Country Status (6)

Country Link
US (2) US10666974B2 (en)
EP (1) EP3207712B1 (en)
KR (2) KR20190101495A (en)
CN (1) CN110519604B (en)
AU (2) AU2015345649A1 (en)
WO (1) WO2016074627A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872539B (en) * 2015-02-08 2020-01-14 同济大学 Image encoding method and apparatus, and image decoding method and apparatus
CN107534783B (en) * 2015-02-13 2020-09-08 联发科技股份有限公司 Method for encoding and decoding palette index map of block in image
WO2016146076A1 (en) 2015-03-18 2016-09-22 Mediatek Inc. Method and apparatus for index map coding in video and image compression
US10951895B2 (en) * 2018-12-31 2021-03-16 Alibaba Group Holding Limited Context model selection based on coding unit characteristics
US10924750B2 (en) * 2019-03-01 2021-02-16 Alibaba Group Holding Limited Palette size constraint in palette mode for video compression system
US11202101B2 (en) * 2019-03-13 2021-12-14 Qualcomm Incorporated Grouped coding for palette syntax in video coding
US11252442B2 (en) * 2019-04-08 2022-02-15 Tencent America LLC Method and apparatus for video coding
KR20200132761A (en) * 2019-05-15 2020-11-25 현대자동차주식회사 Method and apparatus for parallel encoding and decoding of video data
JP7436519B2 (en) * 2019-05-31 2024-02-21 バイトダンス インコーポレイテッド Palette mode with intra block copy prediction
WO2020264456A1 (en) 2019-06-28 2020-12-30 Bytedance Inc. Chroma intra mode derivation in screen content coding
WO2020264457A1 (en) 2019-06-28 2020-12-30 Bytedance Inc. Techniques for modifying quantization parameter in transform skip mode
KR20220035154A (en) * 2019-07-21 2022-03-21 엘지전자 주식회사 Image encoding/decoding method, apparatus and method of transmitting bitstream for signaling chroma component prediction information according to whether or not the palette mode is applied
US11330306B2 (en) * 2019-08-07 2022-05-10 Tencent America LLC Method and apparatus for video coding
CN116684634A (en) * 2019-08-26 2023-09-01 Lg电子株式会社 Encoding device, decoding device, and bit stream transmitting device
KR20210027175A (en) * 2019-08-30 2021-03-10 주식회사 케이티 Method and apparatus for processing a video
US20220295045A1 (en) * 2019-08-30 2022-09-15 Kt Corporation Video signal processing method and device
BR112022004606A2 (en) * 2019-09-12 2022-05-31 Bytedance Inc Video processing method, apparatus for processing video data and computer readable non-transient recording and storage media
WO2021133529A1 (en) * 2019-12-26 2021-07-01 Alibaba Group Holding Limited Methods for coding video data in palette mode
US20230179760A1 (en) * 2020-03-19 2023-06-08 Lg Electronics Inc. Image encoding/decoding method and apparatus based on palette mode, and recording medium that stores bitstream
US11683514B2 (en) * 2020-12-22 2023-06-20 Tencent America LLC Method and apparatus for video coding for machine
US11463716B2 (en) * 2021-02-25 2022-10-04 Qualcomm Incorporated Buffers for video coding in palette mode

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6237079B1 (en) * 1997-03-30 2001-05-22 Canon Kabushiki Kaisha Coprocessor interface having pending instructions queue and clean-up queue and dynamically allocating memory
CN102523367A (en) * 2011-12-29 2012-06-27 北京创想空间商务通信服务有限公司 Real-time image compression and reduction method based on plurality of palettes
CN103098466A (en) * 2010-09-13 2013-05-08 索尼电脑娱乐公司 Image processing device, image processing method, data structure for video files, data compression device, data decoding device, data compression method, data decoding method and data structure for compressed video files
US20140126814A1 (en) * 2011-11-01 2014-05-08 Zynga Inc. Image compression with alpha channel data
WO2014165784A1 (en) * 2013-04-05 2014-10-09 Qualcomm Incorporated Determining palette indices in palette-based video coding

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3219403B2 (en) * 1989-05-10 2001-10-15 キヤノン株式会社 Image storage device
US5446482A (en) * 1991-11-13 1995-08-29 Texas Instruments Incorporated Flexible graphics interface device switch selectable big and little endian modes, systems and methods
US6034667A (en) * 1992-01-21 2000-03-07 Radius Inc. Method and apparatus for displaying YUV color information on a pseudo-color RGB display
CA2180240A1 (en) * 1994-01-14 1995-07-20 Charles K. Chui Boundary-spline-wavelet compression for video images
US5801665A (en) * 1995-10-30 1998-09-01 Industrial Technology Research Institute Format converter for the conversion of conventional color display format to field sequential
US5936606A (en) * 1996-04-15 1999-08-10 Winbond Electronics Corp. On-screen edit/display controller
US6208350B1 (en) * 1997-11-04 2001-03-27 Philips Electronics North America Corporation Methods and apparatus for processing DVD video
US6021196A (en) * 1998-05-26 2000-02-01 The Regents University Of California Reference palette embedding
US6347157B2 (en) * 1998-07-24 2002-02-12 Picsurf, Inc. System and method for encoding a video sequence using spatial and temporal transforms
US6441829B1 (en) * 1999-09-30 2002-08-27 Agilent Technologies, Inc. Pixel driver that generates, in response to a digital input value, a pixel drive signal having a duty cycle that determines the apparent brightness of the pixel
US7414632B1 (en) * 2000-01-07 2008-08-19 Intel Corporation Multi-pass 4:2:0 subpicture blending
US6891893B2 (en) * 2000-04-21 2005-05-10 Microsoft Corp. Extensible multimedia application program interface and related methods
US7649943B2 (en) * 2000-04-21 2010-01-19 Microsoft Corporation Interface and related methods facilitating motion compensation in media processing
US6940912B2 (en) * 2000-04-21 2005-09-06 Microsoft Corporation Dynamically adaptive multimedia application program interface and related methods
US7634011B2 (en) * 2000-04-21 2009-12-15 Microsoft Corporation Application program interface (API) facilitating decoder control of accelerator resources
US7143353B2 (en) * 2001-03-30 2006-11-28 Koninklijke Philips Electronics, N.V. Streaming video bookmarks
US6937759B2 (en) * 2002-02-28 2005-08-30 Nokia Corporation Method and device for reducing image by palette modification
US7302006B2 (en) * 2002-04-30 2007-11-27 Hewlett-Packard Development Company, L.P. Compression of images and image sequences through adaptive partitioning
US7433526B2 (en) * 2002-04-30 2008-10-07 Hewlett-Packard Development Company, L.P. Method for compressing images and image sequences through adaptive partitioning
US7072512B2 (en) * 2002-07-23 2006-07-04 Microsoft Corporation Segmentation of digital video and images into continuous tone and palettized regions
EP1445734B1 (en) 2003-02-06 2007-08-08 STMicroelectronics S.r.l. Method and apparatus for texture compression
WO2005079064A1 (en) * 2004-02-17 2005-08-25 Matsushita Electric Industrial Co., Ltd. Recording medium, reproduction device, program, and reproduction method
JP2008244981A (en) * 2007-03-28 2008-10-09 Seiko Epson Corp Video synthesis device and video output device
ES2727131T3 (en) * 2011-02-16 2019-10-14 Dolby Laboratories Licensing Corp Decoder with configurable filters
US9116634B2 (en) * 2011-06-10 2015-08-25 International Business Machines Corporation Configure storage class memory command
US8615138B2 (en) * 2011-11-03 2013-12-24 Google Inc. Image compression using sub-resolution images
KR102101304B1 (en) * 2013-03-15 2020-04-16 삼성전자주식회사 Memory controller and operating method of memory controller
US9609336B2 (en) * 2013-04-16 2017-03-28 Fastvdo Llc Adaptive coding, transmission and efficient display of multimedia (acted)
US9558567B2 (en) * 2013-07-12 2017-01-31 Qualcomm Incorporated Palette prediction in palette-based video coding
CN112383780B (en) * 2013-08-16 2023-05-02 上海天荷电子信息有限公司 Encoding and decoding method and device for point matching reference set and index back and forth scanning string matching
WO2015041647A1 (en) * 2013-09-19 2015-03-26 Entropic Communications, Inc. Parallel decode of a progressive jpeg bitstream
US20150110181A1 (en) * 2013-10-18 2015-04-23 Samsung Electronics Co., Ltd. Methods for palette prediction and intra block copy padding
US9477423B2 (en) * 2013-11-26 2016-10-25 Seagate Technology Llc Eliminating or reducing programming errors when programming flash memory cells
US20150181208A1 (en) * 2013-12-20 2015-06-25 Qualcomm Incorporated Thermal and power management with video coding
RU2679201C2 (en) * 2014-09-30 2019-02-06 МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи Rules for intra-picture prediction modes when wavefront parallel processing is enabled
US10356440B2 (en) * 2014-10-01 2019-07-16 Qualcomm Incorporated Scalable transform hardware architecture with improved transpose buffer
CN105960802B (en) * 2014-10-08 2018-02-06 微软技术许可有限责任公司 Adjustment when switching color space to coding and decoding

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6237079B1 (en) * 1997-03-30 2001-05-22 Canon Kabushiki Kaisha Coprocessor interface having pending instructions queue and clean-up queue and dynamically allocating memory
CN103098466A (en) * 2010-09-13 2013-05-08 索尼电脑娱乐公司 Image processing device, image processing method, data structure for video files, data compression device, data decoding device, data compression method, data decoding method and data structure for compressed video files
US20140126814A1 (en) * 2011-11-01 2014-05-08 Zynga Inc. Image compression with alpha channel data
CN102523367A (en) * 2011-12-29 2012-06-27 北京创想空间商务通信服务有限公司 Real-time image compression and reduction method based on plurality of palettes
WO2014165784A1 (en) * 2013-04-05 2014-10-09 Qualcomm Incorporated Determining palette indices in palette-based video coding

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
C. GISQUET ET AL.: "Non-SCCE3: memory reduction for palette mode software", 《JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG 11》 *
SHIH-TA HSIANG ET AL.: "Run coding of the palette index map using a universal entropy coding scheme", 《JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG 11》 *
XIAOYU XIU ET AL.: "Removal of parsing dependency in palette-based coding", 《JOINT COLLABORATIVE TEAM ON VIDEO CODING (JCT-VC) OF ITU-T SG 16 WP 3 AND ISO/IEC JTC 1/SC 29/WG 11》 *

Also Published As

Publication number Publication date
EP3207712A4 (en) 2017-12-20
US20200288170A1 (en) 2020-09-10
CN110519604B (en) 2022-04-01
BR112017009946A2 (en) 2018-07-03
EP3207712B1 (en) 2021-06-09
AU2019201623B2 (en) 2020-10-22
AU2019201623A1 (en) 2019-04-04
US10666974B2 (en) 2020-05-26
US20190116380A1 (en) 2019-04-18
AU2015345649A1 (en) 2017-06-08
US11457237B2 (en) 2022-09-27
EP3207712A1 (en) 2017-08-23
KR102144154B1 (en) 2020-08-13
KR20170094544A (en) 2017-08-18
WO2016074627A1 (en) 2016-05-19
KR20190101495A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
CN110519604A (en) Pixel decoding method is jumped out in index mapping encoding and decoding
CN107005717A (en) Pixel decoding method is jumped out in index mapping encoding and decoding
CN106716999B (en) The method of palette fallout predictor signaling for Video coding
CN107852505A (en) Method and apparatus for being handled using the video decoding error of intra block replication mode
CN106797478B (en) The method of palette coding
KR102080450B1 (en) Inter-plane prediction
CN106537910B (en) It will jump out method of the pixel as fallout predictor in index graph code
CN107534711B (en) Advanced coding techniques for efficient video coding (HEVC) screen picture coding (SCC) extension
CN102884792B (en) For the method and apparatus unifying notable graph code
KR101887798B1 (en) Method and apparatus of binarization and context-adaptive coding for syntax in video coding
CN105981388B (en) The method and apparatus that syntax redundancy removes in palette coding
CN107925769A (en) Method and system for the device of decoded picture buffer of intra block replication mode
CN105874795A (en) Rules for intra-picture prediction modes when wavefront parallel processing is enabled
CN105981380A (en) Method and apparatus for palette initialization and management
CN109644271A (en) The method and device of determination Candidate Set for binary tree segmentation block
CN106537916A (en) Palette mode encoding and decoding design
CN107852499A (en) The method that constraint intra block for reducing the bandwidth under worst case in coding and decoding video replicates
CN106464871A (en) Coding runs with escape in palette-based video coding
CN106464873A (en) Predictor palette initialization in palette-based video coding
KR20200019794A (en) Sample region merging
CN106231335A (en) Decoder, coding/decoding method, encoder and coded method
CN107251555A (en) The method and apparatus of monochromatic content palette coding in video and compression of images
BR112017009946B1 (en) EXHAUST PIXEL TO CODE CONVERSION METHODS IN INDICATOR MAP TO CODE CONVERSION

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant