CN101099391A - Methods of and apparatuses for adaptive entropy encoding and adaptive entropy decoding for scalable video encoding - Google Patents

Methods of and apparatuses for adaptive entropy encoding and adaptive entropy decoding for scalable video encoding Download PDF

Info

Publication number
CN101099391A
CN101099391A CNA2006800017078A CN200680001707A CN101099391A CN 101099391 A CN101099391 A CN 101099391A CN A2006800017078 A CNA2006800017078 A CN A2006800017078A CN 200680001707 A CN200680001707 A CN 200680001707A CN 101099391 A CN101099391 A CN 101099391A
Authority
CN
China
Prior art keywords
syntactic element
entropy
context
coding
encoded
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2006800017078A
Other languages
Chinese (zh)
Inventor
全炳宇
崔雄一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Sungkyunkwan University
Original Assignee
Samsung Electronics Co Ltd
Sungkyunkwan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd, Sungkyunkwan University filed Critical Samsung Electronics Co Ltd
Publication of CN101099391A publication Critical patent/CN101099391A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Methods and apparatuses are provided for adaptive entropy encoding and adaptive entropy decoding using various context models. A scalable entropy encoding method includes determining a context by referring to both syntax elements in the same layer as a block including a syntax element to be encoded and syntax elements in lower layers or only the syntax elements in the lower layers and performing entropy encoding on the syntax element using the determined context.

Description

Be used for the adaptive entropy coding of scalable video coding and the method and apparatus of adaptive entropy decoding
Technical field
Method and apparatus of the present invention relates to adaptive entropy coding and the adaptive entropy decoding that is used for scalable video coding, more particularly, relates to adaptive entropy coding and the adaptive entropy decoding of using various context models.
Background technology
Scalable video coding is to be used for transmitting the video coding that transmits data according to isomery after adjusting transmitted data amount with terminal environments, and for according to various transmission environments adaptively deal with data be basic.Along with the development of mobile communication technology, transmission system, high-performance semiconductor and video compression technology, for the demand growth fast of the Video service that can be suitable for various transmission environments.
Yet, in order to handle video adaptively, be not suitable for according to various transmission environments coded data adaptively based on the conventional video coding techniques of particular communication environment exploitation according to the variation in the various environment of mobile environment that comprise channel width, terminal processing capacity, packet loss rate and user.Scalable video coding is the intellectual technology that is adapted to so various transmission environments.The example of scalable video coding comprises spatial scalable coding, frame rate auto-adaptive time ges forschung and based on the signal to noise ratio (snr) ges forschung of video quality.
Conventional video standards comprises these scalable video coding technology.For example, MPEG-2 ges forschung based on main video Data Transmission in the ATM(Asynchronous Transfer Mode) network is arranged, H.263 the SNR of Annex.0, time and spatial scalable coding and based on fine granular (fine granular) ges forschung of MPEG-4.In addition, the scalable video coding that meets MPEG-4 AVC is carried out standardization.The scalable video coding purpose that meets MPEG-4 AVC is that the angle from SNR, time and space scalability provides scalable video coding.
Fig. 1 is the view that the example of the video coding that uses scalable video coding is shown.
With reference to Fig. 1, can carry out scalable video coding from the angle of SNR, time and space scalability as can be seen.Scalable video coding relates to according to network state video coding to a plurality of layer, wherein uses the data of their next-door neighbours' lower floor to come encoding enhancement layer.
In the example of video coding shown in Figure 1, when the transmission bit rate of video data is lower than 41kpbs, basis of coding layer (base layer) 110 only.When the transmission bit rate of video data is between 41kbps and 80kbps, carry out the SNR ges forschung that the data of using basal layer 110 improve video quality, to produce and first enhancement layer 120 of encoding.The picture size of each frame of the basal layer 110 and first enhancement layer 120 is in 1/4th public intermediate forms (a quarter common intermediate format, QCIF) in, and with the speed entropy coding basal layer 110 and first enhancement layer 120 of per second 15 frames.
Specify the picture size of each frame of second enhancement layer 130 and the 3rd enhancement layer 140 by CIF, and with speed entropy coding second enhancement layer 130 and the 3rd enhancement layer 140 of per second 30 frames.Therefore, when the transmission bit rate of video data is 115kbps or when faster, by the frame of frame up-sampling (up-sampling) for having the CIF picture size that will have the QCIF picture size in first enhancement layer 120, and the frame through up-sampling is carried out predictive coding, with further establishment intermediate frame (promptly, high pass (H) frame), create second enhancement layer 130.When the transmission bit rate of video data is 256kpbs or when faster, carry out the SNR ges forschung that improves video quality by the data of using next-door neighbour's second enhancement layer 130 under the 3rd enhancement layer 140, create the 3rd enhancement layer 140.
Since from the angle of transmission sequence with two prediction (B) frames of every layer or H frames as before the reference frame of motion compensation of frame, so can be with they time ges forschungs.With reference to Fig. 1, I frame and P frame from the angle of transmission sequence before B frame or H frame.As shown in Figure 1, the transmission sequence between B frame and the H frame is distributed to index (by the indication of the subscript in the frame title) variation of B frame and H frame thereupon.Be easy to act as most in the B frame its index and when low and when its index in the H frame, transmit this frame when being high.
For example, in the basal layer 110 or first upper strata 120, by reference I frame and P frame to B 1The frame motion compensation, and pass through with reference to B 1Frame is to B 2The frame motion compensation.In second enhancement layer 130 and the 3rd enhancement layer 140, by reference L 3Frame is to H 3The frame motion compensation, and pass through with reference to H 3Frame is to H 2The frame motion compensation.Therefore, the frame transmission sequence is I->P->B in the basal layer 110 and first enhancement layer 120 1->B 2->B 3, and in second enhancement layer 130 and the 3rd enhancement layer 140, be L 3->H 3>-H 2->H 1->H 0Determine at the transmission sequence that has between the frame of same index by the time sequencing of frame.By time ges forschung, spatial scalable coding and SNR ges forschung, decoder can be decoded to layer on the scalable bit rate corresponding with layer.
Though the scalable video coding standard of being used as is based upon in the Moving Picture Experts Group-2, and is in depth studied, it does not also enter into public use.The reason of this situation is that code efficiency is low.In other words, when comparing with non-scalable video encoder, salable video encoder is carried out coding little by little to improve the quality of low quality basal layer.As a result of, even when video has identical bit rate, also may seriously reduce the quality of some video.Under the situation that does not solve this code efficiency problem, be difficult in market, dispose ges forschung.
In order to address this problem, carrying out the research that the code efficiency that overcomes in the ges forschung is descended energetically.For example, in spatial scalable coding, can compare with the absolute coding of each layer by the up-sampling frame that in motion compensation, uses lower level and to improve code efficiency significantly.In other words, owing between layer, have high correlation, so can in predictive coding, obtain high coding efficiency by using this high correlation.
Yet in traditional scalable video coding, entropy coding does not use the correlation between the layer, but is carried out in the mode identical with the mode of non-scalable video coding.As a result, can not solve the problem that code efficiency reduces.
Summary of the invention
Technical scheme
The invention provides the correlation between the syntactic element that is used to use layer, based on the method and apparatus of context ground adaptive entropy coding, with the efficient of the entropy coding of raising in the salable video encoder that with video coding is multilayer.
The method and apparatus that the present invention also is provided for using the correlation between the syntactic element of layer to encode based on context ground adaptive entropy.
Beneficial effect
According to the present invention, can improve code efficiency by in scalable video coding, not only using the syntactic element in identical layer but also using the syntactic element in lower level to carry out entropy coding.
Description of drawings
Describe example embodiment of the present invention in conjunction with the drawings in detail, of the present invention above and other aspects will become more apparent.
Fig. 1 is the view that is used to explain the example of the video coding that uses scalable video coding;
Fig. 2 is based on contextual adaptive binary arithmetic coding (context-based adaptive binaryarithmetic coding, the CABAC) block diagram of encoder;
Fig. 3 A and 3B are the reference diagrams of determining that is used for interpretive context;
Fig. 4 is the block diagram of CABAC decoder;
Fig. 5 is to use the map information between code element (symbol) and the code word to carry out the block diagram of the entropy coder of coding;
Fig. 6 is to use the map information between code element and the code word to carry out the block diagram of the entropy decoder of decoding;
Fig. 7 A and 7B are the reference diagrams that is used to explain according to the operation of the context determining unit of example embodiment of the present invention;
Fig. 8 A is the block diagram that example embodiment is used the adaptive entropy coding device of context determining unit according to the present invention to 8G;
Fig. 9 A is the block diagram that example embodiment is used the adaptive entropy decoder of context determining unit according to the present invention to 9G;
Figure 10 A is the flow chart that the entropy coding method of example embodiment according to the present invention is shown to 10C;
Figure 11 A is the flow chart that the entropy decoding method of example embodiment according to the present invention is shown to 11C;
Figure 12 is the block diagram that example embodiment has the video encoder of entropy coder according to the present invention; With
Figure 13 is the block diagram that example embodiment has the Video Decoder of entropy decoder according to the present invention.
Embodiment
Preferred forms
According to an aspect of the present invention, provide scalable entropy coding method.This scalable entropy coding comprises: by with reference to as the syntactic element in the identical layer of piece that comprises the syntactic element that will be encoded and the syntactic element in lower level or just the syntactic element in lower level determine context, and use determined context that syntactic element is carried out entropy coding.
According to another aspect of the present invention, provide entropy coding method.It is one of following that this entropy coding method comprises: by with reference to as the syntactic element in the identical layer of the piece that comprises the syntactic element that will be encoded, at the syntactic element of lower level and the syntactic element in identical layer and syntactic element in lower level, the syntactic element that will be encoded is transformed to binary string, determine context by the syntactic element in the reference identical layer, and use determined context that the syntactic element that will be encoded is carried out entropy coding.
Another aspect again according to the present invention provides scalable entropy decoding method.This scalable entropy decoding method comprises: by with reference to as syntactic element in the identical layer of piece of the syntactic element that comprises entropy coding and the syntactic element in lower level, perhaps only with reference to the syntactic element in lower level, receive data and definite context, and use determined context carrying out the entropy decoding through the data of entropy coding through entropy coding.
According to another aspect of the present invention, provide entropy decoding method.This entropy decoding method comprises: receive the data through entropy coding, and by with reference to determining context as the syntactic element in the identical layer of piece of the syntactic element that comprises entropy coding, use determined context to carrying out the entropy decoding through the data of entropy coding, with by with reference to one of following: as the syntactic element in the identical layer of the piece of the syntactic element that comprises entropy coding, at the syntactic element of lower level and the syntactic element in identical layer and syntactic element in lower level, will be transformed to code element through the binary string that entropy is decoded.
According to another aspect more of the present invention, provide the scalable entropy coder that comprises context determining unit and entropy coding engine.This context determining unit perhaps only with reference to the syntactic element in lower level, is determined context by with reference to as syntactic element in the identical layer of the piece that comprises the syntactic element that will be encoded and the syntactic element in lower level.This entropy coding engine uses determined context that syntactic element is carried out entropy coding.
According to a further aspect of the invention, provide entropy coder, it comprises: binary string converter unit, context determining unit and entropy coding engine.This binary string converter unit is by with reference to one of following: as the syntactic element in the identical layer of the piece that comprises the syntactic element that will be encoded, at the syntactic element of lower level and the syntactic element in identical layer and syntactic element in lower level, to be encoded syntactic element be transformed to binary string.This context determining unit is determined context by the syntactic element in the reference identical layer.This context determining unit is determined context by the syntactic element in the reference identical layer.The entropy coding engine uses determined context that the syntactic element that will be encoded is carried out entropy coding.
According to another aspect of the present invention, provide scalable entropy decoder, it comprises context determining unit and entropy Decode engine.This context determining unit receives the data through entropy coding, and by with reference to as the syntactic element in the identical layer of the piece of the syntactic element that comprises entropy coding, perhaps only with reference to the syntactic element at lower level, determines context.This entropy Decode engine uses determined context to carrying out the entropy decoding through the data of entropy coding.
According to a further aspect of the invention, provide entropy decoder, it comprises context determining unit, entropy Decode engine and symbol transformations unit.This context determining unit receives the data through entropy coding, and by with reference to as comprising that the syntactic element in the identical layer of the piece of the syntactic element of entropy coding determines context.This entropy Decode engine uses determined context to carrying out the entropy decoding through the data of entropy coding.This symbol transformations unit is by with reference to one of following: as comprise syntactic element in the identical layer of the piece of the syntactic element of entropy coding, at the syntactic element of lower level and the syntactic element in identical layer and syntactic element in lower level, will be transformed to code element through the binary string of entropy decoding.
The invention execution mode
Entropy coding and decoding will be described, comprise according to example embodiment of the present invention use correlation between the code element of layer based on contextual entropy coding and decoding.Specifically, to describe in detail since its high coding efficiency and in AVC H.264/MPEG-4, use based on contextual adaptive binary arithmetic coding (CABAC), and such as adaptive arithmetic code, Huffman (Huffman) coding and general variable length encode (UVLC).The invention is not restricted to CABAC, but can be applied to common entropy coding and coding/decoding method.
Fig. 2 is the block diagram of CABAC encoder.
The CABAC encoder of Fig. 2 receives and syntax elements encoded (SE), and comprises binary string converter unit 210, context determining unit 220 and adaptive arithmetic code unit 250.The data of the bit stream that syntactic element indication is compressed, and comprise that expression is as the video of compressed form or analyze compressed data or header information parameters needed and residual risk.For example, in H.264, in syntactic element, comprise parameter, header information and the remaining information such as coeff_abs_level_minus1 such as the mb_type of entropy_coding_mode_flag that indicates which entropy coding pattern of use and indication macro block (mb) type.
Binary string converter unit 210 receives the symbol value of the syntactic element that will be encoded, and it is transformed to binary string.Binary string is the sequence of the binary value such as 0 and 1.The example of the binary string shown in the table 1 will be transformed to this binary string by the symbol value of the macro block (mb) type mb_type of defined B segment (slice) in H.264.With reference to table 1, indication is that the unit carries out motion compensation to given macro block with 16 * 16, and code element B_L0_16 * 16 have value 1 by cross index L0 (tabulation 0) in the code element B_L0_16 of the mb_type of B segment * 16.Code element B_L0_16 * 16 are transformed to binary string " 100 " by binary string converter unit 210 with three binary values.
In table 1, the position of the binary value of BinIdx indication in binary string.For example, its BinIdx is the beginning that 0 binary value is positioned at binary string.Usually at the mapping between each syntactic element definition symbol value and the binary string.
[table 1]
The symbol value of the mb_type of B segment (code element) Binary string
0(B_Direct_16×16) 0
1(B_L0_16×16) 100
2(B_L1_16×16) 101
3(B_Li_16×16) 110000
4(B_L0_L0_16×8) 110001
5(B_L0_8×16) 110110
6(B_L1_L1_16×8) 110011
.... ...
BinIdx 0123456
After the symbol transformations with given syntactic element is binary string, come they order entropy codings by adaptive arithmetic code unit 250 by reference context determining unit 220 determined contexts.Context determining unit 220 is determined and the binary value 0 of binary string and each corresponding context of 1.Context refers to the surrounding environment that is associated with the code element that will be encoded.Will be described later this surrounding environment.Because context is used for probability Estimation unit 230 estimating the probable value at each of binary value 0 and 1, thus if the change in context of binary value, then the probable value at binary value would also change.Arithmetic coding engine 240 probability of use estimation units 230 estimated probable values are carried out arithmetic coding, and the probabilistic model of the renewal probability Estimation unit 230 corresponding with encoded 0 or 1.Below, describe contextual definite with reference to Fig. 3 A and 3B.
Fig. 3 A and 3B are the reference diagrams of determining that is used for interpretive context.
In Fig. 3 A, T 310 is context template, and is included in the subclass of the syntactic element of encoding before the syntax elements encoded.T 310 can also comprise the part of the binary string corresponding with the symbol value of the syntactic element that will be encoded outside this subclass, perhaps can include only the part of binary string.T 310 can also be included in outside this subclass syntax elements encoded and before the attribute of coded syntactic element and the syntactic element that will be encoded, perhaps include only attribute.For example, shown in Fig. 3 B, the context template character library of CABAC is positioned at the code element S of adjacent block 340 in the left side of the current block 330 with the code element that will be encoded 0, and the code element S that is positioned at the adjacent block 350 of piece 330 tops 1
For example, under the situation of the code element mb_type of the B of table 1 segment, whether the code element my_type of the adjacent block by the left side and top is that Direct (directly) determines that its BinIdx is the context of 0 binary value.Whether the adjacent block of the left side and top is skipped or do not existed also influences context.Therefore, the needed context template of first binary value of code symbols my_type comprises the code element mb_type of the adjacent block of the left side and top, and such as whether skipped about the adjacent block of the left side and top or non-existent information the left side and the attribute of the adjacent block of top.
Context template can also comprise the binary value of its BinIdx less than the BinIdx of the binary value that will be encoded.For example, in the code element mb_type of table 1, in the time will determining that its BinIdx is binary context of 2, be that 1 binary value is as context template with its BinIdx.Therefore, its BinIdx is that the context of 2 binary value is that 1 the value of encoded binary is 0 or 1 to change according to its BinIdx.
The modeling function F is received in the included code element { S of context template T 310 that is used for given symbol value 0, S 1..., S T-1, and determine from set of context C 320{C 0, C 1..., C N-1Context.Its BinIdx is that the modeling function of 0 binary value is as follows in the mb_type of table 1:
ctxIdx=condTermFlag(A)+condTermFlag(B)+Offset...........(1)
Wherein ctxIdx is the index of the determined context C of indication, and Offset is the initial value of index of the starting position of the mb_type in the table of indication defining context, and the state of condTermFlag (M) indicator collet M.When there be not (condition 1) in piece M or skipped (condition 2), when perhaps the code element mb_type of piece M was Direct (condition 3), condTermFlag (M) was 0.In other words, satisfy condition 1, one of condition 2 and condition 3, condTermFlag (M) is 0.Otherwise condTermFlag (M) is 1.In the example of mb_type, the modeling function F is expressed as F:T={S 0, S 1The C={C of }-> 0, C 1..., C N-1.At this moment, S 0And S 1Be the left adjacent block 340 of the current block 330 shown in Fig. 3 B and the code element mb_type of last adjacent block 250.More particularly, S 0=condTermFlag (A) and S 1=condTermFlag (B).
Therefore, input about piece A and whether piece B exists or the information whether skipped and code element mb_type as context template to obtain context index ctxIdx, from set of context, extract context with the context index ctxIdx that is obtained, and definite context.
Determined context is used for adaptive arithmetic code unit 250 to estimate the probable value at each binary value.Adaptive arithmetic code unit 250 comprises probability Estimation unit 230 and arithmetic coding engine 240.Probability Estimation unit 230 receives each corresponding context of the binary value that is obtained with binary string converter unit 210, and estimates to be used for the probable value of each binary value 0 or 1.Determine the probable value corresponding by the mapping table between context and the probable value with context.Determined probable value is used for arithmetic coding engine 240 with the coding binary value.Arithmetic coding engine 240 sequential encoding binary values, and order probability Estimation unit 230 estimates that each encoded binary value upgrades probable value.
The general arithmetic encoder of the symbol value that does not have binarization being carried out arithmetic coding is different from the CABAC encoder, and except binary string converter unit 210, it has the 26S Proteasome Structure and Function identical with the CABAC encoder.In general arithmetic encoder, the context determining unit is determined the context corresponding with the symbol value of code element, rather than determines the context corresponding with the binary value of code element.
Fig. 4 is the block diagram of CABAC decoder.
The CABAC decoder is from as by the video data of encoder compresses and by extracting code element the 1 and 0 compressed bit stream of forming.Context determining unit 410 is determined each the corresponding context with the binary value that will be encoded, and context is determined in each position of the binary value of the estimation binary string of will not decode.Determined context is used for self adaptation arithmetic decoding unit 450 with the binary value of decoding from the bit stream of input.Probability Estimation unit 420 receives contexts, and estimates each probable value of binary value 0 and 1.Arithmetic decoding engine 430 uses the estimated probable value binary value of sequentially decoding from the bit stream of input, and estimated probable value is sent to the probable value of probability Estimation unit 420 with the renewal binary value.To be transformed to symbol data by symbol transformations unit 440 by the binary value of 450 the order decodings in self adaptation arithmetic decoding unit.Use binary string converter unit 210 employed mapping tables as the CABAC encoder of table 1 to carry out mapping between code element and the binary value.
Wherein the arithmetic decoding engine general mathematical decoder that directly produces symbol value is different from the CABAC decoder, and it does not need the symbol transformations unit 440 of CABAC decoder.
Fig. 5 is to use the map information between code element and the code word to carry out the block diagram of the entropy coder of coding.
Use the coding method of the map information between code element and the code word to comprise huffman coding, general variable length coding (UVLC) and Exp-Golomb coding.Table 2 shows the example of the map information between code element and the code word.
[table 2]
Code element Context 1 (0≤nC<2) Context 2 (2≤nC<4) Context 3 (4≤nC<8) Context 4 (8≤nC) Context 5 (nC=-1)
0 1 11 1111 000011 01
1 000101 001011 001111 000000 000111
2 01 10 1110 000001 1
3 00000111 000111 001011 000100 000100
4 000100 001111 01111 000101 000110
5 001 011 1101 000110 001
6 000000111 0000111 001000 001000 000011
7 00000110 001010 01100 001001 0000011
8 0000101 001001 01110 001010 0000010
9 00011 0101 1100 001011 000101
With reference to Fig. 5, entropy coder comprises context determining unit 510, entropy coding engine 520 and code element-code word map information storage unit 530.Entropy coding engine 520 is exported the code word corresponding with input symbols according to as shown in table 2, context determining unit 510 determined contexts.At this moment, context determining unit 510 is used the code element of the adjacent block that comprises the piece with the code element that will be encoded or the context template of symbols encoded, determines and the corresponding context that will be encoded.Entropy coding engine 520 extracts the map information corresponding with determined context from the code element-code word map information storage unit 530 of storing a plurality of code elements-code word map information, and input symbols is carried out entropy coding.For example, in table 2, when satisfied 0≤nC<2, determine contexts by context determining unit 510.Therefore, code element-code word map information that entropy coding engine 520 uses code element-code word map information storage unit 530 to provide is exported the code word corresponding with input symbols.
Here, nC indication has the average of the non-zero transform coefficient of the left side of piece of the code element that will be encoded and top adjacent block (S0 of Fig. 3 B and S1).The mean value of the output non-zero transform coefficient when in other words, nC refers to input when the modeling function of the quantity of the conversion coefficient of piece (S0 and S1) being given mapping 3A.
Fig. 6 carries out the block diagram that the map information that uses between code element and the code word is carried out the entropy decoder of decoding.
Use the coding/decoding method of the map information between code element and the code word to comprise Hofmann decoding, general variable length decoding and Exp-Golomb decoding.With reference to Fig. 6, entropy decoder comprises context determining unit 610, entropy Decode engine 620 and code element-code word map information storage unit 630.
Context determining unit 610 determine with through the corresponding context of the syntactic element of entropy coding.Code element-630 storages of code word map information storage unit as a plurality of code elements-code word map information defined previously in table 2.Entropy Decode engine 620 uses map information to carry out the entropy decoding.
Because adaptive entropy coding method based on context control adaptively is used for the probable value of each binary value or code element, so their the code efficiency situation more fixing than probable value is far better.This is because when probable value fixedly the time, use the probable value of training at each code element before, causes the employed probable value of each code element and the difference between the actual probability.Yet, when CABAC coding method and adaptive entropy coding method are applied to salable video encoder, can realize from the angle of code efficiency improving.This is because when the CABAC method is applied to scalable video coding, and it is possible using the probability Estimation more accurately of the correlation between the layer.
In other words, according to the entropy coding method of example embodiment of the present invention based on such fact: can be by when the code element of entropy coding syntactic element, not only the syntactic element that will be encoded in the code element of the syntactic element of reference in identical layer or the next comfortable identical layer is dissimilar, and dissimilar with reference in the code element of the syntactic element of same type or the lower level, obtain statistical nature more accurately, thereby entropy-coding efficiency is provided.This be because the syntactic element that will be encoded than included syntactic element in the adjacent block in identical layer, have with lower level in the higher correlation of syntactic element.
So far, the context determining unit of the adaptive entropy of example embodiment coding device according to the present invention is by not only determining context with reference to the syntactic element in the identical layer but also with reference to the information of lower level.Below, with reference to Fig. 7 A and 7B, the operation of the context determining unit of the example embodiment according to the present invention is described in further detail.
Fig. 7 A and 7B are the reference diagrams that is used to explain according to the operation of the context determining unit of example embodiment of the present invention.
With reference to Fig. 7 A and 7B, the difference of the context determining unit of the context determining unit of example embodiment and traditional entropy coder is as follows according to the present invention.At first, context template T 710 is at the context template { S that uses traditionally 0, S 1..., S T-1Outside, also comprise code element and association attributes { S from the code element of the syntactic element of the same type of the syntactic element that will be encoded in the lower level or dissimilar syntactic element T, S T+1..., S T+K-1.For example, when shown in 7B, when current block 731 is included among layer M 730, the code element { S of included adjacent block in layer M 730 0And S 1And each code element { S of the lower level of layer M 730 T, S T+1..., S T+K-1, promptly layer (M-1) 740 and layer 0 750 are included among the context template T 710.When the code element of lower level was included among the context template T 710, set of context C 720 was at existing context { C 0, C 1..., C N-1Outside, also comprise new context { C N, C N+1..., C N+q-1.
The piece of lower level preferably includes the piece conduct that is arranged in same position and works as the current block of anterior layer and the adjacent block of this piece.With reference to Fig. 7 B, the code element S of layer (M-1) 740 T741 and layer 0 750 code element S T+K-5The 751st, as the code element of the piece that is arranged in same spatial location of current block 731.When the picture size of anterior layer and lower level not simultaneously, be preferably in by after up-sampling or the down-sampling coupling picture size definite piece that is arranged in same spatial location as current block.
According to the type of scalable video coding, layer can be time horizon, space layer, SNR layer or its combination.
Fig. 8 A is the block diagram of adaptive entropy coding device of the use context determining unit of the example embodiment according to the present invention to 8G.
Fig. 8 A is to use the block diagram of CABAC as the adaptive entropy coding device of entropy coding method, Fig. 8 B is to use the block diagram of general mathematical coding method as the adaptive entropy coding device of entropy coding method, Fig. 8 C is the block diagram of adaptive entropy coding device of entropy coding method that utilize to use the coding of the map information between code element and the code word, and Fig. 8 D is to use the block diagram of CABAC encoder of the information of the lower level that is used for binarization.
In the adaptive entropy coding device of Fig. 8 A, binary string converter unit 810 is a binary string with the symbol transformations of syntactic element.Context determining unit 812 is determined the context corresponding with each binary value of binary string.When determining the context corresponding with each binary value, not only syntax element information in the identical layer but also the syntax element information in lower level are read from lower-layer syntax element memory cell 814, and are added in the context template to use in context is determined.Probability Estimation unit 816 uses context determining unit 812 determined contextual search probable values, and arithmetic coding engine 818 probability of acceptance value and binary values, and carry out arithmetic coding.
In the adaptive entropy of Fig. 8 B coding device, since different with the adaptive entropy coding device of Fig. 8 A that uses the CABAC coding method and do not need the conversion of symbol value to binary string, so do not comprise binary string converter unit 810.Therefore, the context determining unit 820 of Fig. 8 B not only reads the syntax element information in the identical layer, and from lower-layer syntax element memory cell 822, read in syntax element information in the lower level, the syntax element information that is read is added in the context template, and determine the context corresponding with the code element that will be encoded.Probability Estimation unit 824 uses context determining unit 820 determined contextual search probable values, and 826 pairs of input symbols of arithmetic coding engine are carried out arithmetic coding.
In the adaptive entropy coding device of Fig. 8 C, context determining unit 830 not only reads the syntax element information in the identical layer, and from lower-layer syntax element memory cell 832, read syntax element information, the syntax element information that is read is added in the context template, and determine the context corresponding with the code element that will be encoded.Determined context is used for selecting one from a plurality of map informations of code element-code word map information storage unit 834.Entropy coding engine 836 uses selected code element-code word map information output code word corresponding with input symbols.
In the adaptive entropy coding device of Fig. 8 D, not only the syntax element information in the lower level is used in the context determining unit 842, and is used in the binary string converter unit 840 that is used for the code element binarization.In other words, when the coding given syntactic element code element the time, the binary string converter unit 840 of CABAC encoder uses from the code element of the same type of the given syntactic element in identical layer or the lower level or dissimilar syntactic element or the attribute of syntactic element, is binary string with symbol transformations.In other words, when symbol transformations is binary string, can be only with reference to the syntactic element in the identical layer or with reference to syntactic element in the identical layer and the syntactic element in the lower level.Those of the adaptive entropy coding device of lower-layer syntax element memory cell 844, context determining unit 842, probability Estimation unit 846 and arithmetic coding engine 848 and Fig. 8 A are identical.
With reference to the syntactic element in the syntactic element in the identical layer, identical layer and the lower level or the syntactic element in the lower level is that the method for binary string improves code efficiency adaptively with symbol transformations, no matter and the context determining unit with reference to syntactic element in the lower level or the syntactic element in the identical layer.Associated exemplary execution mode of the present invention has been shown in Fig. 8 E.In other words, such described with reference to Figure 2, the syntax element information of 852 references of context determining unit in identical layer, and one of 850 references of binary string converter unit syntactic element, the syntactic element in identical layer and lower level and syntactic element in lower level in identical layer.Therefore, layer syntactic element memory cell 854 only is stored in one of syntactic element, the syntactic element in identical layer and lower level and the syntactic element in lower level in the identical layer according to the selection of reference information.
Detailed implementation method as binary string converter unit 840 and 850, can be stored in the symbol value of code element and a plurality of mapping tables between the binary string, and according to the selection of reference information can select they one of, perhaps can use the symbol value of code element and the fixed table between the binary value and according to can remap relation between code element and the symbol value of the selection of reference information.In the former, the binary string corresponding with code element changes.For example, replace 1 the corresponding binary string " 100 " of symbol value with code element mb_type, therefore create another mapping table with another.In the latter, in fixedly mapping table as shown in table 1, only the mapping between symbol value and the binary string changes.For example, when the binary string corresponding with 1 symbol value is " 100 " and with the corresponding binary string of 2 symbol value during for " 101 ", the binary string corresponding with 1 symbol value is changed and is " 101 ", and the binary string corresponding with 2 symbol value is changed and is " 100 ".
Though context determining unit 812 with 820 and probability Estimation unit 816 can separate with shown in the 8B as Fig. 8 A with 824, but shown in Fig. 8 F, can be by implementing context determining unit 860 in the context determining unit 812 of the function of probability Estimation unit 816 being added to Fig. 8 A, and shown in Fig. 8 G, can be by implementing context determining unit 870 in the context determining unit 820 of the function of probability Estimation unit 824 being added to Fig. 8 B. Probability Estimation unit 816 and 824 uses determines particular probability by context determining unit 812 among Fig. 8 A and the 8B and 820 determined contexts, but context determining unit 860 and 870 is determined specific probable value in Fig. 8 F and 8G. Context determining unit 860 and 870 is carried out the probability update functions.
Can also in Fig. 8 D and 8E, carry out this change.
Fig. 9 A is the block diagram that uses the adaptive entropy decoder of context determining unit according to example embodiment of the present invention to 9G.
More particularly, Fig. 9 A to 9D is and the block diagram of Fig. 8 A to the corresponding entropy decoder of the entropy coder of 8B.Therefore, Fig. 9 A is to use the block diagram of CABAC (based on contextual adaptive binary arithmetic coding) as the entropy decoder of entropy decoding method.Fig. 9 B is to use the block diagram of general mathematical coding/decoding method as the entropy decoder of entropy decoding method.Fig. 9 C utilizes the block diagram of the decoding of the map information between use code element and the code word as entropy decoding method.Fig. 9 D is to use the block diagram of CABAC decoder of the information of the lower level that is used for binarization.
In the entropy decoder of Fig. 9 A, context determining unit 910 is from determining through the bit stream of entropy coding and want the decoded corresponding context of binary system, and determines context according to the position of the binary value in binary string that will be decoded.When determining the context corresponding with each binary value, context determining unit 910 not only reads in the syntax element information in the identical layer, and from lower-layer syntax element memory cell 911, read syntax element information in the lower level, and the syntax element information that is read is added in the context template with use in context is determined.Determined context is used for entropy decoding unit 912 with the binary value of decoding from the bit stream of input.Probability Estimation unit 913 receives contexts and at each estimated probability value of binary value 0 and 1.The estimated probable value of arithmetic decoding engine 914 uses decodes the binary value at incoming bit stream in proper order, and transfers them to probability Estimation unit 913 to upgrade the probable value of binary value.To be transformed to symbol data by symbol transformations unit 915 by the binary value of 914 order decodings of arithmetic decoding engine.
Its arithmetic decoding engine produces the general mathematical decoder of symbol value except symbol transformations unit 915 is different with the CABAC decoder, has configuration and the function identical with the CABAC decoder.Yet implementing in detail may be different, therefore uses different Reference numerals.
In the entropy decoder of Fig. 9 C, context determining unit 920 not only reads the syntax element information in the identical layer, and the binary value that uses the symbol value of code element rather than code element from lower level grammer original storage unit 921 reads the syntax element information in the lower level, the information that is read is added in the context template, and and definite context.Determined context is used for selecting one from a plurality of map informations of code element-code word map information storage unit 923.Entropy Decode engine 924 is used selected code element-code word map information output code element corresponding with code word.
In the entropy decoder of Fig. 9 D, the syntax element information in the lower level of being stored in lower-layer syntax element memory cell 931 not only is used in the context determining unit 930, and is used in the symbol transformations unit 932 binary value is transformed to code element.Conversion process is with opposite with reference to the described binary translation process of Fig. 8 D in detail.
Be used for reference at the syntactic element of identical layer, syntactic element in identical layer and lower level or the syntactic element in lower level, adaptively binary value is transformed to code element and has improved code efficiency, no matter and the context determining unit with reference to syntactic element in the lower level or the syntactic element in identical layer.Associated exemplary execution mode of the present invention has been shown in Fig. 9 E.In other words, context determining unit 940 is only described with reference to Figure 4 like that with reference to the syntax element information in the identical layer, and symbol transformations unit 942 is with reference to only one in syntactic element in the syntactic element in the identical layer, identical layer and the lower level and the syntactic element in the lower level.Therefore, layer syntactic element memory cell 944 be according to the selection of reference information, in syntactic element in syntactic element, identical layer and the lower level in the storage identical layer and the syntactic element in the lower level only one.
Though shown in Fig. 9 A and 9B, context determining unit 910 with 916 and probability Estimation unit 913 can separate with 918, but shown in Fig. 9 F, can be by the function of probability Estimation unit 913 being added in the context determining unit 910 of Fig. 9 A, and shown in Fig. 9 G, the function of probability Estimation unit 918 can be added in the context determining unit 916 of Fig. 9 B.Probability Estimation unit 913 and 918 uses context determining unit 910 and 916 determined contexts among Fig. 9 A and the 9B to determine probable value, but context determining unit 950 and 960 is determined the particular probability value in Fig. 9 F and 9G.Context unit 950 and 960 is also carried out the probability update functions.
Figure 10 A is the flow chart that the entropy coding method of example embodiment according to the present invention is shown to 10C.
With reference to Figure 10 A, in the code element input of the operation S1010 syntactic element that will be encoded.Can be binary string with the symbol transformations of the syntactic element exported.The example that is transformed to binary string is as described in the reference table 1.S1012 determines and the symbol value of syntactic element or the corresponding context of each binary value of binary string in operation.At this moment, the syntactic element in the lower level of the identical layer that comprises the piece that grammer is original by the reference conduct and this layer is when perhaps only determining context with reference to the syntactic element in the low layer.Context is determined as described in reference Fig. 7 A and the 7B.
Estimate to be used for the probable value of syntactic element according to determined context at operation S1014, and use estimated probable value to carry out entropy coding.When such described with reference to Figure 1, with the symbol transformations of syntactic element is when being used for the binary string of entropy coding, estimation is at each probable value of binary value 0 and 1, and at the symbol value of operation S1016 entropy coding input syntax element or the binary value corresponding with the input syntax element.Upgrade probabilistic model at operation S1018 according to symbol value or binary value through entropy coding.Context can be determined that operation S1012 and probability Estimation operation S1014 are integrated in the operation, therefore can be by with reference to as the syntactic element in the lower level of the identical layer of the piece that comprises syntactic element and this layer, perhaps only with reference to the syntactic element in lower level, estimate to be used for the probable value of syntactic element, and do not need contextual clearly definite.
Below, with reference to the entropy coding method of Figure 10 B description according to another example embodiment of the present invention.Import the code element of the syntactic element that will be encoded at operation S1020, and in operation S1022, determine the context corresponding with syntactic element.It is identical that context determines that the context of operation S1022 and Figure 10 A determines to operate S1012.As described in reference Fig. 8 C, select one of a plurality of code elements-code word map information according to determined context in operation S1024.In operation S1026, according to selected code element-code word map information syntactic element is carried out entropy coding, therefore the output code word corresponding with input symbols.In Figure 10 B, context can be determined operation S1022 and map information selection operation S1024 are integrated in the operation, therefore can be by with reference to as the identical layer of the piece that comprises syntactic element and the syntactic element in the lower level at this layer, perhaps only with reference to the syntactic element in lower level, come to select code element-code word map information, and do not need contextual clearly definite at syntactic element.
Below, with reference to Figure 10 C the CABAC entropy coding method that the information of using in identical layer or the lower level is used for binarization is described.Import the code element of the syntactic element that will be encoded at operation S1030.In operation S1032, be binary string with the symbol transformations of syntactic element.When symbol transformations is binary string, determine the binary string corresponding with each code element according to the syntax element information in identical layer or lower level.In other words, use is from the same type in the same type of the given syntactic element of identical layer or dissimilar syntactic element, identical layer and the lower level or dissimilar syntactic element and the same type in lower level or one of dissimilar syntactic element or attribute of syntactic element, is binary string with each symbol transformations of given syntactic element.In other words, when symbol transformations is binary string, can be only with reference to one of in the syntactic element in the identical layer, syntactic element in identical layer and syntactic element in lower level and the syntactic element in lower level.
In operation S1034, determine the corresponding context of each binary value with the code element of syntactic element.At this moment, by only with reference to determining context as one of the syntactic element in the identical layer of the piece that comprises syntactic element, syntactic element in identical layer and lower level and the syntactic element in lower level.Contextually determine as described with reference to Fig. 7 A and 7B.
In operation S1036, estimate to be used for the probable value of syntactic element, and use estimated probable value to carry out entropy coding according to determined context.When as described, being when being used for the binary string of entropy coding,, estimate each probable value of binary value 0 and 1, and use estimated probable value entropy coding syntactic element at operation S1038 with the symbol transformations of syntactic element with reference to table 1.At operation S1040, upgrade probabilistic model according to symbol value and binary value through the syntactic element of entropy coding.Context can be determined that operation S1034 and probability Estimation operation S1036 are integrated in the operation, therefore can be by with reference to as the syntactic element in the lower level of the identical layer of the piece that comprises syntactic element and this layer, perhaps only with reference to the syntactic element in the lower level, estimate to be used for the probable value of syntactic element, and do not need contextual clearly definite.
Figure 11 A is the flow chart that illustrates according to the entropy decoding method of example embodiment of the present invention to 11C.
Below, the entropy decoding method that uses CABAC is described with reference to Figure 11 A.At the compressed bit stream of operation S1110 input, determine context according to the position of each binary value of wanting decoded binary string at operation S1112.When in operation S1112, having determined the context corresponding, as reference Fig. 9 A is described, not only with reference to the syntax element information in the identical layer but also with reference to the syntax element information in the lower level with each binary value.Context is determined as described with reference to Fig. 7 A and 7B.Estimate each corresponding probable value with binary value 0 and 1 at step S1114 according to determined context.Use estimated probable value to carry out the entropy decoding at operation S1116, the binary value of wherein from compressed bit stream, decoding in proper order.At operation S1118, upgrade each the probabilistic model that is used for binary value 0 and 1 according to the binary value of order decoding.Symbol transformations operation S1119 is as described in the symbol transformations unit 915 of reference Fig. 9 A.
In entropy decode operation S1116, the entropy decoding method of Figure 11 A can also be applied to produce the general mathematical coding/decoding method of symbol value, these are different with the entropy decoding method that uses CABAC.In this case, skip the symbol transformations operation S1119 of Figure 11 A, and use description with reference to Figure 11 A.Yet, owing to decoded symbol value rather than binary value in entropy decode operation S1116, so determine to determine among the operation S1112 and want the decoded corresponding context of code element at context.In operation S1114, estimate to be used for the probable value of each symbol value according to determined context.When in operation S1112, determining context, not only with reference to the syntax element information in the identical layer, and with reference to the syntax element information in the lower level.After carrying out the entropy decode operation S1116 that uses estimated probable value decoded symbol value from compressed bit stream, upgrade the probable value that is used for through the symbol value of decoding according to the symbol value of being decoded at operation S1118.
Below, the coding/decoding method that uses code element-code word map information is described with reference to Figure 11 B.The compressed bit stream of input in operation S1120.In operation S1122, read in as syntax element information and the syntax element information in the lower level of this layer in the identical layer of the piece that comprises the code element of wanting decoded, and add in the context template to determine and to want the decoded corresponding context of code element.As reference Fig. 9 C is described, in operation S1124, use determined context to select one of a plurality of map informations.In operation S1126, use the corresponding code element of code word of selected code element-code word map information output and compressed bit stream.
Below, describe to use information in identical layer or the lower level to be used for the CABAC coding/decoding method of binarization with reference to Figure 11 C.At the compressed bit stream of operation S1130 input, and determine context according to the position of each binary value of wanting decoded binary string at operation S1132.In operation S1134, estimate each probable value of binary value 0 and 1 according to determined context.In operation S1136, carry out the entropy decoding to use estimated probable value order from compressed bit stream binary value of decoding.In operation S1138, upgrade each the probabilistic model that is used for binary value 0 and 1 according to the binary value of order decoding.In operation S1140, will be transformed to code element through the binary value of entropy decoding.Symbol transformations operation S1140 is as described in reference Fig. 9 E.In other words, when binary value is transformed to code element, with reference to one of code element of the code element of coming comfortable code element, the same type in identical layer and lower level or dissimilar syntactic element and the same type in lower level or dissimilar syntactic element as the same type of wanting decoded syntactic element in the identical layer of wanting decoded syntactic element or dissimilar syntactic element.
Be used for the method that syntactic element in syntactic element, identical layer and the lower level with reference to identical layer or the syntactic element in the lower level be transformed to binary value code element adaptively and improved code efficiency, no matter and contextual determine in reference to syntactic element in the lower level or the syntactic element in the identical layer.
During Figure 12 according to the block diagram of the video encoder with entropy coder of example embodiment of the present invention.
This video encoder comprises motion compensation units 1202, motion compensation units 1204, interior predicting unit 1206, converter unit 1208, quantifying unit 1210, rearranges unit 1212, Entropy Changes coding unit 1214, inverse quantization unit 1216, inverse transformation block 1218 and frame memory 1222.
This video encoder is with the encode macro block of current picture of a pattern of selecting from various coding modes.So far, might come computation rate distortion (RD) cost by pattern xia execution coding by institute in a prediction (inter prediction) and interior prediction (intra prediction), selection has the pattern of minimum RD cost as the forced coding pattern, and the selected forced coding pattern of yi is carried out coding.For a prediction, motion compensation units 1202 is searched for the predicted value of the macro block of current picture in reference picture.If motion estimation unit 1202 is a unit searching for reference piece with 1/2 pixel or 1/4 pixel, then motion compensation units 1204 is calculated the data of intermediate pixel and definite reference block.Therefore, by prediction between motion compensation units 1202 and motion compensation units 1204 execution.
Prediction in interior predicting unit 1206 is carried out, the predicted value of in this, in current picture, searching for the macro block of current picture in the prediction.In order to determine whether to prediction or interior prediction between the current macro execution, might calculate the RD cost in the coding mode in institute, the pattern that will have minimum RD cost is defined as being used for the coding mode of unit macro block, and in determined coding mode current macro is carried out coding.
In case prediction or interior prediction are found and will then be extracted it from the macro block of current picture by the prediction data of the macro block reference of present frame between passing through.The remainder of macro block carries out conversion by converter unit 1208 and is quantized by quantifying unit 1210.The remainder that has extracted prediction data macro block afterwards is called residual fraction (residual).Also this residual fraction is encoded to reduce the data volume in the coding.The residual fraction that process quantizes is by rearranging unit 1212 to be encoded by entropy coding unit 1214.Dispose entropy coding unit 1214 to 8G described as reference Fig. 8 A.
For the reference picture of prediction between obtaining to be used for, through the picture that quantizes by inverse quantization unit 1216 and inverse transformation block 1218 with the current picture of reconstruct.The current picture of institute's reconstruct is stored in the frame memory 1222, and be used for afterwards next picture between the prediction.In case the current picture through reconstruct passes through filter 1220, then obtain to have the raw frames of additional code error.
Figure 13 is the block diagram according to the Video Decoder with entropy decoder of example embodiment of the present invention.
This Video Decoder comprises entropy decoding unit 1310, rearranges unit 1320, inverse quantization unit 1330, inverse transformation block 1340, interior predicting unit 1350, motion compensation units 1360, filter 1370 and frame memory 1380.
In case input is by the bit stream of video encoder encodes, entropy decoding unit 1310 just extracts symbol data by the bit stream execution entropy of being imported is decoded.The miscellaneous part of this Video Decoder have with reference to Figure 12 described those identical configuration and functions.
Can also be with the computer-readable code that is embodied as according to the entropy coding of example embodiment of the present invention and entropy decoding method at the computer readable recording medium storing program for performing entropy.Can easily construct code and the code segment that forms this computer program by the computer programmer of computer realm.And, this computer program can be stored in the computer-readable medium and by computer and read and carry out, thereby enforcement is used for the method for entropy coding and entropy decoding.The example of computer-readable medium comprises tape, light data storage device and carrier wave.
Though specifically illustrate and described example embodiment of the present invention with reference to example embodiment of the present invention, but it should be appreciated by those skilled in the art, do not departing under the situation of the spirit and scope of the present invention that limit by claims, can make amendment to it in form and details.

Claims (45)

1. scalable entropy coding method comprises:
By with reference to as syntactic element in the identical layer of the piece that comprises the syntactic element that will be encoded and the syntactic element in lower level, perhaps only determine context with reference to the syntactic element in the lower level; And
Use determined context that the syntactic element that will be encoded is carried out entropy coding.
2. scalable entropy coding method according to claim 1, wherein, the syntactic element of institute's reference is identical with the syntax element type that will be encoded.
3. scalable entropy coding method according to claim 1, wherein, described context indication probable value.
4. scalable entropy coding method according to claim 1, wherein, code element-code word map information that described context is indicating predetermined.
5. scalable entropy coding method according to claim 1, wherein, the syntactic element of institute's reference comprises element identical with the type of the syntactic element that will be encoded and the element different with the type of the syntactic element that will be encoded.
6. scalable entropy coding method according to claim 1, wherein, described entropy coding is to use the arithmetic coding that predetermined arithmetic represents or uses the coding of code element-code word map information.
7. scalable entropy coding method according to claim 1 wherein, uses predetermined code element-code word map information to carry out entropy coding.
8. scalable entropy coding method according to claim 6, wherein, described entropy coding is one of binary arithmetic coding, arithmetic coding, huffman coding and general variable length coding.
9. scalable entropy coding method according to claim 1 comprises that also the syntactic element that will be encoded is transformed to binary string.
10. scalable entropy coding method according to claim 9, wherein, by with reference to as the syntactic element in the identical layer of piece that comprises the syntactic element that will be encoded, in the lower level syntactic element or with reference at the syntactic element of identical layer and the syntactic element in lower level, the syntactic element that will be encoded is transformed to binary string.
11. scalable entropy coding method according to claim 10 wherein, is fixed or is changed the syntactic element that will be encoded and the mapping between the binary string changeably according to determined context.
12. an entropy coding method comprises:
As the syntactic element in the identical layer of the piece that comprises the syntactic element that will be encoded, syntactic element or syntactic element in identical layer and the syntactic element in lower level in lower level, the syntactic element that will be encoded is transformed to binary string by reference;
By determining context with reference to the syntactic element in identical layer; And
Use determined context, the syntactic element that be encoded is carried out entropy coding.
13. a scalable entropy decoding method comprises:
Reception is through the data of entropy coding, and by with reference to as comprising syntactic element in the identical layer of the piece of the syntactic element of entropy coding and the syntactic element in lower level, perhaps only with reference to the syntactic element in lower level, determines context; With
Use determined context to carrying out the entropy decoding through the data of entropy coding.
14. scalable entropy decoding method according to claim 13, wherein, the syntactic element of institute's reference is identical with the syntax element type that will be encoded.
15. scalable entropy decoding method according to claim 13, wherein, described context indication probable value.
16. scalable entropy decoding method according to claim 13, wherein, code element-code word map information that described context is indicating predetermined.
17. scalable entropy decoding method according to claim 13, wherein, the syntactic element of institute's reference comprises element identical with the type of the syntactic element that will be encoded and the element different with the type of the syntactic element that will be encoded.
18. scalable entropy decoding method according to claim 13, wherein, the decoding of described entropy is to use arithmetic decoding that predetermined arithmetic represents or the decoding of using code element-code word map information.
19. scalable entropy decoding method according to claim 18, wherein, described entropy decoding is one of binary arithmetic decoding, arithmetic decoding, Hofmann decoding and general variable length decoding.
20. scalable entropy decoding method according to claim 13 also comprises the binary string through the entropy decoding is transformed to code element.
21. scalable entropy decoding method according to claim 20, wherein, by with reference to as comprising, will be transformed to code element through the binary string of entropy decoding through the syntactic element in the identical layer of determining of entropy coding syntactic element, syntactic element or syntactic element in identical layer and the syntactic element in lower level in lower level.
22. an entropy decoding method comprises:
Reception is through the data of entropy coding, and by with reference to as comprising that the syntactic element in the identical layer of the piece of the syntactic element of entropy coding determines context;
Use determined context to carrying out the entropy decoding through the data of entropy coding; With
By being transformed to code element through the binary string of entropy decoding with reference to one of following: conduct comprise syntactic element in the identical layer of the piece of the syntactic element of entropy coding, in lower level syntactic element or in identical layer syntactic element and the syntactic element in lower level.
23. entropy decoding method according to claim 22 is wherein, according to determined context, fixing or change mapping between binary string and the code element changeably.
24. a scalable entropy coder comprises:
The context determining unit, it perhaps only with reference to the syntactic element in lower level, determines context by with reference to as syntactic element in the identical layer of the piece that comprises the syntactic element that will be encoded and the syntactic element in lower level; With
The entropy coding engine, it uses determined context to come the syntactic element that will encode is carried out entropy coding.
25. scalable entropy coder according to claim 24, wherein, described context determining unit is by determining context with reference to the syntactic element identical with the syntax element type that will be encoded.
26. scalable entropy coder according to claim 24, wherein, described context indication probable value.
27. scalable entropy coder according to claim 24, wherein, code element-code word map information that described context is indicating predetermined.
28. scalable entropy coder according to claim 24, wherein, described context determining unit is by determining context with reference to syntactic element identical with the type of the syntactic element that will be encoded and the syntactic element different with the type of the syntactic element that will be encoded.
29. scalable entropy coder according to claim 24, wherein, described entropy coding draws and carry out to use arithmetic coding that predetermined arithmetic represents or the coding that uses code element-code word map information.
30. scalable entropy coder according to claim 29, wherein, described entropy coding engine is carried out one of binary arithmetic coding, arithmetic coding, huffman coding and general variable length coding.
31. scalable entropy coder according to claim 24 also comprises the binary string converter unit, its syntactic element that will be encoded is transformed to binary string.
32. scalable entropy coder according to claim 31, wherein, described binary string converter unit is by being transformed to binary string with reference to one of the following syntactic element that will be encoded: as the syntactic element in the identical layer of the piece that comprises the syntactic element that will be encoded, syntactic element or syntactic element in identical layer and the syntactic element in lower level in lower level.
33. an entropy coder comprises:
The binary string converter unit, it is by with reference to as the syntactic element in the identical layer of piece that comprises the syntactic element that will be encoded, syntactic element or syntactic element in identical layer and the syntactic element in lower level in lower level, and the syntactic element that will be encoded is transformed to binary string;
The context determining unit, it determines context by the syntactic element in the reference identical layer; With
The entropy coding engine, it uses determined context that the syntactic element that will be encoded is carried out entropy coding.
34. entropy coder according to claim 33, wherein, described binary string converter unit use the syntactic element that will be encoded and according to determined context fix or the binary string that changes changeably between mapping, the syntactic element that will be encoded is transformed to binary string.
35. a scalable entropy decoder comprises:
The context determining unit, it receives the data through entropy coding, and, perhaps, determine context only with reference to the syntactic element in the lower level by with reference to as comprising syntactic element in the identical layer of the piece of the syntactic element of entropy coding and the syntactic element in the lower level; With
The entropy Decode engine, it uses determined context to carrying out the entropy decoding through the data of entropy coding.
36. scalable entropy decoder according to claim 35, wherein, described context determining unit is determined context by reference with the syntactic element of the syntactic element same type that will be encoded.
37. scalable entropy decoder according to claim 35, wherein, described context indication probable value.
38. scalable entropy decoder according to claim 35, wherein, one of a plurality of predetermined code elements-code word map information indicated in described context.
39. scalable entropy decoder according to claim 35, wherein, described context determining unit is usually determined context by comprising element identical with the type of the syntactic element that will be encoded and the unit different with the type of the syntactic element that will be encoded.
40. scalable entropy decoder according to claim 35, wherein, arithmetic decoding that the predetermined arithmetic of described entropy Decode engine execution use is represented or the coding that uses code element-code word map information.
41. according to the described scalable entropy decoder of claim 40, wherein, described entropy Decode engine is carried out one of binary arithmetic decoding, arithmetic decoding, Hofmann decoding and general variable length decoding.
42. scalable entropy decoder according to claim 35 also comprises the symbol transformations unit, will be transformed to code element through the binary string of entropy decoding.
43. according to the described scalable entropy decoder of claim 42, wherein, described symbol transformations unit will be transformed to code element through the binary string of entropy decoding by with reference to as the syntactic element in the identical layer of the piece that comprises process entropy coding syntactic element, syntactic element or syntactic element in identical layer and the syntactic element in lower level in lower level.
44. an entropy decoder comprises:
The context determining unit, it receives the data through entropy coding, and by with reference to as comprising that the syntactic element in the identical layer of the piece of the syntactic element of entropy coding determines context;
The entropy Decode engine, it uses determined context to carrying out the entropy decoding through the data of entropy coding; With
The symbol transformations unit, it will be transformed to code element through the binary string of entropy decoding by with reference to as the syntactic element in the identical layer of the piece that comprises the syntactic element that passes through entropy coding, syntactic element or syntactic element in identical layer and the syntactic element in lower level in lower level.
45. according to the described entropy decoder of claim 44, wherein, described symbol transformations unit uses the mapping between the binary string of fixing through the element of entropy coding with according to determined context or changing changeably, will be transformed to code element through the binary string of entropy decoding.
CNA2006800017078A 2005-01-14 2006-01-14 Methods of and apparatuses for adaptive entropy encoding and adaptive entropy decoding for scalable video encoding Pending CN101099391A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR20050003918 2005-01-14
KR1020050003918 2005-01-14
KR1020050031410 2005-04-15

Publications (1)

Publication Number Publication Date
CN101099391A true CN101099391A (en) 2008-01-02

Family

ID=37173591

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2006800017078A Pending CN101099391A (en) 2005-01-14 2006-01-14 Methods of and apparatuses for adaptive entropy encoding and adaptive entropy decoding for scalable video encoding

Country Status (3)

Country Link
JP (1) JP2008527902A (en)
KR (1) KR100636229B1 (en)
CN (1) CN101099391A (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102263948A (en) * 2010-05-27 2011-11-30 飞思卡尔半导体公司 Video processing system, computer program product and method for decoding an encoded video stream
CN102598514A (en) * 2009-08-21 2012-07-18 汤姆森特许公司 Methods and apparatus for explicit updates for symbol probabilities of an entropy encoder or decoder
CN102783036A (en) * 2010-02-26 2012-11-14 捷讯研究有限公司 Encoding and decoding methods and devices using a secondary codeword indicator
CN102783035A (en) * 2010-02-18 2012-11-14 捷讯研究有限公司 Parallel entropy coding and decoding methods and devices
WO2012163199A1 (en) * 2011-05-27 2012-12-06 Mediatek Inc. Method and apparatus for line buffer reduction for video processing
CN104581154A (en) * 2014-12-31 2015-04-29 湖南国科微电子有限公司 Entropy coding method and entropy coder circuit
CN104813589A (en) * 2012-12-14 2015-07-29 英特尔公司 Protecting against packet loss during transmission of video information
CN104853217A (en) * 2010-04-16 2015-08-19 Sk电信有限公司 Video encoding/decoding apparatus and method
CN105144712A (en) * 2013-04-09 2015-12-09 西门子公司 Method for coding sequence of digital images
CN105917408A (en) * 2014-01-30 2016-08-31 高通股份有限公司 Indicating frame parameter reusability for coding vectors
CN105959014A (en) * 2010-09-30 2016-09-21 夏普株式会社 Methods and systems for context initialization in video coding and decoding
CN106488237A (en) * 2010-04-05 2017-03-08 三星电子株式会社 Low complex degree entropy coding/decoding method and apparatus
TWI575886B (en) * 2011-01-14 2017-03-21 Ge影像壓縮有限公司 Entropy encoding and decoding scheme
CN107087189A (en) * 2011-11-07 2017-08-22 太格文-Ii有限责任公司 Method for encoding images and picture coding device
CN107465926A (en) * 2011-06-16 2017-12-12 Ge视频压缩有限责任公司 Context initialization in entropy code
CN108471538A (en) * 2010-04-13 2018-08-31 Ge视频压缩有限责任公司 Decode device, method and the device for encoding Saliency maps of Saliency maps
US10499176B2 (en) 2013-05-29 2019-12-03 Qualcomm Incorporated Identifying codebooks to use when coding spatial components of a sound field
US10770087B2 (en) 2014-05-16 2020-09-08 Qualcomm Incorporated Selecting codebooks for coding vectors decomposed from higher-order ambisonic audio signals

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100703776B1 (en) * 2005-04-19 2007-04-06 삼성전자주식회사 Method and apparatus of context-based adaptive arithmetic coding and decoding with improved coding efficiency, and method and apparatus for video coding and decoding including the same
KR101154999B1 (en) 2005-07-08 2012-07-09 엘지전자 주식회사 Method for modeling coding information of video signal for compressing/decompressing coding information
WO2007008018A1 (en) 2005-07-08 2007-01-18 Lg Electronics Inc. Method for modeling coding information of a video signal to compress/decompress the information
KR101158439B1 (en) 2005-07-08 2012-07-13 엘지전자 주식회사 Method for modeling coding information of video signal for compressing/decompressing coding information
KR100935492B1 (en) * 2007-10-29 2010-01-06 중앙대학교 산학협력단 Apparatus and method for predicting motion using 1-bit transform and apparatus and method for encoding image using the same
CN102754433B (en) * 2010-01-26 2015-09-30 维德约股份有限公司 Low complex degree, high frame-rate video encoder
WO2011126282A2 (en) 2010-04-05 2011-10-13 Samsung Electronics Co., Ltd. Method and apparatus for encoding video by using transformation index, and method and apparatus for decoding video by using transformation index
US9049450B2 (en) 2010-04-05 2015-06-02 Samsung Electronics Co., Ltd. Method and apparatus for encoding video based on internal bit depth increment, and method and apparatus for decoding video based on internal bit depth increment
WO2011126277A2 (en) * 2010-04-05 2011-10-13 Samsung Electronics Co., Ltd. Low complexity entropy-encoding/decoding method and apparatus
PL2559166T3 (en) 2010-04-13 2018-04-30 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Probability interval partioning encoder and decoder
JP2013526795A (en) * 2010-05-10 2013-06-24 サムスン エレクトロニクス カンパニー リミテッド Method and apparatus for transmitting and receiving layer coding video
WO2012036436A2 (en) * 2010-09-13 2012-03-22 한국전자통신연구원 Method and apparatus for entropy encoding/decoding
US8977065B2 (en) * 2011-07-21 2015-03-10 Luca Rossato Inheritance in a tiered signal quality hierarchy
US8531321B1 (en) * 2011-07-21 2013-09-10 Luca Rossato Signal processing and inheritance in a tiered signal quality hierarchy
PT3145197T (en) 2011-10-31 2018-08-10 Samsung Electronics Co Ltd Method for determining a context model for transform coefficient level entropy decoding
AU2015201780B2 (en) * 2011-10-31 2016-07-28 Samsung Electronics Co., Ltd. Method and apparatus for determining a context model for transform coefficient level entropy encoding and decoding
KR101975404B1 (en) * 2017-12-27 2019-08-28 세종대학교산학협력단 Apparatus and method for generating procedural content

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9654138B2 (en) 2009-08-21 2017-05-16 Thomson Licensing Dtv Methods and apparatus for explicit updates for symbol probabilities of an entropy encoder or decoder
CN102598514A (en) * 2009-08-21 2012-07-18 汤姆森特许公司 Methods and apparatus for explicit updates for symbol probabilities of an entropy encoder or decoder
CN102598514B (en) * 2009-08-21 2015-12-09 汤姆森特许公司 The method and apparatus of the symbol probability of dominant renewal entropy coder or decoder
CN102783035A (en) * 2010-02-18 2012-11-14 捷讯研究有限公司 Parallel entropy coding and decoding methods and devices
CN102783035B (en) * 2010-02-18 2015-07-22 黑莓有限公司 Parallel entropy coding and decoding methods and devices
CN102783036A (en) * 2010-02-26 2012-11-14 捷讯研究有限公司 Encoding and decoding methods and devices using a secondary codeword indicator
CN102783036B (en) * 2010-02-26 2015-04-22 黑莓有限公司 Encoding and decoding methods and devices using a secondary codeword indicator
CN106488237A (en) * 2010-04-05 2017-03-08 三星电子株式会社 Low complex degree entropy coding/decoding method and apparatus
CN113556560A (en) * 2010-04-13 2021-10-26 Ge视频压缩有限责任公司 Coding of significance maps and transform coefficient blocks
US11252419B2 (en) 2010-04-13 2022-02-15 Ge Video Compression, Llc Coding of significance maps and transform coefficient blocks
US11297336B2 (en) 2010-04-13 2022-04-05 Ge Video Compression, Llc Coding of significance maps and transform coefficient blocks
CN108471538B (en) * 2010-04-13 2022-05-17 Ge视频压缩有限责任公司 Device and method for decoding significance map and device for encoding significance map
CN108471538A (en) * 2010-04-13 2018-08-31 Ge视频压缩有限责任公司 Decode device, method and the device for encoding Saliency maps of Saliency maps
CN104853217A (en) * 2010-04-16 2015-08-19 Sk电信有限公司 Video encoding/decoding apparatus and method
CN102263948A (en) * 2010-05-27 2011-11-30 飞思卡尔半导体公司 Video processing system, computer program product and method for decoding an encoded video stream
CN102263948B (en) * 2010-05-27 2016-08-03 飞思卡尔半导体公司 For decoding processing system for video and the method for the video flowing of coding
CN105959014B (en) * 2010-09-30 2019-07-23 夏普株式会社 The method and apparatus that video is decoded or is encoded
CN105959014A (en) * 2010-09-30 2016-09-21 夏普株式会社 Methods and systems for context initialization in video coding and decoding
CN106059593A (en) * 2010-09-30 2016-10-26 夏普株式会社 Methods and systems for context initialization in video coding and decoding
CN106060545A (en) * 2010-09-30 2016-10-26 夏普株式会社 Methods and systems for context initialization in video coding and decoding
CN106060545B (en) * 2010-09-30 2019-06-14 夏普株式会社 The method and system of context initialization in Video coding and decoding
US10404272B2 (en) 2011-01-14 2019-09-03 Ge Video Compression, Llc Entropy encoding and decoding scheme
US10855309B2 (en) 2011-01-14 2020-12-01 Ge Video Compression, Llc Entropy encoding and decoding scheme
US11405050B2 (en) 2011-01-14 2022-08-02 Ge Video Compression, Llc Entropy encoding and decoding scheme
CN107196662B (en) * 2011-01-14 2021-07-30 Ge视频压缩有限责任公司 Entropy encoding apparatus and method, entropy decoding apparatus and method, and storage medium
CN107196662A (en) * 2011-01-14 2017-09-22 Ge视频压缩有限责任公司 Entropy code apparatus and method, entropy decoding apparatus and method and storage medium
US9806738B2 (en) 2011-01-14 2017-10-31 Ge Video Compression, Llc Entropy encoding and decoding scheme
US9698818B2 (en) 2011-01-14 2017-07-04 Ge Video Compression, Llc Entropy encoding and decoding scheme
US10826524B2 (en) 2011-01-14 2020-11-03 Ge Video Compression, Llc Entropy encoding and decoding scheme
US10644719B2 (en) 2011-01-14 2020-05-05 Ge Video Compression, Llc Entropy encoding and decoding scheme
US9647683B2 (en) 2011-01-14 2017-05-09 Ge Video Compression, Llc Entropy encoding and decoding scheme
US10581454B2 (en) 2011-01-14 2020-03-03 Ge Video Compression, Llc Entropy encoding and decoding scheme
US10090856B2 (en) 2011-01-14 2018-10-02 Ge Video Compression, Llc Entropy encoding and decoding scheme
TWI640169B (en) * 2011-01-14 2018-11-01 Ge影像壓縮有限公司 Entropy encoding and decoding scheme
US20190013822A1 (en) 2011-01-14 2019-01-10 Ge Video Compression, Llc Entropy encoding and decoding scheme
US10224953B2 (en) 2011-01-14 2019-03-05 Ge Video Compression, Llc Entropy encoding and decoding scheme
US10419017B2 (en) 2011-01-14 2019-09-17 Ge Video Compression, Llc Entropy encoding and decoding scheme
TWI575886B (en) * 2011-01-14 2017-03-21 Ge影像壓縮有限公司 Entropy encoding and decoding scheme
US9866848B2 (en) 2011-05-27 2018-01-09 Hfi Innovation Inc. Method and apparatus for line buffer reduction for video processing
WO2012163199A1 (en) * 2011-05-27 2012-12-06 Mediatek Inc. Method and apparatus for line buffer reduction for video processing
US9762918B2 (en) 2011-05-27 2017-09-12 Hfi Innovation Inc. Method and apparatus for line buffer reduction for video processing
US9986247B2 (en) 2011-05-27 2018-05-29 Hfi Innovation Inc. Method and apparatus for line buffer reduction for video processing
CN107465926A (en) * 2011-06-16 2017-12-12 Ge视频压缩有限责任公司 Context initialization in entropy code
CN107087189A (en) * 2011-11-07 2017-08-22 太格文-Ii有限责任公司 Method for encoding images and picture coding device
CN104813589A (en) * 2012-12-14 2015-07-29 英特尔公司 Protecting against packet loss during transmission of video information
CN105144712B (en) * 2013-04-09 2018-09-11 西门子公司 Method for encoded digital image sequence
CN105144712A (en) * 2013-04-09 2015-12-09 西门子公司 Method for coding sequence of digital images
US10250874B2 (en) 2013-04-09 2019-04-02 Siemens Aktiengesellschaft Method for coding sequence of digital images
US10499176B2 (en) 2013-05-29 2019-12-03 Qualcomm Incorporated Identifying codebooks to use when coding spatial components of a sound field
US11146903B2 (en) 2013-05-29 2021-10-12 Qualcomm Incorporated Compression of decomposed representations of a sound field
US11962990B2 (en) 2013-05-29 2024-04-16 Qualcomm Incorporated Reordering of foreground audio objects in the ambisonics domain
CN105917408A (en) * 2014-01-30 2016-08-31 高通股份有限公司 Indicating frame parameter reusability for coding vectors
US10770087B2 (en) 2014-05-16 2020-09-08 Qualcomm Incorporated Selecting codebooks for coding vectors decomposed from higher-order ambisonic audio signals
CN104581154A (en) * 2014-12-31 2015-04-29 湖南国科微电子有限公司 Entropy coding method and entropy coder circuit
CN104581154B (en) * 2014-12-31 2016-03-02 湖南国科微电子股份有限公司 A kind of entropy coding method and entropy coder circuit

Also Published As

Publication number Publication date
JP2008527902A (en) 2008-07-24
KR100636229B1 (en) 2006-10-19
KR20060083100A (en) 2006-07-20

Similar Documents

Publication Publication Date Title
CN101099391A (en) Methods of and apparatuses for adaptive entropy encoding and adaptive entropy decoding for scalable video encoding
US9774863B2 (en) Video decoder with enhanced CABAC decoding
US10277903B2 (en) Video encoding method and video encoding for signaling SAO parameters
US7262721B2 (en) Methods of and apparatuses for adaptive entropy encoding and adaptive entropy decoding for scalable video encoding
KR101187243B1 (en) Fast parsing of variable-to-fixed-length codes
CN108965891B (en) Method and apparatus for video encoding and decoding
CN105325004A (en) Video encoding method and apparatus, and video decoding method and apparatus based on signaling of sample adaptive offset parameters
JP2021520087A (en) Methods and equipment for video coding and decoding based on CABAC's neural network implementation
CN111263149B (en) Video encoding method and apparatus, and video decoding method and apparatus
CN104539948A (en) Video processing system and video processing method
AU2019356526B2 (en) A method and apparatus for image compression
US20140269896A1 (en) Multi-Frame Compression
US8421655B2 (en) Apparatus for parallel entropy encoding and decoding

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20080102