CN100403801C - Adaptive entropy coding/decoding method based on context - Google Patents
Adaptive entropy coding/decoding method based on context Download PDFInfo
- Publication number
- CN100403801C CN100403801C CNB2005101048530A CN200510104853A CN100403801C CN 100403801 C CN100403801 C CN 100403801C CN B2005101048530 A CNB2005101048530 A CN B2005101048530A CN 200510104853 A CN200510104853 A CN 200510104853A CN 100403801 C CN100403801 C CN 100403801C
- Authority
- CN
- China
- Prior art keywords
- context
- level
- coding
- run
- coefficient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Abstract
The present invention relates to an adaptive entropy coding/decoding method based on a context. When the coding is carried out, a quantized DCT coefficient in a current conversion block is scanned, and thus, a plurality of pairs of sequences are formed; then, entropy coding is carried out to each digital pair in the sequences according to the scanning inverse direction sequence digital pairs, and in the coding, a context statistic model is configured by the digital pairs of values, the coding of which is completed in the coding block in a motion adaptive mode. Simultaneously, a context model weighted fusion technique is provided for further enhancing the compressive performance of the model, and the entropy coding is driven by the context statistic model obtained from the step. The adaptive entropy coding/decoding method based on a context is the inverse of the encoding method. A long memory Markov model is carved by small number of context states by the present invention, and the present invention has high compressive performance.
Description
Technical field
The technical field that the present invention relates to is multimedia (image, video, audio frequency) compression and transmission, particularly a kind of DCT coefficient adaptive entropy coding/decoding method that is used for Signal Compression based on context modeling.The quality of statistics context model when the objective of the invention is to improve DCT coefficient adaptive entropy coding.In all data compression techniques and system, the adaptive entropy crucial effects of having encoded, and the adaptive entropy coding must be driven by the statistics context model.The quality of statistics context model is with the compression performance of final decision whole system.
Background technology
Discrete cosine transform (Discrete Cosine Transform is called for short DCT) is widely used in the compression of video/audio signal.Through dct transform, the statistics of signal and subjective redundancy can be understood, be utilized and be removed better.Therefore, the signal after the conversion also is more suitable for compression.The current majority world, domestic Signal Compression standard (as JPEG, MPEG, H.264, Chinese audio/video standards (Audio video standard, be called for short AVS)) all adopted the mode of in dct transform domain, encoding, this has demonstrated fully the validity of dct transform.Yet dct transform self can't produce the minimizing of any data volume, and this is because the number of coefficient is identical with the former hits of signal after the conversion.What really realize compression effectiveness is the entropy coding process of DCT coefficient.The entropy coding method of any DCT coefficient as Huffman (Huffman) sign indicating number, arithmetic code, all must utilize the estimation to DCT coefficient probability distribution.
In the signal compression system based on DCT of a practicality, the DCT coefficient will carry out entropy coding successively according to the scanning sequency of certain agreement.One of most popular scanning sequency is the zigzag scanning of being adopted in JPEG and the mpeg standard.Method among the present invention has versatility, goes for the scanning sequency of any DCT coefficient.
If use x
1, x
2..., x
NDCT coefficient sequence after the representative scanning, wherein N reflects the size of the piece that dct transform adopted, the required minimum code length of this sequence of encoding so is:
Wherein, x
I-1Represent current coefficient x
iSequence { the x that constitutes of all coefficients before
I-1, x
I-1... x
1.If conditional probability P is (x
i| x
I-1) known, adaptive arithmetic code can the desirable minimum code length L of convergence
MinEstimate in conditional probability
Down, arithmetic coding can be realized code length
。The key of problem be how to carry out probability Estimation so that
Become conditional probability P (x
i| x
I-1) good estimation.
Expression sequence x
I-1In to current coefficient x
iCertain significant subsequence of statistical property, be called model context (Context).Estimated conditional probability
Statistical model as information source.
With the probability Estimation form is that the statistics context model of embodiment is the core of all signal compression system.Model quality is that the accuracy of probability Estimation is with the final decision compression performance.
Nokia (Nokia) has proposed (a run to the DCT coefficient block in calendar year 2001 at H264 standard reference model TML85, level) (level is the non-zero residual error coefficient, and run represents the zero coefficient number between current nonzero coefficient and the previous nonzero coefficient) several methods of sequence being carried out the self adaptation two-value arithmetic coding.This method according to several when scanning is generated the order of " identical " several to encoding to each successively, during coding only with the level value of closing on most in this coefficient block textural hereinafter (single order Markov model).At the publication number of this method is the method that the patent of US2004112683 has been announced a kind of transcoding, coding transform coefficient, finishes coding to the DCT coefficient block by positive and negative twice scanning.Physical frequencies placement configurations context according to coefficient in forward scan is first encoded to first plane (being Significant Map) of coefficient.In the secondary inverse scan, finish coding to other plane of nonzero coefficient.But the method is not portrayed long memory markov (Markov) model, and compression performance is relatively low.
Summary of the invention
The objective of the invention is to deficiency at above-mentioned existing coding techniques existence, a kind of DCT coefficient adaptive entropy coding/decoding method that is used for Signal Compression based on context modeling is provided, improved existing DCT coefficient entropy coding/decoding technology, for Chinese AVS provides a high-performance entropy coding module, make the overall performance of Chinese AVS system meet or exceed the level of the most advanced video compression technology of our times.
For achieving the above object, the present invention has adopted a kind of DCT coefficient adaptive entropy coding/decoding method that is used for Signal Compression based on context modeling, wherein, during coding, carries out following steps:
If the individual numerical value of the described nonzero coefficient of step 2 is zero, then obtain EOB information (EOB is block end mark End of Block), promptly one (0,0) is several right; If the individual numerical value of described nonzero coefficient is non-vanishing, then in scanning result, search described nonzero coefficient number locational (level, run) several right;
If step 3 to first coefficient coding, is then constructed first context according to empty sequence; Otherwise, several to constructing first context according to having finished all of coding in the described coefficient block, and utilize the context weighting scheme, the absolute value of current level is carried out entropy coding;
If the number of the described nonzero coefficient of step 4 is zero then end-of-encode; Otherwise, according to finished in the described coefficient block all of coding several to and step 3 described in the absolute value of the level that encoded, the sign bit of level is encoded;
Step 6, the quantitative value of described nonzero coefficient is subtracted 1 operation, execution in step 2.
During decoding, carry out following steps:
If the resulting Level amplitude of step 3 is zero, then obtain EOB information, promptly one (0,0) is several right, execution in step 7; Otherwise, execution in step 4;
Step 6, several to constructing first context according to having finished all of decoding, and utilize the context weighting technique, decoding obtain the next one, and (Level Run) counts the absolute value of centering level; Execution in step 3;
Step 7, according to described decoding obtain (Level Run) severally recovers described coefficient block to sequence.
Consider practicality and compatibility, ins and outs of the present invention have been carried out careful adjusting to the direction that is suitable for Chinese AVS benchmark version, yet existing Chinese AVS system has been made from great improvement expansion.
(1) the present invention according to several when scanning is generated the order of " opposite " several to each successively to encoding, the coded sequence of this inverse scan preface has obviously improved the validity of context model.
(2) simultaneously, utilize during coding that all have finished (the level of coding in this coefficient block, run) several to textural hereinafter (high-order Markov model), and novelty used the context weighting technique, this method can be improved compression performance by a plurality of context models are combined together.
(3) the present invention has adopted a kind of context quantization method of innovation, has realized utilizing the few long memory of context state number portrayal Markov model, thereby avoid suffering the harmful effect of context dilution problem when portrayal high-order Markov model.
The present invention when stating purpose in realization relative prior art do not increase computation complexity, be fit to use in real time, and have compatibility with Chinese AVS REF video encoding and decoding framework.
Below in conjunction with drawings and Examples, technical scheme of the present invention is described in further detail.
Description of drawings
Fig. 1 is a coding flow chart of the present invention;
Fig. 2 is decoding process figure of the present invention;
Fig. 3 is a zig-zag scanning example;
Fig. 4 is the coded sequence of the present invention to the piece coefficient.
Embodiment
Be embodiments of the invention one as shown in Figure 1, 2, the specific coding step is:
If the individual numerical value of the described nonzero coefficient of step 102 is zero, then obtain EOB information, promptly one (0,0) is several right; If the individual numerical value of described nonzero coefficient is non-vanishing, then in scanning result, search described nonzero coefficient number locational (level, run) several right;
If step 103 to first coefficient coding, is then constructed first context according to empty sequence; Otherwise, several to constructing first context according to having finished all of coding in the described coefficient block, and utilize the context weighting scheme, the absolute value of current level is carried out entropy coding; Constructing first context comprises: define two stochastic variables, first stochastic variable be used for that the minute book coefficient block finished coding all count the amplitude change information of centering level, second stochastic variable is recorded in the position of current level to be encoded in the inverse scan preface; According to described two stochastic variables, construct first context by the context weighting, utilize few context state to describe long memory Markov model;
If the number of the described nonzero coefficient of step 104 is zero then end-of-encode; Otherwise, according to finished in the described coefficient block all of coding several to and step 3 in the just absolute value of the level of coding, the sign bit of level is encoded;
During decoding, carry out following steps:
(Level Run) counts centering Level amplitude to first of step 202, backward scanning, and based on context the model decoding obtains first Level amplitude;
If the resulting Level amplitude of step 203 is zero, then obtain EOB information, promptly one (0,0) is several right, execution in step 7; Otherwise, execution in step 4;
Below be embodiments of the invention two, embodiment 2 is the enforcement of the present invention in Chinese AVS:
Description to the present embodiment key technology:
1, the scanning sequency of DCT coefficient block and coded sequence:
The scanning of DCT coefficient is a process that two dimension or multidimensional DCT coefficient is arranged as one-dimensional sequence.The principle of being followed is to make that to arrange the back coefficient be that zero probability presents progressive law.All follow this rule at the scan mode of different coefficient block sizes, different field/frame patterns among the scanning of the classical zigzag scanning among JPEG, the MPEG, Chinese AVS coefficient block and the H264.
One dimension coefficient random sequence after the scanning is similar to and satisfies the rule that increases progressively for zero probability.Be converted into (level, run) several to after, we are several to encoding to each according to the order that scanning backward and coefficient " non-zero " probability increase progressively.
2, the binaryzation of element to be encoded:
Before two-value arithmetic coding is carried out in execution, at first need the value of element to be encoded: level and run is converted into a series of two-value decision (Binary Decision).
Level is a signed integer.At first isolate sign bit, with 1 bit " 0/1 " expression "+/-"; Remaining absolute value partly is a nonnegative integer, represents to realize binaryzation by Unary.As :-2 two-value turns to (1) 001, and+1 two-value turns to (0) 01.
Run is a signless integer.Directly represent to realize binaryzation by Unary.
Table 1: binaryzation realizes
Table 1 has provided the structure of nonnegative integer being represented to carry out binaryzation by Unary.The result that can see binaryzation is made up of prefix and suffix: prefix code is several 0 character strings of forming, and suffix is 1 at end; Character string after the binaryzation is called a bin stfing, and wherein 0 of each position or 1 is called a bin;
3, contextual definition and quantification:
With nonnegative integer variables L max note down that each is to be encoded (level, run) several to having finished the maximum level amplitude of coding in this coefficient block before.Lmax is initialized as 0 before each coefficient block coding.Select Primary context sequence number by shown in the table 2 Lmax being quantified as 5 grades.
|
0 | 1 | 2 | [3,4] | [5,+∞) |
|
0 | 1 | 2 | 3 | 4 |
The contextual quantification of table 2:Primary
Lmax play the coding of Minute book coefficient block historical and to current (level, run) several to effects by different statistical properties classification.Encode the concrete binaryzation of current level and run as a result the time, and different bin will be according to the further selected Secondary context sequence number of table 3.
Pattern number | The context model encoded |
0 | With first bin of this context model coding absLevel (such as, EOB) |
1 | If there be second bin in absLevel, with this context model this bin that encodes |
2 | If there are three or more bin in absLevel, with this context model these bin that encode |
3 | If absLevel=1 is with first bin. of this context |
4 | If there be second bin or more in absLevel=1 and run, with this context model these bin that encode |
5 | If absLevel>1 is with first bin of this context model coding run |
6 | If absLevel>1, and run exists second bin latter more is with this context model these bin that encode |
Table 3:Secondary context mechanism
During each two-value of encoding decision, from selecting respective contexts totally 35 contexts, utilize the probability Estimation driving arithmetic encoder under this context according to Primary context sequence number (0 to 4) and Secondary context sequence number (0 to 6).
4, context weighting scheme:
For further improving the code efficiency of EOB information, ReverseP represents level to be encoded position in coefficient inverse scan order.Lmax is initialized as 0 before each coefficient block coding.To 8 * 8DCT coefficient block, ReverseP is limited to [0,63], and uniform quantization is that the context sequence number is followed in 32 grades of conducts.
When first bin of absLevel was encoded, according to Primary context sequence number and Secondary sequence number (value is 0) selection-individual context up and down, the probability Estimation of establishing under this context was p1; Follow the context from 32 according to ReverseP and to select another context, the probability Estimation of establishing under this context is p2.Simple weighted with p1 and p2: (p1+p2)/2 drive the two-value arithmetic coding device.
Below be the flow process of embodiment two, the specific coding flow process is:
Fig. 3 is a zigzag scanning example, and Fig. 4 is the coded sequence of the present invention to the piece coefficient.
Whether the value of step 2, judgment variable icoef is zero, if then give variable (level, run) assignment (0,0); If the value of variable i coef is non-vanishing, then search icoef locational (level, run) value in the scanning result; Initialization context variable Lmax=0, ReverseP=0.It is several to expression coefficient block ending message EOB that the value of icoeff is reduced to (0,0) of insertion in 0 o'clock.
The context model of first bin of coding absLevel: the value according to Lmax is selected Primary context sequence number (table 2), and Secondary context sequence number is 0; Simultaneously, select to follow the context sequence number according to the value of ReverseP; Application context weighting technique first bin that encodes.
The context model of other bin of coding absLevel: select Primary context sequence number according to the value of Lmax, encode according to the choice of location Secondary context sequence number (seeing Table 3) of bin after the current absLevel binaryzation again.
If the value of step 4 variable i coef is zero then end-of-encode.
If the value of step 5 variable i coef is non-vanishing, then according to (level run) is worth, and the sign bit of level is encoded; The sign bit of level adopts equiprobability to encode, if negative, with two-value arithmetic coding device coding " 1 ", otherwise two-value arithmetic coding device coding " 0 ".
Step 6, according to finished in this coefficient block all of coding several to and step 104 in just the value of the current level of coding is hereinafter textural, run is encoded.The binaryzation process of run sees Table l; Select Primary context sequence number according to the value of Lmax, by present encoding (level, run) the choice of location Secondary context sequence number (seeing Table 3) of bin of counting the run of the amplitude of centering level and present encoding is encoded;
Step 7, the value of variable i coef is subtracted 1 operation; Execution in step 2.
Table 4: the example of a context model number selection
It should be noted that at last: above embodiment is only unrestricted in order to technical scheme of the present invention to be described; although the present invention is had been described in detail with reference to preferred embodiment; those of ordinary skill in the art is to be understood that: still can make amendment or be equal to replacement technical scheme of the present invention; and not breaking away from the spirit and scope of technical solution of the present invention, it all should be encompassed in the middle of the scope of the technical scheme that the present invention asks for protection.
Claims (5)
1. one kind based on contextual adaptive entropy coding method, wherein, carries out following steps:
Coefficient block after step 1, scanning current process dct transform and the quantification, it is several to sequence to form " level, run ", obtains the individual numerical value of nonzero coefficient; The context model of the described coefficient block of initialization;
If the individual numerical value of the described nonzero coefficient of step 2 is zero, then obtain EOB information; If the individual numerical value of described nonzero coefficient is non-vanishing, it is several right then to search described nonzero coefficient number locational " level, run " in scanning result;
If step 3 to first coefficient coding, is then constructed first context according to empty sequence; Otherwise, several to constructing first context according to having finished all of coding in the described coefficient block, and utilize the context weighting scheme, the absolute value of current level is carried out entropy coding;
If the number of the described nonzero coefficient of step 4 is zero then end-of-encode; Otherwise, according to finished in the described coefficient block all of coding several to and step 3 described in the absolute value of the level that encoded, the sign bit of level is encoded;
Step 5, according to finished in the described coefficient block all of coding several to and step 3 and 4 described in absolute value and the sign bit of the current level that encoded construct second context, run is carried out entropy coding;
Step 6, the quantitative value of described nonzero coefficient is subtracted 1 operation, execution in step 2.
2. adaptive entropy coding method according to claim 1 is characterized in that: scanning described in the step 1 is to carry out according to the several backwards to order of coding " level, run ".
3. adaptive entropy coding method according to claim 1 and 2, it is characterized in that: structure first context of described step 3 comprises: define two stochastic variables, first stochastic variable be used for that the minute book coefficient block finished coding all count the amplitude change information of centering level, second stochastic variable is recorded in the position of current level to be encoded in the inverse scan preface; According to described two stochastic variables, construct first context by the context weighting.
4. according to claim l or 2 described adaptive entropy coding methods, it is characterized in that: structure second context in the described step 5 comprises: define two stochastic variables, first stochastic variable be used for that the minute book coefficient block finished coding all count the amplitude change information of centering level, the absolute value of the current level that has encoded described in another recording step 3, two variablees are textural hereinafter run is encoded by this.
5. one kind based on contextual adaptive entropy coding/decoding method, wherein, carries out following steps:
Step 1, first, second context model of initialization;
Step 2, backward scan first " level, run " number centering Level amplitude, and decoding obtains first Level amplitude according to first context model;
If the resulting Level amplitude of step 3 is zero, then obtain EOB information, execution in step 7; Otherwise, execution in step 4;
Step 4, decoding obtain the sign bit of level;
Step 5, according to finished all of decoding several to and step 2 and 3 in just the value of the current level of decoding construct second context, decoding obtains Run;
Step 6, several to constructing first context according to having finished all of decoding, and utilize the context weighting scheme, decoding obtains the absolute value of the next one " level, run " number centering level; Execution in step 3;
Step 7, " level, the run " that obtain according to described decoding are several to sequence recovery coefficient piece.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB2005101048530A CN100403801C (en) | 2005-09-23 | 2005-09-23 | Adaptive entropy coding/decoding method based on context |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB2005101048530A CN100403801C (en) | 2005-09-23 | 2005-09-23 | Adaptive entropy coding/decoding method based on context |
Publications (2)
Publication Number | Publication Date |
---|---|
CN1741616A CN1741616A (en) | 2006-03-01 |
CN100403801C true CN100403801C (en) | 2008-07-16 |
Family
ID=36093813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB2005101048530A Expired - Fee Related CN100403801C (en) | 2005-09-23 | 2005-09-23 | Adaptive entropy coding/decoding method based on context |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN100403801C (en) |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101137047B (en) * | 2006-08-29 | 2010-09-15 | 昆山杰得微电子有限公司 | Method for analyzing and enhancing coding efficiency through effective residual error coefficient |
CN101290771B (en) * | 2007-04-20 | 2011-07-13 | 中兴通讯股份有限公司 | Bit consumption controlling method based on advanced audio decoder |
CA2871252C (en) * | 2008-07-11 | 2015-11-03 | Nikolaus Rettelbach | Audio encoder, audio decoder, methods for encoding and decoding an audio signal, audio stream and computer program |
RU2487473C2 (en) * | 2008-12-03 | 2013-07-10 | Нокиа Корпорейшн | Switching between discrete cosine transform coefficient coding modes |
CN101646086B (en) * | 2009-08-21 | 2011-07-20 | 香港应用科技研究院有限公司 | Method and device of preamble reference modelling of 4*4 block conversion coefficient indication |
CN102845065A (en) * | 2010-04-19 | 2012-12-26 | 捷讯研究有限公司 | Methods and devices for reordered parallel entropy coding and decoding |
CN104811706B (en) | 2011-01-06 | 2017-10-27 | 三星电子株式会社 | The coding method of video and the coding/decoding method and device of device and video |
CN104093020B (en) * | 2011-03-10 | 2017-11-17 | 华为技术有限公司 | The coding method of conversion coefficient, the coding/decoding method of conversion coefficient, and device |
CN104093018B (en) * | 2011-03-10 | 2017-08-04 | 华为技术有限公司 | The coding method of conversion coefficient, the coding/decoding method of conversion coefficient, and device |
CN102685503B (en) * | 2011-03-10 | 2014-06-25 | 华为技术有限公司 | Encoding method of conversion coefficients, decoding method of conversion coefficients and device |
CN104094607B (en) * | 2011-05-04 | 2017-04-26 | 宁波观原网络科技有限公司 | Modeling method and system based on context in transform domain of image/video |
EP2721820A1 (en) * | 2011-06-16 | 2014-04-23 | Fraunhofer Gesellschaft zur Förderung der angewandten Forschung E.V. | Context initialization in entropy coding |
KR101464978B1 (en) | 2011-07-01 | 2014-11-26 | 삼성전자주식회사 | Method and apparatus for entropy encoding and decoding using hierarchical-structured data unit |
CN102256125B (en) * | 2011-07-14 | 2013-06-05 | 北京工业大学 | Context adaptive arithmetic coding method for HEVC (High Efficiency Video Coding) |
CN103858433B (en) * | 2011-08-25 | 2017-08-15 | 汤姆逊许可公司 | Layered entropy encoding and decoding |
TR201811137T4 (en) * | 2011-10-31 | 2018-08-27 | Samsung Electronics Co Ltd | Method for determining a context model for entropy decoding for the conversion coefficient level. |
US9247257B1 (en) * | 2011-11-30 | 2016-01-26 | Google Inc. | Segmentation based entropy encoding and decoding |
AU2013211004B2 (en) | 2012-01-20 | 2016-03-17 | Ge Video Compression, Llc | Transform coefficient coding |
US9774856B1 (en) | 2012-07-02 | 2017-09-26 | Google Inc. | Adaptive stochastic entropy coding |
US9538175B2 (en) * | 2012-09-26 | 2017-01-03 | Qualcomm Incorporated | Context derivation for context-adaptive, multi-level significance coding |
US9509998B1 (en) | 2013-04-04 | 2016-11-29 | Google Inc. | Conditional predictive multi-symbol run-length coding |
US9392288B2 (en) | 2013-10-17 | 2016-07-12 | Google Inc. | Video coding using scatter-based scan tables |
CN105141966B (en) * | 2015-08-31 | 2018-04-24 | 哈尔滨工业大学 | The context modeling method of conversion coefficient in video compress |
EP4307683A1 (en) * | 2021-03-17 | 2024-01-17 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Coefficient coding/decoding method, encoder, decoder, and computer storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1224978A (en) * | 1998-01-26 | 1999-08-04 | 大宇电子株式会社 | Context-based arithmetic encoding/decoding method and apparatus |
WO2003026307A2 (en) * | 2001-09-14 | 2003-03-27 | Siemens Aktiengesellschaft | Method for producing video coding and programme-product |
WO2003027940A1 (en) * | 2001-09-14 | 2003-04-03 | Nokia Corporation | Method and system for context-based adaptive binary arithmetic coding |
-
2005
- 2005-09-23 CN CNB2005101048530A patent/CN100403801C/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1224978A (en) * | 1998-01-26 | 1999-08-04 | 大宇电子株式会社 | Context-based arithmetic encoding/decoding method and apparatus |
WO2003026307A2 (en) * | 2001-09-14 | 2003-03-27 | Siemens Aktiengesellschaft | Method for producing video coding and programme-product |
WO2003027940A1 (en) * | 2001-09-14 | 2003-04-03 | Nokia Corporation | Method and system for context-based adaptive binary arithmetic coding |
Also Published As
Publication number | Publication date |
---|---|
CN1741616A (en) | 2006-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100403801C (en) | Adaptive entropy coding/decoding method based on context | |
CN100488254C (en) | Entropy coding method and decoding method based on text | |
CN101243611B (en) | Efficient coding and decoding of transform blocks | |
CN101243459B (en) | Adaptive coding and decoding of wide-range coefficients | |
US7365659B1 (en) | Method of context adaptive binary arithmetic coding and coding apparatus using the same | |
CN104365099B (en) | For the entropy code of transform coefficient levels and the parameter updating method of entropy decoding and the entropy code device and entropy decoding device of the transform coefficient levels for using this method | |
US8483500B2 (en) | Run length coding with context model for image compression using sparse dictionaries | |
US20080219578A1 (en) | Method and apparatus for context adaptive binary arithmetic coding and decoding | |
CN103404141A (en) | Video encoding and decoding using transforms | |
CN111246206B (en) | Optical flow information compression method and device based on self-encoder | |
CN104041053A (en) | Method and device for arithmetic coding of video, and method and device for arithmetic decoding of video | |
WO2010144461A2 (en) | Adaptive entropy coding for images and videos using set partitioning in generalized hierarchical trees | |
CN1589023A (en) | Coding and decoding method and device for multiple coded list lengthening based on context | |
CN1525761A (en) | Apparatus and method for selecting length of variable length coding bit stream using neural network | |
CN102186087A (en) | Parallel non-zero coefficient context modeling method for binary arithmetic coding | |
Le et al. | Mobilecodec: neural inter-frame video compression on mobile devices | |
CN101132530A (en) | Method for implementing built-in image compression based on run-length coding | |
CN107018426A (en) | Binarizer for image and video coding is selected | |
CN111432211B (en) | Residual error information compression method for video coding | |
Wu et al. | Context modeling and entropy coding of wavelet coefficients for image compression | |
CN100551064C (en) | Variable length encoding method and device | |
KR100989686B1 (en) | A method and a device for processing bit symbols generated by a data source, a computer readable medium, a computer program element | |
Arora et al. | Review of Image Compression Techniques | |
CN113453002B (en) | Quantization and entropy coding method and apparatus | |
Kuo et al. | An efficient spatial prediction-based image compression scheme |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20080716 Termination date: 20210923 |