CN111787325A - Entropy encoder and encoding method thereof - Google Patents

Entropy encoder and encoding method thereof Download PDF

Info

Publication number
CN111787325A
CN111787325A CN202010632430.0A CN202010632430A CN111787325A CN 111787325 A CN111787325 A CN 111787325A CN 202010632430 A CN202010632430 A CN 202010632430A CN 111787325 A CN111787325 A CN 111787325A
Authority
CN
China
Prior art keywords
module
context model
encoder
interval
updating module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010632430.0A
Other languages
Chinese (zh)
Other versions
CN111787325B (en
Inventor
王世超
严韫瑶
文湘鄂
宋磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Boya Huishi Intelligent Technology Research Institute Co ltd
Original Assignee
Beijing Boya Huishi Intelligent Technology Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Boya Huishi Intelligent Technology Research Institute Co ltd filed Critical Beijing Boya Huishi Intelligent Technology Research Institute Co ltd
Priority to CN202010632430.0A priority Critical patent/CN111787325B/en
Publication of CN111787325A publication Critical patent/CN111787325A/en
Application granted granted Critical
Publication of CN111787325B publication Critical patent/CN111787325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/13Adaptive entropy coding, e.g. adaptive variable length coding [AVLC] or context adaptive binary arithmetic coding [CABAC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/184Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being bits, e.g. of the compressed video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The invention discloses an entropy encoder, comprising: the circuit comprises a first partial circuit, a second partial circuit and a third partial circuit which are sequentially connected, wherein the first partial circuit is a ram memory; the second part of circuit comprises a binarization module and a context model selection module; the third part of circuit comprises a context model updating module, an encoder probability interval and interval lower limit updating module, an encoding counter state updating module and a bit generating module; the binarization module and the context model selection module perform parallel processing; the context model updating module and the encoder probability interval and interval lower limit updating module perform parallel processing; the three modules of the encoder probability interval and interval lower limit updating module, the encoding counter state updating module and the bit generating module are three-level pipeline operation. The binary arithmetic coding module is divided into three small modules with independent functions, so that the clock frequency of the whole binary arithmetic coding module is improved, and the output code rate of the entropy coder is further improved.

Description

Entropy encoder and encoding method thereof
Technical Field
The invention relates to the technical field of video coding, in particular to an entropy coder and a coding method thereof.
Background
The national standard of AVS2 is a new generation standard after AVS1, which includes three parts of video, audio and system. The AVS2 video coding aims at ultra-high definition video and supports the efficient compression of ultra-high resolution (above 4K) and high dynamic range video, wherein the AVS2 entropy coder is a core coding module in an AVS2 video coding system and is mainly used for carrying out compression coding on a video sequence without losing any information.
In the prior art, the AVS2 entropy encoder mainly includes a binarization module, a context model module, and a binary arithmetic coding module. The AVS2 entropy encoder contains very tight encoding dependency relationship, is complex in calculation, and has certain challenges in hardware circuit implementation. The hardware design of the AVS2 entropy encoder is not complicated if the AVS2 entropy encoder outputs low bit rate, but the AVS2 entropy encoder needs to be optimized if high bit rate output is needed, so as to reduce the encoding time of single bit encoding of the entropy encoder.
Disclosure of Invention
The present invention is directed to an entropy encoder and an encoding method thereof, which address the above-mentioned deficiencies of the prior art, and the object is achieved by the following technical solutions.
A first aspect of the present invention proposes an entropy encoder comprising: a first, second and third partial circuits connected in sequence, the first partial circuit being a ram memory storing syntax elements; the second part of circuit comprises a binarization module and a context model selection module; the third part of circuit comprises a context model updating module, an encoder probability interval and interval lower limit updating module, an encoding counter state updating module and a bit generating module;
the binarization module and the context model selection module perform parallel processing; the context model updating module and the encoder probability interval and interval lower limit updating module are processed in parallel; and the three modules of the encoder probability interval and interval lower limit updating module, the encoding counter state updating module and the bit generating module are three-stage pipeline operation.
A second aspect of the present invention proposes an encoding method applying the entropy encoder as described in the first aspect above, the method comprising:
reading a syntax element in a ram memory through a binarization module and converting the syntax element into a binary string, transmitting the binary string to an encoder probability interval and interval lower limit updating module, simultaneously reading the syntax element in the ram memory through a context model selecting module and selecting a corresponding context model, and transmitting an index of the context model to the encoder probability interval and interval lower limit updating module and the context model updating module;
updating the coding interval and the interval lower limit by the encoder probability interval and interval lower limit updating module by using the binary string and the context model corresponding to the index, and transmitting an intermediate variable generated in the updating process to the coding counter state updating module; updating the context model corresponding to the index through the context model updating module;
generating coded data according to the intermediate variable and updating the count value of the coding counter through the coding counter state updating module, and transmitting the coded data and the count value to the bit generating module;
and generating a bit code stream by the bit generation module according to the counting value and the coded data.
Based on the entropy encoder of the first aspect, the key path of a large binary arithmetic coding module with complex operation is split into three small modules with independent functions, namely a block encoder probability interval and interval lower limit updating module, a coding counter state updating module and a bit generating module, and the three small modules perform three-level pipeline operation, so that the unit time calculation amount of each module is reduced, the clock frequency of the whole binary arithmetic coding module is improved, the time consumption for completing the coding of syntax elements of a maximum coding unit (LCU) is reduced, and the output code rate of the entropy encoder is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of an entropy encoder in accordance with an exemplary embodiment of the present invention;
fig. 2 is a flowchart illustrating an embodiment of an encoding method of an entropy encoder according to an exemplary embodiment of the present invention.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
At present, in order to increase the output code rate of the entropy encoder, entropy encoding is generally performed in parallel by multiple entropy encoding modules, which increases the hardware cost and complicates the control flow.
In order to solve the technical problem, the invention provides an entropy encoder which improves the overall operation frequency of the encoder on the premise of one path of entropy encoding module, and further improves the output code rate of the encoder.
The entropy encoder comprises very tight encoding dependency relationship and complex calculation, wherein the number of clock cycles required by the binary arithmetic coding module to complete the encoding of the syntax element of a maximum coding unit (LCU) is constant, and the binary arithmetic coding module is a large module with complex operation, so the running clock frequency is relatively low, the encoding is time-consuming, and the output code rate is low.
The inventor splits a large binary arithmetic coding module critical path with complex operation into three small modules with independent functions, namely a block encoder probability interval and interval lower limit updating module, a code counter state updating module and a bit generating module, and the three small modules perform three-level pipeline operation by analyzing the coding principle of the binary arithmetic coding module, so that the unit time calculation amount of each module is reduced, the clock frequency of the whole binary arithmetic coding module is improved, the time consumption for completing the coding of syntax elements of a maximum coding unit (LCU) is reduced, and the output code rate of an entropy encoder is improved.
Meanwhile, the inventor also splits the context model module into a context model selection module and a context model updating module which are independent in function.
The schematic structural diagram of the entropy encoder shown in fig. 1 includes a first sub-circuit 10, a second sub-circuit 20, and a third sub-circuit 30, which are sequentially connected in sequence, where the first sub-circuit 10 is a ram memory for storing syntax elements; the second part circuit 20 comprises a binarization module and a context model selection module; the third part of the circuit 30 includes a context model updating module, an encoder probability interval and interval lower limit updating module, an encoding counter state updating module, and a bit generating module.
The three modules of the encoder probability interval and interval lower limit updating module, the encoding counter state updating module and the bit generating module are three-level pipeline operation.
Further, in order to avoid that the data in the selected context model is modified by the context model updating module before being used for encoding, the context model updating module is disposed in the second circuit portion 20, and the context model updating module performs processing in parallel with the encoder probability interval and the interval lower limit updating module, and in addition, the binarization module in the first circuit portion 10 performs processing in parallel with the context model selecting module, so as to further improve the computation amount of the entire encoder in a unit time.
It should be noted that, due to the limited hardware resources of the FIFO, if there are syntax elements of a plurality of largest coding units, since there is no distinguishing mark between each maximum coding unit, the second partial circuit is indistinguishable when reading, therefore, only the syntax element of one maximum coding unit is stored each time, and only the syntax element of the maximum coding unit is stored into the syntax element of the next maximum coding unit after the syntax element of the maximum coding unit is coded, so that the coding efficiency is low, and based on the lower coding efficiency, since the hardware resource of the ram memory is satisfied with the address addressing, the present invention uses the ram memory to store the syntax elements of the maximum coding units in the first circuit, and each maximum coding unit is allocated with a corresponding address, and the second part of circuit can read one maximum coding unit according to the address, thereby further improving the coding efficiency.
In an embodiment, as shown in fig. 1, the entropy encoder further includes a fourth circuit 40 and a fifth circuit 50, where the fourth circuit 40 is a code stream serializing module, and the fifth circuit 50 is a pseudo start code detecting module.
In an embodiment, as shown in fig. 1, the entropy encoder further includes a First FIFO (First Input First Output) memory located between the First circuit part 10 and the second circuit part 20, a second FIFO memory located between the encoder probability interval and interval lower limit updating module and the encoding counter state updating module, a third FIFO memory located between the encoding counter state updating module and the bit generating module, and a fourth FIFO memory located after the fifth circuit part 50.
The front-stage module and the back-stage module can be mutually independent and do not interfere with each other by using the FIFO memory, the data loss caused by overflow and underflow of data is avoided, the waste of clock cycles caused by the mutual handshake signals of the front-stage module and the back-stage module is avoided, and the processing performance of the entropy encoder is improved.
It should be noted that, because the present invention splits the critical path of the binary arithmetic coding module and increases the running clock frequency greatly, the present invention can use the synchronous FIFO to meet the coding requirement of the single-path binary arithmetic coding module. Based on this, the first FIFO memory, the second FIFO memory, the third FIFO memory and the fourth FIFO memory are all synchronous FIFOs.
The following comparison between before and after optimization is made with specific examples:
before optimization, as the key path of the binary arithmetic coding module is not split, the maximum operation clock frequency of the binary arithmetic coding module can only reach 200MHz, but the maximum operation clock frequency of the front-stage module and the rear-stage module can reach 400MHz, under the condition, two paths of binary arithmetic coding modules are required to be adopted, and each path only uses one asynchronous fifo;
after optimization, the critical path of the binary arithmetic coding module is split, the running clock frequency of the binary arithmetic coding module can reach 400MHz, and at this time, one path of the binary arithmetic coding module is used and only one synchronous fifo is adopted.
Fig. 2 is a flowchart illustrating an embodiment of an encoding method of an entropy encoder according to an exemplary embodiment of the present invention, which is implemented by the entropy encoder as described in fig. 1 above. As shown in fig. 2, the encoding method includes the steps of:
step 201: reading the syntax elements in the ram memory by a binarization module and converting the syntax elements into binary strings, transmitting the binary strings to an encoder probability interval and interval lower limit updating module, simultaneously reading the syntax elements in the ram memory by a context model selecting module and selecting a corresponding context model, and transmitting the indexes of the context model to the encoder probability interval and interval lower limit updating module and the context model updating module.
The binarization module stores the converted binary string into a first FIFO memory, so that the first FIFO memory transmits the binary string to the encoder probability interval and interval lower limit updating module. The context model selecting module stores the index of the selected context model into the first FIFO memory, so that the first FIFO memory transmits the index to the encoder probability interval and interval lower limit updating module and the context model updating module.
It should be noted that a maximum coding unit is usually 64x64, and if syntax elements of 64x64 size are directly binarized by the binarization module, the processing capacity is very low, so the binarization module performs parallel conversion processing by splitting decorrelation of the maximum coding unit 64x64 into 4 independent paths 32x32, so as to improve the processing capacity of binarization in unit time.
Step 202: and updating the coding interval and the interval lower limit by using a context model corresponding to the binary string and the index through an encoder probability interval and interval lower limit updating module, transmitting an intermediate variable generated in the updating process to an encoding counter state updating module, and updating the context model corresponding to the index through a context model updating module.
The encoder probability interval and interval lower limit updating module stores the generated intermediate variable into the second FIFO memory, so that the second FIFO memory transmits the intermediate variable to the encoding counter state updating module.
Step 203: and generating coded data according to the intermediate variable and updating the count value of the coding counter by the coding counter state updating module, and transmitting the coded data and the count value to the bit generating module.
The coding counter state updating module stores the coded data and the count value into a third FIFO memory, and the third FIFO memory transmits the coded data and the count value to the bit generating module.
Step 204: and generating a bit code stream by a bit generation module according to the counting value and the coded data.
In an embodiment, after step 204 is executed, the generated bit code stream may be further transmitted to a code stream serialization module through a bit generation module, the bit code stream is serialized and converted through the code stream serialization module, the converted bit code stream is transmitted to a pseudo start code detection module, the converted bit code stream is subjected to pseudo start code detection through the pseudo start code detection module, and the detected pseudo start code is inserted into the bit code stream.
The code stream serialization module specifically serializes and converts the bit code stream according to the preset code stream bit width. The pseudo start code detection module is specifically configured to insert the detected pseudo start code at a position where the pseudo start code is detected every time one pseudo start code is detected until the detection is finished.
Further, after inserting the detected pseudo start code into the bit code stream, the pseudo start code detection module may also store the bit code stream into which the pseudo start code is inserted into the fourth FIFO memory for backward transmission.
To this end, the process shown in fig. 2 is completed, and the encoding process of the entropy encoder is completed through the process shown in fig. 2, because the binary arithmetic encoding module that is split into the block encoder probability interval and interval lower limit updating module, the encoding counter state updating module, and the bit generation module performs three-level pipeline operation, each small module completes the encoding process of the current maximum encoding unit, that is, the encoding process of the next maximum encoding unit can be performed, and it is not necessary to wait for the data of the current maximum encoding unit to be completely encoded, so that the encoding efficiency of the entire binary arithmetic encoding module can be improved. In addition, in the encoding process, the binarization module and the context model selection module are processed in parallel, and the context model updating module and the encoder probability interval and interval lower limit updating module are also processed in parallel, so that the output code rate of the encoder can be further improved.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. An entropy encoder, characterized by comprising: a first, second and third partial circuits connected in sequence, the first partial circuit being a ram memory storing syntax elements; the second part of circuit comprises a binarization module and a context model selection module; the third part of circuit comprises a context model updating module, an encoder probability interval and interval lower limit updating module, an encoding counter state updating module and a bit generating module;
the binarization module and the context model selection module perform parallel processing; the context model updating module and the encoder probability interval and interval lower limit updating module are processed in parallel; and the three modules of the encoder probability interval and interval lower limit updating module, the encoding counter state updating module and the bit generating module are three-stage pipeline operation.
2. An entropy encoder as claimed in claim 1, further comprising: a fourth partial circuit connected with the third partial circuit, a fifth partial circuit connected with the fourth partial circuit;
the fourth part of circuits are code stream serialization modules, and the fifth part of circuits are pseudo start code detection modules.
3. An entropy encoder as claimed in claim 2, further comprising: a first FIFO memory for connecting the first and second circuit portions;
the second FIFO memory is used for connecting the encoder probability interval and interval lower limit updating module and the encoding counter state updating module;
a third FIFO memory for connecting the code counter state updating module and the bit generating module;
and the fourth FIFO memory is connected with the output end of the fifth partial circuit.
4. An entropy coder as claimed in claim 3, wherein the first, second, third and fourth FIFO memories are synchronous FIFOs.
5. An encoding method applying an entropy encoder as claimed in any one of the claims 1-4, characterized in that the method comprises:
reading a syntax element in a ram memory through a binarization module and converting the syntax element into a binary string, transmitting the binary string to an encoder probability interval and interval lower limit updating module, simultaneously reading the syntax element in the ram memory through a context model selecting module and selecting a corresponding context model, and transmitting an index of the context model to the encoder probability interval and interval lower limit updating module and the context model updating module;
updating the coding interval and the interval lower limit by the encoder probability interval and interval lower limit updating module by using the binary string and the context model corresponding to the index, and transmitting an intermediate variable generated in the updating process to the coding counter state updating module; updating the context model corresponding to the index through the context model updating module;
generating coded data according to the intermediate variable and updating the count value of the coding counter through the coding counter state updating module, and transmitting the coded data and the count value to the bit generating module;
and generating a bit code stream by the bit generation module according to the counting value and the coded data.
6. The method of claim 5, further comprising:
transmitting the bit code stream to a code stream serialization module through the bit generation module;
serializing and converting the bit code stream through the code stream serializing module, and transmitting the converted bit code stream to a pseudo initial code detection module;
and carrying out pseudo initial code detection on the converted bit code stream through the pseudo initial code detection module, and inserting the detected pseudo initial code into the bit code stream.
7. The method of claim 5,
the binarization module stores the binary string into a first FIFO memory, so that the first FIFO memory transmits the binary string to an encoder probability interval and interval lower limit updating module, the context model selection module stores the index of the context model into the first FIFO memory, and the first FIFO memory transmits the index to the encoder probability interval and interval lower limit updating module and the context model updating module.
8. The method of claim 5, wherein the encoder probability interval and interval floor update module stores the intermediate variable in a second FIFO memory, such that the second FIFO memory transmits the intermediate variable to the encoder counter state update module.
9. The method of claim 5, wherein the encoder counter status update module stores the encoded data and the count value in a third FIFO memory, such that the third FIFO memory transmits the encoded data and the count value to the bit generation module.
10. The method of claim 6, further comprising:
and after inserting the detected pseudo start code into the bit code stream, the pseudo start code detection module stores the bit code stream inserted with the pseudo start code into a fourth FIFO memory.
CN202010632430.0A 2020-07-03 2020-07-03 Entropy encoder and encoding method thereof Active CN111787325B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010632430.0A CN111787325B (en) 2020-07-03 2020-07-03 Entropy encoder and encoding method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010632430.0A CN111787325B (en) 2020-07-03 2020-07-03 Entropy encoder and encoding method thereof

Publications (2)

Publication Number Publication Date
CN111787325A true CN111787325A (en) 2020-10-16
CN111787325B CN111787325B (en) 2022-03-08

Family

ID=72758521

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010632430.0A Active CN111787325B (en) 2020-07-03 2020-07-03 Entropy encoder and encoding method thereof

Country Status (1)

Country Link
CN (1) CN111787325B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911314A (en) * 2021-01-14 2021-06-04 北京博雅慧视智能技术研究院有限公司 Coding method of entropy coder and entropy coder
WO2023045204A1 (en) * 2021-09-22 2023-03-30 苏州浪潮智能科技有限公司 Method and system for generating finite state entropy coding table, medium, and device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050204058A1 (en) * 1997-10-14 2005-09-15 Philbrick Clive M. Method and apparatus for data re-assembly with a high performance network interface
US20080137974A1 (en) * 2006-12-07 2008-06-12 Sunplus Technology Co., Ltd. JBIG coding and decoding system
US20100007534A1 (en) * 2008-07-14 2010-01-14 Girardeau Jr James Ward Entropy decoder with pipelined processing and methods for use therewith
CN101848311A (en) * 2010-02-21 2010-09-29 哈尔滨工业大学 JPEG2000 EBCOT encoder based on Avalon bus
CN102176750A (en) * 2011-03-10 2011-09-07 西安电子科技大学 High-performance adaptive binary arithmetic encoder
US20110248871A1 (en) * 2010-04-09 2011-10-13 Korea Electronics Technology Institute Decoding device for context-based adaptive binary arithmetic coding (cabac) technique
CN102724491A (en) * 2012-06-15 2012-10-10 北京博雅华录视听技术研究院有限公司 Statistical multiplexing method based on parallel coding
CN105025296A (en) * 2014-04-30 2015-11-04 北京大学 Advance arithmetic coder and realization method thereof
CN105791828A (en) * 2015-12-31 2016-07-20 杭州士兰微电子股份有限公司 Binary arithmetic encoder and encoding method thereof
CN106921859A (en) * 2017-05-05 2017-07-04 郑州云海信息技术有限公司 A kind of CABAC entropy coding methods and device based on FPGA
CN109922341A (en) * 2017-12-13 2019-06-21 博雅视云(北京)科技有限公司 The advanced entropy coder implementation method of AVS2 and device
US20190200043A1 (en) * 2017-12-21 2019-06-27 Qualcomm Incorporated Probability initialization and signaling for adaptive arithmetic coding in video coding

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050204058A1 (en) * 1997-10-14 2005-09-15 Philbrick Clive M. Method and apparatus for data re-assembly with a high performance network interface
US20080137974A1 (en) * 2006-12-07 2008-06-12 Sunplus Technology Co., Ltd. JBIG coding and decoding system
US20100007534A1 (en) * 2008-07-14 2010-01-14 Girardeau Jr James Ward Entropy decoder with pipelined processing and methods for use therewith
CN101848311A (en) * 2010-02-21 2010-09-29 哈尔滨工业大学 JPEG2000 EBCOT encoder based on Avalon bus
US20110248871A1 (en) * 2010-04-09 2011-10-13 Korea Electronics Technology Institute Decoding device for context-based adaptive binary arithmetic coding (cabac) technique
CN102176750A (en) * 2011-03-10 2011-09-07 西安电子科技大学 High-performance adaptive binary arithmetic encoder
CN102724491A (en) * 2012-06-15 2012-10-10 北京博雅华录视听技术研究院有限公司 Statistical multiplexing method based on parallel coding
CN105025296A (en) * 2014-04-30 2015-11-04 北京大学 Advance arithmetic coder and realization method thereof
CN105791828A (en) * 2015-12-31 2016-07-20 杭州士兰微电子股份有限公司 Binary arithmetic encoder and encoding method thereof
CN106921859A (en) * 2017-05-05 2017-07-04 郑州云海信息技术有限公司 A kind of CABAC entropy coding methods and device based on FPGA
CN109922341A (en) * 2017-12-13 2019-06-21 博雅视云(北京)科技有限公司 The advanced entropy coder implementation method of AVS2 and device
US20190200043A1 (en) * 2017-12-21 2019-06-27 Qualcomm Incorporated Probability initialization and signaling for adaptive arithmetic coding in video coding

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911314A (en) * 2021-01-14 2021-06-04 北京博雅慧视智能技术研究院有限公司 Coding method of entropy coder and entropy coder
CN112911314B (en) * 2021-01-14 2023-08-18 北京博雅慧视智能技术研究院有限公司 Coding method of entropy coder and entropy coder
WO2023045204A1 (en) * 2021-09-22 2023-03-30 苏州浪潮智能科技有限公司 Method and system for generating finite state entropy coding table, medium, and device

Also Published As

Publication number Publication date
CN111787325B (en) 2022-03-08

Similar Documents

Publication Publication Date Title
CN111787325B (en) Entropy encoder and encoding method thereof
CN102098508B (en) The coding of multimedia signature and decoding
CN102176750B (en) High-performance adaptive binary arithmetic encoder
CN107423397B (en) Adaptive compression storage and decompression extraction method for multitask micro-system
CN111884660B (en) Huffman coding equipment
CN108287877B (en) FPGA (field programmable Gate array) compression/decompression system and hardware decompression method for RIB (run in Box) rendering compressed file
US8576100B2 (en) Parallel entropy encoder and parallel entropy decoder
US8542137B2 (en) Decoding encoded data
CN103248896A (en) MQ (memory quotient) arithmetic coder
CN102547260B (en) Decoding method of adaptive variable length coding based on context and system thereof
CN113905093A (en) Serialization and deserialization methods and devices and electronic equipment
US9455742B2 (en) Compression ratio for a compression engine
US20050105556A1 (en) Packet processor and buffer memory controller for extracting and aligning packet header fields to improve efficiency of packet header processing of main processor and method and medium therefor
CN107623524B (en) Hardware-based Huffman coding method and system
CN100551066C (en) The implementation method of encoder and adaptive arithmetic code and device
CN102298782B (en) System and method for parameter estimation for lossless video compression
US10915547B2 (en) Optimizing data conversion using pattern frequency
CN110247666B (en) System and method for hardware parallel compression
TW201419140A (en) Reconfigurable instruction encoding, execution method and electronic apparatus
CN114337682A (en) Huffman coding and compressing device
CN102751994B (en) Short code length block code decoder device based on two finite group symbols
CN107659315B (en) Sparse binary coding circuit for compressed sensing
KR101370606B1 (en) Bus encoding device to minimize the switching and crosstalk delay
Kuo et al. Design of a low power architecture for CABAC encoder in H. 264
CN112911314B (en) Coding method of entropy coder and entropy coder

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant