US20110102212A1 - Entropy decoding device - Google Patents

Entropy decoding device Download PDF

Info

Publication number
US20110102212A1
US20110102212A1 US12/632,917 US63291709A US2011102212A1 US 20110102212 A1 US20110102212 A1 US 20110102212A1 US 63291709 A US63291709 A US 63291709A US 2011102212 A1 US2011102212 A1 US 2011102212A1
Authority
US
United States
Prior art keywords
module
node
entropy decoding
decoding device
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/632,917
Other versions
US7928868B1 (en
Inventor
Sheng-Che Huang
Yi-Shin LI
Hsieh-Fu Tsai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hon Hai Precision Industry Co Ltd
Original Assignee
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hon Hai Precision Industry Co Ltd filed Critical Hon Hai Precision Industry Co Ltd
Assigned to HON HAI PRECISION INDUSTRY CO., LTD. reassignment HON HAI PRECISION INDUSTRY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUANG, SHENG-CHE, LI, YI-SHIN, TSAI, HSIEH-FU
Application granted granted Critical
Publication of US7928868B1 publication Critical patent/US7928868B1/en
Publication of US20110102212A1 publication Critical patent/US20110102212A1/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/40Conversion to or from variable length codes, e.g. Shannon-Fano code, Huffman code, Morse code
    • H03M7/4006Conversion to or from arithmetic code

Definitions

  • Embodiments of the present disclosure relate to decoding devices, and especially to an entropy decoding device.
  • CABAC Context-based adaptive binary arithmetic coding
  • An entropy decoding process is the most important arithmetic calculations. Therefore, the performance and efficiency of H.264/AVC will be improved if the entropy decoding process increases.
  • FIG. 1 is a block diagram of one embodiment of an application environment of an entropy decoding device as disclosed
  • FIG. 2 is a flowchart of one embodiment of an entropy decoding process as disclosed
  • FIG. 3 is one exemplary embodiment of a traversing of a decoding tree as disclosed.
  • FIG. 4 is one exemplary embodiment of a predication of a next node as disclosed.
  • the entropy decoding device 100 may be a set-top box with an entropy decoding function.
  • the entropy decoding device 100 decodes context-based adaptive binary arithmetic code (CABAC) bit streams, according to the H.264/AVC standard.
  • CABAC context-based adaptive binary arithmetic code
  • the H.264/AVC defines a decoding table and a decoding process in detail.
  • Each decoding table represents a decoding tree, and each decoding tree comprises a plurality of branch nodes and leaf nodes.
  • the decoding tree is structured as a binary tree.
  • the entropy decoding device 100 may decode other kinds of video encoded by other standards, such as context-based adaptive variable length coding (CALVC).
  • CALVC context-based adaptive variable length coding
  • the entropy decoding device 100 comprises a fetching section 10 , an executing section 20 , and an output section 30 . Furthermore, the entropy decoding device 100 comprises a processor 40 to execute at least one computerized instruction for the modules of the sections 10 , 20 and 30 .
  • the fetching section 10 comprises a storage module 11 , a table look-up module 12 , and a state buffer 13 .
  • the storage module 11 is operable to store an encoded bit stream of video data.
  • the table look-up module 12 is operable to select a context probability model for the encoded bit stream, to offer dependence between nodes on a decoding tree.
  • the table look-up module 12 comprises a context table 120 .
  • the context table 120 is operable to store a plurality of context probability models.
  • Each context probability model comprises a most probable symbol (MPS), a least probable symbol (LPS), and a state corresponding to each of the nodes on the decoding tree.
  • MPS most probable symbol
  • LPS least probable symbol
  • the MPS value follows the context probability model of the H.264/AVC standard, to indicate the next node is a left child or right child in probability.
  • the MPS value is equal to 0 or 1.
  • different nodes have their own MPS values.
  • the state buffer 13 is operable to read a current node and the MPS from the table look-up module 12 . In one embodiment, the state buffer 13 reads and discards the current node and the MPS in pipeline, which improves flexibility of reading and discarding nodes and the speed of the entropy decoding device 100 .
  • the executing section 20 comprises an arithmetic decoder module 21 , a parsing module 22 , a prediction module 23 , an update module 24 , and a timing module 25 .
  • the timing module 25 is operable to offer clock cycles.
  • the arithmetic decoder module 21 is connected to the state buffer 13 and the storage module 11 .
  • the arithmetic decoder module 21 is operable to decode the encoded bit stream according to the current node, and output a decoded content.
  • the decoded content comprises real information of a next node.
  • the parsing module 22 is connected to the arithmetic decoder module 21 , to determine whether the current node in the arithmetic decoder module 21 is a leaf node or a branch node. In one embodiment, the paring module 22 is further operable to output the decoded content to the output section 30 if the current node is the leaf node, and output the decoded content to the update module 24 if the current node is the branch node.
  • the prediction module 23 is operable to predict presumptive information of the next node. In one embodiment, the prediction module 23 predicts the next node is a left child node or a right child node, according to the current node and the MPS of the current node from the state buffer 13 .
  • the arithmetic decoder module 21 and the prediction module 23 operate synchronously.
  • the update module 24 is connected to the parsing module 22 and the prediction module 23 , for determining if the prediction hits or misses.
  • the update module 24 is further operable to generate a flush instruction to the table look-up module 12 if the prediction misses.
  • a prediction hit means the presumptive information of the next node conforms to the real information of the next node.
  • a prediction miss means the presumptive information of the next node does not conform to the real information of the next node.
  • the arithmetic decoder module 21 is further operable to decode the next node in the state buffer 13 , if the prediction hits.
  • the table look-up module 12 updates the MPS, according to the flush instruction.
  • a means for updating the MPS comprises maintaining the context probability model, according to a probability state conversion means in the H.264/AVC standard.
  • the context probability model comprises offset and range.
  • the state buffer 13 is further to flush the current node, and read the next node according to the real information of the next node in the decoded content, after the table look-up module 12 receives the flush instruction.
  • the output section 30 comprises an output module 31 , to output the decoded content.
  • a flowchart of one embodiment of an entropy decoding process is shown. Depending on the embodiment, additional blocks may be added, others deleted, and the ordering of the blocks may be changed.
  • the storage module 11 stores an encoded bit stream of video data.
  • the table look-up module 12 selects a context probability model, corresponding to the encoded bit stream.
  • the context probability model comprises a MPS, a LPS, and a state corresponding to each of the nodes on the decoding tree.
  • the MPS value for leaf nodes is null.
  • the state buffer 13 reads a current node and the MPS from the table look-up module 12 . In one embodiment, the state buffer 13 reads and discards the current node and the MPS in pipeline.
  • the prediction module 23 predicts presumptive information of a next node, and the arithmetic decoder module 21 decodes the encoded bit stream and outputs a decoded content.
  • the decoded content comprises real information of the next node.
  • the prediction module 23 and the arithmetic decoder module 21 operate synchronously.
  • the parsing module 22 determines whether the current node in the arithmetic decoder module 21 is a leaf node or a branch node. In one embodiment, the process goes to block S 206 , if the current node is the leaf node, and goes to block S 207 , if the current node is the branch node.
  • the parsing module 22 outputs the decoded content to the output module 31 .
  • the output module 31 outputs the decoded content on the leaf node.
  • the update module 24 determines whether the prediction hits or misses.
  • the prediction hit means that the presumptive information of the next node conforms to the real information of the next node.
  • the prediction miss means that the presumptive information of the next node does not conform to the real information of the next node. In one embodiment, it goes to block S 203 , if the prediction hits, and goes to block S 208 , if the prediction misses.
  • the current node is updated, according to the prediction.
  • the update module 24 In block S 208 , the update module 24 generates a flush instruction to the table look-up module 12 , to discard the next node information in the state buffer 13 . Then, the process goes back to block S 202 .
  • FIG. 3 One exemplary embodiment of a traversing of a decoding tree is shown as FIG. 3 . As shown, F is short for the fetching section 10 , E is short for the executing section 20 , and O is short for the output section 30 .
  • the table look-up module 12 selects a decoding tree, as shown in FIG. 3 .
  • the current node may be 2, it will discard the next node 3 , read node 4 , and update the MPS.
  • the entropy decoding device 100 speeds up the process, by prediction via MPS.
  • the MPS may be updated according to the decoded content, which increases the accuracy of the prediction, and indeed speeds up the entropy decoding device 100 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)

Abstract

An entropy decoding device offers all nodes on a decoding tree, and a most probable symbol for each node, and predicts presumptive information of a next node. The entropy decoding device decodes an encoded bit stream, and output a decoded content that includes real information of the next node. The entropy decoding device further generates a flush instruction to the table look-up module, when the prediction misses, and updates the most probable symbol.

Description

    BACKGROUND
  • 1. Technical Field
  • Embodiments of the present disclosure relate to decoding devices, and especially to an entropy decoding device.
  • 2. Description of Related Art
  • Context-based adaptive binary arithmetic coding (CABAC) is a tool selected by H.264/AVC to compress video data streams. An entropy decoding process of CABAC can be seen as a binary tree traversing. However, throughput of the entropy decoding process is limited, due to complex arithmetic calculations and dependence between binary nodes.
  • An entropy decoding process is the most important arithmetic calculations. Therefore, the performance and efficiency of H.264/AVC will be improved if the entropy decoding process increases.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of one embodiment of an application environment of an entropy decoding device as disclosed;
  • FIG. 2 is a flowchart of one embodiment of an entropy decoding process as disclosed;
  • FIG. 3 is one exemplary embodiment of a traversing of a decoding tree as disclosed; and
  • FIG. 4 is one exemplary embodiment of a predication of a next node as disclosed.
  • DETAILED DESCRIPTION
  • Referring to FIG. 1, a block diagram of one embodiment of an application environment of an entropy decoding device 100 is disclosed. In one embodiment, the entropy decoding device 100 may be a set-top box with an entropy decoding function.
  • In one embodiment, the entropy decoding device 100 decodes context-based adaptive binary arithmetic code (CABAC) bit streams, according to the H.264/AVC standard. The H.264/AVC defines a decoding table and a decoding process in detail. Each decoding table represents a decoding tree, and each decoding tree comprises a plurality of branch nodes and leaf nodes. The decoding tree is structured as a binary tree. In other embodiments, the entropy decoding device 100 may decode other kinds of video encoded by other standards, such as context-based adaptive variable length coding (CALVC).
  • The entropy decoding device 100 comprises a fetching section 10, an executing section 20, and an output section 30. Furthermore, the entropy decoding device 100 comprises a processor 40 to execute at least one computerized instruction for the modules of the sections 10, 20 and 30.
  • The fetching section 10 comprises a storage module 11, a table look-up module 12, and a state buffer 13.
  • The storage module 11 is operable to store an encoded bit stream of video data.
  • The table look-up module 12 is operable to select a context probability model for the encoded bit stream, to offer dependence between nodes on a decoding tree.
  • In one embodiment, the table look-up module 12 comprises a context table 120. In one embodiment, the context table 120 is operable to store a plurality of context probability models.
  • Each context probability model comprises a most probable symbol (MPS), a least probable symbol (LPS), and a state corresponding to each of the nodes on the decoding tree. In one embodiment, the MPS value follows the context probability model of the H.264/AVC standard, to indicate the next node is a left child or right child in probability. The MPS value is equal to 0 or 1. MPS=0 means the next node is probably a left child, and MPS=1 means the next node is probably a right child. In one embodiment, different nodes have their own MPS values.
  • The state buffer 13 is operable to read a current node and the MPS from the table look-up module 12. In one embodiment, the state buffer 13 reads and discards the current node and the MPS in pipeline, which improves flexibility of reading and discarding nodes and the speed of the entropy decoding device 100.
  • The executing section 20 comprises an arithmetic decoder module 21, a parsing module 22, a prediction module 23, an update module 24, and a timing module 25. In one embodiment, the timing module 25 is operable to offer clock cycles.
  • The arithmetic decoder module 21 is connected to the state buffer 13 and the storage module 11. The arithmetic decoder module 21 is operable to decode the encoded bit stream according to the current node, and output a decoded content. In one embodiment, the decoded content comprises real information of a next node.
  • The parsing module 22 is connected to the arithmetic decoder module 21, to determine whether the current node in the arithmetic decoder module 21 is a leaf node or a branch node. In one embodiment, the paring module 22 is further operable to output the decoded content to the output section 30 if the current node is the leaf node, and output the decoded content to the update module 24 if the current node is the branch node.
  • The prediction module 23 is operable to predict presumptive information of the next node. In one embodiment, the prediction module 23 predicts the next node is a left child node or a right child node, according to the current node and the MPS of the current node from the state buffer 13.
  • In one embodiment, the arithmetic decoder module 21 and the prediction module 23 operate synchronously.
  • The update module 24 is connected to the parsing module 22 and the prediction module 23, for determining if the prediction hits or misses. The update module 24 is further operable to generate a flush instruction to the table look-up module 12 if the prediction misses. In one embodiment, it should be understood that a prediction hit means the presumptive information of the next node conforms to the real information of the next node. A prediction miss means the presumptive information of the next node does not conform to the real information of the next node.
  • In one embodiment, the arithmetic decoder module 21 is further operable to decode the next node in the state buffer 13, if the prediction hits.
  • In one embodiment, the table look-up module 12 updates the MPS, according to the flush instruction. In one embodiment, a means for updating the MPS comprises maintaining the context probability model, according to a probability state conversion means in the H.264/AVC standard. The context probability model comprises offset and range. The entropy decoding device 10 may change a prediction direction, according to the MPS updating, such as MPS=0 converse to MPS=1.
  • The state buffer 13 is further to flush the current node, and read the next node according to the real information of the next node in the decoded content, after the table look-up module 12 receives the flush instruction.
  • The output section 30 comprises an output module 31, to output the decoded content.
  • A flowchart of one embodiment of an entropy decoding process is shown. Depending on the embodiment, additional blocks may be added, others deleted, and the ordering of the blocks may be changed.
  • In block S201, the storage module 11 stores an encoded bit stream of video data.
  • In block S202, the table look-up module 12 selects a context probability model, corresponding to the encoded bit stream. The context probability model comprises a MPS, a LPS, and a state corresponding to each of the nodes on the decoding tree. In one embodiment, the MPS value for leaf nodes is null.
  • In block S203, the state buffer 13 reads a current node and the MPS from the table look-up module 12. In one embodiment, the state buffer 13 reads and discards the current node and the MPS in pipeline.
  • In block S204, the prediction module 23 predicts presumptive information of a next node, and the arithmetic decoder module 21 decodes the encoded bit stream and outputs a decoded content. In one embodiment, the decoded content comprises real information of the next node.
  • In one embodiment, the prediction module 23 and the arithmetic decoder module 21 operate synchronously.
  • In block S205, the parsing module 22 determines whether the current node in the arithmetic decoder module 21 is a leaf node or a branch node. In one embodiment, the process goes to block S206, if the current node is the leaf node, and goes to block S207, if the current node is the branch node.
  • In block S206, the parsing module 22 outputs the decoded content to the output module 31. The output module 31 outputs the decoded content on the leaf node.
  • In block S207, the update module 24 determines whether the prediction hits or misses. In one embodiment, the prediction hit means that the presumptive information of the next node conforms to the real information of the next node. The prediction miss means that the presumptive information of the next node does not conform to the real information of the next node. In one embodiment, it goes to block S203, if the prediction hits, and goes to block S208, if the prediction misses.
  • In block S203, the current node is updated, according to the prediction.
  • In block S208, the update module 24 generates a flush instruction to the table look-up module 12, to discard the next node information in the state buffer 13. Then, the process goes back to block S202.
  • One exemplary embodiment of a traversing of a decoding tree is shown as FIG. 3. As shown, F is short for the fetching section 10, E is short for the executing section 20, and O is short for the output section 30.
  • The table look-up module 12 selects a decoding tree, as shown in FIG. 3. Here, the current node is 0, the MPS=0.
  • As shown in FIG. 4, when the prediction hits, for example, the current node is 0, it goes to read the next node 2.
  • When the prediction misses, for example, the current node may be 2, it will discard the next node 3, read node 4, and update the MPS.
  • The entropy decoding device 100 speeds up the process, by prediction via MPS. The MPS may be updated according to the decoded content, which increases the accuracy of the prediction, and indeed speeds up the entropy decoding device 100.
  • Although the features and elements of the present disclosure are described as embodiments in particular combinations, each feature or element can be used alone or in other various combinations within the principles of the present disclosure to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed.

Claims (9)

1. An entropy decoding device, comprising:
a storage module, to store an encoded bit stream;
a table look-up module, to select a context probability model for the encoded bit stream, wherein the context probability model comprises all nodes on a decoding tree and a most probable symbol corresponding to each of the nodes;
a prediction module, to predict presumptive information of a next node according to a current node and the most probable symbol of the current node;
an arithmetic decoder module, to decode the encoded bit stream according to the current node, and output a decoded content, wherein the decoded content comprises real information of the next node;
an update module, to determine whether the presumptive information of the next node conforms to the real information of the next node, and generate a flush instruction to the table look-up module and update the most probable symbol according to the flush instruction when the presumptive information of the next node does not conform to the real information of the next node; and
an output module, to output the decoded content.
2. The entropy decoding device as claimed in claim 1, further comprising a state buffer, to read the current node and the most probable symbol from the table look-up module, and output the current node and the most probable symbol to the prediction module and the arithmetic decoder module.
3. The entropy decoding device as claimed in claim 2, wherein the state buffer is further to flush the current node, and read the next node according to the real information of the next node in the decoded content after the table look-up module receives the flush instruction.
4. The entropy decoding device as claimed in claim 1, further comprising a timing module, to generate a clock cycles during which the arithmetic decoder module and the prediction module operate synchronously.
5. The entropy decoding device as claimed in claim 1, further comprising a parsing module, to determine the current node as a leaf node or a branch node.
6. The entropy decoding device as claimed in claim 5, wherein the parsing module is further operable to output the decoded content to the output module when the current node is the leaf node.
7. The entropy decoding device as claimed in claim 5, wherein the parsing module is further operable to output the decoded content to the update module when the current node is the branch node.
8. The entropy decoding device as claimed in claim 1, wherein the table look-up module further comprises a context table, to store the context probability model.
9. The entropy decoding device as claimed in claim 1, wherein a means for updating the most probable symbol comprises maintaining the context probability model according to a probability state conversion means in H.264/AVC standard.
US12/632,917 2009-10-29 2009-12-08 Entropy decoding device Expired - Fee Related US7928868B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN200910309029.7A CN102055483B (en) 2009-10-29 2009-10-29 Entropy decoding device
CN200910309029 2009-10-29
CN200910309029.7 2009-10-29

Publications (2)

Publication Number Publication Date
US7928868B1 US7928868B1 (en) 2011-04-19
US20110102212A1 true US20110102212A1 (en) 2011-05-05

Family

ID=43858638

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/632,917 Expired - Fee Related US7928868B1 (en) 2009-10-29 2009-12-08 Entropy decoding device

Country Status (2)

Country Link
US (1) US7928868B1 (en)
CN (1) CN102055483B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102045558B (en) * 2009-10-22 2012-09-19 鸿富锦精密工业(深圳)有限公司 Entropy decoding method
US8416104B2 (en) * 2010-04-23 2013-04-09 Certicom Corp. Method and apparatus for entropy decoding
US8421655B2 (en) * 2010-04-23 2013-04-16 Certicom Corp. Apparatus for parallel entropy encoding and decoding
CA2794771C (en) * 2010-05-21 2016-03-29 Research In Motion Limited Methods and devices for reducing sources in binary entropy coding and decoding
BR112013032333B1 (en) * 2011-06-16 2022-07-26 Ge Video Compression, Llc AUXILIARY MODE SWITCHING FOR ENTROPY ENCODING
US10097833B2 (en) * 2014-12-26 2018-10-09 Intel Corporation Method and system of entropy coding using look-up table based probability updating for video coding

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060022848A1 (en) * 2004-07-15 2006-02-02 Kabushiki Kaisha Toshiba Arithmetic code decoding method and apparatus
US7808406B2 (en) * 2005-12-05 2010-10-05 Huawei Technologies Co., Ltd. Method and apparatus for realizing arithmetic coding/decoding

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2156889C (en) * 1994-09-30 1999-11-02 Edward L. Schwartz Method and apparatus for encoding and decoding data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060022848A1 (en) * 2004-07-15 2006-02-02 Kabushiki Kaisha Toshiba Arithmetic code decoding method and apparatus
US7808406B2 (en) * 2005-12-05 2010-10-05 Huawei Technologies Co., Ltd. Method and apparatus for realizing arithmetic coding/decoding

Also Published As

Publication number Publication date
CN102055483B (en) 2013-05-08
CN102055483A (en) 2011-05-11
US7928868B1 (en) 2011-04-19

Similar Documents

Publication Publication Date Title
RU2763532C2 (en) Encoding of sampling array for low latency
US7928868B1 (en) Entropy decoding device
US7411529B2 (en) Method of decoding bin values using pipeline architecture and decoding device therefor
US7365660B2 (en) Method and device for decoding syntax element in CABAC decoder
US8094048B2 (en) Method of decoding syntax element in context-based adaptive binary arithmetic coding decoder and decoding device therefor
US7245242B2 (en) Decoding systems and methods
US7304590B2 (en) Arithmetic decoding apparatus and method
CN101562455B (en) Context-based adaptive binary arithmetic coding (cabac) decoding apparatus and decoding method thereof
TW200952498A (en) CABAC decoding unit and decoding method
US10127913B1 (en) Method of encoding of data stream, method of decoding of data stream, and devices for implementation of said methods
KR101030726B1 (en) Memory efficient multimedia huffman decoding method and apparatus for adapting huffman table based on symbol from probability table
US8970405B2 (en) Method and apparatus for entropy decoding
Lin et al. A branch selection multi-symbol high throughput CABAC decoder architecture for H. 264/AVC
JP2008199100A (en) Device for decoding variable length code
CN101267559A (en) Universal entropy decoding method and device for video decoder
KR102296153B1 (en) Dedicated arithmetic encoding instruction
US8421655B2 (en) Apparatus for parallel entropy encoding and decoding
JP2013009167A (en) Entropy encoding apparatus, entropy decoding apparatus, entropy encoding method and entropy decoding method
JP5201052B2 (en) Device for speeding up decoding of variable length codes
US10742783B2 (en) Data transmitting apparatus, data receiving apparatus and method thereof having encoding or decoding functionalities
TWI396448B (en) Entropy decoding device
CN101188753B (en) A table structure for video entropy decoding search and corresponding decoding method
RU2787846C1 (en) Sample arrige encoding for low delay
JP6509916B2 (en) Method and apparatus for performing arithmetic coding based on concatenated ROM-RAM table
Tian et al. Review of Existing Statistical Codec Designs

Legal Events

Date Code Title Description
AS Assignment

Owner name: HON HAI PRECISION INDUSTRY CO., LTD., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HUANG, SHENG-CHE;LI, YI-SHIN;TSAI, HSIEH-FU;REEL/FRAME:023617/0837

Effective date: 20091201

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20150419