WO2000036751A1 - Code book construction for variable to variable length entropy encoding - Google Patents

Code book construction for variable to variable length entropy encoding Download PDF

Info

Publication number
WO2000036751A1
WO2000036751A1 PCT/US1999/029003 US9929003W WO0036751A1 WO 2000036751 A1 WO2000036751 A1 WO 2000036751A1 US 9929003 W US9929003 W US 9929003W WO 0036751 A1 WO0036751 A1 WO 0036751A1
Authority
WO
WIPO (PCT)
Prior art keywords
grouping
symbol
groupings
probability
symbols
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US1999/029003
Other languages
English (en)
French (fr)
Inventor
Wei-Ge Chen
Ming-Chieh Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Corp
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to JP2000588899A priority Critical patent/JP4559631B2/ja
Priority to DE69916661T priority patent/DE69916661T2/de
Priority to EP99968085A priority patent/EP1147612B1/en
Priority to AT99968085T priority patent/ATE265106T1/de
Publication of WO2000036751A1 publication Critical patent/WO2000036751A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/40Conversion to or from variable length codes, e.g. Shannon-Fano code, Huffman code, Morse code
    • H03M7/42Conversion to or from variable length codes, e.g. Shannon-Fano code, Huffman code, Morse code using table look-up for the coding or decoding process, e.g. using read-only memory

Definitions

  • the invention generally relates to data compression, and more specifically relates to a form of entropy coding.
  • input data is encoded by an encoder, transmitted over a communication channel (or simply stored), and decoded by a decoder.
  • an input signal is typically pre-processed, sampled, converted, compressed or otherwise manipulated into a form for transmission or storage. After transmission or storage, the decoder attempts to reconstruct the original input.
  • One fundamental limitation of this simple model is that a given communication channel has a certain capacity or bandwidth. Consequently, it is frequently necessary to reduce the information content of input data in order to allow it to be reliably transmitted, if at all, over the communication channel.
  • an optimal encoding is to use equal length code words, where each bit of an ⁇ -bit code allows distinguishing among 2 n equally probable input possibilities.
  • a single bit i.e., single entry code book
  • two bits can distinguish four possibilities, etc.
  • Entropy encoding operates by assigning variable-length codes (e.g., code book entries) to fixed-sized blocks of input. That is, a random variable X, which is known to take on values x ⁇ .. ⁇ m with corresponding probability p ⁇ ..pTM, is mapped to an entry within a set of code words ⁇ Y ⁇ .
  • y ⁇ k will be referenced simply as V ⁇ , with k implied.
  • the code alphabet is likely to be a series of binary digits ⁇ 0, 1 ⁇ , with code lengths measured in bits. It is assumed code words are constructed so only a single scan of a compressed representation needs to be inspected in order to reconstruct appropriate output.
  • the difficulty in entropy encoding the source signal depends on the number m of possible values X may take. For small m, there are few possible messages, and therefore the code book for the messages can be very small (e.g ., only a few bits need to be used to unambiguously represent all possible messages) .
  • One possible set of code book entries is assigning "1 " to represent message xi , "01 " for message X2, "000” for message X3, and "001 " for message X4. This gives an average code length of 1 .56 bits instead of 2 for encoding the random variable X -- a significant savings.
  • Coding sequences of X's produces further savings as it is known from information theory studies that the entropy of a coherent series X ⁇ ..X ⁇ is less than or equal to the sum of each individual X's entropy.
  • vector Huffman coding can compress a coherent source much more efficiently than scalar Huffman coding .
  • the efficiency of vector Huffman coding is only limited by practical concerns. In order to achieve higher compression ratios, bigger vector dimensions are needed. Higher dimension, however, increases code book sizes beyond practical limits. For example, for source symbols having 30 possible values, a dimension of only 6 corresponds to a code book of 729 million entries.
  • entropy coding are characterized as fixed-to-variable length coding as the source symbols have fixed length and the code words have variable length depending on the probability of the corresponding source symbol.
  • Another methods of entropy coding have been attempted which attempt the opposite approach, where a variable number of source symbols are grouped together and then translated into code words having equal length.
  • the source is composed of independent X's, and symbol groupings achieve equal probability, such a reverse scheme is provably optimal .
  • such solutions require resources exceeding resources practically (if at all) available.
  • variable-to-fixed length approach is not useful.
  • the invention relates to a method of assigning variable length codes to variable length input sequences.
  • entropy-type codes are assigned to probable input sequences, thus allowing a particular input stream to be encoded in a compressed format.
  • the invention may be configured so as to reduce the size of the code book required for performing encoding and decoding.
  • variable length code words might only be assigned to inputs that are highly probable, and where default codes can be assigned to less probable sequences.
  • the degree of probability required for assignment of a specific code to a specific input is adjusted according to a desired code book size.
  • the input stream to encode can be of any data type, such as numbers, characters, or a binary data stream which encodes audio, video or other types of data.
  • the input stream is referenced herein as a series of symbols, where each "symbol” refers to the appropriate measurement unit for the particular input.
  • a code book is constructed for groupings of symbols, in which variable-sized groups of symbols are each assigned a variable length code based on probability of occurrence of symbol groupings.
  • possible groupings of symbols are generated and compared against the probability of the generated grouping occurring in exemplary input used to generate the code book.
  • Such exemplary input is assumed to approximate arbitrary input likely to be received and require encoding.
  • a data structure may be used to track symbols combinations (e.g., the groupings) . This structure is used to associate the new symbol with previously received symbols, so that arbitrarily long groupings of previously received symbols are tracked.
  • One possible configuration for the data structure is a tree-type data structure, in which successive symbol groupings form new leaf nodes. These nodes may contain an entire grouping or just the single symbol extension to a previous parent node. In this latter configuration, the path from the root of the tree corresponds to a particular grouping.
  • one or more trivial groupings are selected, such as single symbol "groups" containing symbols from the input alphabet.
  • the probability of these initial groupings is evaluated to determine the grouping most likely to occur as input, where such probability is necessarily computed with respect to exemplary inputs.
  • the most probable grouping is then expanded with symbols from the alphabet to form tentative groupings.
  • the probability of these tentative groupings is then evaluated to identify the most probable tentative expansions, and the least probable groupings combined into a single grouping.
  • the concept of a code book is to assign code words to symbol groupings.
  • the invention can be configured so that code book size is restricted. One method of doing so is avoiding assigning codes to all input sequences. Instead, only probable input sequences are stored in the code book and assigned an entropy-type code. Improbable sequences are represented in the code book as an input sequence prefix followed by a special expansion character suffix. This suffix character represents all possible input sequence extensions to the prefix. The prefix-suffix pairing represents all possible input sequences beginning with the prefix that do not have an entry in the code book. Thus, after evaluating the tentative extensions, two code book entries result, one for the most probable extension, and one to represent all other extensions (again, assuming only keeping one most probable extension) .
  • FIG. 1 is a block diagram of a computer system that may be used to implement a variable to variable entropy encoding.
  • FIG . 2 shows a basic communication model
  • FIG. 3 is a flowchart showing creation of a code book having variable length entries for variable length symbol groupings.
  • FIGS. 4-10 illustrate creation of a code book pursuant to FIG . 3 for an alphabet ⁇ A, B, C ⁇ .
  • the invention has been implemented in an audio/visual codec. This is only one example of how the invention may be implemented.
  • the invention is designed to be utilized wherever entropy-type coding may be utilized, and is applicable to compression of any type of data. Briefly described, optimal entropy encoding requires excessive resources, and the illustrated embodiments provide a nearly optimal encoding solution requiring far fewer resources.
  • FIG. 1 and the following discussion are intended to provide a brief, general description of a suitable computing environment in which the invention may be implemented. While the invention will be described in the general context of computer- executable instructions of a computer program that runs on a personal computer, those skilled in the art will recognize that the invention also may be implemented in combination with other program modules. Generally, program modules include routines, programs, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like.
  • the illustrated embodiment of the invention also is practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. But, some embodiments of the invention can be practiced on stand alone computers. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
  • an exemplary system for implementing the invention includes a computer 20, including a processing unit 21 , a system memory 22, and a system bus 23 that couples various system components including the system memory to the processing unit 21 .
  • the processing unit may be any of various commercially available processors, including Intel x86, Pentium and compatible microprocessors from Intel and others, the Alpha processor by Digital, and the PowerPC from IBM and Motorola. Dual microprocessors and other multi-processor architectures also can be used as the processing unit 21 .
  • the system bus may be any of several types of bus structure including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of conventional bus architectures such as PCI, AGP, VESA, MicroChannel, ISA and EISA, to name a few.
  • the system memory includes read only memory (ROM) 24 and random access memory (RAM) 25.
  • ROM read only memory
  • RAM random access memory
  • BIOS basic input/output system
  • BIOS basic routines that help to transfer information between elements within the computer 20, such as during start-up, is stored in ROM 24.
  • the computer 20 further includes a hard disk drive 27, a magnetic disk drive 28, e.g., to read from or write to a removable disk 29, and an optical disk drive 30, e.g., for reading a CD-ROM disk 31 or to read from or write to other optical media.
  • the hard disk drive 27, magnetic disk drive 28, and optical disk drive 30 are connected to the system bus 23 by a hard disk drive interface 32, a magnetic disk drive interface 33, and an optical drive interface 34, respectively.
  • the drives and their associated computer- readable media provide nonvolatile storage of data, data structures, computer-executable instructions, etc. for the computer 20.
  • a number of program modules may be stored in the drives and RAM 25, including an operating system 35, one or more application programs (e.g., Internet browser software) 36, other program modules 37, and program data 38.
  • application programs e.g., Internet browser software
  • a user may enter commands and information into the computer 20 through a keyboard 40 and pointing device, such as a mouse 42.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are often connected to the processing unit 21 through a serial port interface 46 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a universal serial bus (USB) .
  • a monitor 47 or other type of display device is also connected to the system bus 23 via an interface, such as a video adapter 48.
  • personal computers typically include other peripheral output devices (not shown), such as speakers and printers.
  • the computer 20 is expected to operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 49.
  • the remote computer 49 may be a web server, a router, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 20, although only a memory storage device 50 has been illustrated in FIG. 1 .
  • the computer 20 can contact the remote computer 49 over an Internet connection established through a Gateway 55 (e.g., a router, dedicated-line, or other network link), a modem 54 link, or by an intra-office local area network (LAN) 51 or wide area network (WAN) 52.
  • LAN local area network
  • WAN wide area network
  • the present invention is described below with reference to acts and symbolic representations of operations that are performed by the computer 20, unless indicated otherwise. Such acts and operations are sometimes referred to as being computer-executed. It will be appreciated that the acts and symbolically represented operations include the manipulation by the processing unit 21 of electrical signals representing data bits which causes a resulting transformation or reduction of the electrical signal representation, and the maintenance of data bits at memory locations in the memory system (including the system memory 22, hard drive 27, floppy disks 29, and CD-ROM 31 ) to thereby reconfigure or otherwise alter the computer system's operation, as well as other processing of signals.
  • the memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, or optical properties corresponding to the data bits.
  • a basic communication model there is a data source or sender 200, a communication channel 204, and a data receiver 208.
  • the source may be someone speaking on a telephone, over telephone wires, to another person.
  • the source may be a television or radio broadcast transmitted by wireless methods to television or radio receivers.
  • the source may be a digital encoding of some data, whether audio, visual, or other, transmitted over a wired or wireless communication link (e.g., a LAN or the internet) to a corresponding decoder for the information.
  • an encoder 202 is used to prepare the data source for transmission over the communication channel 204.
  • the encoder is responsible for converting the source data into a format appropriate for the channel 204.
  • a decoder 206 is required to reverse the encoding process so as to present sensible data to the receiver.
  • the encoder 202 is frequently designed to utilize compression schemes for the transmission of the data. Compression is desirable since, except for unusual circumstances, communication bandwidth is limited. Therefore, for complex data sources, such as audio or video data, the source data needs to be compressed to allow its transmission over conventional transmission paths.
  • a particularly effective coding method is known as entropy encoding, which utilizes a "code book" containing short code words that have been pre-assigned to highly probable input data.
  • An effective coding method is entropy encoding. Such coders can capitalize on the data coherency, and are particularly effective when symbols have non- uniform probability distribution.
  • FIG. 3 is a flowchart showing a preferred method for generating a code book.
  • FIG. 3 illustrates how to create a code book having variable length code assignments for variable length symbol groupings.
  • prior art techniques either require fixed-length codes or fixed blocks of input.
  • Preferred implementations overcome the resource requirements of large dimension vector encoding, and the inapplicability of coding into words of equal lengths, by providing an entropy based variable-to-variable code, where variable length code words are used to encode variable length X sequences.
  • yi represent each source symbol group ⁇ XJ ⁇ , for 1 ⁇ / ⁇ N ⁇ , having probability Pi of occurring within the input stream (FIG . 2 channel 204), and that each group is assigned a corresponding code word having L bits. It is presumed that each x f is drawn from a fixed alphabet of predetermined size. The objective is to minimize the
  • equation L instead of finding a general solution to the problem, the problem is separated into two different tasks.
  • the first task is identification of a (sub-optimal) grouping of a set of input symbols ⁇ xi ⁇ through an empirical approach described below.
  • the second task is assigning a entropy-type code for the grouped symbols ⁇ yi ⁇ . Note that it is known that if the source is not coherent (i.e., the input is independent or without memory), any grouping that has the same configuration of ⁇ Nj ⁇ can achieve the same coding efficiency. In this situation, the first task becomes inconsequential.
  • This initial configuration assumes that an exemplary input stream is being used to train creation of the code book.
  • a computer may be programmed with software constructions such as data structures to track receipt of each symbol from an input.
  • data structures may be implemented as a binary-type tree structure, hash table, or some combination of the two. Other equivalent structures may also be used.
  • the probability of occurrence for each y. is computed 302. Such probability is determined with respect to any exemplary input used to train code book generation. As further symbols are added to the symbol data structure, the probabilities are dynamically adjusted.
  • the most probable grouping y. is identified 304 (denoted as y m ) . If 306 the highest probability symbol is a grouping of previously lower probability symbols, then the grouping is split 308 into its constituent symbols, and processing restarted from step 302. (Although symbols may be combined, the group retains memory of all symbols therein so that symbols can be extracted.)
  • step 310 in which the most probable grouping is then tentatively extended 310 with single symbol extensions x.'s.
  • y mp is extended with each symbol from the X alphabet is used.
  • a predictor can be used to only generate an extension set containing only probable extensions, if the alphabet is very large and it is known many extensions are unlikely. For example, such a predictor may be based on semantic or contextual meaning, so that very improbable extensions can be ignored a priori.
  • the probability for each tentative expansion of y mp is then computed 31 2, and only the most probable extension retained 314.
  • the rest of the lower probability extensions are collapsed together 31 6 as a combined grouping and stored in code book with a special symbol to indicate a combined grouping .
  • This wild-card symbol represents any arbitrary symbol grouping having y m as a prefix, but with an extension (suffix) different from the most probable extension. That is, if ymp + X p is the most probable root and extension, then the other less probable extensions are represented as ym P * , * ⁇ x mp . (Note that this discussion presumes, for clarity, serial processing of single-symbol extensions; however, parallel execution of multiple symbol extensions is contemplated.)
  • Code book construction is completed by repeating 31 8 steps 302-31 6 until all extensions have been made, or the number of code book entries reaches a predetermined limit.
  • the effect of repeatedly applying the above operations is to automatically collect symbol groupings having high correlation, so that inter-group
  • One structure for a code book is traversal and storage of a N-ary (e.g., binary, tertiary, etc.) tree, where symbol groupings guide a traversal of the tree structure.
  • Leaf nodes of the tree represent the end of a recognized symbol sequence, where an entropy- type code is associated with the sequence.
  • Nodes can be coded in software as a structure, class definition, or other structure allowing storage of a symbol or symbols associated with the node.
  • the code book may be structured as a table having each string of input symbol sorted by probability of occurrence, with highly probable input at the top of the table.
  • the table can be sorted according to the first symbol, i.e., all symbol series beginning with "A” are grouped together, followed by series starting with "B", etc. With this arrangement, all entries within the grouping are sorted according to their probabilities of occurrence. The position of the beginning of each section is marked/tracked so that a hash-type function (e.g., a look-up based on the first symbol) can be used to locate the correct portion of the code book table, in this look-up table approach to storing the code book, once the first symbol is hashed, look-up simply requires a search of the corresponding table section until a matching entry is located.
  • FIGS. 4-1 0 illustrate creation of a code book pursuant to FIG.
  • the code book is defined with respect to an exemplary input stream "A A A B B A A C A B A B B A B".
  • one or more exemplary inputs may be used to generate a code book that is then used by encoders and decoders to process arbitrary inputs.
  • the code book is presented as a tree structure, although it may in fact be implemented as a linear table, hash table, database, etc.
  • the tree is oriented left-to-right, where the left column (e.g., "A” and "X0") represents the top row of a tree-type structure, and successively indented rows represent the "children" of the previous row's node (e.g., in a top-down tree for FIG. 5, node "A” is a first-row parent node for a second-row middle-child node “B”.) .
  • FIG. 4 shows an initial grouping for the input stream "A A A B B A A - C A B A B B A B".
  • This initial trivial grouping can be created based on different criteria, the simplest being having a first-level node for every character in the alphabet.
  • FIG. 4 illustrates this technique by starting with only two initial groups, group A 400 having probability 8/1 5, and group XO 402 having probability 7/1 5, where X0 represents all remaining low probability symbols in the alphabet, e.g., B and C.
  • the leaf-node having highest probability is selected for extension (see also FIG. 3 discussion regarding processing sequence) .
  • group A 400 is tentatively expanded by each character in the alphabet (or one may limit the expansion to some subset thereof as described for creating the initial grouping) .
  • Probabilities are then recomputed with respect to the input stream "A A A B B A A C A B A B B A B” to determine values for the tentative extensions A 406, B 408, and C 410.
  • the result is nine parsing groups, where "A A” appears 2/9, "A B” appears 4/9, and "A C" appears 0/9.
  • FIG. 6 shows the collapse into X1 41 2 for FIG. 5. Processing repeats with identification of the node having highest probability, e.g., node B 408 at probability 4/9
  • this node 408 is tentatively extended with symbols A 414, B 41 6, C 41 8, and as discussed above, the tentative grouping with highest probability is retained.
  • the result is eight parsing groups in which the symbol sequence "A B A” 414 appears once, "A B B” 41 6 appears once, and "A B C” 41 8 does not appear at all. Since tentative extensions A 41 4 and B 41 6 have the same probability of occurrence, a rule needs to be defined to choose which symbol to retain.
  • the highest row node e.g., the left-most child node in a top-down tree
  • the left-most row's node e.g., the node closest to the root of a top-down tree
  • code book entries can be created to account for such end of input sequences, or the input having no entry can be escaped with a special character and inserted in the encoded output stream For example, a special symbol can be used to indicate end of input, therefore implying how to handle the trailing characters on decoding
  • the next step is to expand the node currently having highest probability with respect to the input stream.
  • the highest node in the tree (X0 402) is extended (Although it is only necessary to be consistent, it is preferable to expand higher level nodes since this may increase coding efficiency by increasing the number of long code words.)
  • X0 402 is a combined node, so it must be split instead of extended.
  • FIG . 9 illustrates the result of splitting node XO into its constituent symbols B 422 and C 424.
  • FIG 1 0 shows the result of retaining high-probability node B 422. Note that grouping XO now only represents a single symbol "C”. After revising probabilities, the node having highest probability must be identified and split or extended.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
PCT/US1999/029003 1998-12-14 1999-12-07 Code book construction for variable to variable length entropy encoding Ceased WO2000036751A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2000588899A JP4559631B2 (ja) 1998-12-14 1999-12-07 可変長から可変長へのエントロピー符号化のためのコードブック構成
DE69916661T DE69916661T2 (de) 1998-12-14 1999-12-07 Codebuchkonstruktion für entropiekodierung von variabler zu variabler länge
EP99968085A EP1147612B1 (en) 1998-12-14 1999-12-07 Code book construction for variable to variable length entropy encoding
AT99968085T ATE265106T1 (de) 1998-12-14 1999-12-07 Codebuchkonstruktion für entropiekodierung von variabler zu variabler länge

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/211,532 US6404931B1 (en) 1998-12-14 1998-12-14 Code book construction for variable to variable length entropy encoding
US09/211,532 1998-12-14

Publications (1)

Publication Number Publication Date
WO2000036751A1 true WO2000036751A1 (en) 2000-06-22

Family

ID=22787319

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1999/029003 Ceased WO2000036751A1 (en) 1998-12-14 1999-12-07 Code book construction for variable to variable length entropy encoding

Country Status (6)

Country Link
US (1) US6404931B1 (enExample)
EP (1) EP1147612B1 (enExample)
JP (1) JP4559631B2 (enExample)
AT (1) ATE265106T1 (enExample)
DE (1) DE69916661T2 (enExample)
WO (1) WO2000036751A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017171611A1 (en) * 2016-03-31 2017-10-05 Zeropoint Technologies Ab Variable-sized symbol entropy-based data compression

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6624761B2 (en) 1998-12-11 2003-09-23 Realtime Data, Llc Content independent data compression method and system
US6604158B1 (en) 1999-03-11 2003-08-05 Realtime Data, Llc System and methods for accelerated data storage and retrieval
US6601104B1 (en) * 1999-03-11 2003-07-29 Realtime Data Llc System and methods for accelerated data storage and retrieval
US20030191876A1 (en) * 2000-02-03 2003-10-09 Fallon James J. Data storewidth accelerator
US20010047473A1 (en) 2000-02-03 2001-11-29 Realtime Data, Llc Systems and methods for computer initialization
GB2367459A (en) * 2000-09-28 2002-04-03 Roke Manor Research Method of compressing data packets
US7417568B2 (en) 2000-10-03 2008-08-26 Realtime Data Llc System and method for data feed acceleration and encryption
US9143546B2 (en) 2000-10-03 2015-09-22 Realtime Data Llc System and method for data feed acceleration and encryption
US8692695B2 (en) 2000-10-03 2014-04-08 Realtime Data, Llc Methods for encoding and decoding data
US6735339B1 (en) * 2000-10-27 2004-05-11 Dolby Laboratories Licensing Corporation Multi-stage encoding of signal components that are classified according to component value
US7386046B2 (en) 2001-02-13 2008-06-10 Realtime Data Llc Bandwidth sensitive data compression and decompression
US7016547B1 (en) * 2002-06-28 2006-03-21 Microsoft Corporation Adaptive entropy encoding/decoding for screen capture content
US7433824B2 (en) * 2002-09-04 2008-10-07 Microsoft Corporation Entropy coding by adapting coding between level and run-length/level modes
ATE449405T1 (de) 2002-09-04 2009-12-15 Microsoft Corp Entropische kodierung mittels anpassung des kodierungsmodus zwischen niveau- und lauflängenniveau-modus
US7313817B2 (en) * 2003-06-17 2007-12-25 Lockheed Martin Corporation Data transmission system utilizing efficient complexity estimation of the kolmogorov complexity for data transmission
US7724827B2 (en) * 2003-09-07 2010-05-25 Microsoft Corporation Multi-layer run level encoding and decoding
US7782954B2 (en) * 2003-09-07 2010-08-24 Microsoft Corporation Scan patterns for progressive video content
US7688894B2 (en) * 2003-09-07 2010-03-30 Microsoft Corporation Scan patterns for interlaced video content
US7941311B2 (en) * 2003-10-22 2011-05-10 Microsoft Corporation System and method for linguistic collation
US7684981B2 (en) * 2005-07-15 2010-03-23 Microsoft Corporation Prediction of spectral coefficients in waveform coding and decoding
US7599840B2 (en) * 2005-07-15 2009-10-06 Microsoft Corporation Selectively using multiple entropy models in adaptive coding and decoding
US7693709B2 (en) 2005-07-15 2010-04-06 Microsoft Corporation Reordering coefficients for waveform coding or decoding
US7565018B2 (en) * 2005-08-12 2009-07-21 Microsoft Corporation Adaptive coding and decoding of wide-range coefficients
US8599925B2 (en) * 2005-08-12 2013-12-03 Microsoft Corporation Efficient coding and decoding of transform blocks
US7933337B2 (en) * 2005-08-12 2011-04-26 Microsoft Corporation Prediction of transform coefficients for image compression
US8184710B2 (en) * 2007-02-21 2012-05-22 Microsoft Corporation Adaptive truncation of transform coefficient data in a transform-based digital media codec
US7774205B2 (en) * 2007-06-15 2010-08-10 Microsoft Corporation Coding of sparse digital media spectral data
US8179974B2 (en) * 2008-05-02 2012-05-15 Microsoft Corporation Multi-level representation of reordered transform coefficients
US8406307B2 (en) 2008-08-22 2013-03-26 Microsoft Corporation Entropy coding/decoding of hierarchically organized data
KR20110011357A (ko) * 2009-07-28 2011-02-08 삼성전자주식회사 비트의 순서를 변경한 데이터의 부호화, 복호화 방법 및 장치
DE102010009263B4 (de) 2010-02-25 2012-04-12 Knorr-Bremse Systeme für Nutzfahrzeuge GmbH Verfahren zur Datenübertragung zu einem elektronischen Steuergerät
DE102012211031B3 (de) * 2012-06-27 2013-11-28 Siemens Aktiengesellschaft Verfahren zur Codierung eines Datenstroms
US10044405B2 (en) 2015-11-06 2018-08-07 Cable Television Laboratories, Inc Signal power reduction systems and methods
US9673836B1 (en) * 2016-09-23 2017-06-06 International Business Machines Corporation System level testing of entropy encoding

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62247626A (ja) * 1986-04-19 1987-10-28 Fuji Photo Film Co Ltd 符号化方法
EP0283735A2 (en) * 1987-02-24 1988-09-28 Hayes Microcomputer Products, Inc. Adaptive data compression method and apparatus
US5550541A (en) * 1994-04-01 1996-08-27 Dolby Laboratories Licensing Corporation Compact source coding tables for encoder/decoder system
JPH09232968A (ja) * 1996-02-23 1997-09-05 Kokusai Denshin Denwa Co Ltd <Kdd> 可変長符号生成装置
US5959560A (en) * 1997-02-07 1999-09-28 Said; Amir Data compression via alphabet partitioning and group partitioning

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4122440A (en) 1977-03-04 1978-10-24 International Business Machines Corporation Method and means for arithmetic string coding
US4464650A (en) 1981-08-10 1984-08-07 Sperry Corporation Apparatus and method for compressing data signals and restoring the compressed data signals
US4558302A (en) * 1983-06-20 1985-12-10 Sperry Corporation High speed data compression and decompression apparatus and method
JPS61107818A (ja) 1984-10-30 1986-05-26 Nec Corp エントロピ−符号化方式とその装置
JPH0821863B2 (ja) 1985-04-13 1996-03-04 キヤノン株式会社 データ処理方法
US5532694A (en) 1989-01-13 1996-07-02 Stac Electronics, Inc. Data compression apparatus and method using matching string searching and Huffman encoding
US5003307A (en) 1989-01-13 1991-03-26 Stac, Inc. Data compression apparatus with shift register search means
US5479562A (en) * 1989-01-27 1995-12-26 Dolby Laboratories Licensing Corporation Method and apparatus for encoding and decoding audio information
DE3912605B4 (de) 1989-04-17 2008-09-04 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Digitales Codierverfahren
US5254990A (en) 1990-02-26 1993-10-19 Fujitsu Limited Method and apparatus for compression and decompression of data
US5049881A (en) 1990-06-18 1991-09-17 Intersecting Concepts, Inc. Apparatus and method for very high data rate-compression incorporating lossless data compression and expansion utilizing a hashing technique
US5227789A (en) 1991-09-30 1993-07-13 Eastman Kodak Company Modified huffman encode/decode system with simplified decoding for imaging systems
JP3134424B2 (ja) 1991-10-31 2001-02-13 ソニー株式会社 可変長符号化方法及び装置
US5229768A (en) 1992-01-29 1993-07-20 Traveling Software, Inc. Adaptive data compression system
US5227788A (en) 1992-03-02 1993-07-13 At&T Bell Laboratories Method and apparatus for two-component signal compression
US5406279A (en) 1992-09-02 1995-04-11 Cirrus Logic, Inc. General purpose, hash-based technique for single-pass lossless data compression
US5442350A (en) 1992-10-29 1995-08-15 International Business Machines Corporation Method and means providing static dictionary structures for compressing character data and expanding compressed data
US5400075A (en) 1993-01-13 1995-03-21 Thomson Consumer Electronics, Inc. Adaptive variable length encoder/decoder
JP2505980B2 (ja) 1993-04-16 1996-06-12 インターナショナル・ビジネス・マシーンズ・コーポレイション 静的辞書作成方法及びコンピュ―タ実行システム
JP3210996B2 (ja) 1993-07-30 2001-09-25 三菱電機株式会社 高能率符号化装置及び高能率復号化装置
JP3278297B2 (ja) 1994-07-20 2002-04-30 富士通株式会社 データ圧縮方法及びデータ復元方法並びにデータ圧縮装置及びデータ復元装置
JPH08116263A (ja) * 1994-10-17 1996-05-07 Fujitsu Ltd データ処理装置及びデータ処理方法
JPH08167852A (ja) * 1994-12-13 1996-06-25 Fujitsu Ltd データ圧縮方法及び装置
JPH08205169A (ja) * 1995-01-20 1996-08-09 Matsushita Electric Ind Co Ltd 動画像符号化装置及び復号装置
EP0731614B1 (en) * 1995-03-10 2002-02-06 Kabushiki Kaisha Toshiba Video coding/decoding apparatus
US5884269A (en) 1995-04-17 1999-03-16 Merging Technologies Lossless compression/decompression of digital audio data
JP3181809B2 (ja) 1995-05-31 2001-07-03 シャープ株式会社 データ圧縮のための圧縮コードの復元回路
US5825830A (en) 1995-08-17 1998-10-20 Kopf; David A. Method and apparatus for the compression of audio, video or other data
US5819215A (en) 1995-10-13 1998-10-06 Dobson; Kurt Method and apparatus for wavelet based data compression having adaptive bit rate control for compression of digital audio or other sensory data
US5933104A (en) 1995-11-22 1999-08-03 Microsoft Corporation Method and system for compression and decompression using variable-sized offset and length fields
US5831559A (en) 1996-01-24 1998-11-03 Intel Corporation Encoding/decoding video signals using multiple run-val mapping tables
JP3566441B2 (ja) 1996-01-30 2004-09-15 シャープ株式会社 テキスト圧縮用辞書作成装置
JP3277792B2 (ja) 1996-01-31 2002-04-22 株式会社日立製作所 データ圧縮方法および装置
US5790706A (en) 1996-07-03 1998-08-04 Motorola, Inc. Method and apparatus for scanning of transform coefficients
DE19637522A1 (de) 1996-09-13 1998-03-19 Bosch Gmbh Robert Verfahren zur Reduzierung von Daten in Videosignalen
US5999949A (en) 1997-03-14 1999-12-07 Crandall; Gary E. Text file compression system utilizing word terminators
US5946043A (en) 1997-12-31 1999-08-31 Microsoft Corporation Video coding using adaptive coding of block parameters for coded/uncoded blocks
US6100824A (en) 1998-04-06 2000-08-08 National Dispatch Center, Inc. System and method for data compression
US6029126A (en) 1998-06-30 2000-02-22 Microsoft Corporation Scalable audio coder and decoder

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62247626A (ja) * 1986-04-19 1987-10-28 Fuji Photo Film Co Ltd 符号化方法
EP0283735A2 (en) * 1987-02-24 1988-09-28 Hayes Microcomputer Products, Inc. Adaptive data compression method and apparatus
US5550541A (en) * 1994-04-01 1996-08-27 Dolby Laboratories Licensing Corporation Compact source coding tables for encoder/decoder system
JPH09232968A (ja) * 1996-02-23 1997-09-05 Kokusai Denshin Denwa Co Ltd <Kdd> 可変長符号生成装置
US5883589A (en) * 1996-02-23 1999-03-16 Kokusai Denshin Denwa Co., Ltd. Variable length code construction apparatus
US5959560A (en) * 1997-02-07 1999-09-28 Said; Amir Data compression via alphabet partitioning and group partitioning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 012, no. 118 (E - 600) 13 April 1988 (1988-04-13) *
PATENT ABSTRACTS OF JAPAN vol. 1998, no. 01 30 January 1998 (1998-01-30) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017171611A1 (en) * 2016-03-31 2017-10-05 Zeropoint Technologies Ab Variable-sized symbol entropy-based data compression
CN109075798A (zh) * 2016-03-31 2018-12-21 零点科技公司 可变大小符号基于熵的数据压缩
US10862507B2 (en) 2016-03-31 2020-12-08 Zeropoint Technologies Ab Variable-sized symbol entropy-based data compression

Also Published As

Publication number Publication date
US6404931B1 (en) 2002-06-11
ATE265106T1 (de) 2004-05-15
DE69916661D1 (de) 2004-05-27
JP2002533005A (ja) 2002-10-02
EP1147612B1 (en) 2004-04-21
DE69916661T2 (de) 2004-08-19
JP4559631B2 (ja) 2010-10-13
EP1147612A1 (en) 2001-10-24

Similar Documents

Publication Publication Date Title
US6404931B1 (en) Code book construction for variable to variable length entropy encoding
EP1142129B1 (en) Variable to variable length entropy encoding
US9223765B1 (en) Encoding and decoding data using context model grouping
US6100824A (en) System and method for data compression
EP1142130A1 (en) Entropy code mode switching for frequency-domain audio coding
JPS6356726B2 (enExample)
US6668092B1 (en) Memory efficient variable-length encoding/decoding system
Yang et al. Universal lossless data compression with side information by using a conditional MPM grammar transform
Willems et al. Reflections on “the context tree weighting method: Basic properties”
Jacob et al. Comparative analysis of lossless text compression techniques
US20060125660A1 (en) Digital data compression robust relative to transmission noise
Fenwick Improvements to the block sorting text compression algorithm
Kieffer et al. Survey of grammar-based data structure compression
Rani et al. A survey on lossless text data compression techniques
WO2005043765A2 (en) Resilient parameterized prefix codes for adaptive coding
Gibson Lossless Source Coding
JPH1155125A (ja) 文字データの圧縮・復元方法
Erkan et al. A New Text Compression Algorithm Based on Index Permutation and Suffix Coding Un nouvel algorithme de compression de texte basé sur la permutation d’index et le codage par suffixe
Herrera Alcántara et al. Incompressibility and lossless data compression: An approach by pattern discovery
Rottenstreich et al. Optimal compression of element pairs in fixed-width memories
CN117465471A (zh) 一种针对文本文件的无损压缩系统及其压缩方法
Tyagi et al. Implementation of Data Compression Algorithms
Morita et al. Data Compression of ECG Based on the Edit Destance Algorithms
Sabuj et al. A Modified Approach to Huffman Algorithm for Unequal Bit Costs
Topaloglu et al. Polymorphic compression

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref country code: JP

Ref document number: 2000 588899

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1999968085

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1999968085

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 1999968085

Country of ref document: EP