WO2002021784A1 - Soft-output error-trellis decoder for convolutional codes - Google Patents

Soft-output error-trellis decoder for convolutional codes Download PDF

Info

Publication number
WO2002021784A1
WO2002021784A1 PCT/US2001/026785 US0126785W WO0221784A1 WO 2002021784 A1 WO2002021784 A1 WO 2002021784A1 US 0126785 W US0126785 W US 0126785W WO 0221784 A1 WO0221784 A1 WO 0221784A1
Authority
WO
WIPO (PCT)
Prior art keywords
syndrome
optimum state
decoder
trellis
error
Prior art date
Application number
PCT/US2001/026785
Other languages
French (fr)
Inventor
Meir Ariel
Ofer Amrani
Reuven Meidan
Original Assignee
Motorola Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Inc. filed Critical Motorola Inc.
Priority to AU2001286842A priority Critical patent/AU2001286842A1/en
Publication of WO2002021784A1 publication Critical patent/WO2002021784A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0045Arrangements at the receiver end
    • H04L1/0047Decoding adapted to other signal detection operation
    • H04L1/005Iterative decoding, including iteration between signal detection and decoding operation
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/3723Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35 using means or methods for the initialisation of the decoder
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/37Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
    • H03M13/39Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
    • H03M13/3905Maximum a posteriori probability [MAP] decoding or approximations thereof based on trellis or lattice decoding, e.g. forward-backward algorithm, log-MAP decoding, max-log-MAP decoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0045Arrangements at the receiver end
    • H04L1/0055MAP-decoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0056Systems characterized by the type of code used
    • H04L1/0064Concatenated codes

Definitions

  • This present invention relates generally to communication systems, and more particularly to a soft-output decoder for use in a receiver of a convolutional code communication system.
  • Convolutional codes are often used in digital communication systems (e.g., the direct sequence code division multiple access CDS-CDMA), IS-95, IS-136, and Global System for Mobile Communications (GSM) time division multiple access (TDMA) standards) to protect transmitted information.
  • CDS-CDMA direct sequence code division multiple access
  • GSM Global System for Mobile Communications
  • TDMA time division multiple access
  • an outgoing code vector may be described using a trellis diagram whose complexity is determined by the constraint length of the encoder. Although computational complexity increases with increasing constraint length, the robustness of the coding also increases with constraint length.
  • a soft-decision decoder such as a Viterbi decoder
  • turbo codes have been developed that outperform conventional coding techniques. Turbo codes are generally composed of two or more convolutional codes and turbo interleavers. Turbo decoding is iterative using a soft output decoder to decode the constituent convolutional codes.
  • the soft output decoder provides a reliability measure on each information bit which helps the soft output decoder decode the other convolutional codes.
  • the soft output decoder computes an approximated probability for each information bit in a received signal.
  • the soft output decoder is usually a MAP (maximum a posteriori) decoder, which uses both backward and forward decoding to determine the soft output.
  • MAP decoding is usually limited to a sub-optimal approximation. All. of these variants require both forward and backward decoding over the block.
  • FIG. 1 shows a block diagram of a communication system with a receiver having a soft- output decoder.
  • FIG. 2 shows a block diagram of the trellis composer and optimum state detector shown in FIG. 1.
  • FIG. 3 shows a more detailed flow chart of the operation of the soft output decoder including the syndrome calculator, syndrome modifier, and optimum state detector shown in FIG. 2.
  • a soft output error trellis decoder 170 (FIG. 1) computes an approximated maximum a posteriori probability on each information bit in a demodulated received signal.
  • a trellis composer and optimum state detector composes a window error trellis from a window syndrome and identifies an optimum state from the window error trellis.
  • a soft output decoder performs decoding using optimum states identified by the optimum state detector. The soft output decoder generates a soft output sequence to be used in iterative decoding of concatenated codes.
  • Efficient soft-output decoding of convolutional codes uses a soft-output decoder based on a data-dependent error-trellis.
  • the soft-output decoder includes two error-trellis decoders for decoding the sequence of signals received over the channel. Forward and backward recursions are conducted through a window of length (w-m) trellis stages, where w is fixed and m depends on the position of an optimum state in the window. Such a state lies, with high probability, along the optimum path through the error trellis.
  • a syndrome based error trellis is constructed and used to identify an optimum state.
  • the identified optimum state serves as the starting point for backward recursion and as the terminal state for forward recursion for a turbo decoder decoding windows of the syndrome based error trellis.
  • the obtained soft-output metric may then be employed for iterative decoding of concatenated codes, also referred to as turbo codes.
  • the syndrome based error trellis is thus employed to identify optimum states without the need for either first processing the entire error trellis block (as required for a conventional trellis that generates soft outputs only after performing a backward recursion over the full length of a trellis block) or use of a trellis decoder training period prior to decoding each window in the trellis (as required for sliding window techniques that decode windows in a trellis block only after decoding in a learning period beyond the window).
  • the performance of the improved decoder is substantially the same as that of a decoder using full trellis block decoding without the memory requirements of a full trellis block decoder or the computational complexity of prior art sliding window trellis decoders.
  • This improved soft output MAP decoder provides optimal approximation for information bits in a received signal.
  • FIG. 1 shows a block diagram of a communication system 100 with a receiver 130 having a soft output error trellis decoder 170.
  • Receiver 130 is shown as part of a cellular radiotelephone subscriber unit 101, however, the receiver may alternately be part of a digital telephone, a facsimile machine, modulator-demodulator (MODEM), two-way radio, a base station, or any other communication device that receives convolutionally encoded signals.
  • a microphone 105 picks up audio signals that are modulated by a transmitter 110 and broadcast by an antenna 120 after passing through a duplexer 125.
  • the antenna 120 also receives radio frequency (RF) signals from a complementary transmitter in a transceiver 199, which may for example be a cellular base station.
  • a radio frequency (RF) front end 140 steps down the received RF signal received through duplexer 125 to an analog baseband signal.
  • An analog-to- digital (A D) converter 146 converts the analog baseband signal to a digital signal. Those skilled in the art will recognize that the A/D converter 146 may not be required where the baseband signal is digital.
  • Digital demodulator 150 processes the digital signal to produce a demodulated received signal vector r.
  • the demodulated received signal vector r is connected to the soft output error trellis decoder 170.
  • the soft output error trellis decoder generates an error trellis, identifies an optimum state within a relatively small number of sections of the full trellis block, and generates soft output data from the error trellis using the optimum state detected.
  • the soft output decoder could be incorporated in the framework of iterative decoding of concatenated codes.
  • a digital-to-analog (D/A) converter 180 converts the decoded signal to the analog domain.
  • D/A converter 180 converts the decoded signal to the analog domain.
  • an audio amplifier 185 uses operational amplifiers to increase the gain of the recovered signal for reproduction through audio speaker 190.
  • the decoder 170 may be implemented using any suitable circuitry, such as one or more digital signal processors (DSP), microprocessors, microcontrollers, or the like, and associated circuitry. It is envisioned that the decoder may include a trellis composer, an optimum state detector and a soft output decoder embodied in a single DSP.
  • the soft output decoder may decode the trellis constructed by the trellis composer to take advantage of the simplified error trellis produced, or the soft output decoder may decode received data independently of the error trellis using the optimum state from the optimum state detector, in which case an error trellis composed will only be used to identify the optimum state.
  • FIG. 2 shows a block diagram of the soft output iterative decoder 170 of FIG. 1 in greater detail.
  • a demodulated received signal vector r from demodulator 150 enters the symbol-by- symbol detector 210 that produces a hard-decision vector v.
  • the symbol-by-symbol detector merely examines the incoming signal and converts it to the closest valid (e.g., binary) symbol without regard for the value of the surrounding symbols.
  • the output hard-decision vector v from symbol-by-symbol detector 210 is not necessarily a valid code vector.
  • a syndrome calculator 214 multiplies the hard-decision vector v by scalar parity check matrix H to produce a syndrome vector S.
  • hard-decision vector v is the maximum likelihood estimated transmitted code vector 5, and it is guaranteed that hard-decision vector v is identical to the output of a soft-decision decoder such as a Viterbi decoder, and no additional processing will be required.
  • a soft-decision decoder such as a Viterbi decoder
  • syndrome modifier 216 performs a search through the computed syndrome vector to detect predetermined syndrome patterns stored in syndrome pattern memory 218, which correspond to specific error patterns e- p , such as single- error syndrome patterns or double-error syndrome patterns, in the received hard-decision vector v. If a syndrome pattern, p, is discovered in the syndrome vector, syndrome modifier 216 simplifies the syndrome vector by subtracting the syndrome pattern p and records the found error pattern e p in an estimated error vector e according to the search results. The modified syndrome vector S' is simplified in the sense that it has more zeros than the original syndrome vector S.
  • the modified syndrome vector S ' is sent to a syndrome-based decoder 220 for use in constructing a simplified error trellis.
  • the search for the most likely error vector e that produces the syndrome vector s can be described by a search through an error trellis diagram for the path with the minimum accumulated weight.
  • Meir Ariel and Jakov Snyders provide an in-depth discussion and detailed examples of how to construct and decode an error trellis in an article entitled "Soft Syndrome Decoding of Binary Convolutional Codes," published in Vol. 43 of the IEEE Transactions on Communications, pages 288-297 (1995), the disclosure of which is incorporated herein by reference thereto.
  • An error trellis that describes the same coset of the convolutional code to which the symbol-by-symbol vector v belongs is used to search for the most likely transmitted data sequence.
  • the shape of the error trellis depends only on the value of the computed syndrome vector, and a modified syndrome vector produces a simplified error trellis.
  • an accelerated search through a simplified error trellis is possible due to its irregular structure.
  • the error trellis can be substantially simplified without affecting the optimum output of the decoding procedure.
  • the syndrome modifier 216 simplifies the syndrome vector to promote long strings of zeros, which allows the construction and processing of an error trellis by the syndrome-based decoder 220 to be more efficient.
  • an optimum state is detected by optimum state detector 222 from a search through the syndrome based error trellis.
  • An optimum state is one that lies on the least weighing error path through the error trellis.
  • a backward search is performed to detect the optimum state.
  • the identified optimum state is generated at output 207 for use by soft output decoder 204.
  • Soft output decoder 204 uses the optimum state as the starting state for backward recursion.
  • FIG. 3 A flow chart describing operation of the soft output error trellis decoder 170 is depicted in FIG. 3.
  • the soft-output decoder employs a variable-length window for backward and forward recursion. State metrics are saved during the backward recursion, and soft bit metrics are output during the forward recursion.
  • a key component of the decoder is a searcher for an optimum state. Such a state is defined as having a high probability to lie along the optimum path through the error trellis. The optimum path is the most likely path through the error trellis.
  • the construction and properties of error trellises can be found in "Error Trellises for Convolutional Codes- Part 1: Construction", by M. Ariel and J.
  • W which is a sequence of w symbols wherein w is a positive integer preferably less than the full trellis block length
  • W is processed to produce soft output metrics ⁇ (dk) ⁇ at the output of the decoder 170.
  • w can be less than 100 sections, such as 36 sections, whereas the full trellis length will be thousands of sections long.
  • the soft output metrics are computed sequentially starting with ⁇ (do).
  • m is set to 0 in step 300.
  • the variable m represents an index of the number of trellis locations (or stages) considered, and will be used to locate the optimum state that is the final stage of the variable length window.
  • the w soft input (a priori LLR) symbols are received in step 301 and the symbol detector 210 outputs a symbol-by-symbol hard-decision vector vw corresponding thereto in step 302.
  • a syndrome S related to the received vector v- is calculated using a window syndrome calculator 303.
  • preprocessing is performed in steps 304-307, the purpose of which is to construct a degenerated error trellis. In such a trellis, identifying optimum states is immediate.
  • the preprocessing is composed of the following steps:
  • step 306 2) if the patterns match, and the accumulated metric ⁇ p of the corresponding error pattern is below some predetermined threshold T, in step 306, then the syndrome Sw is modified by Sw ⁇ — Sw+S p in step 305;
  • the threshold T is a predetermined value representing a probability level.
  • the metric of an error- pattern is proportional to the probability of that pattern.
  • the threshold T is set to assure that only highly probable events are considered, and will be dependent on the application for the trellis decoder.
  • the accumulated metric is the sum of the metrics of the bits composing the error pattern.
  • a window error trellis is then composed based upon the modified syndrome Sw in step 308.
  • a backward search is made through the error trellis to identify an optimum state, as indicated in step 309.
  • the optimum state is the first state encountered during the backward search through the error trellis that meets the following criteria: all surviving paths leading to all states at some time index k are traced back to this single state at time index k-s, where s is a positive integer.
  • U such an optimum state is identified after searching backwards through m stages of the error trellis, as determined in step 310 (m being incremented until it is equal to w or an optimum state is detected in steps 309, 310 and 311), the optimum state is provided to soft output decoder 204 (in steps 312 and 314), and s w _ m is a stage with reliable state metrics.
  • the soft output decoder performs backward recursion decoding over the w-m stages of the error trellis and the backward recursion state metrics generated are stored in memory (for example a memory which is not shown that is associated with the soft output decoder 204), as indicated in step 312.
  • the stage is then shifted w-m stages to the right in step 315 such that they are located between stages s w . m and s 2w . m , if it is determined in step 316 that the entire trellis block has not yet been decoded.
  • the size is increased by a predetermined amount ⁇ in step 313.
  • the process described above for the w stages is repeated for the w+ ⁇ stages, beginning from step 301 for window size w+ ⁇ .
  • the variable m is reset to 0 in step 316. In this manner, the window length is initiated at a desired maximum length w but can be increased dynamically to look for an optimum state without having to decode and store state metrics for the entire block.
  • the optimum constraint can be relaxed, and the following method could be applied: partition the sequence of length w into two parts, wi and w 2 , where wi >0. Wi and w 2 may be of equal or different lengths.
  • a sub- optimum state within w 2 is identified.
  • the sub-optimum state used for backward recursion can be selected by some sub-optimal criterion such as: the state at time k-s that is connected to the maximum number of surviving paths leading to states at some time index k, where s is a positive integer.
  • the sub-optimum state can be selected at random. Backward and forward recursions are performed through wi and the portion of w 2 to the left of the sub-optimum state.
  • the maximum constraint length might be 100 sections or more, for example.
  • the soft output decoder 204 uses the information generated by the trellis composer and optimum state detector 202 to generate the decoded sequence.
  • a soft input-soft output (SISO) decoder is used, such as a SISO for turbo codes that is based on the MAP algorithm described in a paper by L.R. Bahl, J. Cocke, F. Jelinek, and J. Raviv entitled "Optimum Decoding of Linear Codes for Minimizing Symbol Error Rate", IEEE Transactions on Information Theory, Vol. JT-20, March 1974, pp.
  • turbo coders are constructed with interleavers and constituent codes, which are usually systematic convolutional codes, but can alternately be block codes.
  • MAP algorithms not only minimize the probability of error for an information bit given the received sequence, they also provide the probability that the information bit is either a 1 or 0 given the received sequence.
  • the BCJR algorithm provides a soft output decision for each bit position (stage) wherein the influences of the soft inputs within the block are broken into contributions from the past (earlier soft inputs), the present soft input, and the future (later soft inputs).
  • This decoder algorithm requires a forward and a backward generalized Viterbi recursion on the trellis to arrive at an optimum soft output for each trellis location, or stage.
  • These a posteriori probabilities, or more commonly the log-likelihood ratio (LLR) of the probabilities are passed between SISO decoding steps in iterative turbo decoding.
  • LLR log-likelihood ratio
  • the probability that the decoded bit is equal to 1 (or 0) in the trellis given the received sequence is composed of a product of terms due to the Markov property of the code.
  • the Markov property states that the past and the future are independent given the present.
  • the present, ⁇ t (n,m), is the probability of being in state m at time t and generating the symbol y t when the previous state at time t-1 was n.
  • the present operates as a branch metric.
  • the past, afyn), is the probability of being in state m at time t with the received sequence ⁇ y 1 ,..., ⁇ t ⁇ , and the future, ⁇ t ⁇ m), is probability of generating the received sequence ⁇ y tH ,..., y N ⁇ from state m at time t.
  • the probability a t (m) can be expressed as function of a t - ⁇ (m) and ⁇ t (n,m) and is called the forward recursion
  • MAP Maximum a posteriori
  • MAP maximum a posteriori
  • log-MAP log-MAP
  • max-log-MAP constant- log-MAP
  • constant- log-MAP etc.
  • the MAP decoder minimizes the decoded bit error probability for each information bit based on all received bits.
  • typical prior art MAP decoders require a large memory to support full block decoding.
  • the soft output decoder 204 avoids memory problems associated with prior MAP type decoders by decoding windows of the error trellis rather than the entire block.
  • the windows each have a length of w-m, wherein m is dynamically determined based on the detection of an optimum state through the error trellis as described hereinabove.
  • Backward recursion is initiated from the optimum state detected for each window.
  • Forward recursion starts from the known final state of the previous window.
  • the decoder slides to the next adjacent window, the length of which is dynamically determined.
  • the syndrome modifier in conjunction with a syndrome-based decoder simplifies the construction of a degenerated error trellis, which is used to identify an optimum state within a window.
  • An optimum state may thus be identified without decoding the entire block length N of the trellis decoder.
  • the optimum state so detected can then be used as the starting state for backward recursion within a window without the need for a soft output decoder training period.
  • the soft outputs can be generated during the forward recursion.
  • a dynamically assigned soft-output decoder window is provided based upon the location of an optimum state within a sub-block of input data.
  • the optimum state closest to the end of the sub-block is selected as the starting point for backward recursion and the ending point for forward recursion to allow the largest possible soft-output decoder window size within the input sequence of length w.
  • the optimum state identified can be used with any decoder and it is envisioned that the optimum state detected could be provided to a soft output decoder which is not processing the error trellis generated by the syndrome detector. While specific components and functions of the soft-decision decoder for convolutional codes are described above, fewer or additional functions could be employed by one skilled in the art within the true spirit and scope of the present invention. The invention should be limited only by the appended claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Error Detection And Correction (AREA)

Abstract

A softer error trellis decoder (170) computes an approximated maximum a posterior on each information bit in a demodulated received signal. A trellis composer (220) and optimum state detector (222) composes a window error trellis from a window syndrome and identifies an optimum state from the window error trellis. The decoder (170) performs decoding using optimum state identified by the optimum state detector (222). The decoder generates a soft output sequence to be used in iterative decoding of concatenated codes.

Description

SOFT-OUTPUT ERROR-TRELLIS DECODER FOR CONVOLUTIONAL CODES
Technical Field
This present invention relates generally to communication systems, and more particularly to a soft-output decoder for use in a receiver of a convolutional code communication system.
Background Art Convolutional codes are often used in digital communication systems (e.g., the direct sequence code division multiple access CDS-CDMA), IS-95, IS-136, and Global System for Mobile Communications (GSM) time division multiple access (TDMA) standards) to protect transmitted information. At the transmitter, an outgoing code vector may be described using a trellis diagram whose complexity is determined by the constraint length of the encoder. Although computational complexity increases with increasing constraint length, the robustness of the coding also increases with constraint length.
At the receiver, a soft-decision decoder, such as a Viterbi decoder, uses a trellis structure to perform an optimal search for the maximum likelihood transmitted code vector. More recently, turbo codes have been developed that outperform conventional coding techniques. Turbo codes are generally composed of two or more convolutional codes and turbo interleavers. Turbo decoding is iterative using a soft output decoder to decode the constituent convolutional codes.
The soft output decoder provides a reliability measure on each information bit which helps the soft output decoder decode the other convolutional codes. The soft output decoder computes an approximated probability for each information bit in a received signal. The soft output decoder is usually a MAP (maximum a posteriori) decoder, which uses both backward and forward decoding to determine the soft output. However, because of memory, processing, and numerical tradeoffs, MAP decoding is usually limited to a sub-optimal approximation. All. of these variants require both forward and backward decoding over the block.
For future standards, such as the 3GPP (third generation partnership project for wireless systems), an 8-state turbo code with a block length of N=5120, will need 40960 words of intermediate storage, which may be unacceptable for practical implementation. Future systems employing larger frames and a greater number of states will require even more memory. By comparison, a Viterbi decoder that does not produce soft outputs for an N=5120, 8-state trellis requires less than 100 words of intermediate storage. There is a need for a soft output decoder that reduces overall memory and processing requirements for decoding convolutional codes without the limitations imposed by prior art turbo and MAP decoders.
Brief Description of Drawings
FIG. 1 shows a block diagram of a communication system with a receiver having a soft- output decoder.
FIG. 2 shows a block diagram of the trellis composer and optimum state detector shown in FIG. 1.
FIG. 3 shows a more detailed flow chart of the operation of the soft output decoder including the syndrome calculator, syndrome modifier, and optimum state detector shown in FIG. 2.
Disclosure of the Invention
A soft output error trellis decoder 170 (FIG. 1) computes an approximated maximum a posteriori probability on each information bit in a demodulated received signal. A trellis composer and optimum state detector composes a window error trellis from a window syndrome and identifies an optimum state from the window error trellis. A soft output decoder performs decoding using optimum states identified by the optimum state detector. The soft output decoder generates a soft output sequence to be used in iterative decoding of concatenated codes.
Efficient soft-output decoding of convolutional codes uses a soft-output decoder based on a data-dependent error-trellis. The soft-output decoder includes two error-trellis decoders for decoding the sequence of signals received over the channel. Forward and backward recursions are conducted through a window of length (w-m) trellis stages, where w is fixed and m depends on the position of an optimum state in the window. Such a state lies, with high probability, along the optimum path through the error trellis. A syndrome based error trellis is constructed and used to identify an optimum state. The identified optimum state serves as the starting point for backward recursion and as the terminal state for forward recursion for a turbo decoder decoding windows of the syndrome based error trellis. The obtained soft-output metric may then be employed for iterative decoding of concatenated codes, also referred to as turbo codes.
The syndrome based error trellis is thus employed to identify optimum states without the need for either first processing the entire error trellis block (as required for a conventional trellis that generates soft outputs only after performing a backward recursion over the full length of a trellis block) or use of a trellis decoder training period prior to decoding each window in the trellis (as required for sliding window techniques that decode windows in a trellis block only after decoding in a learning period beyond the window). The performance of the improved decoder is substantially the same as that of a decoder using full trellis block decoding without the memory requirements of a full trellis block decoder or the computational complexity of prior art sliding window trellis decoders. This improved soft output MAP decoder provides optimal approximation for information bits in a received signal.
FIG. 1 shows a block diagram of a communication system 100 with a receiver 130 having a soft output error trellis decoder 170. Receiver 130 is shown as part of a cellular radiotelephone subscriber unit 101, however, the receiver may alternately be part of a digital telephone, a facsimile machine, modulator-demodulator (MODEM), two-way radio, a base station, or any other communication device that receives convolutionally encoded signals. In the subscriber unit 101, a microphone 105 picks up audio signals that are modulated by a transmitter 110 and broadcast by an antenna 120 after passing through a duplexer 125. The antenna 120 also receives radio frequency (RF) signals from a complementary transmitter in a transceiver 199, which may for example be a cellular base station. A radio frequency (RF) front end 140 steps down the received RF signal received through duplexer 125 to an analog baseband signal. An analog-to- digital (A D) converter 146 converts the analog baseband signal to a digital signal. Those skilled in the art will recognize that the A/D converter 146 may not be required where the baseband signal is digital. Digital demodulator 150 processes the digital signal to produce a demodulated received signal vector r. The demodulated received signal vector r is connected to the soft output error trellis decoder 170. The soft output error trellis decoder generates an error trellis, identifies an optimum state within a relatively small number of sections of the full trellis block, and generates soft output data from the error trellis using the optimum state detected. The soft output decoder could be incorporated in the framework of iterative decoding of concatenated codes. At the decoder 170, a digital-to-analog (D/A) converter 180 converts the decoded signal to the analog domain. Those skilled in the art will recognize that the decoder can be an approximated a posterior decoder, and that the D/A converter 180 can be omitted where the baseband signal, and thus the output of the receiver 130, is digital. In the illustrated embodiment, an audio amplifier 185 uses operational amplifiers to increase the gain of the recovered signal for reproduction through audio speaker 190. The decoder 170 may be implemented using any suitable circuitry, such as one or more digital signal processors (DSP), microprocessors, microcontrollers, or the like, and associated circuitry. It is envisioned that the decoder may include a trellis composer, an optimum state detector and a soft output decoder embodied in a single DSP. The soft output decoder may decode the trellis constructed by the trellis composer to take advantage of the simplified error trellis produced, or the soft output decoder may decode received data independently of the error trellis using the optimum state from the optimum state detector, in which case an error trellis composed will only be used to identify the optimum state.
FIG. 2 shows a block diagram of the soft output iterative decoder 170 of FIG. 1 in greater detail. A demodulated received signal vector r from demodulator 150 enters the symbol-by- symbol detector 210 that produces a hard-decision vector v. The symbol-by-symbol detector merely examines the incoming signal and converts it to the closest valid (e.g., binary) symbol without regard for the value of the surrounding symbols. The output hard-decision vector v from symbol-by-symbol detector 210 is not necessarily a valid code vector. A syndrome calculator 214 multiplies the hard-decision vector v by scalar parity check matrix H to produce a syndrome vector S. In the situation where syndrome vector S = 0, then hard-decision vector v is the maximum likelihood estimated transmitted code vector 5, and it is guaranteed that hard-decision vector v is identical to the output of a soft-decision decoder such as a Viterbi decoder, and no additional processing will be required.
If syndrome vector S ≠ 0, which is the typical case, the syndrome modifier 216 performs a search through the computed syndrome vector to detect predetermined syndrome patterns stored in syndrome pattern memory 218, which correspond to specific error patterns e-p, such as single- error syndrome patterns or double-error syndrome patterns, in the received hard-decision vector v. If a syndrome pattern, p, is discovered in the syndrome vector, syndrome modifier 216 simplifies the syndrome vector by subtracting the syndrome pattern p and records the found error pattern ep in an estimated error vector e according to the search results. The modified syndrome vector S' is simplified in the sense that it has more zeros than the original syndrome vector S. If S' = 0, then the most likely error vector e = e, and the maximum likelihood estimated transmitted code vector c = v - e. If modified syndrome vector S' = H(v - e) ≠ 0 after subtracting a found syndrome pattern p, the syndrome modifier 216 continues to search for syndrome patterns until all of the syndrome patterns in the syndrome pattern memory 218 have been compared with the modified syndrome vector S' or until the modified syndrome vector S' = 0.
After all of the syndrome patterns in the syndrome pattern memory 218 have been compared to the modified syndrome pattern S', then the modified syndrome vector S ' is sent to a syndrome-based decoder 220 for use in constructing a simplified error trellis. The search for the most likely error vector e that produces the syndrome vector s can be described by a search through an error trellis diagram for the path with the minimum accumulated weight. Meir Ariel and Jakov Snyders provide an in-depth discussion and detailed examples of how to construct and decode an error trellis in an article entitled "Soft Syndrome Decoding of Binary Convolutional Codes," published in Vol. 43 of the IEEE Transactions on Communications, pages 288-297 (1995), the disclosure of which is incorporated herein by reference thereto. An error trellis that describes the same coset of the convolutional code to which the symbol-by-symbol vector v belongs is used to search for the most likely transmitted data sequence. The shape of the error trellis depends only on the value of the computed syndrome vector, and a modified syndrome vector produces a simplified error trellis. Once the error trellis is constructed, an accelerated search through a simplified error trellis is possible due to its irregular structure. Under low bit-error rate conditions, the error trellis can be substantially simplified without affecting the optimum output of the decoding procedure. In other words, the syndrome modifier 216 simplifies the syndrome vector to promote long strings of zeros, which allows the construction and processing of an error trellis by the syndrome-based decoder 220 to be more efficient.
Once the error trellis is constructed, an optimum state is detected by optimum state detector 222 from a search through the syndrome based error trellis. An optimum state is one that lies on the least weighing error path through the error trellis. A backward search is performed to detect the optimum state. Once the optimum state is identified, the identified optimum state is generated at output 207 for use by soft output decoder 204. Soft output decoder 204 uses the optimum state as the starting state for backward recursion.
A flow chart describing operation of the soft output error trellis decoder 170 is depicted in FIG. 3. As described briefly above, the soft-output decoder employs a variable-length window for backward and forward recursion. State metrics are saved during the backward recursion, and soft bit metrics are output during the forward recursion. A key component of the decoder is a searcher for an optimum state. Such a state is defined as having a high probability to lie along the optimum path through the error trellis. The optimum path is the most likely path through the error trellis. The construction and properties of error trellises can be found in "Error Trellises for Convolutional Codes- Part 1: Construction", by M. Ariel and J. Snyders, IEEE Transactions on Communications, vol. COM 46, No. 12, pp 1592-1612, Dec. 1998, and an in-depth discussion on optimum states and their identification can be found in "Error Trellises for Convolutional Codes - Part II: Decoding Methods" IEEE Transactions on Communications, vol. COM-47, No. 7, pp 1015-1024, July 1999, the disclosures of both of which articles are incorporated herein by reference thereto. The backward recursion begins once the rightmost optimum state through the window is identified, and no learning period is required to increase the reliability of the state metrics.
More particularly, W, which is a sequence of w symbols wherein w is a positive integer preferably less than the full trellis block length, is processed to produce soft output metrics {Λ(dk)} at the output of the decoder 170. For example, it is envisioned that w can be less than 100 sections, such as 36 sections, whereas the full trellis length will be thousands of sections long. The soft output metrics are computed sequentially starting with Λ(do). Initially, m is set to 0 in step 300. The variable m represents an index of the number of trellis locations (or stages) considered, and will be used to locate the optimum state that is the final stage of the variable length window. The w soft input (a priori LLR) symbols are received in step 301 and the symbol detector 210 outputs a symbol-by-symbol hard-decision vector vw corresponding thereto in step 302. A syndrome S related to the received vector v- is calculated using a window syndrome calculator 303. The syndrome is calculated as Sw=Hw*vtw, where Hw is the scalar parity check matrix associated with the window W. Based on the syndrome Sw, preprocessing is performed in steps 304-307, the purpose of which is to construct a degenerated error trellis. In such a trellis, identifying optimum states is immediate. The preprocessing is composed of the following steps:
1) the syndrome Sw is searched backwards to match a syndrome pattern Sp in step 304 (this operation is a pattern matching process);
2) if the patterns match, and the accumulated metric λp of the corresponding error pattern is below some predetermined threshold T, in step 306, then the syndrome Sw is modified by Sw<— Sw+Sp in step 305; and
3) the syndrome search is repeated until all of the syndrome patterns have been considered, as determined in step 307.
The threshold T is a predetermined value representing a probability level. The metric of an error- pattern is proportional to the probability of that pattern. The threshold T is set to assure that only highly probable events are considered, and will be dependent on the application for the trellis decoder. The accumulated metric is the sum of the metrics of the bits composing the error pattern. Additionally, a detailed description of syndrome based decoding can be found in United States patents 5,936,972, entitled "Syndrome Based Channel Quality or Message Structure Determiner", issued to Meiden et al. on August 10, 1999, and 6,009,552, entitled Soft-Decision Syndrome Based Decoder for Convolutional Codes, issued to Ariel et al. on December 28, 1999, the disclosures of both of which are incorporated herein by reference thereto.
A window error trellis is then composed based upon the modified syndrome Sw in step 308. Starting from the last stage of the obtained error trellis, a backward search is made through the error trellis to identify an optimum state, as indicated in step 309. The optimum state is the first state encountered during the backward search through the error trellis that meets the following criteria: all surviving paths leading to all states at some time index k are traced back to this single state at time index k-s, where s is a positive integer. An in-depth discussion of optimum states and their identification can be found in "Error Trellises for Convolutional Codes - Part U: Decoding Methods" incorporated hereinabove by reference. U such an optimum state is identified after searching backwards through m stages of the error trellis, as determined in step 310 (m being incremented until it is equal to w or an optimum state is detected in steps 309, 310 and 311), the optimum state is provided to soft output decoder 204 (in steps 312 and 314), and sw_ m is a stage with reliable state metrics. The soft output decoder performs backward recursion decoding over the w-m stages of the error trellis and the backward recursion state metrics generated are stored in memory (for example a memory which is not shown that is associated with the soft output decoder 204), as indicated in step 312. Next, forward recursion decoding is performed over the window comprising the w-m stages of the error trellis and forward recursion soft metrics A(dk) for k=0, 1, 2,...,w-m-l, are output as indicated in step 314. The stage is then shifted w-m stages to the right in step 315 such that they are located between stages sw.m and s2w.m , if it is determined in step 316 that the entire trellis block has not yet been decoded.
If, however, no optimum state is identified within the w stages, as determined in steps 310 and 311, then the size is increased by a predetermined amount Δ in step 313. The process described above for the w stages is repeated for the w+ Δ stages, beginning from step 301 for window size w+ Δ. The variable m is reset to 0 in step 316. In this manner, the window length is initiated at a desired maximum length w but can be increased dynamically to look for an optimum state without having to decode and store state metrics for the entire block.
Alternatively, if a maximum sequence length is reached, such that the sequence length w can not be increased (e.g., because of memory size limitations), then the optimum constraint can be relaxed, and the following method could be applied: partition the sequence of length w into two parts, wi and w2, where wi >0. Wi and w2 may be of equal or different lengths. A sub- optimum state within w2 is identified. The sub-optimum state used for backward recursion can be selected by some sub-optimal criterion such as: the state at time k-s that is connected to the maximum number of surviving paths leading to states at some time index k, where s is a positive integer. Alternatively, the sub-optimum state can be selected at random. Backward and forward recursions are performed through wi and the portion of w2 to the left of the sub-optimum state. The maximum constraint length might be 100 sections or more, for example.
The soft output decoder 204 uses the information generated by the trellis composer and optimum state detector 202 to generate the decoded sequence. Those skilled in the art will recognize that any suitable decoder could be used for decoder 204, but soft output decoders are preferred. Most preferably, a soft input-soft output (SISO) decoder is used, such as a SISO for turbo codes that is based on the MAP algorithm described in a paper by L.R. Bahl, J. Cocke, F. Jelinek, and J. Raviv entitled "Optimum Decoding of Linear Codes for Minimizing Symbol Error Rate", IEEE Transactions on Information Theory, Vol. JT-20, March 1974, pp. 284-7 (the "BCJR algorithm" or "BCJR method"). As will also recognized by those skilled in the art, turbo coders are constructed with interleavers and constituent codes, which are usually systematic convolutional codes, but can alternately be block codes. MAP algorithms not only minimize the probability of error for an information bit given the received sequence, they also provide the probability that the information bit is either a 1 or 0 given the received sequence.
The BCJR algorithm provides a soft output decision for each bit position (stage) wherein the influences of the soft inputs within the block are broken into contributions from the past (earlier soft inputs), the present soft input, and the future (later soft inputs). This decoder algorithm requires a forward and a backward generalized Viterbi recursion on the trellis to arrive at an optimum soft output for each trellis location, or stage. These a posteriori probabilities, or more commonly the log-likelihood ratio (LLR) of the probabilities, are passed between SISO decoding steps in iterative turbo decoding. The LLR for information bit ut is
Figure imgf000010_0001
for all bits in the decoded sequence (t = 1 to N). In equation (1), the probability that the decoded bit is equal to 1 (or 0) in the trellis given the received sequence is composed of a product of terms due to the Markov property of the code. The Markov property states that the past and the future are independent given the present. The present, γt(n,m), is the probability of being in state m at time t and generating the symbol yt when the previous state at time t-1 was n. The present operates as a branch metric. The past, afyn), is the probability of being in state m at time t with the received sequence { y1 ,..., γt } , and the future, βt{m), is probability of generating the received sequence {ytH ,..., y N } from state m at time t. The probability at(m) can be expressed as function of at-ι(m) and γt(n,m) and is called the forward recursion
M-\ at (m) = ∑at_1(n)γt (n,m), m = ,...,M -X , (2)
where M is the number of states. The reverse or backward recursion for computing the probability
Figure imgf000010_0002
M-\ β, (n) = ∑βM(τn)yt(n,m), n = 0,...,M -1. (3) m=0 The overall a posteriori probabilities in equation (1) are computed by summing over the branches in the trellis B1 (B°) that correspond to ut = 1 (or 0).
Maximum a posteriori (MAP) type decoders (MAP, log-MAP, max-log-MAP, constant- log-MAP, etc.) utilize forward and backward generalized Viterbi recursions on the trellis in order to provide soft outputs, as is known in the art. The MAP decoder minimizes the decoded bit error probability for each information bit based on all received bits. However, typical prior art MAP decoders require a large memory to support full block decoding.
In summary, the soft output decoder 204 avoids memory problems associated with prior MAP type decoders by decoding windows of the error trellis rather than the entire block. The windows each have a length of w-m, wherein m is dynamically determined based on the detection of an optimum state through the error trellis as described hereinabove. Backward recursion is initiated from the optimum state detected for each window. Forward recursion starts from the known final state of the previous window. When a window is decoded, the decoder slides to the next adjacent window, the length of which is dynamically determined. The syndrome modifier in conjunction with a syndrome-based decoder simplifies the construction of a degenerated error trellis, which is used to identify an optimum state within a window. An optimum state may thus be identified without decoding the entire block length N of the trellis decoder. The optimum state so detected can then be used as the starting state for backward recursion within a window without the need for a soft output decoder training period. The soft outputs can be generated during the forward recursion. Additionally, a dynamically assigned soft-output decoder window is provided based upon the location of an optimum state within a sub-block of input data. The optimum state closest to the end of the sub-block is selected as the starting point for backward recursion and the ending point for forward recursion to allow the largest possible soft-output decoder window size within the input sequence of length w. Those skilled in the art will recognize that the optimum state identified can be used with any decoder and it is envisioned that the optimum state detected could be provided to a soft output decoder which is not processing the error trellis generated by the syndrome detector. While specific components and functions of the soft-decision decoder for convolutional codes are described above, fewer or additional functions could be employed by one skilled in the art within the true spirit and scope of the present invention. The invention should be limited only by the appended claims.
We claim:

Claims

CLA S
1. A soft output decoder for decoding a demodulated received signal characterized by: an optimum state detector to identify an optimum state for a window of an error trellis, the optimum state detector including a syndrome calculator and outputting the optimum state; and a soft output decoder coupled to the optimum state detector, the soft output decoder generating soft outputs using a maximum a posteriori (MAP) type decoder, the MAP type decoder to decode the window using backward recursion starting from the optimum state received from the optimum state detector.
2. A soft output decoder according to claim 1 wherein the optimum state detector composes an error trellis using the syndrome calculator.
3. A soft output decoder according to claim 2, further including a syndrome modifier coupled to the syndrome calculator, the syndrome calculator to generate a syndrome vector and the syndrome modifier to locate and remove syndrome patterns from the syndrome vector to create a modified syndrome vector.
4. A soft output decoder according to claim 3, further including a syndrome pattern memory coupled to the syndrome modifier, the syndrome pattern memory storing syndrome patterns used for computing an estimated error vector, the error vector used to generate a modified syndrome.
5. A soft output decoder according to claim 1 further characterized by an error trellis composer to compose an error trellis from the received signal.
6. A soft output decoder according to claim 5 wherein the optimum state detector is coupled to the error trellis composer to detect an optimum state in a composed error trellis.
7. A soft output decoder according to claim 1 wherein the soft output decoder includes a backward recursion decoder, a forward recursion decoder, and wherein soft outputs are computed as a function of backward recursion metrics generated starting from the optimum state identified by the optimum state detector.
8. A radiotelephone having a soft output decoder characterized by: a hard-decision detector to compute a hard-decision vector from a demodulated received signal vector; a syndrome calculator, coupled to the hard-decision detector, to compute a syndrome vector from the hard-decision vector and a parity check matrix; a syndrome pattern memory to store at least one syndrome pattern; a syndrome modifier, coupled to the syndrome calculator and the syndrome pattern memory, to locate and remove at least one syndrome pattern from the syndrome vector and create a modified syndrome vector; an error trellis composer coupled to the syndrome modifier to compose an error trellis; an optimum state detector coupled to the syndrome modifier to identify an optimum state in an error trellis generated from the modified syndrome vector; and a decoder coupled to the optimum state detector, the decoder dynamically selecting a window size as a function of the optimum state and performing backward recursions from the optimum state identified by the optimum state detector.
9. A radiotelephone according to claim 8 wherein the syndrome modifier computes an estimated error trellis as a function of the at least one syndrome pattern.
10. A radiotelephone according to claim 9 wherein the soft output decoder includes a backward recursion decoder, a forward recursion decoder, and wherein the output decoder computes soft outputs as a function of backward recursion metrics generated starting from the optimum state identified by the optimum state detector.
11. A radiotelephone according to claim 8 wherein the syndrome-based decoder computes a remaining error vector from the modified syndrome vector.
12. A method of soft output decoding a block of demodulated received signal by sequentially decoding data sequences within the block, the method characterized by the steps of: composing an error trellis for a first sequence from at least one identified syndrome in the first sequence; identifying an optimum state in the error trellis; and decoding a window of the first sequence using backward and forward recursion, at least one of backward and forward recursion starting from the optimum state identified in the error trellis.
13. The method of claim 12 further including the step of identifying as the final stage of the window the trellis stage having the identified optimum state.
14. The method of claim 13 wherein the step of composing includes searching for at least one syndrome pattern.
15. The method of claim 14 wherein the step of composing includes analyzing each syndrome pattern stored in a syndrome memory.
16. The method of claim 14 further including the step of increasing the size of the first sequence if the optimum state is not detected.
17. The method according to claim 12 wherein the step of decoding comprises the steps of: performing backward recursion processing from the optimum state to the initial stage of the first sequence to generate backward recursion state metrics; performing forward recursion processing from the first stage of the first sequence to the stage associated with the optimum state to generate forward recursion state metrics; and generating soft outputs as a function of the forward recursion state metrics and the backward recursion state metrics.
18. The method according to claim 17 wherein the step of generating soft outputs as a function of the forward recursion state metrics and the backward recursion state metrics generates the soft outputs further as a function of branch metrics.
19. The method according to claim 12 wherein the step of composing an error trellis from at least one identified syndrome includes composing an error trellis over the first sequence, and further including incrementally increasing the size of the first sequence, if an optimum state is not detected, until a maximum sequence length is reached.
20. The method according to claim 19 wherein the step of increasing the first sequence is repeated until the maximum sequence length is reached, and if an optimal state is not identified in the maximum sequence length, then further including the steps of dividing the maximum sequence length into at least two stages and selecting a sub-optimum state in a later stage of the at least two stages.
21. The method according to claim 12 further including the steps of sliding to a second sequence and repeating the steps of composing, identifying and decoding.
22. The method according to claim 21 further including the step of iteratively sliding to a next sequence and repeating the steps of composing, identifying and decoding until the entire block is decoded.
23. The method according to claim 12 wherein the step of composing an error trellis from at least one identified syndrome composes an error trellis over a first sequence and further including the step of selecting the optimum state closest to the end of the first sequence.
PCT/US2001/026785 2000-09-06 2001-08-28 Soft-output error-trellis decoder for convolutional codes WO2002021784A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001286842A AU2001286842A1 (en) 2000-09-06 2001-08-28 Soft-output error-trellis decoder for convolutional codes

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US65599500A 2000-09-06 2000-09-06
US09/655,995 2000-09-06

Publications (1)

Publication Number Publication Date
WO2002021784A1 true WO2002021784A1 (en) 2002-03-14

Family

ID=24631217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/026785 WO2002021784A1 (en) 2000-09-06 2001-08-28 Soft-output error-trellis decoder for convolutional codes

Country Status (2)

Country Link
AU (1) AU2001286842A1 (en)
WO (1) WO2002021784A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8848781B2 (en) 2007-05-16 2014-09-30 Thomson Licensing Apparatus and method for encoding and decoding signals
US8908773B2 (en) 2007-10-15 2014-12-09 Thomson Licensing Apparatus and method for encoding and decoding signals
US9414110B2 (en) 2007-10-15 2016-08-09 Thomson Licensing Preamble for a digital television system
WO2019009961A1 (en) * 2017-07-01 2019-01-10 Intel Corporation Early-termination of decoding convolutional codes
CN115499103A (en) * 2022-09-20 2022-12-20 中国人民解放军32802部队 Blind identification method for convolutional code

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5457704A (en) * 1993-05-21 1995-10-10 At&T Ipm Corp. Post processing method and apparatus for symbol reliability generation
US5887035A (en) * 1997-10-31 1999-03-23 Ericsson, Inc. Method for joint equalization and detection of multiple user signals

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5457704A (en) * 1993-05-21 1995-10-10 At&T Ipm Corp. Post processing method and apparatus for symbol reliability generation
US5887035A (en) * 1997-10-31 1999-03-23 Ericsson, Inc. Method for joint equalization and detection of multiple user signals

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8848781B2 (en) 2007-05-16 2014-09-30 Thomson Licensing Apparatus and method for encoding and decoding signals
US8873620B2 (en) 2007-05-16 2014-10-28 Thomson Licensing Apparatus and method for encoding and decoding signals
US8964831B2 (en) 2007-05-16 2015-02-24 Thomson Licensing Apparatus and method for encoding and decoding signals
US8908773B2 (en) 2007-10-15 2014-12-09 Thomson Licensing Apparatus and method for encoding and decoding signals
US9414110B2 (en) 2007-10-15 2016-08-09 Thomson Licensing Preamble for a digital television system
WO2019009961A1 (en) * 2017-07-01 2019-01-10 Intel Corporation Early-termination of decoding convolutional codes
US10680749B2 (en) 2017-07-01 2020-06-09 Intel Corporation Early-termination of decoding convolutional codes
CN115499103A (en) * 2022-09-20 2022-12-20 中国人民解放军32802部队 Blind identification method for convolutional code

Also Published As

Publication number Publication date
AU2001286842A1 (en) 2002-03-22

Similar Documents

Publication Publication Date Title
US6671852B1 (en) Syndrome assisted iterative decoder for turbo codes
US6947506B2 (en) Method and apparatus for improved turbo multiuser detector
US6829313B1 (en) Sliding window turbo decoder
EP0625829B1 (en) Post processing method and apparatus symbol reliability generation
AU762877B2 (en) Component decoder and method thereof in mobile communication system
US5784392A (en) Viterbi decoder with l=2 best decoding paths
KR100512668B1 (en) Iteration terminating using quality index criteria of turbo codes
US6393076B1 (en) Decoding of turbo codes using data scaling
US6452979B1 (en) Soft output decoder for convolutional codes
US20030204808A1 (en) Method and apparatus for random shuffled turbo multiuser detector
JP3741616B2 (en) Soft decision output decoder for convolutional codes
US8806312B2 (en) Soft output Viterbi algorithm method and decoder
JP2002535910A (en) Decoding method and decoding device for convolutional code
US7272771B2 (en) Noise and quality detector for use with turbo coded signals
WO2002021784A1 (en) Soft-output error-trellis decoder for convolutional codes
US20030101402A1 (en) Hard-output iterative decoder
US6973615B1 (en) System of and method for decoding trellis codes
US7031406B1 (en) Information processing using a soft output Viterbi algorithm
Luna et al. Iterative maximum-likelihood trellis decoding for block codes
Ahmed et al. Viterbi algorithm performance analysis for different constraint length
Mohammad et al. A comparison between the M-algorithm and the list Viterbi algorithm
Seethal et al. A low complex turbo decoding algorithm with early iteration termination
Antonini et al. Suppressing Error Floors in SCPPM via an Efficient CRC-aided List Viterbi Decoding Algorithm
KR100267370B1 (en) A low-complexity syndrome check error estimation decoder for convolutional codes
Das et al. Low-complexity iterative multiuser detection and decoding for real-time applications

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP