WO2005055433A1 - 復号装置及び復号方法 - Google Patents
復号装置及び復号方法 Download PDFInfo
- Publication number
- WO2005055433A1 WO2005055433A1 PCT/JP2004/017284 JP2004017284W WO2005055433A1 WO 2005055433 A1 WO2005055433 A1 WO 2005055433A1 JP 2004017284 W JP2004017284 W JP 2004017284W WO 2005055433 A1 WO2005055433 A1 WO 2005055433A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- probability
- backward
- decoding
- window
- backward probability
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M13/00—Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
- H03M13/37—Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
- H03M13/39—Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
- H03M13/3905—Maximum a posteriori probability [MAP] decoding or approximations thereof based on trellis or lattice decoding, e.g. forward-backward algorithm, log-MAP decoding, max-log-MAP decoding
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M13/00—Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M13/00—Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
- H03M13/29—Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes combining two or more codes or code structures, e.g. product codes, generalised product codes, concatenated codes, inner and outer codes
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M13/00—Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
- H03M13/37—Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
- H03M13/39—Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
- H03M13/3972—Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes using sliding window techniques or parallel windows
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M13/00—Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
- H03M13/37—Decoding methods or techniques, not specific to the particular type of coding provided for in groups H03M13/03 - H03M13/35
- H03M13/39—Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes
- H03M13/41—Sequence estimation, i.e. using statistical methods for the reconstruction of the original codes using the Viterbi algorithm or Viterbi processors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L1/00—Arrangements for detecting or preventing errors in the information received
Definitions
- the present invention relates to a decoding device and a decoding method, and is suitably applied to, for example, a decoding device and a decoding method that perform turbo decoding.
- VSF-OF and DM Very spreading Factor-Orthogonal frequency and ode Division Multiplexing
- VSF-OF CDM Very spreading Factor-Orthogonal frequency and ode Division Multiplexing
- the turbo codec system is characterized in that convolutional coding and interleaving are used in combination for transmission data and decoding is performed repeatedly during decoding. It is known that by performing iterative decoding processing, excellent error correction capability is exhibited not only for random errors but also for burst errors.
- turbo decoding likelihood information L (d) is calculated, and the calculated likelihood information L (d) is compared with a threshold "0".
- the likelihood information L (d) is a forward probability ⁇ and a backward k k k probability j8
- ⁇ can be represented by the following equations.
- k represents the input to the decoder at time point k
- ⁇ ( ⁇ ⁇ ⁇ ) represents the transition probability of the discrete Gaussian memoryless channel.
- q corresponds to 0 or 1
- ⁇ is the state transition probability of the trellis.
- turbo decoding information bits d are decoded by performing such an operation.
- the turbo decoding processing procedure includes calculation of the forward probability ⁇ (hereinafter, referred to as “ ⁇ operation”), calculation of the backward probability ⁇ 8 (hereinafter, “ ⁇ operation” t, ⁇ ), and likelihood information L ( d) Calculation
- the sliding window which can be roughly divided into k, is a method for efficiently performing the A operation and the B operation.
- the sliding window method is This method divides the entire series of data into predetermined window units and sets a training section in each window to calculate the backward probability that the last force of the series must also be calculated from the mid-stream force. According to this sliding window method, it is sufficient to accumulate backward probabilities in units of windows and windows. The memory capacity is greatly reduced compared to the case where all backward probabilities at time point k-1 are accumulated. Can be.
- FIG. 1 is a schematic diagram conceptually showing a conventional repetition process of the B operation.
- the window size is set to 32, and the explanation is made using three windows for convenience of explanation.
- the window at time 0-31 is B # 0
- the window at time 32-63 is B # l
- the window at time 64-95 is B # 2.
- the training interval is extrapolated to the last point of each window, and is generally at least 415 times the constraint length.
- the initial value of each training section is initialized to "0" for each iterative decoding, and the operation is performed using the reliability information for each window obtained in the previous decoding processing. Do.
- Non-Patent Document 1 Claude Berrou, Near Optimum Error Correcting Coding Ana Decoding: Turbo-Codes, "IEEE Trans. On Communications, Vol.44, NolO, Oct. 1996.
- the conventional sliding window method has a problem in that the training period is long, so that the amount of computation and the memory capacity of the turbo decoder are large.
- the length of the training section is fixed, if the coding ratio is high, there is a possibility that the deterioration of the characteristics will increase, and in order to maintain the characteristics, the training section must be lengthened. There is a problem.
- An object of the present invention is to provide a decoding device and a decoding method that reduce the amount of calculation and the memory capacity and that prevent deterioration of characteristics even when the coding rate is high.
- the decoding device of the present invention divides a data sequence into a plurality of windows, and uses the backward probability calculated at a predetermined point in time in the previous iterative decoding as an initial value in the current iterative decoding for each window.
- Backward probability calculating means for calculating the backward probability calculated by the backward probability calculating means, storing means for storing the backward probability at a predetermined time, and likelihood information using the backward probability calculated by the backward probability calculating means.
- a likelihood calculating means for calculating the likelihood.
- the backward probability at a predetermined point in time calculated in the previous iterative decoding is used as the initial value in the current iterative decoding to calculate the backward probability. Therefore, the amount of calculation and the memory capacity can be reduced, and the characteristics can be prevented from deteriorating even if the coding ratio is high.
- FIG. 1 is a schematic diagram conceptually showing a conventional repetition process of a B operation.
- FIG. 2 is a block diagram showing a configuration of a turbo decoder according to Embodiment 1 of the present invention.
- FIG. 3 is a block diagram showing the internal configuration of the element decoder
- FIG. 4 is a schematic diagram conceptually showing a repetition process of a B operation according to the first embodiment of the present invention.
- FIG. 5 is a diagram showing decoding characteristics of the turbo decoder according to Embodiment 1 of the present invention.
- FIG. 6 is a schematic diagram conceptually showing a repetition process of a B operation according to the second embodiment of the present invention.
- FIG. 2 is a block diagram showing a configuration of the turbo decoder according to Embodiment 1 of the present invention.
- an element decoder 101 decodes a systematic bit sequence Y1 and a parity bit sequence Y2 together with a priori value Lai, which is reliability information transmitted from the dingleaver 105, and outputs an external value Lei to the interleaver 102.
- the extrinsic value indicates the increase in the reliability of the symbol by the element decoder.
- the interleaver 102 rearranges the external value Lei output from the element decoder 101 and outputs the rearranged value to the element decoder 104 as a prior value La2. By the way, in the first repetition, Since decoding is not performed in the encoder 104, "0" is substituted for the prior value.
- Element decoder 104 receives a sequence obtained by rearranging systematic bit sequence Y1 in interleaver 103, a knowledge bit sequence Y3, and a priori value La2, performs a decoding process, and performs an external value Le2 in Output to
- Dinter liver 105 performs an operation of returning the rearrangement by the interleaver to external value Le2, and outputs the result to element decoder 101 as prior value Lai. Thereby, iterative decoding is performed. After several dozen iterations, the element decoder 104 calculates a posterior value L2 defined as a log posterior probability ratio, and outputs the calculation result to the ding liver 106. Din taller 106 tallies the calculation result output from element decoder 104, and outputs the sequence after the delta leaving to hard decision section 107.
- a posterior value L2 defined as a log posterior probability ratio
- Hard decision section 107 outputs a decoded bit sequence by performing a hard decision on the sequence after the dent leave.
- Error detection section 108 performs error detection on the decoded bit sequence and outputs a detection result
- FIG. 3 is a block diagram showing an internal configuration of the element decoder 101. It is assumed that the element decoder 104 also has the same internal configuration as the element decoder 101. Also, the following decoding operation is performed in window units of a predetermined size.
- the systematic bit Yl, the knowledge bit ⁇ 2, and the prior value Lai obtained from the previous decoding result are input to the transition probability calculation unit 201, and the transition probability ⁇ is calculated.
- the calculated transition probability ⁇ is output to forward probability calculating section 202 and backward probability calculating section 203.
- the forward probability calculating unit 202 performs the operation of the above equation (3) on the data sequence divided for each window using the transition probability ⁇ output from the transition probability calculating unit 201, and calculates the forward probability a (m) is calculated.
- the calculated a (m) is output to likelihood calculating section 205.
- the backward probability calculation unit 203 uses the transition probability ⁇ output from the transition probability calculation unit 201 and the value stored in the memory 204, which will be described later, to calculate the above-described equation for the data sequence divided for each window. Perform the operation of (4) ( ⁇ operation) to calculate the backward probability
- the backward probability ⁇ (m) is output to likelihood calculating section 205.
- the backward probability used as the value is output to the memory 204.
- the memory 204 stores the backward probability at a predetermined point in time output from the backward probability calculation unit 203.
- the backward probability calculation unit 203 temporarily stores the backward probability, it outputs the stored backward probability at a predetermined point in time of the previous decoding to the backward probability calculation unit 203.
- the predetermined time point corresponds to a time point at which calculation of each window is started in the next iterative decoding.
- Likelihood calculation section 205 calculates forward probability a (m) output from forward probability calculation section 202 and
- FIG. 4 is a schematic diagram conceptually showing the repetition processing of the B operation according to the first embodiment of the present invention.
- the window size is set to 32, and three windows are used for convenience of explanation.
- the window at time 0-31 is B # 0
- the window at time 32-63 is B # l
- the window at time 64-95 is B # 2.
- the size of the training section is set to 4 and extrapolated at the end of each window.
- the calculation is started from the training section for each window, and the calculation is advanced in the direction in which the higher force at the time goes back. Then, the backward probability at the time point 36 of the window B # l is stored in the memory 204, and the backward probability at the time point 68 of the window B # 2 is stored in the memory 204.
- the backward probability at the time point 36 stored in the memory 204 with the number of repetitions 1 is used as the initial value of the training section of the window B # 0, and the backward probability at the time point 68 is the training section of the window B # 1.
- the initial value of. As described above, by using the backward probability obtained in the previous decoding as the current initial value, it is considered that the previous decoding process corresponds to the current training interval, and decoding accuracy can be improved.
- window size of window B # 0 is increased by one, and the time point is set to 0-3, window B # 1 and B # 2 are advanced by one time point, and window B # 1 is Time point 33-64, window B # 2 is time point 65-96.
- the time of the training section also changes.
- the backward probability at a predetermined point in time of the number of repetitions (i 1) is used as the initial value of the training interval of each window, and window B # 0 is set at the time i Increase the number of minutes behind, shift the windows B # l, B # 2, ... backward to the point Fen, and perform the B operation.
- FIG. 5 is a diagram showing decoding characteristics of the turbo decoder according to Embodiment 1 of the present invention.
- the vertical axis is the bit error rate (BER)
- the horizontal axis is EbZNO.
- the solid line shows the decoding characteristics of the turbo decoder according to the present embodiment
- the dotted line shows the decoding characteristics of the conventional turbo decoder.
- the simulation specifications are as follows.
- AWGN Additional White Gaussian Noise
- the solid line indicates that the turbo decoder of the present embodiment has excellent decoding characteristics.
- good decoding characteristics can be obtained even though the encoding ratio is set as high as 5Z6, so that deterioration of the characteristics can be prevented even at a high encoding ratio.
- the same decoding characteristics as the solid line can be obtained. This can also prevent the characteristics from deteriorating.
- the backward probability at a predetermined point in time of each window in the previous decoding is set as the initial value of the current training section, and the position of the window is shifted backward each time decoding is repeated.
- the accuracy of the initial value is improved with each repetition, so that the training interval can be shortened, and the amount of calculation and memory capacity can be reduced. Further, even if the coding ratio is high, it is possible to prevent the deterioration of the characteristics.
- Embodiment 2 of the present invention describes a case where no training section is provided.
- the configuration of the turbo decoder according to the present embodiment is the same as that of FIG. 2, and the configuration of the elementary decoder is the same as that of FIG. 3, so that FIG. 2 and FIG. FIG. 6 is a schematic diagram conceptually showing the repetition processing of the B operation according to Embodiment 2 of the present invention.
- the window size is set to 32, and three windows are used for convenience of explanation. In this figure, at the number of repetitions 1, the window at time 0-31 is B # 0, the window at time 32-63 is B # l, and the window at time 64-95 is B # 2.
- the operation is performed for each window in the backward direction at a higher point in time, the backward probability at time 32 of window B # 1 is stored in the memory 204, and the window B # 2 The backward probability at time point 64 is stored in the memory 204.
- the backward probability at the time 32 stored in the memory 204 with the number of repeats 1 is set as the initial value of the training interval of the window B # 0
- the backward probability at the time 64 is the training interval of the window B # l.
- the window size of window B # 0 is increased by one, and the time is set to 0-32, windows B #l and B # 2 are advanced by one time, window B # l is set to time 33-64, and window B # 2 is 65-96 at the time.
- B operation is performed by using the backward probability at the predetermined time point in 1), increasing the window B # 0 backward by the time point i, and shifting the windows B # 1, B # 2,... Backward by the time point i.
- the backward probability of each window at a predetermined time point in the previous decoding is set as the initial value of the current operation, and the position of the window is shifted backward each time decoding is repeated.
- the accuracy of the initial value is improved with each repetition, so that the characteristics can be prevented from deteriorating without providing a training section, and the calculation amount and the memory capacity can be reduced.
- the forward probability may be used as the initial value to calculate the forward probability, whereby the training interval used for calculating the forward probability can be shortened, and the calculation amount and the memory capacity can be reduced.
- the amount by which the window is shifted is shifted by the time i when the number of repetitions is i.
- the present invention is not limited to this.
- the shift amount may be used.
- j is a positive number except 0.
- the data sequence is divided into a plurality of windows, and the backward probability at a predetermined time point calculated in the previous iterative decoding is used as an initial value in the current iterative decoding, and the backward probability is determined for each window.
- Backward probability calculating means for calculating the probability
- storage means for storing the backward probability at a predetermined time calculated by the backward probability calculating means, and likelihood using the backward probability calculated by the backward probability calculating means.
- a likelihood calculating means for calculating information.
- a second aspect of the present invention is the decoding apparatus according to the above aspect, wherein the backward probability calculation means shifts the position of the window backward according to the number of decoding iterations to calculate the backward probability. is there.
- the backward probability at a predetermined time point calculated in the previous iterative decoding is used as the initial value in the current iterative decoding to calculate the backward probability for each window. Since the accuracy of the initial value is improved, the training interval can be shortened, thereby reducing the amount of calculation and the memory capacity. In addition, even if the code ratio is high, deterioration of characteristics can be prevented.
- the storage means stores the starting point in the next iterative decoding in response to the backward probability calculating means shifting the position of the window backward.
- This is a decoding device that stores backward probabilities.
- the storage means stores the backward probability at the start time in the next iterative decoding, that is, the initial value. Therefore, even when the window shifts and the calculation start position changes at each iteration, a highly accurate initial value is used, and the training interval can be shortened, so that the amount of calculation and memory capacity can be reduced. Can be.
- the data sequence is divided into a plurality of windows, and the forward probability at a predetermined point in time calculated in the previous iterative decoding is used as an initial value in the current iterative decoding, and the forward probability is determined for each window.
- the forward probability calculating means for calculating the probability and the forward probability calculating means calculate And a likelihood calculating means for calculating likelihood information using the forward probability calculated by the forward probability calculating means. is there.
- the forward probability at a predetermined time point calculated in the previous iterative decoding is used as an initial value in the current iterative decoding to calculate the forward probability for each window, so that the number of repetitions increases. Since the accuracy of the initial value is improved, the training section can be shortened, and thereby the amount of calculation and the memory capacity can be reduced. Further, even if the coding ratio is high, it is possible to prevent the deterioration of the characteristics.
- the data sequence is divided into a plurality of windows, and the backward probability at a predetermined time point calculated in the previous iterative decoding is used as an initial value in the current iterative decoding, and the backward probability is determined for each window.
- This is a decoding method for calculating a probability.
- the backward probability at a predetermined time point calculated in the previous iterative decoding is used as the initial value in the current iterative decoding to calculate the backward probability for each window. Since the accuracy of the initial value is improved, the training section can be shortened, and thereby the amount of calculation and the memory capacity can be reduced. Further, even if the coding ratio is high, it is possible to prevent the deterioration of the characteristics.
- the decoding device calculates the backward probability using the backward probability at a predetermined time calculated in the previous iterative decoding as an initial value in the current iterative decoding, thereby reducing the amount of computation and the memory capacity.
- the coding ratio is high, it has an effect of preventing deterioration of characteristics, and can be applied to a turbo decoder and the like.
Landscapes
- Physics & Mathematics (AREA)
- Probability & Statistics with Applications (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Error Detection And Correction (AREA)
- Detection And Prevention Of Errors In Transmission (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20040819763 EP1677423A1 (en) | 2003-12-01 | 2004-11-19 | Decoder apparatus and decoding method |
US10/581,032 US20070113144A1 (en) | 2003-12-01 | 2004-11-19 | Decoding apparatus and decoding method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-402218 | 2003-12-01 | ||
JP2003402218A JP2005167513A (ja) | 2003-12-01 | 2003-12-01 | 復号装置及び復号方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005055433A1 true WO2005055433A1 (ja) | 2005-06-16 |
Family
ID=34650007
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/017284 WO2005055433A1 (ja) | 2003-12-01 | 2004-11-19 | 復号装置及び復号方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20070113144A1 (ja) |
EP (1) | EP1677423A1 (ja) |
JP (1) | JP2005167513A (ja) |
KR (1) | KR20060096089A (ja) |
CN (1) | CN1883120A (ja) |
WO (1) | WO2005055433A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4227481B2 (ja) * | 2003-07-11 | 2009-02-18 | パナソニック株式会社 | 復号装置および復号方法 |
CN102484484B (zh) * | 2009-08-25 | 2014-08-20 | 富士通株式会社 | 发送机、编码装置、接收机以及解码装置 |
GB0915135D0 (en) * | 2009-08-28 | 2009-10-07 | Icera Inc | Decoding signals received over a noisy channel |
US20130142057A1 (en) * | 2011-12-01 | 2013-06-06 | Broadcom Corporation | Control Channel Acquisition |
US9197365B2 (en) * | 2012-09-25 | 2015-11-24 | Nvidia Corporation | Decoding a coded data block |
CN108400788A (zh) * | 2018-01-24 | 2018-08-14 | 同济大学 | Turbo译码的硬件实现方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001267938A (ja) * | 2000-01-31 | 2001-09-28 | Texas Instr Inc <Ti> | パラレル化されたスライディングウィンドウ処理によるmapデコーディング |
JP2002314437A (ja) * | 2001-04-17 | 2002-10-25 | Nec Corp | ターボ復号方式及びその方法 |
JP2002367291A (ja) * | 2001-06-11 | 2002-12-20 | Fujitsu Ltd | 情報記録再生装置及び方法並びに信号復号回路 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6563877B1 (en) * | 1998-04-01 | 2003-05-13 | L-3 Communications Corporation | Simplified block sliding window implementation of a map decoder |
EP1156588B1 (en) * | 1999-03-01 | 2007-05-09 | Fujitsu Limited | Method and apparatus for maximum a posteriori probability decoding |
JP3846527B2 (ja) * | 1999-07-21 | 2006-11-15 | 三菱電機株式会社 | ターボ符号の誤り訂正復号器、ターボ符号の誤り訂正復号方法、ターボ符号の復号装置およびターボ符号の復号システム |
SG97926A1 (en) * | 2000-08-29 | 2003-08-20 | Oki Techno Ct Singapore Pte | Soft-in soft-out decoder used for an iterative error correction decoder |
KR100390416B1 (ko) * | 2000-10-16 | 2003-07-07 | 엘지전자 주식회사 | 터보 디코딩 방법 |
JP4185314B2 (ja) * | 2002-06-07 | 2008-11-26 | 富士通株式会社 | 情報記録再生装置、光ディスク装置及び、データ再生方法 |
-
2003
- 2003-12-01 JP JP2003402218A patent/JP2005167513A/ja active Pending
-
2004
- 2004-11-19 CN CNA200480034409XA patent/CN1883120A/zh active Pending
- 2004-11-19 WO PCT/JP2004/017284 patent/WO2005055433A1/ja not_active Application Discontinuation
- 2004-11-19 EP EP20040819763 patent/EP1677423A1/en not_active Withdrawn
- 2004-11-19 KR KR20067010607A patent/KR20060096089A/ko not_active Application Discontinuation
- 2004-11-19 US US10/581,032 patent/US20070113144A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001267938A (ja) * | 2000-01-31 | 2001-09-28 | Texas Instr Inc <Ti> | パラレル化されたスライディングウィンドウ処理によるmapデコーディング |
JP2002314437A (ja) * | 2001-04-17 | 2002-10-25 | Nec Corp | ターボ復号方式及びその方法 |
JP2002367291A (ja) * | 2001-06-11 | 2002-12-20 | Fujitsu Ltd | 情報記録再生装置及び方法並びに信号復号回路 |
Non-Patent Citations (2)
Title |
---|
BERROU ET AL.: "Near optimun error correcting coding and decoding: turbo codes.", IEEE TRANSACTIONS ON COMUNICATIONS., vol. 44, no. 10, October 1996 (1996-10-01), pages 1261 - 1271, XP000919139 * |
OHBUCHI K. ET AL.: "Turbo fugo ni okeru sub-log-map no kairo kibo sakugen hoshiki.", THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS HGIYUTSU KENKYU HOKOKU., vol. 99, no. 140, 25 June 1999 (1999-06-25), pages 1 - 6, XP002990220 * |
Also Published As
Publication number | Publication date |
---|---|
EP1677423A1 (en) | 2006-07-05 |
US20070113144A1 (en) | 2007-05-17 |
JP2005167513A (ja) | 2005-06-23 |
KR20060096089A (ko) | 2006-09-05 |
CN1883120A (zh) | 2006-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1383246B1 (en) | Modified Max-LOG-MAP Decoder for Turbo Decoding | |
US8443265B2 (en) | Method and apparatus for map decoding and turbo decoder using the same | |
US8457219B2 (en) | Self-protection against non-stationary disturbances | |
JP4227481B2 (ja) | 復号装置および復号方法 | |
US20020091973A1 (en) | Pre-decoder for a turbo decoder, for recovering punctured parity symbols, and a method for recovering a turbo code | |
JP2004531116A (ja) | ターボデコーダ用インタリーバ | |
US8358713B2 (en) | High throughput and low latency map decoder | |
CN108134612B (zh) | 纠正同步与替代错误的级联码的迭代译码方法 | |
WO2005055433A1 (ja) | 復号装置及び復号方法 | |
US7634703B2 (en) | Linear approximation of the max* operation for log-map decoding | |
JP2004349901A (ja) | ターボ復号器及びそれに用いるダイナミック復号方法 | |
KR100738250B1 (ko) | Llr의 부호 비교를 이용한 터보 복호기의 반복복호제어장치 및 방법 | |
WO2012123514A1 (en) | State metrics based stopping criterion for turbo-decoding | |
US7055102B2 (en) | Turbo decoder using parallel processing | |
JP2006507736A (ja) | Fec復号化における消失判定手順 | |
KR100776910B1 (ko) | 비이진부호에서의 scr/sdr을 이용한 반복 복호 장치및 그 동작 방법 | |
JP4991481B2 (ja) | 反復復号装置及び反復復号方法 | |
KR20100027631A (ko) | 무선통신 시스템에서 데이터 복호 방법 | |
KR20080035404A (ko) | 블록 부호를 사용하는 통신 시스템에서 신호 송수신 장치및 방법 | |
Pukkila | Source and Channel Encoder and Decoder Modeling | |
KR20110096222A (ko) | 터보 코드 복호기 및 이를 위한 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480034409.X Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004819763 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007113144 Country of ref document: US Ref document number: 10581032 Country of ref document: US Ref document number: 1020067010607 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 3151/DELNP/2006 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2004819763 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2004819763 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020067010607 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 10581032 Country of ref document: US |