CN107241106A - Polarization code decoding algorithm based on deep learning - Google Patents

Polarization code decoding algorithm based on deep learning Download PDF

Info

Publication number
CN107241106A
CN107241106A CN201710371218.1A CN201710371218A CN107241106A CN 107241106 A CN107241106 A CN 107241106A CN 201710371218 A CN201710371218 A CN 201710371218A CN 107241106 A CN107241106 A CN 107241106A
Authority
CN
China
Prior art keywords
mrow
msubsup
polarization code
neural network
mfrac
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710371218.1A
Other languages
Chinese (zh)
Other versions
CN107241106B (en
Inventor
张川
徐炜鸿
吴至臻
尤肖虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201710371218.1A priority Critical patent/CN107241106B/en
Publication of CN107241106A publication Critical patent/CN107241106A/en
Application granted granted Critical
Publication of CN107241106B publication Critical patent/CN107241106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/03Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words
    • H03M13/05Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words using block codes, i.e. a predetermined number of check bits joined to a predetermined number of information bits
    • H03M13/11Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words using block codes, i.e. a predetermined number of check bits joined to a predetermined number of information bits using multiple parity bits
    • H03M13/1102Codes on graphs and decoding on graphs, e.g. low-density parity check [LDPC] codes
    • H03M13/1191Codes on graphs other than LDPC codes
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/03Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words
    • H03M13/05Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words using block codes, i.e. a predetermined number of check bits joined to a predetermined number of information bits
    • H03M13/13Linear codes

Landscapes

  • Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Error Detection And Correction (AREA)

Abstract

The invention discloses a kind of polarization code decoding algorithm based on deep learning, it is proposed that various dimensions scale Min sum confidence spreads (Beliefpropagation) decoding algorithm, to accelerate decoding algorithm convergence rate;Then according to the factor graph of BP algorithm and the similitude of deep neural network, realize the polarization code decoder based on deep neural network, deep neural network decoder is trained using depth learning technology, compared to the decoding iteration number of times that original BP decoding algorithms reduce nearly 90%, while achieving more preferable decoding performance;It is last The present invention gives the hardware realization of deep neural network polarization code decoder basic operation module, and reduce using hardware folding 50% hardware consumption.

Description

Polarization code decoding algorithm based on deep learning
Technical field
The invention belongs to deep neural network and polarization code decoding field, more particularly to a kind of polarization based on deep learning Code decoding algorithm.
Background technology
Polarization code (Polar code) is in paper " Channel in 2009 by ErdalArikan polarization:A method for constructing capacity-achieving codes for symmetric The a kind of of proposition can tend to the coded system of shannon limit in binary-input memoryless channels ".Channel Polarization phenomena refer to that when channel quantity tends to infinity a part of channel tends to be perfect, and a part of channel tends to pure noise Channel.Based on this channel-polarization phenomenon, channel relatively good in aggregate channel is chosen, polarization code is constructed.Polarization code was the 5th generation One of highly important technology in (5G) GSM.
Most common two kinds of polarization code decoding algorithms are successive elimination (SC) algorithm and confidence spread (BP) algorithm.Its In, SC decoding computation complexities are low, and have good error-correcting performance, but are due to the serial arithmetic structure of SC algorithms, and it is deposited In longer decoding latency.
Compared with SC decodings, BP decodings, due to its parallel organization, decoding latency is decoded much smaller than SC in the case of long code; But be due to BP decoding need carry out successive ignition processing, therefore BP decoding computation complexity it is very high, and decoding performance with SC has certain gap.In order to reduce computation complexity, early stopping algorithm and Min-sum algorithms are introduced into BP decodings, but not There is the convergence for accelerating BP decodings.People also introduce deep learning (Deeplearning) technology and deep neural network (DNN) more preferable decoding performance is obtained, but neutral net complexity exponentially increases with code length.Therefore how multiple It is one of emphasis of BP decoding algorithms research based on deep learning that good trade-off is obtained on miscellaneous degree and decoding performance.
The content of the invention
Goal of the invention:For problem above, the present invention proposes a kind of polarization code decoding algorithm based on deep learning, overcome Existing polarization code BP decoding algorithms are reached with less using deep learning technology the problem of convergence rate is slow under low signal-to-noise ratio Iterations obtains the target of more excellent decoding performance, reduction decoding complexity and decoding delay.
Technical scheme:To realize the purpose of the present invention, the technical solution adopted in the present invention is:One kind is based on deep learning Polarization code decoding algorithm, specifically include following steps:
(1) BP algorithm based on scaling Min-sum, proposes that improved multidimensional scales Min-sum BP algorithm;
(2) similitude of factor graph and neural network structure, the expansion polarization code BP decoding factors are decoded according to polarization code BP Figure constitutes deep neural network decoder;
(3) generate all-zero code word, after awgn channel is transmitted, using the back-propagating in deep learning technology and Mini-batch stochastic gradient descent algorithms train deep neural network decoder;
(4) hardware structure of improved B P decoders is provided based on original BP decoders, is reduced using hardware folding Hardware consumption.
In step (1), multidimensional scaling Min-sum BP algorithm is:
Wherein,WithRepresent to be located at BP factor graphs the i-th row jth row log-likelihood ratio characteristic in the t times iteration respectively Information,WithThe zoom factor propagated and propagated to the right to the left for correspondence, g (x, y)=sign (x) sign (y) min(|x|,|y|)。
In step (2), using deep neural network and the similitude of BP factor graphs, deploy polarization code factor graph, selection is solid Determine iterations, finally output uses Sigmoid activation primitives, constitute deep neural network polarization code decoder.
In step (3), nerve is trained using back-propagating in deep learning and Mini-batch stochastic gradient descent algorithms Network, obtains the combination of the optimal zooming parameter of multidimensional scaling Min-sum algorithms, by complete the zero of additive white Gaussian noise channel Code word, introduces the Adam algorithms that learning rate is 0.001, and automatic adjusument learning rate accelerates deep neural network polarization code The training convergence of decoder.
In step (4), using hardware folding, selection is suitable to fold set, and be time-multiplexed same module, folds it Basic calculating module afterwards includes 1 adder, 1 g function module and 1 multiplier.
Beneficial effect:It is of the invention compared with original polarization code BP decoders, its remarkable advantage is:Greatly speed up decoding algorithm Convergence rate, reduces the iterations reached needed for convergence effect, the deep neural network polarization code decoder of 5 iteration Performance has exceeded the performance of original polarization code BP decoders 50 times, about 10 times of convergence rate;In addition, by hardware folding Hardware consumption after processing saves about 50% compared to original deep neural network decoder.
Brief description of the drawings
Fig. 1 is 8 bit polarization code BP decoding factor graphs;
Fig. 2 is 8 bit polarization code neural network decoders once complete BP decoding iteration process figures;
Fig. 3 is the deep neural network polarization code decoder architecture figure of T iteration of 64 bit;
Fig. 4 is the basic operation module with various dimensions zoom function in polarization code decoder;
Fig. 5 is the polarization code decoder basic operation module after hardware is folded;
Fig. 6 is various dimensions scaling Min-sum computing modules;
Fig. 7 is the performance comparison figure of deep neural network polarization code decoder and conventional polar code BP decoders.
Embodiment
Technical scheme is further described with reference to the accompanying drawings and examples.
Be as shown in Figure 1 polarization code BP decoding iteration factor figure, polarization code BP decoding be on factor graph iteration to The log-likelihood ratio information that from left to right is propagated.By taking code length N=8 polarization code as an example, high order end corresponding bit is believed in the factor graph Cease for u, the code word that low order end correspondence receives is x.
Wherein, (i, j) represents the node of jth row in the i-th row, and each node includes log-likelihood ratio letter to the left and to the right The information propagated to the left in breath, the t times iteration is designated asThe information propagated to the right is designated asThe starting stage is decoded to most The information progress of left end and low order end initializes as follows:
Wherein, A represents information bit set, n=log2N。
It is iterated after conventional polar code BP decoding initializations according to below equation:
Wherein, g (x, y)=sign (x) sign (y) min (| x |, | y |), sign is sign function.
After iteration is finished, to a certain bit word if not information bit is then decoded as 0, if information bit is then sentenced according to the following formula Certainly:
Based on the BP algorithm of existing scaling Min-sum (Scaled min-sum), improved multidimensional scaling Min- is proposed Sum (Multiple scaled min-sum) BP algorithm, each iteration uses different zoom factors to function g, can improve Decode effect.
Wherein,WithThe zoom factor propagated and propagated to the right to the left is corresponded to respectively.
The similitude of factor graph and neutral net is decoded using polarization code BP, expansion polarization code BP decoding factor graphs constitute deep Spend neural network decoder.It is polarization code neural network decoder once complete BP decoding iteration process as shown in Figure 2, once Complete 8 bit polarizations code BP iteration correspondence neutral nets, it, which is exported, passes through Sigmoid activation primitives.For 64 bit T times The deep neural network polarization code BP decoders of iteration, its structure can be represented intactly by Fig. 3.
A number of plus noise all-zero code word is generated, loss function (Loss function) uses following cross entropy (Crossentropy) function is fine or not to measure decoding effect:
Utilize back-propagating in deep learning (Back propagation) and Mini-batch stochastic gradient descents (Mini-batch stochastic gradient descent) Algorithm for Training neutral net, obtains multidimensional scaling Min-sum The optimal zooming parameter of algorithmWithCombination θ={ α, β }.Only need to by additive white Gaussian noise (AWGN) channel All-zero code word, considerably reduces training complexity., can be adaptive by introducing the Adam algorithms that learning rate is 0.001 Learning rate is adjusted, accelerates the training convergence of deep neural network polarization code decoder.
The base in the hardware structure of improved B P decoders, deep neural network decoder is provided based on original BP decoders This computing module can be expressed as Fig. 4, wherein, s modules are the g functions with zoom function as shown in Figure 6.Basic calculating module It is made up of 2 adders and 2 g function modules and 2 multipliers.
Using hardware folding, selection is suitable to fold set, and be time-multiplexed same module, it is only necessary to 1 adder and 1 Individual g function modules and 1 multiplier, the basic calculating module after folding can be expressed as Fig. 5, reduce polarization code BP decodings In device in basic calculating module nearly 50% hardware consumption.
, can be with as shown in fig. 7, the performance comparison of deep neural network polarization code decoder and conventional polar code BP decoders Find out and greatly speed up decoding algorithm convergence rate, reduce the iterations reached needed for convergence effect, the depth god of 5 iteration The performances of original polarization code BP decoders 50 times, about 10 times of convergence rate are exceeded through the polarize performance of code decoder of network.

Claims (5)

1. a kind of polarization code decoding algorithm based on deep learning, it is characterised in that:Specifically include following steps:
(1) BP algorithm based on scaling Min-sum, proposes that improved multidimensional scales Min-sum BP algorithm;
(2) similitude of factor graph and neural network structure, expansion polarization code BP decoding factor graph structures are decoded according to polarization code BP Into deep neural network decoder;
(3) all-zero code word is generated, after awgn channel is transmitted, the back-propagating in deep learning technology and Mini- is utilized Batch stochastic gradient descent algorithms train deep neural network decoder;
(4) hardware structure of improved B P decoders is provided based on original BP decoders, hardware is reduced using hardware folding Consumption.
2. the polarization code decoding algorithm according to claim 1 based on deep learning, it is characterised in that:The step (1) In, multidimensional scaling Min-sum BP algorithm is:
<mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>=</mo> <msubsup> <mi>&amp;alpha;</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mi>g</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>+</mo> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> </mrow> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>=</mo> <msubsup> <mi>&amp;alpha;</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mi>g</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> </mrow> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>,</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>=</mo> <msubsup> <mi>&amp;beta;</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mi>g</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> </msubsup> <mo>+</mo> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> </mrow> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>=</mo> <msubsup> <mi>&amp;beta;</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mi>g</mi> <mrow> <mo>(</mo> <mrow> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>L</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mn>2</mn> <mi>j</mi> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> </msubsup> </mrow> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </mrow> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>t</mi> <mo>)</mo> </mrow> </msubsup> <mo>.</mo> </mrow> </mtd> </mtr> </mtable> </mfenced>
Wherein,WithRepresent to be located at BP factor graphs the i-th row jth row log-likelihood ratio characteristic information in the t times iteration respectively,WithThe zoom factor propagated and propagated to the right to the left for correspondence, g (x, y)=sign (x) sign (y) min (| x |, |y|)。
3. the polarization code decoding algorithm according to claim 1 based on deep learning, it is characterised in that:The step (2) In, using deep neural network and the similitude of BP factor graphs, deploy polarization code factor graph, select fixed number of iterations, finally Output uses Sigmoid activation primitives, constitutes deep neural network polarization code decoder.
4. the polarization code decoding algorithm according to claim 1 based on deep learning, it is characterised in that:The step (3) In, neutral net is trained using back-propagating in deep learning and Mini-batch stochastic gradient descent algorithms, multidimensional contracting is obtained The combination of the optimal zooming parameter of Min-sum algorithms is put, by the all-zero code word of additive white Gaussian noise channel, study speed is introduced Rate is 0.001 Adam algorithms, and automatic adjusument learning rate, the training for accelerating deep neural network polarization code decoder is received Hold back.
5. the polarization code decoding algorithm according to claim 1 based on deep learning, it is characterised in that:The step (4) In, using hardware folding, selection is suitable to fold set, and be time-multiplexed same module, the basic calculating mould after folding Block includes 1 adder, 1 g function module and 1 multiplier.
CN201710371218.1A 2017-05-24 2017-05-24 Deep learning-based polar code decoding algorithm Active CN107241106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710371218.1A CN107241106B (en) 2017-05-24 2017-05-24 Deep learning-based polar code decoding algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710371218.1A CN107241106B (en) 2017-05-24 2017-05-24 Deep learning-based polar code decoding algorithm

Publications (2)

Publication Number Publication Date
CN107241106A true CN107241106A (en) 2017-10-10
CN107241106B CN107241106B (en) 2020-07-14

Family

ID=59985067

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710371218.1A Active CN107241106B (en) 2017-05-24 2017-05-24 Deep learning-based polar code decoding algorithm

Country Status (1)

Country Link
CN (1) CN107241106B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107659318A (en) * 2017-11-07 2018-02-02 东南大学 A kind of adaptive polarization code coding method
CN108023679A (en) * 2017-12-07 2018-05-11 中国电子科技集团公司第五十四研究所 Iterative decoding zoom factor optimization method based on parallel cascade system polarization code
CN108092672A (en) * 2018-01-15 2018-05-29 中国传媒大学 A kind of BP interpretation methods based on folding scheduling
CN108199807A (en) * 2018-01-19 2018-06-22 电子科技大学 A kind of polarization code reliability estimation methods
CN108418588A (en) * 2018-01-17 2018-08-17 中国计量大学 Low latency polarization code SMS design of encoder
CN108449091A (en) * 2018-03-26 2018-08-24 东南大学 A kind of polarization code belief propagation interpretation method and decoder based on approximate calculation
CN108540267A (en) * 2018-04-13 2018-09-14 北京邮电大学 A kind of multi-user data information detecting method and device based on deep learning
CN108777584A (en) * 2018-07-06 2018-11-09 中国石油大学(华东) A kind of fast Optimization of polarization code decoding parameter
CN108847848A (en) * 2018-06-13 2018-11-20 电子科技大学 A kind of BP decoding algorithm of the polarization code based on information post-processing
CN109586730A (en) * 2018-12-06 2019-04-05 电子科技大学 It is a kind of based on the polarization code BP decoding algorithm intelligently post-processed
CN109728824A (en) * 2018-12-06 2019-05-07 杭州电子科技大学 A kind of LDPC code iterative decoding method based on deep learning
CN109978079A (en) * 2019-04-10 2019-07-05 东北电力大学 A kind of data cleaning method of improved storehouse noise reduction self-encoding encoder
WO2019134553A1 (en) * 2018-01-02 2019-07-11 华为技术有限公司 Method and device for decoding
CN110719112A (en) * 2019-09-12 2020-01-21 天津大学 Deep learning-based parameter adaptive RS code decoding method
CN110798228A (en) * 2019-10-29 2020-02-14 南京宁麒智能计算芯片研究院有限公司 Polarization code turning decoding method and system based on deep learning
CN111313914A (en) * 2019-11-05 2020-06-19 北京航空航天大学 SCL simplified decoding method based on neural network classifier
CN111541517A (en) * 2020-04-17 2020-08-14 北京交通大学 List polarization code propagation decoding method
CN111697975A (en) * 2020-06-01 2020-09-22 西安工业大学 Polarization code continuous deletion decoding optimization algorithm based on full-connection neural network
CN112332863A (en) * 2020-10-27 2021-02-05 东方红卫星移动通信有限公司 Polar code decoding algorithm, receiving end and system under low signal-to-noise ratio scene of low earth orbit satellite
CN113014270A (en) * 2021-02-22 2021-06-22 上海大学 Partially folded polarization code decoder with configurable code length
CN117914446A (en) * 2023-12-31 2024-04-19 杭州海宴科技有限公司 Decoding method and system for algebraic code

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022244904A1 (en) * 2021-05-21 2022-11-24 엘지전자 주식회사 Method for transmitting/receiving signal in wireless communication system by using auto encoder, and apparatus therefor

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103259545A (en) * 2013-04-26 2013-08-21 西安理工大学 Quasi-cyclic low density odd-even check code belief propagation decoding method based on oscillation
CN103607208A (en) * 2013-11-25 2014-02-26 上海数字电视国家工程研究中心有限公司 LDPC minimum sum decoding method based on normalization correction factor sequences
CN103929210A (en) * 2014-04-25 2014-07-16 重庆邮电大学 Hard decision decoding method based on genetic algorithm and neural network
CN104539296A (en) * 2015-01-21 2015-04-22 西安电子科技大学 Method for improving BP (belief propagation) decoding by use of polarisation code based on early termination of iterative strategy
US20150333775A1 (en) * 2014-05-15 2015-11-19 Broadcom Corporation Frozen-Bit Selection for a Polar Code Decoder
CN105187073A (en) * 2015-10-13 2015-12-23 东南大学 BP decoding method and device for polarization code
US20160043743A1 (en) * 2012-12-03 2016-02-11 Digital PowerRadio, LLC Systems and methods for advanced iterative decoding and channel estimation of concatenated coding systems
CN105634507A (en) * 2015-12-30 2016-06-01 东南大学 Assembly-line architecture of polarization code belief propagation decoder
CN106571832A (en) * 2016-11-04 2017-04-19 华南理工大学 Multi-system LDPC cascaded neural network decoding method and device
CN106571831A (en) * 2016-10-28 2017-04-19 华南理工大学 LDPC hard decision decoding method based on depth learning and decoder

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160043743A1 (en) * 2012-12-03 2016-02-11 Digital PowerRadio, LLC Systems and methods for advanced iterative decoding and channel estimation of concatenated coding systems
CN103259545A (en) * 2013-04-26 2013-08-21 西安理工大学 Quasi-cyclic low density odd-even check code belief propagation decoding method based on oscillation
CN103607208A (en) * 2013-11-25 2014-02-26 上海数字电视国家工程研究中心有限公司 LDPC minimum sum decoding method based on normalization correction factor sequences
CN103929210A (en) * 2014-04-25 2014-07-16 重庆邮电大学 Hard decision decoding method based on genetic algorithm and neural network
US20150333775A1 (en) * 2014-05-15 2015-11-19 Broadcom Corporation Frozen-Bit Selection for a Polar Code Decoder
CN104539296A (en) * 2015-01-21 2015-04-22 西安电子科技大学 Method for improving BP (belief propagation) decoding by use of polarisation code based on early termination of iterative strategy
CN105187073A (en) * 2015-10-13 2015-12-23 东南大学 BP decoding method and device for polarization code
CN105634507A (en) * 2015-12-30 2016-06-01 东南大学 Assembly-line architecture of polarization code belief propagation decoder
CN106571831A (en) * 2016-10-28 2017-04-19 华南理工大学 LDPC hard decision decoding method based on depth learning and decoder
CN106571832A (en) * 2016-11-04 2017-04-19 华南理工大学 Multi-system LDPC cascaded neural network decoding method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BO YUAN等: "Architecture optimizations for BP polar decoders", 《2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING》 *
TOBIAS GRUBER等: "On deep learning-based channel decoding", 《2017 51ST ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS)》 *
张青双等: "一种改进的极化码置信译码器", 《通信技术》 *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107659318A (en) * 2017-11-07 2018-02-02 东南大学 A kind of adaptive polarization code coding method
CN108023679A (en) * 2017-12-07 2018-05-11 中国电子科技集团公司第五十四研究所 Iterative decoding zoom factor optimization method based on parallel cascade system polarization code
CN108023679B (en) * 2017-12-07 2020-06-16 中国电子科技集团公司第五十四研究所 Iterative decoding scaling factor optimization method based on parallel cascade system polarization code
WO2019134553A1 (en) * 2018-01-02 2019-07-11 华为技术有限公司 Method and device for decoding
CN108092672B (en) * 2018-01-15 2021-03-19 中国传媒大学 BP decoding method based on folding scheduling
CN108092672A (en) * 2018-01-15 2018-05-29 中国传媒大学 A kind of BP interpretation methods based on folding scheduling
CN108418588A (en) * 2018-01-17 2018-08-17 中国计量大学 Low latency polarization code SMS design of encoder
CN108418588B (en) * 2018-01-17 2022-02-11 中国计量大学 Low-delay polar code decoder
CN108199807A (en) * 2018-01-19 2018-06-22 电子科技大学 A kind of polarization code reliability estimation methods
CN108199807B (en) * 2018-01-19 2020-06-16 电子科技大学 Polarization code reliability estimation method
CN108449091B (en) * 2018-03-26 2021-05-11 东南大学 Polarization code belief propagation decoding method and decoder based on approximate calculation
CN108449091A (en) * 2018-03-26 2018-08-24 东南大学 A kind of polarization code belief propagation interpretation method and decoder based on approximate calculation
CN108540267A (en) * 2018-04-13 2018-09-14 北京邮电大学 A kind of multi-user data information detecting method and device based on deep learning
CN108847848A (en) * 2018-06-13 2018-11-20 电子科技大学 A kind of BP decoding algorithm of the polarization code based on information post-processing
CN108847848B (en) * 2018-06-13 2021-10-01 电子科技大学 BP decoding algorithm of polarization code based on information post-processing
CN108777584A (en) * 2018-07-06 2018-11-09 中国石油大学(华东) A kind of fast Optimization of polarization code decoding parameter
CN109586730B (en) * 2018-12-06 2020-07-07 电子科技大学 Polarization code BP decoding algorithm based on intelligent post-processing
CN109728824A (en) * 2018-12-06 2019-05-07 杭州电子科技大学 A kind of LDPC code iterative decoding method based on deep learning
CN109586730A (en) * 2018-12-06 2019-04-05 电子科技大学 It is a kind of based on the polarization code BP decoding algorithm intelligently post-processed
CN109978079A (en) * 2019-04-10 2019-07-05 东北电力大学 A kind of data cleaning method of improved storehouse noise reduction self-encoding encoder
CN110719112A (en) * 2019-09-12 2020-01-21 天津大学 Deep learning-based parameter adaptive RS code decoding method
CN110719112B (en) * 2019-09-12 2023-08-29 天津大学 Parameter self-adaptive RS code decoding method based on deep learning
CN110798228A (en) * 2019-10-29 2020-02-14 南京宁麒智能计算芯片研究院有限公司 Polarization code turning decoding method and system based on deep learning
CN111313914B (en) * 2019-11-05 2021-09-28 北京航空航天大学 SCL simplified decoding method based on neural network classifier
CN111313914A (en) * 2019-11-05 2020-06-19 北京航空航天大学 SCL simplified decoding method based on neural network classifier
CN111541517A (en) * 2020-04-17 2020-08-14 北京交通大学 List polarization code propagation decoding method
CN111541517B (en) * 2020-04-17 2022-03-25 北京交通大学 List polarization code propagation decoding method
CN111697975A (en) * 2020-06-01 2020-09-22 西安工业大学 Polarization code continuous deletion decoding optimization algorithm based on full-connection neural network
CN112332863A (en) * 2020-10-27 2021-02-05 东方红卫星移动通信有限公司 Polar code decoding algorithm, receiving end and system under low signal-to-noise ratio scene of low earth orbit satellite
CN112332863B (en) * 2020-10-27 2023-09-05 东方红卫星移动通信有限公司 Polar code decoding algorithm, receiving end and system under low signal-to-noise ratio scene of low orbit satellite
CN113014270A (en) * 2021-02-22 2021-06-22 上海大学 Partially folded polarization code decoder with configurable code length
CN117914446A (en) * 2023-12-31 2024-04-19 杭州海宴科技有限公司 Decoding method and system for algebraic code

Also Published As

Publication number Publication date
CN107241106B (en) 2020-07-14

Similar Documents

Publication Publication Date Title
CN107241106A (en) Polarization code decoding algorithm based on deep learning
US7539920B2 (en) LDPC decoding apparatus and method with low computational complexity algorithm
Nachmani et al. Near maximum likelihood decoding with deep learning
CN105515590B (en) A kind of effective low complex degree serially offsets list polarization code coding method
CN107204780B (en) Merging BP decoding algorithm and device of polar-LDPC (Low Density parity check) concatenated code
CN108039891A (en) A kind of polarization code BP interpretation methods and device based on multistage more new technological process
CN110233628B (en) Self-adaptive belief propagation list decoding method for polarization code
CN115664899A (en) Channel decoding method and system based on graph neural network
Deng et al. Reduced-complexity deep neural network-aided channel code decoder: A case study for BCH decoder
CN110752894B (en) CNN-based LDPC code blind channel decoding method and decoder
Dai et al. New min-sum decoders based on deep learning for polar codes
CN109495116B (en) SC-BP mixed decoding method of polarization code and adjustable hardware architecture thereof
CN105680881A (en) LDPC decoding method and decoder
CN111835364B (en) Low-complexity nerve BP decoding method of polarization code
Hernandez et al. The three/two Gaussian parametric LDLC lattice decoding algorithm and its analysis
CN111313913A (en) Low-delay cross-scheduling polarization code BP decoding method and device
CN108365918B (en) Multivariate LDPC code decoding method based on effective concentration criterion
CN110166060A (en) Height is handled up pipeline-type polarization code BP decoder and its implementation
Qin et al. Convolutional neural network-based polar decoding
CN110855298B (en) Low iteration number polarization code BP decoding method based on subchannel freezing condition
WO2010054526A1 (en) Rs decoding device and key multinomial solving device used by rs decoding device
Nayak et al. Green detnet: Computation and memory efficient detnet using smart compression and training
Nachmani et al. A gated hypernet decoder for polar codes
CN110717343A (en) Optimal alignment method based on transformer attention mechanism output
Debbabi et al. Analysis of ADMM-LP algorithm for LDPC decoding, a first step to hardware implementation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant