WO2018121887A1 - Appareil et procédé de mise en forme de la distribution de probabilités d'une séquence de données - Google Patents

Appareil et procédé de mise en forme de la distribution de probabilités d'une séquence de données Download PDF

Info

Publication number
WO2018121887A1
WO2018121887A1 PCT/EP2017/050023 EP2017050023W WO2018121887A1 WO 2018121887 A1 WO2018121887 A1 WO 2018121887A1 EP 2017050023 W EP2017050023 W EP 2017050023W WO 2018121887 A1 WO2018121887 A1 WO 2018121887A1
Authority
WO
WIPO (PCT)
Prior art keywords
shaping
data sequence
symbols
probabilistic
probability distribution
Prior art date
Application number
PCT/EP2017/050023
Other languages
English (en)
Inventor
Marcin PIKUS
Wen Xu
Original Assignee
Huawei Technologies Duesseldorf Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Duesseldorf Gmbh filed Critical Huawei Technologies Duesseldorf Gmbh
Priority to PCT/EP2017/050023 priority Critical patent/WO2018121887A1/fr
Priority to CN201780081987.6A priority patent/CN110140330B/zh
Publication of WO2018121887A1 publication Critical patent/WO2018121887A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L27/00Modulated-carrier systems
    • H04L27/32Carrier systems characterised by combinations of two or more of the types covered by groups H04L27/02, H04L27/10, H04L27/18 or H04L27/26
    • H04L27/34Amplitude- and phase-modulated carrier systems, e.g. quadrature-amplitude modulated carrier systems
    • H04L27/3405Modifications of the signal space to increase the efficiency of transmission, e.g. reduction of the bit error rate, bandwidth, or average power
    • H04L27/3411Modifications of the signal space to increase the efficiency of transmission, e.g. reduction of the bit error rate, bandwidth, or average power reducing the peak to average power ratio or the mean power of the constellation; Arrangements for increasing the shape gain of a signal set
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/0001Systems modifying transmission characteristics according to link quality, e.g. power backoff
    • H04L1/0002Systems modifying transmission characteristics according to link quality, e.g. power backoff by adapting the transmission rate
    • H04L1/0003Systems modifying transmission characteristics according to link quality, e.g. power backoff by adapting the transmission rate by switching between different modulation schemes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L27/00Modulated-carrier systems
    • H04L27/32Carrier systems characterised by combinations of two or more of the types covered by groups H04L27/02, H04L27/10, H04L27/18 or H04L27/26
    • H04L27/34Amplitude- and phase-modulated carrier systems, e.g. quadrature-amplitude modulated carrier systems
    • H04L27/36Modulator circuits; Transmitter circuits

Definitions

  • This invention relates to apparatus and methods for shaping the probability distribution of a data sequence.
  • PSCM Probabilistically shaped coded modulation
  • the scheme uses a shaping encoder (ShEnc) and a channel encoder (ChEnc) at the transmitter side and a channel decoder (ChDec) followed by a shaping decoder (ShDec) at the receiver side.
  • This brings a number of advantages.
  • the shaping encoder transforms uniformly distributed bits of the input message to a non-uniform distribution so that the channel input symbols approach the distribution that is able to achieve capacity for the channel.
  • the transmitter can adjust the rate of the transmission, without changing the parameters of the forward error correction (FEC) code.
  • FEC forward error correction
  • PSCM Bit-Interleaved Coded Modulation
  • PES probabilistic amplitude shaping
  • the shaping encoder 101 aims to produce a sequence of symbols (random variables) with a desired probability distribution given a sequence of symbols as an input.
  • the input symbols often have a uniform probability distribution.
  • the output symbols often have a non-uniform probability distribution.
  • a shaping encoder is sometimes referred to as a distribution matcher (DM), and a shaping decoder is called a distribution dematcher or inverse DM.
  • Distribution matching is usually performed on a block-to-block (or block-to- variable-length) basis.
  • the shaping encoder usually maps a uniformly distributed input sequence of fixed length to a fixed length (or variable length, depending on the input sequence) sequence of symbols distributed according to a desired probability distribution. The mapping should be one-to-one.
  • the shaping encoder 101 outputs a sequence 108 of n c amplitudes formed
  • Each amplitude is mapped independently 102, by a fixed mapping b A , to a corresponding bit label of length m - 1.
  • a probabilistic shaper comprising an input configured to receive an input data sequence, which has a first probability distribution.
  • the probabilistic shaper also comprises a splitter configured to divide the input data sequence into a plurality of separate data sequences. It comprises a plurality of shaping encoders. Each shaping encoder is configured to receive a respective one of the separate data sequences and map it to an output data sequence having a different probability distribution from the first probability distribution. Each shaping encoder is configured to perform its mapping independently of the other shaping encoders. It also comprises a combiner configured to receive the output data sequences from the plurality of shaping encoders and combine them to form a single data sequence having a target probability distribution.
  • the probabilistic shaper is able to increase throughput by dividing up the incoming data sequence into sequences that can be processed in parallel.
  • the plurality of shaping encoders and the combiner may each configured to map their respective received data sequences to output data sequences that are formed from a respective alphabet, and the shaping encoders may each be configured to map to a smaller alphabet than the combiner. This reduces the complexity of the individual shape encoding operations and further increases throughput.
  • the combiner may be configured to receive the output data sequences and map them to a single data sequence that is formed from a target alphabet, wherein the target alphabet comprises a plurality of symbols and each symbol can be represented by one or more sub-symbols.
  • Each shaping encoder may be configured to map its respective separate data sequence to an output data sequence that is formed from an alphabet that comprises at least one of the sub-symbols. The shaping encoders are thus able to output individual data sequences, which can be independently formed and then jointly mapped to the target alphabet.
  • the splitter may be configured to divide the input data sequence into a number of separate data sequences that is equal to the number of the plurality of sub-symbols that represent each symbol in the target alphabet.
  • the probabilistic shaper may be configured such that one or more of those separate data sequences is not input into a shaping encoder but is instead input directly into the combiner. This can further improve throughput and reduces complexity.
  • Each sub-symbol may comprise one or more bit levels.
  • the combiner may be configured to treat the output data sequences from the plurality of shaping encoders as each providing a respective one or more bit levels of the sub-symbols. This assists the combiner to jointly map the outputs of the shaping encoder to the desired target alphabet.
  • the combiner may be configured to combine the output data sequences to form the single data sequence by combining one or more sub-symbols that are comprised in the output data sequences of the plurality of shaping encoders to form a combination of sub-symbols and mapping the combination of sub-symbols to the symbol in the target alphabet that is represented by that combination. This enables the combiner to jointly map the outputs of the shaping encoder to the desired target alphabet.
  • Each symbol in the target alphabet may be associated with a respective transmit power.
  • the combiner may be configured to treat each symbol in the target alphabet as being represented by a specific combination of sub-symbols, whereby a symbol in the target alphabet that is associated with a higher transmit power than another symbol in the target alphabet is treated as being represented by a specific combination of sub- symbols that has a probability that is lower than or equal to a probability of a specific combination of sub-symbols that represents the other symbol. This improves performance by enabling the output distribution of the output symbols to approximate a Gaussian distribution.
  • the combiner may be configured to use Natural Code mapping to map a combination of sub-symbols to a symbol in the target alphabet. This helps to provide a probabilistically ordered mapping.
  • the combiner may be configured to form the single data sequence to have a target probability distribution that is anticipated to offer improved transmission performance compared with the first probability distribution.
  • the improvements could arise in any area, including reduced error rates, reduced transmit power, reduced Peak-to- Averaged Power ratio (PAPR) etc.
  • PAPR Peak-to- Averaged Power ratio
  • a transmitter comprising a probabilistic shaper as described in any of the preceding paragraphs.
  • a probabilistic deshaper comprises an input configured to receive an input data sequence, which has a target probability distribution.
  • the probabilistic deshaper also comprises a divider configured to arrange the input data sequence into a plurality of separate data sequences. It comprises a plurality of shaping decoders, each configured to receive a respective one of the separate data sequences and demap it to an output data sequence having a different probability distribution from the target probability distribution.
  • Each shaping decoder performs said demapping independently of the other shaping decoders.
  • It also comprises a combiner configured to receive the output data sequences from the plurality of shaping decoders and combine them to form a single data sequence having a first probability distribution.
  • the probabilistic deshaper is thus able to reverse the processing of the probabilistic shaper and increase throughput by dividing up the incoming data sequence into sequences that can be processed in parallel.
  • the divider may comprise a demappper that is configured to receive the input data sequence, which is formed from a target alphabet, wherein the target alphabet comprises a plurality of symbols and each symbol can be represented by a plurality of sub-symbols. It may also be configured to demap the input data sequence to a demapped data sequence that is formed from an alphabet that comprises the plurality of sub-symbols. It may also be configured to output the demapped data sequence for arranging into the plurality of separate data sequences. The divider is thus able to divide the input data stream into appropriate sequences of sub-symbols that can be demapped to reclaim the original data that was input into the probabilistic shaper.
  • the divider may be configured to arrange the demapped data sequence into a plurality of separate data sequences by arranging data corresponding to the same sub-symbol in the same sequence and data corresponding to different sub-symbols in different sequences.
  • the divider is thus able to divide the input data stream into sequences of sub-symbols that can be independently demapped.
  • a receiver comprises a probabilistic deshaper described in any of the preceding paragraphs.
  • a method comprises receiving an input data sequence, which has a first probability distribution. It comprises dividing the input data sequence into a plurality of separate data sequences. It comprises independently mapping each separate data sequence to an output data sequence that has a different probability distribution from the first probability distribution. It also comprises combining the output data sequences to form a single data sequence having a target probability distribution.
  • Figure 1 shows an example of a prior art PAS system
  • Figure 2 shows an example of a probabilistic shaper according to an embodiment of the invention
  • Figure 3 shows an example of a probabilistic deshaper according to an embodiment of the invention
  • Figure 4 is a flow chart that illustrates an example of a shaping technique according to an embodiment of the invention
  • Figure 5 shows a more detailed example of a probabilistic shaper
  • Figure 6 shows an example of a probabilistic shaper configured to perform spilt shaping encoding
  • Figure 7 shows an example of a probabilistic shaper configured to perform binary spilt shaping encoding
  • Figure 8 shows an example of a probabilistic shaper incorporated in a PCSM transmit chain
  • Figure 9 shows simulation results for split shaping encoders that use Gray Binary Code mapping and Natural Binary Code mapping.
  • FIG. 2 shows an example of an apparatus for implementing probabilistic shaping.
  • the apparatus which is termed a "probabilistic shaper" herein, is shown generally at 200. It comprises an input 201 , a splitter 202 and a plurality of shaping encoders 203.
  • the splitter is configured to receive an input data sequence from the input and divide it into a plurality of separate data sequences.
  • the splitter might simply be a switch but it could be implemented by any suitable means for separating an input data stream, including e.g. a demultiplexer.
  • Each of the shaping encoders is configured to receive one of the separate data sequences and shape it to generate an output data sequence that has a different probability distribution from that of the original input data sequence.
  • the shaping encoders could be configured to produce output sequences that all have the same probability distribution or they could differ from each other. Each shaping encoder performs its mapping independently of the other shaping encoders, so each shaping operation is performed without reference to the shaping operations being performed by other shaping encoders.
  • a data sequence is typically formed of a plurality of bits or symbols. Those bits or symbols are usually taken from an alphabet that defines the available bits or symbols. For example, a binary alphabet is the set ⁇ 0, 1 ⁇ .
  • the "probability distribution" of a data sequence defines the relative proportions of the different alphabet members that the sequence contains. So, for example, the probability distribution of a binary sequence refers to its relative proportions of '1 's and O's. Changing a probability distribution of a sequence is referred to as "shaping" herein.
  • the input data sequence that is received by the probabilistic shaper is a uniform sequence, i.e. a sequence in which the relative proportions of the alphabet members are the same.
  • the shaping encoders are configured to transform these uniform distributions into "biased" distributions, where the term “biased” refers to the probability of the different alphabet members in a sequence being different.
  • the probabilistic shaper also includes a combiner 204.
  • the combiner is configured to receive the output data sequences from the plurality of shaping encoders and combine them to form a single data sequence having a target probability distribution.
  • the combiner may optionally incorporate a mapper 205, which is configured to map the output data sequences that the combiner receives from the shaping encoders to a target alphabet.
  • the probabilistic shaper will usually form part of a transmitter. For example, it could form part of a transmitter that is capable of formatting data for transmission according to a PCSM transmit scheme. This formatting is reversed at the receiver.
  • the processing that is performed by the probabilistic shaper shown in Figure 2 may be reversed by its mirror: the probabilistic deshaper shown in Figure 3.
  • FIG. 3 shows an example of an apparatus for implementing probabilistic deshaping.
  • the apparatus which is termed a "probabilistic deshaper" herein, is shown generally at 300. It comprises an input 301 , a divider 302 and a plurality of shaping decoders 303.
  • the divider is configured to arrange the input data sequence into a plurality of separate data sequences. As with splitter 201 , the divider might simply be a switch. The job of the divider is likely to be slightly more complicated than that of splitter 201 , however (as will become apparent from some of the examples below) so it is likely to be implemented by a more complex component, such as a demultiplexer.
  • the divider comprises an optional demapper 304, which may be required if split- shaping encoding has been implemented at the transmitter-side (which is described in more detail below).
  • Each of the shaping decoders is configured to receive one of the separate data sequences and shape it to generate an output data sequence.
  • the output data sequences have a different probability distribution from that of the original input data sequence.
  • the shaping decoders could be configured to produce output sequences that all have the same probability distribution or they could differ from each other.
  • Each shaping decoder performs its demapping independently of the other shaping encoders.
  • the probabilistic deshaper also has a combiner 305 configured to receive the output data sequences from the plurality of shaping decoders and combine them to form a single data sequence. For the deshaper, this single data sequence should be equivalent to the original data sequence that was processed in the transmitter, i.e. the single data sequence should have the first probability distribution.
  • the probabilistic shaper and deshaper work in a very similar way, in that both receive an input data sequence and divide it into a number of smaller, separate sequences, which are then independently mapped to an output sequence.
  • This general method is illustrated in Figure 4, which commences in step S401 with receiving an input data sequence.
  • this input data sequence will have a first probability distribution.
  • this signal will be a received signal that has already gone through a shaping process and hence has the target distribution. Both the probabilistic shaper and the probabilistic deshaper then split their respective input sequences into a plurality of separate data sequences (step S402).
  • the probabilistic shaper and the probabilistic deshaper both include shaping encoders/decoders for translating their respective input sequences into output sequences that have a different probability distribution from the data sequence that was originally received by the apparatus (step S403).
  • both apparatus combine the outputs from the shaping encoders to form a single data sequence (step S404).
  • this combination is likely to be a simple concatenation of the data sequences output by the different shaping decoders.
  • this operation may be more complicated and involve jointly mapping sub-symbols in the output sequences of the shaping encoders to target symbols (as explained in more detail below).
  • the apparatus shown in Figures 2 and 3 introduce a new architecture for block-to- block or block-to-variable-length shaping encoding and decoding. This new architecture provides reduced complexity and parallelisation, which improves throughput.
  • Figures 2 and 3 are intended to correspond to a number of functional blocks. This is for illustrative purposes only. Figures 2 and 3 are not intended to define a strict division between different parts of hardware on a chip or between different programs, procedures or functions in software.
  • some or all of the signal processing techniques described herein may be performed wholly or partly by a processor acting under software control.
  • some or all of the signal processing techniques described herein are likely to be performed wholly or partly in hardware. This particularly applies to techniques incorporating repetitive arithmetic operations, such as mapping and shaping.
  • the functional blocks are expected to be implemented as dedicated hardware in a transmitter/receiver chain.
  • a transmitter chain and a receiver chain may be implemented in the same device and components.
  • one or more of the components in the probabilistic shaper and deshaper may have a dual-purpose depending on whether the device is operating in a "transmit" mode or a "receive” mode.
  • the shaping encoders and decoders may be the same components, just configured to perform different mappings depending on whether they are operating in transmit mode or receive mode.
  • the majority of the description below concentrates on the transmitter-side, since it is the transmitter-side that mandates the receiver-side processing. It should be understood, even where this is not explicitly stated, that all of the techniques described below as being performed on the transmitter-side will be mirrored on the receiver-side in order to reverse the transmitter-processing and obtain the original data sequence.
  • transmitter/receiver chain The specific components found in a transmitter/receiver chain are dependent on the exact waveform and telecommunications protocol that the transmitter/receiver is configured to implement.
  • One or more implementations of the invention are described below with reference to an application in which the transmitter/receiver is configured to operate in accordance with a PCSM transmission scheme. This is for the purposes of example only; it should be understood that the scope of the invention is not limited to any particular transmission scheme, waveform, or telecommunications protocol.
  • FIG. 5 A more detailed example of a probabilistic shaper is shown in Figure 5. This implementation may be termed the "separate shaping encoder" implementation.
  • the separate shaping encoder splits the input sequence into multiple shorter sequences.
  • an input data sequence of k bits 501 is split into two separate sequences 502, 503, each of length k / 2 .
  • the splitter is not shown in Figure
  • the probabilistic shaper includes two building block shaping encoders 504, 505, each of which is configured to receive one of the separate data sequences.
  • the shaping encoders are configured to perform distribution matching on each sequence in parallel to generate respective output sequences 508, 509.
  • the shaping encoders are both configured to map their respective input sequences to output sequences having a probability distribution P A .
  • the shaping encoders could also map to output sequences having different probability distributions.
  • the multiplexor 510 is then configured to concatenate the shaping encoder output sequences to form a single output sequence 51 1 .
  • the combiner is thus able to be implemented by a multiplexor 510, since the single output sequence is formed through concatenating the output sequences from the shaping encoders.
  • the shaping encoders may map to an intermediate alphabet that is different from the target alphabet.
  • the combiner may be configured to jointly map the output sequences of the shaping encoders to the target alphabet to form the single output sequence. This is described in more detail below.
  • the shaping encoders may map to a different alphabet from the combiner.
  • the shaping encoders may map to a smaller alphabet than the combiner. This results in multiple shaping encoders, all operating on shorter sequences in parallel with each other.
  • An example of a probabilistic shaper in which the shaping encoders and the combiner use different alphabets is shown in Figure 6.
  • the building block shaping encoders 604, 605 each perform a distribution matching independently on smaller alphabets than the target alphabet X.
  • Each shaping encoder uses its own respective alphabet, so shaping encoder 604 uses alphabet A to generate its output sequence 608 whereas shaping encoder 605 uses alphabet B to generate its output sequence 609.
  • the joint distribution of the symbols in the shaping encoder output sequences 608, 609 are mapped to the symbols of the target alphabet by mapper 610.
  • the general principle underlying the probabilistic shaper shown in Figure 6 is as follows.
  • the probabilistic shaper may have k parallel building block shaping encoders. Together these shaping encoders can be considered as implementing a "split shaping encoder".
  • the operation of the split shaping encoder will now be described with reference to an example in which the shaping system is configured as a CCDM (Constant Composition Distribution Matcher).
  • CCDM Constant Composition Distribution Matcher
  • the output alphabet ⁇ of the split shaping encoder has a size of w k , where w is the number of "levels" in the shaping encoder alphabets (which are the CCDM alphabets in this example).
  • each symbol in the target alphabet may be capable of being represented by one or more sub-symbols, and those sub-symbols may in turn form the alphabets of the shaping encoders.
  • each symbol in the target alphabet has a label L i .
  • Each label may have k bits. These can be viewed as k "bit levels". The number of bit levels may be the same as the number of shaping encoders.
  • Each shaping encoder can, in effect, be considered as being responsible for mapping the input data sequence to one of the "bit levels".
  • the combiner correspondingly treats the output data sequences from the shaping encoders as each providing a respective bit-level for the symbol labels. (It is also possible for each shaping encoder to handle more than one bit level. This is described in more detail below).
  • the probability of a label is then a product of the probabilities of corresponding bit levels. For example,
  • the input parameters to the shaping system are the target output distribution P A and the length n c of the output data sequence.
  • the output distribution P A is emulated by outputting a sequence symbols of a certain type, i.e., the
  • n a is the number of occurrences of the symbol in the output sequence
  • the labels can then be determined by selecting q kc sequences, each of which have the empirical distribution, and defining a one-to-one mapping between those labels and the symbols of the target alphabet.
  • the distribution matching is then performed for each bit level of the binary labels, where the nth "bit-level" refers to the nth bit of a binary label .
  • the binary output sequences are mapped back to symbols. In this way, the input bits can be processed independently and in parallel, such that a higher throughput can be obtained.
  • the two shaping encoders 701 , 702 in Figure 7 are bit-level (or binary) shaping encoders.
  • the apparatus in Figure 7 represents the receiver-side implementation; corresponding bit-level (or binary) shaping decoders would form the building blocks of the transmitter-side implementation. These building blocks could be replaced by any other binary source encoder or decoder (e.g. one that works on block-to-block basis), which outputs biased bit sequences.
  • the target alphabet in the example of Figure 7 is The target alphabet
  • the apparatus in Figure 7 is configured to perform distribution matching on two bit-levels, B 1 and B 2 , of binary labels assigned to the symbols.
  • the symbols in the target alphabet can be labelled as follows:
  • each symbol in the target alphabet is represented by two sub-symbols, 0 and 1 , which form a unique label .
  • the shaping encoders each use the alphabets
  • the shaping encoders use alphabets that are formed from the sub-symbols.
  • the next stage is to find bit-level distributions such that a "good” approximation
  • the binary shaping encoders 701 , 702 are each configured to map their respective input bits to an output bit sequence that show the appropriate distribution of Ts and O's according to their marginal probabilities B 1 and B 2 .
  • the mapper 703 treats the output from binary shaping encoder 701 as providing bit B 1 and the output from binary shaping encoder 702 as providing bit B 2 . It therefore takes alternate bits from each sequence to jointly map bits B 1 and B 2 to the symbols of the target alphabet, to obtain a single output sequence of symbols that have the target probability distribution P A .
  • the shaping apparatus in Figure 7 has an input length of This is
  • the apparatus in Figure 7 is configured to perform binary distribution matching inside a non-binary distribution matcher, for reasons of reduced complexity and improved parallelization. These techniques are applicable to a PSCM scheme and can be incorporated before the channel code in a PSCM transmit chain (with the transmit chain being mirrored in the receive chain).
  • Figure 8 shows an example of part of a PCSM transmit chain that incorporates a split shaping arrangement 801 such as that described above is shown. After the symbols are generated, the rest of the chain can function in the same way as the PAS system shown in Figure 1 .
  • the mapping bA (from symbols to bits, just before the FEC Encoder) is suitably a Gray binary code (GBC) mapping, which is considered optimal for BICM.
  • the divider may optionally include a demapper 304.
  • the demapper is configured to receive the input data sequence, which is formed from the target alphabet. It is also configured to demap that input data sequence to a demapped data sequence that is formed from the appropriate sub-symbols.
  • the divider is configured to arrange the output data sequences into a plurality of separate data sequences so that data corresponding to the same sub-symbol is directed to the same shaping decoder and data corresponding to different sub-symbols is directed to different shaping decoders. Essentially, this just reverses the process performed in the transmitter.
  • a performance close to Additive White Gaussian Noise (AWGN) channel capacity can be achieved if signal amplitudes approximate a Maxwell-Boltzmann distribution , i.e., a target alphabet where the probability of each amplitude a ⁇
  • v is a parameter depending on Signal-to-Noise Ratio (SNR).
  • Tables 1 to 3 show results obtained by labelling the target symbols using Natural Binary Coding, i.e., 000, 001 , 010, 01 1 , 100, ...,1 1 1 .
  • the results show correlation coefficients between the bits of the Maxwell-Boltzmann distribution and 16-ASK modulation amplitudes, i.e. 8 amplitudes (represented by B1 B2B3) on the positive part of the x-axis and 8 symmetric amplitudes on the negative part of the x- axis.
  • the results show that the bits of different bit-levels are nearly independent of each other, and consequently confirm that bit-level shaping should perform well.
  • the building block shaping encoders can also work on higher alphabets. This includes alphabets that are not necessarily binary and/or alphabets in which each symbol comprises more than one bit (so that each shaping encoder maps to more than one output bit). Using a higher-order shaping encoder can increase the accuracy of approximation of the desired target distribution P A . This is especially important when the correlation or dependency of some bit-levels is not weak.
  • individual shaping encoders can be configured to consider two or more bit-levels jointly to form the output sub-symbols.
  • the mapping of the sub-symbols to the target alphabet need not have any special structure.
  • Each target symbol could be randomly assigned its unique label of sub- symbols.
  • a preferred option is to assign the labels to achieve a probabilistically ordered mapping.
  • target symbols that are associated with a relatively high transmit power are generally mapped to sub-symbol combinations that have a lower probability than the sub-symbol combinations to which relatively low transmit power symbols are mapped. It is also possible for symbols that neighbour each other in magnitude to be mapped to sub-symbol combinations that have the same probability.
  • the probability distribution of transmit symbols in a PSCM system is preferably configured to ensure that the general trend of the output symbols is for their probability to decrease as their magnitude increases. This can be achieved as follows: First a series of assumptions is made.
  • the probabilistic shaper comprises k parallel building block shaping encoders.
  • the target alphabet ⁇ has a size of 2 k and each symbol in the alphabet has a label L £ of k bits.
  • the binary target distribution that applies to each of the k building blocks shaping encoders can be designated as:
  • the probability of a label is then a product of the probabilities of corresponding bit levels. For example,
  • the labels L can then be sorted according to their probability. For example: where is the label index after the sorting operation.
  • the result is target symbols that have the desired probability distribution.
  • the shaping encoders may be configured to use a Natural Code as an alphabet.
  • a probabilistically ordered Natural Code can achieve maximum redundancy in terms of unequal distribution from symbol level to bit levels. It can also be shown that under some constraints on the distribution applied by each of the k building block shaping encoders, a Natural Code mapping is a mapping which results in the target symbols having a decreasing probability distribution. This is demonstrated below using a practical, worked example. In this example it is assumed that the probabilistic shaper has 4 parallel building block binary shaping encoders.
  • the target alphabet has a size of 16 and each symbol in the alphabet has a 4-bit label
  • the distributions for each of the 4 binary shaping encoders can be designated:
  • T nese distributions are subject to the following
  • This general theorem can be extended to cover any Natural Code combiner, i.e. it is possible to derive a necessary condition for a Natural Code to be a probalistically ordered code.
  • Non-probabilistically ordered mapping instead may result in a probability distribution which is non-decreasing with increasing amplitude. This waives the achievable shaping gain for AWGN channel.
  • An example of a non-probabilistically ordered mapping is Gray binary code (GBC) mapping.
  • GBC Gray binary code
  • Figure 9 illustrates the resulting symbol probability distribution using Natural Binary Code mapping and Gray Binary Code mapping.
  • the simulation results for Gray Binary Code are shown in plot 901 .
  • the results show this mapping is non-probabilistically ordered mapping. It consequently results in poor performance.
  • the simulation results for Natural Binary Code mapping are shown in plot 902. This mapping meets the constraints set out above and thus achieves a probabilistically ordered mapping.
  • the shaping encoders may be configured to implement a mapping that uses any Natural Code.
  • a Natural Code is an extension of Natural Binary Code, and extends to any N-ary alphabet.
  • the Natural Code defines an order of the sequences, e.g. for a binary code. Specifically, a Natural Code orders the sequences such that the rightmost symbol is the least important one, the second rightmost symbol is the second least important, and so on, with the left-most symbol being the most important.
  • the 10-ary natural code is decimal numbering.
  • Natural Code on a sequences which have mixed alphabets.
  • an alphabet may have a first symbol that is binary and a second that is 3-ary.
  • the Natural Code would then be:
  • the input data sequence is divided into a number of separate data sequences that is equal to the number of shaping encoders.
  • Each separate data sequence is then processed by a shaping encoder before being mapped to the target alphabet.
  • the probabilistic shaper is still configured to divide the input data sequence into a number of separate data sequences that is equal to the number of the plurality of sub-symbols that represent each symbol in the target alphabet.
  • One or more of those separate data sequences is not input into a shaping encoder, however, but is instead input directly into the combiner. The reason for this is explained below.
  • bit-level probabilities for the shaping encoders are preferably
  • a PSCM system such as that shown in Figure 8, has the following free parameters available to it: (i) the probability distribution on each bit-level and (ii) the constellation scaling To determine
  • An optimization method can be employed to obtain a "good" distribution P x that yields rates close to AWGN capacity.
  • the distribution P x can be restricted to a Maxwell-Boltzmann distribution, i.e., then ⁇ and P x can be obtained by
  • P x is a distribution on amplitudes P A and therefore a joint distribution on the bit labels (via a bit-to-symbol mapping).
  • the aim is to find a product
  • divergence D can be used as a measure of "closeness":
  • the probabilistic deshaper on the receiver-side is configured to reverse the processing that has been performed at the transmitter-side.
  • the shaping decoders are therefore preferably configured to use both an inverse probabilistically ordered mapping and Natural Code mapping in their operations to demap their respective input sequences to obtain the original data sequence.
  • the probabilistic deshaper can be configured to mirror the transmitter-side by skipping shaping decoding for any bit levels that were not shaped at the transmitter.
  • the desired distribution of the output symbols is preferably one that is anticipated to offer improved transmission performance compared with the distribution of the original data sequence.
  • the probabilistic shaper described herein is able to achieve improved transmission performance.
  • the improvements could arise in any area, including reduced error rates, reduced transmit power, reduced Peak-to-Averaged Power ratio (PAPR) etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
  • Error Detection And Correction (AREA)

Abstract

Un appareil de mise en forme probabiliste comprend une entrée configurée pour recevoir une séquence de données d'entrée, laquelle séquence a une première distribution de probabilités. L'appareil de mise en forme probabiliste comprend également un diviseur configuré pour diviser la séquence de données d'entrée en une pluralité de séquences de données séparées. L'appareil de mise en forme probabiliste comprend une pluralité de codeurs de mise en forme. Chaque codeur de mise en forme est configuré pour recevoir une séquence de données séparée respective parmi les séquences de données séparées et mettre en correspondance ladite séquence de données séparée avec une séquence de données de sortie ayant une distribution de probabilités différente de la première distribution de probabilités. Chaque codeur de mise en forme est configuré pour effectuer sa mise en correspondance indépendamment des autres codeurs de mise en forme. L'appareil de mise en forme probabiliste comprend également un combineur configuré pour recevoir les séquences de données de sortie de la pluralité de codeurs de mise en forme et combiner ces dernières pour former une seule séquence de données ayant une distribution de probabilités cible. L'appareil de mise en forme probabiliste peut augmenter le débit en divisant la séquence de données entrante en séquences plus courtes qui peuvent être traitées en parallèle.
PCT/EP2017/050023 2017-01-02 2017-01-02 Appareil et procédé de mise en forme de la distribution de probabilités d'une séquence de données WO2018121887A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/EP2017/050023 WO2018121887A1 (fr) 2017-01-02 2017-01-02 Appareil et procédé de mise en forme de la distribution de probabilités d'une séquence de données
CN201780081987.6A CN110140330B (zh) 2017-01-02 2017-01-02 用于整形数据序列概率分布的装置和方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2017/050023 WO2018121887A1 (fr) 2017-01-02 2017-01-02 Appareil et procédé de mise en forme de la distribution de probabilités d'une séquence de données

Publications (1)

Publication Number Publication Date
WO2018121887A1 true WO2018121887A1 (fr) 2018-07-05

Family

ID=57860809

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/050023 WO2018121887A1 (fr) 2017-01-02 2017-01-02 Appareil et procédé de mise en forme de la distribution de probabilités d'une séquence de données

Country Status (2)

Country Link
CN (1) CN110140330B (fr)
WO (1) WO2018121887A1 (fr)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019034780A1 (fr) * 2017-08-17 2019-02-21 Sony Corporation Dispositif et procédé de mappage, dispositif et procédé de mise en forme de point de signal probabiliste
EP3605906A1 (fr) * 2018-08-02 2020-02-05 Nokia Solutions and Networks Oy Transmission d'amplitudes façonnées de manière probabiliste à l'aide d'étiquettes d'amplitude partiellement antisymétriques
CN110971559A (zh) * 2019-12-18 2020-04-07 南京信息工程大学 一种基于动态控制因子降低ofdm-pon信号峰均功率比的调制解调方法
WO2020108771A1 (fr) * 2018-11-30 2020-06-04 Huawei Technologies Co., Ltd. Dispositif et procédé de mise en forme probabiliste de constellations
WO2020170497A1 (fr) * 2019-02-20 2020-08-27 Mitsubishi Electric Corporation Système et procédé de communication pour communiquer des symboles de bits
CN113169809A (zh) * 2018-10-25 2021-07-23 华为技术有限公司 分布匹配器、信道编码器以及用于编码数据比特或符号的方法
CN113454962A (zh) * 2019-02-26 2021-09-28 三菱电机株式会社 分布整形方法和分布解整形方法、分布整形编码器和分布整形解码器以及传输系统
US11277225B2 (en) * 2017-08-17 2022-03-15 Sony Coroporation Probabilistic signal point shaping device and method
WO2022183472A1 (fr) * 2021-03-05 2022-09-09 Qualcomm Incorporated Sélection de codage liée à la mise en forme de constellation
WO2024016205A1 (fr) * 2022-07-20 2024-01-25 Qualcomm Incorporated Mise en correspondance de gris pour adaptation de distribution variable vers fixe
WO2024045107A1 (fr) * 2022-09-01 2024-03-07 Qualcomm Incorporated Codage arithmétique comprenant une détermination de séquence de symboles
WO2024148548A1 (fr) * 2023-01-12 2024-07-18 Qualcomm Incorporated Approximation en mise en forme de constellation probabiliste

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7148832B2 (ja) * 2019-08-26 2022-10-06 日本電信電話株式会社 通信機及び光伝送システム
CN111162801A (zh) * 2019-12-27 2020-05-15 上海微波技术研究所(中国电子科技集团公司第五十研究所) 基于序排列的概率整形分布匹配器的装置和方法
CN113746594A (zh) * 2020-05-29 2021-12-03 深圳市中兴微电子技术有限公司 概率整形编码装置、系统及方法
CN114285519B (zh) * 2020-09-27 2024-04-26 中兴通讯股份有限公司 数据发送、接收方法及终端、系统、设备、可读存储介质
CN116418411B (zh) * 2023-06-06 2023-09-22 众瑞速联(武汉)科技有限公司 一种用于波分复用系统的光信号编码方法及系统

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1324558A1 (fr) * 2001-12-28 2003-07-02 Sony International (Europe) GmbH Emetteur et méthode de radiodiffusion numérique multirésolution avec mise en forme de trellis gaussienne pour réduire la puissance du signal émis et décoder à plusieurs étages correspondant

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7756350B2 (en) * 2006-11-13 2010-07-13 Global Ip Solutions, Inc. Lossless encoding and decoding of digital data
US9881625B2 (en) * 2011-04-20 2018-01-30 Panasonic Intellectual Property Corporation Of America Device and method for execution of huffman coding
KR102149770B1 (ko) * 2013-08-26 2020-08-31 삼성전자주식회사 메모리 컨트롤러 및 그것의 동작 방법

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1324558A1 (fr) * 2001-12-28 2003-07-02 Sony International (Europe) GmbH Emetteur et méthode de radiodiffusion numérique multirésolution avec mise en forme de trellis gaussienne pour réduire la puissance du signal émis et décoder à plusieurs étages correspondant

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BOCHERER GEORG ET AL: "Bandwidth Efficient and Rate-Matched Low-Density Parity-Check Coded Modulation", IEEE TRANSACTIONS ON COMMUNICATIONS, IEEE SERVICE CENTER, PISCATAWAY, NJ. USA, vol. 63, no. 12, 1 December 2015 (2015-12-01), pages 4651 - 4665, XP011593618, ISSN: 0090-6778, [retrieved on 20151215], DOI: 10.1109/TCOMM.2015.2494016 *
BUCHALI ET AL.: "Experimental Demonstration of Capacity Increase and Rate-Adaptation by Probabilistically Shaped 64-QAM", POSTDEADLINE PAPER PDP.3.4, 2015
G. BOCHERER ET AL.: "Bandwidth Efficient and Rate-Matched Low-Density Parity-Check Coded Modulation", IEEE TRANS. COMMUN., vol. 63, no. 12, 2015, pages 4651 - 4665, XP011593618, DOI: doi:10.1109/TCOMM.2015.2494016
P. SCHULTE; G. BOCHERER: "Constant Composition Distribution Matching", IEEE TRANS. INF. THEORY, vol. 62, no. 1, 2016, XP011594649, DOI: doi:10.1109/TIT.2015.2499181
SCHULTE PATRICK ET AL: "Constant Composition Distribution Matching", IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE PRESS, USA, vol. 62, no. 1, 1 January 2016 (2016-01-01), pages 430 - 434, XP011594649, ISSN: 0018-9448, [retrieved on 20151218], DOI: 10.1109/TIT.2015.2499181 *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11159273B2 (en) 2017-08-17 2021-10-26 Sony Corporation Mapping device and method, probabilistic signal point shaping device and method
WO2019034780A1 (fr) * 2017-08-17 2019-02-21 Sony Corporation Dispositif et procédé de mappage, dispositif et procédé de mise en forme de point de signal probabiliste
US11277225B2 (en) * 2017-08-17 2022-03-15 Sony Coroporation Probabilistic signal point shaping device and method
EP3605906A1 (fr) * 2018-08-02 2020-02-05 Nokia Solutions and Networks Oy Transmission d'amplitudes façonnées de manière probabiliste à l'aide d'étiquettes d'amplitude partiellement antisymétriques
CN110798267A (zh) * 2018-08-02 2020-02-14 诺基亚通信公司 使用部分反对称的幅度标签的概率成形的幅度的发送
JP2020048188A (ja) * 2018-08-02 2020-03-26 ノキア ソリューションズ アンド ネットワークス オサケユキチュア 部分的非対称増幅ラベルを用いた確率的整形振幅の伝送
CN110798267B (zh) * 2018-08-02 2023-02-17 诺基亚通信公司 使用部分反对称的幅度标签的概率成形的幅度的发送
US10944504B2 (en) 2018-08-02 2021-03-09 Nokia Solutions And Networks Oy Transmission of probabilistically shaped amplitudes using partially anti-symmetric amplitude labels
JP7007336B2 (ja) 2018-08-02 2022-01-24 ノキア ソリューションズ アンド ネットワークス オサケユキチュア 部分的非対称増幅ラベルを用いた確率的整形振幅の伝送
CN113169809B (zh) * 2018-10-25 2023-09-29 华为技术有限公司 分布匹配器、信道编码器以及用于编码数据比特或符号的方法
CN113169809A (zh) * 2018-10-25 2021-07-23 华为技术有限公司 分布匹配器、信道编码器以及用于编码数据比特或符号的方法
WO2020108771A1 (fr) * 2018-11-30 2020-06-04 Huawei Technologies Co., Ltd. Dispositif et procédé de mise en forme probabiliste de constellations
JP2022507015A (ja) * 2019-02-20 2022-01-18 三菱電機株式会社 通信システム及びビットのシンボルを通信する方法
CN113424467A (zh) * 2019-02-20 2021-09-21 三菱电机株式会社 通信系统及用于通信比特符号的方法
WO2020170497A1 (fr) * 2019-02-20 2020-08-27 Mitsubishi Electric Corporation Système et procédé de communication pour communiquer des symboles de bits
CN113424467B (zh) * 2019-02-20 2024-04-19 三菱电机株式会社 通信系统及用于通信比特符号的方法
CN113454962A (zh) * 2019-02-26 2021-09-28 三菱电机株式会社 分布整形方法和分布解整形方法、分布整形编码器和分布整形解码器以及传输系统
CN110971559B (zh) * 2019-12-18 2022-02-01 南京信息工程大学 一种降低ofdm-pon信号峰均功率比的调制解调方法
CN110971559A (zh) * 2019-12-18 2020-04-07 南京信息工程大学 一种基于动态控制因子降低ofdm-pon信号峰均功率比的调制解调方法
WO2022183472A1 (fr) * 2021-03-05 2022-09-09 Qualcomm Incorporated Sélection de codage liée à la mise en forme de constellation
WO2024016205A1 (fr) * 2022-07-20 2024-01-25 Qualcomm Incorporated Mise en correspondance de gris pour adaptation de distribution variable vers fixe
WO2024045107A1 (fr) * 2022-09-01 2024-03-07 Qualcomm Incorporated Codage arithmétique comprenant une détermination de séquence de symboles
WO2024148548A1 (fr) * 2023-01-12 2024-07-18 Qualcomm Incorporated Approximation en mise en forme de constellation probabiliste

Also Published As

Publication number Publication date
CN110140330B (zh) 2021-08-13
CN110140330A (zh) 2019-08-16

Similar Documents

Publication Publication Date Title
WO2018121887A1 (fr) Appareil et procédé de mise en forme de la distribution de probabilités d'une séquence de données
EP3718228B1 (fr) Système et procédé de communication utilisant un ensemble de distribution matchers
Schulte et al. Divergence-optimal fixed-to-fixed length distribution matching with shell mapping
Pikus et al. Bit-level probabilistically shaped coded modulation
CN113424467B (zh) 通信系统及用于通信比特符号的方法
US9246510B2 (en) Apparatus and method for multilevel coding in communication systems
CN110199490B (zh) 一种概率成形操作的方法和装置
CN111670543B (zh) 用于信号整形的多组成编码
US12081274B1 (en) Probabilistic shaping techniques for high performance coherent optical transceivers
EP3949184B1 (fr) Algorithme d'appariement de distribution de longueur de bloc courte
CN110073640B (zh) 用于转换或重新转换数据信号的方法以及用于数据传输和/或数据接收的方法和系统
EP3306821B1 (fr) Procédé de conversion et de reconversion d'un signal de données et procédé et système de transmission de données et/ou de réception de données
US9136870B2 (en) Method and apparatus with error correction for dimmable visible light communication
WO2019197037A1 (fr) Codeur et décodeur à niveaux multiples avec mise en forme et procédés de codage et de décodage à niveaux multiples avec mise en forme
KR20220085049A (ko) 멀티-레벨 인코딩을 위한 장치
CN110892658B (zh) 对具有编码符号的目标概率分布的消息进行编码的设备和方法
KR100526510B1 (ko) 이동 통신시스템의 다중 안테나 신호 송수신 장치 및 방법
Djordjevic et al. Power efficient LDPC-coded modulation for free-space optical communication over the atmospheric turbulence channel
CN114616773A (zh) 分布匹配器及分布匹配方法
KR20080026584A (ko) 골든 코드 기술에 따른 시공간 부호의 블록 송신을 위한컨벌루티브 부호화 방법 및 시스템
Zhang et al. Optimal design of linear space code for MIMO optical wireless communications
Mittal et al. Channel State Information feedback overhead reduction using Arithmetic coding in massive MIMO systems
Tchamkerten et al. On the use of training sequences for channel estimation
KR102098299B1 (ko) 다중 안테나 시스템의 연판정 정보 생성 장치 및 그 방법
Salman et al. On the Capacity Region of the Three-Receiver Broadcast Channel With Receiver Message Cognition

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17700895

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17700895

Country of ref document: EP

Kind code of ref document: A1