Analysis of Iterative Decoding Alexei Ashikhmin Research Department of Mathematics of Communications...

Preview:

Citation preview

Analysis of Iterative

Decoding

Alexei Ashikhmin

Research Department of Mathematics of Communications

Bell Laboratories

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel Gaussian Channel MIMO Channel

Expander Codes

Shannon’s Channel Coding Theorem

In 1948, Claude Shannon, generally regarded as the father of the Information Age, published the paper: “A Mathematical Theory of Communications” which laid the foundations of Information Theory.

In this remarkable paper, he formulated the notion of channel capacity, defining the maximal rate by which information can be transmitted reliably over the channel.

Channel

Shannon proved that for any channel, there exists a family of codes (including linear block codes) that achieve arbitrary small probability of error at any communication rate up to the channel capacity

Channel

Decoder

Encoder

Linear binary codes

A binary linear [n,k] code C is a k-dimensional subspace of R=k/n is the code rate Example of an [6,2] code:

Repetition code of length 3

Single parity check code of length 3

sum of code bits of any codeword equals zero by mod 2, i.e. the

number of ones in any codeword is even

Shannon proved that if R<C then a typical (random) code has the probability of error decreasing exponentially fast with the code length

(SNR is the Signal to Noise Ratio)

Shannon’s Channel Coding Theorem (Cont.)

The complexity of decoding of a random code is

We need codes with a nice structure

Algebraic codes (BCH, Reed-Solomon, Algebraic Geometry) have nice structure, but do not allow one to achieve capacity

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel AWGN and Other ChannelsMIMO Channel

Expander Codes

Low Density Parity Check (LDPC) Codes

LDPC codes can be defined with the help of bipartite graphs

Var

iabl

e no

des

Che

ck n

odes

0

1

1

1

1

LDPC codes – Definition (Cont.)Sparse graph

Average variablenode degree dv

Average checknode degree dc

n is the code length m is the number of parity checksn-m is the number of information symbols

Belief Propagation Decoding

We receive from the channel a vector of corrupted symbols

For each symbol we compute log-likelyhood ratio

Belief Propagation Decoding (Cont.)Sparse graph

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel AWGN and Other ChannelsMIMO Channel

Expander Codes

Density Evolution Analysis

Assume that we transmit +1, -1 through Gaussian channel. Then

received symbols are Gaussian random variables

their log-likelihood ratios (LLR) are also Gaussian random variables

T.Richardson and R. Urbanke

Density Evolution Analysis

Sparse graph

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel AWGN and Other ChannelsMIMO Channel

Expander Codes

Extrinsic Information Transfer (EXIT) Functions

Stephen ten Brink in 1999 came up with EXIT functions for analysis of iterative decoding of TURBO codes

Ashikhmin, Kramer, ten Brink 2002: EXIT functions analysis of LDPC codes and properties of EXIT functions in the binary erasure channel

E.Sharon, A.Ashikhmin, S.Litsyn 2003: EXIT functions for continues channels

I.Sutskover, S.Shamai, J.Ziv 2003: bounds on EXIT functions I.Land, S.Huettinger, P. Hoeher, J.Huber 2003: bounds on

EXIT functions Others

EXIT Functions (cont)

Average a priori information:

Average extrinsic information:

EXIT function:

EncoderExtrinsi

c

Channel

Extrinsic

APP

Decoder

Source

Simplex [15,4] Code and a Good Code of Infinite Length with R=4/15

R=4/15

Average a priori information: Average communication information: Average extrinsic information: EXIT function:

Encoder

1Source

Extrinsic

Channel Extrinsic

APP

Decoder

Encoder

2

Communication

Channel

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel AWGN and Other ChannelsMIMO Channel

Expander Codes

EXIT Function in Binary Erasure ChannelA.Ashikhmin, G.Kramer, S. ten Brink

are split support weights (or generalized Hamming weights) of a code, i.e. the number of subspaces of the code that have dimension r and support weight i on the first n positions and support weight j on the second m positions.

Let

Then

Dec

oder

Dec

oder

Comm. Chan.

Extrinsic Chan.

Examples for BEC with erasure probability q

Let dv=2 and dc=4, the code rate R=1-dv/dc =1/2

and This code does not achieve capacity

In BEC with q=0.3

Area Theorems for Binary Erasure Channel

Theorem:

EncoderExtrinsic

Channel

Extrinsic

APP

Decoder

Source

Code with large

minimum distance

Code with small

minimum distance

Theorem:

where C is the capacity of the communication channel

Encoder

1Source

Extrinsic

Channel Extrinsic

APP

Decoder

Encoder

2

Communication

Channel

For successful decoding we must guarantee that EXIT functions do not intersect with each other

This is possible only if the area under the variable nodes function is larger than the area under the check nodes function

To construct an LDPC code that achieves capacity in BEC we must match the EXIT functions of variable and check nodes.

Tornado LDPC codes (A. Shokrollahi) Right-regular LDPC codes (A. Shokrollahi), obtained with the

help of the Taylor series expansion of the EXIT functions:

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel AWGN and Other ChannelsMIMO Channel

Expander Codes

AWGN and other Communication ChannelsE.Sharon, A. Ashikhmin, S. Litsyn

To analyse LDPC codes we need EXIT functions of the repetition and single parity check codes in other (not BEC) channels

EXIT function of repetition codes for AWGN channel

where

Let be “soft bits”

Let be the conditional probability density of T given 1 was transmitted.

If the channel is T-consistent, i.e. if

then the EXIT function of single repetition code of length n is

How accurate can we be with EXIT functions

Let us take the following LDPC code: the variable nodes degree distribution

the check node degree distribution

According to Density Evolution analysis this code can work in AWGN channel with Eb/No=0.3dB

According to the EXIT function analyses the code can work at Eb/No=0.30046dB, the difference is only 0.00046dB

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel Gaussian Channel MIMO Channel

Expander Codes

Application to Multiple Antenna Channel

Capacity of the Multiple Input Multiple Output channel grows linearly with the number of antennas

We assume that detector knows coefficients

Design of LDPC Code for MIMO ChannelS. ten Brink, G. Kramer, A. Ashikhmin

We construct combined EXIT function of detector and variable nodes and match it with the EXIT function of the check nodes

The resulting node degree distribution is different from the AWGN channel

Probability of Error and Decoding Complexity

Let A be an LDPC code with rate R=(1-)C Conjecture 1: the probability of decoding error of A decreases

only polynomially with the code length

Conjecture 2: the complexity of decoding behaves like

Mutual Information and Channel Capacity LDPC Codes Density Evolution Analysis of LDPC Codes EXIT Functions Analysis of LDPC Codes

Binary Erasure Channel Gaussian Channel MIMO Channel

Expander Codes

Exapnder Codes M.Sipser and D.A.Spielman (1996)

Let us take a bipartite expander graph Assign to edges code bits such that bits on edges conneted to a

left (right) node form a codeword of code C1 (C2)

Bits form a code word of C1

Bits form a code word of C1

Bits form a code word of C2

Bits form a code word of C2

An (V,E) graph is called (,)-expander if every subset of at most |V| has at least |V| neighbors

Maximum Likelihood Decoding of C1

Decoding of Exapnder Codes

Maximum Likelihood Decoding of C1

Maximum Likelihood Decoding of C2

Maximum Likelihood Decoding of C2

M. Sipser and D. Spielman (1996) showed that an Expander code can decode d/48 errors and that d grows linearly with N

G. Zemor (2001) proved that an Expander code can decode d/4 errors

A. Barg and G. Zemor proved that Expander codes have positive error exponent if R<C

R. Roth and V. Skachek (2003) proved that an Expander codes can decode d/2 errors

What is the complexity of decoding of Expander codes?

Complexity of Decoding of Expander Codes

Let N be the entire code length, let n be the length of codes C1 and C2

We choose n=log2 N and allow N tends to infinity

The complexity of ML decoding of C1 and C2 is O(2n)=O(N)

The overall complexity of decoding is linear in N At the same time if R=(1-)C then the complexity of decoding is

Can we replace ML decoding with decoding up to half min.dist?

Threshold of Decoding of Expander Codes

Barg and Zemor: Choose C2 to be a good code with R2 1 and C1 to be a good code with R1 C (capacity). The rate of the expander code is R=R1+R2-1 C (capacity).

C2 Expander Code:

R1 C

Codes with Polynomial Decoding Complexity and Positive Error Exponent

A.Ashikhmin and V.Skachek (preliminary results)

We assume that in there exist LDPC codes such that Conjecture 1. Conjecture 2. The complexity of decoding

Let us use such kind of LDPC codes as constituent codes C1 and C2 in a Expander code Cexp with rate R=(1-)C.

Theorem.

The complexity of decoding of Cexp is

The error exponent is

where i maximizes the expression:

E is small, but positive. Hence Perror=2-NE is decreasing exponentially fast with the code length N.

Random codes

Concatenation of LDPC and Expander codes

Thank you for your attention

YOURTHANK

ATTENTION!YOUFOR

EXIT Function

source