Upload
sara-hudson
View
226
Download
0
Embed Size (px)
Citation preview
Channel Coding Part 1: Block Coding
Doug Young Suh
Physical Layer
Bit error rate
① Transmitting power
② Noise power
Problems in hardware or operating cost
Needs algorithmic(logical) approach!!
1V-1V
0.5
0.5
+ AWGN =
-1V 1V
BER : bit error rate
Datalink/transport layer
Error control coding by addition of redundancy Block coding Convolutional coding
Error-control in computer network Error detection : ARQ automatic repeat request Error correction : FEC forward error correction
Errors and erasure Error : 0 1 or 1 0 at unknown location Erasure : packet(frame) loss at known location
Discrete memoryless channels
Decision between 0 and 1 Hard decision : Simple, loss of information Soft decision : complex
Channel model : BSC
BSC (binary symmetric channel)
Bit error rate, p of BPSK with Gaussian noise Noise No and signal power Ec
pPP
pPP
1)0|0()1|1(
)0|1()1|0(
3 2
1)(
2
1)( )
2(
2
2
0
2
2
xforex
xQ
duexQwhereN
EQp
x
x
uc
p
p
1-p
1-p
p
SNR in dB
Entropy and codeword
Example) Huffman codingsymbolbitsXH /2)( symbolbitsXH /75.1)(
)(38
122
4
11
2
1XHL )(2
4
14 XHL
1/4
1/4
1/4
1/4
Average codeword length
1/2
1/4
1/8
1/8
0
10
10
11
1
1
00
0
00011011
0
10110111
Dice vs. coinl
04/21/23Media Lab. Kyung Hee
University7
Fine-to-coarse Quantization
1/6
1 2 3 4 5 6
{1,2,3} head
{4,5,6} tail
1/2
H T
3 5 2 1 5 4 ∙∙∙ H T H H T T ∙∙∙ quantization
5849.2)( XH 1)( XH
Effects of quantization Data compression Information loss, but not all
Shannon coding theory
Entropy and bitrate RExample) dice p(i)= 1/6 for i=1,…,6
H(X) = Σ(log26)/6 = 2.58 bits
Shannon coding theoremNo error, if H(X) < R(X) = 3 bitsIf R(X) = 2, {00,01,10,11}{1,2,{3,4},{5,6}}
With received information Y=“even number” H(X|Y) = Σlog23/3 = 1.58 < R(X|Y) = 2
If the receiver received 2 more bits decodable
BSC and mutual informationBSC (Binary Symmetric Channel)
H(X|Y) = - Σ Σ p(x,y)log2p(x|y)
H(X|Y) = -p log2 p – (1-p) log2 (1-p)p=0.10.47, p=0.20.72, p=0.51
Mutual information (channel capacity)I(X;Y) = H(X) – H(X|Y)p=0.10.53, p=0.20.28, p=0.50.
1 bit transmission delivers I(X;Y) bit information.
X=0
X=1
Y=0
Y=1
p
p1-p
1-p
H(X|Y) = loss of information
H(X|Y)
P=0 P=1
1bit
(n,k) Code
(n,k) n-k redundant bits (parity bit, check bit) k data bits
Code rate r = k/n Information of k bits is delivered by transmission of n bits.
Parity symbol ⊃ parity bit (For RS, a byte is a symbol.)
(n,k)=(4,3) even parity error detection code when bit error rate p=0.001
odd)n (for 2
1)(n
even)n for (2
1
22 )1(2
n
j
jnjnd pp
j
nP
Probability of non-detected errors
Trade-offs
Trade-off 1 : Error Performance vs. BandwidthA : less bandwidth higher error rate than C at the same channel condition.
Trade-off 2 : Coding gain (D-E)
Trade-off 3 : Capacity vs. Bandwidth
)()()()()(00
dBN
EdB
N
EdBG c
bu
b coded
AB
CE
uncoded
Eb/N0
(dB)8 9 1
4
10-2
10-4
10-6 D
BER
Example) Coded vs. Uncoded Performance
R = 4800bps (n, k) = (15, 11) t=1 Performance of coding?
Sol) without coding
[Watt]power ing transmittdata : here w776,43/ ror PNP
11 where1012.1)1(1 ratioerror block
1002.124.182
6.912.91
4
5
kpP
QN
EQp
dBRN
P
N
E
ku
uM
o
bu
o
r
o
b
Trade-off : an example
61515
2
4
1094.1)1(15
ratioerror block
1036.138.132
3.869.61
654511
154800
codingwith
jc
j
jc
cM
o
cc
co
r
o
c
c
ppj
P
QN
EQp
dBRN
P
N
E
bpsR
bEHigher layer 11kbps,
Datalink layer (11=>15)
Physical layer 15 kbps, cE
MPHigher layer
Datalink layer (15=>11)
Physical layer op)(xQp
Code performance at low values of
Too many errors to be corrected => Turbo codes
ob NE /
(continued) Trade-off : an example
(n, k) code
n: length of a codeword
k: number of message bits
n-k: number of parity bits
Example) Even parity check is a (8,7) code
Systematic code : Message bits are left unchanged.
(Parity check is one of systematic codes.)
GF(2) (GF: Glois field, /galoa/)
“field”: set of variables closed for an operation.
closed: The results of an operation is also an element of the field.
Ex) Set of positive integers is closed for + and x, but, open for – and /.
.
Linear Block Code
GF(2) is closed for the following two operations.
+ 0 1
0 0 1
1 1 0 101
000
10
Two operations above are XOR and AND, respectively.
]| [],,,[ vector Code
],,[ tor Parity vec
],,,[ vector Message
011
01,1
021
bmccc c
bbb b
mmmm
n
kn
kk
GF(2) : Galois Field
Block encoder
Block decoderm
c
e
r ' m
Hamming distance
How many bits should be changed to be the same?
Example) Effect of repetition code
Send (0 1), by using (000 111) or (00000 11111).
Minimum distance
maximum error detection capability
maximum error correction capability
◇ : single error detection
◇ : double error detection OR single error correction
◇ : (double error detection AND single error correction)
OR (triple error detection)
]0001100[1 c ]1000101[2 c
mind
1min d
2
1min
dt
2min d
3min d
4min d
Error-Detecting/correcting capability
Double error correction with k message bits
perfect code
Example) Extended Hamming code
(7,4) + one parity bit = (8,4) ⇒
k
nn
n
CC2
1
2
21
k
nn
n
CC2
1
2
21
4min d
Error-Detecting/correcting capability
Cyclic codes (Cyclic codes ⊂ Linear block codes) For n=7, X7+1 = (X+1)(X3+X2+1)(X3+X+1) Generator polynomial g(X) = X3+X2+1 or X3+X+1 Note that X7+1 = 0 when g(X)=0.
where
Example) mod operator( 나머지 연산자 ) 7 3 = 7 % 3 = 1
Example) Calculate c(X) for m(X) = [1010]
1
1
1)(kn
i
knii XXgXg
)()()( XbXmXXc kn )()( XmXXb kn mod )(Xg
mod
?)( Xg)(Xc mod
Cyclic code
Example) Make the syndrome table of Hamming(7, 4) code.
0 0 0 0 0 0 0 0 0 0
1 0 0 0 0 0 0
0 1 0 0 0 0 0
0 0 1 0 0 0 0
0 0 0 1 0 0 0
0 0 0 0 1 0 0
0 0 0 0 0 1 0
0 0 0 0 0 0 1
Example) For Hamming(7,4), find and when
se
]1011010[re m
Hamming (7,4) Decoding
C1 [101000
1]
C2 [111001
0]
[1110001]
[1110011][1011001] [1110011]
[1110110]
Example) Hamming (7,4) code
a)
b)
Parity check polynomial h(X)
If X is a root of
Since
There exists which satisfies
Then,
]1011[1 m
]0011[2 m
?1 c
?2 c
01nX 01)()( nXXhXg
)(Xc mod 0)( Xg
)()()( XgXaXc )(Xa
?)()( XhXc
Entropy and Hamming (7,4)
k=4, n=7 How many codewords? 2k = 24 Their entropy? P = 1/ 2k k bits / codeword
Information transmission rate = coding rate r = k/n [information/transmission] The value of a bit is k/n. Suitable when I(X;Y)=H(X)-H(X|Y) > k/n =
0.57
I(X;Y) = H(X) – H(X|Y)p=0.10.53, p=0.20.28, p=0.50.Suitable at BER of less than 10%
H(X|Y)
P=0 P=1
1bit
Other Block Codes (1)CRC codes : error detection only for a long packet CRC-12 code = CRC-16 code CRC-CCITT code open question) How many combinations of non-detectable
errors for CRC-12 code used for 100bits long data? What is the probability of the non-detectable errors when BER is 0.01?
(2) BCH Codes (Bose-Chadhuri-Hocquenghem) Block length : Number of message bits : Minimum distance : Ex) (7,4,1) g(X)=13 (15,11,1) g(X)=23 (15,7,2) g(X)=721
(15,5,3) g(X)=2467 (255,171,11)
15416214212342356077061630637
(3) Reed Solomon Codes : arithmetic
1211321 XXXXX
12 mnmtnk 12min td
)2( mGF
3551)2,21,31(),,( tkn