Upload
vinu4794
View
9
Download
4
Embed Size (px)
DESCRIPTION
Problems Information theory.pdf
Citation preview
1
Advanced Digital Communication
Information and Entropy
1. Let p denote the probability of some event. Plot the amount of information gained bythe occurrence of this event for 10 ≤≤ p . [9.1 Haykin]
2. A source emits one of four symbols 21 ,, ssso and 3s with probabilities 1/3, 1/6, 1/4,and 1/4, respectively. The successive symbols emitted by the source are statisticallyindependent. Calculate the entropy of the source. [9.3 Haykin]
3. The sample function of a Gaussian process of zero mean and unit variance isuniformly sampled and then applied to a uniform quantizer having the input-outputamplitude characteristic shown in Fig.1. Calculate the entropy of the quantizer output.[9.5 Haykin]
Input
Output
1-1 0.5
-0.5
1.5
-1.5
Given:
==
=∫∞ −
1 if1611.00 if5.0
21 2/2
xx
dyex
y
π
4. Consider a discrete memoryless source with source alphabet { }321 , , sssS = andsource statistics { }0.15 ,0.15 ,7.0 .(a) Calculate the entropy of the source.(b) Calculate the entropy of the second-order extension of the source. [9.7 Haykin]
2
Mutual Information and Channel Capacity
5. A nonsymmetric binary channel is shown in Figure 1.(a) Find )0( =YP and )1( =YP when ,4/1)0( ==XP ,4/3)1( ==XP ,75.0=α
and .9.0=β(b) Find )( and ),( ),( ),( XYHYXHYHXH .(c) Find the rate of information transmission over the channel.
0
1
α
1
0
X Ysybmol/sec 1000=sr
β
α−1
β−1
Figure 1
6. Find the rate of information transmission of the discrete channel shown in Figure 2.
X Y
1 1
22
3 3
8.0
8.0
8.01.0
1.02.02.0
symbol/sec 1000=sr
3/1)3()2()1( ====== XPXPXP
Fig. 2
7. Two binary symmetric channels are connected in cascade. Find the overall channelcapacity of the cascaded connection, assuming that both channels have the sametransition probability as shown below. [Haykin9.22]
8. An analog signal has a 4 kHz bandwidth. The signal is sampled at 2.5 times theNyquist rate and each sample is quantized into one of 256 equally likely levels.Assume that the successive samples are statistically independent.(a) What is the information rate of this source?
3
(b) Can the output of this source be transmitted without errors over a Gaussianchannel with a bandwidth of 50kHz and S/N ratio of 23 dB?
(c) What will be the output of the source without errors if the S/N ratio is 10 dB?
9. A black-and-white television picture may be viewed as consisting of approximately5103× elements, each of which may occupy one of 10 distinct brightness levels with
equal probability. Assume that (1) the rate of transmission is 30 picture frames persecond, and (2) the signal-to-noise ratio is 30 dB. Using the information capacitytheorem, calculate the minimum bandwidth required to support the transmission ofthe resulting video signal.(Note: As a matter of interest, commercial television transmissions actually employ abandwidth of 4.2 MHz, which fits into an allocated bandwidth of 6MHz.)[Haykin9.31]
4
Solution1. bits log2 pI −=
2. bits 959.1log3
02 =−= ∑
=iii ppH
3. ∑=
−=3
02 )(log)(
iii xpxpH
where xi denotes a representation level of quantizer.
1611.02
exp21)1()()(
1
2
30 =
−=∞<<== ∫
∞dyyinputpxpxp
π
3389.01611.05.02
exp21)10()()(
1
0
2
21 =−=
−=<<== ∫ dyyinputpxpxp
πTherefore, bits 91.1=H
4. lbits/symbo 079.1)(log)(2
02 =−= ∑
=iii xpxpH
lbits/symbo 158.2)(2)( 2 == SHSH
5 (a) 2625.0)1()1|0()0()0|0()0( ====+===== XPXYPXPXYPYP7375.0)1( ==YP
(c) ∑=
===−=1
02 lbits/symbo 811.0)(log)()(
iiXPiXPXH
5
∑=
===−=1
02 lbits/symbo 831.0)(log)()(
iiYPiYPYH
∑=
===−=1
0lbits/symbo 555.0)|()()|(
iiXYHiXPXYH
∑=
===−=1
0lbits/symbo 536.0)|()()|(
iiYXHiYPYXH
(c) lbits/symbo 276.0)|()();( =−= YXHXHYXIlbits/symbo 277.0)|()();(or =−= XYHYHYXI
Therefore, dbits/secon 276);( =srYXI
6....)3()3|1()2()2|1()1()1|1()1( ====+===+===== XPXYPXPXYPXPXYPYP
...)3()3|2()2()2|2()1()1|2()2( ====+===+===== XPXYPXPXYPXPXYPYP...)3()3|3()2()2|3()1()1|3()3( ====+===+===== XPXYPXPXYPXPXYPYP
∑=
===−=3
12 ...)(log)()(
iiYPiYPYH
∑=
===−=3
1...)|()()|(
iiXYHiXPXYH
lbits/symbo ...)|()();( =−= XYHYHYXI
dbits/secon ...);();( =⋅= YXIrYXI s
7.
The BSC of a cascaded channel becomes
6
.Using the result of the BSC, we have
8. (a) Sampling rate kHzkHzf s 20425.2 =××=
Source entropy = 2561 log
256
12 =−∑
=i
ii ppp
Therefore, lbits/symbo 8)( =XHInformation rate = dbits/secon 160)( kXHf s =×
(b) Channel Capacity = kbpskNSB 382)5.1991(log50)/1(log 22 =+=+As Channel Capacity > Information rate, the output of this source can betransmitted without errors.
(c) /
9. Source entropy = lbits/symbo 32.3log10
12 =−∑
=iii pp
Information rate = MbpsXH 9.2930)(103 5 =⋅⋅× Let C = Information rate, we have
MHzB
B3
)10001(log109.29 26
=⇒+=×