Upload
arghya-pal
View
237
Download
5
Embed Size (px)
Citation preview
7/25/2019 Communication System Jto Lice Study Material Sample
1/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
1
LICE - JTO
Limited Internal Competi t ive Examinat ion
STUDY MATERIAL
COMMUNICATION SYSTEM
COMMUNICATION SYSTEM
COMMUNICATION SYSTEM
7/25/2019 Communication System Jto Lice Study Material Sample
2/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
2
CONTENTS
Chapte r-1 Random signals & noise 3 -16
Chapte r-2 M odulat ion Tec hniques 16-49
Chapte r-3 Digital Communicat ion 50-65
Chapter-4 Stored program control (SPC) 65 -72
Chapte r-5 Propagat ion of w aves 72 -75
CHAPTER-6 I ntroduct ion to microwa ve engineering 75 -84
Chapte r-7 Satellite communicat ions 84 -109
Chapter-8 Optical communication 110-115
Frequently asked MCQ 115-131
Chapte r-w ise Solved Problems
SYLLABUS : Communication Systems
Amplitude, frequency and phase modulation, their generation and demodulation,
Noise. PCM, basic principles of SPC Exchanges. Quantization & Coding; Time
division and frequency division multiplexing; Equalization; Optical communication-in
free space & fibre optic; Propagation of signals at HF,VHF,UHF and Microwave
frequency; Satellite Communication.
7/25/2019 Communication System Jto Lice Study Material Sample
3/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
3
Chapter-1 RANDOM SIGNALS & NOISE
Random Signals and Noise
Signals can be classified as deterministic and random. Deterministic signals can be
completely specified as functions of time. Random signals are those whose value cannot be
predicted in advance. Radio communication signals are random signals, which are invariably
accompanied by noise. Noise like signals, also called random processes can be predicted
about their future performance with certain probability of being correct.
Probability:
The concept of probability occurs naturally when we contemplate the possible outcomes
of an experiment whose outcomes are not always the same. Thus the probability of
occurrence of an event A, that is
P(A) =number of possible favourable outcomes
total number of possible equally likely outcomes... (i)
Two possible outcomes of an experiment are defined as being mutually exclusive if the
occurrence of one outcome precludes the occurrence of the other. Hence,
P (A1 or A2) = P (A1) + (A2) ... (ii)
Where A1 and A2 are two mutually exclusive events.
The joint probability of related and independent events is given as:
P(A, B)P(B/A)
P(A) ... (iii)
Where P(B/ A) is the conditional probability of outcome B, given A has occurred and
P(A, B) is the probability of joint occurrence of A and B.
Random Variables:
The term random variable is used to signify a rule by which a real number is assigned
to each possible outcome of an experiment.
A random experiment with sample space S. A random variable X()is a single valued
real function that assigns a real number called the value of to each sample point of
S.
7/25/2019 Communication System Jto Lice Study Material Sample
4/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
4
The sample space S is termed as domain of random variable X and the collection of
all numbers [values of] is termed as the range of random variable X.
The random variable X induces a probability measure on the real line
If X can take on only a countable number of distinct values, then X is called a discrete
random variable.
If X can assume any values within one are more intervals on the real line, then X is called a
continuous random variable.
Cumulative Distribution Function (CDF):
The cumulative distribution function associated with a random variable is
defined as the probability that the outcome of an experiment will be one of the
outcome for which X(A) x, where x is any given number. If Fx(x) is the
cumulative distribution function then
Fx (x) = P (X x)
where P (X x) is the probability of the outcome.
Fx (x) has the following properties:
1. 0 F (x) 1
2. F () = 0, F () = 1
3. F (x1) F (x2) if x1 < x2.
Probability Density Function (PDF):
The probability density function fx(x) is defined in terms of cumulative distribution
function Fx (x) as:
fx (x) = xd
F (x)dx
The properties of PDF are:
1.fx (x)0, for all x
7/25/2019 Communication System Jto Lice Study Material Sample
5/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
5
2. xf (x)dx 1
3.
x
x xF (x) f (x) dx
The relation between probability and probability density can be written as:
P(x1 xx2) =2
1
x
x
f(x)dx
Example 1:
Consider the probability f (x) = ae-b|x|, where X is a random variable whose Allowable values
range from x = to x = +.
Find:
a. The cumulative distribution function F (x).
b. The relationship between a and b
c. The probability that the outcome x lies between 1 and 2.
Solution:
a) The cumulative distribution function is :
F (x) = P (X x)
x
x
x
b|x|
bx
bx
f (x) dx
ae
ae , x 0
b
a(2 e ) x 0
b
f (x)dx 1
b|x| 2aae dx 1b
a 1
b 2
c) The probability that x lies in the range between 1 and 2 is
2
b|x| b 2b
1
b 1P(1 x 2) e dx (e e )2 2
7/25/2019 Communication System Jto Lice Study Material Sample
6/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
6
The average value of a random variable is given as:
i i
i
X E(x) m m x P (x )
where X or E(x) or m represents the average, expected or mean value of random variable
x.
The variance (2) x is given as
2= E (x2)m2
where is the standard deviation of x.
If the average valuem = 0 then
The covariance of two random variables x and y is defined as: = E {(xmx) (ymy)}
where mi denotes the corresponding mean. For independent random variables = 0.
The correlation coefficient between the variables x and y, serves as a measure of the
extent to which X and Y are dependent. It is given by
( )
x y x y
E xy
falls within the range1 1.
When the random variables X and Y are said to be uncorrelated. When random variables are
independent, they are uncorrelated but vice versa is not true.
Gaussian PDF:
The Gaussian PDF is of great importance in the analysis of random variables. It is defined as
2
2
( )
2
2
1( )
2
x m
f x e
And is plotted in figure below, m and2 are the average value and variance of f(x).
2 2E(X)
7/25/2019 Communication System Jto Lice Study Material Sample
7/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
7
The Gaussian density Function
Thus, x m xf (x)dx and
2 2 2E[(x m) ] (x m) f (x) dx
It can be deduced from figure that when x - m = , f (x) has fallen to 0.606 of its peak
value. When x = m 2, f (x) falls to 0.135 of the peak value.
The Rayleigh PDF:
The Rayleigh PDF is defined by
2
2
r,
22
re 0 r
f(r)
0, r < 0
A plot of f (r) as a function of r is shown in figure below.
7/25/2019 Communication System Jto Lice Study Material Sample
8/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
8
The Rayleigh PDF
The mean value is given by R ,2
, the mean squared value = 2a2
, and the variance
r2 =
22 .2
Auto carrelation and Power Spectral Density (PSD):
A random process n(t), being neither periodic nor of finite energy has an autocorrelation
function defined by
1
2
TT
2
1R( ) lim n(t) n(t )dt
T
Where is known as time lag parameter. For a random process the PSD G(f) is given as:
jwG(f ) F[R( )] R ( )e dr
Thus,F
G(f ) R( ) they constitute a Fourier Transform pair.
Consider a deterministic waveform v (t), extending from - to. Let a section of
waveform V(t) be Selected such that VT (t) = v(t) forT T
to2 2
and zero otherwise. It vT (f)
is the Fourier Transform of vT(t), |vT(f)|2 is the energy spectral density. Hence over the
interval T the normalized power density is2
T| v (f ) |
.T
As T, VT (t)V(t)
Thus, the physical significance of PSD G (f), is that
21( ) lim | ( ) |TT
G f V f T
Also, we can write the total energy of the signal V (t) is
2| ( ) |E G f df
This is known as Rayleighs Energy Theorem.
Properties of Autocorrelation function, R():
1. R() =R(-), i.e. it is an even function of time.
7/25/2019 Communication System Jto Lice Study Material Sample
9/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
9
2. R(0) = E[X2(t)]
3. max|R()| = R (0), i.e. R () has maximum magnitude at zero time lag.
Properties of PSD, G(f):
1. G(f) is a real function of frequency f for a random process.
2. G(f) = G(-f), i.e., it is an even function of f.
3. G(f) for all f.
Noise:
Noise is an unwanted signal of an unpredictable nature. It can either be man made
ofrnatural. The internally generated noise due to circuit components is of two types shot noise
and thermal noise.
Shot Noise:
This type of noise is predominant in valves and transistors as current in these devices is
due to flow of discrete electrons of diffusion of electrons and holes. The mean square value
of shot noise current is:
2SNI = 2eI0f amp.
Where, I0- mean value of current (dc)
f-bandwidth of device in Hz.
Thermal Noise:
The noise in a conductor of resistor due to random motion of electrons is called thermal a
noise. The mean sqare value of thermal noise voltage appearing across the terminals of the
resistor R is
ETN2 = 4KTRf
Where,
T - the absolute temperature
K - Boltzmanns constant,
Noise Figure:
Noise figure is defined as
=Total available output power (due to device and source) per unit bandwidth
Total available output noise power due to the source alone
= 10 log10 F, gives the Noise Factor
7/25/2019 Communication System Jto Lice Study Material Sample
10/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
10
Terms of equivalent noise temperature T, we can write.
Te = To(F1)
Where Te= equivalent noise temperature of the amplifier or receiver whose Noise Figure is F.
T0= 17 C = 290 K
We can also write,input
output
(SNR)
(SNR)
Example 2:
A pentode having an equivalent noise resistance of 5 k is selected for Equation as an
amplifier with a signal source of emf 100V and an internal resistance 125 k connected
across the input. The grid leakage resistance is 175 k. The width may be taken as 10KHz
and the temperature as 27oC. Calculate the signal to --ratio at the output of the amplifier and
express the value in dB. (Given 1.38x1023 J/K).
Solution:
Give input resistances g
s g
R R 125 17572.92k
R R 125 175
175100 58.33 V
300
Voltage,
= 4(Ra + Reff) KTB
= 4(5 x 103 + 72.92 x 103) x 1.38 x 10-23 x 300 x 104
= 1290.36 x 10-34 volts2
= 3.592V
22
n4
2
pentode n
VS 58.33263.7
N V 3.592
7/25/2019 Communication System Jto Lice Study Material Sample
11/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
11
10log10 263.7 = 24.2dB
Example: A receiver connected to an antenna whose resistance is 50has an equivalent noise
temperature of Calculate the receivers noise figure in decibels and its equivalent noise temperature.
Sol. eq
0
R 30F 1 1 1 0.6 1.6R 50
10 log 1.6 = 2.04 dB
Teq = T0 (F1)290(1.61) = 290 0.6 = 174 K
Information Theory
Information
The amount of information contained in an event is closely related to its uncertainty. For
a discrete memory less source (DMS), denoted by X, with symbols {x1, x2, ... xm}, the
information context of a symbol is given by
1
1( ) log
( )iI x
P x
Unless mentioned take base as 2 since we are dealing with binary digits
2( ) log ( )i iI x P x bits
Average Information or Entropy:
Entropy of a source is given by
m m
i i i i
i 1 i 1H(x) P log(P ) P(x ) I (x ) bits / symbol
When all the input digits have equal probability of occurrence we have maximum un-certainty and
hence and hence maximum entropy.
For example say all them events have equal probability of occurrence
1 2 3
max 2
1( ) ( ) ( ) ....... ( )
log /
mP x P x P x P xM
H m bits symbol
7/25/2019 Communication System Jto Lice Study Material Sample
12/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
12
Example: A source is generating 3 possible symbols with probability1 1 1
, , .4 4 2
Find the information
associated with each of the symbol.
Solution:1 2
1 1( ) ( )
4 4P x P x
We know that 21
1( ) log( )
ii
xP x
Using above formula
1 2 2 2 3 2( ) log 4 2 ( ) log (4) 2 ( ) log (2) 2I x bits I x bits I x bits
The probability of occurrence ofx2 is high hence information associated with it will be less
Information Rate:
If the time rate at which source x emits symbols is r (symbols/s), the information rate R of
the source is given by
R = rH b/s
Example:A source is generating 3 possible symbols with probabilities of1 1 2
, , .4 4 4
Find entropy and
information. Rate if the source is generating one symbol per millisecond.
Solution: We have
2
2 2 2
H P( ) log P( )
1 1 1 1 1 1log 4 log 4 log 2
4 4 2 2 2 2
i ix x
3
2H bits/symbols r= 1 symbol/millisecond
Information rate (R) = H r
3
3 1
2 10 bits/second
3
2R kbps
Channel Capacity of a lossless channel:
C = rCs = r log2m, m-number of symbols in x.
Channel Capacity of Additive White Gaussian Noise Channel (AWGN):
The capacity of the AWGN channel is given by
2log 1 S
C BN
b/s........Shannon-Hartley law
where B-bandwidth
7/25/2019 Communication System Jto Lice Study Material Sample
13/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
13
S
N-SNR at the Channel output
Assuming Bandwidth to be infinite
Then
Shannon-Hartley law becomes.
Each of the frequency component transmitted through a channel affected by same amount of white
noise power.
2log 1 S
C BN
Replace N = N0 where N0Power spectral density of white noise
2SB log 1
N Bo
C BBandwidth of white noise
As,S
C 1.44N
o
B
Proof: For AWGN channel as B channel capacityS
C 1.44N
o
2
S
B log 1 N BoC
Multiply and divide byS
No
2
NS S. B log 1
N S N B
o
o o
C
AssumeN B
[as B , ]S
o x x
2
S 1C lim log 1
N xo
xx
We know, (1 )lim log1x
x e
x
where, nB = N
As bandwidth increase, noise power also increases.
So no channel can have infinite capacity.
B 0
n s slim .B. log 1
S n nB
7/25/2019 Communication System Jto Lice Study Material Sample
14/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
14
B 0
S nB s, lim , log 1
n s, nB
Now using formula
X1lim x log 1 logex
2B
Slim C log e 1.44s / n bits / sec
n
For efficient channel capacity there is trade off between bandwidth and signal to noise
ratio.
Example 3:
Consider a telegraph source having two symbols, dot and dash. The dot duration is 0.2 s. the
dash duration is 3 times the dot duration. The probability of the dots occurring is twice that
of the dash, and the time between symbols is 0.2 s. Calculate the information rate of the
telegraph source.
Solution:
Given, p (dot) = 2p (dash)
Now, P(dot) + P(dash) = 3P(dash) = 1
P(dash) =1
3, P(dot) =
2
3
Using2
1 1
i 1
H(x) P(x )I (x )
We have, H (x) = P(dot) log2 P(dot)P(dash)log2 P(dash)
= 0.667 (0.585) + 0.333 (1.585)
= 0.92 b/symbol
The average time per symbol is
Ts = P(dot)tdot+ P(dash) tdash + tspace= 0.533 s/symbol
The average information rate is
s
H(x) 0.92R rH(x) 1.725 b / s
T 0.533
Example 4:
7/25/2019 Communication System Jto Lice Study Material Sample
15/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
15
Consider an AWGN channel with 4 KHz bandwidth and the noise psd12
102
w/Hz. The
signal power required at the receiver is 0.1mw. Calculate the capacity of this channel.
Solution:
B = 4000 Hz, S = 0.1 103 W
N =B = 2 1012 4000 = 8 109 W
3
4
9
S 0.1x101.25 10 W
N 8x10
Capacity, 4 32 2S
C Blog 1 4000log (1 1.25 10 ) 54.44 10 b / sN
Mutual Information:
The mutual information I (x; y) of a channel is defined by
I (x; y) = H (x)H(x/y) b/symbol.
Where H (x) represents the uncertainty about the channel input before the channel output
is observed and H(x/y) represents the uncertainty about the channel input after the channel
output is observed.
For a binary symmetric channel
I (x; y) = H (y) + plog2 p + (1-p) log2 (1-P)
Source Coding:
A conversion of the output of a DMS into a sequence of binary symbols is called source
coding.
An objective of source coding is to minimize the average bit rate required for
representation of the source by reducing the redundancy of the information source and also to
achieve power saving.
Code Length and Efficiency:
The average code word length,m
1 1 1
l 1
L P (x ) n , n
length in bits of symbol x,
7/25/2019 Communication System Jto Lice Study Material Sample
16/16
JTO LICE Communication System
2015 ENGINEERS INSTITUTE OF INDIA .All Rights Reserved JTO LICE : Classroom , POSTAL, All India TEST Series 28-B/7, Jia Sarai, Near
IIT, Hauz Khas, New Delhi-110016. Ph. 011-26514888. www.engineersinstitute.com
16
Code efficiency: minL H(x)
L L
Code redundancy: y = 1
For discrete source, two methods of source coding
1. Shannon Fanno coding
2. Huffman 1 coding
When the source is transmitting message with unequal probabilites variable length source
coding is efficient than fixed lengh source coding.
SAMPLE FILE
To Buy complete package
Call +91-9990657855, 011-26514888