391

Stochastic processes slides

Embed Size (px)

Citation preview

Page 1: Stochastic processes slides
Page 2: Stochastic processes slides

MODULE 1MODULE 1

Page 3: Stochastic processes slides

BASIC CONCEPTS OF PROBABILITY BASIC CONCEPTS OF PROBABILITY THEORYTHEORY

•RANDOM EXPERIMENT: A random experiment is an experiment in which the outcome varies in an

unpredictable fashion, when the experiment is repeated under the same conditions.• SAMPLE SPACE: The sample space S of a random experiment is defined as the set of all possible outcomes

Page 4: Stochastic processes slides

Discrete sample space: The sample space is countable that means its

outcome has one to one correspondence with the positive integers.

Continuous sample space: The sample space is not countable.

Ex: Picking up a number between 0 and 1.

Page 5: Stochastic processes slides

EVENT: The event occurs if and only if the outcome of the experiment ξ is in the subset.Ex: Select a ball from box numbered from 1 to 50.

Certain event: it consists of all outcomes hence always occurs.

Null event/impossible event): It contains no outcome and never occur.

Elementary event: Event from a discrete sample space that consists of a single outcome.

Page 6: Stochastic processes slides

PROBABILITY: It is the measure of uncertainty of occurrence of an event or chance of probability occurrence of an event.

THE AXIOMS OF PROBABILITY:

Axiom 1: 0 ≤ P [A]Axiom 2: P[S] =1Axiom 3: for any no of mutual exclusively events A1, A2, A3 …An. in the class c. P (A1 Ụ A2 Ụ A3………………… Ụ An)=

P(A1)+P(A2)+P(A3)+……..P(An).

Page 7: Stochastic processes slides

THEOREMS on PROBABILITY: Theorem 1: If there are two events A1<A2 then. P (A1) ≤ P (A2). And P(A2-A1)=P(A2)-P(A1).Theorem 2: The probability of an event is lies between 0 ≤P (A) ≤1.Theorem 3: P (Ф) =0.Theorem 4: P (A′) =1-P (A).Theorem 5: If there are ‘n’ no of mutually exclusively events. P(A)=P(A1)+P(A2)+P(A3)+……..P(An)=1. Theorem 6: P (A Ụ B) =P (A) +P (B)-P (A∩B)

Page 8: Stochastic processes slides

Example: Choose the experiment tossing a die the sample space S={1,2,3,4,5,6}, find the probability of P(2 Ụ 5).Sol: P(1)=P(2)=P(3)=P(4)=P(5)=P(6)= 1/6 P (2) =1/6, P (5) =1/6 P (2 Ụ 5). =P (2) +P (5)-P (2∩5) = 1/6+1/6=2/6. (P (2∩5) =0)CONDITIONAL PROBABILITY: If A and B are two events A, B. such that P (A)>0, the conditional probability of B of A is given by.

P(B/A)= P(A∩B) P(A)

(P (A)>0)Example: tossing a die and the event is odd and is less then ‘4’.Sol: P(1)=P(2)=P(3)=P(4)=P(5)=P(6)= 1/6 P (A) = P (1) +P (3) +P (5) =3/6. P (B) = P (1) +P (2) +P (3) =3/6 P (A∩B) = P (1) +P (3) =2/6.

P (B/A) = P (A∩B)

= 1/3 = 2/3 P (A) 1/2

Page 9: Stochastic processes slides

INDEPENDENT PROBABILITY: If ‘A ‘and ‘B’ are two independent events The conditional probability of ‘B’ given ‘A’ is P (B/A) = P (A) P (B) = P (B). P (A)

Page 10: Stochastic processes slides

MODULE II

Page 11: Stochastic processes slides

RANDOM VARIABLERANDOM VARIABLE

A definition

Page 12: Stochastic processes slides

Classification of random Classification of random variablesvariablesReal random variable: A random variable ‘X’

is a function that assigns a real number, X (ξ) to each outcome ξ in sample space of a random experiment, whose domain is the sample

space and the set Sx is a of all values taken on by X is the range of the variable .thus Sx is a subset of the set of all real numbers.◦ P{X=+∞} =0◦ P{X=-∞}=0.

Complex random variable: Complex random variable is a process of assigning a complex number to every outcome Z(ξ) . Z(ξ) = X(ξ) +jY(ξ) .

Where X and Y are real random variables

Page 13: Stochastic processes slides

Classification of random Classification of random variablesvariables

Discrete random variable has a finite number of random variables and it is defined as a random who’s CDF is right continuous, staircase function of x with jumps at countable set of pointsSx = {x0 , x1…….xn}

Continuous random variable has infinite number of random variables and it is defined as a random variable who’s CDF Fx(x) is continuous every where and which in addition is sufficiently smooth that it can be written as an integral of some non-negative function f(x)

Page 14: Stochastic processes slides

Mixed type: it is a random variable with a CDF that has jumps on a countable set of points {x0, x1….xn}, but that also increases continuously over at least one interval of value x

Lattice Type:

Page 15: Stochastic processes slides

SAMPLE SPACE OF RANDOM VARIABLE: The sample space of real random variable is Sx={X: X= X (ξ) for ξЄS}X (ξ) → is the number assign to outcome ‘ξ’X → is the ruler correspondence between any element of the set ‘S’ and the number assign to it. Ex: X (fi) =10i i=1, 2, .6. If i=1; X (fi) =10*1= 10. {X ≤35} = {f1, f2, f3}. {X ≤5}= {0}. {20 ≤X ≤ 45}= {f2, f3, f4}. {X=35}=0. {X=40}= {f4}.

Page 16: Stochastic processes slides

CUMULATIVE DISTRIBUTION CUMULATIVE DISTRIBUTION FUNCTIONFUNCTION (CDF)(CDF)

The cumulative distribution function (cdf) of a random variable ‘X’ is defined as the probability of the event {X ≤x}

Fx(x)=P[X ≤ x]For -∞ < x < +∞, the distribution function is the probability of the event if consisting all the outcome ‘ξ’ such that {X (ξ) ≤ x }

CDF of X Y Z is given by Fx, Fy , Fz. For x y z between -∞ < x < +∞

Fx(x) =P (X≤x), Fy(y) =(Y ≤ y), Fz(z)=P(Z≤z), Fx(w) , Fy(w), Fz(w) these three are pdf of x , y ,z

Page 17: Stochastic processes slides

Example: tossing a coin the sample space is given by S= {h,t}, and P(h)=p , P(t)=q. Define X(h)=1, X(t)=0. Find F(x).

Sol: case 1: if x ≥ 1 certain event

X (h) =1 and X (t) =0, it contains only head,

F(x)=P(X ≤ x)=P(h,t)=1

Case 2: if the x is lies between 0 and 1, 0 ≤ x< 1

In this case it has only tail x (t) =0. F(x) =P(X ≤ x) =P (t) =q

Case 3: if x is less than ‘0’ X < 0 F(x)=P(X ≤ x)=0

Fx(x) = {1 for x ≥ 1 q For 0 ≤ x< 1 0 for x < 0

Page 18: Stochastic processes slides

cdf

Page 19: Stochastic processes slides

PROBABILITY DENSITY FUNCTIONPROBABILITY DENSITY FUNCTION

The probability density function of X(PDF), if it exist it is defined

as the derivative of Fx(x)

dx

xdFxf X

X

)()(

Page 20: Stochastic processes slides

Properties of PDFProperties of PDF f(x) ≥ 0

The CDF of X can be obtained by integrating the PDF

By letting x tending to infinity we obtained normalization condition for PDF’s

b

a

X dxxfbXaP )(

x

XX dttfxF )()(

dttf X )(1

Page 21: Stochastic processes slides

Problem1: tossing of two coins

P(HH) = P (HT) = P (TH) = P(TT) = .25P(X=0) = P(TT) = .25P(X=1) = P(TH) + P (HT) = .5P(X=2) = P(HH) = .25

Solution:

Page 22: Stochastic processes slides

DISTRIBUTION FUNCTION OF DISCRETE DISTRIBUTION FUNCTION OF DISCRETE RANDOM VARIABLERANDOM VARIABLE

It is defined as P(X≤x) = F(x), where x is any number from -∞<x<∞

F(x) = { 0 for -∞<x<x1

f(x1) for x1 < x < x2

f(x1) + f(x2) for x2 < x <x3

f(x1) + f(x2) + ….+ f(xn) for xn < x < ∞ }

Page 23: Stochastic processes slides

F(x) = { 0 for x<0 .25 for 0≤x<1 .75 for 1≤x<2 1 for x<0 }

Page 24: Stochastic processes slides

Problem 2: Tossing of a die

X(i) = 10i ; for i= 1 to 6 S = {10, 20, 30, 40, 50, 60}

CDF

Page 25: Stochastic processes slides

Problem 3:The value of X(ζ)=a for any ζ find out he CDF and PDF. Sol: case 1: x ≥ a {X≤x}={S}

F(x)=P(S)=1

Case 2: x<a {X≤x}=Φ F(x)= P(Φ)=0

Page 26: Stochastic processes slides

Example prob 4: The function X (t) = t t= [0 T] It gives the outcome of the experiment and Also random variable.Sol: Case 1: if x ≥T {X ≤ x} = P{S} = 1 F(x) =1Case 2: if 0 ≤ x < T {X ≤ x} = {0 ≤ x < T} F(x) =P {0 ≤ t ≤ x} = x/T.Case 3: if {X ≤ x} =Ф F(x) =P (Ф) =0 F(x) = { 1 for x ≥T X/T for 0 ≤ x < T 0 for x<0

Page 27: Stochastic processes slides

Example pro 5: Find the constant ‘c’ the density function given as F(x) = {cx2 for 0<x<3

0, other wise 3

Sol: 0 ∫ cx2 dx = 1

32

0

3

0

22

1

23

1

1

31

1,9

11 2

9

8 1 7

3 3 3 27

3

cx dx

c

p x x dx

x

cx

Page 28: Stochastic processes slides

EXAMPLE PRO 6: Find the distribution function for the random variable And find the probability P (1<x≤2)Sol: 2 0 3

0

32

0 0

3

0 3

33

0

13 3

0

( )

1: 3

( ) 0, [1 2] 0

2 : 0 3

1( ) ( )

9 27

3: 3

( ) ( ) ( )

11

9 3

( ) 0 327

cx for xotherwise

x x

x

forx

otherwise

F x

case x

F x P x

case x

xF x f x dx x dx

case x

F x f x dx f x dx

x

xF x for x

Page 29: Stochastic processes slides

0 0( ) ( )F x F x

( ) ( )F x F x

( )i ip x x

1 2 2 1( ) ( ) ( )P x X x F x F x

( ) ( )F x F x

1) It is a non decreasing function of ‘x’ if x1<x2

F(x1) <F(x2)

2) F(x0) =0 then F(x) =0 for every x≤x0

3) P(X<x) =1-.P[X>x] P(X>x) =1-.P [X≤x] =1-F(x)4) The function continuous from the right

5) The probability that

6) The probability at X=x ,

Some Properties of Distribution Function:

Page 30: Stochastic processes slides

0 0

( ) ( ) (0)

( ) ( ) ( )

x x dx

x x x dx x

0 0( ) ( )F x F x

REPRESENT THE DENSITY FUNCTION INTERMS OF IMPULSE FUNCTION AND DELTA FUNCTION:

Assume k is the magnitude of discontinuity of discrete function K=

Page 31: Stochastic processes slides

( )dF x

dx( )i

i

Pi x x

1( 10) ( 20) ............ ( 60)

6x x x

1 1 1( ) ( 1) ( 2)

4 2 4U x U x U x

f(x)=,f(x)=

For tossing die the unit impulse representation is f(x)=

Unit step reorientation tossing of two coins:F(x)=

.

Page 32: Stochastic processes slides

:

in the density function are known as point mass Pi placed at r i if density is positive in entire X-axis has

total mass =1

Page 33: Stochastic processes slides

PROBABILITY MASS FUNCTIONPROBABILITY MASS FUNCTION

If the density function f(x) is finite, mass in the interval [x, x+ dx] equals to the f(x)dx .

The impulse

in the density function are known as point mass pi placed at xi .

• density is positive in entire X-axis has total mass =1 •Probability that RV X takes values in a certain region of x-axis equals the mass in that region, i.e., Distribution function F(x) equals the mass in the interval (-∞,x)

)( ii xxp

Page 34: Stochastic processes slides

CONDITIONAL PROBABILITY:

If A and B are two events, such that P (A)>0, the conditional probability of B given A is given by.

0)( APIf)(

)()|(

AP

BAPABP

Page 35: Stochastic processes slides

Example: tossing a die The event ‘A’ that the number is odd and event ‘B’ that it is less than ‘4’.Find the conditional probability of B

Sol: P(1)=P(2)=P(3)=P(4)=P(5)=P(6)= 1/6 P (A) = P (1) +P (3) +P (5) =3/6. P (B) = P (1) +P (2) +P (3) =3/6 P (A∩B) = P (1) +P (3) =2/6.

3/26/3

6/2

)(

)()/(

AP

BAPABP

Page 36: Stochastic processes slides

Total probabilityTotal probability

nn

n

BPBAPBPBAPBPBAPAP

BAPBAPBAPAP

|......||

.....

2211

21

Page 37: Stochastic processes slides

INDEPENDENT PROBABILITY:

If ‘A ‘and ‘B’ are two independent events The conditional probability of ‘B’ given ‘A’ is P (B/A) = P (A) P (B) = P (B). P (A)

Page 38: Stochastic processes slides

Conditional CDF Conditional CDF

0,)/(

APif

AP

AxXPAxFX

Page 39: Stochastic processes slides

Conditional PDFConditional PDF

)/()/( AxFdx

dAxf XX

Page 40: Stochastic processes slides

Example 1Example 1Waiting time X of a customer in a

queueing system is 0 if he finds the system idle and exponentially distributed random length of time if he find the system busy.

i.e. P[idle]=p, P[busy]=1-pFind CDF of X

Page 41: Stochastic processes slides

SolutionSolution

F(x) can be expressed as the sum of step function with amplitude p and a continuous function of x

0)1)(1(

00

)1(||

)(

xforepp

xfor

pbusyxXPpidlexXP

xXPxF

x

X

xepxFxf )1()()( '

Page 42: Stochastic processes slides

ExampleExamplePDF of samples of amplitude of

speech waveforms is found to decay exponentially at a rate

vxPandcFind

xcexf xX

)(

Page 43: Stochastic processes slides

SolutionSolution

v

v

vdxv

x

x

eevxP

cgives

cdxce

dxce

12/

2/

1/22

1

0

Page 44: Stochastic processes slides

ExampleExampleLifetime of a machine has

continuous CDF F(x)Find conditional PDF and CDFEvent A={X>t} , i.e., machine is

still working at time t.

Page 45: Stochastic processes slides

SolutionSolutionConditional CDF

txtF

xftXxfPDFlConditiona

txfortF

tFxF

txfor

tXP

tXxXP

tXxXPtXxF

X

XX

X

XX

,)(1

)()|(:

)(1

)()(

0

|)|(

Page 46: Stochastic processes slides

Special Random VariablesSpecial Random Variables

Page 47: Stochastic processes slides

Continuous type Random Variable

1) Gaussian (Normal) Random Variable : The PDF is given by

xy

xy

X

xX

dyexGwhere

xGdyexFCDF

aroundlsymmetricaandshapedbellexf

2/

2/)(

2

2/)(

2

2

22

22

2

1)(,

2

1)(:

,2

1)(

22

21

Page 48: Stochastic processes slides

Constant is normalization constant and maintains the area under the curve f(x) to be unity.

22

2

0

2

2

0 0

2/2/)(2

2/

22

,,,

,22222

22

due

rdrddxdyrSinyrCosxwhere

rdrdedxdyeQ

dxeQ

u

ryx

x

Page 49: Stochastic processes slides

=0 and σ=1 is called standard normal random variable

Skewness: It is the measure of asymmetry of density function about the mean.

Kurtosis: It is a measure of peakedness of the density function

The upper one has large Kurtosis than the lower one.

SkewnessoftCoefficienX

,33

3

positive3

negative3

Page 50: Stochastic processes slides

00

xe forxotherwise

2) Exponential distribution: formula for exponential distribution is fx(x)=

The parameter is the rate at which events occur, i.e., The parameter is the rate at which events occur, i.e., the probability of an event occurring by time x increases the probability of an event occurring by time x increases as the rate increases as the rate increases

Page 51: Stochastic processes slides

If occurence of events over non overlapping intervals are independent.

i.e., arrival time of telephone calls or bus arrival time at bus stop, waiting time distribution.

Page 52: Stochastic processes slides

ExampleExampleIf q(t) is the probability that in time

interval ‘t’ no event has occurred

)()()(

1)(1)(1)(),()(

2121 ttqtqtq

etqtxPtXPtqtXP t

Page 53: Stochastic processes slides

It has memoryless It has memoryless propertyproperty

}{

)(1

)(1

1

1

][

}][{}]{[|

.,0,

)(

tXP

ee

e

sF

stF

sXP

stXP

sXP

stXP

sXP

sXstXPsXstXP

eventstwoaresxstxandtsIf

ts

st

Page 54: Stochastic processes slides

ExampleExampleWaiting time of a customer

spending at a restaurant, which has mean value= 5 min.

Find the probability that the customer will spend more than 10 min

Page 55: Stochastic processes slides

SolutionSolution

Probability that the customer will spend an additional 10 min in the restaurant given that he has been there for more tan 10 minutes.

1353.0)10( 5/10/10 eeXP

pastondependnotdoes

exPXXP

1353.0)10()10|10( 2

Page 56: Stochastic processes slides

3)Gamma Distribution:3)Gamma Distribution:

1

0( )

0

1

0

( )

xe forx

otherwise

xx e dx

)(xf X

Exponential is a special type of gamma distribution

If

ondistributiErlangmegeranmIf

ondistributisquareChin

int,

2,2/

Page 57: Stochastic processes slides

1

0

fora x bb a

otherwise

4) Uniform Distribution: the density function for the uniform distribution is fx(x)=

Page 58: Stochastic processes slides

Discrete type random variable:

1)Bernoulli Random Variable: Only two possible outcomes in this random variable The value of x is 0 or 1 P(x=1) =p, P(x=0)=q=1-p p→ is the probability of the successes in each experiment of independent trail. q →is the probability of failure in each experiment.

Page 59: Stochastic processes slides

2)Binomial Random 2)Binomial Random VariableVariable : :

probability that an event occurs exactly ‘x’ times out of ‘n’ time’s. x→ number of successes n→ number of failure’s fx=

Page 60: Stochastic processes slides

3)Poisson Random Variable3)Poisson Random Variable

This is closely related to the Binomial Distribution where there are number of event’s occurrence in a large number of event’s.

Examples: No of count’s of emission from the radioactive substance Number of demands for telephone connection. Number of call’s at telephone exchange at a time. Number of printing error’s in book.

Page 61: Stochastic processes slides

Function of random Function of random variablevariable X→ Random variableg(X)→ real valued functionDefine Y=g(X)the probability of ‘Y’ is depends

upon probability of ‘X’ and cdf of ‘X’

Page 62: Stochastic processes slides

ExampleExampleA Linear function Y=aX+b, a≠0If CDF of X is F(x), find F(y)

Page 63: Stochastic processes slides

SolutionSolutionThe event occurs when occurs If a>0 , thus,

If a<0,

}{ yY }{ ybaXA

}{

a

byXA

0,)(

aa

byF

a

byXPyF XY

a

byXA

0,1)(

aa

byF

a

byXPyF XY

Page 64: Stochastic processes slides

a

byf

ayfor

aa

byf

ayfand

aa

byf

ayf

a

byuif

dy

du

du

dF

dy

dF

XY

XY

XY

1)(,

0,1

)(,

0,1

)(

,.

)(1

)()( xfady

dxxfyf XXY

Page 65: Stochastic processes slides

ExampleExampley=x2 ‘x’ is any continuous

random variable Find cdf and pdf of ‘y’

Page 66: Stochastic processes slides

SolutionSolution Event occurs when for

y in nonnegative Event is null when y<0

yY yXyoryX 2

0)(

00)(

yforyFyF

yforyF

XX

Y

Page 67: Stochastic processes slides

PDFPDF

( )( ) ( ) .

1 1( ). ( )( )

2 2

1( ) ( ) ( )

2

YY Y

x x

Y x x

dF yd duf y F y

dx du dy

f y f yy y

f y f y f yy

0y

Page 68: Stochastic processes slides

ExampleExampleY=cos(X), X is uniformly

distributed random variable in interval (0,2π)

i.e., Y is uniformly distributed random variable over the period of sinusoid

Find PDF of Y

Page 69: Stochastic processes slides

SolutionSolutionFor -1<y<1, the equation y=Cos(x) has two

solutions, 01

10 2)(cos xxandyx

onDistributieArchavetosaidisY

y

yySin

y

yFCDF

yfory

yyyf

xfSince

yySinxSindx

dy

Y

Y

X

x

sin

11

11,)(

2

1

10

)(,

11,1

1

12

1

12

1)(

2

1)(

1))((cos)(|

1

2

22

2100

Page 70: Stochastic processes slides

ExampleExample

cxcx

cxcx

cxcforXg

,

,

0)(

)(,0

)(,0

cyFcyXPyYPyIf

cyFcyXPyYPyif

X

X

Page 71: Stochastic processes slides

ExampleExample

0,1

0,1)(

x

xXg

)0(101

)0(01,1

X

X

FXPyP

FXPyPy

Page 72: Stochastic processes slides

Expected value of random Expected value of random variable or Mathematical variable or Mathematical ExpectationExpectationIt is an estimation of a random

variable.It is a measure of central

tendency.

Page 73: Stochastic processes slides

Expected value of discrete Expected value of discrete random variablerandom variableA discrete RV X having the

possible values Expected value of a X is defined

as

nxxx ,......., 21

n

jjj

nn

jj

n

jjj

nn

xxfxfx

xfxxfxxfxXE

xfxXPIfor

xXPx

xXPxxXPxxXPxXE

1

2211

1

2211

)()(

)(.....)()()(

)()(,

)(

)(....)()()(

Page 74: Stochastic processes slides

If all probabilities are equal,

If X has infinite number of values, then

nn xxxofMeanmeanArithmatic

n

xxxXE ,...,/

.....)( 21

21

1

inf)()(j

jj convergesseriesinitetheprovidedxfxXE

Page 75: Stochastic processes slides

Expected value of continous Expected value of continous random variablerandom variableThe expected value of a cont. RV

is given by

absolutelyconvergesegraltheprovideddxxxfXE int)()(

Page 76: Stochastic processes slides

Function of Random Variable Function of Random Variable (Discrete )(Discrete ) If X is a RV with probability function f(x)Define Y=g(X), whose probability function

h(y) is given by

IfThen,

yxgx yxgx

xfxXPyYPyh)(| )(|

)()()()(

nmforyyyYandxxxX mn ,...,,,...,, 2121

)()(

)()(

)()(....)()()()()(

)()(...)()()()()(...)()(

1

2211

22112211

xfxg

xfxg

xfxgxfxgxfxgXgE

xfxgxfxgxfxgyhyyhyyhy

n

jjj

nn

nnmm

Page 77: Stochastic processes slides

Function of Random Function of Random Variable(Continous)Variable(Continous)If X is a cont. RV havinf

probability function f(x)

dxxfxgXgE )()()(

Page 78: Stochastic processes slides

Problem: Fair Die Problem: Fair Die ExperimentExperimentIf 2 turns up, one wins Rs. 20/-If 4 turns up, one wins Rs. 40/- If 6 turns up, one loses Rs. 30/- Find E(X), if the RV, X is the

amount of money won or lost

Page 79: Stochastic processes slides

SolutionSolution0 +20 0 +40 0 -30

f(x) 1/6 1/6 1/6 1/6 1/6 1/6

jx

5

)6/1)(30()6/1)(0()6/1)(40()6/1)(0()6/1)(20()6/1)(0()(

XE

Page 80: Stochastic processes slides

ProblemProblemIf X is a RV whose density

function is given by

Find E(X)

otherwise

xforxxf

0

202

1)(

Page 81: Stochastic processes slides

SolutionSolution

3

4

2

2

1)()(

2

0

2

2

0

dxx

dxxxdxxxfXE

Page 82: Stochastic processes slides

Problem: Uniform RVProblem: Uniform RV

Page 83: Stochastic processes slides

ProblemProblemFind

If

)23( 2 XXE

otherwise

xforxxf

0

2021)(

Page 84: Stochastic processes slides

SolutionSolution

3/10

)2

1)(23()23( 22

dxxxxXXE

Page 85: Stochastic processes slides

Theorems on Expected valueTheorems on Expected value

Theorem 1: If C is a constant,

Theorem 2: Theorem 3:

Theorem 4:

)()( XcEcXE

)()()( YEXEYXE

RVstindependentwoareYandXIfYEXEXYE ),()()(

cXEcXE )()(

Page 86: Stochastic processes slides

Variance and Standard Variance and Standard DeviationDeviationIt is a parameter to measure the

spread the PDF of random variable about the mean.

It is defined as

.,)( 2 numbernegativenonaiswhichXEXVarmeanarounddeviation

)(

tan)( 2

rootsquarepositive

DeviationdardSXEXVarX

Page 87: Stochastic processes slides

Discrete Random VariableDiscrete Random VariableIf X is a discrete RV taking values

as and having

probability function f(x), then variance is given by

nxxx ,.....,, 21

)()(

)(

2

1

222

xfx

xfxXE j

n

jjX

Page 88: Stochastic processes slides

If all the probabilities are equal,

If X takes infinite number of terms,

nxxx n /..... 222

21

2

,...., 21 xx

convergesseriesthethatprovidedxfx jj

jX

1

22

Page 89: Stochastic processes slides

Continuous Random Continuous Random VariableVariableIf X is a continuous RV having

density function f(x)

convergesegraltheprovideddxxfxXEX int)(222

Page 90: Stochastic processes slides

The variance or standard deviation is the measure of the dispersion or scatter (spread of PDF) of the values of RV about the mean.

Small variance

Large variance

Page 91: Stochastic processes slides

ProblemProblemIf X is cont. RV with PDF

Find the variance and standard deviation

otherwise

xforxxf

0

202

1)(

Page 92: Stochastic processes slides

SolutionSolution

3

29

2,tan

9

2

2

1

3

4

)(3

4,

3

4

2

0

2

2

22

DeviationdardS

dxxx

dxxfxXEVariance

XE

* If unit of X is cm, unit if variance is cmcm22 and unit of standard deviation is cm

Page 93: Stochastic processes slides

Theorems on VarianceTheorems on VarianceTheorem 1:

Proof:

Theorem 2:

XEwhereXEXE

XEXE

,22

2222

22222

22222

2

22

XEXE

XEXEXXEXE

0)()()( cVarXVarcXVar

Page 94: Stochastic processes slides

Theorem 3:Theorem 4: The quality of is

minimum , whenProof:

)()( 2 XVarccXVar

2aXE

XEa

a

awhenoccursaXEofvalueimumSo

XEXESince

aXE

aXEaXE

aaXXE

aXEaXE

0min,

0

2

2

22

22

22

22

22

Page 95: Stochastic processes slides

Theorem 5:If X and Y are independent RVs

Var(X+Y)=Var(X)+Var(Y)Var(X-Y)=Var(X)+Var(Y)

Page 96: Stochastic processes slides

Problem: Uniform RVProblem: Uniform RV

Page 97: Stochastic processes slides

SolutionSolution

12

1

2

1)(

2

22

2

2

2

abdyy

ab

dxba

xab

XVar

baXE

ab

ab

b

a

Page 98: Stochastic processes slides

Standardized Random Standardized Random VariableVariableThese are the random variables that mean

value is ‘0’ and the variance is ‘1’ The standardized random variable are given

by X* E[x*] =0 where

x*=x-m Var[x*]=1

Page 99: Stochastic processes slides

MomentsMomentsThe rth moment of a RV X about the

mean is defined as

It is also called the rth central moment, where r=0,1,2….

rr XE

meantheaboutmomentSecond

ormomentcentralSecond

2

2

10 ,0,1

Second moment about the mean is variance

Page 100: Stochastic processes slides

dxxfx

xfX

rr

r

r

)(

)(

Discrete Random Variable

Continuous Random Variable

Page 101: Stochastic processes slides

Moment about originMoment about originThe rth moment of X about the

origin or rth Raw moment is defined as .....2,1,0,' rwhereXE r

r

Page 102: Stochastic processes slides

SkewnessSkewnessIt is the measure of degree of

asymmetry about the mean and is given by

33

3

3

3

XE

Page 103: Stochastic processes slides

KurtosisKurtosisIt is the measure of degree of

peakedness of the density function

Page 104: Stochastic processes slides

Moment Generating Moment Generating FunctionFunctionIt is used to generate the

moments of a random variable and is given by

dxxfetM

xfetM

eEtM

tXX

tXX

tXX

)()(

)()(

)(

Discrete Random Variable

Continuous Random Variable

Page 105: Stochastic processes slides

...!

...!3!2

1)( '3

'3

2'2

r

tttttM

r

rX

...!3!2

1

...!3!2

1

....!3!2

1)(:Pr

3'3

2'2

33

22

3322

ttt

XEt

XEt

XtE

XtXttXEeEtMoof tX

X

The Coefficients of this expression enables to find the moments, hence called moment generating function.

Page 106: Stochastic processes slides

is the rth derivative of evaluated at t=0

'r )(tM X

0' |)( tXr

r

r tMdt

d

Page 107: Stochastic processes slides

Theorems on Moment Theorems on Moment generating functiongenerating functionTheorem 1: If Mx(t) is moment generating

function of RV X and ‘a’ and ‘b’ are constants, then

Proof:

b

tMetM Xb

at

b

aX )(

b

tMe

eEeeeE

eEtM

Xb

at

b

Xt

b

attb

at

b

X

tb

aX

b

aX )(

Page 108: Stochastic processes slides

Theorem 2: If X and Y are two independent RVs having moment generating function Mx(t) and My(t), then

Proof:

)()()( tMtMtM YXYX

)()(

)( )(

tMtMeEeE

eeEeEtM

YXtYtX

tYtXYXtYX

Page 109: Stochastic processes slides

Theorem 3: The two RVs X and Y have same probability distribution if and only if

)()( tMtM YX

Uniqueness theorem

Page 110: Stochastic processes slides

ProblemProblemA RV assumes values 1 and -1 with

probabilties ½ each.Find (a) Moment Generating function(b) First 4 moments about the origin

Page 111: Stochastic processes slides

SolutionSolution(a)

(b)

tttttX eeeeeE

2

1

2

1

2

1 )1()1(

.....,1,0,1,0

...!4!3!2

1)(

...!4!2

12

1

...!4!3!2

1

...!4!3!2

1

'4

'3

'2

4'4

3'3

2'2

42

432

432

tttttM

ttee

tttte

tttte

X

tt

t

t

Page 112: Stochastic processes slides

22 00( )

xe forxotherwisef x

2

0

( 2 )

0

( 2 )

0

[ ] 2

2

2( 2 )

2 2

( 2 ) 2

tx tx x

t x

t x

E e e e dx

e dx

e

t

t t

Example 1: Find the moment generating function and first 4 moments about the origin

Sol:

Page 113: Stochastic processes slides

If

...,2

3,

4

3,

2

1,

2

1

....!4!3!2

1)(

...16842

12/1

1

2

2

2

'4

'3

'2

4'4

3'3

2'2

432

tttttM

tttt

tt

t

Page 114: Stochastic processes slides

( ) [ ] ( )

( )

( )

iwxx x

iwx

iwx

iw E e iw

e f x fordiscretecase

e f x dx forcontinuouscase

( )( ) ( )

aiw

bx a x

b

ww e

b

( )

( )

( ) ( ) ( )

( )

( )

x ai wb b

x a

b b

x a a wi w w i w i xb b b b

aiw

bx

w e

e e e

we

b

( ) ( )x yw w

Characteristic Function: Simply replacing t =iw where I is the imaginary term

Theorem 1: The random variable x. the characteristic function of random variable

Proof:

Theorem 2: ‘x’ and ‘y’ has same distribution if and only if

Uniqueness theorem

Page 115: Stochastic processes slides

Theorem 3: If X and Y are independent RVs

)()()( YXYX

Page 116: Stochastic processes slides

0'

'2

'2

|)()1(

...!

....!2

1)(

Xr

rrr

r

r

rr

X

d

diand

rii

Page 117: Stochastic processes slides

ProblemProblemIf X is a RV with values -1 and 1

with ½ prob. EachFind the Ch. Fn.

Page 118: Stochastic processes slides

SolutionSolution

Cos

ee

eeeE

ii

iiXi

2

1

2

1

2

1 )1()1(

Page 119: Stochastic processes slides

ProblemProblemIf X is a RV with PDF

Find the Ch. Fn.

otherwise

axforaxf

02

1)(

Page 120: Stochastic processes slides

SolutionSolution

a

Sina

ai

ee

i

e

a

dxea

dxxfeeE

iaxiaxaa

xi

a

a

xixiXi

2|

2

1

2

1)(

Page 121: Stochastic processes slides

Markov and Chebyshev Markov and Chebyshev InequalityInequality• Mean and variance does not provide

enough information to determine CDF/PDF and does not provide the bounds for probabilities of the form tXP

Page 122: Stochastic processes slides

Markov InequalityMarkov Inequality

aXaPXE

xFadxxfa

anosmallbyreplacedxdxxaf

dxxxfdxxxfdxxxfdxxxfXE

enonnegativXfora

XEaXP

a

a

a

a a

)(1)(

.)(

)()()()(0 0

Page 123: Stochastic processes slides

ProblemProblemMean height of children in a class is 3 feet, 6

inches.Find the bound on the probability that a kid

in the class is taller than 9 feet.

Solution: 389.0

9

5.39 HP

Page 124: Stochastic processes slides

Chebyshev InequalityChebyshev Inequality In Markov inequality, the knowledge about

the variability of random variable about the mean is not provided.

If

2

2

2

222

22

2

2

2

:

)(,

aa

mXEaDP

mXDLet

aamXPinequalityChebyshev

XVarmXE

If RV has zero variance, Chebyshev inequality implies that P[X=m]=1i.e. , RV is equal to its mean with probability 1

Page 125: Stochastic processes slides

ProblemProblemIf Chebyshev inequality for gives

If X is Gaussian RV, then for k=2

But, Chebyshev inequality gives 0.25

2)(, XVarmXE

ka

2

1

kkmXP

0456.02 mXP

It is useful if the knowledge about distribution of RV is not available other than mean and variance.

Page 126: Stochastic processes slides

Transform MethodsTransform MethodsThese are useful computational aids in the

solution of equations that involve derivatives and integral of functions.

Characteristic Function Probability Generating Function Laplace Transform of PDF

Page 127: Stochastic processes slides

Characteristic FunctionCharacteristic FunctionIt is defined as

It is the Fourier transform of PDF of X with reversal in sign of exponent

The PDF of X is given by

XixiX

XiX eofvalueExpecteddxexfeE

)()(0

dexf xiXX )(

2

1)(

Every PDF and Characteristic function form a unique Fourier Transform pair.

Page 128: Stochastic processes slides

ProblemProblemFind the Ch. Fn. of an exponentially

distributed RV with parameterSolution:

i

dxedxee xixixX

0 0

)(

Page 129: Stochastic processes slides

Characteristic Function for Discrete Characteristic Function for Discrete RVRVIt is given byIf discrete RV are integer valued, the

Ch. Fn. is given by

It is the Fourier Transform of the Sequence

k

xikXX

kexp )()(

k

kiXX ekp )()(

)(kpX

Page 130: Stochastic processes slides

Probability Generating Probability Generating FunctionFunctionThe Probability Generating Function of

a nonnegative integer valued RV N is defined as

The pmf of N is given by

)exp()(

,)(

0

onentinchangesignwithpmfoftransformZzkp

zNoffunctionofvalueExpectedzEzG

k

NN

NNN

0|)(!

1)( zNr

r

N zGdz

d

kkp

Page 131: Stochastic processes slides

Laplace Transform of PDFLaplace Transform of PDFUseful for queueing system, where

one deals with service times, waiting times, delays etc.

The Laplace transform of pdf is given by

sxsxX eEdxexfsX

0

* )()(

Page 132: Stochastic processes slides

Module 3Module 3

Page 133: Stochastic processes slides

Vector Random VariableVector Random VariableAssigning a vector of real numbers to each

outcome of Sample Space of a random experiment.

Example: Random Experiment consists of selecting student’s name from an urn based on the Height, Weight and Age of the student.

- Height of student in inches - Age of student in years - Weight of student in Kg

Then the vector is the vector random variable.

)(H

)(A

)(W

WAH ,,

Page 134: Stochastic processes slides

We have

y

y2

x1 x2 x

221 yYxXx

x

y1

y2

y

x2x1

2121 yYyxXx

Page 135: Stochastic processes slides

y

y2

y1

x1 y2

211 yYyxX

y

y2

y1

y2x1-x1

211 yYyxX

Page 136: Stochastic processes slides

MULTIPLE RANDOM VARIABLESMULTIPLE RANDOM VARIABLES

Page 137: Stochastic processes slides

Joint DistributionJoint Distribution If there are two RVs X and Y and the sets and

are events with probabilities

consisting of all outcomes ξ such that is also an event, then

is called the Joint Distribution of the RVs X and Y.• It is given by

xX yY

yYxXyYxX

producttheandyFyYPandxFxXP YX

,

)()(

yYandxX )()(

yYxXP ,

yYxXPyxFXY ,),(

Page 138: Stochastic processes slides

Discrete Random VariableDiscrete Random Variable

For discrete RV,

x y

iiyxf

iyxfWhere

yxfyYxXP

).....(..........1),(

).....(..........0),(,

),(,

Page 139: Stochastic processes slides

Let X be a RV which assumes any one value of

And, Y be a RV which assumes any one of

Then, the probability of an event that X=xj and Y=yk is given by

mxxx .,..........,, 21

nyyy .,..........,, 21

),(, kjkj yxfyYxXP

Page 140: Stochastic processes slides

X Y

y1 y2 . . yn Total

f(x1,y1) f(x1,y2) . . f(x1,yn) f1(x1)

x2 f(x2,y1) f(x2,y2) . . f(x2,yn) f1(x2)

x3 f(x3,y1) f(x3,y2) . . f(x3,yn) f1(x3)

. . . . . . .

. . . . . . .

xm f(xm,y1) f(xm,y2) . . f(xm,yn) f1(xm)

Total f2(y1) f2(y2) . . f2(yn) 1

x1

Grand Total

Page 141: Stochastic processes slides

The probability that X=xj is obtained by adding all entries in the row corresponding to xj is given by

The probability that Y=yk is obtained by adding all entries in the row corresponding to yk is given by

n

kkjjj yxfxfxXP

11 ,)(

m

jkjkk yxfyfyYP

12 ,)(

Page 142: Stochastic processes slides

f1(xj) and f2(yk) or simply f1(x) and f2(y) are obtained from the margins of table, hence called Marginal Probability of function of X and Y

Or, which, can be written as

i.e., Total probability of all entries is 1. Joint distribution function of X and Y is given by

This is the sum of all entries for which

111 1

21

m

j

n

kkj yfandxf

1,1 1

m

j

n

kkj yxf

xu yv

vufyYxXPyxF ),(,,

yyandxx kj

Page 143: Stochastic processes slides

Continuous Random VariablesContinuous Random Variables

The properties of joint PDF for the continuous RVs are

)........(1),(

)..(....................0,

iidxdyyxf

iyxf

Page 144: Stochastic processes slides

Joint Distribution function for Joint Distribution function for Continuous RVContinuous RVThe joint distribution function of

two RVs X and Y is given by

FunctionDensityyxf

yx

yxF

dudvvufyYxXPyxFu

v

),(,

),(,),(

2

Page 145: Stochastic processes slides

Marginal Distribution Marginal Distribution FunctionFunctionThe marginal distribution function of X

is given by

The marginal distribution function of X is given by

x

u vdudvvufxFxXP ),()(1

u

y

vdudvvufyFyYP ),()(2

Page 146: Stochastic processes slides

Marginal Density FunctionMarginal Density FunctionThe marginal density function of RV X

is given by

The marginal density function of RV X is given by

vdvvxfxF

dx

dxf ),()(11

uduyufxF

dy

dyf ),()(22

Page 147: Stochastic processes slides

IndendenceIndendence If X and Y are two independent random variables,

events that involve only X should be independent of the events that involve only Y.

If A1 is any event that involve X only and A2 is any event that involve only Y, then for discrete RVs,

In general, n random variables are independent, when

Knowledge about probabilities of RVs in isolation is sufficient to specify the probabilities of joint events.

)()(),(,

,

21 yfxfyxfor

yYPxXPyYxXP

nXXX ,......, 21

nnnn xXPxXPxXPxXxXxXP ............,,, 22112211

Page 148: Stochastic processes slides

Or, If for all x and y, f(x,y) is the product of a function of x alone and a function of y alone (which are the marginal probability of X and Y), then, X and Y are independent.

If, f(x,y) cannot be expressed as function of x and y, then, X and Y are dependent.

Page 149: Stochastic processes slides

X and Y (continuous RVs)If, the events are independent

events for all x and y, yYandxX

)()(),(

),()(),(

,

21

21

yfxfyxf

andyFxFyxF

yYPxXPyYxXP

Page 150: Stochastic processes slides

Properties of Joint CDFProperties of Joint CDF

The joint CDF is non-decreasing in the ‘northeast’ direction, i.e.,

21212211 ),(),( yyandxxifyxFyxF XYXY

(x1,y1)

(x2,y2)

x

y

Page 151: Stochastic processes slides

It is impossible for either X or Y to assume a value less than -∞, therefore

It is certain that X and Y will assume values less than infinity, therefore

If, one of the variables approach infinity while keeping the other fixed, marginal cumulative distribution functions are obtained as

0,, ,, xFyF YXYX

1),(, YXF

yYPyYXPyxFyF

and

xXPYxXPyxFxF

YXY

YXX

,),()(

,),()(

,

,

Page 152: Stochastic processes slides

x

y

x1

YxXPxFX ,)( 11

Page 153: Stochastic processes slides

x

y

y1 11 ,)( yYXPyFY

Page 154: Stochastic processes slides

The joint CDF is continuous from the north and from the east, i.e.,

by

YXYX

ax

YXYX

bxFyxF

and

yaFyxF

),(),(lim

),(),(lim

,,

,,

Page 155: Stochastic processes slides

ProblemProblemIf X and Y are two discrete RVs, whose

joint PDF is given by

2,1)(

1,2)(

)(

0

3020,2),(

YXPc

YXPb

caFind

otherwise

yandxwhereyxcyxf

Page 156: Stochastic processes slides

Solution: (a)Solution: (a)We haveX Y

0 1 2 3 Total

0 0 C 2c 3c 6c1 2c 3c 4c 5c 14c2 4c 5c 6c 7c 22cTotal

6c 9c 12c 15c 42cC=1/42

Page 157: Stochastic processes slides

(b)

(c)

42

551,2 cYXP

7

4

42

2424

)654()432(

),(2,11 2

c

cccccc

yxfYXPX Y

Page 158: Stochastic processes slides

ProblemProblemFind the marginal probability of

(a) X and (b) Y

Page 159: Stochastic processes slides

SolutionSolutionMarginal Prob. function of X,

Marginal Prob. function of Y,

221

1122

13

114

07

16

)()(1

xforc

xforc

xforc

xfxfxXP X

314

515

27

212

114

39

07

16

)()(2

yforc

yforc

yforc

yforc

yfyfyYP Y

Page 160: Stochastic processes slides

ProblemProblemShow that the RVs X and Y are

dependent

Page 161: Stochastic processes slides

SolutionSolutionIf x and y are independent, then for all

x and y

dependentareYandX

YPandXPBut

YXP

yYPxXPyYxXP

14

3.

21

11

42

514

31,

21

112,

42

51,2

,

Page 162: Stochastic processes slides

ProblemProblemIf X and Y are two cont. RV, whose

joint density function is given by

2,3)(

32,21)(

)(

0

51,40),(

YXPc

YXPb

caFind

otherwise

yxforcxyyxf

Page 163: Stochastic processes slides

SolutionSolution(a)

(b)

(c)

96

196

1),(

4

0

5

1

4

0

5

1

cc

dxxydyccxydxdy

dxdyyxf

x yx y

128

5

9632,21

2

1

3

2

x y

dxdyxy

YXP

128

7

962,3

4

3

2

1

x y

dxdyxy

YXP

Page 164: Stochastic processes slides

ProblemProblemFind the Marginal distribution

function of X and Y

Page 165: Stochastic processes slides

SolutionSolutionMarginal Distribution function of X

41

4016

00

)(

.0)(,0

1)(,4

40,

1696

1

96

),()(

2

2

0

5

10

5

1

xfor

xforx

xfor

xF

xFx

xFxFor

xBecause

xduuvdvdudv

uv

dudvvufxXPxF

X

X

X

x

u v

x

u v

x

u v

X

Page 166: Stochastic processes slides

Marginal Prob. function of Y,

51

5124

1

10

)(

.0)(,1

1)(,5

51,

24

1),()(

2

2

yfor

yfory

yfor

yF

yFy

yFyFor

yBecause

ydudvvufyYPyF

Y

Y

Y

u

y

v

Y

Page 167: Stochastic processes slides

Change of Variables Change of Variables (Discrete RVs) (Discrete RVs) Theorem 1: If X is a discrete RV having

Prob. Function f(x) and another RV U is defined by U=ø(X)

For each value of X there corresponds one and only one value of U,

Proof:

)()(,..,

)(

ufugUoffnprobThen

UX

UfuXP

uXPuUPug

)(

)()(

Page 168: Stochastic processes slides

Theorem 2: If X and Y is discrete RVs having joint prob. Function f(x,y)

Another RV U and V defined by

For each pair of values of X and Y there corresponds one and only one pair of values of U & V

Then, joint Prob. Function of U & V is given by

YXVandYXU ,, 21

VUYandVUX ,, 21

vuvufvug ,,,),( 21

Page 169: Stochastic processes slides

Proof:

vuvuf

vuYvuXP

vYXuYXPvVuUPvug

,,,

,,,

,,,,),(

21

21

21

Page 170: Stochastic processes slides

Change of Variables Change of Variables (Continuous RVs) (Continuous RVs) Theorem 1: If X is a continuous RV

having Prob. Function f(x) and another RV U is defined by U=ø(X), where

Then, prob. Density function of U is given by g(u), where

)(UX

uufdu

dxxfug

dxxfduug

')()()(

)()(

Page 171: Stochastic processes slides

Proof: If u is an increasing function i.e., if x increases then u increases.

u

x1 x2

u2

u1

x

0)('')()(

)()(

)()(

2

1

2

1

2

1

2

1

'

2121

uhereuufug

duuufduug

dxxfduug

xXxPuUuP

u

u

u

u

u

u

v

v

This can be proved for 0)('0)(' uoru

Page 172: Stochastic processes slides

Theorem 2:

The joint density function g(u,v) of U and V is given by

Where,

VUYVUXwhere

YXVYXUDefine

,,,,

,,,

21

21

Jvuvufvug

dxdyyxfdudvvug

,,,),(

),(),(

21

v

y

u

yv

x

u

x

vu

yxJ

,

,

Page 173: Stochastic processes slides

Proof: If x and y increases then, u and v also increases

0,,,),(

,,,

),(),(

,,

21

21

21212121

2

1

2

1

2

1

2

1

2

1

2

1

JhereJvuvufvug

Jdudvvuvuf

dxdyyxfdudvvug

yYyxXxPvVvuUuP

u

u

v

v

x

x

y

y

u

u

v

v

This can be proved also for 0J

Page 174: Stochastic processes slides

ProblemProblemIf X is a discrete RV with prob.

Function

Find the prob. Function of RV

otherwise

xforxf

x

0

....3,2,12)(

14 XU

Page 175: Stochastic processes slides

ProblemProblemIf X is a Cont. RV with density function

Find the prob. Density Function for

otherwise

xforxxf

0

6381/)(

2

XU 123

1

Page 176: Stochastic processes slides

SolutionSolution

otherwise

uu

ug

ux

anduxFordx

duu

uxux

xu

0

52,27

312

2,6

,5,3

3'

312312

123

1

2

Check:

5

2

2

.127

312du

u

Page 177: Stochastic processes slides

SolutionSolution

otherwise

uug

uwhereuxgives

xu

u

0

.....,82,17,2,2)(

.......,82,17,2,1

,1

4 1

4

4

Page 178: Stochastic processes slides

ProblemProblemIf X and Y are two cont. RVs whose

joint density function is given by

Find the density function of U=X+2Y

otherwise

yandxxy

yxf0

514096),(

Page 179: Stochastic processes slides

SolutionSolution

102,4051,402

1,

)(2

vuvyxrangethe

vuyvx

chosenyarbitrarilxvandyxu

v

u

u-v=2u-v=10

v=4

v=0 102

I

II III

2

1

21

21

10

v

y

u

yv

x

u

x

J

otherwise

vvuvuvvug

0

40,102384/),(

Page 180: Stochastic processes slides

Marginal density function of U is given by

otherwise

uuu

uu

uuu

ufordvvuv

ufordvvuv

ufordvvuv

ug

uv

v

u

v

,0

1410,2304

2128348

106,144

83

62,2304

42

1410384

)(

106384

)(

62384

)(

)(

3

2

4

10

4

0

2

0

1

Page 181: Stochastic processes slides

Expected Value of function of Expected Value of function of random variablesrandom variablesThe expected value of Z=g(X,Y) can

be obtained by

RVsdiscreteYXyxpyxg

scontinunoulyjoYXdxdyyxfyxgZE

iniYX

nni

YX

,,,

int,,),()(

,

,

Page 182: Stochastic processes slides

Sum of Random VariablesSum of Random VariablesLet Z=X+Y, Find E(Z)

)()(

''''''

''',''''',''

''',''''',''

''','''

)()(

,,

,,

,

YEXE

dyyfydxxfx

dydxyxfydxdyyxfx

dydxyxfydydxyxfx

dydxyxfyx

YXEZE

YX

YXYX

YXYX

YX

The expected value of the sum of two random variables is equal to the sum of individual expected values.X and Y need not be independent.

Page 183: Stochastic processes slides

The expected values of sum of n random variables is equal to the sum of the expected values.

nn XEXEXEXXXE ........ 2121

Page 184: Stochastic processes slides

Sum of Discrete RVsSum of Discrete RVsIf X and Y are two discrete RVs

YEXE

yxyfyxxf

yxfyxYXE

x yx y

x y

),(),(

),(

Page 185: Stochastic processes slides

Expected value of Product of Expected value of Product of Two RVsTwo RVsIf X and Y are two independent RVs

Proof: YEXEXYE

YEXE

dyyfydxxfx

dydxyfxfyx

dydxyxfyxXYE

YX

YX

YX

''''''

'''()'''

''',''' ,

Page 186: Stochastic processes slides

Product of two discrete Product of two discrete RVsRVsIf X and Y are two independent RVs

)()(),( 21 yfxfyxf

YEXE

YExxf

yyfxxf

yfxxyf

yxxyfXYE

x

x y

x y

x y

)(

)()(

)()(

),(

1

21

21

Page 187: Stochastic processes slides

Product of functions of Random Product of functions of Random variablesvariables If X and Y are two independent RVs and

g(X,Y)=g1(X)g2(Y)Find Solution:

YgEXgE

dyyfygdxxfxg

dydxyfxfygxgYgXgE

YX

YX

21

21

2121

''''''

''''''

YgEXgEYXgE 21),(

Page 188: Stochastic processes slides

In general, for n independent RVs,

nXXX ,........, 21

nnnn XgEXgEXgEXgXgXgE ................. 22112211

Page 189: Stochastic processes slides

Conditional Probability Conditional Probability DistributionDistributionFor P(A)>0, Probability of event B

given that A has occurred

If X and Y are discrete RVs with events (A:X=x) and (B:Y=y)

)(|)(

| APABPBAPorAP

BAPABP

Xofprobinalmtheisxfwherexf

yxfxyf .arg)(,

)(

),(| 1

1

Page 190: Stochastic processes slides

Conditional prob. function of Y given X is given by

Conditional prob. function of X given Y is given by

)(

),(|(

xf

yxfxyf

X

)(

),(|

yf

yxfyxf

Y

Page 191: Stochastic processes slides

Conditional Probability Conditional Probability Distribution for Cont. RVDistribution for Cont. RVIf X and Y are two cont. RVs, the

conditional prob. Density function of Y given X is given by

)(

),(|

xf

yxfxyf

X

Page 192: Stochastic processes slides

ProblemProblemThe joint density function of two cont.

RVs X and Y are given by

Find (a)

(b)

otherwise

yxxyyxf

0

10,10,4

3),3(

xyf

dxXYP

2

1

2

1

2

1

Page 193: Stochastic processes slides

SolutionSolution(a) For 0<x<1

(b)

)(0

10,23

43

)(

),(

4

23

24

3

4

3)(

1

1

0

1

definednototherwise

yx

xy

xf

yxfxyf

xxdyxyxf

16

9

4

23

2

1

2

1

2

1

2

1

2/1

2/1

dyy

dyyfdxXYP

Page 194: Stochastic processes slides

Variance of Sum of Two Variance of Sum of Two RVsRVsFor two independent RVs X and Y,

Var(X+Y)=Var(X)+Var(Y)Proof:

)()(

2

2

2

)(

22

22

22

22

2

2

YVarXVar

YEXE

YEYEXEXE

YEYXEXE

YYXXE

YXE

YXEYXVar

YX

YYXX

YYXX

YYXX

YX

YX

Page 195: Stochastic processes slides

Var(X-Y)=Var(X)+Var(Y)Proof:

)()(

2

2

)(

22

22

22

2

2

YVarXVar

YEXE

YEYEXEXE

YYXXE

YXE

YXEYXVar

YX

YYXX

YYXX

YX

YX

Page 196: Stochastic processes slides

Variance of Sum of Two Variance of Sum of Two RVsRVsIn general,

XYYXYXor

YXCovYVarXVarYXVar

2,

),(2)()()(222

Page 197: Stochastic processes slides

Correlation and Covariance of Two Correlation and Covariance of Two RVsRVsThe jkth joint moment of two RVs X and

Y is defined as

i nniYX

kn

ji

YXkjKj

RVsdiscreteYandXyxpyx

ContinuouslyjoYandXFordxdyyxfyxYXE

,

int,

,

,

If j=0, moments of Y are obtainedIf k=0, moments of X are obtained

If j=k=1, E[XY] gives the correlation of X and Y

IF E[XY]=0, X and Y are orthogonal

Page 198: Stochastic processes slides

JkJkthth central moment about central moment about the meanthe meanThe jkth central moment of X and Y

about the mean is given by

kYj

X YXE

If j=2, k=0, variance of X is obtainedIf j=0, k=2, variance of Y is obtained

If j=k=1, Covariance of X and Y is obtained

YandXofianceCoYXCovYXE YX var),(

Page 199: Stochastic processes slides

Covariance of Two RVsCovariance of Two RVs

YEXEXYE

YEXEYEXEXYE

YEXEXYE

YXXYE

YXEYXCov

YXYX

YXYX

YX

2

),(

Hence,

Cov(X,Y)=E[XY], if either of the RVs has mean value equal to zero

Page 200: Stochastic processes slides

ProblemProblemIf X and Y are two independent RVs

each having density function

Find

otherwise

uforeuf

u

0

02)(

2

XYEYXEYXE ,, 22

Page 201: Stochastic processes slides

SolutionSolution

4

14

122

1

2.2.

0 0

22

0

22

0

2222

0

2

0

2

dxdyexyeXYE

dyeydxexYXE

dyeydxex

YEXEYXE

yx

yx

yx

Check E[X+Y]=E[X]+E[Y] E[XY]=E[X]E[Y]

Page 202: Stochastic processes slides

ProblemProblemIf X and Y are two discrete

independent RVs

Find

4/1.3

4/3.2

3/2.0

3/1.1

probwith

probwithY

probwith

probwithX

YXEXYEYXEYXE 222 ,,2,23

Page 203: Stochastic processes slides

SolutionSolution

2

5

4

32

3

13

23234

33

4

12

4

33

10

3

21

3

1

YEXEYXE

YE

XE

Page 204: Stochastic processes slides

4

1

4

3.

3

1

4

1

4

3.

3

1

22

YEXEYXE

YEXEXYE

Page 205: Stochastic processes slides

12

55

4

21

3

2

4

21

3

12

22

4

21

4

933

4

12

4

33

10

3

21

3

1

2222

222

22

YEXEYXE

YE

XE

Page 206: Stochastic processes slides

ProblemProblem If X and Y are two independent RVs, find the

CovarianceSolution:

0

),(

YX

YX

YEXE

YXEYXCov

For the pairs of independent RVs, Covariance is zero.

Page 207: Stochastic processes slides

Theorems on CovarianceTheorems on CovarianceTheorem 1:

Theorem 2: If X and Y are independent RVs

Theorem 3:

Theorem 4:

YEXEXYEXY

0),( YXCovXY

XYYXYXor

YXCovYVarXVarYXVar

2,

),(2)()(222

YXXY

If X and Y are independent, Theorem 3 reduces to Theorem 4.

The converse of Theorem 3 is not necessarily true.

Page 208: Stochastic processes slides

Correlation Coefficient of X Correlation Coefficient of X and Yand YThe correlation coefficient between the two

RVs, X and Y is given by

deviationdardSYVarandXVarwhere

YEXEXYEYXCov

YX

YXYXYX

tan)()(,

),(,

11 , YX

X and Y are uncorrelated, if 0, YX

If X and Y are independent, Cov(X,Y)=0, 0, YX

If X and Y are independent, they are uncorrelated.

-------From Theorem 4

Page 209: Stochastic processes slides

ProblemProblem If X and Y are two discrete RVs, whose joint

density function is given by

Find E[X], E[Y], E[XY], E[X2], E[Y2], Var(X), Var(Y), Cov(X,Y) and correlation coeff.

otherwise

yxforyxcyxf

0

30,202),(

Page 210: Stochastic processes slides

SolutionSolutionWe have

X Y

0 1 2 3 Total

0 0 C 2c 3c 6c

1 2c 3c 4c 5c 14c

2 4c 5c 6c 7c 22c

Total 6c 9c 12c 15c 42c

Page 211: Stochastic processes slides

c

ccc

yxxyfXYE

ccccc

yyfyxfyyxyfYE

cccc

xfxyxfxyxxfXE

x y

x y yy x

x y xx y

102

........3.3.02.2.0.1.00.0.0

,

7815.312.29.16.0

)(,,

5822.214.16.0

)(,),(

Page 212: Stochastic processes slides

YX

YX

Y

X

x y y x

x y x y

YXCov

ccc

YEXEXYEYXCov

ccYEYEYVar

ccXEXEXVar

ccccc

yxfyyxfyYE

cccc

yxfxyxfxXE

),(

78.58102

),(

78192)(

58102)(

1921531229160

,,

10222214160

,,

,

2222

2222

2222

222

222

222

Page 213: Stochastic processes slides

ProblemProblem If X and Y are two Cont. RVs

Find (a) c (b) E[X] (c) E[Y] (d) E[XY] (e) E[Y2] (f) Var(X) (g) Var(Y) (h) Cov(X,Y) (i) Correlation Coeff.

otherwise

yxforyxcyxf

0

50,622,

Page 214: Stochastic processes slides

SolutionSolution

7

802

210

1

63

1702

210

1

63

2682

210

1

210

1

12

1,sin

6

2

5

0

6

2

5

0

dxdyyxxyXYE

dxdyyxyYE

dxdyyxxXE

c

dxdyyxc

dxdyyxfgU

x y

x y

Page 215: Stochastic processes slides

03129.0

3969

200),(

7938

16225)(

3969

5036)(

126

117563

1220

2

222

2

2

YX

XY

Y

X

YEXEXYEYXCov

YVar

XEXEXVar

YE

XE

Page 216: Stochastic processes slides

ProblemProblemLet Ɵ be uniformly distributed RV in the

interval (0,2π) and X and Y are two RVs defined as X=Cos Ɵ, Y=SinƟ.

Show that X and Y are uncorrelated.

Page 217: Stochastic processes slides

SolutionSolutionThe marginal PDF of X and Y are arcsine

functions which are nonzero in the interval So, if X and Y were independent the point

(X,Y) would assume all values in the square.But, this is not the case, hence X and Y are

dependent.

1111 yandx

(CosƟ, SinƟ)

Ɵ

y

x

1

-1

-1

1

Page 218: Stochastic processes slides

024

1

2

1

2

0

2

0

dSin

dCosSinCosSinEXYE

Since E[X]=E[Y]=0, X and Y are uncorrelated

Uncorrelated but dependent Random Variables

Page 219: Stochastic processes slides

Conditional Expectation, Conditional Expectation, Variance and MomentsVariance and MomentsThe conditional expectation of a RV Y given

X is expressed as

Properties: If X and Y are independent,

dxxXxmeansxXwheredyxyyfxXYE

,

YExXYE

XYEEdxxfxXYEYE

1

Page 220: Stochastic processes slides

TheoremTheorem

XYEEYE

Proof:

YEdyyyf

dxdyyxfy

dxdyxfxf

yxfy

dxxfdyxyyf

dxxfXYE

dxxfXYEXYEERHS

XY

XY

X

,

,

:

Page 221: Stochastic processes slides

Hence,

origintheaboutmomentktheisYwhereXYEEYE

and

YoffunctionaisYhwhereXYhEEYhE

thKKK ,

,

,

Page 222: Stochastic processes slides

ProblemProblemIf X and Y are two RVs whose density

function is given by

Find E[Y|X=2]

otherwise

yxforyxcyxf

0

30,202,

Page 223: Stochastic processes slides

SolutionSolutionThe marginal density function of RV X

is given by

And, marginal density function of Y is given by

222

114

06

1

xforc

xforc

xforc

xf

315

212

19

06

2

yforc

yforc

yforc

yforc

yf

Page 224: Stochastic processes slides

The conditional density function of Y given X=2 is obtained as

11

19

22

73

22

62

22

51

22

40

22

422

22

4

42/22

42/2,2

1

yy

yyyyfXYE

yyx

xf

yxfyf

Page 225: Stochastic processes slides

ProblemProblemThe average time of travel from city

‘A’ to city ‘B’ is ‘c’ hours by car and ‘b’ hours by bus. A man cannot decide whether to drive the car or to take bus, so he tosses a coin. What is the expected time of travel?

Page 226: Stochastic processes slides

SolutionSolution

X is RV which is the outcome of tossY is travel time

Using Property 1,

Using Property 2, (for discrete RVs)

1

0

XifY

XifYY

bus

car

bYEXYEXYE

cYEXYEXYE

busbus

carcar

11

00

2

1100

bc

XPXYEXPXYEYE

The RVs X and Y are independent, because Ycar and Ybus are independent of X

Page 227: Stochastic processes slides

Conditional VarianceConditional VarianceThe conditional variance of Y given X

is defined as

The rth conditional moment of Y about any value a is given as

xXYEwhere

ydxyfyxXYE

2

22

22

dyxyfayxXaYE rr

The usual theorems for variance and moments extend to conditional variance and moments.

Page 228: Stochastic processes slides

Gaussian PDF and CDFGaussian PDF and CDFShow that if X is a Gaussian distributed RV,

then its CDF is also Gaussian distributed.

xt

mx

t

X

x mx

mx

X

dtexwhere

mx

dtexFThen

mxtLet

dxexXPCDF

exfXofPDF

2

2

2'

2

2

2

2

2

2

2

2

1

2

1,

'

'2

1:

2

1:

FX(x) is the CDF of Gaussian RV with zero mean and unit variance.

Page 229: Stochastic processes slides

Gaussian PDFGaussian PDFShow that Gaussian PDF integrates to

one.Solution: Take Square of PDF of Gaussian RV

1

2

1

,

2

1

2

1

2

1

0

2

0

2

0

2

2

22

2

2

22

22

222

drrerdrde

rSinyrCosxLet

dxdye

dyedxedxe

rr

yx

yxx

Page 230: Stochastic processes slides

Joint Characteristic Function Joint Characteristic Function

The Joint Ch. Fn of n RVs, is given by

And, the joint ch. fn. of two RVs X and Y is given by

nn

n

XXXjnXXX eE ...

21.....2211

21,...,,

nXXX ,...,, 21

dxdyeyxf

eE

yxjYX

YXjYX

21

21

,

,

,

21,

Joint Characteristic function is the 2-dimensional Fourier transform of joint PDF of X and Y

2121,2,21,

4

1,

ddeyxf yxj

YXYX

Page 231: Stochastic processes slides

Marginal Characteristic Marginal Characteristic FunctionFunction

,0

0,

XYY

XYX

If X and Y are independent

21

21

21

2121,

YXYjXj

YjXjYXjXY

eEeE

eeEeE

Page 232: Stochastic processes slides

ProblemProblemIf Z=aX+bY, find the ch. Fn. Of ZSolution:

If X and Y are independent

ba

eEeE

XY

bYaXjbYaXjZ

,

baba YXXYZ ,

Page 233: Stochastic processes slides

0,021

21

0 0

21

0

2

0

1

21

21

21

|,1

,

!!

!!

,

XYki

ki

kiki

i k

kiki

k

k

i

i

YjXjXY

jYXE

And

k

j

i

jYXE

k

Yj

i

XjE

eeE

Page 234: Stochastic processes slides

ProblemProblemIf U and V are independent zero

mean, unit variance Gaussian RV, and

X=U+V and Y=2U+VFind the joint ch. Fn of X and Y

and find E[XY]

Page 235: Stochastic processes slides

SolutionSolutionThe joint Ch. Fn. Of X and Y is given by

The joint Ch. Fn. Of U and V is given by

)2(

221

2121

2121,VUj

VUVUjYXjXY

eE

eEeE

3

|,1

2

,

0,02121

2

2

2

12

2

1

2121

221

21

221

221

2121

XY

VU

VjUjXY

jXYE

ee

eEeE 2/22 jm

X e

Page 236: Stochastic processes slides

Jointly Gaussian RVJointly Gaussian RVThe RVs X and Y are said to be jointly

Gaussian, if their joint PDF has the form

for

2,21

2

2

2

2

2

1

1,

2

1

12

,

12

212

1exp

,YX

YXYX

XY

mymymxmx

yxf

yandx

Page 237: Stochastic processes slides

ProblemProblemThe PDF for jointly Gaussian RVs is

given by

Find E[X], E[Y], Var(X), Var(Y) and Cov(X,Y)

16168

38

3

16

3

4

2

1

,

22

2

1,

yxxy

yx

YX eyxf

Page 238: Stochastic processes slides

ProblemProblemThe joint PDF of X and Y is given by

Find the marginal PDF.

yxeyxf yxyxYX ,,

12

1,

222 12/2

2,

Page 239: Stochastic processes slides

SolutionSolution The marginal PDF of X is obtained by integrating

f(x,y) over y as

2

122

2,12

12

2

2

12/2

2222212/

2

12/

12/2

2

12/

2

222

2222

22

22

22

x

xyx

xxyx

xyyx

X

e

dyee

xxxyyHeredyee

dyee

xf

Page 240: Stochastic processes slides

Hence

The last integral equals one because its integrand is a Gaussian PDF with mean and variance

The marginal PDF of X is a one-dimensional Gaussian PDF with mean 0 and variance 1. Form symmetry, the marginal PDF of Y will also be a Gaussian PDF with mean 0 and variance 1.

x 21

Page 241: Stochastic processes slides

N jointly Gaussian RVN jointly Gaussian RVThe RVs are said to be jointly

Gaussian RVs if their joint PDF is given by

nXXX ,...,, 21

nnn

n

n

nnn

n

T

nXXXX

XVarXXCovXXCov

XXCovXVarXXCov

XXCovXXCovXVar

Kand

XE

XE

XE

m

m

m

m

x

x

x

X

bydefinedvectorscolumnaremandXwhere

K

mXKmXxxxfXf

n

..,,

.....

.....

,..,

,..,

,

.

.

.

.,

.

.

,2

21

exp,...,,)(

21

2212

1211

2

1

2

1

2

1

21

2

1

21,...,, 21

Where, K is called the Covariance Matrix

Page 242: Stochastic processes slides

Module 4Module 4

Page 243: Stochastic processes slides
Page 244: Stochastic processes slides

Module 5Module 5

Page 245: Stochastic processes slides

Stochastic ProcessesStochastic ProcessesThe outcome of a random experiment is a

function of time or spaceExamples: Speech recognition system (based on

voltage waveforms) Image processing system (intensity of pixels

is a function of space) Queuing system (no. of customers varies as

a function of time) Based on temperature demand of electricity

varies

Page 246: Stochastic processes slides

In electrical engineering, voltage or current are used for collecting, transmitting and processing information as well as for controlling and providing power to a variety of devices.

Page 247: Stochastic processes slides

SignalSignalThese are functions of time and

belongs to two classes: Deterministic: These are described by

functions in the mathematical sense with time ‘t’ as independent variable

Random signal

Page 248: Stochastic processes slides

Random signalRandom signal This always has some element of uncertainty

associated in it and hence it is not possible to determine exact value at the given point of time.

Example: Audio waveform transmitted over a telephone channel.i.e. , we cannot precisely specify the value of random signal in advance. However, may describe its average properties such as average power, spectral distribution, probability that the signal amplitude exceeds a certain value etc.

Page 249: Stochastic processes slides

The probabilistic model used for characterizing a random signal is called random process or stochastic process.

This deals with time varying waveforms that have some element of chance or randomness associated with them.

Example: data communication system in which a number of terminals are sending information in binary format over noisy transmission links to a central computer.

Page 250: Stochastic processes slides

Data Communication Data Communication SystemSystem

Page 251: Stochastic processes slides

Transmitted and Received Transmitted and Received SequenceSequence

Page 252: Stochastic processes slides

ObservationObservationBy observing the waveform of x1(t) for [t1,t2],

we cannot with certainty predict the value of xi(t) for any other value of

The knowledge of one member function xi(t) will not enable us to know the value of another member function xj(t).

21, ttt

Page 253: Stochastic processes slides

We should use a probabilistic model to describe or characterize the ensemble of waveforms so that to answer

a)what are the spectral properties of ensemble of waveforms?

b)how does the noise affects the system performance as measured by the receivers ability to recover the transmitted data correctly?

c) what is the optimum processing algorithm, the receiver should use?

Page 254: Stochastic processes slides

ExampleExampleTossing of N coins simultaneously and

repeating N tossings once every T seconds.

Draw the waveforms.

Page 255: Stochastic processes slides

Random Variable vs Random Random Variable vs Random ProcessProcessA random variable maps the outcome

of random experiment to a set of real numbers, similarly

A random process maps the outcome of a random experiment to a set of waveforms or functions of time.

Page 256: Stochastic processes slides
Page 257: Stochastic processes slides

Suppose there is a large number of people, each flipping a fair coin every minute. If we assign the value 1 to a head and the value 0 to a tail.

Page 258: Stochastic processes slides

Tossing of coinsTossing of coins

Page 259: Stochastic processes slides

Example: Tossing of dieExample: Tossing of dieFor tossing of a die

The set of waveformsis called an ensemble.

224224)(

654321

tttX

)(.....,),(),( 621 txtxtx

Page 260: Stochastic processes slides

Ensemble of waveforms: Ensemble of waveforms: Tossing of DieTossing of Die

Page 261: Stochastic processes slides

For specific value of time, t0 , is collection of numerical values of various member function at t=t0., where t is time and represents an outcome in sample space S

),( 0 tX

Page 262: Stochastic processes slides

000

0

0020100

21

,.4

,....,,|,,.3

mindet,.2

,....,,|,,.1

ttatfunctionmemberitheofvaluenumericaltxtX

ttatfunctionmemberofvaluesnumericalofcollection

txtxtxStXtX

timeoffunctionisticertxtX

timeoffunctionsofCollection

txtxtxStXtX

thii

nii

ii

nii

Page 263: Stochastic processes slides

ProblemProblemTossing of dieFind

0)0(|2)4(

2)4(,0)0(,0)4(,2)4(

XXP

XXPXPXP

Page 264: Stochastic processes slides

SolutionSolutionLet A be the set of outcomes such that

(a)

(b)

(c)

(d)

Ai

3

1

6

22)4(

5,22,4

APXP

AX i

2

1

6

3)(0)4( APXP

6

1,5 BPB

2

1

626

1

0)0(

0)0(,2)4(0)0(|2)4(

XP

XXPXXP

Page 265: Stochastic processes slides

Classification of Random Classification of Random ProcessProcess

X(t) t

Continuous Discrete

Continuous Continuous Random Process

Continuous Random Sequence

Discrete Discrete Random Process

Discrete Random Sequence

Page 266: Stochastic processes slides

Types of Stochastic Types of Stochastic ProcessProcess

Page 267: Stochastic processes slides

Classification of Random Classification of Random ProcessProcessStationary Random Process: The probability

distribution function or averages do not depend upon time ‘t’.

Non Stationary Random Process: The prob. Distribution function or averages depend on time ‘t’.

Page 268: Stochastic processes slides

Classification of Random Classification of Random ProcessProcessReal Valued RPComplex valued RP

If a RP Z(t) is given by

Z(t) =real part of = real part of

Complex envelope

Here, W(t) is complex valued RP and X(t), Y(t) and Z(t) are Real valued RPs.

RPsvaluedrealthearetandtAfrequencycarriertheisfwhere

ttfCostAtZ

c

c

)()(,

,)(2)()(

tfjtjtA c 2exp)(exp)(

tfjtw c2exp)(

)()(

)()()()()(

tjYtX

tSintjAtCostAtw

Page 269: Stochastic processes slides

Classification of Random Classification of Random ProcessProcessBased on observation of past

valuesPredictable, andUnpredictable

Page 270: Stochastic processes slides

Definition of Random Definition of Random ProcessProcess

A real valued RP X(t), is a measurable function on that maps onto R1.

If is a set of one or more intervals on the real line, X(t) is called Random Process.

If is a subset of integers, X(t) is called Random Sequence.

TtS S

.var

,

1 linerealRsetinvalueswithiablet

andSpaceSampleS

Page 271: Stochastic processes slides

A real valued RP X(t) is described by nth order distribution function

It satisfy all requirements of joint probability distribution function (CDF).

n

nnnXXX

tttandnallfor

xtXxtXxtXPxxxFn

,...,,

,...,,,...,,

21

221121....21

Page 272: Stochastic processes slides

Methods of DescriptionMethods of Description1. Joint Distribution: First order dist. Fn. is given by

, which gives the idea about instantaneous amplitude distribution of the process

Second order dist. Fn. is given bywhich gives the information about the structure of the signal in time domain.

11 atXP

2211 , atXatXP

Page 273: Stochastic processes slides

Tossing of dieTossing of die

224224)(

654321

tttX

The outcomes and the corresponding waveforms are given by

60 XandXPJoint probability

Find

Marginal probabilities P[X(0)] and P[X(6)]

Page 274: Stochastic processes slides

waveformwaveform

Page 275: Stochastic processes slides

Values of X(0)

Values of X(6) Total

-4 -3 -2 2 3 4

-4 1/6

0 0 0 0 0 1/6

-2 0 0 1/6 0 0 0 1/6

0 0 1/6

0 0 1/6 0 2/6

2 0 0 0 1/6 0 0 1/6

4 0 0 0 0 0 1/6

1/6

Total 1/6

1/6

1/6 1/6 1/6 1/6

1

Grand Total

Marginal Probability of X(6)

Page 276: Stochastic processes slides

2. Analytical description of RP using 2. Analytical description of RP using Random VariablesRandom VariablesA RP Y(t) is expressed as

iablesrandomareandAWhere

tCosAtY

var

,10)( 8

Page 277: Stochastic processes slides

3. Average Values3. Average ValuesMean,Autocorrelation,

Autocovariance,Correlation Coefficient,

tXEtX

XofconjugatetheisXwhere

tXtXEttRXX*

21*

21,

21*

2121 ,, ttttRttC XXXXXX

111

2211

2121

varvar,

,,

,,

tXiablerandomofiancetheisttCwhere

ttCttC

ttCttr

XX

XXXX

XXXX

Page 278: Stochastic processes slides

ProblemProblemTossing of dieFind 212121 ,,,,),( ttrandttCttRt XXXXXXX

Page 279: Stochastic processes slides

SolutionSolutionWe have

21

2121

6

1212121

6

1

2

140

6

1

4

1

4

1164416

6

1

6

1,

06

1

tt

tttt

txtxtXtXEttR

txtXEt

iiiXX

iiX

X is real, hence conjugate is omitted

Page 280: Stochastic processes slides

The autocovariance and correlation coefficient is given by

22

21

21

21

2121

21

4021

40

21

40,,

,,

tt

ttttrand

ttRttC

XX

XXXX

Page 281: Stochastic processes slides

ProblemProblemA RP X(t) is given by

Find

RVstindependenareandA

inddistributeuniformlyis

ianceandmeanwithRVisAwhere

tACostX

],[

1var0

100)(

11, ttRandt XXX

Page 282: Stochastic processes slides

SolutionSolutionWe have

]sec[1002

1

21002001002

100100100

,,

0100

2

2121

zerotoequalispartondasCos

tCosCosA

E

tACostACosE

ttandttwheretXtXEttR

tCosEAEt

XX

X

It is a function of τ and periodic in τ

If a process has a periodic component, its autocorrelation function will also have a periodic component with same period.

Page 283: Stochastic processes slides

4. Two or more RPs4. Two or more RPsIf X(t) and Y(t) are two RPs

Cross Correlation Function

Cross Covariance Function

Correlation Coefficient

functionondistributiJocalledis

ytYytYytYxtXxtXxtXP nnnn

int

,...,,,,....,, '2

'21

'12211

21*

21, tYtXEttRXY

21*

2121 ,, ttttRttC YXXYXY

2121

2121

,,

,,

ttCttC

ttCttr

YYXX

XYXY

Page 284: Stochastic processes slides

Equality: If their respective member functions are identical for each outcome , the two RPs are called equal.

Uncorrelated: Orthogonal: Independent:

2121 ,,0, ttforttCXY

2121 ,,0, ttforttRXY

''2

'121

'1

'111

'1

'111

,...,,,,..,,,

,...,,......,

,...,;,......,

mn

mmnn

mmnn

ttttttandmnallfor

ytYytYPxtXxtXP

ytYytYxtXxtXP

Independent implies uncorrelated but converse is not true.

Page 285: Stochastic processes slides

ProblemProblemIf Ɵ is uniformly distributed RV in [0,2π]

and X=CosƟ , Y=SinƟShow that X and Y are uncorrelated.

Page 286: Stochastic processes slides

SolutionSolutionWe have,

024

1

2

1

2

0

2

0

dSin

dCosSin

CosSinEXYE

0 YEXE

Hence, X and Y are uncorrelated.

Page 287: Stochastic processes slides

ProblemProblemSuppose X(t) is a RP with

Find the mean, variance and covariance of RVs, Z and W, If Z=X(5) and W=X(8).

212.021 49,

,3)(ttettR

t

Page 288: Stochastic processes slides

SolutionSolution

195.243349

858,58,5

195.1149498,5

4949888,88,8

4949555,55,5

138,8

135,5

38

35

6.06.0

6.032.0

2

2

ee

RC

eeRZWE

RCWVar

RCZVar

RWE

RZE

WE

ZE

Page 289: Stochastic processes slides

StationarityStationarity It describes the time invariance of certain

properties of a RP, whereas individual member functions of a RP may fluctuate rapidly as a function of time, the ensemble averaged values such as, mean of the process might remain constant with respect to time.

A process is said to be stationary, if its distribution function or certain expected values are invariant with respect to a translation of the time axis.

Example: Signal from radar used for both searching and tracking is non stationary.

Considering only searching time is stationary RP.

Page 290: Stochastic processes slides

Types of stationarityTypes of stationarityStrict sense stationarityWide sense stationarity

Page 291: Stochastic processes slides

Strict sense stationarityStrict sense stationarity A RP X(t) is called Strict sense stationary (SSS) or stationary

in the strict sense, if all of the distribution functions describing the process are invariant under a translation of time.i.e., for

If the above equation holds for all kth order distribution function k=1,2,…,N, but not necessarily for k>N, the process is called Nth order stationary.

kk

kk

nn

xtXxtXxtXP

xtXxtXxtXP

kalland

tttttt

,...,,

,...,,

...,2,1

,...,,,,...,,

2211

2211

2121

Page 292: Stochastic processes slides

First and second order First and second order dist. Fn.dist. Fn.First order distribution

Second order distribution

From (i) and (ii),

anyforxtXPxtXP

anyforxtXxtXPxtXxtXP 22112211 ,,

The first order distribution is independent of t

The second order distribution is strictly a function of time difference (t2-t1)

…(i)

…(ii)

1221

*

tan

ttRtXtXE

tConstXE

XX

X

Page 293: Stochastic processes slides

Two real valued RPs X(t) and Y(t) are jointly SSS, if their joint distribution are invariant under a translation of time.

A complex RP, Z(t)=X(t)+jY(t) is SSS, if the processes X(t) and Y(t) are jointly stationary in strict sense.

Page 294: Stochastic processes slides

Wide sense stationarityWide sense stationarityA RP X(t) is said to be stationary in wide

sense (WSS or weakly stationary), if its mean is a constant and the autocorrelation function depends only upon the time difference.

Two processes X(t) and Y(t) are called jointly WSS, if

For random sequences,

XX

X

RtXtXE

tXE

*

XYRtYtXE *

kRknXnXE

kXE

XX

X

*

SSS implies WSS, but the converse is not true.

Page 295: Stochastic processes slides

ProblemProblemIfCheck for stationarity.

.5,3,1,1,3,5 654321 tXtXtXtXtXtX

Page 296: Stochastic processes slides

SolutionSolutionWe have

6

7025911925

6

1,

0

2121

ttRtXtXE

tXE

XX

Stationary in strict sense, since the translation of time axis does not result in any change in member function.

Page 297: Stochastic processes slides

ProblemProblemIf

Check for stationarity.

6,3,3

3,3,6

654

321

tytCostytCosty

tSintytSintyty

Page 298: Stochastic processes slides

SolutionSolutionWe have

12

122121 18726

1,

0

ttR

ttCosttRtYtYE

tYE

YY

YY

Stationary in wide sense.

Page 299: Stochastic processes slides

ProblemProblemEstablish the necessary and sufficient

condition for stationarity of the process

tbSintaCostX )(

Page 300: Stochastic processes slides

A RP X(t) is asymptotically stationary, if distribution function of does not depend on τ, when τ is large.

A RP X(t) is stationary in an interval, if

A RP X(t) is said to have independent increments, if its increments form a stationary process for every τ. Example: Poisson and Wiener Process.

A RP is cyclostationary or periodically stationary, if it is stationary under a shift of the time origin by integer multiple of a constant T0 (which is the period of the process).

ntXtXtX ....,,, 21

.int,...,,

,....,,...,

21

1111

ervalaninliettt

whichforallfor

xtXxtXPxtXxtXP

k

kkkk

)()()( tXtXtY

Page 301: Stochastic processes slides

Time Averaging and Time Averaging and ErgodicityErgodicity It is a common practice in laboratory to obtain

multiple measurements of a variable and average them to reduce measurement errors.

If value of a variable being measure is constant, and the errors are due to noise or due to the instability of the measuring instrument, then averaging is indeed a valid and useful technique.

Time averaging is used to reduce the variance associated with the estimation of the value of a random signal or the parameters of a random process.

Ergodicity is a relationship between statistical average and sample average.

Page 302: Stochastic processes slides

ErgodicityErgodicity It is a property of stationary RP with the

assumption that time average over one member function of a RP is equivalent to an average over the ensemble of functions.

After a sufficient length of time, the effect of initial conditions is negligible.

If one function of the ensemble is inspected over a long period of time, all salient characteristics of the ensemble of functions will be observed.

Any one function can be used to represent the whole ensemble.

Page 303: Stochastic processes slides

It can be interpreted in terms of prob. Dist. Fn.

If Then, 1st prob. Dist. Fn. p(y)is the

probability that at any given time t1, any function fj(t) lies between y and y+dy.

p(y) is the probability that any function fj(t) at any time lies between y and y+dy called ergodic property.

ensembleoffunctionsmembertftftf n ,...,, 21

Page 304: Stochastic processes slides

Statistical characteristics of an ensemble can be determined by consideration of any one function.

The ensemble average is the average taken at a fixed time for a large number of member functions of the ensemble.

The time average is the average taken with large number of widely separated choices of initial time and using only one function of the ensemble.

Page 305: Stochastic processes slides

Special types of random Special types of random processesprocessesPoisson processRandom Walk processWiener processMarkov process

Page 306: Stochastic processes slides

Poisson processPoisson processIt is a continuous time, discrete

amplitude random process.It is used to model phenomena such

as emission of photons from a light emitting diode, arrival of telephone calls, occurance of failures etc.

A counting function Q(t) is defined foris the number of events that have occurred during time period.

),0[ t

Page 307: Stochastic processes slides

Q(t) is an integer valued RP and said to be Poisson process if the following assumptions hold:

For any time number of events Q(t2)-Q(t1) that occur in the interval t1 to t2 is Poisson distribution as:

Number of events that occur in any interval of time is independent of number of events that occur in other non-overlapping time intervals.

1221, ttandtt

...2,1,0,exp! 12

1212

kfortt

k

ttktQtQP

k

Page 308: Stochastic processes slides

We have,

The autocorrelation of Q(t) is obtained as

ttQVarttQE

ktk

tktQP

k

,

...,2,1,0,exp!

2121212

1221

12121

21

12112

121211

2121

,,min.

1

,

ttallfortttt

ttfortt

ttttt

tQtQEtQEtQE

ttfortQtQtQtQE

tQtQEttRQQ

222

222

XEXE

XEXE

It is not a martingale, since the mean is time varying.

Page 309: Stochastic processes slides

Markov ProcessMarkov ProcessA RP X(t), is called a first order Markov

process, if for all sequences of times

t

...,2,1,0.....21 kandttt k

111 .......... kkkkkk tXxtXPtXtXxtXP

i.e., conditional probability distribution of X(tk) given for all past values of X(t1)=x1,…..,X(tk-1)=xk-1 depends upon the most recent value of X(tk-1)=xk-1.

Page 310: Stochastic processes slides

Process with independent Process with independent incrementsincrementsA RP X(t), is said to have independent

increments, if for all times the RVs,

are mutually independent.

t

...,4,3....21 kandttt k

12312 ....,, kk tXtXandtXtXtXtX

Page 311: Stochastic processes slides

MartingaleMartingaleA RP X(t), is called a Martingale, if

t

2112112 ; ttallfortXtttXtXE

andtallfortXE

i.e., Constant mean

It plays an important role in prediction of future values of random processes based on past observation.

Page 312: Stochastic processes slides

GaussianGaussianA RP X(t), is called a Gaussian process

if all its nth order distribution function

are n-variate Gaussian distributions

If the Gaussian process is also a Markov process, it is called a Gaussian-Markov process.

t

nXXX xxxFn

,...,, 21.....21

iin tXXandttt ,......,, 21

Page 313: Stochastic processes slides

Random Walk ProcessRandom Walk Process It is a discrete version of Wiener process used to

model the random motion of a particle. Assumptions: A particle is moving along a horizontal line, until it collides

with another particle. Each collision causes the particle to move ‘up’ or ‘down’

from its previous path by a distance ‘d’. Collision takes place once every T seconds and

movement after the collision is independent of its position.

It is analogous to tossing a coin once every T seconds and taking a step ‘up’ if head show and ‘down’ if tail show, called Random Walk.

Page 314: Stochastic processes slides

Sample function of Random walk Sample function of Random walk processprocess

d

X(n)

2d

-d

-2d

-3d

t/T=n

Page 315: Stochastic processes slides

Position of particle at t=nT is a random sequence X(n).

Assume X(0)=0 and jump of appears instantly after each toss.

d

Page 316: Stochastic processes slides

If k heads show up in the first n tosses, then the position of the particle at t=nT is given by

If, the sequence of jumps is denoted by a sequence of random variables {Ji}, then X(n) can be expressed as

The RVs Ji, i=1,2,3,…,n are independent and have identical distribution functions with

nnnnnm

nnkwheremd

dnk

dknkdnX

,2...,,4,2,

,1...,,2,1,0,

2

)(

nJJJnX ...21

222

2

2,0

2

1,

2

1

ddd

JEJE

dJPdJP

ii

ii

Page 317: Stochastic processes slides

We have,

The number of heads in n tosses has Binomial distribution, hence

2

,)(nm

ktossesninheadskPmdnXP

2

221

2 ...

0

2;...,,2,1,0,2

1

2

1

2

1

nd

JJJEnXE

nXE

and

nkmnkk

n

k

nmdnXP

n

nknk

Page 318: Stochastic processes slides

The autocorrelation function of random walk sequence is given by

If n2>n1, X(n1) and X(n2)-X(n1) are independent RVs, hence the number of heads from the 1st to n1

th tossing is independent of the number of heads from (n+1)th tossing to the n2

th tossing. Hence,

If n1>n2,

Hence,

121

21

1211

2121,

nXnXnXEnXE

nXnXnXnXE

nXnXEnnRXX

21

21

1212

121,

dn

nXE

nXnXEnXEnXEnnRXX

2221, dnnnRXX

22121 ,min, dnnnnRXX

Random Walk is a Markov sequence and a Martingale.

Page 319: Stochastic processes slides

Wiener processWiener processDefine Y(t) as continuous random process for

from the random sequence X(n) as

Mean:Variance:Y(t) is Broken line of sample function of random walk process.

Wiener process is obtained from Y(t) by letting time (T) between jumps and the step size (d) approach zero with constraint d2 =ᵅT to assume that variance will remain finite and nonzero for finite values of t.

),0[ t

22

2

,0)(

,....2,1)1(,)(

0,0)(

ndT

tdtYE

andtYE

nTtAt

nfornTtTnnX

ttY

Page 320: Stochastic processes slides

Properties of Wiener Properties of Wiener processprocessW(t) is a constant amplitude, continuous

time, independent increment process. It has It has Gaussian distribution

For any value of t’, ,the increment w(t)-w(t’) has a Gaussian PDF with zero mean and variance

Autocorrelation of W(t) is

tt '0

'tt

t

w

twfW 2

exp2

1 2

2121 ,min, ttttRXX

Wiener process is a nonstationary Markov process and a Martingale.

ttWEandtWE 20

Page 321: Stochastic processes slides

Correlation FunctionCorrelation FunctionThe core of statistical design theory is

mean square error criterion.The synthesis should aim towards

minimization of mean square error between actual output and desired output.

The input is assumed as stationary time series existing over all time.

Page 322: Stochastic processes slides

The mean square error is expressed as

To express it in terms of system characteristic and input signal, f0(t) is replaced by fi(t) and g(t), the unit impulse response.

The superposition theorem states that

T

T

dT

dttftfT

e 20

2

2

1lim

sGsF

sFdtfgtf

ii

00 ,

Page 323: Stochastic processes slides

The mean square error is then expressed as

T

T

dT

d

T

T

iT

T

T

iiT

T

T

dT

id

T

TT

ii

T

TT

T

T

didiiT

T

T

diT

dttfT

dttftfT

dg

dttftfT

dgdg

dttfT

dtfgtfdtT

dtfgdtfgdtT

tfdtfgtfdtfgdtfgdtT

tfdtfgdtT

e

2

2

2

22

2

1lim

2

1lim2

2

1lim

2

1lim

2

1lim2

2

1lim

22

1lim

2

1lim

Page 324: Stochastic processes slides

The fi(t) and fd(t) are in the form of an averaging of the product of two time functions.

If

dttftfT

T

T

baT

ab

2

1lim

id

dd

ii

ddidii

Where

dgdgdge 022

Correlation function of statistics

Auto correlation function of input signal fi(t)

Auto correlation function of desired output

Cross correlation function between input signal and desired output

Page 325: Stochastic processes slides

Measurement of Autocorrelation Measurement of Autocorrelation functionfunctionThe meter will read the autocorrelation

function for one particular value of τ, since wattmeter performs multiplication and averaging.

For the plot of autocorrelation function, delay of line can be varied and a number of discrete readings are taken.

It is qualitatively a measure of regularity of the function.

If there is no DC component in the signal, the autocorrelation function will be small if the argument τ is taken larger than the interval over which values of the function are strongly dependent.

Page 326: Stochastic processes slides

Autocorrelation function for any argument τ is the average of the product of e1 and e2 values of the function τ seconds apart and average is given by:

The expected or average value is summation or integration of all products multiplied by their respective probabilities.

2121

21212111

,

,

dedeeepwhere

dedeeepee

is the probability of any given product having 1st term between e1 and e1+de1 and 2nd term between e2 and e2+de2

Page 327: Stochastic processes slides

Properties of Autocorrelation Properties of Autocorrelation FunctionFunction

It is an even function of τ, i.e., . Since the functions are averaged over a doubly

infinite interval, the averaged product is independent of the direction of the shift.

is autocorrelation function with zero argument, i.e., average power of time function.

If function f1(t) represents the voltage across a 1 Ω resistor or the current through a 1 Ω resistor, autocorrelation function is the Power consumed by the 1 Ω resistor .

dttftfT

dttftfT

T

TT

T

TT

111111 2

1lim

2

1lim

1111

011

Page 328: Stochastic processes slides

0

2

10

02

1

002

1

,2

1

0

1111

2

111111

11

2

11

1111

2

1111

21

21

21111

1111

tftf

tftf

tftf

sidesbothontakenAverage

tftftftftftf

o The Maximum value of autocorrelation function appears, when function is multiplied by itself without shifting

Page 329: Stochastic processes slides

If the signal contains periodic components (or DC value), the autocorrelation function contain components of the same periods (or a DC component), i.e., a periodic wave shifted by one period is indistinguishable from the unshifted wave.

If the input signal contains only random components (no periodic components), the autocorrelation function tends to zero as τ tends to infinity. As the shift of time function becomes very large, the two functions f1(t) and f1(t+τ) becomes essentially independent.

Page 330: Stochastic processes slides

Aurocorrelation function is equal to the sum of the autocorrelation functions of the individual frequency components, since the multiplication of components of different frequency results in zero average value, i.e., voltage and current of different frequencies result in zero average value.

A given autocorrelation function may correspond to an infinite number of different time functions.

Autocorrelation function of the derivative of f(t) can be expressed in terms of autocorrelation function of f(t) as,

''11

'1

'12

1lim

dttftf

T

T

TT

Page 331: Stochastic processes slides

Power Spectral Density (PSD)Power Spectral Density (PSD)

If input is stationary time series and minimization of mean square error is used as design criterion, signals are described by correlation functions.

The correlation functions are sufficient data for synthesis of a minimum mean square error system.

It is convenient to describe the input signal in frequency domain characteristics.

If the autocorrelation function defines the system adequately, then frequency domain function must carry information contained in the autocorrelation functions.

Page 332: Stochastic processes slides

A function satisfying such requirements in Laplace transform of autocorrelation function,

To determine what characteristics of random input signal are measured by the frequency function, consideration of Laplace transform in significant.

If fp(t) is periodic,

des s

1111

Tt

t

tjnpn

Tt

t

p

n

tjnnp

dtetfT

a

dttfT

a

eatf

0

0

0

0

1

10

Page 333: Stochastic processes slides

Amplitude spectrumAmplitude spectrum

The energy in signal is concentrated at isolated frequencies.

But, in aperiodic signals conversion from Fourier series to Fourier transform of Laplace integral transformation involves a limiting process and spectrum is called Amplitude density spectrum.

|an|

n

Indicates actual amplitude of signal component at the corresponding frequency

Page 334: Stochastic processes slides

If f1(t) is the voltage across 1 ohm resistor, plot of |F1(jw)|2 vs w is called the Energy density spectrum, i.e., direct indication of energy dissipated in 1ohm resistor as a function of frequency.

The area under the curve between w1 and w2 is proportional to the total energy at all frequencies within these limits.

f1(t)

wt

|F1(iw)|

221

11

1

1

0

00

ajF

assF

tfore

tfortf

at

Page 335: Stochastic processes slides

Total energy is proportional to the the area under the entire curve, i.e.,

Thus, spectrum of periodic wave is called the amplitude spectrum and the spectrum of aperiodic waves is called the amplitude density spectra.

But, random time functions, can be characterized by Laplace transform of autocorrelation function.

ad

a

djF

2

11

2

1

2

1

22

2

1

Page 336: Stochastic processes slides

Hence,

Consider f1(t) to exist over the time interval –T to T instead of over all time,

Laplace transform of f11(t) exists, the order of integration is interchanged

dttftfdeT

dttftfT

de

des

T

T

s

T

T

TT

s

s

11

11

1111

2

1lim

2

1lim

dttftfdT

sT

TT

1111

T

T

s-11 e

2

1lim

TandTTandTthatelsoisT

xtlet

detfdttfT

sT

T

sT

TT

arg

2

1lim 111111

Page 337: Stochastic processes slides

So, we have

Now, the consideration is reverted to the original function f1(t), which is equal to f11(t) in the interval –T to T, an interval which is allowed to become infinite after integral of above equation is evaluated.

2

1111

1111

111111

2

1lim

,expint

2

1lim

2

1lim

T

T

st

T

T

T

sxT

T

st

T

T

T

txsT

TT

dtetfT

s

ressionsconjugateseparateareegralsTwo

dxexfdtetfT

dxexfdttfT

s

Page 338: Stochastic processes slides

Random time functions involve infinite energy, hence it is necessary to convert by the averaging process with respect to time, to a consideration of power.

is the power spectral density. The significance of PSD is that the power between the

frequencies w1 and w2 is 1/2π times the integral of ø11(jw) from w1 and w2.

djwith

dej

givestiontransformaInverse

djP

j

total

1111

1111

11

2

10,0

2

1

2

1

Ø11(0) is total power, if f1(t) is the voltage across 1 ohm resistor. The integral yields the power dissipated in the resistor by all signal component with frequency lying within the range w1 and w2.

j11

Page 339: Stochastic processes slides

Measurement of PSDMeasurement of PSD

The wattmeter measures the power dissipated in the 1 ohm resistor. The filter output voltage across 1 ohm resistor contains all frequencies of f1(t) below wc, with no distortion introduced, but none of the frequencies above wc.

The wattmeter reads the power dissipated in the resistor by all frequency components of f1(t) from –wc to wc.

R=1f1(t)

-wc wc w

Gain

f1(t) is electrical voltage, passed through an ideal low pass filter

djadingWattmeterc

c

112

1Re

Page 340: Stochastic processes slides

Characteristics of Power spectral Characteristics of Power spectral densitydensity It measures the power spectral density

rather than the amplitude or phase specra of a signal , i.e., relative phase of the various frequency components is lost.

As a result of discarding the phase information, a given power density spectrum may correspond to a large number of different time functions.

It is purely real , i.e. , time average power dissipated in a pure resistance is being measured.

It is an even function of frequency, i.e.,

functionEvendCos

functionoddanisitaszeroistermonddSinjdCos

dej

jj

j

11

1111

1111

1111

][sec

Page 341: Stochastic processes slides

It is nonnegative at all frequencies. Negative values in any frequency band indicates that the power is being taken from the passive 1 ohm resistor.

If the input signal contains a periodic component such that the Fourier series for this component contains terms representing frequencies w1, w2,…wn, PSD will contain impulses at w1,-w1, w2, -w2,….., wn, -wn.

If f1(t) contains periodic component of frequency w1, PSD will contain a term of form a1Cosw1τ

Page 342: Stochastic processes slides

ProblemProblemLet X(t) is a RP defined as,

where Ɵ is a uniformly distributed RV in the interval (0,2π).

Find PSD of X(t).

tfaCostX 02)(

Page 343: Stochastic processes slides

SolutionSolutionMean of the process is zero.Autocorrelation and autocovariance is

Cosa

ttCosa

dttCosttCosa

tCostCosaEttRttC

f

XXXX

22

222

1

,,

2

2

21

2

2121

2

212

2121

0

Page 344: Stochastic processes slides

SolutionSolutionThe autocorrelation function is obtained as,

The PSD is thus,

The signal has average power Rx(0)=a2/2.All the power is concentrated at the

frequencies f0,-f0,

So, the power density at these frequencies is infinite.

0

2

22

fCosa

Ror XXXX

0

2

0

2

0

2

44

22

ffa

ffa

fCosa

jorfS XXXX

Page 345: Stochastic processes slides

Response of Linear Systems to Response of Linear Systems to Random InputsRandom InputsRegardless of whether or not the system is

linear, for each member function x(t) of the input process X(t), the system produces an output y(t) and an ensemble of output functions form a random process Y(t), which is the response of the system to the random input signal X(t).

Given the description of input process X(t) and that of system, obtain the properties of Y(t) such as mean, autocorrelation function or lower order probability distribution of Y(t).

Page 346: Stochastic processes slides

Classification of SystemsClassification of SystemsA system is functional relationship between

input x(t) and output y(t).

Lumped : A dynamic system is called lumped if it can be modeled by a set of ordinary differential or difference equation.

Linear Time InvariantCausal

,);()( 0 ttxfty

Page 347: Stochastic processes slides

Response of LTIVC Continuous Response of LTIVC Continuous time systemstime systems Input-output relationship of linear, time

invariant and causal system driven by deterministic input signal x(t) can be represented by convolution integral as

where h(t) is the impulse response of the system and zero initial conditions are assumed.

For a stable system

dtxh

dthxty

)(

0,0

hand

dh

Page 348: Stochastic processes slides

In frequency domain, input output relationship is expressed as

Y(t) is obtained by taking the inverse Fourier transform of YF(f).

The forward and inverse transforms are defined as

fXfHfY FF

dfefYfYty

dtetyfY

ftjFF

ftjF

21

2

)(

)(

Page 349: Stochastic processes slides

When the input to the system is a RP X(t), the resulting output process Y(t) is given by

The above equation implies that each member function of X(t) produces a member function of Y(t).

In case of discrete time inputs, distribution function of the process Y(t) are very difficult to obtain except for the Gaussian case in which Y(t) is Gaussian, if X(t) is Gaussian.

dthX

dhtXtY

)(

Page 350: Stochastic processes slides

Mean and autocorrelation Mean and autocorrelation functionfunction Assuming that h(t) and X(t) are real valued and that

the expectation and integration can be interchanged because integration is a linear operator, mean and autocorrelation function of the output is calculated as

21221121

21222111

2121

,

,

,

ddttRhh

ddhtXhtXE

tYtYEttR

and

dht

dhtXE

dhtXEtYE

XX

YY

X

Page 351: Stochastic processes slides

Stationarity of the outputStationarity of the outputWe have,

If the processes X(t) and X(t+E) have the same distribution (i.e., X(t) is SSS) then the same is true for Y(t) and Y(t+E) and hence Y(t) is SSS.

If X(t) is WSS, then mean does not depend upon t and we have

Thus, the mean of the output does not depend on time.

dhtXtY

and

dhtXtY

)(

)0(

)(

Hdh

dhtYE

XX

X

Page 352: Stochastic processes slides

The autocorrelation function of the output is given by

2112122121, ddttRhhttR XXYY

Since, the integral only depends on the time difference t2-t1, RYY(t1,t2) will also be a function of the difference t2-t1. This is coupled with the fact that the output process Y(t) is WSS, if the input process X(t) is WSS.

Page 353: Stochastic processes slides

PSD of the outputPSD of the outputWhen X(t) is WSS, it can be shown that

where * denotes convolution

Taking Fourier transform of both sides, PSD of the output is obtained as

hhR

hRR

and

hRR

hRR

XX

YXYY

XXXY

XXYX

**

*

,

*

*

2fHfSfS XXYY

The input spectral component at frequency f is modified according to |H(f)|2 ,hence is sometimes called power transfer function.

hR

tXdthxE

tXtYER

XX

YX

*

Page 354: Stochastic processes slides

Mean square value of the Mean square value of the output output The mean square value of the output,

which is a measure of the average value of the output power.

It is given by

Assumption that Rxx(τ) can be expressed as a sum of complex exponentials [i.e., Sxx(f) is a rational function of f] simplifies the integral.

dffHfS

dffSRtYE

XX

YYYY

2

2 0

Except in some cases, evaluation of preceding integral in difficult.

Page 355: Stochastic processes slides

The transformation s=2πjf is used and we have

Where a(s)/b(s) has all of its poles and zeros in the LHS and a(-s)/b(-s) has all its roots in the RHS.

Therfore,

,2 sbsb

sasa

j

sSXX

)()(

)()(|*,

)()(

)()(

2

1

2/

2

sdsd

scscfHfHfSwhere

dssdsd

scsc

jtYE

jsfXX

j

j

Page 356: Stochastic processes slides

ProblemProblemX(t) is the input voltage to the system

shown in Figure is a stationary RP with

Find the mean, PSD and autocorrelation of the output.

exp0 XXX Rand

Input X(t)Output Y(t)

L

R

Page 357: Stochastic processes slides

SolutionSolutionWe have

So,

22

0

0

2

2

2expexp2expexp

,

2

f

dfjdfjfS

also

fLjR

RfH

XX

L

R

LRL

R

LR

LR

R

FTinverseTaking

fLR

R

ffS

YY

YY

Y

2222

2

22

2

22

exp

,

22

2

0

Page 358: Stochastic processes slides

ProblemProblemThe input to a RC lowpass filter with

is a zero mean stationary RP with

Find the mean square value of output Y(t).

1000/1

1

fjfH

HzwattfSXX /10 12

Page 359: Stochastic processes slides

SolutionSolutionWe have

100010

20001

10

20001

10

2

1

2

10001

1.

10001

110

12

662

12

dsssj

tYE

jfs

fjfjfS

j

j

YY

Page 360: Stochastic processes slides

Design of Stored data Wiener filterDesign of Stored data Wiener filter

Determination of optimum linear system lies within obtaining a linear system which minimizes the measure of error.

Assumption: The system is linear. Time series is stationary. Mean square error is the appropriate

measure of system error.

djjGjGjjGe ssdnn

222

2

1

Page 361: Stochastic processes slides

The system is given by figure below

System g(t)Tr. Fn G(jw)

Input: fi(t)=s(t)+n(t)

PSD: øii(jw)=øii(jw)+ øii(jw)

Output: fo(t)

Desired output , fd(t)

Page 362: Stochastic processes slides

We have

dCosAAAAA

djASinACosSinjACosAA

djeAeAjAe

iablerealoffunctionsrealareandAboth

eAjGandeAjG

tftfte

tgtstf

ssdddnn

ssddddnn

ssjj

dnn

jdd

j

d

dd

d

d

)(22

1

2

1

2

1

var

,*

222

22

222

0

The equation indicates the optimum choice of Ɵ. (if physical realizability condition are neglected). Since A, Ad, ønn, øss are nonnegative for all values of w, minimum value of integral occurs when

dAAAAAe

or

imumisCosAA

ssddnn

d

dd

22

1

,

max2

2222

Page 363: Stochastic processes slides

Hence., minimization of mean square error is resolved to determination of optimum value of A.

Equation comes from purely real physical reasoning. Since choice of Ɵ has no effect on mean square value of noise component of output, it is reasonable to select Ɵ to minimize the signal distortion, best choice of Ɵ is the one which results in no phase distortion.

Page 364: Stochastic processes slides

Hence, mean square error can be expressed as

dAA

A

dA

AA

A

dAA

AAAA

dAAAAe

nnss

nnssd

nnss

ssdnnss

nnss

ssdssd

nnss

ssdnnss

nnss

ssd

nnss

ssdssdssdnnss

ssdssdnnss

22

222

2

222222

222

2

1

2

1

22

1

22

1

Neither the squared term nor the last term can be negative for nay value of w, hence minimum value of mean square error occurs when square term is zero.

nnss

ssdAA

Page 365: Stochastic processes slides

Thus, optimum transfer function, without regarding physical realizability is given by

dAe

and

jGjj

jjG

dnnss

nnssopt

dnnss

ssopt

22

2

1

,

Page 366: Stochastic processes slides

ProblemProblem Given a filtering problem

2

222

22min

222

2

22

1

3

/136

/36

2

1

/136

/36

:

.,.

,36

36

a

a

daa

ae

eaa

ajG

Solution

TdelaytimewithinputofcomponentsignaleiejG

ajj

Tjopt

Tjd

nnSS

Now, compare the designed filter with a filter designed by classical theory

Page 367: Stochastic processes slides

1

PSD

wc-wcw

øss

ønn

Page 368: Stochastic processes slides

Physically Realizable Physically Realizable sytemsytemThe design theory lies within determination

of physically realizable network which minimizes the mean square error.

Ein Eout

sE

sEsG

in

out

If input is a sine wave, output is also a sine wave of same frequency w1

1

1

jGEE

EjGE

inout

inout

I.e., in passage of signal through the network, the amplitude is multiplied by |G(jw1)| and phase is advanced by the angle of G(jw1)

Page 369: Stochastic processes slides

Physical realizability of the network requires that G(s) be analytic in RHS of s-plane i.e., if G(s) is the ratio of polynomials, all poles must lie in the LHS of s-plane.

The impulse response must be zero for all negative time i.e., there must be no output before the application of the input.

Page 370: Stochastic processes slides

SummarySummary |G(jw)| is the gain function of physically

realizable network, if there is atleast one zero at infinity, since in any is increased indefinitely.

g(t) is realizable if g(t)=0 for t<0, and g(t) approaches to zero as t tends to infinity.

If G(s) is given as the ratio of polynomials with real coefficients, it is realizable if all poles in the RHS of s-plane excluding infinity and jw-axis. If there is a pole at infinity, G(s) can be realized within any desired accuracy over any infinite portion of the frequency spectrum.

Page 371: Stochastic processes slides

In classical design, ideal low pass filter with cutoff frequency at which noise and signal power density spectra are equal, i.e., the value at which

Corresponding mean square to error is equal the sum of signal and noise components.

222

16

36

36a

aa c

Page 372: Stochastic processes slides

6arctan

63

36

361

1,

16

112

2

1

16

16

2

1

2

1,

2

2

2

22

222

2

c

sss

nnn

d

djeComponentSignal

aa

aa

a

aa

aa

a

djeComponentNoise

c

c

c

c

Page 373: Stochastic processes slides

Total mean square error

Classical filter fields a mean square error about 42% greater than the filter designed by minimizing the mean square error.

There is an improved performance of Wiener filter as a consideration of phase, whereas classical filter neglects the phase directly.

Wiener filter does not exhibit a sharp cutoff because of possibility that the high frequency components of signal may add in just the correct phase yield a very rapid change in signal waveform.

73.1,sin,

45.2,5.0

16

,

6arctan

631

6

2

22

2

22

eFilterWienergubybut

eawhen

aa

where

aa

e

c

c

Page 374: Stochastic processes slides

Design of Real time Wiener Design of Real time Wiener FilterFilterThe expression for optimum transfer function

is then obtained as,

tjopt

dnnss

ssopt

ea

ajG

jGjj

jjG

22

2

/36

/36)(

)(

It describes the optimum transfer function without consideration of physical realizability. The exponential term is the desired transfer functionin absence of noise.

aapolestwo

eaas

asG

sj

sTopt

/16

/136

/36

2

222

2

One pole in each half of s-plane

Page 375: Stochastic processes slides

The impulse response for the first term can be obtained ase

The impulse response corresponding to total Gopt(s) in the above equation delayed by T is now obtained as

g(t)

t

Gopt(t)

t

Page 376: Stochastic processes slides

Regardless of allowable delay, the optimum system is never exactly physically realizable, optimum impulse response is never zero for all negative time even though the response for negative time can be made small (as small as possible) if sufficiently great delay is admitted.

But, difficulty arises regardless of Gd(jw) because is always a function of w2 or s2 i.e., poles in

both left and right half plane. Only situation in which optimum transfer function is

realizable is the trivial case in which there is no noise and Gd(jw) is itself realizable.

nnss

ss

Page 377: Stochastic processes slides

Noise free systemNoise free systemFor a noise free system, n=0 and

If Gd(jw) corresponds to a nonrealizable network, the corresponding inverse waveform, the desired impulse response is not zero for t<0.

The simplest example is to design a predictor, where desired output is a prediction of input, i.e.,

Hence,

jGjG dopt

valuepositivehaveandtimepredictioniswhere

tstfd

,

jopt ejG

Page 378: Stochastic processes slides

Bode and Shannon Bode and Shannon MethodMethodThe approximation of Gopt(jw) by a realizable

G(jw) is solved by Bode and Shannon method, which is as follows:

Factorize into two components

The subsystems can be drawn as

ss sss

sands

ssssss

ssss

61

32,

61

32

361

9422

2

ss

ss

ss

ss

ss

ssIf

ssss

ss

g2(t) or

G2(s)

g1(t) or

G1(s)

S(t) m(t) f(t)

Øss(s) Øoo(s)Ømm(s)

Page 379: Stochastic processes slides

Here, m(t) is the white noise. And,

Waveform of m(t) depends on the waveform of s(t), but the optimum filter is independent of this waveform and depends only on the power spectrum density or autocorrelation function.

It is permissible to consider m(t) as a train of closely spaced narrow random statistically independent pulses. Hence, select transfer function G2(s) to operate on these pulses to give best prediction of input signal s(t).

jsssssss

jsssssmm

mmss

ssj

sGsGjjGjs

ss

sG

1.

1

1,1

11

2

1

1

Page 380: Stochastic processes slides

Each of these pulses produces an output proportional to the impulse response g2(t), hence total output fo(t) is the sum of all individual responses.

Pulses of m(t)

Components of fo(t)

Page 381: Stochastic processes slides

The approximate impulse response for the second section of predictor is

Mean square error of prediction is the error introduced by neglecting the pulses of m(t) from t=0 to t= if output at t=0 is considered.

Relative error introduced by prediction is measured by

softransforminverseistgwherettg

ttg ssss

ss

,0,

0,02

0

22

0

2

0

22

dttgf

bygivenisoutputactualThe

dttgdttge

ssd

ssss

0

2

0

2

2

2

dttg

dttg

f

e

ss

ss

d

Page 382: Stochastic processes slides

Problem Problem A system has øss(s) expressed as

Find the optimum transfer function of the predictor, mean square error of prediction.

361

3622

ss

sss

Page 383: Stochastic processes slides

SolutionSolutionWe can express now

6

1,435.0016.1

6

1,0

6

1,

5

6

6

611

0,5

6

0,0

61

6,

61

6

62

6

16

6

1

2

1

6

tee

ttg

teetg

ss

ssG

tee

ttg

sss

sss

ttopt

tt

opt

ss

ttss

ssss

Page 384: Stochastic processes slides

Physical realizability requires g2(t) to be zero for t<0, hence,

So,

0,435.0016.1

0,062tee

ttg

tt

gss(t)g2opt(t)

g2(t)

t t t-1/6

61

11015.065.5

6

435.0

1

1016.12

ss

s

sssG

Page 385: Stochastic processes slides

The overall transfer function for the optimum physically realizable predictor is G1(s)G2(s), hence

The mean square error of prediction is evaluated using the expression. Both integrals will have the form,

After substitution of limits, we get

ssG 1015.01942.0

25

72442

7

3

225

36

1272

12722

ttt

tttss

eee

dteeedtg

05.02

2

df

e

-> Mean square value of e(t) is 5% of mean square value of input or desired output, i.e., there is not much difference between mean square value of input and actual output because of crosscorrelation between both.

Page 386: Stochastic processes slides

Noise included in inputNoise included in inputWe have,

Assumption:Signal and noise are uncorrelated.Bode and Shannon method : The input is

first converted to white noise which is then considered as a sequence of statistically independent short duration pulses.

Øss(s) is factored, Øii+(s) contains

all critical frequencies in LHS of s-plane. Input is passed through a system with

transfer function

PSDinputsss

sGs

ssG

nnssii

dii

ssopt

:

sss iiiiii

ssGii

1

1

Page 387: Stochastic processes slides

It is incorrect and arises from the fact that when the system is designed to operate on the actual input which posses an autocorrelation function other than 0 for τ not equal to 0, future values of input are in part determined by the present and past values.

If realizability condition is neglected, the optimum transfer function for the second section is

Actual G2(s) used in ‘realizable part’ of G2opt (s), i.e.,

sGs

ssG d

ii

ssopt

2

0,

0,0

,22 ttg

ttg

opt

It follows from the fact that for minimization of mean square error, the optimum operation on each side of the white noise pulses is independent of the other pulses.

This is true because of consideration of m(t) as train of pulses which are statistically Independent. Accordingly the pulses of m(t) which have already occurred must be given same weighting whether future pulses are to be considered or neglected.

This is the basic reason for converting the signal to white noise before physical realizability condition is introduced.

Page 388: Stochastic processes slides

ProblemProblemGiven

Find the optimum overall transfer function

using Bode and Shannon method.

sdnnss esGs

sss 1.0

22,5.0,

361

36

Page 389: Stochastic processes slides

SolutionSolution

66112

82.582.579.179.1

5.0361

3622

ssss

ssss

sssii

61

62.579.1

2

1

ss

sssii

The input is then converted to white noise by passage through a network with transfer function

82.579.1

6121

ss

sssG

Page 390: Stochastic processes slides

G2opt (s) can be expressed as

g2opt(t) is obtained by taking inverse transform of G2opt(s) as

Hence,

The optimum overall transfer function is

s

opt essss

sG 1.02 82.579.161

236

0,111.0536.0

0,01.061.02

tee

ttg

tt

61

72.6424.0

6

0607.0

1

485.02

ss

sss

sG

82.579.1

72.66.0

ss

ssG

Page 391: Stochastic processes slides

Assumption in Bode and Shannon Assumption in Bode and Shannon methodmethodThe mean square error is the

significant error measure.Both signal and noise are stationary

time series.A linear system is desired.