Ch2 (Review of Probability)

  • View
    214

  • Download
    0

Embed Size (px)

Text of Ch2 (Review of Probability)

  • 7/27/2019 Ch2 (Review of Probability)

    1/35

    1

  • 7/27/2019 Ch2 (Review of Probability)

    2/35

    A random variable is a number chosen atrandom as the outcome of an experiment.

    Random variable may be real or complex and

    may be discrete or continuous. In S.P. ,the random variable encounter are most

    often real and discrete.

    We can characterize a random variable by itsprobability distribution orby itsprobability densityfunction (pdf).

    2

  • 7/27/2019 Ch2 (Review of Probability)

    3/35

    The distribution function for a random variable yis the probability that y does not exceed somevalue u,

    and

    3

    )()( uyPuFy

    )()()( uFvFvyuP yy

  • 7/27/2019 Ch2 (Review of Probability)

    4/35

    Theprobability density function is the derivativeof the distribution:

    and,

    4

    )()( uF

    du

    duf yy

    v

    uy dyyfvyuP )()(

    1)(

    yF

    1)(

    dyyfy

  • 7/27/2019 Ch2 (Review of Probability)

    5/35

    We can also characterize a random variable byits statistics.

    The expected value of g(x) is written E{g(x)} or

    and defined as Continuous random variable:

    Discrete random variable:

    5

    dxxfxgxg )()()(

    x

    xpxgxg )()()(

  • 7/27/2019 Ch2 (Review of Probability)

    6/35

    The statistics of greatest interest are the momentofp(x).

    The kth moment ofp(x) is the expected value of

    . For a discrete random variable:

    6

    k

    x

    x

    kk

    k

    xpxxm )(

  • 7/27/2019 Ch2 (Review of Probability)

    7/35

    The first moment, , is the mean of x.

    Continuous:

    Discrete:

    The second central moment, also known as the

    variance of p(x), is given by

    7

    1m

    x

    xxpxx )(

    dxxxfx )(

    2

    2

    22

    )()(

    xm

    xpxxx

  • 7/27/2019 Ch2 (Review of Probability)

    8/35

    To estimate the statistics of a random variable,we repeat the experiment which generates thevariable a large number of times.

    If the experiment is run Ntimes, then each value xwill occur Np(x) times, thus

    8

    N

    i

    ix xN 1

    1

    N

    i

    k

    ik xNm 1

    1

  • 7/27/2019 Ch2 (Review of Probability)

    9/35

    A random variable has a uniform density onthe interval (a, b) if :

    9

    otherwise,0

    ),/(1

    )(

    bxaab

    xfx

    bx

    bxaabax

    ax

    xFx

    ,1

    ),/()(

    ,0

    )(

    22 )(12

    1ab

  • 7/27/2019 Ch2 (Review of Probability)

    10/3510

    The gaussian, or normal, density function isgiven by:

    22 2/)(

    2

    1),;(

    xexn

  • 7/27/2019 Ch2 (Review of Probability)

    11/35

    If two random variables x and y are to beconsidered together, they can be described interms of theirjoint probability densityf(x, y) or,

    for discrete variables,p(x, y).

    Two random variable are independent if

    11

    )()(),( ypxpyxp

  • 7/27/2019 Ch2 (Review of Probability)

    12/35

    Given a functiong(x, y), its expected value isdefined as: Continuous:

    Discrete:

    And joint moment for two discrete random variable is:

    12

    dxdyyxfyxgyxg ),(),(),(

    yx

    yxpyxgyxg,

    ),(),(),(

    yx

    ji

    ij yxpyxm,

    ),(

  • 7/27/2019 Ch2 (Review of Probability)

    13/35

    Moments of a random variable X

    ][].[][

    )(][

    )(),(),(][

    ),(][

    ,,

    ,

    YEXEXYE

    dyyyfYE

    dxxxfdydxyxfxdxdyyxxfXE

    dxdyyxfyxyxE

    y

    XYXYX

    YX

    nnnn

    If X and Y are independents and in this case

    )().(),(, yfxfyxf YXYX

  • 7/27/2019 Ch2 (Review of Probability)

    14/35

    14

  • 7/27/2019 Ch2 (Review of Probability)

    15/35

  • 7/27/2019 Ch2 (Review of Probability)

    16/35

  • 7/27/2019 Ch2 (Review of Probability)

    17/35

  • 7/27/2019 Ch2 (Review of Probability)

    18/35

    .. .

    18

  • 7/27/2019 Ch2 (Review of Probability)

    19/35

    19

  • 7/27/2019 Ch2 (Review of Probability)

    20/35

    20

  • 7/27/2019 Ch2 (Review of Probability)

    21/35

    21

  • 7/27/2019 Ch2 (Review of Probability)

    22/35

    Two random variables x and y are jointlygaussian if their density function is :

    Where

    22

    yx

    xyxyr

    2

    2

    2

    2

    222

    )1(21exp

    121),(

    yyxxyx

    yrxyxrr

    yxn

  • 7/27/2019 Ch2 (Review of Probability)

    23/35

    A random function is one arising as theoutcome of an experiment.

    Random function need not necessarily be

    functions of time, but in all case of interest to usthey will be.

    A discrete stochastic process is characterizedby many probability density of the form,

    23

    ),...,,,,,...,,,( 321321 nn ttttxxxxp

  • 7/27/2019 Ch2 (Review of Probability)

    24/35

    If the individual values of the random signalare independent, then

    If these individual probability densities are allthe same, then we have a sequence of

    independent, identically distributed samples(i.i.d.).

    24

    ),()...,(),(),...,,,,...,,( 22112121 nnnn txptxptxptttxxxp

  • 7/27/2019 Ch2 (Review of Probability)

    25/35

    Mean and autocorrelation can be determined intwo ways:

    The experiment can be repeated many times and the

    average taken over all these functions. Such anaverage is called ensemble average. Take any one of these function as being

    representative of the ensemble and find the averagefrom a number of samples of this one function. Thisis called a time average.

    25

  • 7/27/2019 Ch2 (Review of Probability)

    26/35

    If the time average and ensemble average of arandom function are the same, it is said to beergodic.

    A random function is said to be stationary if itsstatistics do not change as a function of time.

    Any ergodic function is also stationary.

    26

  • 7/27/2019 Ch2 (Review of Probability)

    27/35

    In stationary signal we have:

    Where

    And the autocorrelation function is :

    27

    xtx )(),,(),,,( 212121 xxpttxxp

    12 tt

    21 ,2121 ),,()( xx xxpxxr

  • 7/27/2019 Ch2 (Review of Probability)

    28/35

    When x(t) is ergodic, its mean andautocorrelation is :

    28

    N

    NtN

    txN

    x )(2

    1lim

    )()(1

    lim)()()(

    N

    NtN

    txtxN

    txtxr

  • 7/27/2019 Ch2 (Review of Probability)

    29/35

    The cross-correlation of two ergodic randomfunctions is :

    The subscript xy indicates a cross-correlation.

    29

    N

    NtN

    xy tytx

    N

    tytxr )()(1

    lim)()()(

  • 7/27/2019 Ch2 (Review of Probability)

    30/35

    The Fourier transform of (theautocorrelation function of an ergodic randomfunction) is called thepower spectral density of

    x(t) :

    The cross-spectral density of two ergodic randomfunction is :

    30

    jerS )()(

    j

    xyxy erS )()(

    )(r

  • 7/27/2019 Ch2 (Review of Probability)

    31/35

    For ergodic signal x(t), can be written as:

    Then from elementary Fourier transform properties,

    31

    2|)(|

    )()(

    )()()(

    X

    XX

    XXS

    )(r

    )()()( xxr

  • 7/27/2019 Ch2 (Review of Probability)

    32/35

    If all values of a random signal areuncorrelated,

    Then this random function is called white noise The power spectrum of white noise is constant,

    White noise is mixture of all frequencies.

    32

    )()( 2 r

    2)( S

  • 7/27/2019 Ch2 (Review of Probability)

    33/35

    33

  • 7/27/2019 Ch2 (Review of Probability)

    34/35

    34

  • 7/27/2019 Ch2 (Review of Probability)

    35/35