310
Probability Theory and Random Processes

Probablity

Embed Size (px)

Citation preview

Page 1: Probablity

Probability Theory and Random Processes

Page 2: Probablity

Title : Probability Theory and Random Processes Course Code : 07B41MA106 (3-1-0), 4 CreditsPre-requisite : NilObjectives : To study● Probability: its applications in studying the outcomes of random experiments● Random variables: types, characteristics, modeling random data● Stochastic systems: their reliability ● Random Processes: types, properties and characteristics with special

reference to signal processing and trunking theory.

Learning Outcomes :students will be able to (i) model real life random processes using appropriate statistical distributions;(ii) compute the reliability of different stochastic systems;(iii) apply the knowledge of random processes in signal processing and trunking

theory.

Page 3: Probablity

Evaluation Scheme

Evaluation Components

Weightage ( in percent)

Teacher Assessment

25(based on assignments, quizzes,attendence etc.)

T1 (1 hour)

20T2 (1hour 15 min)

25T3 (1 hour 30 min)

30 ---------------------------------------------------------------------- Total 100

Page 4: Probablity

Reference Material :1. T. Veerarajan. Probability, Statistics and Random processes. Tata McGraw-Hill.2. J. I. Aunon & V. Chandrasekhar. Introduction to Probability and Random Processes. McGraw-Hill International Ed. 3. A. Papoulis & S. U. Pillai. Probability, Random Variables and Stochastic Processes. Tata WcGraw-Hill.4. Stark, H. and Woods, J.M. Probability and Random Processes with Applications to Signal Processing..

Page 5: Probablity

The study of probabilities originallycame from gambling!

Origins of Probability

Page 6: Probablity

Why are Probabilities Important?

• They help you to make good decisions, e.g.,– Decision theory

• They help you to minimize risk, e.g.,– Insurance

• They are used in average-case time complexity analyses of - Computer algorithms.• They are used to model processes in - Engineering.

Page 7: Probablity

Random Experiments• An experiment whose outcome or result can

be predicted with certainty is called a deterministic experiment.

•Although all possible outcomes of an experiment may be known in advance, the outcome of a particular performance of the experiment cannot be predicted owing to a number of unknown causes. Such an experiment is called a random experiment.

•A random experiment is an experiment that can be repeated over and over, giving different results. •e.g A fair 6-faced cubic die, the no. of telephone calls received in a board in a 5-min. interval.

Page 8: Probablity

Probability theory is a study of random orunpredictable experiments and is helpful in investigating the important features of these random experiments.

Page 9: Probablity

Probability Definitions

• For discrete math, we focus on the discrete version of probabilities.

• For each random experiment, there is assumed to be a finite set of discrete possible results, called outcomes. Each time the experiment is run, one outcome occurs. The set of all possible outcomes is called the sample space.

Page 10: Probablity

Example.

 

If the experiment consists of flipping two coins, then the sample space is:

 

S = {(H, H), (H, T), (T, H), (T, T)}

Page 11: Probablity

Example.

 

If the experiment consists of tossing two dice, then the sample space is:

 

S = {(i, j) | i, j = 1, 2, 3, 4, 5, 6}

Page 12: Probablity

More Probability Definitions

• A subset (say E) of the sample space is called an event. In other words, events are sets of outcomes.

( If the outcome of the experiment is contained in E, then we say E has occurred.)

For each event, we assign a number between 0 and 1, which is the probability that the event occurs.

Page 13: Probablity

Example.

 

If the experiment consists of flipping two

coins, and E is the event that a head

appears on the first coin, then E is:

 

E = {(H, H), (H, T)}

Page 14: Probablity

Example.

 

If the experiment consists of tossing two

dice, and E is the event that the sum of the

two dice equals 7, then E is:

 

E = {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}

Page 15: Probablity

Union: E F

 

Intersection:E F also denoted as EF

  

(If E F = , then E and F are said to be mutual exclusive.)

Page 16: Probablity

NOTE

It may or may not be true that all outcomes are equally likely. If they are, then we assume their probabilities are the same.

Page 17: Probablity

Definition of Probability in Text

The probability of an event E is the sum of the probabilities of the outcomes in E

p(E) = n(E)/n(S)

Sin cases ofnumber eexhasustiv

E tofavourable cases ofnumber

Page 18: Probablity

Example

• Rolling a die is a random experiment.

• The outcomes are: 1, 2, 3, 4, 5, and 6, presumably each having an equal probability of occurrence (1/6).

• One event is “odd numbers”, which consists of outcomes 1, 3, and 5. The probability of this event is 1/6 + 1/6 + 1/6 = 3/6 = 0.5.

Page 19: Probablity

Example• An urn contains 4 green balls and 6 red balls.

What is the probability that a ball chosen from the urn will be green?

• There are 10 possible outcomes, and all are assumed to be equally likely. Of these, 4 of them yield a green ball. So probability is 4/10 = 0.4.

Page 20: Probablity

Example• What is the probability that a person wins the

lottery by picking the correct 6 lucky numbers out of 40? It is assumed that every number has the same probability of being picked (equally likely).

Using combinatorics, recall that the total number of ways we can choose 6 numbers out of 40 is:

C(40,6) = 40! / (34! 6!) = 3,838,380.Therefore, the probability is 1/3,838,380.

Page 21: Probablity

Examples• Consider an experiment in which a coin is tossed twice.• Sample space: { HH, HT, TH, TT }• Let E be the event that at least one head shows up on

the two tosses. Then E = { HH, HT, TH }• Let F be the event that heads occurs on the first toss.

Then F = { HH, HT }• A natural assumption is that all four possible events in

the sample space are equally likely, i.e., each has probability ¼.Then the P(E) = ¾ and P(F) = ½.

Page 22: Probablity

Probability as a FrequencyLet a random experiment be repeated n times andlet an event A occur times out of the n trials.An

Aevent the

offrequency relative thecalled is n

n ratio The A

.n

nlimP(A) i.e., A,event theofy probabilit thecalled is

P(A),by denoted value,This alue.constant v aapproach

toand stabilise to tendency a shows n

n increases,n As

An

A

Page 23: Probablity

Frequency Definition of Probability

• Consider probability as a measure of the frequency of occurrence.– For example, the probability of “heads” in a

coin flip is essentially equal to the number of heads observed in T trials, divided by T, as T approaches infinity.

T

heads ofnumber lim)headsPr( T

Page 24: Probablity

Probability as a Frequency• Consider a random experiment with possible outcomes

w1, w2, …,wn. For example, we roll a die and the possible outcomes are 1,2,3,4,5,6 corresponding to the side that turns up. Or we toss a coin with possible outcomes H (heads) or T (tails).

• We assign a probability p(wj) to each possible outcome wj in such a way that:

p(w1) + p(w2) +… + p(wn) = 1• For the dice, each outcome has probability 1/6. For the

coin, each outcome has probability ½.

Page 25: Probablity

ExampleTo find the probability that a spare part produced by a machine is defective.

If , out of 10,000 items produced , 500 are defective, it is assumed that the probability of a defective itemis 0.05

Page 26: Probablity

Axioms of Probability

A B

A B

A B

A B

A

A

Axioms (where A and B are events):• 0 <= P(A) <= 1• P(S) = 1; P({}) = 0• P(A B) = P(A) + P(B) – P(A B)• If A and B are disjoint then P(A B) = P(A) + P(B) (mutually exclusive events)

Page 27: Probablity

Example• Rolling a die is a random experiment.• The outcomes are: 1, 2, 3, 4, 5, and 6. Suppose the die

is “loaded” so that 3 appears twice as often as every other number. All other numbers are equally likely. Then to figure out the probabilities, we need to solve:

p(1) + p(2) + p(3) + p(4) + p(5) + p(6) = 1 and p(3) = 2*p(1) and p(1) = p(2) = p(4) = p(5) = p(6). Solving, we get p(1) = p(2) = p(4) = p(5) = p(6) = 1/7 and p(3) = 2/7.

• One event is “odd numbers”, which consists of outcomes 1, 3, and 5. The probability of this event is: p(odd) = p(1) + p(3) + p(5) = 1/7 + 2/7 + 1/7 = 4/7.

Page 28: Probablity

0.)P( point,

sample no containing (event)subset the is if , i.e.

zero, isevent impossible the of yprobabilit The

1 Theorem

)P(P(S))P(S Hence

exclusive. mutually are

event impossible the and Sevent certain The

oofPr

Page 29: Probablity

0)P(

)P(P(S)P(S)

S.S But

Page 30: Probablity

P(B)P(A)

B)P(A-P(B)P(A)B)P(A

events, 2 any are B and AIf

Theorem

AB.and BA events exclusive

mutually the of union the is B and ABand BA

events exclusive mutually the of union the is A

:oofPr

Page 31: Probablity

)AB(P)BA(P)B(P&

)AB(P)BA(P)A(P

BA AB BA

S

A B

Page 32: Probablity

)AB(P)BA(P)AB(P)BA(P

)B(P)A(P

)BA(P)BA(P

Hence proved.

BA AB BA

S

A B

Page 33: Probablity

If event F occurs, what is the probability

that event E also occurs? 

This probability is called conditional

probability and denoted as p(E|F).

Conditional Probability

Definition of Conditional Probability

If p(F) > 0, then

Fp

FEpFEp

|

Page 34: Probablity

Example.

 

An urn contains 8 red balls and 4 white

balls. We draw 2 balls from the urn

without replacement. What is the

probability that both balls are red?

Page 35: Probablity

Solution: Let E be the event that both balls drawn arered. Then  p(E) = C(8, 2)/C(12, 2)

or, we can solve the problem using conditionalprobability approach, 

Let E1 and E2 denote, respectively, the eventsthat the first and second balls drawn are red.Then

  p(E1E2) = p(E1) p(E2 | E1 ) = (8/12) (7/11)

Page 36: Probablity

Multiplication Rule

 

nEEEEp ...321

121213121 ...|...|| nn EEEEpEEEpEEpEp

Page 37: Probablity

Example.

 

A deck of 52 playing cards is randomly

divided into 4 piles of 13 cards each.

Compute the probability that each pile

has exactly one ace.

Page 38: Probablity

Solution: Define events  

E1 = the first pile has exactly one ace 

E2 = the second pile has exactly one ace 

E3 = the third pile has exactly one ace 

E4 = the fourth pile has exactly one ace

Page 39: Probablity

p(E1E2E3E4) = p(E1)p(E2| E1)p(E3|E1E2)p(E4|E1E2E3)

p(E1) = C(4,1)C(48,12)/C(52,13)

 

p(E2|E1) = C(3,1)C(36,12)/C(39,13)

 

p(E3|E1E2) = C(2,1)C(24,12)/C(26,13)

 

p(E4|E1E2 E3) = C(1,1)C(12,12)/C(13,13)

 

p(E1E2E3E4) 0.1055

Page 40: Probablity

Let E and F be events. We can express E as

E = EF EFc

Where Fc is the complementary event of F.

 

Therefore, we have

 

p(E) = p(EF) +p(EFc) = p(E|F)p(F) +p(E|Fc)p(Fc)

Page 41: Probablity

Independent Events

Two events E and F are independent if

p(EF)=p(E)p(F)

Two events are not independent are said to bedependent.

Page 42: Probablity

p(EF) = p(E)p(F) if and only if p(E|F) = p(E).

 

If E and F are independent, then so are E and

Fc.

Page 43: Probablity

ts.independen also are B&A

events the,t independen are B andA events theIf

Theorem

heorem)addition tby )(B(P)BA(PB)P(Asuch that

exclusivemutually are BA&BA events The

oofPr

)BA(P)B(P)BA(P eorem)product th by)(B(P)A(P)B(P

)B(P)A(P)A(P1)B(P

Page 44: Probablity

Three events E, F, and G are independent if

p(EFG) = p(E)p(F)p(G)p(EF) = p(E)p(F)p(EG) = p(E)p(G)p(FG) = p(F)p(G)

Page 45: Probablity

Example. Two fair dice are thrown. Let E denotethe event that the sum of the dice is 7.Let F denote the event that the firstdie is 4 and let G be the event that thesecond die is 3. Is E independent ofF? Is E independent of G? Is Eindependent of FG?

Page 46: Probablity

p(E) = 6/36 = 1/6 p(F) = 1/6 p(G) = 1/6 p(EF) = 1/36 

E and F are independent.

Page 47: Probablity

p(EG) = 1/36  E and G are independent. p(FG) = 1/36  F and G are independent. p(EFG) = 1/36  but, E and FG are NOT independent.

Page 48: Probablity

k1,2,.....,i )B(P)B/A(P

)B(P)B/A(P)A/P(B

thenk, to1 ifor

iesprobabilit lconditiona thebe )A/B(P&)P(A/BLet

S. space sample theofpartition a be B,.......B,BLet

Theorem sBaye'

k

1jjj

iii

ii

k21

.jiBB (i) i.e.,

S. space sample theofpartition a be B,....B,BGiven

oofPr

ji

k21

Page 49: Probablity

1B2B

3B

4B

A

S

Page 50: Probablity

.i0)B(P)iii(SB)ii( iik

1i

law) vedistributi by(

)B(A.......)B(A)BB(A

)B(A.......)B(A)B(A

Then S. with associatedevent thebeA Let

k321

k21

ASA)B....BB(A k21

A

ji),BB(A)B(A.......)B(A)B(A

For, exclusive.mutually pairwise are

)B(A),.......,B(A),B(A events theall Also

jik21

k21

Page 51: Probablity

)BA(P

...)BA(P)BP(AP(A) Then

k

21

y.probabilit on total

theoremobtain the wehence &)B(P)P(A/B as

expressed bemay )BP(A each termHowever

jj

j

k

1jjj

kk

2211

)B(P)B/A(P

)B(P)B/A(P

...)B(P)B/A(P)B(P)B/A(P)A(P

Page 52: Probablity

)A/B(P)A(P)B/A(P)B(P)AB(P iiii

)A(P

)B/A(P)B(P)A/B(P ii

i

k

1jjj

ii

)B/A(P)B(P

)B/A(P)B(P

Page 53: Probablity

Example. In answering a question on a multiple-choicetest, a student either knows the answer orguesses. Let p be the probability that thestudent knows the answer and 1p theprobability that the student guesses.Assume that a student who guesses at theanswer will be correct with probability 1/m,where m is the number of multiple-choicealternatives. What is the (conditional)probability that a student knew the answer to aquestion, given that his answer is correct?

Page 54: Probablity

Solution:

 

C = the student answers the question correctly,

K = the student actually knows the answer

KpKCpKpKCp

KpKCpCKp

~|~|

||

mpp

p

/11

pm

mp

11

Page 55: Probablity

Example.

 

When coin A is flipped it comes up heads

with probability ¼, whereas when coin B

is flipped it comes up heads with probability ¾.

Suppose that one coin is randomly chosen and

is flipped twice. If both flips land heads, what is

the probability that coin B was the one chosen?

Page 56: Probablity

Solution:

 

C = coin B is chosen

H = both flips show head

CpCHpCpCHp

CpCHpHCp

~|~|

||

21

41

41

21

43

43

21

43

43

10

9 9.0

Page 57: Probablity

Example.  A laboratory test is 95 percent correct in detecting a certain

disease when the disease is actually present. However, the test also yields a “false” result for 1 percent of the healthy people tested. If 0.5 percent of the population has the disease, what is the probability a person has the disease given that his test

result is positive?

Example.

  A suspect is believed 60 percent guilty. Suppose now a new piece of evidence shows the criminal is left-handed. If 20

percent of the population is left-handed,and it turns out that the suspect is also left-handed, then does this change the guilty

probability of the suspect? By how much?

Page 58: Probablity

Solution:

 

D = the person has the disease

E = the test result is positive

995.001.0005.095.0

005.095.0

294

95

323.0

DpDEpDpDEp

DpDEpEDp

~|~|

||

Page 59: Probablity

Example.

  A suspect is believed 60 percent guilty. Suppose now a new piece of evidence shows the criminal is left-handed. If 20

percent of the population is left-handed,and it turns out that the suspect is also left-handed, then does this change the

guilty probability of the suspect? By how much?

Page 60: Probablity

Solution:

 

G = the suspect is guilty

LH = the suspect is left-handed

GpGLHpGpGLHp

GpGLHpLHGp

~|~|

||

4.02.06.00.1

6.00.1

68

60

88.0

Page 61: Probablity

Random Variable

Page 62: Probablity

Definition:A random variable (RV) is a function that assigns a real number X(s) to every element

,Ss where S is the sample space corresponding to a random experiment E.

Discrete Random VariableIf X is a random variable (RV) which can take a finitenumber or countably infinite number of values, X is called a discrete RV.

Eg. 1. The number shown when a die is thrown 2. The number of alpha particles emitted by a radioactive source are discrete RVs.

RS:X

Page 63: Probablity

ExampleSuppose that we toss two coins and consider the sampleSpace associated with this experiment. Then S={HH,HT,TH,TT}. Define the random variable X as follows: X is the number of heads obtained in the two tosses. Hence X(HH) = 2, X(HT) = 1 = X(TH) & X(TT) = 0.

Note that to every s in S there corresponds exactlyone value X(s). Different values of x may lead to the same value of S.

Eg. X(HT) = X(TH)

Page 64: Probablity

Probability FunctionIf X is a discrete RV which can take the values

,....x,x,x 321 such that

,...)3,2,1i(p provided function,y probabilit

point or function massy probabilitor function

y probabilit thecalled is p then ,p)xX(P

i

iii

satisfy the following conditions:

ii

i

1p)ii(

,&i,0p)i(

Page 65: Probablity

Example of a Discrete PDF

• Suppose that 10% of all households have no children, 30% have one child, 40% have two children, and 20% have three children.

• Select a household at random and let X = number of children.

• What is the pmf of X?

Page 66: Probablity

Example of a Discrete PDF

• We may list each value.– P(X = 0) = 0.10– P(X = 1) = 0.30– P(X = 2) = 0.40– P(X = 3) = 0.20

Page 67: Probablity

Example of a Discrete PDF

• Or we may present it as a chart.

x P(X = x)

0 0.10

1 0.30

2 0.40

3 0.20

Page 68: Probablity

Example of a Discrete PDF

• Or we may present it as a stick graph.

x

P(X = x)

0 1 2 3

0.10

0.20

0.30

0.40

Page 69: Probablity

Example of a Discrete PDF

• Or we may present it as a histogram.

x

P(X = x)

0 1 2 3

0.10

0.20

0.30

0.40

Page 70: Probablity

2)0)(ii)P(XP(X (i) Find value.

positive some is where,.....,2,1,0ii!

cp(i)

by given is X variablerandom a of pmf. The

Example

i

1!i

c have we,1p(i) Since

.Solution

0i 0i

i

Page 71: Probablity

.1ce have we,!

e As0

i

i

i

e!0

e0)(XP Hence

0-

)2X(P1)2X(P

2

eee1

2

Page 72: Probablity

!

e)(X

-

xxP

x

Page 73: Probablity

If X represents the total number of heads obtained,when a fair coin is tossed 5 times, find the probability distribution of X.

32

1

32

5

32

10

32

10

32

5

32

1 :P

5 4 3 2 1 0 :X

Page 74: Probablity

Continuous Random VariableIf X is an RV which can take all values (i.e., infinite number of values ) in an interval, then X is called a continuous RV.

1x,1

,1x0,x

,0x,0

)x(X

Page 75: Probablity

Probability Density FunctionIf X is a continuous RV,then f is said to be the probability density function (pdf) of X , if it satisfies the following conditions:

1dx)x(f)ii( and,Rx,0)x(f)i( x

b

adx)x(fb)XP(a

,ba- with ba,any For (iii)

Page 76: Probablity

When X is a continuous RV

a

a0dx)x(f)aXa(P)aX(P

This means that it is almost impossible that a continuousRV assumes a specific value.

b)XP(ab)XP(ab)XP(a Hence

Page 77: Probablity

Probability Density Function

0.00.10.20.30.40.50.6

-3 0 3 6 9 12 15x

f X (x )

xe

xxf xX 0if

10if0

)( /

Page 78: Probablity

not.or function density y probabilit a is

1x0,4x f(x)function he whether tCheck

Example3

.1dxx4dx)x(f Also

[0,1].x0f(x)Clearly .Solution1

0

1

0

3

Page 79: Probablity

)53X2(P)iii)(1X1)(ii)P(P(X (i) Obtain

elsewhere 0

2x2- 4/1f(x)

functiondensity thehas X variablerandom A

(¾,1/2,1/4)

Page 80: Probablity

Find the formula for the probability distributionof the number of heads when a fair coin istossed 4 times.

Page 81: Probablity

Cumulative Distribution Function (cdf)If X is an RV, discrete or continuous , then P(X<=x)is called the cumulative distribution function of X or distribution function of X and denoted as F(x).

xX

j j

j

p F(x) , discrete is X If

X

-

f(x)dxx)XP(-F(x) ,continuous is X If

Page 82: Probablity

Probability Density Function

0.00.10.20.30.40.50.6

-3 0 3 6 9 12 15x

f X (x )

xe

xxf xX 0if

10if0

)( /

Page 83: Probablity

Cumulative Distribution Function

0.00.20.40.60.81.01.2

-3 0 3 6 9 12 15x

F X (x )

xe

xxF xX 0if1

0if0)( /

Page 84: Probablity

Probbility Density Function

0.0000

0.0004

0.0008

0.0012

0.0016

-100 0 100 200 300 400 500 600 700 800

x

f X (x )

0 if 0

1( ) if 0

0 if

X

x

f x x uu

u x

Page 85: Probablity

Cumulative Distribution Function

0.0

0.2

0.4

0.6

0.8

1.0

-100 0 100 200 300 400 500 600 700 800

x

F X (x )

0 if 0

( ) if 0

1 if

X

x

xF x x u

uu x

Page 86: Probablity

variable.random the

offunction on distributi cumulative the(ii) K, (i) Find

otherwise 0

1x0),x-K(1 f(x)by given

is variablerandom a ofdensity y probabilit theIf2

function.on distributi theandK eminDeter

otherwise 0

xifx1

1K.

f(x)

fuctiondensity thehas X variablerandom A

2

otherwise 0

1x0 ]3/xx[2/3 3

otherwise,0

x),x(tan/1)x(F

/1K1

Page 87: Probablity

Properties of the cdf F(x)

. )F(x)F(xthen

,x xif , i.e. x,ofunction f decreasing-non a is F(x).1

21

21

.1)(F&0)(F.2

).x(F)x(F)xP(X then .....,xx....xxx

where,....,x, x values takingRV discrete a is X If .3

1iiii1i321

21

able.differenti is F(x) wherepoints

all at),x(f)x(Fdx

dthen RV, continuous a is X If.4

Page 88: Probablity

X. offunction massy probabilit the

)xX(Pp where,p)g(xE{g(X)}

as defined is g(X) of mean value the

or valueexpected then theRV, discrete a is X If

sDefinition

iiiii

xR

g(x)f(x)dxE{g(X)}

thenf(x), pdf with RV continuous a is X If

Page 89: Probablity

2xx iancevar&mean its are X RV a singcharacterifor used

commonlymost are which valuesexpected Two

xR

iiix

continuous is X if xf(x)dx,

discrete is X if,px)X(E

xR

2x

ii

2xi

2x

2x

continuous is X if,dx)x(f)-(x

discrete is X if ,p)x(

)X(E)x(Var

The square root of variance is called the standard

deviation.

Page 90: Probablity

22 )}X(E{)X(E)X(Var

))X(Ece(sin)X(E

constant) a is ce(sin)X(E2)X(E

}X2X{E})X{(E)X(Var

x2

x2

x2

xx2

2xx

22x

22 )}X(E{)X(E)X(Var

ExampleFind the expected value of the number on a die when thrown.(7/2)

Page 91: Probablity

X. ofmean theevaluate (d) and X

of cdf thefind (c) 2),XP(-2 and 2)P(X Evaluate (b) K, Find (a)

3K 0.32K 0.2K 0.1 :p(x)

3 2 1 0 1- 2- :x

ondistributiy probabilit following thehas X variablerandom A

Example

3).by divisible is P(X&5)P(X even), is P(X also Find

on.distributi theof varianceandmean thefind

1and isy probabilit total that theVerify

,..).3,2,1j(1/2j)P(Xby given ison distributi

discrete infinitean offunction y probabilit Thej

Page 92: Probablity

P(x).function on distributi its find (b)

X.) RV continuous a pdf(of a is p(x) that show)a(

0 x0

0xxep(x) If

2/-x2

variance.andmean find and k ttancons the eminDeter

otherwise0

0,0xkxef(x)by definedfunction density

y probabilit thehas variablerandom continuous A

Example

x -

)/6,/2,k( 22

0x,e1)x(F 2/x2

Page 93: Probablity

MomentsIf X is the discrete or continuous RV, is called rthorder raw moment of S about the origin and denotedby .

)X(E r

r'

-

r

x

r

rr

continuous is X if dx)x(fx

discrete is X if )x(fx

)X(E'

.)X(EVar(X)

origin about themoment first mean

),X(E&)X(Eby given

areorigin about the moments second andfirst theSince

211

12

212

212

11

Page 94: Probablity

)X(Var))X(EX(E

0)X(E)X(E))X(EX(E2

2

1

.by denoted and X of

moment centralorder nth thecalled is )X{(E

n

nx

X. of moments absolute called are }X{E&}X{En

x

n

X. of moments dgeneralise called are }aX{E&})aX{(Enn

Page 95: Probablity

Two-Dimensional Random Variables

variable.random ldimensiona- twoa called

is Y)(X,Then S.s outcomeseach number to real a assigning

each functions twobe Y(s) Y and X(s) XLet E. experiment

random a with associated space sample thebe SLet :sDefinition

Two-dimensional continuous RV.Two-dimensional discrete RV.

,.....n,...,3,2,1j,......;m,....,3,2,1i),y,x()Y,X( ii

Page 96: Probablity

Probability Function of (X,Y)

Y)(X, ofon distributiy probabilitjoint thecalled is

...3,2,1j,...,2,1i},p,y,x triplets{ofset The

1p)ii(

j&i,0p(i)

provided Y)(X, offunction massy probabilit thecalled is p then

p)yy,xP(xsuch that RV discrete ldimensiona- twoa is Y)(X, If

ijii

j iij

ij

ij

ijii

Page 97: Probablity

• If (X,Y) is a two-dimensional continuous RV. The joint probability density function (pdf) f is a function satisfying the following conditions

1dxdy)y,x(f

yx,0)y,x(f

2

1

2

1

xx

yy2121 dydx)y,x(f]yYy,xXxPr[

Page 98: Probablity

x y

j iij

dvdu)v,u(f

p)y,x(F

Cumulative Distribution Function

Y)(X, of cdf thecalled is y}Y&xP{Xy)F(x,then

),continuousor eRV(discret ldimensiona- twoa is Y),(X If

Page 99: Probablity

)(),()(),(

increase. both,or y,or either x asfunction ingnondecreas a is ),(

1),(

0),(),(),(

,1),(0

xFxFyFyF

yxF

F

FxFyF

yxyxF

XY

•Properties of joint PDF

yx

yxFyxf

yYxXyxF

),(),(

),Pr(),(2

Page 100: Probablity

Examplestossing two coins

headfor 1 for tail, 0 )}1,1(),0,1(),1,0(),0,0{( space sample

coin second with theassociated variablerandom ...

coinfirst with theassociated variablerandom ...

Y

X

1)1,1(F4

3)0,1(F

4

2)1,0(F

4

1)0,0(F

1/21/4

1 x

y

F(x,y)

Page 101: Probablity

ExampleThree balls are drawn at random without replacementfrom a box containing 2 white,3red and 4 black balls.If X denotes the number of white balls drawn and Ydenotes the no of red balls drawn, find the joint probability distribution of (X,Y).

SolutionsAs there are only 2 white balls in the box, X can take the values 0,1,2and Y can take the values 0,1,2,3.

P(X=0,Y=0)=P(drawing 3 balls none of which is white or red)=P(all the 3 balls drawn are black)

21/1C/C 39

34

Page 102: Probablity

P(X=0,Y=1)=3/14,P(X=0,Y=2)=1/7………

X Y

0 1 2 3

0

1

2

Page 103: Probablity

X Y

0 1 2 3

0 1/21 3/14 1/7 1/84

1 1/7 2/7 1/14 0

2 1/21 1/28 0 0

Page 104: Probablity

4).YP(X&1)3/XP(Y

3)1/YP(X3),Y1,P(X1),P(X find below,given

Y)(X, ofon distributiy probabilit bivariate For the

1 2 3 4 5 6

0 0 0 1/32 2/32 2/32 3/32

1 1/16 1/16 1/8 1/8 1/8 1/8

2 1/32 1/32 1/64 1/64 0 2/64

X

Y

(ans.7/8,9/32,18/32,9/28,13/32)

Page 105: Probablity

a21-

y2x-

e1,3/1),e1(e ans.

a)(iii)P(XY),(ii)P(X1),Y1,P(X (i) Compute

otherwise0

y0,x0e2ey)f(x,

bygiven is Y)(X, offunction density joint The

Example

Page 106: Probablity

Marginal probability density function

For every fixed j

p(xj, y1) + p(xj, y2) + p(xj, y3) + … = p{X= xj} = f(xj)

 and for every fixed k

 p(x1, yk) + p(x2, yk) + p(x3, yk) + … = p{Y= yk} = f(yk)

 

The probability functions f(xj) and g(yk) are also

called marginal probability density functions.

dx)y,x(f)y(f,dy)y,x(f)x(f YX

Page 107: Probablity

As an illustrative example, consider a joint pdf of the form

-Integrating this wrt y alone and wrt x alone gives the two marginal pdf

-

elsewhere 0

10,10for )1(5

6),( 2

yxyxyxf

1y0)3

y1(

5

6)y(f

1x0)2

x1(

5

6)x(f

Y

2

X

Page 108: Probablity

Independent Random Variables

Two random variables X and Y with joint probability density function f(x,y) and marginal probability functions and

If

F(x,y) = F(x) G(y)

Or p(x,y) =

for all x, y, then X and Y are independent.

)x(fX

)x(fX

)y(fY

)y(fY

Page 109: Probablity

Example A machine is used for a particular job in the forenoonand for a different job in the afternoon. The joint probability of (X,Y), where X and Y represent the number of times the machine breaks down in the forenoon and in the afternoon respectively, is givenin the following table. Examine if X and Y are independent RVs.

X Y 0 1 2

0 0.1 0.04 0.06

1 0.2 0.08 0.12

2 0.2 0.08 0.12

Page 110: Probablity

4.0P;4.0P;2.006.004.01.0)0(fP

j,iPPP if ,t independen are Y&X

*2*1*0

ijj*i*

3.0P;2.0P;5.0P 2*1*0*

011**0

00*00*

P04.02.02.0PP

P1.05.02.0PP

Hence the RVs X and Y are independent

X Y 0 1 2

0 0.1 0.04 0.06

1 0.2 0.08 0.12

2 0.2 0.08 0.12

Page 111: Probablity

t.independen are Y and X that ovePr

otherwise0

0y,x,eee-1y)F(x,

bygiven is Y)(X, variablerandom continuous

theoffunction on distributi cumulative The

y)(x-y-x-

otherwise0

0y,xe

yx

)y,x(F)y,x(f

)yx(2

Page 112: Probablity

otherwise0

0ye)y(f

otherwise0

0xe)x(f

y

2

x

1

0ye1

0y0)y(F

0xe1

0x0)x(F

x2x1

)y,x(F)e1)(e1()y(F)x(F yx21

Page 113: Probablity

tindependen are Y and X then

0y,0x,ey)f(x,by given is pdfjoint r that theiSuppose

devices. electronic twoof lifetimes thebe Y and X Let

Example

y)(x-

Y. and X of ceindependen Check the

1.yx8xy,0y)f(x, that Suppose

Example

Page 114: Probablity

Expectation of Product of random variables

If X and Y are mutually independent random variables, then the expectation of their product exists and is

E(XY) = E(X) E(Y)

Page 115: Probablity

y0,x0,ey)f(x,by given be pdf Let the

variablesrandom continuous as modeled being are lightbulb

a of Y brightness theand X lifetime that theAssuming

)yx(21

21

Find the joint distribution function

Example

A line of length a units is divided into two parts. If the first part is of length X, find E(X), Var(X) and E{X(a-X)}.

Page 116: Probablity

Expectation of Sum of random variables

If X1, X2, …, Xn are random variables, then the expectation of their sum exists and is

E(X1+ X2+…+ Xn) = E(X1) + E(X2) +… + E(Xn)

kj

kjkkjkj

j yxpyyxpxYEXE,,

,,

kjkj

kj yxpyx ,,

YXE

Page 117: Probablity

Example What is the mathematical expectation of the sum of points on n dice?

A box contains tickets among which tickets bear the number r (r = 0,1,2,…,n). A group of m tickets is drawn . Let S denote the sum of their numbers. Find E(S) and Var S.

n2 rn C

Ans. (7/2)n

Ans. (n/2)m

Page 118: Probablity

kj

kjkj yxpyxXYE,

,

kjkkj

j ygxfyx,

kkk

jjj ygyxfx

YEXE

Page 119: Probablity

E(XY) Find

1xyx),0-24y(1y)f(x,by given is Y)(X, of pdfjoint theIf

Example

y=x

0 x

y

1

0

1

ydxdy)y,x(xyf)XY(E

Ans. 4/15

Page 120: Probablity

Binomial Distribution (re-visit)

Suppose that n Bernoulli trials, each of which results in a success with probability p and results in a failure with 1–p, are performed. If Sn represents the number

of successes that occur in the n Bernoulli trials, then Sn is said to be a binomial random variable with

parameter n and p.

Page 121: Probablity

Let Xk be the number successes scored at the kth

trial. Since Xk assumes only the values 0 and 1 with

corresponding probabilities q and p, we have

 E(Xk) = 0 q + 1 p = p

Page 122: Probablity

Since

Sn = X1 + X2+…+ Xn

 We have  

E(Sn) = E(X1+ X2+…+ Xn)

= E(X1) + E(X2) +… + E(Xn) = np

Page 123: Probablity

Conditional Probability

Now, consider the case where the event M depends on some other random variable Y.

0)Pr()Pr(

,Pr

Pr

MM

Mxx

MxXMxF

Page 124: Probablity

Conditional Probability Density Function

or

- the continuous version of Bayes’ theorem

- another expression of the marginal pdf

)y(f

)y,x(f)y|x(f

Y

)x(f

)y,x(f)x|y(f

X

)(

)()|()|(

xf

yfyxfxyf

X

Y

dxxfxyfdxyxfyf

dyyfyxfdyyxfxf

XY

YX

)()|(),()(

)()|(),()(

Page 125: Probablity

4).YP(X&1)3/XP(Y

3)1/YP(X3),Y1,P(X1),P(X find below,given

Y)(X, ofon distributiy probabilit bivariate For the

1 2 3 4 5 6

0 0 0 1/32 2/32 2/32 3/32

1 1/16 1/16 1/8 1/8 1/8 1/8

2 1/32 1/32 1/64 1/64 0 2/64

X

Y

(ans.7/8,9/32,18/32,9/28,13/32)

Page 126: Probablity

Suppose that p(x,y) the joint probability mass function of X and Y , is given by p(0,0) =.4p(0,1)=.2,p(1,0)=.1,p(1,1)=.3 Calculate the conditional probability mass function of X given that Y = 1

Ans. 2/5,3/5

Page 127: Probablity

ExampleSuppose that 15 percent of the families in a certain community have no children, 20% have 1, 35% have2, & 30% have 3 children; suppose further that eachchild is equally likely (and independently) to be a boy or a girl. If a family is chosen at random from this community, then B, the number of boys, and G , the number of girls, in this family will have the joint probability mass function .

Page 128: Probablity

j

i

0 1 2 3

0 .15 .10 .0875 .0375

1 .10 .175 .1125 0

2 .0875 .1125 0 0

3 .0375 0 0 0

1125.8

9.

32

1

2

1

2

130.)GBB(P)BGB(P)BBG(P

)GBBBBGorBGBor(P)1G,2B(P

175.22

1

2

135.

)GB(P)BG(P

)BGorGB(P)1G,1B(P

Page 129: Probablity

If the family chosen has one girl, compute the conditional probability mass function of the number of boys in the family

Ans.8/31,14/31,9/31,0

Page 130: Probablity

1y0 where,y Y

given that , X ofdensity lconditiona thecompute

otherwise0

1y0,1x0,yx2x5

12 y)f(x,

bygiven is Y and X ofdensity joint The

Example

)y(f

)y,x(f)y|x(f

Y

y34

yx2x6ans

Page 131: Probablity

)1YX(P&)YX(P

)1X/2

1Y(P),

2

11/YP(X),

2

1P(Y1),P(X Compute

.1y0,2x0,8

xxyy)f(x,

bygiven is RV ldimensiona- twoa of pdfjoint The

Example

22

(ans. 19/24,1/4,5/6,5/19,53/480,13/480)

Page 132: Probablity

Variance of a Sum of random variables

If X and Y are random variables, then the variance of their sum is

Var(X + Y) = E({(X+Y) – (X + Y)}2)

Page 133: Probablity

YX2

Y2

X YXE2YEXE

))((2)()( YX YXEYVarXVar

YXXYEYVarXVar 2

The covariance of X and Y is defined by

YXYX XYEYXEYXCov ,

Var(X + Y) = E({(X+Y) – ( + )} )2

x y

Page 134: Probablity

• If X and Y are mutually independent, then Cov(X,Y) = 0.

Q: Is the reverse of the above true?

• If X and Y are mutually independent, then

Var(X + Y) = Var(X) + Var(Y)

Page 135: Probablity

• If X1, …, Xn are mutually independent, and

Sn = X1 + …+ Xn, then

Var(Sn) = Var(X1) + … + Var(Xn)

Q: Let Sn be a binomial random variable with parameter n and p. Show that

Var(Sn) = np(1-p)

Page 136: Probablity

ExampleCompute Var(X) when X represents the outcomewhen we roll a fair die.

SolutionSince P(X=i)=1/6, i = 1,2,3,4,5,6, we obtain

6

1i

22 ]iX[Pi)X(E

)6

1(6)

6

1(5)

6

1(4)

6

1(3)

6

1(2)

6

1(1 222222

=91/6

Page 137: Probablity

22 )X(E)X(E)X(Var

12/352

7

6

912

Compute the variance of the sum obtained when 10independent rolls of a fair die are made.

Ans 175/6

Compute the variance of the number of heads resultingfrom 10 independent tosses of a fair coin.

Page 138: Probablity

yx

xyxy

xy

C

as defined is ,by denoted

Y and Xbetween n correlatio oft coefficien The

The correlation co-efficient is a measure of dependencebetween RV’s X and Y.

If = 0 , we say that X and Y are uncorrelated If E(XY) = 0 , X and Y are said to be orthogonal RV’s.

xy

yxxyxy C or 1

Page 139: Probablity

Example Calculate the correlation coefficient for the followingheights (in inches) of fathers (X) & their sons (Y):

X Y

65 67

66 68

67 65

67 68

68 72

69 72

70 69

72 71

(Ans .603)

Page 140: Probablity

yx

xyxy

C

2222

ny

ny

nx

nx

ny

nx

nxy

Page 141: Probablity

Example If X,Y and Z are uncorrelated RVs with zero means and standard deviations 5, 12 and 9 respectivelyand if U=X+Y and V=Y+Z find the correlation coefficient between U and V.

(ans 48/65)

1/11)- (ans.

Y. and Xbetween t coefficienn correlatio theFind

elsewhere 0

1y1,0x0y xy)f(x, pdfjoint

thehave Y and X variablesrandom Let the

Example

Page 142: Probablity

)Y,X(g

p Y)(X,

Values Expected lConditiona

ij

i

jijij )yY/xX(P)y,x(g}yY/)Y,X(g{E

ij*

ijji

j

jiji

i p

p)y,x(g

}yY{P

}yYxX{P)y,x(g

dy)x/y(f)y,x(g}X/)Y,X(g{E

dx)y/x(f)y,x(g}Y/)Y,X(g{E

Page 143: Probablity

dy)x/y(yf)X/Y(E

means lConditiona

y/x

dy)x/y(f)y()Y(E

are variancelConditiona

2x/y

2x/y

2y/x

Page 144: Probablity

Example The joint pdf of (X,Y) is given by f(x,y)=24xy, x>0,y>0,x+y<=1,and 0, elsewhere, find the conditionalmean and variance of Y, given X.

E(Y/X) and E(X/Y) find , 0y&x-1 y by bounded

semicircle over the ddistributeuniformly is Y)(X, If

2 E(Y) E{E(Y/X)} and E(X)E{E(X/Y)}at verify thAlso

2

2

)x1(18/1var

),x1(3/2)X/Y(E,)x1/(y2)x/y(f

Page 145: Probablity

Properties(1)If X and Y are independent RV’s, then E(Y/X)=E(Y)

and E(X/Y)=E(X).(2)E[E{g(X,Y)/X)=E{g(X,Y)} in particular E{E(X/Y)}=E(X)(3)E(XY)=E[X.E(Y/X)]

)]X/Y(EX(E)YX(E 2222

Page 146: Probablity

ExampleThree coins are tossed. Let X denote the number of

heads on the first two coins,Y denote the no of tailson the last two, and z denote the number of headson the last two. Find

(a)The joint distribution of (i) X and Y (ii) X and Z(b) Conditional distribution of Y given X = 1(c) Find covariance of x,y and x,z(d) Find E(Z/X=1)(e)Give a joint distribution , that is not the joint

distribution of X and Z in (a), but has the same marginals as of (b)

Page 147: Probablity

)x(f...)x(f)x(f)x,...,x,x(f

tindependen)X,...X,(X RVs

n21n21

n21

)x,f(x

)x,x,f(x)x,x/f(x

)xf(

)x,x,f(x)x/x,f(x

density lconditiona

32

321321

3

321321

Page 148: Probablity

DefinitionLet X denote a random variable with probability density function f(x) if continuous (probability mass function p(x) if discrete)

Then

M(t) = the moment generating function of X

tXE e

if is continuous

if is discrete

tx

tx

x

e f x dx X

e p x X

Page 149: Probablity

XY. offunction generating-moment theFind

otherwise0

0y,0xxe)y,x(f If

Example)1y(x

0 0

)t1(xyx

0 0

)y(xxytx

0 0

)1y(xxytXY

dx}dye{xe

dx}dyee{xe

dydxxeeM:Solution

=1/1-t

Page 150: Probablity

Properties

MX(0) = 1

)x(fe)t(M txX

)x(f.....)

!2

txtx1(

2

10tX )X(E)x(fx)t('M

)x(f...)!2

tx2x()t('M

2

X

22

0tX2

32

X2

)x(fx)t(M

)x(f...)!3

tx6x()t(M

Page 151: Probablity

0 derivative of at 0.k thX Xm k m t t

2 33211 .

2! 3! !kk

Xm t t t t tk

continuous

discrete

k

kk k

x f x dx XE X

x p x X

Page 152: Probablity

Let X be a random variable with moment generating function MX(t). Let Y = bX + a

Then MY(t) = MbX + a(t) = E(e [bX + a]t) = eatMX (bt)

Let X andY be two independent random

variables with moment generating functionMX(t)and MY(t) .

Then MX+Y(t) = MX (t) MY (t)

Page 153: Probablity

6. Let X and Y be two random variables with moment generating function MX(t) and MY(t) and two distribution functions FX(x) and FY(y) respectively.

Let MX (t) = MY (t) then FX(x) = FY(x).

This ensures that the distribution of a random variable can be identified by its moment generating function

Page 154: Probablity

Example If X represents the outcome, when a fair die is tossed, find the MGF of X and hence find E(X) and Var(X).

If a RV X has the MGF M(t) = 3/(3-t), obtain the standard deviation of X. (ans, 1/3)

(Ans. 7/2,35/12)

t4t3t2t e10

4e

10

3e

10

2e

10

1)t(M

Find p.d.f.

Page 155: Probablity

t4t3t2t e10

4e

10

3e

10

2e

10

1)t(M

x

tx )x(fe)t(M

...e)b(fe)a(fe10

4e

10

3e

10

2e

10

1 btatt4t3t2t

otherwise,0

4,3,2,1x,10

x)x(f

Page 156: Probablity

X. ofmean thefind

otherwise,0

x1,x

1f(x) p.d.f thehas X If 2

Page 157: Probablity

)e(E)w(

by defined is X variablerandom a offunction sticcharacteri Theiwx

X

continuous is X if,dx)x(fe

discrete is X if),x(fe

iwx

x

iwx

11

wxsinwxcoswxsiniwxcose2/1222/1iwx

Page 158: Probablity

dx)x(fe)e(E)w( iwxiwx

x

1dx)x(fdx)x(feiwx

Hence the characteristic function always exist even when moment-generating function may not exist.

Page 159: Probablity

Properties of Characteristic Function

.i of powers asending of seriesin )( ofexpansion

in the !n

i ofefficient -co the)X(E.1

nnn'

n

x

iwxiwxX )x(fe)e(E)w(

)x(f...

!3

iwx

!2

iwxiwx1

x

32

x

22

x x.....)x(fx

!2

iw)x(xfiw)x(f

Page 160: Probablity

.de)(2

1f(x) then ),( is f(x)function density

withX RV continuous a offunction sticcharacteri theIf 5.

).()()(

then RVs,t independen are Y and X If .4

)a(e)( then b,aXY if and )( is

X RV a offunction sticcharacteri theIf.3

ix

yxyx

xib

yx

one.-one is g(X) Y provided Y, of CF

thefrom found becan g(X)Y offunction density

the,known is x offunction density theIf.6

0

n

n

nn )(d

d

i

1'.2

Page 161: Probablity

X. of pdf theFind

1w,0

1w,w1)w(

bygiven

is X variablerandom a offunction sticcharacteri The

x

dwe)w(2

1f(x)

is X of pdf The

iwxx

Page 162: Probablity

dwe)w1(

2

1 iwx1

1

dwe)w1(dwe)w1(

2

1 iwx1

0

iwx0

1

xcos1x

1)ee2(

x2

12

ixix2

x,

2/x

2/xsin

2

12

Page 163: Probablity

x,x1

11f(x)function

density thehas e isfunction sticcharacteri

thefor which on distributi that theShow

2

-

de)(2

1)x(f xi

Page 164: Probablity

).(parameter on with distributiPoisson

a follows also )X(X that prove ,&

parameterson with distributiPoisson follow

that RVst independen twoare X&X If

21

2121

21

Reproductive property of Poisson distribution

)1e()t(x

)1e()t(x

i2

2

i1

1

e

e

Page 165: Probablity

RVs,t independen are X&X since 21

)1e()t(xx

i21

21e

Page 166: Probablity

Joint Characteristic Function

).,(by denoted and Y)(X, offunction

sticcharacterijoint thecalled is )E(e

then RV, ldimensiona- twoa is Y)(X, If

21xy

YiXi 21

0,0

21xyn2

m1

nm

nmnm

xy

21

),(i

1}YX{E)ii(

1)0,0()i(

i jji

yixi

yixi21xy

)y,x(pe

dxdy)y,x(fe),(

21

21

Page 167: Probablity

.conversely and

)()(),(

tindependen are Y and X If)iv(

),0()(&)0,()()iii(

2y1x21xy

xyyxyx

Page 168: Probablity

.else,0

1yx,6/1

0y,1x,6/1

0yx,3/1

P

is PMFjoint theif Y and X sr.v.' discrete

theoffunction sticcharacteri theCompute

XY

1

1k

1

1lXY

)lwkw(i21XY Pe)w,w( 21

21121121 iwiwiwiwiwiw0iw0iw e6

1e

6

1e

6

1e

6

1e

3

1

Page 169: Probablity

212111

212111

wwsiniwwcos6

1wsiniwcos

6

1

wwsiniwwcos6

1wsiniwcos

6

1

3

1

211 wwcos3

1wcos

3

1

3

1

Page 170: Probablity

eduncorrelat arethey

that also and RVsmean zeroboth are Y and X

thatShow .e),(function

sticcharacterijoint thehave Y and X RVs Two

Example

22

21 82

21xy

0,0

82

121

22

21e

i

1E(X)

CFjoint ofproperty By the

0i4e 0,0182

21

22

21

0,0

21xyn2

m1

nm

nmnm

21

),(i

1}YX{E)ii(

Page 171: Probablity

0i16eE(Y) 0,0282

21

22

21

0

}e64{

16e

ei

1E(XY)

0,082

21

0,0

282

1

0,0

82

21

2

2

21

22

21

21

22

21

21

22

21

Page 172: Probablity

0)Y(E)X(E)XY(ECxy

Page 173: Probablity

Compute the joint characteristic function of X and Y if

)yx(

2

1exp

2

1f 22

xy

dxdyee

2

1)w,w(

.Ans

yixi)yx(2

1

21xy21

22

Page 174: Probablity

Random Variable

Page 175: Probablity

Binomial DistributionThe Bernoulli probability mass function is the densityfunction of a discrete variable X having 0 and 1 as the only possible values

The pmf of X is given by P(0) = P{X = 0} = 1-p P(1) = P{P = 1} = p where p, 0<=p<=1 is the probability that the trial is a success.

(0,1).p somefor equation above satisfies pmf if

variablerandom Bernoulli be tosaid is X variablerandomA

Page 176: Probablity

An experiment consists of performing a sequence of subexperiments. If each subexperiment is identical, then the subexperiments are called trials.

Page 177: Probablity

Bernoulli Trials

• Each trial of an experiment that has only two possible outcomes (success or failure) is called a “Bernoulli trial.”

• If p is the probability of success, then (1-p) is the probability of failure.

• The probability of exactly k successes in n independent Bernoulli trials, with probability of success p and probability of failure q = 1-p, is given by a formula called the Binomial Distribution:

C(n, k) pk q(n-k)

Page 178: Probablity

Example of Bernoulli Trials• Suppose I roll a die and I consider a 3 to be

success and any other number to be a failure.• What is the probability of getting exactly 5

successes if I roll the die 20 times?• Solution: C(20, 5) (1/6)5 (5/6)15

• What is the probability of getting 5 or more successes?

Page 179: Probablity

event. theof failure ofy probabilit thep,-1 q

where,pqnC toequal is t trialsindependenn ofout r timesexactly occurs

event y that theprobabilit then thep, is experiment sBernoulli' a of trialsingle

ain success) ofty (probabilievent an of occurrance ofy probabilit theIf

Theorem

rrnr

rnrqp

failures)r -n and successesr P(getting

usly.simultaneo failures r)-(n and successes

r getting means successesr exactly Getting

oofPr

Page 180: Probablity

failures.in result

should trialsr)-(n remaining thesuccesses,for

chosen are r trials theOnce successes.for r trials

choosing of waysnC are There specified.not are

obtained are successes which thefrom trials,The

r

.qp ) successesr

exactly P(getting ways,nC theseofeach

In exclusive.mutually are waysnC These

rnr

r

r

.pqnC

yprobabilit required theheorem,addition t by the ,rrn

r

Page 181: Probablity

Example If war breaks out on the average once in 25 years,find the probability that in 50 years at a strech,there will be no war.

50500

25

24

25

24

25

1)0,50(CP

;50n,25

24q,

25

1p

Page 182: Probablity

ExampleEach of two persons A and B tosses 3 fair coins.What is the probability that they obtain the samenumber of heads?

P(A&B get the same no. of heads)=P(they get no head each or 1 head each or 2 headseach or 3 heads each)

= P(A gets 0 head) P(B gets o head)+------

2323

2323

2

1)3,3(C

2

1)2,3(C

2

1)1,3(C

2

1)0,3(C

=5\16

Page 183: Probablity

Example.

 

A game consists of 5 matches and two players, A and B. Any player who firstly wins 3 matches will be the winner of the game.

 

If player A wins a match with probability 2/3. Suppose matches are independent. What will be the probability for player A to win the game?

Page 184: Probablity

Solution:

 

Player A wins 3 matches or more out of the 5 matches.

 

This event has probability equal to:

51423

3

2

5

5

3

1

3

2

4

5

3

1

3

2

3

5

81

64

Page 185: Probablity

Example.

 

A sequence of independent trials is to be performed. Each trial results in a success with probability p and a failure with probability 1–p. What is the probability that

 

(a) at least 1 success occurs in the first n trials;

 

(b) exactly k success occurs in the first n trials;

Page 186: Probablity

Solution:

(a)The probability of no success in the

first n trials is (1-p)n. Thus, the answer

is 1–(1–p)n.

(b) knk ppk

n

1

Page 187: Probablity

Assuming that p remains the same for all repetitions,if we consider n independent repetitions (or trials) of E and if the random variable X denotes the number of times the event A has occurred, then X is called a binomial random variable with parameters n and p

The pmf of a binomial random variable having parameters (n,p) is given by

1 qp where,n,......,1,0i,qpnC)i(P inii

Page 188: Probablity

ExampleIt is known that car produced by an automobile companywill be defective with probability 0.01 independently ofeach other. The company sells the cars in packages of 10 and offers a money-back guarantee that atmost 1 of the10 cars is defective. What proportion of packages sold must the company replace?

004.})99)(.01(.C10()99(.)01(.C10{1

)1X(P)0X(P{1)}1X(P{1)1X(P9

1100

0

Page 189: Probablity

xnxn

0xx )p1(pxnC)X(Emean

xnxn

1xx )p1(pxnC

xnx

n

1x

)p1(p)!xn(!x

)!1n(nx

xn1x

n

1x

)p1(pp!)1x()1n()!1x(x

)!1n(nx

xn1x

n

1x

)p1(p!)1x()1n()!1x(

)!1n(np

)1x()1n(1xn

1x1x )p1(pC1nnp

n

0x

xnxx

nn1n )p1(pC)p1(pcesin,))p1(p(np np

Page 190: Probablity

xnxx

n

0x

2212 )p1(pnCx)X(Emoment Second

Show that variance is np(1-p)

on.distributi theof termsfirst two thefind ,2

deviation standard 6,mean on with distributi binomial aFor

Example

)2187/2,)3/1(( 9

1).P(X Find ly.respective 3 and 4 are

ondistributi binomial of varianceandmean The

9899.))4/3(1( 16

Page 191: Probablity

. npsay constant finite a is np (iii)

0)small(p very is trialsingle ain success ofy probabilit The (ii)

)(nly indefinite increased is trialsofnumber The (i)

.assumption following under thefunction density

y probabilit Binomial theof case limiting a as obtained be

can riatepoisson va theoffunction density y probabilit The

onDistributi Poisson

xn)p1(p)!xn(!x

!n)p1(pnCx)P(X

as X variable

random Binomial a offunction density y probabilit theCosider

xxnxx

xnx

n1

n!x

)1xn)...(1n(n

Page 192: Probablity

x

n

xx

n1

n1

n!x

)1xn)....(1n(n

x

n

x

n1

n1

!xn

)1xn(....

n)1n(

nn

1. to tends/n)-(1

/n,1-x-2/n,...1-1/n,1-1 terms the,n as given x,For x

Page 193: Probablity

.e)n/1(Lt,Also nn

.e!x

)xX(PLt,Hencex

n

,....2,1,0x!x

ex)P(XP(x) 0, somefor if

parameter with variablerandomPoisson a be tosaid is

0,1,2,... values theof oneon takingX variablerandomA

variablerandomPoisson

x-

Page 194: Probablity

1ee!x

e!x

e)x(P

0x

x

0x

x

0x

1x

x

0x

x

0x )!1x(

e

!x

ex)xX(xP)X(EMean

...

!2!11e

)!1x(e

1x

1x

ee

)X(Var

Page 195: Probablity

Example

The average number of radioactive particles passingthrough a counter during 1 millisecond is in a laboratory experiment is 4. What is the probability that6 particles enter the counter in a given millisecond?

If the probability of a defective fuse from a manufacturingunit is 2%, in a box of 200 fuses, find the probability that exactly 4 fuses are defective.

!4

e4,

!6

4e 4464

Page 196: Probablity

Example

At a busy traffic intersection the probability p of an Individual car having an accident is very small say p=0.0001. However, during a certain peak hours of the day say between 4 p.m. and 6 p.m., a large number of cars (say 1000) pass through the intersection. Under these conditions what is the probability of two or more accidents occurring duringthat period.In a component manufacturing industry, there is a smallprobability of 1/500 for any component to be defective. The components are supplied in packets of 10. Use Poisson distribution to calculate the app no of packets containing (i) no defective (ii)one defective componentsin a consignment of 10,000 packets. 100000196,.100009802.

Page 197: Probablity

Binomial Distribution

n,...,2,1,0r;qpC)rX(P rnrr

n

If we assume that n trials constitute a set and if weconsider N sets, the frequency function of the binomial distribution is given by f(r)=N p(r)

rnrr

n qpCN

Page 198: Probablity

ExampleFit a binomial distribution for the following data and hence find the theoretical frequencies:x : 0 1 2 3 4f: 5 29 36 25 5

Ans. 7,26,37,34,66.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0

The following data are the number of seeds germinating out of 10 on damp filter paper for 80 set of seeds. Fit a binomial distribution to these data: x 0 1 2 3 4 5 6 7 8 9 10

y 6 20 28 12 8 6 0 0 0 0 0

Page 199: Probablity

Fit a Poisson distribution for the following distribution:x: 0 1 2 3 4 5f: 142 156 69 27 5 1

Ans. 147 147 74 25 6 1107,141,93,41,4,0,0,0,0,0

Fit a Poisson distribution to the following data which gives the number of yeast cells per square for 400 squaresNo. of cells per square (x)

0 1 2 3 4 5 6 7 8 9 10

No. of squares (f) 103 143 98 42 8 4 2 0 0 0 0

Page 200: Probablity

GEOMETRIC DISTRIBUTION

,...2,1n,pp)-(1n)P(X then n, Xlet weIf occurs.

success a until performed are success, of py probabilit

a havingeach t trials,independen that Suppose

1-n

success. a

is nth trial theand failures are trials1)-(nfirst that the

sufficient andnecessary isit n,X order thatin Since

1n 1n

1n .1)p1(1

pp)p1()nX(P

Geometric Random Variable

Page 201: Probablity

In a chemical engineering process industry it is known that , on the average, 1 in every 100 items is defective. What is the probability that the fifthitem inspected is the first defective item found.

0096.)99)(.01(.ans 4

At busy time, a telephone exchange will be working busy with full capacity . So people cannot get a line to use immediately. It may be of interest to know the number of attempts necessary in order to get a connection. Suppose that p = 0.05, then find the probability that 5 attempts are necessary for a successful call connection.

Ans .041

Page 202: Probablity

Mean

1n 1n

1n1n p1t,ntpp)p1(n)X(E

....]t4t3t21[p 32

p

1

p

p)]p1(1[p]t1[p

222

)1t ce(sin

Page 203: Probablity

1n

1n2

1n

1n22 ptnp)p1(n)X(E

1n

1n

1n

1n

1n

1n

pntpt)]1n(n[

pt]n)1n(n[

p/1t)]1n(n[pt2n

2n

Page 204: Probablity

p/1....]t4.5t3.4t2.32[pt 32

p/1]t1[pt2p/1...]t6t31[pt2 32

23 p

p2

p

)p1(p2

2p

p1)X(Var

Page 205: Probablity

Negative Binomial Distribution

Trials repeated until a fixed number of success occur.

Instead of finding the probability of r success in n trialswhen n is fixed

Probability that rth success occurs on the xth trial.

Negative binomial experiment

The number of x trials to produce r success in a negativebinomial experiment is called a negative binomialrandom variable and its probability distribution is called the negative binomial distribution

Page 206: Probablity

The negative binomial distribution is used when the number of successes is fixed and we're interested in the number of failures before reaching the fixed number of successes. An experiment which follows a negative binomial distribution will satisfy the following requirements: 1.The experiment consists of a sequence of independent trials. 2.Each trial has two possible outcomes, S or F. 3.The probability of success,is constant from one trial to another. 4.The experiment continues until a total of r successesare observed, where r is fixed in advance

Page 207: Probablity

Suppose we repeatedly throw a die, and consider a "1" to be a "success". The probability of success on each trial is 1/6. The number of trials needed to get three successes belongs to the infinite set { 3, 4, 5, 6, ... }. That number of trials is a (displaced) negative-binomially distributed random variable. The number of failures before the third success belongs to the infinite set { 0, 1, 2, 3, ... }. That number of failures is also a negative-binomially distributed random variable.

Page 208: Probablity

A Bernoulli process is a discrete time process, and so the number of trials, failures, and successes are integers. For the special case where r is an integer, the negative binomial distribution is known as the Pascal distribution.

A further specialization occurs when r = 1: in this case we get the probability distribution of failures before the first success (i.e. the probability of success on the (k+1)th trial), which is a geometric distribution.

Page 209: Probablity

Let the random variable y denotes the no of failures before the occurrence of the rth success. Then y+rdenotes the number of trials necessary to produce exactly r success and y failures with the rth successoccurring at the (y+r)th trial.

p1q,pqpC

pqpCg(y) y of pmfy1r

)1r()1ry(

y1r)1r(

)1n(

yr)1r(

)1ry( qpC

Page 210: Probablity

Pat is required to sell candy bars to raise money for the 6th grade field trip. There are thirty houses in the neighborhood, and Pat is not supposed to return home until five candy bars have been sold. So the child goes door to door, selling candy bars. At each house, there is a 0.4 probability of selling one candy bar and a 0.6 probability of selling nothing.

Example

What’s the probability that Pat finishes on the tenth house?

Page 211: Probablity

A fair die is cast on successive independent trials untilthe second six is observed. The probability of observing exactly ten non-sixes before the second six is cast is ……..

32

14

102

111

4

3

4

1C,

6

5

6

1C

Find the probability that a person tossing three coins will get either all heads or all tails for the second timeon the fifth toss.

Page 212: Probablity

The probability that an experiment will succeed is 0.8.If the experiment is repeated until four successful outcomes have occurred, what is the expected number of repetitions required?

Ans. 1

Page 213: Probablity

yr

0yqp

!y!1r

!1ry

yy

)1ry(

0y

ryry

)1ry(

0yqCpqpC

y

0y

r q!y!1r

!1ryp

yr)1r(

)1ry(

0y0yqpCg(y)

Page 214: Probablity

.1ppq1p rrrr

2p

rqVariance

p

rqE(X) Mean

...q

!2

1rrrq1p 2r

...q!2!1r

!1rq

!1r

!r

!1r

!1rp 2r

Page 215: Probablity

PROBABILITY DISTRIBUTIONS

Page 216: Probablity

Binomial Distribution

Discrete Distributions

n,...,2,1,0r;qpC)rX(P rnrr

n

If we assume that n trials constitute a set and if weconsider N sets, the frequency function of the binomial distribution is given by f(r)=N p(r)

rnrr

n qpCN

Page 217: Probablity

ExampleFit a binomial distribution for the following data and hence find the theoretical frequencies:x : 0 1 2 3 4f: 5 29 36 25 5

Ans. 7,26,37,34,66.89,19.14,23.94,17.74,8.63,2.88,0.67,0.1,0.01,0,0

The following data are the number of seeds germinating out of 10 on damp filter paper for 80 set of seeds. Fit a binomial distribution to these data: x 0 1 2 3 4 5 6 7 8 9 10

y 6 20 28 12 8 6 0 0 0 0 0

Page 218: Probablity

Fit a Poisson distribution for the following distribution:x: 0 1 2 3 4 5f: 142 156 69 27 5 1

Ans. 147 147 74 25 6 1107,141,93,41,4,0,0,0,0,0

Fit a Poisson distribution to the following data which gives the number of yeast cells per square for 400 squaresNo. of cells per square (x)

0 1 2 3 4 5 6 7 8 9 10

No. of squares (f) 103 143 98 42 8 4 2 0 0 0 0

Page 219: Probablity

Probbility Density Function

0.0000

0.0004

0.0008

0.0012

0.0016

-100 0 100 200 300 400 500 600 700 800

x

f X (x )

0 if 0

1( ) if 0

0 if

X

x

f x x uu

u x

Page 220: Probablity

Cumulative Distribution Function

0.0

0.2

0.4

0.6

0.8

1.0

-100 0 100 200 300 400 500 600 700 800

x

F X (x )

0 if 0

( ) if 0

1 if

X

x

xF x x u

uu x

Page 221: Probablity

If X is uniformly distributed over the interval [0,10]compute the probability (a) 2<X<9(b) 1<X<4(c)X<5 (d) X>6

Ans. 7/10,3/10,5/10,4/10

Page 222: Probablity

b

a

b

adx

ab

1xdx)x(xf)X(Emean

)ab(2

1mean

Page 223: Probablity

Buses arrive at a specific stops at 15min. Intervalsstarting at 7 A.M., that is , they arrive at 7,7:15,7:30 and so on. If a passenger arrives at the stop at a random time that is uniformly distributed between7 and 7:30A.M., find the probability that he waits(a) less than 5 min (b) at least 12 min. for a bus.

Ans. 1/3,1/5

)22-XP( find (-3,3),in on distributi uniform has X If

)1X(P)1XP(such that a

find 0,a a),(-a,in on distributi uniform has X If

Ans. 1/2

a = 2

Page 224: Probablity

Example:

The total time it takes First Bank to process a loan application is uniformly distributed between 3 and 7 days. What is the probability that the application will be processed in less than 4 days?

What is the probability that it will take more than 6.5 days?

25.)37

1)(34()43(

xP

125.)37

1)(5.67()75.6(

xP

25.

1

ab

0 7

b

3

a

5

Total Area

)1

)((ab

ab

6.54

Page 225: Probablity

3

aabb

)ab(3

ab

x)ab(3

1dx)x(fx)X(E

2

ba)X(E

)X(E)X(Eiancevar

2233

b

a3

b

a

22

22

12

ab

12

ab2ba

4

bab2a

3

babavar

222

2222

Page 226: Probablity

)ab(1n

ab)E(X moments

1n1nn

}E(X))-E{(X moments central rr

dx2

bax

ab

1r

b

a

1rab2

ba2

ab

1r2

bax

ab

1

1r1rb

a

1r

Page 227: Probablity

even isr if

2

ab

1r

1

odd isr if 0r

1,2,3,.. n for 2

ab

1n2

1,0

n2

n21n2

dxab

1*

2

bax

E(X)-XE( deviation mean

b

a

ab4

1

Page 228: Probablity

0

1xt- 0x,dtte(x)

as defined is by denoted function, gamma The

dtet1xte)x(.1

opertiesPr

t2x

0

01xt

)1x()1x(dtte)1x(0

11xt

1dte)1(.20

t

Page 229: Probablity

1-n1-n(n)

n put x .3

)!1n(

1.2)....2n)(1n(

......)2n(2n1n

1!0)1(

Page 230: Probablity

Exponential Distribution

If the occurances of events over nonoverlapping intervals are independent, such as arrival times of telephone calls or bus arrival times at a bus stop, then the waiting time distribution of these events can be shown to be exponential

• Time between arrivals to a queue (e.g. time between people arriving at a line to check out in a department store. (People, machines, or telephone calls may wait in a queue)

• Lifetime of components in a machine

Page 231: Probablity

0 parameter

otherwise 0

0 xef(x)

ondistributi lExponentiax-

Page 232: Probablity
Page 233: Probablity

0 parameter

otherwise 0

0 xef(x)

ondistributi lExponentiax-

dxex)X(E'moments x

0

rrr

rr0

yrr

!r)1r(dyey

1

Page 234: Probablity

Example Let X have an exponential distribution with mean of 100 . Find the probability that X<90

Ans. 0.593

Customers arrive in a certain shop according to anapproximate Poisson process at a mean rate of 20per hour. What is the probability that the shopkeeper will have to wait for more than 5 minutes for his firstcustomer to arrive?

-15e .ans

Page 235: Probablity

Memoryless Property of the Exponential Distribution

0t s,any for t),P(Xs)t/XsP(X

thend,distributelly exponentia is X If

kk

x

k

x eedxe)kX(P

sXP

sX&tsXPs)t/XsP(X Now

s

)ts(

e

e

sXP

tsXP

).tX(Pe t

Page 236: Probablity

If x represents the lifetime of an equipment then above property states that if the equipment has been working for time s, then the probability that itwill survive an additional time t depends only on t(not on s) and is identical to the probability of survivalfor time t of a new piece of equipment.

Equipment does not remember that it has been in use for time s.

Page 237: Probablity

A crew of workers has 3 interchangeable machines, of which 2 must be working for the crew to its job. When in use, each machine will function for an exponentially distributed time having parameters before breaking down. The workers decide to initiallyuse machines A and B and keep machine C inreserve to replace whichever of A or B breaks downfirst. They will then be able to continue working untilone of the remaining machines breaks down. Whenthe crew is forced to stop working because onlyone of the machines has not yet broken down, whatis the probability that the still operable machine is machine C?

Page 238: Probablity

Suppose the life length of an appliance has an exponential distribution with years. 10

A used appliance is bought by someone. What is the probability that it will not fail in the next 5 years?

Suppose that the amount of waiting time a customerspends at a restaurant has an exponential distributionwith a mean value of 5 minutes Then find theprobability that a customer will spend more than 10 minutes in the restaurant

0.368

0.1353

Page 239: Probablity

ExampleSuppose that the length of a phone call in minutes is an exponenetial random variable with parameter

.10

1

If A arrives immediately ahead of B at a publictelephone booth, find the probability that B will haveto wait (i) more than 10 minutes, and (ii) between 10 and 20 minutes.

(ans. 0.368,0.233)

Page 240: Probablity

Erlang distribution or General Gamma distribution

otherwise 0,

0 xfor,)k(

exf(x)

by given is pdf its if 0,k0,

parametern with distribuio Gamma Generalor n distribuio

Erlangan follow tosaid is X RV continuousA

x1kk

1dx)x(f0

Page 241: Probablity

k.parameter with

ondistributi gamma simpleor on distributi Gamma

.0k;0x,)k(

expdf

,1x1k

0.

parameteron with distributi lexponentia theto

reduceson distributi Erlang the1, k When

Page 242: Probablity

3.an smaller thnot is Xy that probabilit theFind

0for x 2e

0for x 0f(x)

by given function density with

ondistributi gamma thehas X variablerandom The

Example

2x-

6x2

3

x2 e2

e2dxe2)3X(P

Page 243: Probablity

Suppose that an average of 30 customers per hourarrive at a shop in accordance with a Poisson ProcessThat is, if a minute is our unit , then . What is the probability that the shopkeeper wait more than5 minutes before both of the first two customer arrive?

2/1

Solution.If X denotes the waiting time in minutes until the second customer arrives, then X has Erlang(Gamma)distribution with k = 2, 2/1

dxe

)k(

x)5X(P x

5

1k

=0.287

Page 244: Probablity

Mean and Variance of Erlang Distribution

)X(E'moments rr

dxex)k(

x1rk

0

k

0

t1rkrk

k

dtet1

)k( )k(

)rk(1r

2

k(X) var

k)X(Emean

Page 245: Probablity

)e(E)t(M.f.g.m txX

dxeex)k(0

txx1kk

0

x)t(1kk

dxex)k(

0

y1kk

k

dyeyt

1

)k( kk

t

kt

1

Page 246: Probablity

nk...2k1k

t1

)t(M).....t(M)t(M)t(Mn21n21 XXX)X...XX(

n21

n21

kkk

XXX

k1.....

k1

k1

)t(M).....t(M)t(M

)t(M )X...XX( n21

Reproductive Property

Page 247: Probablity

The sum of a finite number of Erlang variables is also an Erlang variable.

)k.....kk,(parameter

with variableErlangan also is X....XX

then ),k,),......(k,(),k,( parameterswith

variablesErlangt independen are X,....,X,X if

n21

n21

n21

n21

Page 248: Probablity

profit? expected themaximise employ to

company theshould nssalespersomany how person,

per 8000 Rs. iscost sales theIf . n80 k and

1/2,ion with distribuit Erlangan having RV

as regarded bemay rupees of dsin thousan sales

gross its persons, salesn employescompany a If

Let X represent the gross sales (in Rupees) by n Sales persons

X will have Erlang distribution

Page 249: Probablity

Weibull Distribution

0x,ex)x(f x1

, parameter

.parameter on with distributi lexponentia the

toreduceson distributi Weibull1,when

0

x1rrr dxex)X(E'

dyy

ey

11

y

11

r

0

0,

Page 250: Probablity

2

/2

1

1

11

12

)X(Var

11

')X(EMean

Page 251: Probablity

service? of months 2first theduring

replaced be tohave will tubenoy that probabilit the

is what another, one oftly independenfunction

tubes theseIf 2. and 25 parameters

on withdistributi Weibulla follows that RV a

as considered bemay which years)length(in

life a hasset radio a of tubes6 theofEach

0x,ex f(x)by given is f(x)function density its

theneach tube, oflength life therepresents X Ifx1-

Page 252: Probablity

0x,xe50)x(f2x25

36/25

6

1x25

6/1

x25

e

edxxe50)6/1X(p22

0155.0e

months) 2first the

during replaced be not to are tubes6 theP(all

625/36-

P( a tube is not to be replaced during the first2 months)

Page 253: Probablity

Properties of aNormal Distribution

• Continuous Random Variable

• Symmetrical in shape (Bell shaped)

• The probability of any given range of numbers is represented by the area under the curve for that range.

• Probabilities for all normal distributions are determined using the Standard Normal Distribution.

Page 254: Probablity

Probability for aContinuous Random Variable

Page 255: Probablity

Probability Density Function for Normal Distribution

0,,x

ex

xf )(2

1)(

2

21

),(N

Page 256: Probablity
Page 257: Probablity
Page 258: Probablity

e )4

7x(

32

1)x(f

2

21

0,,x

)4,7(N

Page 259: Probablity

1dx)x(f

Standard Normal Distribution

N(0,1)

.z,e2

1)z( 2

z2

. and z into

lyrespective f and x changingby & 1,0

Page 260: Probablity

N(0,1)on distributi has then Z

,-X

Zif and ),N(on distributi has X If

. tabulatedare (z)dz (z), of valuesz

0

Page 261: Probablity

),(NX

dx)x(xf)X(E

dxxe

2

1 22 2/x

2

xt dtet2

1 2t

Page 262: Probablity

dtte 2

dte22 tt

.

2iancevar,similarly

Page 263: Probablity

Figure 6.3

Page 264: Probablity

Figure 6.5

Page 265: Probablity
Page 266: Probablity

Determining the Probability for a Standard Normal Random Variable• P(- Z 1.62) = .5 + .4474 = .9474

• P(Z > 1.62) = 1 - P(- Z 1.62) =1 - .9474 = .0526

Page 267: Probablity
Page 268: Probablity
Page 269: Probablity

z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

0.0 0.0000 0.0040 0.0080 0.0120 0.0160 0.0190 0.0239 0.0279 0.0319 0.0359

0.1 0.0398 0.0438 0.0478 0.0517 0.0557 0.0596 0.0636 0.0675 0.0714 0.0753

0.2 0.0793 0.0832 0.0871 0.0910 0.0948 0.0987 0.1026 0.1064 0.1103 0.1141

0.3 0.1179 0.1217 0.1255 0.1293 0.1331 0.1368 0.1406 0.1443 0.1480 0.1517

0.4 0.1554 0.1591 0.1628 0.1664 0.1700 0.1736 0.1772 0.1808 0.1844 0.1879

0.5 0.1915 0.1950 0.1985 0.2019 0.2054 0.2088 0.2123 0.2157 0.2190 0.2224

0.6 0.2257 0.2291 0.2324 0.2357 0.2389 0.2422 0.2454 0.2486 0.2517 0.2549

0.7 0.2580 0.2611 0.2642 0.2673 0.2704 0.2734 0.2764 0.2794 0.2823 0.2852

Page 270: Probablity

0.8 0.2881 0.2910 0.2939 0.2969 0.2995 0.3023 0.3051 0.3078 0.3106 0.3133

0.9 0.3159 0.3186 0.3212 0.3238 0.3264 0.3289 0.3315 0.3340 0.3365 0.3389

1.0 0.3413 0.3438 0.3461 0.3485 0.3508 0.3513 0.3554 0.3577 0.3529 0.3621

1.1 0.3643 0.3665 0.3686 0.3708 0.3729 0.3749 0.3770 0.3790 0.3810 0.3830

1.2 0.3849 0.3869 0.3888 0.3907 0.3925 0.3944 0.3962 0.3980 0.3997 0.4015

1.3 0.4032 0.4049 0.4066 0.4082 0.4099 0.4115 0.4131 0.4147 0.4162 0.4177

1.4 0.4192 0.4207 0.4222 0.4236 0.4251 0.4265 0.4279 0.4292 0.4306 0.4319

1.5 0.4332 0.4345 0.4357 0.4370 0.4382 0.4394 0.4406 0.4418 0.4429 0.4441

1.6 0.4452 0.4463 0.4474 0.4484 0.4495 0.4505 0.4515 0.4525 0.4535 0.4545

Page 271: Probablity

1.7 0.4554 0.4564 0.4573 0.4582 0.4591 0.4599 0.4608 0.4616 0.4625 0.4633

1.8 0.4641 0.4649 0.4656 0.4664 0.4671 0.4678 0.4686 0.4693 0.4699 0.4706

1.9 0.4713 0.4719 0.4726 0.4732 0.4738 0.4744 0.4750 0.4756 0.4761 0.4767

2.0 0.4772 0.4778 0.4783 0.4788 0.4793 0.4798 0.4803 0.4808 0.4812 0.4817

2.1 0.4821 0.4826 0.4830 0.4834 0.4838 0.4842 0.4846 0.4850 0.4854 0.4857

2.2 0.4861 0.4864 0.4868 0.4871 0.4875 0.4878 0.4881 0.4884 0.4887 0.4890

2.3 0.4893 0.4896 0.4898 0.4901 0.4904 0.4906 0.4909 0.4911 0.4913 0.4916

2.4 0.4918 0.4920 0.4922 0.4925 0.4927 0.4929 0.4931 0.4932 0.4934 0.4936

2.5 0.4938 0.4940 0.4941 0.4943 0.4945 0.4946 0.4948 0.4949 0.4951 0.4952

2.6 0.4953 0.4955 0.4956 0.4957 0.4959 0.4960 0.4961 0.4962 0.4963 0.4964

Page 272: Probablity

2.7 0.4965 0.4966 0.4967 0.4968 0.4969 0.4970 0.4971 0.4972 0.4973 0.4974

2.8 0.4974 0.4975 0.4976 0.4977 0.4977 0.4978 0.4979 0.4979 0.4980 0.4981

2.9 0.4981 0.4982 0.4982 0.4983 0.4984 0.4984 0.4985 0.4985 0.4986 0.4986

3.0 0.4987 0.4987 0.4987 0.4988 0.4988 0.4989 0.4989 0.4989 0.4990 0.4990

3.1 0.4990 0.4991 0.4991 0.4991 0.4992 0.4992 0.4992 0.4992 0.4993 0.4993

3.2 0.4993 0.4993 0.4994 0.4994 0.4994 0.4994 0.4994 0.4995 0.4995 0.4995

3.3 0.4995 0.4995 0.4995 0.4996 0.4996 0.4996 0.4996 0.4996 0.4996 0.4997

3.4 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4997 0.4998

Page 273: Probablity

Determining the probability of any Normal Random Variable

Page 274: Probablity

Interpreting Z

• In figure Z = -0.8 means that the value 360 is .8 standard deviations below the mean.

• A positive value of Z designates how many standard deviations () X is to the right of the mean ().

• A negative value of Z designates how may standard deviations () X is to the left of the mean ().

Page 275: Probablity

Example: A group of achievement scores are normally distributed with a mean of 76 and a standard deviation of 4. If one score is randomlyselected what is the probability that it is at least 80.

76 80

4

1587.3413.5.

)1z0(P5.)1z(P)80x(P

14

7680xZ

Page 276: Probablity

0 1

.341

3

.1587

76 80

4

1587.3413.5.

)10(5.)1()80(

14

7680

zPzPxP

uxZ

Page 277: Probablity

Continuing, what is the probability that it is less than 70.

70 76

.06684332.5.

)0.05.1(5.)5.1()70(

5.14

7670

zPzPxP

uxZ

-1.5 0

.433

2

.0668

Page 278: Probablity

What proportion of the scores occur within 70 and 85.

9210.4878.4332.

)25.20.0()0.05.1(

)25.25.1()8570(

25.24

7685

5.14

7670

zPzP

zPxP

uxZ

uxZ

.4

332

.487

8

-1.5 0 2.25

Page 279: Probablity

Time required to finish an exam is known to be normally distributed with a mean of 60 Min. and a Std Dev. of 12 minutes. How much time should be allowed in order for 90% of the students to finish?

12

60 x

.9

36.75

60)12(28.1

x

x

xz

xz

xz

Page 280: Probablity

An automated machine that files sugar sacks has an adjusting device to change the mean fill per sack. It is now being operated at a setting that results in a mean fill of 81.5 oz. If only 1% of the Sacks filled at this setting contain less than 80.0 oz, what is the value of the variance for this population of fill weights. (Assume Normality).

-2.33 0

.01

4144.

6437.33.2

)5.810.80(

5.810.8033.2

01.805.81

2

ux

Z

PROBx

Page 281: Probablity

Moment generating function of N(0,1)

dz)z(e)e(E)t(M tztZZ

dzee2

1 2/ztz 2

dze

2

1 2/tz2z2

dze

2

1 2/ttz 22

u2

due

2

1e u2/t2

2/t2/t 22

e)2/1(1

e

dze

2

1e 2

tz2/t

2

2

Page 282: Probablity

)t(M)t(M

),N( offunction generatingmoment The

ZX

)e(E )z(t )e(Ee ztt )t(Me Z

t

2/tt2/tt 222

eee

.....)4

t(

!2

t)

2

t(

!1

t1 2

222

Page 283: Probablity

)ab,N(aon distributi thehas

baXY then ),N(on distributi thehas X If

2/ttX

2

e)t(M

)t(M)t(M baXY

Page 284: Probablity

)at(Me Xbt

2/taatbt 22

ee

2/tabat 22

e )ab,N(a MGF

)1,0(N).1

,1

N(on distributi thehas

-X then Z),,N(on distributi has X If

Page 285: Probablity

.aiancevar&

amean with RV normal a also is Xa

then , varianceand mean with RVs

normalt independenn be )n,....,2,1i(X If

ondistributi normal ofproperty Additive

n

1ii

22i

n

1iii

n

1iii

2ii

i

ce)independen by(M......M)t(M)t(M,nn2211

n

1iii

XaXaXaXa

Page 286: Probablity

2/tata

2/tata2/tata

22i

2iii

222

2222

221

2111

e

........e.e

Page 287: Probablity

When n is very large and neither p nor q is very small

X—B(n,p)

npq

np-XZ

bygiven is Z variablebinomial dardtans

.npq

1 size step with

npq

npto

npq

np- from

varies Z1, size stepn with to0 from variesX as

Page 288: Probablity

z,dte

2

1)z(F)z(P 2/t2

Let X be the number of times that a fair coin, flipped 40times, land heads. Find P(X=20). Use normalapproximation and compare it to the exact solution.

P(X=20)=P(19.5<X<20.5)

10

205.20

10

20X

10

205.19P

1272.)16.()16(.

Page 289: Probablity

0 1 2 3 4 5 6 7 8 9 10 11 12

Page 290: Probablity

0 1 2 3 4 5 6 7 8 9 10 11 12

Page 291: Probablity

If 20% of the momory chips made in a certain plantare defective, what are the probabilities that in a lot of 100 randomly chosen for inspection (a) at most 15 will be defective? (b) exactly 15 will be defective ?

4,20)20(.100

1292.0)4

205.15(F

0454.04

205.14F

4

205.15F

Page 292: Probablity

Fit a normal distribution to the following distributionand hence find the theoretical frequencies:

Class Freq 60-65 365-70 2170-75 15075-80 33580-85 336 85-90 13590-95 2695-100 4 ------------- 1000

Page 293: Probablity

.parameter on with distributiRayleigh a follows Xthen

,x0,ex

f(x) is X RV continuous a of pdf theIf22 a2/x

2

In communication systems, the signal amplitude valuesof a randomly received signal usually can be modeledas a Rayleigh distribution.

0x,ex)x(f x1

. ratelinear hasRayleigh Thus

on.DistributiRayleigh theasknown is

2&1/ with Weibullof case special The 2

Page 294: Probablity

0 x 0

x0,e2)2/r(

xf(x) is X of pdf The

integer. positive a isr where

r/2 k and 1/2on with distributi gamma a have XLet

2/x2/r

1r/2

freedom. of

degreesr with )r(on distributi square-chi a has X 2

Page 295: Probablity

r

2/1

2/rk)X(E

r2

4/1

2/rk)X(Var

2

Mean equals the number of degrees of freedom andthe variance equals twice the number of degrees of freedom.

2/1t,)t21()t(M 2/rX

Page 296: Probablity

df \p

.005 .01 .025 .05 .10 .90 .95 .975 .99 .995

1.00004

.00016

.00098

.0039

.0158 2.71 3.84 5.02 6.63 7.88

2 .0100 .0201 .0506 .1026

.2107 4.61 5.99 7.38 9.21 10.60

3 .0717 .115 .216 .352 .584 6.25 7.81 9.35 11.34 12.84

4 .207 .297 .484 .711 1.064 7.78 9.49 11.14 13.28 14.86

5 .412 .554 .831 1.15 1.61 9.24 11.07 12.83 15.09 16.75

.6 .676 .872 1.24 1.64 2.20 10.64 12.59 14.45 16.81 18.55

7 .989 1.24 1.69 2.17 2.83 12.02 14.07 16.01 18.48 20.28

8 1.34 1.65 2.18 2.73 3.49 13.36 15.51 17.53 20.09 21.96

9 1.73 2.09 2.70 3.33 4.17 14.68 16.92 19.02 21.67 23.59

10 2.16 2.56 3.25 3.94 4.87 15.99 18.31 20.48 23.21 25.19

11 2.60 3.05 3.82 4.57 5.58 17.28 19.68 21.92 24.73 26.76

12 3.07 3.57 4.40 5.23 6.30 18.55 21.03 23.34 26.22 28.30

13 3.57 4.11 5.01 5.89 7.04 19.81 22.36 24.74 27.69 29.82

Page 297: Probablity

14 4.07 4.66 5.63 6.57 7.79 21.06 23.68 26.12 29.14 31.32

15 4.6 5.23 6.26 7.26 8.55 22.31 25 27.49 30.58 32.80

16 5.14 5.81 6.91 7.96 9.31 23.54 26.30 28.85 32.00 34.27

18 6.26 7.01 8.23 9.39 10.86 25.99 28.87 31.53 34.81 37.16

20 7.43 8.26 9.59 10.85

12.44 28.41 31.41 34.17 37.57 40.00

24 9.89 10.86 12.40 13.85

15.66 33.20 36.42 39.36 42.98 45.56

30 13.79 14.95 16.79 18.49

20.60 40.26 43.77 46.98 50.89 53.67

40 20.71 22.16 24.43 26.51

29.05 51.81 55.76 59.34 63.69 66.77

60 35.53 37.48 40.48 43.19

46.46 74.40 79.08 83.30 88.38 91.95

120 83.85 86.92 91.58 95.70

100.62

140.23

146.57

152.21

158.95

163.64

df \p

.005 .01 .025 .05 .10 .90 .95 .975 .99 .995

Page 298: Probablity

20.5).XP(3.25 Find ).10( be XLet 2

5.23).P(X find , variablerandom

theof m.g.f. theis ,2/1t,2t-1 If 6

.0250c)P(X&0.95d)XP(c

thatso d and cconstant thedetermine , )5( is X If 2

Ans.0.95

0.05

0.831, 12.8

Page 299: Probablity

Beta Distribution

otherwise 0

bx0, )x1(x),B(

1f(x) if

& parameters enonnegativon with distributi

beta have tosaid is x variablerandom The

11

2

0

1212

1

0

11-

dcossin2

dx)x1(x),B( where

Page 300: Probablity

(0,1)on

on distributi uniform ison distributi beta 1,when

(0,1)on on distributi uniform

y thanflexibilitgreater providesfunction beta

shapes. of variety a on takesdistributi

beta the, & of valueson the Depending

1, 2

2

Page 301: Probablity

year.given any in repairs require willsections

highway theof halfmost at y that probabilit the(b)

:yeargiven any in repairs require sections

highway theof percentage what average on the (a)

Find 2.

and 3on with distributi beta thehaving variable

random a isyear given any in repairs requiring

highway of proportion thecountry,certain aIn

Page 302: Probablity

1/2

0

2 16/5dx)x1(12x1/2)(b)P(X

year.given any in repairs require sections

high way theof %60.e.i,60.023

3)a(

Page 303: Probablity

The lognormal distribution

0t,t

tlog

s2

1exp

2st

1f(t)by

given is pdf on whosedistributi lognormal a follows Tthen

),,N(on distributi normal a follows T log X If

2

m2

. tlogby given

parameter,location theis failure tomean time) time(or median

the tandparameter shape a is s where

m

, m

Page 304: Probablity

1)sexp()sexp(t)Tvar(

)2

sexp(t)t(EMTTF

222mT

2

2

m

tlogTlogP)tT(P)t(F

m

m

m

t

tlog

s

1ZP

t

tlog

s

1tlogTlogP

tlogTlogP

Page 305: Probablity

(t). and R(T) computecan We

0.95. ofy reliabilit afor component theof lifedesign theFind (c)

hours. 3000for component theofy reliabilit theFind (b)

SD. and MTTF theCompute (a)

0.20. s and hours 5000with t

ondistributi normal log a hascomponent a of wearout Fatigue

M

5101hours,1030hours

3000

dt)t(f)3000(R

)5000

3000log(

2.0

1dz)z(

Page 306: Probablity

9946.0dz)z(55.2

3598.2

Page 307: Probablity

z 0.00 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 0.09

0.00.0000

0.0040

0.0080

0.0120

0.0160

0.0190

0.0239

0.0279

0.0319

0.0359

0.10.0398

0.0438

0.0478

0.0517

0.0557

0.0596

0.0636

0.0675

0.0714

0.0753

0.20.0793

0.0832

0.0871

0.0910

0.0948

0.0987

0.1026

0.1064

0.1103

0.1141

0.30.1179

0.1217

0.1255

0.1293

0.1331

0.1368

0.1406

0.1443

0.1480

0.1517

0.40.1554

0.1591

0.1628

0.1664

0.1700

0.1736

0.1772

0.1808

0.1844

0.1879

0.50.1915

0.1950

0.1985

0.2019

0.2054

0.2088

0.2123

0.2157

0.2190

0.2224

0.60.2257

0.2291

0.2324

0.2357

0.2389

0.2422

0.2454

0.2486

0.2517

0.2549

0.70.2580

0.2611

0.2642

0.2673

0.2704

0.2734

0.2764

0.2794

0.2823

0.2852

Page 308: Probablity

0.80.2881

0.2910

0.2939

0.2969

0.2995

0.3023

0.3051

0.3078

0.3106

0.3133

0.90.3159

0.3186

0.3212

0.3238

0.3264

0.3289

0.3315

0.3340

0.3365

0.3389

1.00.3413

0.3438

0.3461

0.3485

0.3508

0.3513

0.3554

0.3577

0.3529

0.3621

1.10.3643

0.3665

0.3686

0.3708

0.3729

0.3749

0.3770

0.3790

0.3810

0.3830

1.20.3849

0.3869

0.3888

0.3907

0.3925

0.3944

0.3962

0.3980

0.3997

0.4015

1.30.4032

0.4049

0.4066

0.4082

0.4099

0.4115

0.4131

0.4147

0.4162

0.4177

1.40.4192

0.4207

0.4222

0.4236

0.4251

0.4265

0.4279

0.4292

0.4306

0.4319

1.50.4332

0.4345

0.4357

0.4370

0.4382

0.4394

0.4406

0.4418

0.4429

0.4441

1.60.4452

0.4463

0.4474

0.4484

0.4495

0.4505

0.4515

0.4525

0.4535

0.4545

Page 309: Probablity

1.70.4554

0.4564

0.4573

0.4582

0.4591

0.4599

0.4608

0.4616

0.4625

0.4633

1.80.4641

0.4649

0.4656

0.4664

0.4671

0.4678

0.4686

0.4693

0.4699

0.4706

1.90.4713

0.4719

0.4726

0.4732

0.4738

0.4744

0.4750

0.4756

0.4761

0.4767

2.00.4772

0.4778

0.4783

0.4788

0.4793

0.4798

0.4803

0.4808

0.4812

0.4817

2.10.4821

0.4826

0.4830

0.4834

0.4838

0.4842

0.4846

0.4850

0.4854

0.4857

2.20.4861

0.4864

0.4868

0.4871

0.4875

0.4878

0.4881

0.4884

0.4887

0.4890

2.30.4893

0.4896

0.4898

0.4901

0.4904

0.4906

0.4909

0.4911

0.4913

0.4916

2.40.4918

0.4920

0.4922

0.4925

0.4927

0.4929

0.4931

0.4932

0.4934

0.4936

2.50.4938

0.4940

0.4941

0.4943

0.4945

0.4946

0.4948

0.4949

0.4951

0.4952

2.60.4953

0.4955

0.4956

0.4957

0.4959

0.4960

0.4961

0.4962

0.4963

0.4964

Page 310: Probablity

2.70.4965

0.4966

0.4967

0.4968

0.4969

0.4970

0.4971

0.4972

0.4973

0.4974

2.80.4974

0.4975

0.4976

0.4977

0.4977

0.4978

0.4979

0.4979

0.4980

0.4981

2.90.4981

0.4982

0.4982

0.4983

0.4984

0.4984

0.4985

0.4985

0.4986

0.4986

3.00.4987

0.4987

0.4987

0.4988

0.4988

0.4989

0.4989

0.4989

0.4990

0.4990

3.10.4990

0.4991

0.4991

0.4991

0.4992

0.4992

0.4992

0.4992

0.4993

0.4993

3.20.4993

0.4993

0.4994

0.4994

0.4994

0.4994

0.4994

0.4995

0.4995

0.4995

3.30.4995

0.4995

0.4995

0.4996

0.4996

0.4996

0.4996

0.4996

0.4996

0.4997

3.40.4997

0.4997

0.4997

0.4997

0.4997

0.4997

0.4997

0.4997

0.4997

0.4998