Probability & Statistics

Preview:

DESCRIPTION

Probability & Statistics

Citation preview

Probability & Statistics

Rubayet KarimAssistant Professor

Dept. of Industrial & Production EngineeringJessore University of Science & Technology

Rubayet KarimAssistant Professor

Dept. of Industrial & Production EngineeringJessore University of Science & Technology

P(E)

Where: P(E): probability of occurrence of event E

N: Number of outcomes in EN: total number of outcomes

Probability is a branch of mathematics that deals withcalculating the likelihood of a given event's occurrence,which is expressed as a number between 1 and 0.

P(E)

Where: P(E): probability of occurrence of event E

N: Number of outcomes in EN: total number of outcomes

P(E)P(E)

Where: P(E): probability of occurrence of event EN: total number of trials

Number of outcomes in E

Example.Experiment : Tossing 4 Coins.

Trial : Tossing each coin.

We can consider the act of tossing each coin as a trial and thussay that there are 4 trials in the experiment of tossing 4 coins.

In probability theory an elementary event (also calledan atomic event or simple event) is an event which containsonly a single outcome in the sample space.Example: Die rolling•The possible outcomes of this experiment are 1, 2, 3, 4, 5and 6. Single outcome(Elementary event).• When the objective is to get a even number from thisexperiment, then possible outcome is 2,4,6 so not a singleoutcome that’s why this is not a elementary event.

In probability theory an elementary event (also calledan atomic event or simple event) is an event which containsonly a single outcome in the sample space.Example: Die rolling•The possible outcomes of this experiment are 1, 2, 3, 4, 5and 6. Single outcome(Elementary event).• When the objective is to get a even number from thisexperiment, then possible outcome is 2,4,6 so not a singleoutcome that’s why this is not a elementary event.

{ }Therefore P( ) ) = 0

P ( X Y) = P(X) and P ( Y X) = P(Y)

P(A) + P( ) = 1Types of Probability

There are four types:

Marginal probability P(X)

• The probability of X occurring

P ( X Y) = P(X) and P ( Y X) = P(Y)

P(A) + P( ) = 1Types of Probability

There are four types:

Marginal probability P(X)

• The probability of X occurring

Union Probability P( )• The probability of X or Y occurringJoint Probability P( )• The probability of X and Y occurringConditional probability P ( X Y)• The probability of X occurring given that Y has

occurred.

General Law of AdditionP( ) = P(X) + P(Y) – P( )

Union Probability P( )• The probability of X or Y occurringJoint Probability P( )• The probability of X and Y occurringConditional probability P ( X Y)• The probability of X occurring given that Y has

occurred.

General Law of AdditionP( ) = P(X) + P(Y) – P( )

P( ) = P(X) + P(Y)P( ) = P(X) + P(Y)

• P(T C) = P(T) + P(C) =

P( )=P(X) P(Y X) =P(Y) P(X Y)

• P(S) = P(M)=• P(S M) =0.2• P(S M) = P(M) P(S M)=

P(X)= P(X Y), P(Y) =P(Y X)P(X)= P(X Y), P(Y) =P(Y X)

P( ) = P(X) P(Y)

P ( ( ) P(Y X) P(X)

P(X Y) = =

P(Y) P(Y)

(Y Xi) P(Xi)

P(Xi Y)P(Y Xi) P(Y Xi) P(Xi )

(Y Xi) P(Xi)(Y Xi) P(Xi)

Bayes’ rule• Events are mutually exclusive(i.e conflict with each other )• Together they must form a sample space• Most of the time use reverse time order probability (i.e

P(cause effect) )• When conditional probability declares reverse time order

then it is called posterior probability• P(Y X) Time order

Effect Cause

• P( X Y) Reverse time order

Bayes’ rule• Events are mutually exclusive(i.e conflict with each other )• Together they must form a sample space• Most of the time use reverse time order probability (i.e

P(cause effect) )• When conditional probability declares reverse time order

then it is called posterior probability• P(Y X) Time order

Effect Cause

• P( X Y) Reverse time order

P(Y X1) P(X1)

P(X1 )+P(Y X1) P(Y X2) P(X2)+ P(Y X3) P(X3)

ExampleMarie is getting married tomorrow, at an outdoor ceremony inthe desert. In recent years, it has rained only 5 days each year.Unfortunately, the weatherman has predicted rain for tomorrow.When it actually rains, the weatherman correctly forecasts rain90% of the time. When it doesn't rain, he incorrectly forecastsrain 10% of the time. What is the probability that it will rain onthe day of Marie's wedding?

ExampleMarie is getting married tomorrow, at an outdoor ceremony inthe desert. In recent years, it has rained only 5 days each year.Unfortunately, the weatherman has predicted rain for tomorrow.When it actually rains, the weatherman correctly forecasts rain90% of the time. When it doesn't rain, he incorrectly forecastsrain 10% of the time. What is the probability that it will rain onthe day of Marie's wedding?

Solution: The sample space is defined by two mutually-exclusiveevents - it rains or it does not rain. Additionally, a third eventoccurs when the weatherman predicts rain. Notation for theseevents appears below.Event A1. It rains on Marie's wedding.Event A2. It does not rain on Marie's wedding.Event B. The weatherman predicts rain.

In terms of probabilities, we know the following's P( A1 ) = 5/365=0.0136985 [It rains 5 days out of the year.]P( A2 ) = 360/365 = 0.9863014 [It does not rain 360 days out of theyear.]P( B | A1 ) = 0.9 [When it rains, the weatherman predicts rain 90%of the time.]P( B | A2 ) = 0.1 [When it does not rain, the weatherman predictsrain 10% of the time.]

In terms of probabilities, we know the following's P( A1 ) = 5/365=0.0136985 [It rains 5 days out of the year.]P( A2 ) = 360/365 = 0.9863014 [It does not rain 360 days out of theyear.]P( B | A1 ) = 0.9 [When it rains, the weatherman predicts rain 90%of the time.]P( B | A2 ) = 0.1 [When it does not rain, the weatherman predictsrain 10% of the time.]

We want to know P( A1 | B ), the probability it will rain on theday of Marie's wedding, given a forecast for rain by theweatherman. The answer can be determined from Bayes'theorem, as shown below.

Note the somewhat unintuitive result. Even when the weathermanpredicts rain, it only rains only about 11% of the time. Despite theweatherman's gloomy prediction, there is a good chance thatMarie will not get rained on at her wedding.

c = constant

P(X ≥ 12.5 ) = = = 0.1666P(X ≥ 12.5 ) = = = 0.1666

P(X< 12.5 ) = 1- P(X ≥ 12.5 ) = 1- 0.1666 =0.8333

Application

The exponential distribution occurs naturally when describingthe lengths of the inter-arrival times in a homogeneous PoissonProcess• Queuing theory :the service times of agents in a system (e.g.

how long it takes for a bank teller etc. to serve a customer)are often modeled as exponentially distributed variables.

• Reliability theory: Because of the memory less property ofthis distribution, it is well-suited to model the constant hazardrate portion of the bathtub curve.

The exponential distribution occurs naturally when describingthe lengths of the inter-arrival times in a homogeneous PoissonProcess• Queuing theory :the service times of agents in a system (e.g.

how long it takes for a bank teller etc. to serve a customer)are often modeled as exponentially distributed variables.

• Reliability theory: Because of the memory less property ofthis distribution, it is well-suited to model the constant hazardrate portion of the bathtub curve.

1st moment of x : Mean, E[X]= μ/1 (x)

1st & 2nd moment of x: Variance, E[ X2 ] – {E[X]}2 = μ/2 (x) -

[μ/1 (x)]2

1st , 2nd & 3rd moment of x:Skewness , E[ X3 ] -3E[X] E[ X2 ]+ 2(E[X])3

= μ/3 (x)-3. μ/

1(x) μ/2 (x)+ 2. [μ/

1 (x)]3

1st , 2nd ,3rd & 4th moment of x:Kurtosis , E[ X4 ] -4E[X] E[ X3 ]+ 6(E[X])2 E[ X2] -3(E[X])4

=μ/4 (x)-4μ/

1 (x) μ/3 (x)+ 6[μ/

1 (x) ]2 μ/2 (x) -3. [μ/

1 (x)]4

1st moment of x : Mean, E[X]= μ/1 (x)

1st & 2nd moment of x: Variance, E[ X2 ] – {E[X]}2 = μ/2 (x) -

[μ/1 (x)]2

1st , 2nd & 3rd moment of x:Skewness , E[ X3 ] -3E[X] E[ X2 ]+ 2(E[X])3

= μ/3 (x)-3. μ/

1(x) μ/2 (x)+ 2. [μ/

1 (x)]3

1st , 2nd ,3rd & 4th moment of x:Kurtosis , E[ X4 ] -4E[X] E[ X3 ]+ 6(E[X])2 E[ X2] -3(E[X])4

=μ/4 (x)-4μ/

1 (x) μ/3 (x)+ 6[μ/

1 (x) ]2 μ/2 (x) -3. [μ/

1 (x)]4

Example: The value of a piece of factory equipment afterthree years of use is 100(0.5)x where X is a random variablehaving moment generating function Mx (t) for t <Calculate the expected value of this piece of equipment afterthree years of use.

Soln: Let , Value Y = 100(0.5)x

So expected value, E[Y] = E[100(0.5)x ]= 100 E[(0.5)x ]=100E [ ]=100E [ ]= 100Mx (ln 0.5)= 100 x

= 41.9060

Example: The value of a piece of factory equipment afterthree years of use is 100(0.5)x where X is a random variablehaving moment generating function Mx (t) for t <Calculate the expected value of this piece of equipment afterthree years of use.

Soln: Let , Value Y = 100(0.5)x

So expected value, E[Y] = E[100(0.5)x ]= 100 E[(0.5)x ]=100E [ ]=100E [ ]= 100Mx (ln 0.5)= 100 x

= 41.9060

THE ENDTHE END