Upload
phebe-townsend
View
223
Download
3
Embed Size (px)
Citation preview
Dr. Ahmed M. Sultan1
Topics• Review of probability theory
– Random Variables– Conditional probability and conditional expectation
• The analysis of variance– Introduction– Single factor ANOVA
• Simple linear regression and correlation– Introduction– The simple linear regression model
• Estimation model parameters• Inferences about the slope parameters
Dr. Ahmed M. Sultan2
Topics (Cont.)
• Multivariate regression analysis– When to use multivariate regression– Control variables– Interpreting coefficients– Goodness of fit (R squared statistic)
• The exponential distribution and the Poisson process
• Queueing theory– The M/M/1 queue
• Steady state probabilities• Some performance measures
Dr. Ahmed M. Sultan3
Topics (Cont.)
– The M/M/m queue• Steady state probabilities• Some performance measures
– The M/M/1/K queue• Steady state probabilities• Some performance measures
• Discrete event simulation– Generating pseudo random numbers
• Congruential methods for generating pseudo random numbers• Composite generators
• Statistical tests for goodness of fit
Dr. Ahmed M. Sultan4
Topics (Cont.)
• Generating stochastic variables– The inverse transformation method– Sampling from continuous probability distribution
• Data manipulation in MINITAB– Recording and transforming variables– Graphs and charts– Scatter plots– Histograms– Box plots and other charts– Cross tabulation
Dr. Ahmed M. Sultan5
References
• Devore, J. “Probability and statistics for engineering and sciences”
• Andrews Willing “A short introduction to queueing theory”
• Banks, et. al. “discrete event simulation”
Dr. Ahmed M. Sultan6
Review of probability theory1. Laws of probabilityDEFINITION
If an event E occurs m times in an n trial experiment, then the probability P(E) is defined as:
i.e the experimrnt is repeated infinitely
Dr. Ahmed M. Sultan7
• e.gIn case of flipping a coin, the longer the
experiment is repeated the closer will be the estimate to P(H) (or P(T)) to the theortical value of 0.5
0≤P(E) ≤1P(E)=0 … E is impossibleP(E)=1 … E is certain (sure)
Dr. Ahmed M. Sultan8
HW• In a study to correlate senior year high school students scores
in mathematics and enrollment in engineering colleges a 1000 students were surveyed:
400 have studied mathematicsEngineering enrollment shows that of the 1000 seniors:150 have studied mathematics29 have notDetermine the probability of:a. A student who studied mathematics is enrolled in engineeringb. A student who neither studies mathematics nor enrolled in
engineeringc. A student is not studying engineering
Dr. Ahmed M. Sultan9
1.1 Addition law of probabilityEUF … Union of E and FEF … Intersection of E and FIf EF = ɸ, E and F are mutually exclusives
(occurrence of one precludes the other)• Addition law
Dr. Ahmed M. Sultan10
ExampleRolling a dieS={1,2,3,4,5,6} sample spaceP(1)= P(2)= P(3)= P(4)= P(5)= P(6)=1/6DefineE={1,2,3, or 4}F={3, 4, or 5}EF={3,4}P(E)= P(1)+ P(2)+ P(3)+ P(4)=4/6=2/3P(F)=3/6=1/2P(EF)=2/6=1/3P(EUF)=P(E)+P(F)-P(EF) =2/3+1/2-1/3=5/6Which is intuitively clear sinceEUF={1,2,3,4,5}
Dr. Ahmed M. Sultan11
HW• A fair die is tossed twice. E and F represent the
outcomes of the two tosses. Compute the following probabilities
a. Sum of E and F is 11b. Sum of E and F is evenc. Sum of E and F is odd and greater than 3d. E is even less than 6 and F is odd greater than 1e. E is graeter than 2 and F is less than 4f. E is 4 and sum of E and F is odd
Dr. Ahmed M. Sultan12
• The conditional probability of an event E is the probability that the event will occur given the knowledge that an event F has already occurred. This probability is written P(E|F), notation for the probability of E given F.
• In the case where events E and F are independent (where event F has no effect on the probability of event E), the conditional probability of event E given event F is simply the probability of event E, that is P(E).
1.2 Conditional Probability
Dr. Ahmed M. Sultan13
• If events E and F are not independent, then the probability of the intersection of E and F (the probability that both events occur) is defined by P(E and F) = P(F)P(E|F).
• From this definition, the conditional probability P(E|F) is easily obtained by dividing by P(F):
• P(E|F) = P(EF) / P(F) , P(F) > 0• Note: This expression is only valid when P(F) is greater than
0.
…(Cont.)
Dr. Ahmed M. Sultan14
ExampleIn rolling a die, what is the probability that the
outcome is 6, given that the rolling turned up an event number
SolutionE={6}, F={2,4,6} thusP(E|F)=P(EF)/P(F)=P(E)/P(F)=(1/6)/(1/2)=1/3Note that P(EF)=P(E) because E is a subset of F
Dr. Ahmed M. Sultan15
Ninety percent of flights depart on time. Eighty percent of flights arrive on time. Seventy-five percent of flights depart on time and arrive on time.
(a) You are meeting a flight that departed on time. What is the probability that it will arrive on time?
(b) You have met a flight, and it arrived on time. What is the probability that it departed on time?
(c) Are the events, departing on time and arriving on time, independent?
Example
Dr. Ahmed M. Sultan16
SolutionDenote the events,
A = { arriving on time} , D = {departing on time} .
P{A} = 0.8, P{D} = 0.9, P{AD} = 0.75. (a) P{A I D} = P{AD} / P{D} = 0.75 / 0.9 = 0.8333 (b) P{D I A}= P{AD} / P{A} = 0.75 / 0.8 = 0.9375 (c) Events are not independent because P{AI D} ≠ P{A}, P{DI A} ≠ P{D}, P{AD} ≠ P{A}P{D}.
Actually, anyone of these inequalities is sufficient to prove that A and D are dependent. Further, we see that P{AI D} > P{A} and P{D I A} > P {D}. In other words, departing on time increases the probability of arriving on time, and vise versa. This perfectly agrees with our intuition.
Dr. Ahmed M. Sultan17
HWIn the example of tossing a die if given that the
outcome is less than 6, determine:a. Probability of getting an even numberb. Probability of getting an odd number larger
than 1.
Dr. Ahmed M. Sultan18
• You can toss a fair coin up to 7 times. You will win 1000 SR if three tails appear before a head is encountered. What are your chances of wining?
HW
Dr. Ahmed M. Sultan19
Graduating high school seniors with an ACT score of at least 26 can apply to two universities A, and B, for admission. The probability of being accepted in A is 0.4 and in B is 0.25. The chance of being accepted in both universities is only 15%
1. Determine the probability that the student is accepted in B given that A has granted admission as well
2. What is the probability that admission will be granted in A given that the student was accepted in B?
HW
Dr. Ahmed M. Sultan20
Random variablesDefinition:• Consider a random experiment with sample
space S. A random variable X(ζ) is a single-valued real function that assigns a real number called the value of X(ζ) to each sample point ζ of S. Often, we use a single letter X for this function in place of X(ζ) and use r.v. to denote the random variable.
Dr. Ahmed M. Sultan21
• Note that the terminology used here is traditional. Clearly a random variable is not a variable at all in the usual sense, and it is a function.
• The sample space S is termed the domain of the r.v. X, and the collection of all numbers [values of X(ζ)] is termed the range of the r.v. X. Thus the range of X is a certain subset of the set of all real numbers.
…(Cont.)
Dr. Ahmed M. Sultan22
• Note that two or more different sample points might give the same value of X(ζ), but two different numbers in the range cannot be assigned to the same sample point.
• EXAMPLE In the experiment of tossing a coin, we might define the r.v. X
as:X(H) = 1 X(T) = 0
Note that we could also define another r.v., say Y or Z, withY(H) = 0, Y(T) = 1 or Z(H) = 1, Z(T) = 2
…(Cont.)
Dr. Ahmed M. Sultan23
Consider an experiment of tossing 3 fair coins and counting the number of heads. Certainly, the same model suits the number of girls in a family with 3 children, the number of 1‘s in a random binary code consisting of 3 characters, etc.
Let X be the number of heads (girls, 1's). Prior to an experiment, its value is not known. All we can say is that X has to be an integer between 0 and 3.
Since assuming each value is an event, we can compute probabilities, • P{X = 0} = P{three tails}= P{TTT} = (1/2)(1/2)(1/2)=1/8• P{X = 1} = P{HTT} + P{THT} + P{TTH} = 3/8• P{X = 2} =P {H HT} + P {HT H} + P {T H H} =3/8 .• P{X = 3} =P{HHH} =1/8
EXAMPLE
Dr. Ahmed M. Sultan24
• Summarizing, x P{X = x} 0 1/8 1 3/8 2 3/8 3 1/8
Total 1
• This table contains everything that is known about random variable X prior to the experiment.
…(Cont.)
Dr. Ahmed M. Sultan25
• Before we know the outcome ω, we cannot tell what X equals to. However, we can list all the possible values of X and determine the corresponding probabilities.
• Collection of all the probabilities related to X is the distribution of X. The function P(x)=P{X=x} is the probability mass function, Or pmf: The cumulative distribution function, or cdf is defined asF(x) = P{X≤ x} = Σy ≤ xP(y)
…(Cont.)
Dr. Ahmed M. Sultan26
• There are two types of random variables: • A Discrete random variable can take on only
specified, distinct values.• A Continuous random variable can take on any
value within an interval.
Types of random variables
Dr. Ahmed M. Sultan27
• A probability distribution for a discrete random variable is a mutually exclusive listing of all possible numerical outcomes for that random variable, such that a particular probability of occurrence is associated with each outcome.
• Probability Distribution for the Toss of a Die:x P{X = x} 1 1/6 2 1/6 3 1/6 4 1/65 1/66 1/6
• This is an example of a uniform distribution.
Probability distribution for a discrete random variable
Dr. Ahmed M. Sultan28
• Discrete Probability Distributions have 3 major properties:
• 1) ∑ P(X) = 1• 2) P(X) ≥ 0• 3) When you substitute the random variable into
the function, you find out the probability that the particular value will occur.
• Three major probability distributions: Binomial distribution, Hypergeometric distribution, Poisson distribution.
Discrete Probability Distributions