28
Chapter 2: Random Variable and Probability Distributions [email protected] http://www.mysmu.edu/faculty/zlyang/ Yang Yang Zhenlin Zhenlin

Chapter 2: Random Variable and Probability Distributions [email protected] Yang Zhenlin

Embed Size (px)

Citation preview

Page 1: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2: Random Variable and Probability Distributions

[email protected]://www.mysmu.edu/faculty/zlyang/

Yang ZhenlinYang Zhenlin

Page 2: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Chapter Contents

Random Variable (r.v.)

– Discrete r.v.– Continuous r.v.

Distribution of a Random Variable

– Probability mass function (pmf)– Probability density function (pdf)

Expectation of a Random Variable

Variance and Standard Deviation

Moment Generating Function

2

Page 3: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Random Variable

In real-world problems, we are often faced with one or more quantities that do not have fixed values.

The values of such quantities depend on random actions and they change from one experiment to another,

Number of babies born in a certain hospital

Number of traffic accidents occurred on a road per month

The amount of rainfall in Singapore per year

The starting price of a stock each day, etc.

In probability, quantities introduced in these diverse examples are called Random Variables.

Study the properties of random variables allows us to have a better understanding on the real-world phenomena, and to control or to predict their behavior, … .

3

Page 4: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Random Variable

Example 2.1 If in the experiment of rolling two fair dice, X is the sum, then X is a variable. Since the value of X changes from one experiment to another, X is also a "random variable". The possible values of X and the corresponding probabilities are summarized as follows:

X 2 3 4 5 6 7 8 9 10 11 12

pi 36

1

36

2

36

3

36

4

36

5

36

6

36

5

36

4

36

3

36

2

36

1

4

where, for example,

p2 = P(X=2) = P{(1,1)} = 1/36,

p3 = P(X=3) = P{(1,2), (2,1)} = 2/36,

p7 = P(X=7) = P{(1,6), (6,1), (2,5), (5,2), (3,4), (4,3)} = 6/36

Page 5: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Random Variable

In Example 2.1, numerical values of X depend on the outcomes of the experiment, e.g.,

• if the outcome e = (2,3), then X = 2+3 = 5,

• if e = (5,6), then X = 5+6 = 11, etc.

Thus, X is a real-valued function.

On the other hand, if, for example, X = 3, then it must be that e = (1,2) or (2,1). Therefore, the inverse image of the function X is an event!

Definition 2.1. Let S be the sample space of an experiment. A real-valued function X defined on S and taking values in R (the set of real numbers) is called a random variable (r.v.) if for each interval I R, the inverse image {e: X(e) I } is an event.

Definition 2.1. Let S be the sample space of an experiment. A real-valued function X defined on S and taking values in R (the set of real numbers) is called a random variable (r.v.) if for each interval I R, the inverse image {e: X(e) I } is an event.

5

Page 6: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Random Variable

Example 2.2 Suppose that we toss a coin having a probability p of coming up heads, until the first head appears. Let N denote the number of flips required, then assuming the outcome of successive flips are independent, N is a random variable taking values 1, 2, 3, …, with respective probabilities

P{N =1} = P(H) = p,

P{N =2} = P(TH) = (1p)p,

P{N =3} = P(TTH) = (1p)2p,

ppHTTTPnNP n

n

1

1

)1()(}{

1

1 1)1(1

)1(n

n

p

pppAll probabilities add up to 1:

6

Page 7: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Distribution of a Random Variable

F(x) “cumulates” all of the probabilities of the values of X up to and including x.

Definition 2.2 (Cumulative Distribution Function) The cumulative distribution function (CDF) of a random variable X is defined by,

F(x) = P{X ≤ x},for any real x.

Definition 2.2 (Cumulative Distribution Function) The cumulative distribution function (CDF) of a random variable X is defined by,

F(x) = P{X ≤ x},for any real x.

A function F(x) is a CDF if and only if it satisfies:

(a) limx F(x) = 0 and limx F(x) = 1

(b) limh0 F(x+h)= F(x), h > 0 (right continuous)

(c) a < b implies F(a) ≤ F(b) (nondecreasing).

A function F(x) is a CDF if and only if it satisfies:

(a) limx F(x) = 0 and limx F(x) = 1

(b) limh0 F(x+h)= F(x), h > 0 (right continuous)

(c) a < b implies F(a) ≤ F(b) (nondecreasing).

7

Page 8: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Distribution of a Random Variable

8

Solution: Possible values for X are 1, 2, 3, and 4, and the corresponding probabilities are:

P(X = 1) = P{(1,1)} = 1/16 P(X = 2) = P{(1,2), (2,1), (2,2)} = 3/16 P(X = 3) = P{(1,3), (2,3), (3, 3), (3,2), (3,1)} = 5/16 P(X = 4) = P{(1,4), (2,4), (3,4), (4,4), (4,3), (4,2), (4,1)} = 7/16.

Thus, the CDF of X is,

x 1 2 3 4

F(x) 1/16 4/16 9/16 1

1

x 0 1 2 3 4

9/16

1/4

F(x)

1/16

Example 2.3. A four-sided die has a different number, 1, 2, 3, or 4, affixed to each side. On each roll, each of the four numbers is equally likely to occur. A game consists of rolling the die twice. If X represents the maximum of the two rolls, then X is a random variable. Find and sketch the CDF of X

Example 2.3. A four-sided die has a different number, 1, 2, 3, or 4, affixed to each side. On each roll, each of the four numbers is equally likely to occur. A game consists of rolling the die twice. If X represents the maximum of the two rolls, then X is a random variable. Find and sketch the CDF of X

Page 9: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Distribution of a Random Variable

Example 2.4. A function is specified as follows.

F(x) =

Example 2.4. A function is specified as follows.

F(x) =

.313221122121104

00

xxxxxx

x

9

1

3/4 2/3 1/2

1/4

0 1 2 3 x

F(x) Solution: (a) … ,

(b) : P(X < 2) = F(2–) = 1/2.

P(X = 2) = F(2) – F(2–)

= (2/12+1/2)–1/2 = 1/6.

P(1 ≤ X < 3) = P(X < 3) – P(X < 1)

= F(3–) – F(1–)

= (3/12+1/2) – 1/4 = 1/2.

(a) Verify that it is a CDF.

(b) Plot F(x) and calculate: P(X < 2), P(X = 2), and P(1 ≤ X < 3).

(a) Verify that it is a CDF.

(b) Plot F(x) and calculate: P(X < 2), P(X = 2), and P(1 ≤ X < 3).

Page 10: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Note. Random variables can be classified as discrete, continuous, or mixed. The function given in Example 2.4 is a mixed CDF which has “jumps” at certain points. We will concentrate on pure discrete and pure continuous types of random variables.

Distribution of a Random Variable

Definition 2.3 (Discrete R.V. and Probability Mass Function) If the set of all possible values of a random variable, X, is a countable set, x1, x2, . . . , xn, or x1, x2, . . ., then X is called a discrete random variable. The function

p(x) = P(X = x), x = x1, x2, . . .

that assigns the probability to each possible value of X will be called probability mass function (pmf).

Definition 2.3 (Discrete R.V. and Probability Mass Function) If the set of all possible values of a random variable, X, is a countable set, x1, x2, . . . , xn, or x1, x2, . . ., then X is called a discrete random variable. The function

p(x) = P(X = x), x = x1, x2, . . .

that assigns the probability to each possible value of X will be called probability mass function (pmf).

10

So, pmf tells how the probabilities are distributed across the values of r.v. X.So, pmf tells how the probabilities are distributed across the values of r.v. X.

Page 11: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Distribution of a Random Variable

A function p(x) is a pmf if and only if

(i) p(x) ≥ 0 for all x, and (ii)

A function p(x) is a pmf if and only if

(i) p(x) ≥ 0 for all x, and (ii) .1)( xall

xp

11

Solution: (a) p(x) > 0 for all x = 0, 1, 2, . . . ; and

2/1,2))2/1(2/11()( 2

all kkkxp

x

(b) Since p(0) = k/2, and p(2) = –k/4, the two values of the function have opposite signs no matter what value k takes. Therefore, it is impossible for p(x) to be always nonnegative, and hence p(x) cannot be a pmf.

Example 2.5. Is each of the following functions a pmf? If it is, find the constant k.

(a) p(x) = k(1/2)x , x = 0, 1, 2, . . .

(b) p(x) = k [(1/2)x – 1/2], x = –2, –1, 0, 1, 2.

Example 2.5. Is each of the following functions a pmf? If it is, find the constant k.

(a) p(x) = k(1/2)x , x = 0, 1, 2, . . .

(b) p(x) = k [(1/2)x – 1/2], x = –2, –1, 0, 1, 2.

Page 12: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Distribution of a Random Variable

The r.v's introduced above take countable number of values. However, many social phenomena can only be described by variables with uncountable values, e.g., arrival time of a train, lifetime of a transistor, etc.

Definition 2.4 (Continuous R.V. and Probability Density Function) A r.v. X is said to be a continuous random variable if there exists a nonnegative function f, defined for all real x (– , ), such that for any set B of real numbers

P{X B} = .

The function f is called the probability density function (pdf) of the r.v. X.

Definition 2.4 (Continuous R.V. and Probability Density Function) A r.v. X is said to be a continuous random variable if there exists a nonnegative function f, defined for all real x (– , ), such that for any set B of real numbers

P{X B} = .

The function f is called the probability density function (pdf) of the r.v. X.

f (x)dxB

12

So, pdf f(x) tells how likely values of r.v. X occurs around x.So, pdf f(x) tells how likely values of r.v. X occurs around x.

Page 13: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Some important properties can be deduced immediately from Definition 2.4:

i) F(t) = f (x)dx

t

(let B = (–, t] );

ii) P(a ≤ X ≤ b) = f (x)dxa

b

( let B = [a, b]);

iii) P(X = a) = f (x)dxa

a

= 0 (let a = b in ii) )

iv) The pdf can be obtained from CDF by differentiation:

f(x) = F'(x) = d

dxF(x), for all x where F'(x) exists.

Note that it is possible that F'(x) does not exist at finite number of points,

where f(x) can be defined as either ‘left’ or ‘right’ derivatives.

Distribution of a Random Variable

13

Page 14: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Example 2.6 A CDF has the following expression

F(x) =

Sketch the graph of F(x) and find the pdf f(x).

Example 2.6 A CDF has the following expression

F(x) =

Sketch the graph of F(x) and find the pdf f(x).

xxx

xxx

5.1,15.15.0,5.025.0

5.00,0,0

Distribution of a Random Variable

14

1

.5

.5 1 1.5 0 x

F(x)

Solution: The graph of F(x) is shown below.

xx

xx

xf

5.1,05.15.0,5.0

5.00,10,0

)(

Clearly, F(x) is not differentiable at x = 0, 0.5 and 1.5. Take the ‘right’ derivatives, i.e., f(0) = 1, f(0.5) = 0.5, and f(1.5) = 0, we then have

Page 15: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Example 2.7 A machine produces copper wire, and occasionally there is a flaw at some point along the wire. The length of the wire (in meters) produced between successive flaws is a continuous r.v. X with pdf of the form:

f(x) = c(1+x)3, x > 0,

where c is a constant. Find c, and CDF.

Example 2.7 A machine produces copper wire, and occasionally there is a flaw at some point along the wire. The length of the wire (in meters) produced between successive flaws is a continuous r.v. X with pdf of the form:

f(x) = c(1+x)3, x > 0,

where c is a constant. Find c, and CDF.

A function f(x) is a pdf if and only if it satisfies the properties

(i) f(x) ≥ 0 for all x and (ii)

A function f(x) is a pdf if and only if it satisfies the properties

(i) f(x) ≥ 0 for all x and (ii)

Distribution of a Random Variable

15

Solution: 1 = f (x)dx

= c(1 x) 3

0

dx = c

2

1, c =2.

Now, P(X ≤ x) = 2(1 t) 30

x

dt = 1 – (1+x)–2, hence

0,)1(10,0

)( 2 xxx

xF

1)(

dxxf

Page 16: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

The expected value of a continuous r.v. with pdf f(x) is defined by

= E(X) =

if the integral is absolutely convergent. Otherwise we say that E(X) does not exist.

The expected value of a continuous r.v. with pdf f(x) is defined by

= E(X) =

if the integral is absolutely convergent. Otherwise we say that E(X) does not exist.

Expectation of a Random Variable

Definition 2.5. (Expectation) The expected value of a discrete r.v. X with pmf p(x) is defined by

= E(X) =

i.e., a weighted average of all possible values of X. It is also called mean of the population represented by X.

Definition 2.5. (Expectation) The expected value of a discrete r.v. X with pmf p(x) is defined by

= E(X) =

i.e., a weighted average of all possible values of X. It is also called mean of the population represented by X.

,)(xall

xpx

,)(

dxxfx

16

Expected value of a discrete r.v. is just the weighted average of the possible values of X, weighted by their chances to occur.Expected value of a discrete r.v. is just the weighted average of the possible values of X, weighted by their chances to occur.

Page 17: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

In Example 2.1, X is the sum of the two numbers shown on the two dice. The expected sum is:

E(X) =

Expectation of a Random Variable

736

1)12(

36

3)4(

36

2)3(

36

1)2()(

xall

xpx

In Example 2.7, we have

E(X) = 2x(1 x) 30

dx = –

x

(1 x)20

+ 1

(1 x)2

0

dx = – 1

(1 x) 0

= 1.

17

Expectation is an extremely important concept in summarizing characteristics of distributions. It gives a measure of central tendency. It has following simple but important property:

Expectation is an extremely important concept in summarizing characteristics of distributions. It gives a measure of central tendency. It has following simple but important property:

If a and b are constants, thenE[aX + b] = a E[X] + b

If a and b are constants, thenE[aX + b] = a E[X] + b

Page 18: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Function of a Random Variable

The following results are important:

If X is a r.v., then a function of it, u(X), is also a r.v. ,

E[u(X) ] = , if X is a discrete r.v. with pmf p(x),

E[u(X) ] = , if X is a continuous r.v. with pdf f(x).

The following results are important:

If X is a r.v., then a function of it, u(X), is also a r.v. ,

E[u(X) ] = , if X is a discrete r.v. with pmf p(x),

E[u(X) ] = , if X is a continuous r.v. with pdf f(x).

xall

xpxu )()(

dxxfxu )()(

18

Example 2.8 The r.v. X has pmf p(x) = 1/3, for x = 1, 0, 1. Find E(X) and E(X2).

Solution: 0)1()0()1()()(E 31

31

31

xall

xxpX

3

2)1()0()1()()(E 3

12312

31222

xall

xpxX

The above result says that the expectation of the r.v., u(X), can simply be found through the distribution of the original r.v. X ! The above result says that the expectation of the r.v., u(X), can simply be found through the distribution of the original r.v. X !

Page 19: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Another important quantity of interest is the variance of a r.v. X, denoted by Var(X), which is defined by

Var(X) = E[(X –)2],

where = E(X). A related quantity is the standard deviation,

(X) = {Var(X)}1/2.

Variance and standard deviation (sd) measure how much the values of X vary around its mean .

As (X –)2 is a function of X, i.e., u(X) = (X –)2, from the results given in the last slide, we have

Var(X) = , if X is discrete with pmf p(x),

Var(X) = , if X is continuous with pdf f(x).

As (X –)2 is a function of X, i.e., u(X) = (X –)2, from the results given in the last slide, we have

Var(X) = , if X is discrete with pmf p(x),

Var(X) = , if X is continuous with pdf f(x).

Variance of a Random Variable

19

xall

xpx )()( 2

dxxfx )()( 2

Page 20: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Variance of a Random Variable

20

Example 2.9. The time elapsed, in minutes, between the placement of an order of pizza and its delivery is a random variable with pdf

f(x) = 1 15 if 25 x 400 otherwise

(a) Determine the mean and standard deviation of the time it takes for the pizza shop to deliver pizza.

(b) Suppose that it takes 12 minutes to bake pizza. Determine the mean and standard deviation of the time it takes for the delivery person to deliver pizza.

Solution: Let X be the time between order and delivery.

(i) E(X) = 40

25)15/1( dxx = 32.5,

2 = Var(X) = 40

25

2 )15/1()5.32( dxx = 18.75, and = 4.33.

(ii) The time it takes for delivery person to deliver pizza is Y = X –12. Therefore,

E(Y) = E(X – 12) = 40

25)15/1()12( dxx = 32.5 – 12 = 20.5,

Var(Y) = E[(X – 12 – 20.5)2] = 40

25

2 )15/1()5.2012( dxx = 18.75.

Page 21: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

To show the second result,

Var(X) = E[(X –)2]

= E(X2 –2X+ 2)

= E(X2) –2 E(X)+ 2

= E(X2) – 2

If X is a r.v. with mean , and a and b are arbitrary constants, then

E(aX + b) = aE(X) + b

Var(X) = E(X2) – 2

Var(aX + b) = a2Var(X)

(aX + b) = |a|(X).

21

To show third result,

Var(aX + b)

= E[(aX + b – E(aX + b))2]

= E[(aX + b – aE(X) – b)2]

= E[(aX –aE(X))2]

= E[a 2(X –E(X))2]

= a 2E[ (X –E(X))2]

= a2Var(X)

To show third result,

Var(aX + b)

= E[(aX + b – E(aX + b))2]

= E[(aX + b – aE(X) – b)2]

= E[(aX –aE(X))2]

= E[a 2(X –E(X))2]

= a 2E[ (X –E(X))2]

= a2Var(X)

We note in Example 2.9 Var(X) and Var(Y) are the same. Why? We note in Example 2.9 Var(X) and Var(Y) are the same. Why?

Variance of a Random Variable

Page 22: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU22

Variance of a Random Variable

Example 2.10. The monthly sales at a computer store have a mean of $25,000 and a standard deviation of $4,000. Profits are 30% of the sales less fixed costs of $6,000. Find the mean and standard deviation of the monthly profit.

Example 2.10. The monthly sales at a computer store have a mean of $25,000 and a standard deviation of $4,000. Profits are 30% of the sales less fixed costs of $6,000. Find the mean and standard deviation of the monthly profit.

Solution: Let X = Sales. E(X) = 25,000, and Var(X) = 4,0002

Let Y = Profit. Then, Y = .30(X) – 6,000.

E(Profit) = E(Y) = .30 E(X) – 6,000 = (.30)(25,000) – 6,000 = 1,500.

Var(Profit) = Var(Y) = Var[(.30)(X) – 6,000] = (.30)2Var(X) = 1,440,000

(Profit) = (Y) = 0.3(X) = 1,200.

Page 23: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Moment Generating Function*

Definition 2.12 (Moment-Generating Function) Let X be a random variable, discrete or continuous. If there exists a positive number h such that

M(t) = E(etX)

exists and is finite for –h < t < h, then M(t) is called the Moment-Generating Function (MGF) of X.

Definition 2.12 (Moment-Generating Function) Let X be a random variable, discrete or continuous. If there exists a positive number h such that

M(t) = E(etX)

exists and is finite for –h < t < h, then M(t) is called the Moment-Generating Function (MGF) of X.

23

The mean, variance and standard deviation are important characteristics of a distribution. For some distributions, it is rather difficult to compute these quantities directly. A special function defined below can help. More importantly, the uniqueness property of this function often help to find the distribution of some useful functions of r.v.s.

Page 24: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Moment Generating Function

24

Why is it called the Moment Generating Function? because it generates moments.

What are the moments?

• the term ‘moment’ is from mechanics, representing the product of a distance and its weight.

• So, xi p(xi) is a moment, and xi p(xi) or E(X) is the moment of the ‘system’.

• In statistics, E(X), E(X2), E(X3), …, are the 1st, 2nd , 3rd, …, moments about the origin;

• E(X), E[(X)2], E[(X)3], etc, are the 1st, 2nd , 3rd central moments, or moments about the mean.

• See p69 of text for details.

Page 25: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Thus, if MX(t) = MY(t), it must be that

p(vi) = q(vi), i = 1, 2, …

Moment Generating Function

25

Property of MGF:

M.G.F., if it exists, completely determines the distribution function. In other words, if two random variables, assuming the same set of vaules, have the same M.G.F., they must have the same distribution function.

E(Xr) = M(r)(0), the rth derivative of M(t) evaluated at 0.

Property of MGF:

M.G.F., if it exists, completely determines the distribution function. In other words, if two random variables, assuming the same set of vaules, have the same M.G.F., they must have the same distribution function.

E(Xr) = M(r)(0), the rth derivative of M(t) evaluated at 0.

We use a discrete r.v.s to demonstrate these properties: if X and Y have possible values {v1, v2, …}, and pmfs p(x) and q(x), then,

.)()()()(

,)()()()(

21

21

21

21

vqevqeeEtM

vpevpeeEtMvtvttY

Y

vtvttXX

Page 26: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Thus,Thus,

Moment Generating Function

26

To understand the property E(Xr) = M(r)(0), we have

)()()(

)()()()(

332211

321

321

321

xpxexpxexpxe

xpexpexpetMxtxtxt

xtxtxtdtd

)()()(

)()()()(

3232

221

21

321

321

3212

2

xpxexpxexpxe

xpexpexpetM

xtxtxt

xtxtxt

dtd

)(E)()()()0( 23

232

221

21 XxpxxpxxpxM

In general,

,)()(1

)(

i iri

xtr xpxetM i )(E)()0(1

)( r

i iri

r XxpxM

Thus,Thus, )(E)()()()0( 332211 XxpxxpxxpxM

Page 27: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Moment Generating Function

27

Example 2.11. If X has the MGF,

then, as the coefficients of the e terms are probabilities, the probabilities must be 3/6, 2/6, and 1/6; the values beside t are the values of X, which are 1, 2, and 3. The pmf of X is

6

1

6

2

6

3)( 32 ttt eeetM

x 1 2 3

p(x) 3/6 2/6 1/6

Example 2.12. Suppose the MGF of X is

Find the distribution of X.

.2ln,21

2)(

t

e

etM t

t

Page 28: Chapter 2: Random Variable and Probability Distributions zlyang@smu.edu.sg  Yang Zhenlin

Chapter 2

STAT151, Term I 2015-16 © Zhenlin Yang, SMU

Moment Generating Function

28

Solution: Until we expand M(t), we cannot detect the coefficients of

Recall:

we have,

.ixte

11,11

1 32

zzzzz

3

3

2

2

3

3

2

2

2

1

2

1

2

1

2221

2

21

2)(

ttt

tttt

t

t

eee

eeee

e

etM

That is, P(X = x) = (1/2)x, for positive integer x; the pmf of X is thus,

That is, P(X = x) = (1/2)x, for positive integer x; the pmf of X is thus,

.,3,2,1,2

1)(

xxp

x