20
1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint density ( ) XY f xy x y , , = + for 0 1 x < < and 0 1 y < < , and ( ) 0 XY f xy , , = elsewhere. (a) What is the density of X ? (b) Find ( 3 / 4) EY X | = (c) Find ( ) Var Y X x | = for 0 1 x < < . (d) Let Z 2 3 X = + . What is the probability density of Z ? 2. Let () 2 X f x x = , 0 1 x < < and () 0 X f x = elsewhere. (a) Find E(1/X) (b) Let 1 Y X =/ (i) Find () Y f y (ii) Find E(Y) 3. Suppose that X is distributed with CDF () Fx and density () fx . Show that 1 1 0 ( ) () EX F t dt = . 4. If X has probability density 1 4 () fx = for 1 3 x < < and () 0 fx = elsewhere. Let 2 Y X = . (a) What is the CDF of Y ? (b) What is the probability density function of Y ? 5. Suppose X has probability density function (pdf) f(x) = 1/x 2 for x 1, and f(x) = 0 elsewhere. (a) Compute P(X 5). (b) Derive the CDF of X. (c) Show that the mean of X does not exist. (d) Let Y = 1/X. Derive the probability density function of Y. (e) Compute E(Y). 6. Suppose that X is a discrete random variable with the following joint distribution x Prob(X = x) 1 0.25 2 0.50 3 0.25 The distribution of Y|X=x is Bernoulli with P(Y = 1 | X = x) = (1+x) 1 . (a) Compute the mean of X. (b) Find the variance of X. (c) Suppose X = 2. (c.i) What is your best guess of the value of Y? (c. ii) Explain what you mean by “best”. (d) What is the distribution of X conditional on Y = 1?

EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

  • Upload
    dodien

  • View
    223

  • Download
    3

Embed Size (px)

Citation preview

Page 1: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

1

Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises

1. Let X and Y have a joint density ( )X Yf x y x y, , = + for 0 1x< < and 0 1y< < , and

( ) 0X Yf x y, , = elsewhere. (a) What is the density of X? (b) Find ( 3 / 4)E Y X| = (c) Find ( )Var Y X x| = for 0 1x< < . (d) Let Z 23 X= + . What is the probability density of Z ?

2. Let ( ) 2Xf x x= , 0 1x< < and ( ) 0Xf x = elsewhere.

(a) Find E(1/X) (b) Let 1Y X= /

(i) Find ( )Yf y (ii) Find E(Y)

3. Suppose that X is distributed with CDF ( )F x and density ( )f x . Show that 1 1

0( ) ( )E X F t dt−= ∫ .

4. If X has probability density 1

4( )f x = for 1 3x− < < and ( ) 0f x = elsewhere. Let 2Y X= . (a) What is the CDF of Y ? (b) What is the probability density function of Y ?

5. Suppose X has probability density function (pdf) f(x) = 1/x2 for x ≥ 1, and f(x) = 0

elsewhere. (a) Compute P(X ≥ 5). (b) Derive the CDF of X. (c) Show that the mean of X does not exist. (d) Let Y = 1/X. Derive the probability density function of Y. (e) Compute E(Y).

6. Suppose that X is a discrete random variable with the following joint distribution

x! Prob(X#=!x)!1" 0.25"2" 0.50"3" 0.25"

The distribution of Y|X=x is Bernoulli with P(Y = 1 | X = x) = (1+x)−1.

(a) Compute the mean of X. (b) Find the variance of X. (c) Suppose X = 2.

(c.i) What is your best guess of the value of Y? (c. ii) Explain what you mean by “best”.

(d) What is the distribution of X conditional on Y = 1?

Page 2: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

2

7. (a) X is a Bernoulli random variable with P(X = 1) = p. What is the moment generating function for X?

(b) Y = 1

n

iiX

=∑ where Xi are iid Bernoulli with P(Xi = 1) = p. What is the moment

generating function for Y? (c) Use your answer in (b) to compute E(Y), E(Y2) and E(Y3) for n > 3. 8. Let ( )M t denote the mgf of X . Let ( ) ln( ( ))S t M t= . Let '(0)S denote the first derivative of ( )S t evaluated at 0t = and ''(0)S denote the second derivative.

(a) Show that '(0) ( )S E X= (b) Show that ''(0) ( )S var X= .

9. Suppose that ( ) 1 4X Yf x y, , = / for 1 3x< < and 1 3y< < , and ( ) 0X Yf x y, , = elsewhere.

(a) Derive the marginal densities of X and Y . (b) Find ( 2 3)E Y X| = . (c) Find [( 2 5) ( 1 5)]P X Y< . ∩ < . (d) Let W X Y= + . Derive the moment generating function of W .

10. Show that if X is (0 1)N , , then ( '( ) ( )) 0E g X Xg X− = for any differentiable function g with first derivative that is integrable. (Hint: use integration by parts and the fact that because ( )g x is integrable, lim ( ) ( ) lim ( ) ( ) 0x xg x f x g x f x→∞ →−∞= = , where ( )f x is the standard normal density.) Use this result to show

1 1( ) ( )i iE X iE X+ −= . 11. X and Y are random variables. The marginal distribution of X is U[0,1], that is X is uniformly distributed on [0,1]. The conditional distribution of Y|X=x is U[x, 1+x] for 0 ≤ x ≤ 1.

(a) Derive the mean of X. (b) Derive the variance of X. (c) Derive the joint probability density function of X and Y. (d) Derive the mean of Y.

Page 3: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

3

12. Suppose that a random variable X is distributed (6 2)N , . Let ε be another random variable that is distributed (0 3)N , . Suppose that X and ε are statistically independent. Let the random variable Y be related to X and ε by 4 6Y X ε= + + .

(a) Prove that the joint distribution of ( )X Y, is normal. Include in your answer the value of the mean vector and covariance matrix for the distribution. (b) Suppose that you knew that 10X = . Find the optimal (minimum mean square error) forecast of Y .

13. Y and X are two random variables with ( 2) (0 )Y X x N x| = ,: and (1 2)X U ,: .

(a) What is the joint density of Y and X ? (b) Compute 4( )E Y . (Hint: if 2(0 )Q N σ,: , then 4 4( ) 3E Q σ= .) (c) What is the density of Y

XW = ? (Hint: What is the density of W conditional on X x= ?)

14. Let (0 1)X N ,: and suppose that

if 0.5 10 otherwise

X XY

< <⎧ ⎫= ⎨ ⎬⎩ ⎭

(a) Find the Distribution of Y (b) Find E(Y) (c) Find ( )Var Y .

15. Suppose that a random variable X has density function f and distribution function F, where F is strictly increasing. Let ( )Z F X= (so that Z is a random variable constructed by evaluating F at the random point X). Show that Z is distributed (0 1)U , .

16. Let 1

2( ) ( )xf x = , 1 2 3x = , , ,.... Find E(X). 17. Let 1 2 1 1 2( ) 2 0 1 0 1f x x x x x, = , < < , < < , zero elsewhere, be the density of the vector 1 2( )x x, .

(a) Find the marginal densities of 1X and 2X . (b) Are the two random variables independent? Explain. (c) Find the density of 1X given 2X . (d) Find 1( )E X and 2( )E X . (e) Find 1[( 0 5)]P X < . (f) Find 2[( 0 5)]P X < . . (g) Find 1 2[( 0 5) ( 0 5)]P X X< . ∩ < . . (h) Find 1 2[( 0 5) ( 0 5)]P X X< . ∪ < . .

18. Let ( ) 1f x = , for 0 1x< < and zero elsewhere be the density of X . Let

12Y X= .

(a) Find the density of Y (b) Find the mean of Y (c) Find the variance of Y

Page 4: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

4

19. Suppose that X and Y are independent random variables, g and h are two functions, and E(g(X)) and E(h(X)) exist. Show ( ( ) ( )) ( ( )) ( ( ))E g X h Y E g X E h Y= . 20. The following table gives the joint probability distribution between employment status and college graduation among those either employed or looking for work (unemployed) in the working age U.S. population, based on the 1990 U.S. Census.

Joint Distribution of Employment Status and College Graduation in the U.S. Population Aged 25-64, 1990

Unemployed (Y=0) Employed (Y=1) Total Non College Grads (X=0) 0.045 0.709 0.754

College Grads (X=1) 0.005 0.241 0.246 Total 0.050 0.950 1.00

(a) Compute E(Y). (b) The unemployment rate is the fraction of the labor force that is unemployed. Show that the unemployment rate is given by 1−E(Y). (c) Calculate E(Y|X=1) and E(Y|X = 0). (d) Calculate the unemployment rate for (i) college grads and (ii) non college grads. (e) A randomly selected member of this population reports being unemployed. What is the probability that this worker is a college graduate? A non-college graduate? (f) Are educational achievement and employment status independent? Explain.

21. Suppose that X has density ( )f x and ( )E Xµ = , m = the median of X , and

2( )E X <∞. (The median solves ( ) 0 5F m = . .) (a) Show that a µ= solves 2min [( ) ]a E X a− (b) Show that a m= solves min [ ]a E X a| − |

22. Suppose that Y and X are two random variable and assume the existence of any moments necessary to

(a) Find the values of 0α and 1α that minimize 2

0 1 0 1( ) [( ) ]MSE E Y Xα α α α, = − − (b) Find the values of 0 1α α, and 2α that minimize

2 20 1 0 1 2( ) [( ) ]MSE E Y X Xα α α α α, = − − − .

23. The density of Y is eY for −∞ < Y < 0 and zero elsewhere.

(a) Compute P(Y ≥ −1). (b) Find the value of c that minimizes E|Y – c|.

Page 5: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

5

24. Suppose X = ( X1X2!Xn )′ (a) Show that

E( X ) = µX =

E( X1)

E( X2 )

!E( Xn )

⎢⎢⎢⎢⎢

⎥⎥⎥⎥⎥

is the mean vector. (b) Let {( )( )}ij i i j jE X Xσ µ µ= − − . Show that ij jiσ σ= (c) Show that if iX and jX are independent, then 0ijσ = .

25. Suppose iX , 1i n= ,..., are iid random variables and let

1

nii

Y X=

=∑ .

(a) Show that 1( ) ( )i

nY i XM t M t==Π

(b) When the X s′ are Bernoulli, then Y is binomial. Use your results for the MGF to compute the mean and variance of Y .

26. Suppose Y is a Poisson random variable with ( ) 1E Y = . Let if 22 if 2Y Y

XY≤⎧ ⎫

= ⎨ ⎬>⎩ ⎭

(a) Derive the CDF of X . (b) Derive the mean and variance of X .

27. Suppose that W ~ 2

3χ and that the distribution of X given W = w is U[0, w] (that is uniform on 0 to w).

(a) Find E(X). (b) Find Var(X). (c) Suppose W = 4. What is the value of the minimum mean square error forecast of X?

28. Complete the following calculations

(a) Suppose X ∼ N (5,100) . Find ( 2 7)P X− < < . (b) Suppose X ~ N(50, 4). Find ( 48)P X ≥ (c) Suppose X ∼ χ4

2 . Find ( 9 49)P X ≥ .

(d) Suppose X ∼ χ62 . Find (1 64 12 59)P X. ≤ ≤ .

(e) Suppose X ∼ χ62 . Find ( 3 48)P X ≤ .

(f) Suppose X ∼ t9 . Find (1 38 2 822)P X. ≤ ≤ . (g) Suppose X ∼ N (5,100) . Find ( 1 6)P X− < < . (h) Suppose X ∼ N (46,2) . Find ( 48)P X ≥ (i) Suppose X ∼ χ6

2 . Find (1 24 12 59)P X. ≤ ≤ . (j) Suppose X ∼ F3.5. Find ( 7 76)P X > . (k) Suppose X ∼ t10 . Find ( 2 228)P X ≤ .

Page 6: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

6

29. If X ∼ N (1,4) find (1 9)P X< < . 30. Find the 90th percentile of the (10 30)N , distribution. 31. Suppose X ∼ N (µ,σ 2 ) and ( 89) 90P X < = . and ( )94 95P X < = . . Find µ and 2σ .

32. Suppose X ∼ N (µ,σ 2 ) , show that ( ) 2X µ σ π| − | = / .E 33. X ~ N(1, 4) and Y = eX.

(a) What is the density of Y ? (b) Compute E(Y). (Hint: What is the MGF for X?)

34. X ~ N(0,1) and Y|X is N(X,1).

(a) Find (i) E(Y) (ii) Var(Y) (iii) Cov(X,Y)

(b) Prove that the joint distribution of X and Y is normal. 35. X,Y,U are independent random variables: X~N(5,4), Y~ 2

1χ , and U is distributed Bernoulli with P(U=1)=0.3. Let W=UX+(1−U)Y. Find the mean and variance of W. 36. Let Y be the number of successes in n independent trials with 0 25p = . . Find the smallest value of n such that (1 ) 0 7P Y≤ ≥ . . 37. Suppose that the 3 1× vector X is distributed 3(0 )N I, , and let V be a 2 3× non-random matrix that satisfies VV I′ = . Let Y VX= .

(a) Show that Y′Y ∼ χ2

2 (b) Find Prob ( 6)Y Y′ >

38. Let l denote a p×1 vector of 1’s, Ip denote the identity matrix of order I, suppose the p×1 vector Y is distributed as Y ~ N(0, σ2Ip), with σ2 = 4, and let M = Ip − (1/p)ll′.

(a) Show that M is idempotent. (b) Compute E(Y′MY). (c) Compute var(Y′MY).

39. Suppose that the joint distribution of X and Y are two random variables with joint normal distribution. Further suppose ( ) ( )var X var Y= . Let U=X+Y and V =X −Y. Prove that U and V are independent.

Page 7: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

7

40. Let X ∼ N p (µ,Σ) . Also let 1 2( )X X X′ ′ ′= , , 1 2( )µ µ µ′ ′ ′= , , and

11 12

21 22

⎛ ⎞⎜ ⎟⎜ ⎟⎜ ⎟⎜ ⎟⎝ ⎠

Σ ΣΣ =

Σ Σ

be the partitions of X , µ and Σ such that 1X and 1µ are k –dimensional and 11Σ is a k k× matrix. Show that 12 0Σ = implies that 1X and 2X are independent.

41. Suppose Xi are i.i.d. N(µ, σ2), for i = 1, … , n.. Let 1

1 n

ii

X Xn =

= ∑ and

2 2

1

1 ( )1

n

is X X

n =

= −− ∑ .

Show each of the following. (Hint: Let l denote an n×1 vector of 1’s. Note that 1( ' ) 'X l l l X−= , where X = (X1 X2 … Xn)´, and 2

1( )

n

iiX X

=

−∑ =X´[I – l(l´l)–1l´]X .)

(a) 2

( ) ~ (0,1)/

X Nnµ

σ−

(b) (n – 1)s2/σ2

~ 21nχ − .

(c) X and (n – 1)s2/σ2

are independent.

(d) 2

2 2

( )

12( 1)1

( )X

nn

n sn

X ts n

µσ

σ

µ−

/−

− /−

−= ./

:

42. Suppose Yi ~ i.i.d. N (10, 4), and let s2 = (n–1)–1 2

1( )

n

iiY Y

=

−∑ .

Compute E(Y | s2 = 5). 43. X and Y are random variables. The marginal distribution of X is 2

1χ . The conditional distribution of Y|X=x is N(x,x).

(a) Derive the mean and variance of Y. (b) Suppose that you are told that X < 1. Derive the minimum mean square error forecast of Y.

44. X and Y have a bivariate normal distribution with parameters µX =5, µY =10, 2Xσ

=9, 2Yσ =4, and 0.9XYρ = , where ρXY is the correlation between X and Y. Find the

shortest interval of the real line, say C, such that P(Y ∈C | X = 7) = 0.90.

Page 8: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

8

45. Suppose that X and Y are two random vectors that satisfy:

X

Y

XEY

µµ⎡ ⎤⎡ ⎤

= ⎢ ⎥⎢ ⎥⎣ ⎦ ⎣ ⎦

and var XX XY

XY YY

XY

Σ Σ⎡ ⎤⎡ ⎤= ⎢ ⎥⎢ ⎥ Σ Σ⎣ ⎦ ⎣ ⎦

where X is k×1 and Y is scalar. Let Y a bX= + .

(a) Derive the values of the scalar a and the 1×k matrix b that minimize E[(Y – Y )2]. (b) Find an expression for the minimized value of E[(Y – Y )2].

46. Suppose that Xi are i.i.d. N(0,1) for 1i n= ,..., and let 2 1 2

1

nii

s n X−=

= ∑

(a) Show that 2 1as

s →

(b) Show that 2 1p

s →

(c) Show that 2 1ms

s → (d) Suppose 15n = , find 2( 73)P s ≤ . using an exact calculation

(e) Now let 2 1 21

( 1) ( )ni ni

s n X X−=

= − −∑ , where 11

nn iiX n X−

== ∑

(i) Show 2( ) 1s =E

(ii) Show 2 1p

s → 47. Suppose 1t t tX ε ε −= + , where tε are i i d. . . with mean 0 and variance 1.

(a) Find an upper bound for ( 3)P X| |≥ .

(b) Show that 11

0pn

tn tX

=→∑ .

(c) Show that 11

(0 )dn

tn tX N V

=→ ,∑ , and derive an expression for V .

48. Let X have a Poisson distribution with ( ) 80X =E .

(a) Use Chebyshev’s inequality to find a lower bound for (70 90)P X≤ ≤ . (b) Is the answer the same for (70 90)P X< < ?

49. Suppose iX , 1i n= ,..., are a sequence of iid random variables with ( ) 0iX =E

and 2( )ivar X σ= < ∞ . Let i iY iX= and 21

nii

Y n Y−=

= .∑ Prove 0p

Y→ . (Hint – can you show mean square convergence?)

50. Xi ~ U[0,1] for i = 1, …, n. Let Y = 1

n

iiX

=∑ . Suppose that n = 100. Use an

appropriate approximation to estimate the value of P(Y ≤ 52).

Page 9: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

9

51. Suppose Xi ~ i.i.d. N(0, σ2) for 1i n= ,..., and let

2

2

1

1

niiX

n

XY=−

=∑

(a) Prove that 1nY t −:

(b) Prove that d

Y Z→ where (0 1)Z N ,: (as n→∞ ). 52. Let Y ∼ χn

2 , let 1 2 1 2nX n Y n− / /= − , and 1

nW n Y−= .

(a) Prove that (0 2)d

nX X N→ ,: .

(b) Prove that 1p

nW → . (c) Suppose that 30n = . Calculate ( 43 8)P Y > . . (d) Use the result in (a) to approximate the answer in (c).

53. Let 1X and 2X denote the sample means from two independent randomly drawn samples Bernoulli random variables. Both of the samples is size n and the Bernoulli population has ( 1) 0 5P X = = . . Use an appropriate approximation to determine how large n must be so that 1 2( 01) 95P X X| − |< . = . . 54. Suppose that Xi ~ i.i.d. N(0,Σ) for 1i n= ,..., where iX is a p×1 vector. Let α denote a non-zero 1p× vector and let

111 1

n nn n

i in i

X XYX X

α αα α

′ ′

− ′ ′− =

=∑

(a) Prove that Yn ∼ F1,n−1 . (Hint: recognize that nXα ′ is a scalar.)

(b) Sketch a proof that d

nY Y→ where Y ∼ χ12 (as n→∞ ).

Page 10: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

10

55. Let Y ∼ N p (0,λ I p ) , where λ is a positive scalar. Let Q denote a p k× non-

stochastic matrix with full column rank, 1( )QP Q QQ Q′ − ′= and MQ = I − PQ. Finally, Let 1 QX Y P Y′= and 2 QX Y M Y′= . (A few useful matrix facts: (i) If A and B are two matrices with dimensions so that AB and BA are defined, then ( ) ( )tr AB tr BA= ; (ii) if Q is an idempotent matrix then ( ) ( )rank Q tr Q= .)

(a) Prove that 2( ) ( )E X p k λ= − . (b) Let W = 1 2( )(( ) )X X p k k/ − / . Show that

W ∼ Fk ,p−k .

(c) Suppose that 30p = and 15k = . Find ( 2 4)P W > . . (d) Suppose that [ 2]k p= / where [ ]z denote the greatest integeter less than or

equal to z . Show that as p→∞ , p(W −1)→d

R ∼ N (0,V ) , and derive an expression for V . (e) Use your answer in (d) to approximate the answer to (c).

56. Suppose that iX is i i d. . . ( 1)N µ, for 1i n= ,..., . The risk function for an estimator µ is 2ˆ ˆ( ) [( ) ]R Eµ µ µ µ, = − .

(a) Let 1ˆ Xµ = . Derive 1ˆ( )R µ µ, . (b) Let 1

2 2ˆ Xµ = . Derive 2ˆ( )R µ µ, . (c) Which estimator is better? (d) (More challenging)) Let 3 1 2ˆ ˆ ˆ(1 )a aµ µ µ= + − , where 1a = if 2 1

1ˆ nµ > and equals 0 otherwise. Derive 3ˆ( )R µ µ, . Is this estimator better than 1µ ? Is this estimator better than 2µ ?

57. Assume that Yi ~ iid N(µ, 1) for i = 1, … , n. Let ˆ 1(| | )Y c Yµ = > , where c is a positive constant. (The notation 1(| | )Y c> means that 1(| | )Y c> = 1 if | |Y c> , and is equal to 0 otherwise).

(a) Derive E( µ ). (b) Discuss the consistency of µ when (i) |µ | > c, (ii) |µ | < c, and |µ | = c.

58. Suppose X is a Bernoulli random variable with ( 1)P X p= = . Suppose that you have sample of i.i.d. random variables from this distribution, 1 2 nX X X, ,..., . Let op denote the true, but unknown value of p .

(a) Write the log-likelihood function for p . (b) Compute the score function for p and show that the score has expected

value 0 when evaluated at op . (c) Compute the information I . (d) Show the maximum likelihood estimator is given by MLEp = X . (e) Show that MLEp is unbiased.

(f) Show that n( MLEp − po )→d

N (0,V ) , and derive an expression for V . (g) Propose a consistent estimator for V .

Page 11: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

11

59. Let iY , 1i n= ,..., denote an i.i.d. sample of Bernoulli random variables with

Pr( 1)iY p= = . I am interested in a parameter θ that is given by peθ = . (a) Construct the likelihood function for θ . (b) Show that MLEθ! = eY .

(d) Show that MLEθ! is consistent.

(e) Show that n( MLEθ! −θ )→d

N (0,V ) , and derive an expression for V . 60. Suppose Yi|µ ~ i.i.d. N(µ, 1), 1i n= ,.. . Suppose that you know 5µ ≤ . Let ˆMLEµ denote the value of µ that maximizes the likelihood, subject the constraint 5µ ≤ , and Y denote the sample mean of the iY .

(a) Show that MLEµ = a(Y )5+ (1− a(Y ))Y , where ( ) 1a Y = for 5Y ≥ , and

( ) 0a Y = otherwise.

(b) Suppose that µ 3= . (i) Show that ( ) 0p

a Y → . Show that MLEµ! →

p

µ . Show

that

n(MLEµ! − µ)→

d

N (0,1) .

(c) Suppose that µ 5= . Show that MLEµ! →

p

µ . 61. Yi ~ i.i.d.N(µ, 1) for i = 1, 3, 5, … n–1, Yi ~ i.i.d.N(µ, 4) for i = 2, 4, …, n where n is even, and Yi and Yj are independent for i odd and j even.

(a) Write the log-likelihood function. (b) Derive the MLE of µ. (c) Is the MLE preferred to Y as an estimator of µ ? Explain. (Discuss the

properties of the estimators. Are the estimators unbiased? Consistent? Asymptotically Normal? Which estimator has lower quadratic risk?)

62. Yi ~ i.i.d.N(µY, 1) and Xi ~ i.i.d.N(µX, 1) for i = 1, …, n. Suppose Xi and Yj are independent for all i and j. A researcher is interested in θ = µYµX. Let θ denote the MLE if θ.

(a) Show that ˆ XYθ = .

(b) Suppose that µY = 0 and µX = 5. Show that ˆ ~ (0, )d

n W N Vθ → and derive an expression for V.

(c) Suppose that µY = 1 and µX = 5. Show that ( )ˆ 5 ~ (0, )d

n W N Vθ − → and

derive an expression for V.

Page 12: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

12

63. Suppose that yi is i.i.d.(0,σ2) for i = 1, …, n with E( 4iy ) = κ. Let 2 2

1

1ˆn

iiy

=

= ∑ .

(a) Show that 22 2

ˆˆ( ) (0, )

dn N Vσσ σ− → and derive an expression for 2ˆVσ .

(b) Suppose that n = 100, 2σ = 2.1 and 4

1

1 n

iiy

n =∑ = 16. Construct a 95% confidence

interval for σ, where σ is the standard deviation of y. 64. Suppose that Xi ~ i.i.d.N(0, σ2), 1i n= ,..., .

(a) Construct the MLE of 2σ . (b) Prove that the estimator is unbiased. (c) Compute the variance of the estimator.

(d) Prove that n( MLE2σ −σ 2 )→

d

N (0,V ) . Derive and expression for V . (e) Suppose MLE

2σ = 4 and 100n = . Test 2 3oH σ: = versus 2 3aH σ: ≠ using a Wald test with a size of 10%

(f) Suppose MLE2σ = 4and 100n = . Construct a 95% confidence interval for

2σ . (g) Suppose MLE

2σ = 4 and 100n = . Test 2 3oH σ: = versus 2 4aH σ: = using a test of your choosing. Set the size of the test at 5% . Justify your choice of test.

(h) Suppose that the random variables were iid but non-normal. Would the confidence interval that you constructed in (f) have the correct coverage probability ( i e. ., would it contain the true value of 2σ 95% of the time.)? How could you modify the procedure to make it robust to the assumption of normality (i.e., so that it provided the correct coverage probability even if the data were drawn from a non-normal distribution.)

(i) Construct the MLE of σ (j) Is the MLE of σ unbiased? Explain.

(k) Prove that n( MLEσ! −σ )→d

N (0,W ) . Derive and expression for W .

(l) Suppose MLEσ! = 3 and n=200. Construct a 90% confidence interval for σ.

65. Yi, i = 1, … , n are iid Bernoulli random variables with ( 1)iP Y p= = .

(a) Show that (1 )p

Y Y− →var(Y), where var(Y) denotes the variance of Y. (b) I am interested in Ho: p= 0.5 vs. Ha: p > 0.5. The sample size is 100n = ,

and I decide the reject the null hypothesis if 1

niiY

=∑ > 55. Use the central limit theorem to derive the approximate size of the test.

Page 13: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

13

66. iX , 1i n= ,..., , are distributed i.i.d. Bernoulli random variables with P(Xi = 1) = p. Let X denote the sample mean.

(a) Show that X is an unbiased estimator of the mean of X. (b) Let 2σ = (1 )X X− denote an estimator of σ2

, the variance of X. (b.i) Show that 2σ is a biased estimator σ2. (b.ii) Show that 2σ is a consistent estimator σ2.

(b.iii) Show that ( )2 2ˆ (0, )d

n N Vσ σ− → and derive an expression for V.

(c) Suppose n = 100 and X = 0.4. (c.i) Test H0: p = 0.35 versus H0: p > 0.35 using a test with size of 5%.

(c.ii) Construct a 95% confidence interval for σ2. 67. Suppose iX , 1i n= ,..., , are a sequence of independent Bernoulli random variables with Prob ( 1)X p= = . Based on a sample of size n, a researcher decides to test the

null hypothesis 0 45oH p: = . using the following rule: Reject oH if 21

n niiX

=≥∑ .

Otherwise, do not reject oH . (a) When 2n = compute the exact size of this test. (b) Write down an expression for the exact size when 50n = . Use the Central

Limit Theorem to get an approximate value of the size. (c) Prove that 21

lim ( ) 0n nn iiP X→∞ =

≥ =∑ when p = 0.45.

68. Xi ~ iidN(0, σ2) for i = 1, …, 10. A researcher is interested in the hypothesis Ho:

σ2 = 1 versus Ha: σ2 > 1. He decides to reject the null if 10

2

1

110 i

iX

=∑ > 1.83.

(a) What is the size of his test? (b) Suppose that alternative is true and that σ2 = 1.14. What is the power of his test? . (c) Is the researcher using the best test? Explain.

69. iX are i i d. . . 2(0 )N σ, . Let 2 21

1

nin i

s X=

= ∑ . I am interested in the competing

hypotheses 2 1oH σ: = versus 2 1aH σ: > . Suppose 15n = and I decide to reject the null if 2 5 3s > / .

(a) What is the size of the test? (b) Suppose that the true value of 2σ is 1.5. What is the power of the test? (c) Is this the most powerful test? Explain.

Page 14: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

14

70. X ~ N(0,σ2). You are interested in the competing hypothesis Ho: σ2 = 1 vs Ha: σ2 = 2. Based on a sample of size 1, you reject the null when X2 > 3.84.

(a) Compute the size of the test. (b) Compute the power of the test. (c) Prove that this is the most powerful test. (d) Let 2σ = X2. (i) Show that 2σ is unbiased. (ii) Show that 2σ is the minimum mean squared estimator of σ2.

71. A coin is to be tossed 4 times and the outcomes are used to test the null hypothesis that 1

2oH p: = vs. 12aH p: > , where p denotes the probability that a

“tails” appears. (“Tails” is one surface of the coin, and “heads” is the other surface.) I decide to reject the null hypothesis if “tails” appears in all four of the tosses.

(a) What is the size of this test? (b) Sketch the power function of this test for values of 1 2p > / . (c) Is this the best test to use? Explain.

72. Yi, 1 100i = ,..., are i.i.d. Poisson random variables with a mean of m. I am interested in the competing hypotheses Ho: m = 4 versus Ha: m > 4. I decide to reject the null if 4 2Y > . . Use an appropriate large-sample approximation to compute the size of my test. 73. A researcher has 10 i i d. . . draws from a 2(0 )N σ, distribution, 1 2 10( )X X X, ,..., . He is interested in the competing hypotheses: 2 1oH σ: = vs. 2 4AH σ: = . He plans

to test these hypotheses using the test statistic

2σ! = 1

10 i=1

10∑ Xi2 . The procedure is to

reject oH in favor of AH if 2

σ! >1.83 ; otherwise oH will not be rejected. (a) Determine the size of the test. (b) Determine the power of the test. (c) Is the researcher using the most powerful test given the size from (a)? If

so, prove it. If not, provide a more test.

74. Suppose that iX is i i d. . . with mean µ , variance 2σ and finite fourth moment. A random sample of 100 observations are collected, and you are given the following summary statistics: 1 23X = . and 2 4 33s = . (where 1002 1

1 1( iN i

s X− == −∑ 2)X is the

sample variance.) (a) Construct an approximate 95% confidence interval for µ . Why is this an

”approximate” confidence interval? (b) Let 2α µ= . Construct an approximate 95% confidence interval for α .

Page 15: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

15

75. A researcher is interested in computing the likelihood ratio statistic

1

2

( )( )f yLRf y

θθ,=,

but finds the calculation difficult because the density function ( )f . is very complicated. However, it easy to compute the joint density ( )g y x θ, , where x denotes another random variable. Unfortunately, the researcher has data on y, but not on x . Show that the researcher can compute LR using the formula

2

1 1

2 2

( ) ( )[ ]( ) ( )f y g y x

LR E yf y g y xθ

θ θθ θ, , ,= = |, , ,

where the notation

2[ ]E yθ • | means that the expectation should be taken using the

distribution conditional on y and using the parameter value 2θ . 76. Show the LR test for the normal mean is consistent. (Write the alternative as

a oH µ µ: ≠ , and show that the power of the test approaches 1 as n→∞ for any fixed value of oµ µ≠ .) 77. Suppose Xi ~ i.i.d. Bernoulli with parameter p, for i = 1, …, n. I am interested in testing H0: p = 0.5 versus Ha: p > 0.5. Let X denote the sample mean of Xi.

(a) A researcher decides to reject the null if 1/20.5(1 )X n−> + . Derive the size of the test as n → ∞.

(b) Suppose p = 0.51. What is the power of this test as n → ∞. (c) The researcher asks you if she should use another test instead of the test in

(a). What advice do you have to offer her? 78. Consider two random variables X and Y . The distribution of X is given by the uniform distribution [0 5 1 0]X ~U . , . and the distribution of Y conditional on X is given by the normal (0 )Y X ~ N Xλ| , .

(a) Provide an expression for the joint density of X and Y , say ( )X Yf x y, , . (b) Prove that Y

XZ = is distributed as (0 )Z ~ N λ, . (c) You have a sample 1{ }Ni i iX Y =, . Derive the log likelihood function (L λ |

1{ } )Ni i iX Y =, .

(d) Construct the MLE of λ . (e) Suppose that 10N = and 10 2

1( ) 4 0i

i

YXi=

= .∑ . Construct a 95% confidence interval for λ.

Page 16: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

16

79. Suppose Yi ~ i.i.d. with density f(y,θ), for i = 1, …, n, where θ is the 2×1 vector θ = (θ1 θ2)′. Let θ0 = (θ1,0 θ2,0)′ denote the true value of θ. Let si(θ) denote the score for the i’th observation, var[si(θ0)] = I(θo) denote the 2×2 information matrix, and Sn(θ) =

1( )niis θ

=∑ denote the score for the observations 1{ }ni iY = . Suppose that the problem is

“regular” in the sense that n–1/2Sn(θ0) 0(0, ( ) )dN I θ→ , 1

0 0ˆ( ) (0, ( ) )

dn N Iθ θ θ −− → , and

so forth, where θ is the MLE of θ. Now, suppose the likelihood is maximized not over θ, but only over θ2, with θ1 taking on its true value θ1 = θ1,0. That is, let

!θ2 solve

2 1,0 2max ( , )nLθ θ θ , where Ln(θ1, θ2) is the log-likelihood.

(a) Show that

n( !θ2 −θ2,0 )→d

N (0,V ) and derive an expression for V.

(b) Show that

1n∂Ln(θ10 , !θ2 )

∂θ1

→d

N (0,Ω) and derive an expression for Ω. (Note:

∂Ln(θ10 , !θ2 )∂θ1

is the first element of Sn(θ1,0, !θ2 ).)

80. Suppose that two random variables X and Y have joint density ( )f x y, and marginal densities ( )g x and ( )h y ; suppose that the distribution of Y X| is N(X, σ2). Denote this conditional density as ( )Y Xf y x| |

(a) Show that ( ) ( )

( )( )

Y Xxf y x g x dxE X Y y

h y| |

| = =∫

.

(b) Show that 2

( ) 1 ( ) ( )Y XY X

f y xx y f y x

y σ|

|

∂ |= − |

∂ and solve this for ( )Y Xxf y x| |

.

(c) Show that ( ) ( )( ) Y Xf y x g xh y

dxy y

|∂ |∂ =∂ ∂

∫ .

(d) Using your results from (a) and (c), show that 2( ) ( )E X Y y y S yσ| = = + ,

where ( )( ) ln( ( ))( )

h y yS y h y yh y

∂ /∂= ∂ /∂ =

Often this formula provide a simple method for computing ( )E X Y y| = when ( )Y Xf y x| | is 2( )N x σ, . (In the literature, this is called a “simple Bayes formula” for the conditional expectation.)

Page 17: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

17

81. Suppose that a test statistic for testing 0oH θ: = vs. 0aH θ: ≠ constructed from a sample of size n , say nT is known to have a certain asymptotic null distribution

, od H

nT X F→ :

where X is random variable with distribution function ( )F x . Suppose that the distribution ( )F x depends on an unknown nuisance parameter ρ . Represent this dependence by ( )F x ρ, . Let ( )x α ρ, solve ( )F x ρ α, = , and suppose that ( )x α ρ, is continuous in ρ. Let oρ denote the true value of ρ and let ρ denote a consistent estimator of ρ. Prove that a critical region defined by

Tn > x(α ,ρ! )

produces a test with asymptotic size equal to α . (Such a test is said to be asymptotically similar.)

82. Xi ~ i.i.d. U[0,θ], for i = 1, … , n, and let 1

1 n

ii

X Xn =

= ∑ .

(a) Show that 2p

X θ→ .

(b) Show that (2 ) 0, )d

n X N Vθ− → and derive an expression for V. (c) I am interested in the competing hypotheses Ho: θ = 1 versus Ha: θ = 0.8,

and I decide to reject Ho if X ≤ 0.45. (c1) Using an appropriate approximation, compute the size of the test. (c2) Using an appropriate approximation, compute the power of the test. (c3) Is this the best test to use?

83. Suppose that tY is generated by a “Moving Average-1” process ( (1)MA ) 1t t tY µ ε θε −= + +

where µ is a constant and ε t ∼ iid(0,σ 2 ) for all t. Let 1

1

nn ttY n Y−

== ∑ . Prove that

2 2( ) (0 (1 ) )d

nn Y Nµ σ θ− → , +

Page 18: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

18

84. A sample of size 30 from a 2( )iidN µ σ, population yields: 30

150ii

x=

=∑ and 30 21

120iix

==∑

(a) Calculate the MLE’s of µ and 2σ (b) Suppose that you knew that 2µ = , Calculate the MLE of 2σ (c) Construct the LR, LM and Wald tests of 2 vs 2o aH Hµ µ: = . : ≠ using a

5% critical value. (d) Construct the LR, LM and Wald tests of Ho: µ = 2, σ2 = 1 vs. Ha: µ≠2, σ2≠1, using a 10% critical value. (e) Construct a joint 95% approximate confidence set for µ and 2σ . Plot the

resulting confidence ellipse. (f) Calculate the MLE of 2θ µ σ= / . Calculate a 95% confidence interval for

θ . 85. An economist is flying to a conference where he will present a new research paper. While looking over his presentation he realizes that he failed to calculate the “p-value” for the key t-statistic in his paper. Unfortunately he doesn’t have any statistical tables with him, but he does have a laptop computer and a program that can generate iid (0 1)N , random variables. He decides to estimate the p-value by simulating t-distributed random variables by transforming the standard normals generated by his random number generator. Specifically, he does the following. Let t denote the numerical value of the economist’s t-statistic. The p-value that he wants to estimate is Pr( )dfP t t= >

where dft is a t-distributed random variable with degrees of freedom equal to df. For his problem, the correct value of df is df = 5. The economist generates 100 draws from the 5t distributions and estimates P by the sample fraction of the draws that were larger than t . That is, letting ix denote the i th′ draw from the 5t distribution, and iq denote the indicator variable

1if0 otherwise

ii

x tq

>⎧ ⎫= ⎨ ⎬⎩ ⎭

then P is estimated as

P! = 1

100 i=1

100

∑ qi

(a) If you had a random number generator that only generated iid (0 1)N ,

random variables, how would you generate a draw from the 5t distribution?

(b) Prove that the iq s′ are distributed as iid Bernoulli random variables with

“success” probability equal to P . (c) Compute the mean and variance of P as a function of P . (d) Prove that P is consistent estimator or P (as the number of simulated

values of iq →∞).

Page 19: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

19

(e) Suppose P = 0.04. Construct a 90% confidence interval from P . (Hint: Since the number of simulations is large ( 100)N = , you can use the Central Limit Theorem.)

(f) Write the likelihood function for P as a function of the sample iq s′ .

(g) Show that P is the MLE. (h) The economist thinks that he remembers looking up the p-value for the

statistic and seems to remember that the correct value was .03. How would you construct the most powerful test for the null hypothesis that 03P = . versus the alternate that 04P = . ?

86. Suppose iX , 1i n= ,..., , are distributed i.i.d. [0 ]U θ, (that is, uniformly over 0 to θ ).

(a) What is the MLE of θ ? (b) Derive the CDF of MLEθ!

(c) Prove that MLEθ! →p

θ0 where 0θ denotes the true value of θ . (d) I am interested in testing 0 1H θ: = vs. 1aH θ: > , and do this by using a

critical region defined by 1max { }i n iX c≤ ≤ > . Suppose that the sample size is 100n = .

(i) Find the value of c so that the test has a size of 5%. (ii) Using this value of c, find the power of the test when 1 02θ = . .

87. Yi ~ i.i.d. with mean µ and variance σ2 for i = 1, …, n. A sample of size n = 100

yields 100

1

1100 i

iX

=∑ = 5 and

1002

1

1100 i

iX

=∑ = 50. Construct an approximate 95% confidence

interval for µ. (Justify the steps in your approximation.) 88. iY are iid Bernoulli random variables with ( 1)iP Y p= = . I am interested in Ho: p= ½ vs. Ha: p > ½. The sample size is 100n = , and I decide the reject the null hypothesis if

155n

iiY

=>∑ .

(a) Use the central limit theorem to derive the approximate size of the test. (b) Suppose 0 60p = . . Use the central limit theorem to derive the power of the

test. 89. Y ~ N(µ, σ2) and X is a binary random variable that is equal to 1 if Y > 2 and equal to 0 otherwise. Let p = Prob(X = 1). You have n i.i.d. observations on Yi, and suppose the sample mean is Y = 4 and the sample variance is 1 2

1( )nii

n Y Y−=

−∑ = 9. (a) Construct an estimate of p. (b) Explain how you would construct a 95% confidence for p. (c) Is the estimator that you used in (a) consistent? Explain why or why not.

Page 20: EY X (3/4) |= Var Y X x () |= =+ 3 Xmwatson/misc/switz_w1_ex.pdf1 Studienzentrum Gerzensee Doctoral Program in Economics Econometrics 2014 Week 1 Exercises 1. Let X and Y have a joint

20

90. Yi ~ i.i.d. N(0, σ2), i = 1, …, n. Show that S = 2

1

n

iiY

=∑ is a sufficient statistic for σ2.

91. Suppose Yi|µ ~ iid N(µ, 1), 1i n= ,.. , µ ~ N(2,1), and let L( µ , µ) = ( µ −µ)2. Suppose n = 1.

(a) What is the MLE of µ ? (b) Derive the Bayes risk of the MLE. (c) Derive the posterior for µ . (d) Derive the Bayes estimator of µ . (e) Derive the Bayes risk of the Bayes estimator. (f) Suppose Y1 = 1.5. (f.i) Construct a (frequentist) 95% confidence set for µ. (f.ii) Construct a (Bayes) 95% credible set for µ.

92. Suppose Yi ~ i.i.d. N(µ, 1). Let Y denote the sample mean. Suppose loss is quadratic, i.e. 2ˆ ˆ( , ) ( )L µ µ µ µ= − . Show that ½Y is admissible. 93. Suppose Y1 and Y2 are iid with mean µ and variance σ2. Let µ = 0.25Y1 + 0.75Y2. Suppose that risk is mean squared error, so that R( µ ,µ) = E[( µ −µ)2].

(a) Compute R( µ ,µ). (b) Show that µ is inadmissable.

94. Assume that Yi|µ ~ iid N(µ, 1) for i = 1, … , n. Suppose that µ ~ g for some prior density g. Show that the posterior distribution of µ depends on 1{ }ni iY = only through Y . 95. Assume that Yi|µ ~ iid N(µ, 1) for i = 1, … , n. Suppose that µ ~ N(0, λ) with probability p, and µ = 0 with probability 1–p. Derive E(µ|Y ). 96. Yi is i.i.d. N(µ, σ2) for i = 1, … , n. Let Y denote the sample mean, and consider ˆ 2Yµ = + as an estimator of µ. Suppose that the loss function is 2ˆ ˆ( , ) ( )L µ µ µ µ= − .

Show that µ is an inadmissible estimator of µ.