92
A SHORT COURSE ON ROBUST STATISTICS David E. Tyler Rutgers The State University of New Jersey Web-Site www.rci.rutgers.edu/ dtyler/ShortCourse.pdf

A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

  • Upload
    vutu

  • View
    229

  • Download
    6

Embed Size (px)

Citation preview

Page 1: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

A SHORT COURSE ONROBUST STATISTICS

David E. TylerRutgers

The State University of New Jersey

Web-Sitewww.rci.rutgers.edu/ dtyler/ShortCourse.pdf

Page 2: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

References

• Huber, P.J. (1981). Robust Statistics. Wiley, New York.

• Hampel, F.R., Ronchetti, E.M., Rousseeuw,P.J. and Stahel, W.A. (1986).Robust Statistics: The Approach Based on Influence Functions.Wiley, New York.

• Maronna, R.A., Martin, R.D. and Yohai, V.J. (2006).Robust Statistics: Theory and Methods. Wiley, New York.

Page 3: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PART 1

CONCEPTS AND BASIC METHODS

Page 4: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MOTIVATION

• Data Set: X1, X2, . . . , Xn

• Parametric Model: F (x1, . . . , xn | θ)

θ: Unknown parameterF : Known function

• e.g. X1, X2, . . . , Xn i.i.d. Normal(µ, σ2)

Q: Is it realistic to believe we don’t know (µ, σ2), but we know e.g. the shape of thetails of the distribution?

A: The model is assumed to be approximately true, e.g. symmetric and unimodal(past experience).

Q: Are statistical methods which are good under the model reasonably good if themodel is only approximately true?

ROBUST STATISTICSFormally addresses this issue.

Page 5: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

CLASSIC EXAMPLE: MEAN .vs. MEDIAN

• Symmetric distributions: µ =

population meanpopulation median

• Sample mean: X ≈ Normal(µ, σ

2

n

)

• Sample median: Median ≈ Normal(µ, 1n

14f(µ)2

)

• At normal: Median ≈ Normal(µ, σ

2

nπ2

)• Asymptotic Relative Efficiency of Median to Mean

ARE(Median,X) =avar(X)

avar(Median)=

2

π= 0.6366

Page 6: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

CAUCHY DISTRIBUTIONX ˜ Cauchy(µ, σ2)

f (x;µ, σ) =1

πσ

1 +

x− µσ

2−1

• Mean: X˜Cauchy(µ, σ2) • Median ≈ Normal(µ, π

2σ2

4n

)• ARE(Median,X) =∞ or ARE(X,Median) = 0

• For t on ν degrees of freedom:

ARE(Median,X) =4

(ν − 2)π

Γ((ν + 1)/2)2

Γ(ν/2)

ν ≤ 2 3 4 5ARE(Median,X) ∞ 1.621 1.125 0.960ARE(X,Median) 0 0.617 0.888 1.041

Page 7: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MIXTURE OF NORMALS

Theory of errors: Central Limit Theorem gives plausibility to normality.

Normal(µ, σ2) with probability 1− ε

Normal(µ, (3σ)2) with probability ε

i.e. not all measurements are equally precise.

X˜(1− ε) Normal(µ, σ2) + ε Normal(µ, (3σ)2)

ε = 0.10

•Classic paper: Tukey (1960), A survey of sampling from contaminated distributions.

◦ For ε > 0.10⇒ ARE(Median,X) > 1

◦ The mean absolute deviation is more effiicent than the sample standard deviationfor ε > 0.01.

Page 8: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PRINCETON ROBUSTNESS STUDIESAndrews, et.al. (1972)

Other estimates of location.

• α-trimmed mean: Trim a proportion of α from both ends of the data set and thentake the mean. (Throwing away data?)

• α-Windsorized mean: Replace a proportion of α from both ends of the data setby the next closest observation and then take the mean.

• Example: 2, 4, 5, 10, 200Mean = 44.2 Median = 520% trimmed mean = (4 + 5 + 10) / 3 = 6.3320% Windsorized mean = (4 + 4 + 5 + 10 + 10) / 5 = 6.6

Page 9: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Measuring the robustness of a statistics

• Relative Efficiency over a range of distributional models.

– There exist estimates of location which are asymptotically most efficient forthe center of any symmetric distribution. (Adaptive estimation, semi-parametrics).

– Robust?

• Influence Function over a range of distributional models.

• Maximum Bias Function and the Breakdown Point.

Page 10: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Measuring the effect of an outlier(not modeled)

• Good Data Set: x1, . . . , xn−1

Statistic: Tn−1 = T (x1, . . . , xn−1)

• Contaminated Data Set: x1, . . . , xn−1,x

Contaminated Value: Tn = T (x1, . . . , xn−1,x)

THE SENSITIVITY CURVE (Tukey, 1970)

SCn(x) = n(Tn − Tn−1)⇔ Tn = Tn−1 +1

nSCn(x)

THE INFLUENCE FUNCTION (Hampel, 1969, 1974)

Population version of the sensitivity curve.

Page 11: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

THE INFLUENCE FUNCTION

• Statistic Tn = T (Fn) estimates T (F ).

• Consider F and the ε-contaminated distribution,

Fε = (1− ε)F + εδx

where δx is the point mass distribution at x.

F Fε

• Compare Functional Values: T (F ) .vs. T (Fε)

• Given Qualitative Robustness (Continuity):

T (Fε)→ T (F ) as ε→ 0

(e.g. the mode is not qualitatively robust)

• Influence Function (Infinitesimal pertubation: Gateaux Derivative)

IF (x;T, F ) = limε→0

T (Fε)− T (F )

ε=

∂εT (Fε)

∣∣∣∣∣∣ε=0

Page 12: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

EXAMPLES

• Mean: T (F ) = EF [X ].

T (Fε) = EFε(X)

= (1− ε)EF [X ] + εE[δx]

= (1− ε)T [F ] + ε x

IF(x; T,F) = limε→0

(1− ε)T (F ) + εx− T (F )

ε= x−T(F)

• Median: T (F ) = F−1(1/2)

IF(x; T,F) = {2 f(T(F))}−1sign(X−T(F))

Page 13: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Plots of Influence FunctionsGives insight into the behavior of a statistic.

Tn ≈ Tn−1 +1

nIF (x;T, F )

Mean Median

α-trimmed mean α-Winsorized mean(somewhat unexpected?)

Page 14: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Desirable robustness properties for the influence function

• SMALL

– Gross Error Sensitivity

GES(T ;F ) = supx| IF (x;T, F ) |

GES <∞⇒ B-robust (Bias-robust)

– Asymptotic Variance

Note : EF [ IF (X ;T, F ) ] = 0

AV (T ;F ) = EF [ IF (X ;T, F )2 ]

• Under general conditions, e.g. Frechet differentiability,√n(T (X1, . . . , Xn)− T (F ))→ Normal(0, AV (T ;F ))

• Trade-off at the normal model: Smaller AV↔ Larger GES

• SMOOTH (local shift sensitivity): protects e.g. against rounding error.

• REDESCENDING to 0.

Page 15: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

REDESCENDING INFLUENCE FUNCTION

• Example: Data Set of Male Heights in cm

180, 175, 192, . . ., 185, 2020, 190, . . .

• Redescender = Automatic Outlier Detector

Page 16: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

CLASSES OF ESTIMATES

L-statistics: Linear combination of order statistics

• Let X(1) ≥ . . . ≥ X(n) represent the order statistics.

T (X1, . . . , Xn) =n∑i=1ai,nX(i)

where ai,n are constants.

• Examples: ◦ Mean. ai,n = 1/n

◦ Median.ai,n =

1 i = n+12

0 i 6= n+12

for n odd

ai,n =

12 i = n

2 ,n2 + 1

0 otherwisefor n even

◦ α-trimmed mean.◦ α-Winsorized mean.

• General form for the influence function exists

• Can obtain any desirable monotonic shape, but not redescending

• Do not readily generalize to other settings.

Page 17: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

M-ESTIMATESHuber (1964, 1967)

Maximum likelihood type estimatesunder non-standard conditions

• One-Parameter Case. X1, . . . , Xn i.i.d. f (x; θ), θ ∈ Θ

• Maximum likelihood estimates

– Likelihood function. L(θ | x1, . . . , xn) =n∏i=1f (xi; θ)

– Minimize the negative log-likelihood.

minθ

n∑i=1ρ(xi; θ) where ρ(xi; θ) = − log f (x : θ).

– Solve the likelihood equationsn∑i=1ψ(xi; θ) = 0 where ψ(xi; θ) =

∂ρ(xi; θ)

∂θ

Page 18: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

DEFINITIONS OF M-ESTIMATES

• Objective function approach: θ = arg min ∑ni=1 ρ(xi; θ)

• M-estimating equation approach: ∑ni=1ψ(xi; θ) = 0.

– Note: Unique solution when ψ(x; θ) is strictly monotone in θ.

• Basic examples.

– Mean. MLE for Normal: f (x) = (2π)−1/2e−12(x−θ)

2for x ∈ <.

ρ(x; θ) = (x− θ)2 or ψ(x; θ) = x− θ

– Median. MLE for Double Exponential: f (x) = 12e−|x−θ| for x ∈ <.

ρ(x; θ) = | x− θ | or ψ(x; θ) = sign(x− θ)

• ρ and ψ need not be related to any density or to each other.

• Estimates can be evaluated under various distributions.

Page 19: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

M-ESTIMATES OF LOCATION

A symmetric and translation equivariant M-estimate.

Translation equivariance

Xi → Xi + a⇒ Tn → Tn + a

gives ρ(x; t) = ρ(x− t) and ψ(x; t) = ψ(x− t)

Symmetric

Xi → −Xi ⇒ Tn → −Tngives ρ(−r) = ρ(r) or ψ(−r) = −ψ(r).

Alternative derivationGeneralization of MLE for center of symmetry for a given family of symmetric

distributions.

f (x; θ) = g(|x− θ|)

Page 20: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

INFLUENCE FUNCTION OF M-ESTIMATES

M-FUNCTIONAL: T (F ) is the solution to EF [ψ(X;T (F ))] = 0.

IF (x;T, F ) = c(T, F )ψ(x;T (F ))

where c(t, f ) = −1/EF

[∂ψ(X;θ)

∂θ

]evaluated at θ = T (F ).

Note: EF [ IF (X ;T, F ) ] = 0.

One can decide what shape is desired for the Influence Function and then constructan appropriate M-estimate.

⇒ Mean ⇒ Median

Page 21: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

EXAMPLE

Chooseψ(r) =

c r ≥ c

r | r | < c

−c r ≤ −cwhere c is a tuning constant.

Huber’s M-estimateAdaptively trimmed mean.

i.e. the proportion trimmed depends upon the data.

Page 22: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

DERIVATION OF INFLUENCE FUNCTION FROM M-ESTIMATESSketch

Let Tε = T (Fε), and so

0 = EFε[ψ(X;Tε)] = (1− ε)EF [ψ(X;Tε)] + ε ψ(x;Tε).

Taking the derivative with respect to ε⇒

0 = −EF [ψ(X;Tε)] + (1− ε) ∂∂εEF [ψ(X;Tε)] + ψ(x;Tε) + ε

∂εψ(x;Tε).

Let ψ′(x, θ) = ∂ψ(x, θ)/∂θ. Using the chain rule⇒

0 = −EF [ψ(X;Tε)] + (1− ε)EF [ψ′(X;Tε)]∂Tε∂ε

+ ψ(x;Tε) + εψ′(x;Tε)∂Tε∂ε

.

Letting ε→ 0 and using qualitative robustness, i.e. T (Fε)→ T (F ), then gives

0 = EF [ψ′(X;T )]IF (x;T (F )) + ψ(x;T (F ))⇒ RESULTS.

Page 23: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

ASYMPTOTIC NORMALITY OF M-ESTIMATESSketch

Let θ = T (F ). Using Taylor series expansion on ψ(x; θ) about θ gives

0 =n∑i=1ψ(xi; θ) =

n∑i=1ψ(xi; θ) + (θ − θ)

n∑i=1ψ′(xi; θ) + . . . ,

or

0 =√n{1

n

n∑i=1ψ(xi; θ)} +

√n(θ − θ){1

n

n∑i=1ψ′(xi; θ)} + Op(1/

√n).

By the CLT,√n{ 1n

∑ni=1 ψ(xi; θ)} →d Z ∼ Normal(0, EF [ψ(X; θ)2]).

By the WLLN, 1n

∑ni=1 ψ

′(xi; θ)→p EF [ψ′(X; θ)].

Thus, by Slutsky’s theorem,√n(θ − θ)→d −Z/EF [ψ′(X; θ)] ∼ Normal(0, σ2),

where

σ2 = EF [ψ(X; θ)2]/EF [ψ′(X; θ)]2 = EF [IF (X;TF )2]

• NOTE: Proving Frechet differentiability is not necessary.

Page 24: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

M-estimates of locationAdaptively weighted means

• Recall translation equivariance and symmetry implies

ψ(x; t) = ψ(x− t) and ψ(−r) = −ψ(r).

• Express ψ(r) = ru(r) and let wi = u(xi − θ), then

0 =n∑i=1ψ(xi − θ) =

n∑i=1

(xi − θ)u(xi − θ) =n∑i=1wi {xi − θ}

⇒ θ =∑n

i=1 wi xi∑ni=1 wi

The weights are determined by the data cloud.

••••••• • •

θ︷ ︸︸ ︷Heavily Downweighted

Page 25: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

SOME COMMON M-ESTIMATES OF LOCATION

Huber’s M-estimate

ψ(r) =

c r ≥ c

r | r | < c

−c r ≤ −c

• Given bound on the GES, it has maximum efficiency at the normal model.

• MLE for the “least favorable distribution”, i.e. symmetric unimodal model withsmallest Fisher Information within a “neighborhood” of the normal.LFD = Normal in the middle and double exponential in the tails.

Page 26: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Tukey’s Bi-weight M-estimate (or bi-square)

u(r) =

1− r2

c2

+

2

where a+ = max{0, a}

• Linear near zero

• Smooth (continuous second derivatives)

• Strongly redescending to 0

• NOT AN MLE.

Page 27: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

CAUCHY MLE

ψ(r) =r/c

(1 + r2/c2)

NOT A STRONG REDESCENDER.

Page 28: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

COMPUTATIONS

• IRLS: Iterative Re-weighted Least Squares Algorithm.

θk+1 =∑ni=1wi,kxi∑ni=1wi,k

where wi,k = u(xi − θk) and θo is any intial value.

• Re-weighted mean = One-step M-estimate

θ1 =∑n

i=1 wi,o xi∑ni=1 wi,o

,

where θo is a preliminary robust estimate of location, such as the median.

Page 29: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PROOF OF CONVERGENCESketch

θk+1 − θk =∑ni=1wi,kxi∑ni=1wi,k

− θk =∑ni=1wi,k(xi − θk)∑n

i=1wi,k=

∑ni=1 ψ(xi − θk)∑ni=1 u(xi − θk)

Note: If θk > θ, then θk > θk+1 and θk < θ, then θk < θk+1.

Decreasing objective function

Let ρ(r) = ρo(r2) and suppose ρ′o(s) ≥ 0 and ρ′′o(s) ≤ 0. Then

n∑i=1ρ(xi − θk+1) <

n∑i=1ρ(xi − θk).

• Examples:

– Mean⇒ ρo(s) = s

– Median⇒ ρo(s) =√s

– Cauchy MLE⇒ ρo(s) = log(1 + s).

• Include redescending M-estimates of location.

• Generalization of EM algorithm for mixture of normals.

Page 30: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PROOF OF MONOTONE CONVERGENCESketch

Let ri,k = (xi − θk). By Taylor series with remainder term,

ρo(r2i,k+1) = ρo(r

2i,k) + (r2i,k+1 − r2i,k)ρ′o(r2i,k) +

1

2(r2i,k+1 − r2i,k)2ρ′′o(r2i,∗)

So,n∑i=1ρo(r

2i,k+1) <

n∑i=1ρo(r

2i,k) +

n∑i=1

(r2i,k+1 − r2i,k)ρ′o(r2i,k)

Now,

ru(r) = ψ(r) = ρ′(r) = 2rρ′o(r2) and so ρ′o(r

2) = u(r)/2.

Also,(r2i,k+1 − r2i,k) = (θk+1 − θk)2 − 2(θk+1 − θk)(xi − θk).

Thus,n∑i=1ρo(r

2i,k+1) <

n∑i=1ρo(r

2i,k)−

1

2(θk+1 − θk)2

n∑i=1u(ri,k) <

n∑i=1ρo(r

2i,k)

• Slow but Sure⇒ Switch to Newton-Raphson after a few iterations.

Page 31: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PART 2

MORE ADVANCED CONCEPTS AND METHODS

Page 32: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

SCALE EQUIVARIANCE

• M-estimates of location alone are not scale equivariant in general, i.e.

Xi → Xi + a⇒ θ → θ + a location equivariant

Xi → b Xi 6⇒ θ → b θ not scale equivariant

(Exceptions: the mean and median.)

• Thus, the adaptive weights are not dependent on the spread of the data

••••••• •︷ ︸︸ ︷Equally Downweighted︸ ︷︷ ︸

• • ••• • • •

Page 33: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

SCALE STATISTICS: sn

Xi → bXi + a⇒ sn → | b |sn• Sample standard deviation.

sn =

√√√√√ 1

n− 1

n∑i=1

(xi − x)2

• MAD (or more approriately MADAM).

Median Absolute Deviation About the Median

s∗n = Median| xi −median |

sn = 1.4826 s∗n →p σ at Normal(µ, σ2)

Page 34: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Example

2, 4, 5, 10, 12, 14, 200

• Median = 10

• Absolute Deviations: 8, 6, 5, 0, 2, 4, 190

• MADAM = 5 ⇒ sn = 7.413

• (Standard deviation = 72.7661)

Page 35: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

M-estimates of location with auxiliary scale

θ = arg minn∑i=1

ρ

xi − θc sn

or

θ solves:n∑i=1

ψ

xi − θc sn

= 0

• c: tuning constant

• sn: consistent for σ at normal

Page 36: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Tuning an M-estimate

• Given a ψ-function, define a class of M-estimates via

ψc(r) = ψ(r/c)

• Tukey’s Bi-weight.

ψ(r) = r{(1− r2)+}2 ⇒ ψc(r) =r

c

1− r2

c2

+

2

• c→∞⇒ ≈ Mean

• c→ 0⇒ Locally very unstable

Page 37: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Normal distribution. Auxiliary scale.

Page 38: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MORE THAN ONE OUTLIER

• GES: measure of local robustness.

• Measures of global robustness?

Xn = {X1, . . . , Xn}: n “good” data points

Ym = {Y1, . . . , Ym}: m “bad” data points

Zn+m = Xn ∪ Ym: εm-contaminated sample. εm = mn+m

Bias of a statistic: | T (Xn ∪ Ym)− T (Xn) |

Page 39: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MAX-BIAS and THE BREAKDOWN POINT

• Max-Bias under εm-contamination:

B(εm;T,Xn) = supYm| T (Xn ∪ Ym)− T (Xn) |

• Finite sample contamination breakdown point: Donoho and Huber (1983)

ε∗c(T ; Xn) = inf{εm | B(εm;T,Xn) =∞}

• Other concepts of breakdown (e.g. replacement).

• The definition of BIAS can be modified so that BIAS→∞ as T (Xn ∪Ym) goesto the boundary of the parameter space.

Example: If T represents scale, then choose

BIAS = | log{T (Xn ∪ Ym)} − log{T (Xn)} |.

Page 40: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

EXAMPLES

• Mean: ε∗c = 1n+1

• Median: ε∗c = 1/2

• α-trimmed mean: ε∗c = α

• α-Windsorized mean: ε∗c = α

• M-estimate of location with monotone and bounded ψ function:

ε∗c = 1/2

Page 41: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PROOF(sketch of lower bound)

Let K = supr|ψ(r)|.

0 =n+m∑i=1

ψ(zi − Tn+m) =n∑i=1ψ(xi − Tn+m) +

m∑i=1ψ(yi − Tn+m)

|n∑i=1ψ(xi − Tn+m) | = |

m∑i=1ψ(yi − Tn+m) | ≤ mK

Breakdown occurs⇒ | Tn+m | → ∞, say Tn+m → −∞

⇒ |n∑i=1ψ(xi − Tn+m) | → |

n∑i=1ψ(∞) | = nK

Therefore, m ≥ n and so ε∗c ≥ 1/2. []

Page 42: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Population Version of Breakdown Pointunder contamination neighborhoods

• Model Distribution: F • Contaminating Distribution: H

• ε-contaminated Distribution: Fε = (1− ε)F + εH

• Max-Bias under ε-contamination:

B(ε;T, F ) = supH| T (Fε)− T (F ) |

• Breakdown Point: Hampel (1968)

ε∗(T ;F ) = inf{ε | B(ε;T, F ) =∞}

• Examples

– Mean: T (F ) = EF (X)⇒ ε∗(T ;F ) = 0

– Median: T (F ) = F−1(1/2)⇒ ε∗(T ;F ) = 1/2

Page 43: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

ILLUSTRATION

GES ≈ ∂B(ε;T, F )/∂ε |ε=0

ε∗(T ;F ) = asymptote

Page 44: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Heuristic Interpretation of Breakdown Pointsubject to debate

• Proportion of bad data a statistic can tolerate before becoming arbitrary ormeaningless.

• If 1/2 the data is bad then one cannot distinguish between the good data andthe bad data? ⇒ ε ≤ 1/2?

• Discussion: Davies and Gather (2005), Annals of Statistics.

Page 45: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Example: Redescending M-estimates of location with fixed scale

T (n1, . . . , xn) = arg mint

n∑i=1ρ

| xi − t |c

Breakdown point. Huber (1984). For bounded increasing ρ,

ε∗c(T ;F ) =1− A(Xn; c)/n

2− A(Xn; c)/n

where A(Xn; c) = mint∑ni=1 ρ(xi−tc ) and sup ρ(r) = 1.

• Breakdown point depends on Xn and c

• ε∗ : 0→ 1/2 as c : 0→∞

• For large c, T (n1, . . . , xn) ≈Mean!!

Page 46: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Explanation: Relationship between redescending M-estimates of location andkernel density estimates

Chu, Glad, Godtliebsen and Marron (1998)

• Objective function:n∑i=1ρ

xi − tc

• Kernel density estimate:

f (x) ∝n∑i=1κ

xi − th

• Relationship: κ ∝ 1− ρ and c = window width h

• Example: κ(r) = 1√2π

exp−r2/2

Normal kernel⇒

ρ(r) = 1− exp−r2/2

Welsch’s M-estimate

• Example: Epanechnikov kernel⇒ skipped mean

Page 47: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

• “Outliers” less compact than “good” data⇒ Breakdown will not occur.

• Not true for monotonic M-estimates.

Page 48: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PART 3

ROBUST REGRESSIONAND MULTIVARIATE STATISTICS

Page 49: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

REGRESSION SETTING

• Data: (Yi,Xi) i = 1, . . . , n

– Yi ∈ < Response– Xi ∈ <p Predictors

• Predict: Y by X′β

• Residual for a given β: ri(β) = yi − x′iβ

• M-estimates of regression

– Generalizion of MLE for symmetric error term– Yi = X′iβ + εi where εi are i.i.d. symmetric.

Page 50: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

M-ESTIMATES OF REGRESSION

• Objective function approach:

minβ

n∑i=1ρ(ri(β))

where ρ(r) = ρ(−r) and ρ ↑ for r ≥ 0.

• M-estimating equation approach:n∑i=1ψ(ri(β))xi = 0

e.g. ψ(r) = ρ′(r)

• Interpretation: Adaptively weighted least squares.

◦ Express ψ(r) = ru(r) and wi = u(ri(β))

β = [n∑i=1wixix

′i]−1[

n∑i=1wiyixi]

• Computations via IRLS algorithm.

Page 51: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

INFLUENCE FUNCTIONS FORM-ESTIMATES OF REGRESSION

IF (y, x; T, F ) ∝ ψ(r) x

• r = y − x′T (F )

• Due to residual: ψ(r) • Due to Design: x

• GES =∞, i.e. unbounded influence.

• Outlier 1: highly influential for any ψ function.

• Outlier 2: highly influential for monotonic but not redescending ψ.

Page 52: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

BOUNDED INFLUENCE REGRESSION

• GM-estimates (Generalized M-estimates).

◦ Mallows-type (Mallows, 1975):n∑i=1w(xi) ψ (ri(β)) xi = 0

Downweights outlying design points (leverage points), even if they are goodleverage points.◦ General form (c.g. Maronna and Yohai, 1981):

n∑i=1ψ (xi, ri(β)) xi = 0

• Breakdown points.

– M-estimates: ε∗ = 0

– GM-estimates: ε∗ ≤ 1/(p + 1)

Page 53: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

LATE 70’s - EARLY 80’s

• Open problem: Is high breakdown point regression possible?

• Yes. Repeated Median. Siegel (1982).

– Not regression equivariate

• Regression equivariance: For a ∈ < and A nonsingular

(Yi,Xi)→ (aYi, A′Xi) ⇒ β → aA−1β

i.e.Yi → aYi

• Open problem: Is high breakdown point equivariate regression possible?

• Yes. Least Median of Squares. Hampel (1984), Rousseeuw, (1984)

Page 54: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Least Median of Squares (LMS)Hampel (1984), Rousseeuw, (1984)

minβMedian{ri(β)2 | i = 1, . . . , n}

• Alternatively: minβMAD{ ri(β) }

• Breakdown point: ε∗ = 1/2

• Location version: SHORTH (Princeton Robustness Study)

Midpoint of the SHORTest Half.

Page 55: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

LMS: Mid-line of the shortest strip containing 1/2 of the data.

• Problem: Not√n-consistent, but only 3

√n-consistent

√n|| β − β || →p ∞

3√n|| β − β || = Op(1)

• Not locally stable. e.g. Example is pure noise.

Page 56: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

S-ESTIMATES OF REGRESSIONRousseeuw and Yohai (1984)

• For S(·), an estimate of scale (about zero): minβ S(ri(β))

• S = MAD ⇒ LMS

• S = sample standard deviation about 0 ⇒ Least Squares

• Bounded monotonic M-estimates of scale (about zero):n∑i=1χ (| ri |/s) = 0

for χ ↑, and bounded above and below. Alternatively,1

n

n∑i=1ρ (| ri |/s) = ε

for ρ ↑, and 0 ≤ ρ ≤ 1

• For LMS: ρ : 0 - 1 jump function and ε = 1/2

• Breakdown point: ε∗ = min(ε, 1− ε)

Page 57: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

S-estimates of Regression

•√n- consistent and Asymptotically normal.

• Trade-off: Higher breakdown point ⇒ Lower efficiency and higher residualgross error sensivity for Normal errors.

• One Resolution: One-step M-estimate via IRLS. (Note: R, S-plus, Matlab.)

Page 58: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

M-ESTIMATES OF REGRESSION WITH GENERAL SCALEMartin, Yohai, and Zamar (1989)

minβ

n∑i=1ρ

| yi − x′iβ |c sn

• Parameter of interest: β

• Scale statistic: sn (consistent for σ at normal errors)

• Tuning constant: c

• Monotonic bounded ρ⇒ redescending M-estimate

• High Breakdown Point Examples

– LMS and S-estimates– MM-estimates, Yohai (1987).– CM-estimates, Mendes and Tyler (1994).

Page 59: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

sn .vs.n∑i=1ρ

| yi − x′iβ |c sn

• Each β gives one curve

• S minimizes horizonally

• M minimizes vertically

Page 60: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MM-ESTIMATES OF REGRESSIONDefault robust regression estimate in S-plus (also SAS?)

• Given a preliminary estimate of regression with breakdown point 1/2.Usually an S-estimate of regression.

• Compute a monotone M-estimate of scale about zero for the residuals.sn: Usually from the S-estimate.

• Compute an M-estimate of regression using the scale statistics sn and with anydesired tuning constant c.

• Tune to obtain reasonable ARE and residual GES.

• Breakdown point: ε∗ = 1/2

Page 61: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

TUNING

• High breakdown point S-estimates are badly tuned M-estimates.

• Tuning the S-estimates affects its breakdown point.

• MM-estimates can be tuned without affecting the breakdown point

Page 62: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

COMPUTATIONAL ISSUES

• All known high breakdown point regression estimates are computationally inten-sive. Non-convex optimization problem.

• Approximate or stochastic algorithms.

• Random Elemental Subset Selection. Rousseeuw (1984).

– Consider exact fit of lines for p of the data points⇒ np

such lines.

– Optimize the criterion over such lines.– For large n and p, randomly sample from such lines so that there is a good

chance, say e.g. 95% chance, that one of the elemental subsets will containonly good data even if half the data is contaminated.

Page 63: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MULTIVARIATE DATA

Page 64: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Robust Multivariate Location and Covariance EstimatesParallel developments

•Multivariate M-estimates. Maronna (1976). Huber (1977).

◦ Adaptively weighted mean and covariances.◦ IRLS algorithms.◦ Breakdown point: ε∗ ≤ 1/(d + 1), where d is the dimension of

the data.

•Minimum Volume Ellipsoid Estimates. Rousseeuw (1985).

◦MVE is a multivariate version of LMS.

•Multivariate S-estimates. Davies (1987), Lopuhuaa (1989)

•Multivariate CM-estimates. Kent and Tyler (1996).

•Multivariate MM-estimates. Tatsuoka and Tyler (2000). Tyler (2002).

Page 65: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MULTIVARIATE DATA

• d-dimensional data set: X1, . . .Xn

• Classical summary statistics:

– Sample Mean Vector: X = 1n

∑ni=1 Xi

– Sample Variance-Covarinance Matrix

Sn =1

n

n∑i=1

(Xi − X)(Xi − X)T = {sij}

where sii = sample variance, sij = sample covariance.

•Maximum likelihood estimates under normality

Page 66: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

●●

●●

●●

●●

●●

3.6 3.8 4.0 4.2 4.4 4.6

4.0

4.5

5.0

5.5

6.0

Sample Mean and Covariance

Hertzsprung−Russell Galaxy Data (Rousseeuw)

Visualization⇒ Ellipse containing half the data.

(X− X)TS−1n (X− X) ≤ c

Page 67: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Mahalanobis distances: d2i = (Xi − X)TS−1

n (Xi − X)

●●

●●

●●

●●

0 10 20 30 40

0.5

1.0

1.5

2.0

2.5

3.0

Index

Mah

Dis

Mahalanobis angles: θi,j = Cos−1{(Xi − X)TS−1

n (Xj − X)/(didj)}

Page 68: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

●●

●●

● ●

● ●

●●

●●

●●

−1.0 −0.5 0.0 0.5 1.0

−1.

0−

0.5

0.0

0.5

1.0

PC1

PC

2

●● ●● ●● ●●● ●● ●● ●●● ●●●● ● ●●●● ●●● ●● ●●●● ●● ●● ●● ●●●●● ● ●

Principal Components: Do not always reveal outliers.

Page 69: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

ENGINEERING APPLICATIONS

• Noisy signal arrays, radar clutter.Test for signal, i.e. detector, based on Mahalanobis angle be-tween signal and observed data.

• Hyperspectral Data.Adaptive cosine estimator = Mahalanobis angle.

• BIG DATA

• Can not clean data via observations. Need automatic methods.

Page 70: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

ROBUST MULTIVARIATE ESTIMATESLocation and Scatter (Pseudo-Covariance)

• Trimmed versions: complicated

•Weighted mean and covariance matrices

µ =∑n

i=1 wi Xi∑ni=1 wi

Σ =1

n

n∑i=1

wi (Xi − µ)(Xi − µ)T

where wi = u(di,o) and d2i,o = (Xi − µo)

TΣ−1

o (Xi − µo)

and with µo and Σo being initial estimates, e.g.

→ Sample mean vector and covariance matrix.

→ High breakdown point estimates.

Page 71: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

MULTIVARIATE M-ESTIMATES

•MLE type for elliptically symmetric distributions

• Adaptively weighted mean and covariances

µ =∑n

i=1 wi Xi∑ni=1 wi

Σ =1

n

n∑i=1

wi (Xi − µ)(Xi − µ)T

•wi = u(di) and d2i = (Xi − µ)TΣ

−1(Xi − µ)

• Implicit equations

Page 72: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PROPERTIES OF M-ESTIMATES

• Root-n consistent and asymptotically normal.

• Can be tuned to have desirable properties:– high efficiency over a broad class of distributions– smooth and bounded influence function

• Computationally simple (IRLS)

• Breakdown point: ε∗ ≤ 1/(d+ 1)

Page 73: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

HIGH BREAKDOWN POINT ESTIMATES

•MVE: Minimum Volume Ellipsoid⇒ ε∗ = 0.5Center and scatter matrix for smallest ellipse covering at leasthalf of the data.

•MCD, S-estimates, MM-estimates, CM-estimates⇒ ε∗ = 0.5

• Reweighted versions based on high breakdown start.

• Computationally intensive:

Approximate/probabilistic algorithms.Elemental Subset Approach.

Page 74: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

●●

●●

●●

●●

●●

3.6 3.8 4.0 4.2 4.4 4.6

4.0

4.5

5.0

5.5

6.0

Sample Mean and Covariance

Hertzsprung−Russell Galaxy Data (Rousseeuw)

Page 75: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

●●

●●

●●

●●

●●

3.6 3.8 4.0 4.2 4.4 4.6

4.0

4.5

5.0

5.5

6.0

Add MVE

Page 76: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

●●

●●

●●

●●

●●

3.6 3.8 4.0 4.2 4.4 4.6

4.0

4.5

5.0

5.5

6.0

Add Cauchy MLE

Page 77: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

ELLIPTICAL DISTRIBUTIONS

and

AFFINE EQUIVARIANCE

Page 78: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

SPHERICALLY SYMMETRIC DISTRIBUTIONS

Z = R U where R ⊥ U, R > 0,

and U ∼d Uniform on the unit sphere.

Density: f(z) = g(z′z)

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

−1.0 −0.5 0.0 0.5 1.0

−1.

0−

0.5

0.0

0.5

1.0

x

y ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

Page 79: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

ELLIPTICAL SYMMETRIC DISTRIBUTIONS

X = AZ + µ ∼ E(µ,Γ; g), where Γ = AA′

⇒ f(x) = |Γ|−1/2g((x− µ)

′Γ−1(x− µ)

)

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

−0.5 0.0 0.5 1.0 1.5 2.0 2.5

−0.

50.

00.

51.

01.

52.

02.

5

Eliplot()[,1]

Elip

lot(

)[,2

]

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

Page 80: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

AFFINE EQUIVARIANCE

• Parameters of elliptical distributions:

X ∼ E(µ,Γ; g)⇒ X∗ = BX + b ∼ E(µ∗,Γ∗; g)

where µ∗ = Bµ+ b and Γ∗ = BΓB′

• Sample version:

Xi→ X∗i = BX + b, i = 1, . . . n

⇒ µ→ µ∗ = Bµ+ b and V → V ∗ = BV B′

• Examples

– Mean vector and Covariance matrix– M-estimates– MVE

Page 81: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

ELLIPTICAL SYMMETRIC DISTRIBUTIONS

f(x;µ,Γ) = |Γ|−1/2g((x− µ)

′Γ−1(x− µ)

).

• If second moments exist, shape matrix Γ ∝ covariance matrix.

• For any affine equivariant scatter functional: Σ(F ) ∝ Γ

• That is, any affine equivariant scatter statistic estimates a matrixproportional to the shape matrix Γ.

NON-ELLIPTICAL DISTRIBUTIONS

• Difference scatter matrices estimate different population quantities.

• Analogy: For non-symmetric distributions:population mean 6= population median.

Page 82: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

OTHER TOPICS

Projection-based multivariate approaches

• Tukey’s data depth. (or half-space depth) Tukey (1974).

• Stahel-Donoho’s Estimate. Stahel (1981). Donoho 1982.

• Projection-estimates. Maronna, Stahel and Yohai (1994). Tyler (1996).

• Computationally Intensive.

Page 83: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

TUKEY’S DEPTH

Page 84: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

PART 5

ROBUSTNESS AND COMPUTER VISION

• Independent development of robust statistics within the com-puter vision/image understanding community.

Page 85: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler
Page 86: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler
Page 87: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler
Page 88: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

• Hough transform: ρ = xcos(θ) + ysin(θ)

• y = a+ bx where a = ρ/sin(θ) and b = −cos(θ)/sin(θ)

• That is, intersection refers to the line connecting two points.• Hough: Estimation in Data Space to Clustering in Feature Space

Find centers of the clusters• Terminology:

– Feature Space = Parameter Space– Accumulator = Elemental Fit

• Computation: RANSAC (Random sample consensus)– Randomly choose a subset from the Accumulator. (Random

elemental fits.)– Check to see how many data points are within a fixed neigh-

borhood are in a neighborhood.

Page 89: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Alternative Formulation of Hough Transform/RANSAC

∑ ρ(ri) should be small, where ρ(r) =

1, | r | > R0, | r | ≤ R

That is, a redescending M-estimate of regression with known scale.Note: Relationship to LMS.Vision: Need e.g. 90% breakdown point, i.e. tolerate 90% bad data

Page 90: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Defintion of Residuals?

Hough transform approach does not distinguish between:

• Regression line for regressing Y on X.

• Regression line for regressing X on Y.

• Orthogonal regression line.

(Note: Small stochastic errors⇒ little difference.)

Page 91: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

GENERAL PARADIGM• Line in 2-D: Exact fit to all pairs

• Quadratic in 2-D: Exact fit to all triples

• Conic Sections: Ellipse Fitting

(x− µ)′A(x− µ) = 1

◦ Linearizing: Let x′∗ = (x1, x2, x21, x

22, x1x2, 1)

◦ Ellipse: a′x∗ = 0

◦ Exact fit to all subsets of size 5.

Page 92: A SHORT COURSE ON ROBUST STATISTICS David E. Tyler

Hyperboloid Fitting in 4-D• Epipolar Geometry: Exact fit to

◦ 5 pairs: Calibrated cameras (location, rotation, scale).◦ 8 pairs: Uncalibrated cameras (focal point, image plane).◦ Hyperboloid surface in 4-D.

• Applications: 2D→ 3D, Motion.

• OUTLIERS = MISMATCHES