94
Decision Analysis Scott Ferson, [email protected] ember 2007, Stony Brook University, MAR 550, Challe

Decision Analysis

Embed Size (px)

DESCRIPTION

A step by step approach for decision making under risk and uncertainty

Citation preview

Page 1: Decision Analysis

Decision Analysis

Scott Ferson, [email protected] September 2007, Stony Brook University, MAR 550, Challenger 165

Page 2: Decision Analysis

Outline• Risk and uncertainty• Expected utility decisions

• St. Petersburg game, Ellsberg Paradox

• Decisions under uncertainty• Maximin, maximax, Hurwicz, minimax regret, etc.

• Junk science and the precautionary principle• Decisions under ranked probabilities

• Extreme expected payoffs

• Decisions under imprecision• E-admissibility, maximality, -maximin, -maximax, etc.

• Synopsis and conclusions

Page 3: Decision Analysis

Decision theory

• Formal process for evaluating possible actions and making decisions

• Statistical decision theory is decision theory using statistical information

• Knight (1921)– Decision under risk (probabilities known)– Decision under uncertainty (probabilities not known)

Page 4: Decision Analysis

Discrete decision problem

• Actions Ai (strategies, decisions, choices)

• Scenarios Sj

• Payoffs Xij for action Ai in scenario Sj

• Probability Pj (if known) of scenario Sj

• Decision criterionS1 S2 S3 …

A1 X11 X12 X13 …A2 X21 X22 X23 …A3 X31 X32 X33 … . . . . . . . . . . . .

P1 P2 P3 …

Page 5: Decision Analysis

Decisions under risk

• If you make many similar decisions, then you’ll perform best in the long run using “expected utility” (EU) as the decision rule

• EU = maximize expected utility (Pascal 1670)

• Pick the action Ai so (Pj Xij) is biggest

Page 6: Decision Analysis

20*.5 + 10*.25 + 0*.15 + 5*.1 = 13 20*.5 + 10*.25 + 0*.15 + 5*.1 = 13

Example

Scenario 1 Scenario 2 Scenario 3 Scenario 4

Action A 10 5 15 5Action B 20 10 0 5Action C 10 10 20 15Action D 0 5 60 25

Probability .5 .25 .15 .1

10*.5 + 5*.25 + 15*.15 + 5*.1 = 9

10*.5 + 10*.25 + 20*.15 + 15*.1 = 12 0*.5 + 5*.25 + 60*.15 + 25*.1 = 12.75

Maximizing expected utility prefers action B

Page 7: Decision Analysis

Strategizing possible actions

• Office printing– Repair old printer– Buy new HP printer– Buy new Lexmark printer– Outsource print jobs

• Protection Australia marine resources– Undertake treaty to define marine reserve– Pay Australian fishing vessels not to exploit– Pay all fishing vessels not to exploit– Repel encroachments militarily– Support further research– Do nothing

Page 8: Decision Analysis

Scenario development

• Office printing– Printing needs stay about the same/decline/explode– Paper/ink/drum costs vary – Printer fails out of warranty

• Protection Australia marine resources– Fishing varies in response to ‘healthy diet’ ads/mercury scare– Poaching increases/decreases– Coastal fishing farms flourish/are decimated by viral disease– New longline fisheries adversely affect wild fish populations– International opinion fosters environmental cooperation– Chinese/Taiwanese tensions increase in areas near reserve

Page 9: Decision Analysis

How do we get the probabilities?

• Modeling, risk assessment, or prediction

• Subjective assessment– Asserting A means you’ll pay $1 if not A – If the probability of A is P, then a Bayesian

agrees to assert A for a fee of $(1-P), and to assert not-A for a fee of $P

– Different people will have different Ps for an A

Page 10: Decision Analysis

Rationality

• Your probabilities must make sense• Coherent if your bets don’t expose you to sure loss

– guaranteed loss no matter what the actual outcome is

• Probabilities larger than one are incoherent• Dutch books are incoherent

– Let P(X) denote the price of a promise to pay $1 if X– Setting P(Hillary OR Obama) to something other than the

sum P(Hillary) + P(Obama) is a Dutch book If P(Hillary OR Obama) is smaller than the sum, someone could make a sure profit by buying it from you and selling you the other two

Page 11: Decision Analysis

St. Petersburg game .

• Pot starts at 1¢• Pot doubles with every coin toss• Coin tossed until “tail” appears• You win whatever’s in the pot• What would you pay to play?

First tail Winnings1 0.012 0.023 0.044 0.085 0.166 0.327 0.648 1.289 2.5610 5.1211 10.2412 20.4813 40.9614 81.9215 163.84. .. .. .28 1,342,177.2829 2,684,354.5630 5,368,709.12. .. .. .

for i = 1 to 100 do say i, tab,tab,2^(i-1)/100 1 0.012 0.023 0.044 0.085 0.166 0.327 0.648 1.289 2.5610 5.1211 10.2412 20.4813 40.9614 81.9215 163.8416 327.6817 655.3618 1310.7219 2621.4420 5242.8821 10 485.7622 20971.5223 41943.0424 83886.0825 167772.1626 335544.3227 671088.6428 1342177.2829 2684354.5630 5368709.1231 10737418.2432 21474836.4833 42949672.9634 85899345.9235 171 798 691.8436 343597383.6837 687194767.3638 1374389534.739 2748779069.440 5497558138.941 1099511627842 2199023255643 4398046511144 8796093022245 1.7592186044e+1146 3.5184372089e+1147 7.0368744178e+1148 1.4073748836e+1249 2.8147497671e+1250 5.6294995342e+1251 1.1258999068e+1352 2.2517998137e+1353 4.5035996274e+1354 9.0071992547e+1355 1.8014398509e+1456 3.6028797019e+1457 7.2057594038e+1458 1.4411518808e+1559 2.8823037615e+1560 5.764607523e+1561 1.1529215046e+1662 2.3058430092e+1663 4.6116860184e+1664 9.2233720369e+1665 1.8446744074e+1766 3.6893488147e+1767 7.3786976295e+1768 1.4757395259e+1869 2.9514790518e+1870 5.9029581036e+1871 1.1805916207e+1972 2.3611832414e+1973 4.7223664829e+1974 9.4447329657e+1975 1.8889465931e+2076 3.7778931863e+2077 7.5557863726e+2078 1.5111572745e+2179 3.022314549e+2180 6.0446290981e+2181 1.2089258196e+2282 2.4178516392e+2283 4.8357032785e+2284 9.6714065569e+2285 1.9342813114e+2386 3.8685626228e+2387 7.7371252455e+2388 1.5474250491e+2489 3.0948500982e+2490 6.1897001964e+2491 1.2379400393e+2592 2.4758800786e+2593 4.9517601571e+2594 9.9035203143e+2595 1.9807040629e+2696 3.9614081257e+2697 7.9228162514e+2698 1.5845632503e+2799 3.1691265006e+27100 6.3382530011e+27

Page 12: Decision Analysis

What’s a fair price?

• The expected winnings would be a fair price – The chance of ending the game on the kth toss (i.e.,

the chance of getting k1 heads in a row) is 1/2k

– If the game ends on the kth toss, the winnings would be 2k1 cents

2

1

2

1

2

1

2

1

2

1

2

1

2

1

1632

18

16

14

8

12

4

11

2

1EU

So you should be willing to pay any price to play this game of chance

Page 13: Decision Analysis

St. Petersburg paradox

• The paradox is that nobody’s gonna pay more than a few cents to play

• To see why, and for a good time, call http://www.mathematik.com/Petersburg/Petersburg.html click

• No “solution” really resolves the paradox– Bankrolls are actually finite– Can’t buy what’s not sold– Diminishing marginal utility of money

Page 14: Decision Analysis

Utilities

• Payoffs needn’t be in terms of money

• Probably shouldn't be if marginal value of different amounts vary widely– Compare $10 for a child versus Bill Gates– A small profit may be a lot more valuable than the

amount of money it takes to cover a small loss

• Use utilities in matrix instead of dollars

Page 15: Decision Analysis

Risk aversion

• EITHER get $50• OR get $100 if a randomly drawn ball is

red from urn with half red and half blue balls

• Which prize do you want?

$50EU is the same, but most people take the sure $50

Page 16: Decision Analysis

Ellsberg Paradox

• Balls can be red, black or yellow (probs are R, B, Y )• A well-mixed urn has 30 red balls and 60 other balls• Don’t know how many are black, how many are yellow

Gamble A Gamble B

Get $100 if draw red Get $100 if draw black

Gamble C Gamble D

Get $100 if red or yellow Get $100 if black or yellow

R > B

R + Y < B + Y

HERO

Page 17: Decision Analysis

Persistent paradox

• Most people prefer A to B (so are saying R.> B) but also prefer D to C (saying R < B)

• Doesn’t depend on your utility function• Payoff size is irrelevant• Not related to risk aversion• Evidence for ambiguity aversion

– Can’t be accounted for by EU

Page 18: Decision Analysis

Ambiguity aversion

• Balls can be either red or blue• Two urns, both with 36 balls• Get $100 if a randomly drawn ball is red • Which urn do you wanna draw from?

Page 19: Decision Analysis

Assumptions

• Discrete decisions• Closed world Pj = 1

• Analyst can come up with Ai, Sj, Xij, Pj

• Ai and Sj are few in number

• Xij are unidimensional

• Ai not rewarded/punished beyond payoff

• Picking Ai doesn’t influence scenarios

• Uncertainty about Xij is negligible

.

Page 20: Decision Analysis

Why not use EU?

• Clearly doesn’t describe how people act• Needs a lot of information to use• Unsuitable for important unique decisions• Inappropriate if gambler’s ruin is possible• Sometimes Pj are inconsistent

• Getting even subjective Pj can be difficult

Page 21: Decision Analysis

Decisions under uncertainty

Page 22: Decision Analysis

Decisions without probability

• Pareto (some action dominates in all scenarios)

• Maximin (largest minimum payoff)

• Maximax (largest maximum payoff)

• Hurwicz (largest average of min and max payoffs)

• Minimax regret (smallest of maximum regret)

• Bayes-Laplace (max EU assuming equiprobable scenarios)

Page 23: Decision Analysis

Maximin

• Cautious decision maker• Select Ai that gives largest minimum payoff

(across Sj)• Important if “gambler’s ruin” is possible

(e.g. extinction)

• Chooses action C

1 2 3 4

A 10 5 15 5

B 20 10 0 5

C 10 10 20 15

D 0 5 60 25

Scenario Sj

Act

ion

Ai

Page 24: Decision Analysis

Maximax

• Optimistic decision maker• Loss-tolerant decision maker• Examine max payoffs across Sj

• Select Ai with the largest of these

• Prefers action D

1 2 3 4

A 10 5 15 5

B 20 10 0 5

C 10 10 20 15

D 0 5 60 25

Scenario Sj

Act

ion

Ai

Page 25: Decision Analysis

Hurwicz

• Compromise of maximin and maximax• Index of pessimism h where 0 h 1• Average min and max payoffs weighted by

h and (1h) respectively• Select Ai with highest average

– If h=1, it’s maximin– If h=0 it’s maximax

• Favors D if h=0.5

1 2 3 4

A 10 5 15 5

B 20 10 0 5

C 10 10 20 15

D 0 5 60 25

Scenario Sj

Act

ion

Ai

Page 26: Decision Analysis

Minimax regret

1 2 3 4

A 10 5 45 20

B 0 0 60 20

C 10 0 40 10

D 20 5 0 0

(20) (10) (60) (25) minuends

1 2 3 4

A 10 5 15 5

B 20 10 0 5

C 10 10 20 15

D 0 5 60 25

RegretPayoff

• Several competing decision makers• Regret Rij = (max Xij under Sj)Xij

• Replace Xij with regret Rij

• Select Ai with smallest max regret

• Favors action D

Page 27: Decision Analysis

Bayes-Laplace

• Assume all scenarios are equally likely• Use maximum expected value• Chris Rock’s lottery investments

• Prefers action D

10*.25 + 5*.25 + 15*.25 + 5*.25 = 8.75 20*.25 + 10*.25 + 0*.25 + 5*.25 = 8.75 10*.25 + 10*.25 + 20*.25 + 15*.25 = 13.75 0*.25 + 5*.25 + 60*.25 + 25*.25 = 22.5

Page 28: Decision Analysis

Pareto

• Choose an action if it can’t lose• Select Ai if its payoff is always biggest

(across Sj)

• Chooses action B

1 2 3 4

A 10 5 5 5

B 20 15 30 25

C 10 10 20 15

D 0 5 20 25

Scenario Sj

Act

ion

Ai

Page 29: Decision Analysis

Why not

• Complete lack of knowledge of Pj is rare

• Except for Bayes-Laplace, the criteria depend non-robustly on extreme payoffs

• Intermediate payoffs may be more likely than extremes (especially when extremes don’t differ much)

Page 30: Decision Analysis

Junk science and the precautionary principle

Page 31: Decision Analysis

Junk science (sensu Milloy)

• “Faulty scientific data and analysis used to further a special agenda”– Myths and misinformation from scientists, regulators, attorneys, media, and

activists seeking money, fame or social change that create alarm about pesticides, global warming, second-hand smoke, radiation, etc.

• Surely not all science is sound– Small sample sizes Wishful thinking– Overreaching conclusions Sensationalized reporting

• But Milloy has a very narrow definition of science– “The scientific method must be followed or you will soon find yourself

heading into the junk science briar patch. … The scientific method [is] just the simple and common process of trial and error. A hypothesis is tested until it is credible enough to be labeled a ‘theory’…. Anecdotes aren’t data. … Statistics aren’t science.” (http://www.junkscience.com/JSJ_Course/jsjudocourse/1.html)

Page 32: Decision Analysis

Science is more

• Hypothesis testing• Objectivity and repeatability• Specification and clarity• Coherence into theories• Promulgation of results• Full disclosure (biases, uncertainties)• Deduction and argumentation

Page 33: Decision Analysis

Classical hypothesis testing

• Alpha– probability of Type I error– i.e., accepting a false statement– strictly controlled, usually at 0.05 level, so false

statements don’t easily enter the scientific canon

• Beta– probability of Type II error– rejecting a true statement – one minus the power of the test– traditionally left completely uncontrolled

Page 34: Decision Analysis

Decision making

• Balances the two kinds of error• Weights each kind of error with its cost

• Not anti-scientific, but it has much broader perspective than simple hypothesis testing

One might expect Milloy, who studied law, to be familiar with this idea from jurisprudence, where a Type I error (an innocent person is convicted and the guilty person escapes justice) is considered much worse than the Type II error (acquitting the guilty person).

(Milloy’s criticisms would have merit if he discussed the power of tests that don’t show significance)

Page 35: Decision Analysis

Why is a balance needed?

• Consider arriving at the train station 5 min before, or 5 min after, your train leaves– Error of identical magnitudes– Grossly different costs

• Decision theory = scientific way to make optimal decisions given risks and costs

Page 36: Decision Analysis

Statistics in the decision context

• Estimation and inference problems can be reexpressed as decision problems

• Costs are determined by the use that will be made of the statistic or inference

• The question isn’t just “whether” anymore, it’s “what are we gonna do”

Page 37: Decision Analysis

It’s not your father’s statistics

• Classical statisticsaddresses the use of sample information to make inferences which are, for the most part, made without regard to the use to which they’ll be put

• Modern (Bayesian) statisticscombines sample information, plus knowledge of the possible consequences of decisions, and prior information, in order to make the best decision

Page 38: Decision Analysis

Policy is not science

• Policy making may be sound even if it does not derive specifically from application of the scientific method

• The precautionary principle is a non-quantitative way of acknowledging the differences in costs of the two kinds of errors

Page 39: Decision Analysis

Precautionary principle (PP)

• “Better safe than sorry”

• Has entered the general discourse, international treaties and conventions

• Some managers have asked how to “defend against” the precautionary principle (!)

• Must it mean no risks, no progress?

Page 40: Decision Analysis

Two essential elements

• Uncertainty– Without uncertainty, what’s at stake would be clear

and negotiation and trades could resolve disputes.

• High costs or irreversible effects– Without high costs, there’d be no debate. It is these

costs that justify shifting the burden of proof.

Page 41: Decision Analysis

Proposed guidelines for using PP

• Transparency• Proportionality• Non-discrimination• Consistency• Explicit examination of the costs and

benefits of action or inaction• Review of scientific developments

(Science 12 May 2000)

Page 42: Decision Analysis

But consistency is not essential

• Managers shouldn’t be bound by prior risky decisions

• “Irrational” risk choices very common– e.g., driving cars versus pollutant risks

• Different risks are tolerated differently– control– scale– fairness

Page 43: Decision Analysis

Take-home messages

• Guidelines (a lumper’s version)– Be explicit about your decisions– Revisit the question with more data

• Quantitative risk assessments can overrule PP

• Balancing errors and their costs is essential for sound decisions

Page 44: Decision Analysis

Ranked probabilities

Page 45: Decision Analysis

Intermediate approach

• Knight’s division is awkward– Rare to know nothing about probabilities– But also rare to know them all precisely

• Like to have some hybrid approach

Page 46: Decision Analysis

Kmietowicz and Pearman (1981)

• Criteria based on extreme expected payoffs– Can be computed if probabilities can be ranked

• Arrange scenarios so Pj Pj +1

• Extremize partial averages of payoffs, e.g.

max ( Xik / j )j

j

k=1

Page 47: Decision Analysis

Difficult example

• Neither action dominates the other• Min and max are the same so maximin,

maximax and Hurwicz cannot distinguish• Minimax regret and Bayes-Laplace favor A

Scenario 1 Scenario 2 Scenario 3 Scenario 4

Action A 7.5 -5 15 9Action B 5.5 9 -5 15

Page 48: Decision Analysis

If probabilities are ranked

• Maximin chooses B since 3.17 > 1.25• Maximax slightly prefers A because 7.5 > 7.25• Hurwicz favors B except in strong optimism• Minimax regret still favors A somewhat (7.33 > 7)

Scenario 1 Scenario 2 Scenario 3 Scenario 4Action A 7.5 -5 15 9Action B 5.5 9 -5 15

Partial 7.5 1.25 5.83 6.62 averages 5.5 7.25 3.17 6.12

Most likely Least likely

Page 49: Decision Analysis

Sometimes appropriate

• Uses available information more fully than criteria based on limiting payoffs

• Better than decisions under uncertainty if– several decisions are made (even if actions,

scenarios, payoffs and rankings change)– number of scenarios is large (because standard

methods ignore intermediate payoffs)

Page 50: Decision Analysis

Extreme expected payoffs

• Focusing on maximin expected payoff is more conservative than traditional maximin

• Focusing on maximax expected payoff is more optimistic than the old maximax

• Focusing on minimax expected regret will have less regret than using minimax regret

Page 51: Decision Analysis

Generality

• Robust to revising scenario ranksMostly a selected action won't change by inversion of ranks or by the introduction of a new scenario

• Can easily extend to intervals for payoffsMax (min) expected values found by applying partial averaging technique to all upper (lower) limits

Page 52: Decision Analysis

When it’s useful

• For difficult payoff matrices

• When you can only rank scenarios

• When multiple decision must be made

• When the number of scenarios is large

• When facing identical problem a small number of times (up to 10 or so)

Page 53: Decision Analysis

When you shouldn’t use it

• If probability ranks are rank guesses

• If you actually know the risks

• When you face the identical problem often– You should be able to estimate probabilities

Page 54: Decision Analysis

Imprecise probabilities

Page 55: Decision Analysis

Decision making under imprecision

• State of the world is a random variable, S S• Payoff (reward) of an action depends on S• We identify an action a with its reward fa : S R

i.e., fa is the action and its entire row from the payoff matrix

• We’d like to choose the decision with the largest expected reward, but without precisely specifying

1) the probability measure governing scenarios S

2) the payoff from an action a under scenario S

Page 56: Decision Analysis

Imprecision about probabilities

• Subjective probability– Bayesian “rational agents” are compelled to either sell

or buy any bet, but rational agents could decline to bet– Interval probability for event A is the range between the

largest P such that, for a fee of $(1P), you agree to pay $1 if A is doesn’t occur, and the smallest Q such that, for a fee of $Q, you agree to pay $1 if A occurs

• Frequentist probability– Incertitude or other uncertainties in simulations may

preclude our getting a precise estimate of a frequency

Interval probability is the range between the largest buying price and the smallest selling price s/he accepts

Lloyd Dobler

Page 57: Decision Analysis

Comparing actions a and b

Strictly preferred a > b Ep( fa) > Ep( fb) for all p M

Almost preferred a b Ep( fa) Ep( fb) for all p M

Indifferent a b Ep( fa) = Ep( fb) for all p M

Incomparable a || b Ep( fa) < Ep( fb) and

Eq( fa) > Eq( fb) some p,q M

where Ep( f ) = p(s) f (s), and

M is the set of possible probability distributions

s S

Page 58: Decision Analysis

E-admissibility

• Fix p in M and, assuming it’s the correct probability measure, see which decision emerges as the one that maximizes EU

• The result is then the set of all such decisions for all p M

Page 59: Decision Analysis

Alternative: maximality

• Maximal decisions are undominated over all p

a is maximal if there’s no b where Ep( fb) Ep( fa) for all p M

• Actions cannot be

linearly ordered,

but only partially

ordered

• •

Page 60: Decision Analysis

Another alternative: -maximin

• We could take the decision that maximizes the worst-case expected reward

• Essentially a worst-case optimization

• Generalizes two criteria from traditional theory– Maximize expected utility– Maximin

Page 61: Decision Analysis

Interval dominance

• If E(fa) > E(fb) then action b is inadmissible because a interval-dominates b

• Admissible actions are those that are not inadmissible to any other action

_

inadmissible

dominant

overlap

Page 62: Decision Analysis

E-admissible

Several IP decision criteria

-maximax

maximal

-maximin

interval dominance

Page 63: Decision Analysis

Example

• Suppose we are betting on a coin toss– Only know probability of heads is in [0.28, 0.7]– Want to decide among seven available gambles

1: Pays 4 for heads, pays 0 for tails2: Pays 0 for heads, pays 4 for tails3: Pays 3 for heads, pays 2 for tails4: Pays ½ for heads, pays 3 for tails5: Pays 2.35 for heads, pays 2.35 for tails6: Pays 4.1 for heads, pays 0.3 for tails7: Pays 0.1 for heads, pays 0.1 for tails

(after Troffaes 2004)

Page 64: Decision Analysis

Problem setup

p(H) [0.28, 0.7] p(T) [0.3, 0.72]

f1(H) = 4 f1(T) = 0

f2(H) = 0 f2(T) = 4

f3(H) = 3 f3(T) = 2

f4(H) = 0.5 f4(T) = 3

f5(H) = 2.35 f5(T) = 2.35

f6(H) = 4.1 f6(T) = 0.3

f7(H) = 0.1 f7(T) = 0.1

Page 65: Decision Analysis

M

• M is all Bernoulli probability distributions (mass at only two points, H and T) such that 0.28 p(H) 0.7

• It’s a (one-dimensional) space of probability measures

0 1 0 10 1 0 10 1

p = 0 p = 1p = ½p = 1/3 p = 2/3

0 1p

Page 66: Decision Analysis

0

1

2

3

0.2 0.3 0.4 0.5 0.6 0.7 0.8

Action 1 Action 2 Action 3 Action 4 Action 5 Action 6 Action 7

Rew

ard

p(H)

7

61

23

4

5

Page 67: Decision Analysis

E-admissibility

Probability Preference

p(H) < 2/5 2

p(H) = 2/5 2, 3 (indifferent)2/5 < p(H) < 2/3 3

p(H) = 2/3 1, 3 (indifferent)2/3 < p(H) 1

Page 68: Decision Analysis

Criteria yield different answers

-maximax{2}

E-admissible{1,2,3}

maximal{1,2,3,5}

-maximin{5}

interval dominance{1,2,3,5,6}

Page 69: Decision Analysis

So many answers

• Different criteria are useful in different settings

• The more precise the input, the tighter the outputs

• criteria usually yield only one decision

• criteria not good if many sequential decisions

• Some argue that E-admissibility is best overall

• Maximality is close to E-admissibility, but might be easier to compute for large problems

Page 70: Decision Analysis

Traditional Bayesian answer

• Decision allows only one action, unless we’re indifferent between actions

• Action 3 (or possibly 2, or even 1); different people would get different answers

• Depends on which prior we use for p(H)

• Typically do not express any doubt about the decision that’s made

Page 71: Decision Analysis

IP versus traditional approaches

• Decisions under IP allow indecision when your uncertainty entails it

• Bayes always produces a single decision (up to indifference), no matter how little information may be available

• IP unifies the two poles of Knight’s division into a continuum

Page 72: Decision Analysis

Comparison to Bayesian approach

• Axioms identical except IP doesn’t use completeness

• Bayesian rationality implies not only avoidance of sure loss & coherence, but also the idea that an agent must agree to buy or sell any bet at one price

• “Uncertainty of probability” is meaningful, and it’s operationalized as the difference between the max buying price and min selling price

• If you know all the probabilities (and utilities) perfectly, then IP reduces to Bayes

Page 73: Decision Analysis

Why Bayes fares poorly

• Bayesian approaches don’t distinguish ignorance from equiprobability

• Neuroimaging and clinical psychology shows humans strongly distinguish uncertainty from risk– Most humans regularly and strongly deviate from Bayes– Hsu (2005) reported that people who have brain lesions

associated with the site believed to handle uncertainty behave according to the Bayesian normative rules

• Bayesians are too sure of themselves (e.g., Clippy)

Page 74: Decision Analysis

IP does groups

• Bayesian theory does not work for groups– Rationality inconsistent with democratic process

• Scientific decision are not ‘personal’– Teams, agencies, collaborators, companies, clients– Reviewers, peers

• IP does generalize to group decisions– Can be rational and coherent if indecision is

admitted occasionally

Page 75: Decision Analysis

Take-home messages

• Antiscientific (or at least silly) to say you know more than you do

• Bayesian decision making always yields one answer, even if this is not really tenable

• IP tells you when you need to be careful and reserve judgment

Page 76: Decision Analysis

Synopsis and conclusions

Page 77: Decision Analysis

Decisions under risk

How?– Payoffs and probabilities are known– Select decision that maximizes expected utility

Why?– If you make many similar decisions, then you’ll

perform best in the long run using this ruleWhy not?

– Needs a lot of information to use– Unsuitable for important unique decisions– Inappropriate if gambler’s ruin is possible– Getting subjective probabilities can be difficult– Sometimes probabilities are inconsistent

Page 78: Decision Analysis

Bayesian (personalist) decisions

• Not a good description of how people act– Paradoxes (St. Petersburg, Ellsberg, Allais, etc.)

• No such thing as a ‘group decision’– Review panels, juries, teams, corporations– Cannot maintain rationality in this context– Unless run as a constant dictatorship

Page 79: Decision Analysis

Multi-criteria decision analysis

• Used when there are multiple, competing goals– E.g., USFS’ multiple use (biodiversity, aesthetics, habitat, timber, recreation,…)

– No universal solution; can only rank in one dimension

• Group decision based on subjective assessments• Organizational help with conflicting evaluations

– Identifying the conflicts– Deriving schemes for a transparent compromise

• Several approaches– Analytic Hierarchy Process (AHP); Evidential Reasoning;

Weight of Evidence (WoE)

Page 80: Decision Analysis

Analytic Hierarchy Process

• Identify possible actions– buy house in Stony Brook / PJ / rent

• Identify and rank significant attributes– location > price > school > near bus

• For each attribute, and every pair of actions, specify preference

• Evaluate consistency (transitivity) of the matrix of preferences by eigenanalysis

• Calculate a score for each alternative and rank

• Subject to rank reversals (e.g., without Perot, Bush beat Clinton)

Page 81: Decision Analysis

Decision under uncertainty

How?– Probabilities are not known– Use a criterion corresponding to your attitude about risk

(Pareto, Maximin, Maximax, Hurwicz, Minimax regret, Bayes-Laplace, etc.) – Select an optimal decision under this criterion

Why?– Answer reflects your attitudes about risk

Why not?– Complete ignorance about probabilities is rare– Results depend on extreme payoffs, except for Bayes-Laplace– Intermediate payoffs may be more likely than extremes

(especially when extremes don’t differ much)

Page 82: Decision Analysis

Why IP?

• Uses all available information• Doesn’t require unjustified assumptions• Tells you when you don’t know • Conforms with human psychology• Can make rational group decisions• Better in uncertainty-critical situations

– Gains and losses heavily depend on unknowns – Nuclear risk, endangered species, etc.

Page 83: Decision Analysis

Policy juggernauts

• “Precautionary principle” is a mantra intoned by lefty environmentalists

• “Junk science” is an epithet used by right-wing corporatists

• Yet claims made with both are important and should be taken seriously, and can be via risk assessment and decision analysis

Page 84: Decision Analysis

Underpinning for regulation

• Narrow definition of science?– Hypothesis testing is clearly insufficient– Need to acknowledge differential costs

• Decision theory?– Decision theory only optimal for unitary decision maker

(group decisions are much more tenuous)– Gaming the decision is rampant

• Maybe environmental regulation should be modeled on game theory instead of decision theory

Page 85: Decision Analysis

Game-theoretic strategies

• Building trust– explicitness– reciprocity– inclusion of all stake holders

• Checking– monitoring– adaptive management– renewal licensing

• Multiplicity– sovereignty, subsidiarity– some countries take a risk (GMOs, thalidomide)

Page 86: Decision Analysis

References• Foster, K.R., P. Vecchia, and M.H. Repacholi. 2000.Science and the precautionary principle.

Science 288(12 May): 979-981.

• Hsu, M., M. Bhatt, R. Adolphs, D. Tranel, and C.F. Camerer. 2005. Neural systems responding to degrees of uncertainty in human decision-making. Science 310:1680-1683.

• Kikuti, D., F.G. Cozman and C.P. de Campos. 2005. Partially ordered preferences in decision trees: computing strategies with imprecision in probabilities. Multidisciplinary IJCAI-05 Workshop on Advances in Preference Handling, R. Brafman and U. Junker (eds.), pp. 118-123. http://wikix.ilog.fr/wiki/pub/Preference05/WebHome/P40.pdf

• Kmietowicz, Z.W. and A.D. Pearman.1976. Decision theory and incomplete knowledge: maximum variance. Journal of Management Studies 13: 164–174.

• Kmietowicz, Z.W. and A.D. Pearman. 1981. Decision Theory and Incomplete Knowledge. Gower, Hampshire, England.

• Knight, F.H. 1921. Risk, Uncertainty and Profit. L.S.E., London.• Milloy, S. http://www.junkscience.com/JSJ_Course/jsjudocourse/1.html• Plous, S. 1993. The Psychology of Judgment and Decision Making. McGraw-Hill. • Sewell, M. “Expected utility theory” http://expected-utility-theory.behaviouralfinance.net/• Troffaes, M. 2004. Decision making with imprecise probabilities: a short review. The SIPTA

Newsletter 2(1): 4-7.• Walley, P. 1991. Statistical Reasoning with Imprecise Probabilities. Chapman and Hall, London.

• Cosmides, L., and J. Tooby. 1996. Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty. Cognition 58:1-73.

Page 87: Decision Analysis

Exercises

1. If you have a bankroll of $100, how many tosses could you allow in a finite version of the St. Petersburg game (if you were compelled to pay the winnings from this bankroll)? What are the expected winnings for a game limited to this many tosses? How much do you think your friends might actually pay to play this game? What are the numbers if the bankroll is $100 million?

2. Demographic simulations of an endangered marine turtle suggest that conservation strategy “Beachhead” will yield between 3 and 4.25 extra animals per unit area if it’s a normal year but only 3 if it’s a warm year, and that strategy “Longliner” will produce between 1 and 3 extra animals in a normal year and between 2 and 5 extra animals in a warm year. Assume that the probability of next year being warm is between 0.5 and 0.75. Graph the reward functions for these two strategies as a function of the probability of next year being warm. Which strategy would conservationists prefer? Why?

3. How are -maximin and maximin expected payoff related?

Page 88: Decision Analysis

End

Page 89: Decision Analysis
Page 90: Decision Analysis

A gambler could lock in a profit of 10, by betting 100, 50 and 40 on the three horses respectively

http://en.wikipedia.org/wiki/Dutch_book

Dutch book example

Horse Offered odds ProbabilityDanger Spree Evens 0.5Windtower 3 to 1 against 0.25Shoeless Bob 4 to 1 against 0.2

0.95 (total)

Page 91: Decision Analysis

Knight’s dichotomy bridged

Decisions under riskProbabilities knownMaximize expected utility

Decisions under uncertaintyProbabilities unknownSeveral possible strategies

Decisions under imprecisionProbabilities somewhat knownE-admissability, partial averages, et al.

Page 92: Decision Analysis

Bayesians

• Updating with Bayes’ rule• Subjective probabilities (defined by bets)• Decision analysis context

• Distribution even for “fixed” parameter• Allows natural confidence statements• Uses all information, including priors

Page 93: Decision Analysis

Prior information

• Suppose I claim to be able to distinguish music by Haydn from music by Mozart

• What if the claim were that I can predict the flips of a coin taken from your pocket?

• Prior knowledge conditions us to believe the former claim but not the latter, even if the latter were buttressed by sample data of 10 flips at a significance level of 1/210

Page 94: Decision Analysis

Decision theory paradoxes

• St. Petersburg Paradox• Ellsberg Paradox• Allais Paradox• Borel Paradox • Rumsfeld’s quandry (unknown unknowns)• Open worldness• Intransitivity