View
234
Download
0
Category
Tags:
Preview:
Citation preview
Bayesian Statistics and Decision Analysis
Session 10
• Using Statistics
• Bayes’ Theorem and Discrete Probability Models
• Bayes’ Theorem and Continuous Probability Distributions
• The Evaluation of Subjective Probabilities
• Decision Analysis: An Overview
• Decision Trees
• Handling Additional Information Using Bayes’ Theorem
• Utility
• The Value of Information
• Using the Computer
• Summary and Review of Terms
15-1 Bayesian Statistics and Decision Analysis
ClassicalInference
DataStatistical Conclusion
Bayesian Inference
Data
PriorInformation
Statistical Conclusion
Bayesian statistical analysis incorporates a prior probability distribution and likelihoods of observed data to determine a posterior probability distribution of events.
Bayesian statistical analysis incorporates a prior probability distribution and likelihoods of observed data to determine a posterior probability distribution of events.
Bayesian and Classical Statistics
• A medical test for a rare disease (affecting 0.1% of the population [ ]) is imperfect:– When administered to an ill person, the test will indicate so
with probability 0.92 [ ]
• The event is a false negative
– When administered to a person who is not ill, the test will erroneously give a positive result (false positive) with probability 0.04 [ ]
• The event is a false positive. .
P I( ) .0 001
P Z I P Z I( ) . ( ) . 92 08
( )Z I
( )Z I
P Z I P Z I( ) . ( ) . 0 04 0 96
Bayes’ Theorem: Example 10.1 (1)
P I
P I
P Z I
P Z I
( ) .
( ) .
( ) .
( ) .
0001
0999
092
004
P I ZP I Z
P Z
P I Z
P I Z P I Z
P Z I P I
P Z I P I P Z I P I
( )( )
( )
( )
( ) ( )
( ) ( )
( ) ( ) ( ) ( )
(. )( . )
(. )( . ) ( . )(. )
.
. .
.
..
92 0001
92 0001 004 999
000092
000092 003996
000092
040880225
Example 10.1: Applying Bayes’ Theorem
P I( ) .0001
P I( ) .0999 P Z I( ) .004
P Z I( ) .096
P Z I( ) .008
P Z I( ) .092 P Z I( ) ( . )( . ) . 0 001 0 92 00092
P Z I( ) ( . )( . ) . 0 001 0 08 00008
P Z I( ) ( . )( . ) . 0 999 0 04 03996
P Z I( ) ( . )( . ) . 0 999 0 96 95904
Prior Probabilities
Conditional Probabilities
JointProbabilities
Example 10.1: Decision Tree
Bayes’ theorem for a discrete random variable:
where is an unknown population parameter to be estimated from the data. The summation in the denominator is over all possible values of the parameter of interest, i, and x stands for
the observed data set.
Bayes’ theorem for a discrete random variable:
where is an unknown population parameter to be estimated from the data. The summation in the denominator is over all possible values of the parameter of interest, i, and x stands for
the observed data set.
P xP x P
P x Pii
i
( )( ) ( )
( ) ( )
The likelihood function is the set of conditional probabilities P(x|) for given data x, considering a function of an unknown population parameter, .
The likelihood function is the set of conditional probabilities P(x|) for given data x, considering a function of an unknown population parameter, .
10-2 Bayes’ Theorem and Discrete Probability Models
Prior DistributionS P(S)0.1 0.050.2 0.150.3 0.200.4 0.300.5 0.200.6 0.10
1.00
LikelihoodBinomial with n = 20 and p = 0.100000 x P( X = x) 4.00 0.0898Binomial with n = 20 and p = 0.200000 x P( X = x) 4.00 0.2182Binomial with n = 20 and p = 0.300000 x P( X = x) 4.00 0.1304Binomial with n = 20 and p = 0.400000 x P( X = x) 4.00 0.0350Binomial with n = 20 and p = 0.500000 x P( X = x) 4.00 0.0046Binomial with n = 20 and p = 0.600000 x P( X = x) 4.00 0.0003
Example 10.2: Prior Distribution and Likelihoods of 4 Successes in 20 Trials
Prior PosteriorDistribution Likelihood DistributionS P(S) P(x|S) P(S)P(x|S) P(S|x) 0.1 0.05 0.0898 0.00449 0.060070.2 0.15 0.2182 0.03273 0.437860.3 0.20 0.1304 0.02608 0.348900.4 0.30 0.0350 0.01050 0.140470.5 0.20 0.0046 0.00092 0.012300.6 0.10 0.0003 0.00003 0.00040
1.00 0.07475 1.00000
Prior PosteriorDistribution Likelihood DistributionS P(S) P(x|S) P(S)P(x|S) P(S|x) 0.1 0.05 0.0898 0.00449 0.060070.2 0.15 0.2182 0.03273 0.437860.3 0.20 0.1304 0.02608 0.348900.4 0.30 0.0350 0.01050 0.140470.5 0.20 0.0046 0.00092 0.012300.6 0.10 0.0003 0.00003 0.00040
1.00 0.07475 1.00000
93% CredibleSet
Example 10.2: Prior Probabilities, Likelihoods, and Posterior Probabilities
0.60.50.40.30.20.1
0.5
0.4
0.3
0.2
0.1
0.0
S
P(S
)
P os te rio r D is tributio n o f Marke t S hare
0.60.50.40.30.20.1
0.5
0.4
0.3
0.2
0.1
0.0
S
P(S
)
P rio r D is tributio n o f Marke t S hare
Example 10.2: Prior and Posterior Distributions
Prior Distribution S P(S)0.1 0.060070.2 0.437860.3 0.348900.4 0.140470.5 0.012300.6 0.00040
1.00000
LikelihoodBinomial with n = 16 and p = 0.100000 x P( X = x) 3.00 0.1423Binomial with n = 16 and p = 0.200000 x P( X = x) 3.00 0.2463Binomial with n = 16 and p = 0.300000 x P( X = x) 3.00 0.1465Binomial with n = 16 and p = 0.400000 x P( X = x) 3.00 0.0468Binomial with n = 16 and p = 0.500000 x P( X = x) 3.00 0.0085Binomial with n = 16 and p = 0.600000 x P( X = x) 3.00 0.0008
Example 10.2: A Second Sampling with 3 Successes in 16 Trials
Prior PosteriorDistribution Likelihood Distribution S P(S) P(x|S) P(S)P(x|S) P(S|x) 0.1 0.06007 0.1423 0.0085480 0.0490740.2 0.43786 0.2463 0.1078449 0.6191380.3 0.34890 0.1465 0.0511138 0.2934440.4 0.14047 0.0468 0.0065740 0.0377410.5 0.01230 0.0085 0.0001046 0.0006010.6 0.00040 0.0008 0.0000003 0.000002
1.00000 0.1741856 1.000000
Prior PosteriorDistribution Likelihood Distribution S P(S) P(x|S) P(S)P(x|S) P(S|x) 0.1 0.06007 0.1423 0.0085480 0.0490740.2 0.43786 0.2463 0.1078449 0.6191380.3 0.34890 0.1465 0.0511138 0.2934440.4 0.14047 0.0468 0.0065740 0.0377410.5 0.01230 0.0085 0.0001046 0.0006010.6 0.00040 0.0008 0.0000003 0.000002
1.00000 0.1741856 1.000000
91% Credible Set
Example 10.2: Incorporating a Second Sample
Application of Bayes’ Theorem using Excel. The spreadsheet uses the BINOMDISTfunction in Excel to calculate the likelihood probabilities. The posterior probabilitiesare calculated using a formula based on Bayes’ Theorem for discrete random variables.
PRIOR DISTRIBUTION:S 0.1 0.2 0.3 0.4 0.5 0.6P(S) 0.05 0.15 0.2 0.3 0.02 0.1
LIKELIHOOD OF 4 OCCURRENCES IN 20 TRIALS GIVEN THE VALUES OF S:S 0.1 0.2 0.3 0.4 0.5 0.6P(x|S) 0.089778828 0.218199402 0.130420974 0.03499079 0.004620552 0.000269686
POSTERIOR DISTRIBUDTION AFTER THE FIRST SAMPLE:S 0.1 0.2 0.3 0.4 0.5 0.6P(S|x) 0.060051633 0.437850349 0.348946078 0.14042871 0.012362456 0.000360778
LIKEHOOD OF 3 OCCURRENCES IN 16 TRIALS GIVEN THE VALUES OF S:S 0.1 0.2 0.3 0.4 0.5 0.6P(x|S) 0.142344486 0.246290605 0.146496184 0.04680953 0.008544922 0.000811749
POSTERIOR DISTRIBUTION AFTER THE SECOND SAMPLE:S 0.1 0.2 0.3 0.4 0.5 0.6P(S|x) 0.049074356 0.619102674 0.293476795 0.03773804 0.00060646 1.68E-06
Example 10.2: Using Excel
We define f() as the prior probability density of the parameter . We define f(x|) as the conditional density of the data x, given the value of . This is the likelihood function.
Bayes' theorem for continuous distributions:
Total area under f x
f x f
f x f d
f x ff
( )( ) ( )
( ) ( )
( ) ( )( )
x
10-3 Bayes’ Theorem and Continuous Probability Distributions
• Normal population with unknown mean and known standard deviation
• Population mean is a random variable with normal (prior) distribution and mean M and standard deviation .
• Draw sample of size n:
The posterior mean and variance of the normal population ofthe population mean, :
=
1 n
1 n
1 n
2 2
2 2 2 2
MM M
2 1
The Normal Probability Model
M n M s
M
M M
M
15 8 10 1154 684
2 1
815
6841154
8 684
2 1
8 684
=
12
n2
12
n2
12
n2
=
12
102
12
102
12
102
. .
..
. .
MM
= 11.77 Credible Set:
2 2 07795% 196 1177 196 2 077 7 699 15841
.. . ( . ) . [ . , . ]
The Normal Probability Model: Example 10.3
Likelihood
11.5411.77
PosteriorDistribution
PriorDistribution
15
Density
Example 10.3
• Based on normal distribution95% of normal distribution is within 2
standard deviations of the mean
P(-1 < x < 31) = .95= 15, = 8
68% of normal distribution is within 1 standard deviation of the mean
P(7 < x < 23) = .68 = 15, = 8
10-4 The Evaluation of Subjective Probabilities
• Elements of a decision analysisActions
Anything the decision-maker can do at any timeChance occurrences
Possible outcomes (sample space)Probabilities associated with chance occurrencesFinal outcomes
Payoff, reward, or loss associated with actionAdditional information
Allows decision-maker to reevaluate probabilities and possible rewards and losses
Decision
Course of action to take in each possible situation
10-5 Decision Analysis
Market
Do notmarket
Productunsuccessful(P=0.25)
Productsuccessful(P=0.75)
$100,000
-$20,000
$0
DecisionDecisionChance Occurrence
Chance Occurrence
Final Outcome
Final Outcome
Decision Tree: New-Product Introduction
Product isAction Successful Not SuccessfulMarket the product $100,000 -$20,000Do not market the product $0 $0
The expected value of , denoted ( ):
= 750000 -5000 = 70,000
all x
X E XE X xP x
E Outcome
( ) ( )
( ) (100, )( . ) ( , )( . )
000 0 75 20 000 0 25
Payoff Table and Expected Values of Decisions: New-Product Introduction
Market
Do notmarket
Productunsuccessful(P=0.25)
Productsuccessful(P=0.75)
$100,000
-$20,000
$0
ExpectedPayoff$70,000
ExpectedPayoff$70,000
ExpectedPayoff$0
ExpectedPayoff$0
Nonoptimaldecision branchis clipped
Nonoptimaldecision branchis clipped
Clipping the Nonoptimal Decision Branches
Solution to the New-Product Introduction Decision Tree
Outcome Payoff Probability xP(x)
Extremely successful $150,000 0.1 15,000Very successful 120.000 0.2 24,000Successful 100,000 0.3 30,000Somewhat successful 80,000 0.1 8,000Barely successful 40,000 0.1 4,000Break even 0 0.1 0Unsuccessful -20,000 0.05 -1000Disastrous -50,000 0.05 -2,500
Expected Payoff: $77,500
New-Product Introduction: Extended-Possibilities
Market
Do notmarket
$100,000
-$20,000
$0
DecisionChance Occurrence
Payoff
-$50,000
$0
$40,000$80,000
$120,000$150,000
0.2
0.3
0.05
0.1
0.1
0.1
0.1
0.05
ExpectedPayoff$77,500
ExpectedPayoff$77,500
Nonoptimaldecision branchis clipped
Nonoptimaldecision branchis clipped
New-Product Introduction: Extended-Possibilities Decision Tree
$780,000
$750,000
$700,000
$680,000
$740,000
$800,000
$900,000
$1,000,000
Lease
Not Lease
Pr=0.9
Pr=0.1
Pr=0.05
Pr=0.4
Pr=0.6
Pr=0.3
Pr=0.15
Not Promote
PromotePr=0.5
Example 10.4: Decision Tree
$780,000
$750,000
$700,000
$680,000
$740,000
$800,000
$900,000
$1,000,000
Lease
Not Lease
Pr=0.9
Pr=0.1
Pr=0.05
Pr=0.4
Pr=0.6
Pr=0.3
Pr=0.15
Not Promote
Promote
Expected payoff: $753,000
Expected payoff: $716,000
Expected payoff: $425,000
Expected payoff: $700,000
Pr=0.5Expected payoff: 0.5*425000+0.5*716000=$783,000
Example 10.4: Solution
0
$100,000
$95,000
-$25,000
-$5,000
$95,000
-$25,000
-$5,000
-$20,000
Test
Not test
Test indicatessuccess
Test indicatesfailure
Market
Do not market
Do not market
Do not market
Market
Market
Successful
Failure
Successful
Successful
Failure
Failure
Payoff
Pr=0.25
Pr=0.75
New-Product DecisionTree with Testing
New-Product DecisionTree with Testing
10-6 Handling Additional Information Using Bayes’ Theorem
P(S)=0.75 P(IS|S)=0.9 P(IF|S)=0.1P(F)=0.75 P(IS|F)=0.15 P(IF|S)=0.85P(IS)=P(IS|S)P(S)+P(IS|F)P(F)=(0.9)(0.75)+(0.15)(0.25)=0.7125P(IF)=P(IF|S)P(S)+P(IF|F)P(F)=(0.1)(0.75)+(0.85)(0.25)=0.2875
P(S| IS) =P(IS|S)P(S)
P(IS|S)P(S) P(IS|F)P(F)
P(F| IS) 1 P(S| IS) 1 0.9474 .0526
P(S| IF) =P(IF|S)P(S)
P(IF|S)P(S) P(IF|F)P(F)
P(F| IF) 1 P(S| IF) 1 0.2609 .7391
( . )( . )
( . )( . ) ( . )( . ).
( . )( . )
( . )( . ) ( . )( . ).
0 9 0 75
0 9 0 75 0 15 0 250 9474
0
0 1 0 75
0 1 0 75 0 85 0 250 2609
0
Applying Bayes’ Theorem
0
$100,000
$95,000
-$25,000
-$5,000
$95,000
-$25,000
-$5,000
-$20,000
Test
Not test
P(IS)=0.7125
Market
Do not market
Do not market
Do not market
Market
Market
P(S)=0.75
Payoff
P(F)=0.25
P(IF)=0.2875
P(S|IF)=0.2609
P(F|IF)=0.7391
P(S|IS)=0.9474
P(F|IS)=0.0526
$86,866 $86,866
$6,308
$70,000
$6,308
$70,000
$66.003
$70,000
Expected Payoffs and Solution
Prior InformationLevel ofEconomic
Profit Activity Probability$3 million Low 0.20$6 million Medium 0.50$12 million High 0.30
Reliability of Consulting FirmFutureState of Consultants’ Conclusion Economy High Medium LowLow 0.05 0.05 0.90Medium 0.15 0.80 0.05High 0.85 0.10 0.05
Consultants say “Low” Event Prior Conditional Joint PosteriorLow 0.20 0.90 0.180 0.818Medium 0.50 0.05 0.025 0.114High 0.30 0.05 0.015 0.068 P(Consultants say “Low”) 0.220 1.000
Example 10.5: Payoffs and Probabilities
Consultants say “Medium” Event Prior Conditional Joint PosteriorLow 0.20 0.05 0.010 0.023Medium 0.50 0.80 0.400 0.909High 0.30 0.10 0.030 0.068 P(Consultants say “Medium”) 0.440 1.000
Consultants say “High” Event Prior Conditional Joint PosteriorLow 0.20 0.05 0.010 0.029Medium 0.50 0.15 0.075 0.221High 0.30 0.85 0.255 0.750 P(Consultants say “High”) 0.340 1.000
Alternative InvestmentProfit Probability$4 million 0.50$7 million 0.50
Consulting fee: $1 million
Example 10.5: Joint and Conditional Probabilities
$3 million
$6 million
$3 million
$11 million
$5 million
$2 million
$6 million
$3 million
$11 million
$5 million
$2 million
$6 million
$7 million
$4 million
$12 million
$6 million
$3 million
$11million
$5 million
$2 million
Hire consultantsDo not hire consultants
L
H L
H
L L L
MM M M
HHH
Invest Invest Invest InvestAlternative AlternativeAlternativeAlternative
0.5 0.5 0.50.5 0.5 0.5 0.50.5
0.3 0.2 0.5 0.750 0.221 0.029 0.068 0.909 0.023 0.068 0.114 0.818
5.5 7.2 4.54.54.5 9.413 5.339 2.954
M0.34
0.44
0.22
7.2 9.413 5.339 4.5
6.54
Example 10.5: Decision Tree
Application of Bayes’ Theorem to the information in Example 15-4 using Excel. The conditional probabilities of the consultants’ conclusion given the true future state and the prior distribution on the true future state are used to calculate the joint probabilities for each combination of true state and the consultants’ conclusion. The joint probabilities and Bayes’ Theorem are used to calculate the prior probabilities on the consultants’ conclusions and the conditional probabilities of the true future state given the consultants’ conclusion.
SEE NEXT SLIDE FOR EXCEL OUTPUT.
Example 10.5: Using Excel
Distribution of Consultants' Conclusion Joint Probability of True StateGiven True Future State: and Consultants' Conclusion:
PriorCONSULTANTS' CONCLUSION Prob. Of CONSULTANTS' CONCLUSION
TRUE STATE "High" "Medium" "Low" True State TRUE STATE "High" "Medium" "Low"
Low 0.05 0.05 0.9 0.2 Low 0.01 0.01 0.18Medium 0.15 0.8 0.05 0.5 Medium 0.075 0.4 0.025High 0.85 0.1 0.05 0.3 High 0.255 0.03 0.015
CONSULTANTS' CONCLUSION
"High" "Medium" "Low"Prior Probability of Consultants' Conclusion: 0.34 0.44 0.22
CONSULTANTS' CONCLUSIONTRUE STATE "High" "Medium" "Low"
Posterior Distribution on True Future State Low 0.029 0.023 0.818Given the Consultants' Conclusion: Medium 0.221 0.909 0.114
High 0.75 0.068 0.068
Example 10.5: Using Excel
Dollars
Utility
Additional Utility
Additional $1000
Additional Utility
Additional $1000
}}
{
Utility is a measure of the total worth of a particular outcome.It reflects the decision maker’s attitude toward a collection of factors such as profit, loss, and risk.
10-7 Utility and Marginal Utility
Utility
Dollars
Risk Averse
Dollars
Utility Risk Taker
Utility
Dollars
Risk Neutral
Dollars
MixedUtility
Utility and Attitudes toward Risk
Possible Initial IndifferenceReturns Utility Probabilities Utility$1,500 0 04,300 (1500)(0.8)+(56000)(0.2) 0.2
22,000 (1500)(0.3)+(56000)(0.7) 0.731,000 (1500)(0.2)+(56000)(0.8) 0.856,000 1 1
6000050000400003000020000100000
1.0
0.5
0.0
Utility
Dollars
Assessing Utility
The expected value of perfect information (EVPI): EVPI = The expected monetary value of the decision situation when
perfect information is available minus the expected value of the decision situation when no additional information is available.
Expected Net Gain
Sample Size
Max
nmax
Expected Net Gain from SamplingExpected Net Gain from Sampling
10-8 The Value of Information
$200Fare
$300Fare
Competitor:$200Pr=0.6
Competitor:$300Pr=0.4
Competitor:$300Pr=0.4
Competitor:$200Pr=0.6
$8 million
$10 million
$4 million
$9 million
PayoffCompetitor’sFare
AirlineFare
8.4
6.4
Example 10.6: The Decision Tree
• If no additional information is available, the best strategy is to set the fare at $200.E(Payoff|200) = (.6)(8)+(.4)(9) = $8.4 millionE(Payoff|300) = (.6)(4)+(.4)(10) = $6.4 million
• With further information, the expected payoff could be:E(Payoff|Information) = (.6)(8)+(.4)(10)=$8.8 million
• EVPI=8.8-8.4 = $.4 million.
Example 10.6: Value of Additional Information
Recommended