Upload
macha
View
22
Download
1
Embed Size (px)
DESCRIPTION
INC 551 Artificial Intelligence. Lecture 8 Models of Uncertainty. Inference by Enumeration. Bayesian Belief Network Model. Causes -> Effect. Graph Structure shows dependency. Burglar Alarm Example. My house has a burglar alarm but sometimes it rings because - PowerPoint PPT Presentation
Citation preview
INC 551 Artificial Intelligence
Lecture 8
Models of Uncertainty
Inference by Enumeration
Bayesian Belief Network Model
Causes -> Effect
Graph Structure shows dependency
Burglar Alarm ExampleMy house has a burglar alarm but sometimes it rings becauseof an earthquake. My neighbors, John and Mary promise meto call if they hear the alarm. However, their ears are not perfect.
One Way to Create BBN
Computing Probability
= 0.90x0.70x0.001x0.999x0.998 = 0.00062
BBN Construction
There are many ways to construct BBN ofa problem because the events depend oneach other (related).
Therefore, it depends on the order of eventsthat you consider.
The most simple is the most compact.
Not compact
Inference Problem
)|( eEXP i Find
),|( trueMaryCallstrueJohnCallsBurglaryP
For example, find
Inference by Enumeration
y
yeXPeXPeP
eXPeXP ),,(),(
)(
),()|(
Note: let
α is called “Normalized Constant”
)(
1
eP
y are other events
),|( trueMaryCallstrueJohnCallsBurglaryP
(summation of all events)
Calculation Tree
Inefficient, Compute P(j|a)P(m|a) for every value of e
001.0
998.))06.05.01(.)94.9.7((.
002.))05.05.01(.)95.9.7((.
),,(),|(
mjbPmjbP
Next, we have to find P(~b|j,m) and find α = 1/P(j,m)
Approximate Inference
Idea: Count from real examples
We call this procedure “Sampling”
Sampling = get real examples from the world model
Sampling Example
Cloudy = มี�เมีฆSprinkler = ละอองน้ำ��
?)?,?,,(
),,,(
T
WetGrassRainSprinklerCloudy
?)?,?,,(
),,,(
T
WetGrassRainSprinklerCloudy
?)?,,,(
),,,(
FT
WetGrassRainSprinklerCloudy
?),,,(
),,,(
TFT
WetGrassRainSprinklerCloudy
?),,,(
),,,(
TFT
WetGrassRainSprinklerCloudy
),,,(
),,,(
TTFT
WetGrassRainSprinklerCloudy
1 Sample
Rejection Sampling
Idea: Count only the sample that agree with e
To find )|( eXP i
Rejection Sampling
Drawback: There are not many samples that agree with e
From the example above, from 100 samples, only 27 areUsable.
Likelihood Weighting
Idea: Generate only samples that are relevant with e
However, we must use “weighted sampling”
e.g. Find ),|( trueWetGrasstrueSprinklerRainP
Fix sprinkler = TRUE Wet Grass = TRUE
Weighted Sampling
Sample from P(Cloudy)=0.5 , suppose we get “true”
Sprinkler already has value = true,therefore we multiply with weight = 0.1
Sample from P(Rain)=0.8 , suppose we get “true”
WetGrass already has value = true,therefore we multiply with weight = 0.99
Finally, we got a sample (t,t,t,t)With weight = 0.099
Temporal Model (Time)
When the events are tagged with timestamp
1dayRain 2dayRain 3dayRain
Each node is considered “state”
Markov Process
Let Xt = stateFor Markov processes, Xt depends only on finite numberof previous xt
Hidden Markov Model (HMM)
Each state has observation, Et.
We cannot see “state”, but we see “observation”.