17
Bayesian Networks II: Dynamic Networks and Markov Chains By Peter Woolf ([email protected]) University of Michigan Michigan Chemical Process Dynamics and Controls Open Textbook version 1.0 Creative commons

Bayesian Networks II: Dynamic Networks and Markov Chains By Peter Woolf ([email protected]) University of Michigan Michigan Chemical Process Dynamics and

  • View
    218

  • Download
    2

Embed Size (px)

Citation preview

Bayesian Networks II:Dynamic Networks and Markov Chains

By Peter Woolf ([email protected])University of Michigan

Michigan Chemical Process Dynamics and Controls Open Textbook

version 1.0

Creative commons

Existing plant measurementsPhysics, chemistry, and

chemical engineering knowledge & intuition Bayesian network models to

establish connections

Patterns of likely causes & influences

Efficient experimental design to test combinations of

causes

ANOVA & probabilistic models to eliminate irrelevant or uninteresting

relationships

Process optimization (e.g. controllers, architecture, unit

optimization, sequencing, and utilization)

Dynamical process modeling

From http://www.norsys.com/netlib/car_diagnosis_2.htm

Static Bayesian Network Example 1: Car failure diagnosis network

Static Bayesian Network Example 2:ALARM network: A Logical Alarm Reduction MechanismA medical diagnostic system for patient monitoring with 8 diagnoses, 16 findings, and 13 intermediate values

From Beinlich, Ingo, H. J. Suermondt, R. M. Chavez, and G. F. Cooper (1989) "The ALARM monitoring system: A case study with two probabilistic inference techniques for belief networks" in Proc. of the Second European Conf. on Artificial Intelligence in Medicine (London, Aug.), 38, 247-256. Also Tech. Report KSL-88-84, Knowledge Systems Laboratory, Medical Computer Science, Stanford Univ., CA.

weight

ALT

survival

RBC

procedure

Yesterday(ti-1)

Today(ti)

weight

ALT

survival

RBC

procedure

Unrolled Network

Dynamic Bayesian Networks

weight

ALT

survival

RBC procedure

These are both examples of Dynamic Bayesian Networks (DBNs)

OR

Collapsed Network

0

50

100

150

0 5 10 15 20

time

ALT

ObservedPredicted

Predicts future responses

weight

weight

ALT

survival

RBC

procedure

ALT

survival

RBC

procedure

ALT

survival

RBC

procedure

weight

ALT

survival

RBC

procedure

weight

weight

ALT

survival

RBC

procedure

ALT

survival

RBC

procedure

ALT

survival

RBC

procedure

weight

ALT

survival

RBC

procedure

ti-1 titi-2ti-3ti+1

ti+2

Model derived from past data

weight

weight

ALT

survival

RBC

procedure

ALT

survival

RBC

procedure

ALT

survival

RBC

procedure

weight

ALT

survival

procedure

0

20

40

60

80

100

120

140

160

0 5 10 15 20

time

ALT

Observedtreatment 1treatment 2treatment 3

Today(ti)

Tomorrow(ti+1)

weight

ALT

survival

RBC

procedure

weight

ALT

survival

RBC

procedure

Dynamic Bayesian Networks:Predict to explore alternatives

DBNs provide a suitable environment for MPC!

DBN: Thermostat example

From http://www.norsys.com/networklibrary.html#

N: fluctuations

Time(t)

ti ti+1

H: Heater

T: Temperature

G: Temp Set Pt.

S: Switch

V: Value/Cost

N: fluctuations

H: Heater

T: Temperature

G: Temp Set Pt.

S: Switch

V: Value/Cost

Unrolled networkCollapsed network

N: fluctuations

Time(t)

ti ti+1

H: Heater

T: Temperature

N: fluctuations

H: Heater

T: Temperature

Simplified DBN

A Dynamic Bayesian Network can be recast as a Markov Network

Assume each variable is binary (has states 1 or 0), thus any configuration could be written as {010} meaning N=0, H=1, T=0

A Markov network describes how a system will transition from system state to state

{000}

{001}

{010}

{110}

{011}{111}

{101}

{100}

A Dynamic Bayesian Network can be recast as a Markov Network

A Markov network describes how a system will transition from system state to state

{000}

{001}

{010}

{110}

{011}{111}

{101}

{100}

Each edge has a probability associated with it.

to

{000} {001} {010} {100} {101} {110} {011} {111}

from

{000}

{001}

{010}

{100}

{101}

{110}

{011}

{111}

0 p1 0 0 0 p2 0 0

p3 p4 0 0 0 0 0 0

0 0 0 p5 0 0 0 0

0 0 0 0 0 p6 p7 0

0 0 p8 p9 p10 0 0 0

0 0 0 0 0 p11 0 0

0 0 0 0 p12 0 0 p13

0 0 0 0 0 0 p14 p15

⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢

⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥

Note: All rows must sum to 1P1+P2=1P5=1 etc.

Case Study: Synthetic StudySituation: Imagine that we are exploring the effect of a DNA damaging drug and UV light on the expression of 4 genes.

GFPGene AGene BGene C

Case Study 1: Synthetic Study

GFPGene AGene BGene C

IdealizedData

Case Study 1: Synthetic Study

GFPGene AGene BGene C

Noisy data

Case Study 1: Synthetic StudyNoisy data

Idealized Data

Given idealized or noisy data, can we find any relationships between the drug, UV exposure, GFP, and the gene expression profiles?

See miniTUBA.demodata.xls

Case Study 1: Synthetic Study

Google “miniTUBA” or go to http://ncibi.minituba.org

Case study 1: synthetic data

• Observations:– Stronger relationships require fewer

observations to identify– Noise in measurements are okay– Moderate binning errors are forgivable– Uncontrolled experiments can be your

friend in model learning

Take Home Messages

• Noisy, time varying processes can be modeled as a Dynamic Bayesian Network (DBN)

• A DBN can be recast as a Markov model of a stochastic system

• DBNs can be learned directly from data using tools such as miniTUBA