66
Inference

Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

  • View
    227

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Inference

Page 2: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Overview

The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Page 3: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

MCMC: Gibbs Sampling

state ← random truth assignmentfor i ← 1 to num-samples do for each variable x sample x according to P(x|neighbors(x)) state ← state with new value of xP(F) ← fraction of states in which F is true

Page 4: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

But … Insufficient for Logic

Problem:Deterministic dependencies break MCMCNear-deterministic ones make it very slow

Solution:Combine MCMC and WalkSAT

→ MC-SAT algorithm

Page 5: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

The MC-SAT Algorithm

MC-SAT = MCMC + SAT MCMC: Slice sampling with an auxiliary variable

for each clause SAT: Wraps around SampleSAT (a uniform sampler)

to sample from highly non-uniform distributions

Sound: Satisfies ergodicity & detailed balance Efficient: Orders of magnitude faster than

Gibbs and other MCMC algorithms

Page 6: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Auxiliary-Variable Methods

Main ideas: Use auxiliary variables to capture dependencies Turn difficult sampling into uniform sampling

Given distribution P(x)

Sample from f (x,u), then discard u

1, if 0 ( )( , ) ( , ) ( )

0, otherwise

u P xf x u f x u du P x

Page 7: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Slice Sampling [Damien et al. 1999]

Xx(k)

u(k)

x(k+1)

Slice

U P(x)

Page 8: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Slice Sampling

Identifying the slice may be difficult

Introduce an auxiliary variable ui for each Фi

1( ) ( )i

i

P x xZ

1

1 if 0 ( )( , , , )

0 otherwisei i

n

u xf x u u

Page 9: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

The MC-SAT Algorithm Approximate inference for Markov logic Use slice sampling in MCMC

Auxiliary var. ui for each clause Ci:

Ci unsatisfied: 0 ui 1

exp ( wi fi ( x ) ) ui for any next state x

Ci satisfied: 0 ui exp ( wi )

With prob. 1 – exp ( – wi ), next state x must satisfy Ci

to ensure that exp ( wi fi(x) ) ui

0 exp ( )i i iu w f x

Page 10: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

The MC-SAT Algorithm Select random subset M of satisfied clauses Larger wi Ci more likely to be selected

Hard clause (wi ): Always selected Slice States that satisfy clauses in M Sample uniformly from these

Page 11: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

The MC-SAT Algorithm

X ( 0 ) A random solution satisfying all hard clausesfor k 1 to num_samples

M Ø

forall Ci satisfied by X ( k – 1 )

With prob. 1 – exp ( – wi ) add Ci to MendforX ( k ) A uniformly random solution satisfying M

endfor

Page 12: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

The MC-SAT Algorithm

Sound: Satisfies ergodicity and detailed balance(Assuming we have a perfect uniform sampler)

Approximately uniform sampler [Wei et al. 2004] SampleSAT = WalkSAT + Simulated annealing WalkSAT: Find a solution very efficiently

But may be highly non-uniform Sim. Anneal.: Uniform sampling as temperature 0

But very slow to reach a solution

Trade off uniformity vs. efficiency

by tuning the prob. of WS steps vs. SA steps

Page 13: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Combinatorial Explosion

Problem: If there are n constantsand the highest clause arity is c,the ground network requires O(n ) memory(and inference time grows in proportion)

Solutions: Knowledge-based model construction Lazy inference Lifted inference

c

Page 14: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Knowledge-BasedModel Construction Basic idea:

Most of ground network may be unnecessary,because evidence renders query independent of it

Assumption: Evidence is conjunction of ground atoms

Knowledge-based model construction (KBMC): First construct minimum subset of network needed

to answer query (generalization of KBMC) Then apply MC-SAT (or other)

Page 15: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Ground Network Construction

network ← Øqueue ← query nodesrepeat node ← front(queue) remove node from queue add node to network if node not in evidence then add neighbors(node) to queue until queue = Ø

Page 16: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 17: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 18: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 19: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 20: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 21: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 22: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 23: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

Page 24: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example Grounding

P( Cancer(B) | Smokes(A), Friends(A,B), Friends(B,A))

)()(),(,

)()(

ySmokesxSmokesyxFriendsyx

xCancerxSmokesx

1.1

5.1

Cancer(A)

Smokes(A)Friends(A,A)

Friends(B,A)

Smokes(B)

Friends(A,B)

Cancer(B)

Friends(B,B)

otherwise

if 0

2.2

e

Smokes(B)e

otherwise

)( if 0

5.1

e

BCancerSmokes(B)e

Page 25: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Lazy Inference

Most domains are extremely sparse Most ground atoms are false Therefore most clauses are trivially satisfied

We can exploit this by Having a default state for atoms and clauses Grounding only those atoms and clauses with

non-default states Typically reduces memory (and time) by

many orders of magnitude

Page 26: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example: Scientific Research

Author(person,paper)

Author(person1,paper) Author(person2,paper) Coauthor(person1,person2)

1000 Papers , 100 Authors

100,000 possible groundings … But only a few thousand are true

10 million possible groundings … But only tens of thousands are unsatisfied

Page 27: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Lazy Inference

Here: LazySAT (lazy version of WalkSAT) Method is applicable to many other

algorithms (including MC-SAT)

Page 28: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Naïve Approach Create the groundings and keep in memory

True atoms Unsatisfied clauses

Memory cost is O(# unsatisfied clauses) Problem

Need to go to the KB for each flip Too slow!

Solution Idea : Keep more things in memory A list of active atoms Potentially unsatisfied clauses (active clauses)

Page 29: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

LazySAT: Definitions

An atom is an Active Atom if It is in the initial set of active atoms It was flipped at some point during the search

A clause is an Active Clause if It can be made unsatisfied by flipping zero or more active

atoms in it

Page 30: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

LazySAT: The Basics

Activate all the atoms appearing in clauses unsatisfied by evidence DB

Create the corresponding clauses Randomly assign truth value to all active atoms Activate an atom when it is flipped if not already so Potentially activate additional clauses No need to go to the KB for calculating the change

in cost for flipping an active atom

Page 31: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

LazySATfor i ← 1 to max-tries do active_atoms ← atoms in clauses unsatisfied by DB active_clauses ← clauses activated by active_atoms soln = random truth assignment to active_atoms for j ← 1 to max-flips do if ∑ weights(sat. clauses) ≥ threshold then return soln c ← random unsatisfied clause with probability p vf ← a randomly chosen variable from c else for each variable v in c do compute DeltaGain(v), using weighted_KB if vf active_atoms vf ← v with highest DeltaGain(v) if vf active_atoms then activate vf and add clauses activated by vf

soln ← soln with vf flippedreturn failure, best soln found

Page 32: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

LazySATfor i ← 1 to max-tries do active_atoms ← atoms in clauses unsatisfied by DB active_clauses ← clauses activated by active_atoms soln = random truth assignment to active_atoms for j ← 1 to max-flips do if ∑ weights(sat. clauses) ≥ threshold then return soln c ← random unsatisfied clause with probability p vf ← a randomly chosen variable from c else for each variable v in c do compute DeltaGain(v), using weighted_KB if vf active_atoms vf ← v with highest DeltaGain(v) if vf active_atoms then activate vf and add clauses activated by vf

soln ← soln with vf flippedreturn failure, best soln found

Page 33: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

LazySATfor i ← 1 to max-tries do active_atoms ← atoms in clauses unsatisfied by DB active_clauses ← clauses activated by active_atoms soln = random truth assignment to active_atoms for j ← 1 to max-flips do if ∑ weights(sat. clauses) ≥ threshold then return soln c ← random unsatisfied clause with probability p vf ← a randomly chosen variable from c else for each variable v in c do compute DeltaGain(v), using weighted_KB if vf active_atoms vf ← v with highest DeltaGain(v) if vf active_atoms then activate vf and add clauses activated by vf

soln ← soln with vf flippedreturn failure, best soln found

Page 34: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

LazySATfor i ← 1 to max-tries do active_atoms ← atoms in clauses unsatisfied by DB active_clauses ← clauses activated by active_atoms soln = random truth assignment to active_atoms for j ← 1 to max-flips do if ∑ weights(sat. clauses) ≥ threshold then return soln c ← random unsatisfied clause with probability p vf ← a randomly chosen variable from c else for each variable v in c do compute DeltaGain(v), using weighted_KB if vf active_atoms vf ← v with highest DeltaGain(v) if vf active_atoms then activate vf and add clauses activated by vf

soln ← soln with vf flippedreturn failure, best soln found

Page 35: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example

Coa(A,B)

Coa(A,A)

Coa(A,C)

. . . . . .

False

False

False

Coa(C,A) Coa(A,A) Coa(C,A)

Coa(A,B) Coa(B,C) Coa(A,C)

Coa(C,B) Coa(B,B) Coa(C,B)

Coa(C,A) Coa(A,B) Coa(C,B)

Coa(C,B) Coa(B,A) Coa(C,A)

Page 36: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example

Coa(A,B)

Coa(A,A)

Coa(A,C)

. . . . . .

False

True

False

Coa(C,A) Coa(A,A) Coa(C,A)

Coa(A,B) Coa(B,C) Coa(A,C)

Coa(C,B) Coa(B,B) Coa(C,B)

Coa(C,A) Coa(A,B) Coa(C,B)

Coa(C,B) Coa(B,A) Coa(C,A)

Page 37: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example

Coa(A,B)

Coa(A,A)

Coa(A,C)

. . . . . .

False

True

False

Coa(C,A) Coa(A,A) Coa(C,A)

Coa(A,B) Coa(B,C) Coa(A,C)

Coa(C,B) Coa(B,B) Coa(C,B)

Coa(C,A) Coa(A,B) Coa(C,B)

Coa(C,B) Coa(B,A) Coa(C,A)

Page 38: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

LazySAT Performance Solution quality

Performs the same sequence of flips Same result as WalkSAT

Memory cost O(# potentially unsatisfied clauses)

Time cost Much lower initialization cost Cost of creating active clauses amortized over many flips

Page 39: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Lifted Inference

We can do inference in first-order logic without grounding the KB (e.g.: resolution)

Let’s do the same for inference in MLNs Group atoms and clauses into

“indistinguishable” sets Do inference over those First approach: Lifted variable elimination

(not practical) Here: Lifted belief propagation

Page 40: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Belief Propagation

Nodes (x)

Features (f)

}\{)(

)()(fxnh

xhfx xx

}{~ }\{)(

)( )()(x xfny

fyxwf

xf yex

Page 41: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Lifted Belief Propagation

}\{)(

)()(fxnh

xhfx xx

}{~ }\{)(

)( )()(x xfny

fyxwf

xf yex

Nodes (x)

Features (f)

Page 42: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Lifted Belief Propagation

}\{)(

)()(fxnh

xhfx xx

}{~ }\{)(

)( )()(x xfny

fyxwf

xf yex

Nodes (x)

Features (f)

Page 43: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Lifted Belief Propagation

}\{)(

)()(fxnh

xhfx xx

}{~ }\{)(

)( )()(x xfny

fyxwf

xf yex

, :Functions of edge counts

Nodes (x)

Features (f)

Page 44: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Lifted Belief Propagation

Form lifted network composed of supernodesand superfeatures Supernode: Set of ground atoms that all send and

receive same messages throughout BP Superfeature: Set of ground clauses that all send and

receive same messages throughout BP

Run belief propagation on lifted network Guaranteed to produce same results as ground BP Time and memory savings can be huge

Page 45: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Forming the Lifted Network

1. Form initial supernodesOne per predicate and truth value(true, false, unknown)

2. Form superfeatures by doing joins of their supernodes

3. Form supernodes by projectingsuperfeatures down to their predicatesSupernode = Groundings of a predicate with same number of projections from each superfeature

4. Repeat until convergence

Page 46: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example

)(),()(, ySmokesyxFriendsxSmokesyx

Evidence: Smokes(Ana)Friends(Bob,Charles), Friends(Charles,Bob)

N people in the domain

Page 47: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Example

Evidence: Smokes(Ana)Friends(Bob,Charles), Friends(Charles,Bob)

)(),()(, ySmokesyxFriendsxSmokesyx

Smokes(Ana)Smokes(Bob)

Smokes(Charles)

Smokes(James)Smokes(Harry)

Intuitive Grouping :

Page 48: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Initialization

Smokes(Ana)

Smokes(X)X Ana

Friends(Bob, Charles)Friends(Charles, Bob)

Friends(Ana, X)Friends(X, Ana)

Friends(Bob,X)X Charles

….

Supernodes Superfeatures

Page 49: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Joining the Supernodes

Smokes(Ana)

Smokes(X)X Ana

Friends(Bob, Charles)Friends(Charles, Bob)

Friends(Ana, X)Friends(X, Ana)

Friends(Bob,X)X Charles

….

Smokes(Ana)

Supernodes Superfeatures

Page 50: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Joining the Supernodes

Smokes(Ana)

Smokes(X)X Ana

Friends(Bob, Charles)Friends(Charles, Bob)

Friends(Ana, X)Friends(X, Ana)

Friends(Bob,X)X Charles

….

Smokes(Ana) Friends(Ana, X)

Supernodes Superfeatures

Page 51: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Joining the Supernodes

Smokes(Ana)

Smokes(X)X Ana

Friends(Bob, Charles)Friends(Charles, Bob)

Friends(Ana, X)Friends(X, Ana)

Friends(Bob,X)X Charles

….

Supernodes Superfeatures

Smokes(Ana) Friends(Ana, X) Smokes(X)

X Ana

Page 52: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Joining the Supernodes

Smokes(Ana)

Smokes(X)X Ana

Friends(Bob, Charles)Friends(Charles, Bob)

Friends(Ana, X)Friends(X, Ana)

Friends(Bob,X)X Charles

….

Supernodes Superfeatures

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Ana) Friends(Ana, X) Smokes(X)

X Ana

Page 53: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Joining the Supernodes

Smokes(Ana)

Smokes(X)X Ana

Friends(Bob, Charles)Friends(Charles, Bob)

Friends(Ana, X)Friends(X, Ana)

Friends(Bob,X)X Charles

….

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Supernodes Superfeatures

Smokes(Ana) Friends(Ana, X) Smokes(X)

X Ana

Page 54: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Joining the Supernodes

Smokes(Ana)

Smokes(X)X Ana

Friends(Bob, Charles)Friends(Charles, Bob)

Friends(Ana, X)Friends(X, Ana)

Friends(Bob,X)X Charles

….

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob, X) Smokes(X)X Charles

Supernodes Superfeatures

Smokes(Ana) Friends(Ana, X) Smokes(X)

X Ana

Page 55: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana) Friends(Ana, X) Smokes(X)

X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

Supernodes Superfeatures

Page 56: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)Smokes(Ana) Friends(Ana, X)

Smokes(X)X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

Supernodes Superfeatures

Page 57: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)Smokes(Ana) Friends(Ana, X)

Smokes(X)X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

Supernodes Superfeatures

Populate with projection counts

Page 58: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)Smokes(Ana) Friends(Ana, X)

Smokes(X)X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

N-1

Supernodes Superfeatures

Page 59: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)Smokes(Ana) Friends(Ana, X)

Smokes(X)X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

N-1 0

Supernodes Superfeatures

Page 60: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)Smokes(Ana) Friends(Ana, X)

Smokes(X)X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

N-1 0 0

Supernodes Superfeatures

Page 61: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)Smokes(Ana) Friends(Ana, X)

Smokes(X)X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

N-1 0 0 0

Supernodes Superfeatures

Page 62: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)

Smokes(Bob)Smokes(Charles)

Smokes(Ana) Friends(Ana, X) Smokes(X)

X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

N-1 0 0 0

0 1 1 N-3

Supernodes Superfeatures

Page 63: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Projecting the Superfeatures

Smokes(Ana)

Smokes(Bob)Smokes(Charles)

Smokes(Ana) Friends(Ana, X) Smokes(X)

X Ana

Smokes(X) Friends(X, Ana) Smokes(Ana)

X Ana

Smokes(Bob) Friends(Bob, Charles) Smokes(Charles)

Smokes(Bob) Friends(Bob,X) Smokes(X)X Charles

Smokes(X)X Ana, Bob, Charles

N-1 0 0 0

0 1 1 N-3

0 1 0 N-2

Supernodes Superfeatures

Page 64: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Theorem

There exists a unique minimal lifted network The lifted network construction algo. finds it BP on lifted network gives same result as

on ground network

Page 65: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Representing SupernodesAnd Superfeatures

List of tuples: Simple but inefficient Resolution-like: Use equality and inequality Form clusters (in progress)

Page 66: Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference

Open Questions

Can we do approximate KBMC/lazy/lifting? Can KBMC, lazy and lifted inference be

combined? Can we have lifted inference over both

probabilistic and deterministic dependencies? (Lifted MC-SAT?)

Can we unify resolution and lifted BP? Can other inference algorithms be lifted?