133
Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer, Ersen Bilgin, and Matt Hastings Quantum Computation and Quantum Spin Systems Vienna, August 2009 David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 1 / 33

Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

  • Upload
    others

  • View
    8

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

David Poulin

Département de PhysiqueUniversité de Sherbrooke

Joint work with: Matt Leifer, Ersen Bilgin, and Matt Hastings

Quantum Computation and Quantum Spin SystemsVienna, August 2009

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 1 / 33

Page 2: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Faculty position @ SherbrookeCanadian Excellence Research Chair

Solid state information processingExperimentalists and Theorists

$ 10 million Research Grant≥ 300m2 of new lab or office space in construction

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 2 / 33

Page 3: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

TaskCompute thermal properties of spin systems on lattices.

Origin of the method

Refinement of quantum generalized belief propagation.

ResultAlgorithm to produce lower bounds to free energy by solving finite-sizeconvex optimization problem with linear constraints.

Why is this interesting?

Variational methods (PEPS, MPS, MERA) provide UPPER bounds tofree energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 3 / 33

Page 4: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

TaskCompute thermal properties of spin systems on lattices.

Origin of the method

Refinement of quantum generalized belief propagation.

ResultAlgorithm to produce lower bounds to free energy by solving finite-sizeconvex optimization problem with linear constraints.

Why is this interesting?

Variational methods (PEPS, MPS, MERA) provide UPPER bounds tofree energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 3 / 33

Page 5: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

TaskCompute thermal properties of spin systems on lattices.

Origin of the method

Refinement of quantum generalized belief propagation.

ResultAlgorithm to produce lower bounds to free energy by solving finite-sizeconvex optimization problem with linear constraints.

Why is this interesting?

Variational methods (PEPS, MPS, MERA) provide UPPER bounds tofree energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 3 / 33

Page 6: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

TaskCompute thermal properties of spin systems on lattices.

Origin of the method

Refinement of quantum generalized belief propagation.

ResultAlgorithm to produce lower bounds to free energy by solving finite-sizeconvex optimization problem with linear constraints.

Why is this interesting?

Variational methods (PEPS, MPS, MERA) provide UPPER bounds tofree energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 3 / 33

Page 7: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Outline

1 Belief propagation

2 Quantum belief propagation

3 Markov Entropy Approximation Scheme

4 Closing

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 4 / 33

Page 8: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Outline

1 Belief propagation

2 Quantum belief propagation

3 Markov Entropy Approximation Scheme

4 Closing

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 5 / 33

Page 9: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Description of the algorithm

PurposeApproximate the solution to statistical inference problems involvinga large number of random variables v (spins) located at thevertices v ∈ V of a graph G = (V ,E).The graph’s edges (u, v) ∈ E encode some king of directdependency relation of the random variables.

Algorithm architectureHighly parallelizable: one processor per vertex.Messages exchanged between processors related by an edge.Outgoing messages at v depend on local "fields" and incommingmessages.Exact when G is a tree and complexity = depth(G).Good heuristic on loopy graphs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 6 / 33

Page 10: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Description of the algorithm

PurposeApproximate the solution to statistical inference problems involvinga large number of random variables v (spins) located at thevertices v ∈ V of a graph G = (V ,E).The graph’s edges (u, v) ∈ E encode some king of directdependency relation of the random variables.

Algorithm architectureHighly parallelizable: one processor per vertex.Messages exchanged between processors related by an edge.Outgoing messages at v depend on local "fields" and incommingmessages.Exact when G is a tree and complexity = depth(G).Good heuristic on loopy graphs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 6 / 33

Page 11: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Description of the algorithm

PurposeApproximate the solution to statistical inference problems involvinga large number of random variables v (spins) located at thevertices v ∈ V of a graph G = (V ,E).The graph’s edges (u, v) ∈ E encode some king of directdependency relation of the random variables.

Algorithm architectureHighly parallelizable: one processor per vertex.Messages exchanged between processors related by an edge.Outgoing messages at v depend on local "fields" and incommingmessages.Exact when G is a tree and complexity = depth(G).Good heuristic on loopy graphs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 6 / 33

Page 12: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Description of the algorithm

PurposeApproximate the solution to statistical inference problems involvinga large number of random variables v (spins) located at thevertices v ∈ V of a graph G = (V ,E).The graph’s edges (u, v) ∈ E encode some king of directdependency relation of the random variables.

Algorithm architectureHighly parallelizable: one processor per vertex.Messages exchanged between processors related by an edge.Outgoing messages at v depend on local "fields" and incommingmessages.Exact when G is a tree and complexity = depth(G).Good heuristic on loopy graphs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 6 / 33

Page 13: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Description of the algorithm

PurposeApproximate the solution to statistical inference problems involvinga large number of random variables v (spins) located at thevertices v ∈ V of a graph G = (V ,E).The graph’s edges (u, v) ∈ E encode some king of directdependency relation of the random variables.

Algorithm architectureHighly parallelizable: one processor per vertex.Messages exchanged between processors related by an edge.Outgoing messages at v depend on local "fields" and incommingmessages.Exact when G is a tree and complexity = depth(G).Good heuristic on loopy graphs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 6 / 33

Page 14: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Description of the algorithm

PurposeApproximate the solution to statistical inference problems involvinga large number of random variables v (spins) located at thevertices v ∈ V of a graph G = (V ,E).The graph’s edges (u, v) ∈ E encode some king of directdependency relation of the random variables.

Algorithm architectureHighly parallelizable: one processor per vertex.Messages exchanged between processors related by an edge.Outgoing messages at v depend on local "fields" and incommingmessages.Exact when G is a tree and complexity = depth(G).Good heuristic on loopy graphs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 6 / 33

Page 15: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Description of the algorithm

PurposeApproximate the solution to statistical inference problems involvinga large number of random variables v (spins) located at thevertices v ∈ V of a graph G = (V ,E).The graph’s edges (u, v) ∈ E encode some king of directdependency relation of the random variables.

Algorithm architectureHighly parallelizable: one processor per vertex.Messages exchanged between processors related by an edge.Outgoing messages at v depend on local "fields" and incommingmessages.Exact when G is a tree and complexity = depth(G).Good heuristic on loopy graphs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 6 / 33

Page 16: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Transfer matrix

Consider the 1d classical system with hamiltonianH =

∑i h(vi) +

∑〈ij〉 J(vi , vj).

Its Gibbs distribution is (µ(v) = e−βh(v) and ν(u : v) = e−βJ(u,v))

ρ(v1, v2, . . .) =1Z

e−βH(v1,v2,...)

=1Zµ(v1)ν(v1, v2)µ(v2)ν(v2, v3)µ(v3) . . .

So the reduced state of spin N can be evaluated step by step:

m1→2(v2) =∑v1

µ(v1)ν(v1 : v2)

m2→3(v3) =∑v2

mv1→v2(v2)µ(v2)ν(v2 : v3)

m3→4(v4) =∑v3

mv2→v3(v3)µ(v3)ν(v3 : v4)...

ρ(vN) = mvN−1→vN (vN)µ(vN)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 7 / 33

Page 17: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Transfer matrix

Consider the 1d classical system with hamiltonianH =

∑i h(vi) +

∑〈ij〉 J(vi , vj).

Its Gibbs distribution is (µ(v) = e−βh(v) and ν(u : v) = e−βJ(u,v))

ρ(v1, v2, . . .) =1Z

e−βH(v1,v2,...)

=1Zµ(v1)ν(v1, v2)µ(v2)ν(v2, v3)µ(v3) . . .

So the reduced state of spin N can be evaluated step by step:

m1→2(v2) =∑v1

µ(v1)ν(v1 : v2)

m2→3(v3) =∑v2

mv1→v2(v2)µ(v2)ν(v2 : v3)

m3→4(v4) =∑v3

mv2→v3(v3)µ(v3)ν(v3 : v4)...

ρ(vN) = mvN−1→vN (vN)µ(vN)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 7 / 33

Page 18: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Transfer matrix

Consider the 1d classical system with hamiltonianH =

∑i h(vi) +

∑〈ij〉 J(vi , vj).

Its Gibbs distribution is (µ(v) = e−βh(v) and ν(u : v) = e−βJ(u,v))

ρ(v1, v2, . . .) =1Z

e−βH(v1,v2,...)

=1Zµ(v1)ν(v1, v2)µ(v2)ν(v2, v3)µ(v3) . . .

So the reduced state of spin N can be evaluated step by step:

m1→2(v2) =∑v1

µ(v1)ν(v1 : v2)

m2→3(v3) =∑v2

mv1→v2(v2)µ(v2)ν(v2 : v3)

m3→4(v4) =∑v3

mv2→v3(v3)µ(v3)ν(v3 : v4)...

ρ(vN) = mvN−1→vN (vN)µ(vN)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 7 / 33

Page 19: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Transfer matrix

Consider the 1d classical system with hamiltonianH =

∑i h(vi) +

∑〈ij〉 J(vi , vj).

Its Gibbs distribution is (µ(v) = e−βh(v) and ν(u : v) = e−βJ(u,v))

ρ(v1, v2, . . .) =1Z

e−βH(v1,v2,...)

=1Zµ(v1)ν(v1, v2)µ(v2)ν(v2, v3)µ(v3) . . .

So the reduced state of spin N can be evaluated step by step:

m1→2(v2) =∑v1

µ(v1)ν(v1 : v2)

m2→3(v3) =∑v2

mv1→v2(v2)µ(v2)ν(v2 : v3)

m3→4(v4) =∑v3

mv2→v3(v3)µ(v3)ν(v3 : v4)...

ρ(vN) = mvN−1→vN (vN)µ(vN)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 7 / 33

Page 20: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Transfer matrix

Consider the 1d classical system with hamiltonianH =

∑i h(vi) +

∑〈ij〉 J(vi , vj).

Its Gibbs distribution is (µ(v) = e−βh(v) and ν(u : v) = e−βJ(u,v))

ρ(v1, v2, . . .) =1Z

e−βH(v1,v2,...)

=1Zµ(v1)ν(v1, v2)µ(v2)ν(v2, v3)µ(v3) . . .

So the reduced state of spin N can be evaluated step by step:

m1→2(v2) =∑v1

µ(v1)ν(v1 : v2)

m2→3(v3) =∑v2

mv1→v2(v2)µ(v2)ν(v2 : v3)

m3→4(v4) =∑v3

mv2→v3(v3)µ(v3)ν(v3 : v4)...

ρ(vN) = mvN−1→vN (vN)µ(vN)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 7 / 33

Page 21: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Transfer matrix

Consider the 1d classical system with hamiltonianH =

∑i h(vi) +

∑〈ij〉 J(vi , vj).

Its Gibbs distribution is (µ(v) = e−βh(v) and ν(u : v) = e−βJ(u,v))

ρ(v1, v2, . . .) =1Z

e−βH(v1,v2,...)

=1Zµ(v1)ν(v1, v2)µ(v2)ν(v2, v3)µ(v3) . . .

So the reduced state of spin N can be evaluated step by step:

m1→2(v2) =∑v1

µ(v1)ν(v1 : v2)

m2→3(v3) =∑v2

mv1→v2(v2)µ(v2)ν(v2 : v3)

m3→4(v4) =∑v3

mv2→v3(v3)µ(v3)ν(v3 : v4)...

ρ(vN) = mvN−1→vN (vN)µ(vN)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 7 / 33

Page 22: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Transfer matrix

Consider the 1d classical system with hamiltonianH =

∑i h(vi) +

∑〈ij〉 J(vi , vj).

Its Gibbs distribution is (µ(v) = e−βh(v) and ν(u : v) = e−βJ(u,v))

ρ(v1, v2, . . .) =1Z

e−βH(v1,v2,...)

=1Zµ(v1)ν(v1, v2)µ(v2)ν(v2, v3)µ(v3) . . .

So the reduced state of spin N can be evaluated step by step:

m1→2(v2) =∑v1

µ(v1)ν(v1 : v2)

m2→3(v3) =∑v2

mv1→v2(v2)µ(v2)ν(v2 : v3)

m3→4(v4) =∑v3

mv2→v3(v3)µ(v3)ν(v3 : v4)...

ρ(vN) = mvN−1→vN (vN)µ(vN)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 7 / 33

Page 23: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

General setting

Graph G = (V ,E).Random variables vi for i ∈ V .Functions µi(vi) for i ∈ V and νij(vi : vj) for (i , j) ∈ E .

You can think of them as exponentiels of local fields µ(v) = e−βh(v)

and couplings ν(u, v) = e−βJ(u,v).

Task (basic case)Evaluate

ρk (vk ) =∑{vi}−vk

∏i∈V

µi(vi)∏ij∈E

νij(vi : vj)

=∑{vi}−vk

e−β(P

i∈V hi (vi )+P

i,j∈E Jij (vi ,vj ))

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 8 / 33

Page 24: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

General setting

Graph G = (V ,E).Random variables vi for i ∈ V .Functions µi(vi) for i ∈ V and νij(vi : vj) for (i , j) ∈ E .

You can think of them as exponentiels of local fields µ(v) = e−βh(v)

and couplings ν(u, v) = e−βJ(u,v).

Task (basic case)Evaluate

ρk (vk ) =∑{vi}−vk

∏i∈V

µi(vi)∏ij∈E

νij(vi : vj)

=∑{vi}−vk

e−β(P

i∈V hi (vi )+P

i,j∈E Jij (vi ,vj ))

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 8 / 33

Page 25: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

General setting

Graph G = (V ,E).Random variables vi for i ∈ V .Functions µi(vi) for i ∈ V and νij(vi : vj) for (i , j) ∈ E .

You can think of them as exponentiels of local fields µ(v) = e−βh(v)

and couplings ν(u, v) = e−βJ(u,v).

Task (basic case)Evaluate

ρk (vk ) =∑{vi}−vk

∏i∈V

µi(vi)∏ij∈E

νij(vi : vj)

=∑{vi}−vk

e−β(P

i∈V hi (vi )+P

i,j∈E Jij (vi ,vj ))

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 8 / 33

Page 26: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

General setting

Graph G = (V ,E).Random variables vi for i ∈ V .Functions µi(vi) for i ∈ V and νij(vi : vj) for (i , j) ∈ E .

You can think of them as exponentiels of local fields µ(v) = e−βh(v)

and couplings ν(u, v) = e−βJ(u,v).

Task (basic case)Evaluate

ρk (vk ) =∑{vi}−vk

∏i∈V

µi(vi)∏ij∈E

νij(vi : vj)

=∑{vi}−vk

e−β(P

i∈V hi (vi )+P

i,j∈E Jij (vi ,vj ))

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 8 / 33

Page 27: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

General setting

Graph G = (V ,E).Random variables vi for i ∈ V .Functions µi(vi) for i ∈ V and νij(vi : vj) for (i , j) ∈ E .

You can think of them as exponentiels of local fields µ(v) = e−βh(v)

and couplings ν(u, v) = e−βJ(u,v).

Task (basic case)Evaluate

ρk (vk ) =∑{vi}−vk

∏i∈V

µi(vi)∏ij∈E

νij(vi : vj)

=∑{vi}−vk

e−β(P

i∈V hi (vi )+P

i,j∈E Jij (vi ,vj ))

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 8 / 33

Page 28: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Belief propagation algorithm

Algorithm

Initialization mu→v (v) = cte.Iterations mu→v (v) ∝∑u µ(u)ν(u : v)

∏v ′∈n(u)−v mv ′→u(u).

Beliefs b(u) ∝ µ(u)∏

v∈n(u) mv→u(u).b(u, v) ∝ µ(u)µ(v)ν(u : v)

∏w∈n(u)−v mw→u(u)

∏w∈n(v)−u mw→v (v).

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 9 / 33

Page 29: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Belief propagation algorithm

Algorithm

Initialization mu→v (v) = cte.Iterations mu→v (v) ∝∑u µ(u)ν(u : v)

∏v ′∈n(u)−v mv ′→u(u).

... u v

a

b

g

ma!

u

mg!u

mb!u

µu!(u : v)mu!v

Beliefs b(u) ∝ µ(u)∏

v∈n(u) mv→u(u).b(u, v) ∝ µ(u)µ(v)ν(u : v)

∏w∈n(u)−v mw→u(u)

∏w∈n(v)−u mw→v (v).

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 9 / 33

Page 30: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Belief propagation algorithm

Algorithm

Initialization mu→v (v) = cte.Iterations mu→v (v) ∝∑u µ(u)ν(u : v)

∏v ′∈n(u)−v mv ′→u(u).

... u v

a

b

g

ma!

u

mg!u

mb!u

µu!(u : v)mu!v

Beliefs b(u) ∝ µ(u)∏

v∈n(u) mv→u(u).b(u, v) ∝ µ(u)µ(v)ν(u : v)

∏w∈n(u)−v mw→u(u)

∏w∈n(v)−u mw→v (v).

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 9 / 33

Page 31: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Belief propagation algorithm

Algorithm

Initialization mu→v (v) = cte.Iterations mu→v (v) ∝∑u µ(u)ν(u : v)

∏v ′∈n(u)−v mv ′→u(u).

... u v

a

b

g

ma!

u

mg!u

mb!u

µu!(u : v)mu!v

Beliefs b(u) ∝ µ(u)∏

v∈n(u) mv→u(u).b(u, v) ∝ µ(u)µ(v)ν(u : v)

∏w∈n(u)−v mw→u(u)

∏w∈n(v)−u mw→v (v).

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 9 / 33

Page 32: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy

Let H =∑

v hv +∑〈u,v〉 Juv be a local Hamiltonian on G.

Given a probability P(V ), the Gibbs free energy G = E − TSwhere

E =∑

V

P(V )H(V ) and S = −∑

V

P(V ) log P(V )

The Gibbs distribution P(V ) = 1Z e−βH is the stationary point of G.

E only depends on the two-body probabilities

E =∑〈u,v〉

P(u, v)Juv +∑

u

P(u)hu

So does S if P is the product of local functions on a tree (whichincludes Gibbs distributions)

S = −∑〈u,v〉

P(u, v) log P(u, v)−∑

u

(1− dv )P(u) log P(u)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 10 / 33

Page 33: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy

Let H =∑

v hv +∑〈u,v〉 Juv be a local Hamiltonian on G.

Given a probability P(V ), the Gibbs free energy G = E − TSwhere

E =∑

V

P(V )H(V ) and S = −∑

V

P(V ) log P(V )

The Gibbs distribution P(V ) = 1Z e−βH is the stationary point of G.

E only depends on the two-body probabilities

E =∑〈u,v〉

P(u, v)Juv +∑

u

P(u)hu

So does S if P is the product of local functions on a tree (whichincludes Gibbs distributions)

S = −∑〈u,v〉

P(u, v) log P(u, v)−∑

u

(1− dv )P(u) log P(u)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 10 / 33

Page 34: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy

Let H =∑

v hv +∑〈u,v〉 Juv be a local Hamiltonian on G.

Given a probability P(V ), the Gibbs free energy G = E − TSwhere

E =∑

V

P(V )H(V ) and S = −∑

V

P(V ) log P(V )

The Gibbs distribution P(V ) = 1Z e−βH is the stationary point of G.

E only depends on the two-body probabilities

E =∑〈u,v〉

P(u, v)Juv +∑

u

P(u)hu

So does S if P is the product of local functions on a tree (whichincludes Gibbs distributions)

S = −∑〈u,v〉

P(u, v) log P(u, v)−∑

u

(1− dv )P(u) log P(u)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 10 / 33

Page 35: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy

Let H =∑

v hv +∑〈u,v〉 Juv be a local Hamiltonian on G.

Given a probability P(V ), the Gibbs free energy G = E − TSwhere

E =∑

V

P(V )H(V ) and S = −∑

V

P(V ) log P(V )

The Gibbs distribution P(V ) = 1Z e−βH is the stationary point of G.

E only depends on the two-body probabilities

E =∑〈u,v〉

P(u, v)Juv +∑

u

P(u)hu

So does S if P is the product of local functions on a tree (whichincludes Gibbs distributions)

S = −∑〈u,v〉

P(u, v) log P(u, v)−∑

u

(1− dv )P(u) log P(u)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 10 / 33

Page 36: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy

Let H =∑

v hv +∑〈u,v〉 Juv be a local Hamiltonian on G.

Given a probability P(V ), the Gibbs free energy G = E − TSwhere

E =∑

V

P(V )H(V ) and S = −∑

V

P(V ) log P(V )

The Gibbs distribution P(V ) = 1Z e−βH is the stationary point of G.

E only depends on the two-body probabilities

E =∑〈u,v〉

P(u, v)Juv +∑

u

P(u)hu

So does S if P is the product of local functions on a tree (whichincludes Gibbs distributions)

S = −∑〈u,v〉

P(u, v) log P(u, v)−∑

u

(1− dv )P(u) log P(u)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 10 / 33

Page 37: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy

The Bethe free energy is extending this expression to arbitrarygraphs

GBethe =∑〈u,v〉

P(u, v)(T log P(u, v)+Juv )+∑

u

(1−du)P(u)(T log P(u)+hu).

This is seemingly easier to handle because it involves onlytwo-body distributions.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 11 / 33

Page 38: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy

The Bethe free energy is extending this expression to arbitrarygraphs

GBethe =∑〈u,v〉

P(u, v)(T log P(u, v)+Juv )+∑

u

(1−du)P(u)(T log P(u)+hu).

This is seemingly easier to handle because it involves onlytwo-body distributions.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 11 / 33

Page 39: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy and BP

Theorem (Yedidia, Freeman, and Weiss)

The fixed point of the belief propagation algorithms b(u, v) and b(u)are stationary points of the Bethe free energy.

Use BP to solve Bethe free energy minimization.Exact on tree, complexity = depth(G).Most successful on trees with only large loops:

LDPC and Turbo codes.Spin glasses on Bethe lattices.Random k -SAT assignments.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 12 / 33

Page 40: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy and BP

Theorem (Yedidia, Freeman, and Weiss)

The fixed point of the belief propagation algorithms b(u, v) and b(u)are stationary points of the Bethe free energy.

Use BP to solve Bethe free energy minimization.Exact on tree, complexity = depth(G).Most successful on trees with only large loops:

LDPC and Turbo codes.Spin glasses on Bethe lattices.Random k -SAT assignments.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 12 / 33

Page 41: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy and BP

Theorem (Yedidia, Freeman, and Weiss)

The fixed point of the belief propagation algorithms b(u, v) and b(u)are stationary points of the Bethe free energy.

Use BP to solve Bethe free energy minimization.Exact on tree, complexity = depth(G).Most successful on trees with only large loops:

LDPC and Turbo codes.Spin glasses on Bethe lattices.Random k -SAT assignments.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 12 / 33

Page 42: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy and BP

Theorem (Yedidia, Freeman, and Weiss)

The fixed point of the belief propagation algorithms b(u, v) and b(u)are stationary points of the Bethe free energy.

Use BP to solve Bethe free energy minimization.Exact on tree, complexity = depth(G).Most successful on trees with only large loops:

LDPC and Turbo codes.Spin glasses on Bethe lattices.Random k -SAT assignments.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 12 / 33

Page 43: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Bethe free energy and BP

Theorem (Yedidia, Freeman, and Weiss)

The fixed point of the belief propagation algorithms b(u, v) and b(u)are stationary points of the Bethe free energy.

Use BP to solve Bethe free energy minimization.Exact on tree, complexity = depth(G).Most successful on trees with only large loops:

LDPC and Turbo codes.Spin glasses on Bethe lattices.Random k -SAT assignments.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 12 / 33

Page 44: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Generalized belief propagation

When the graph contains small loops, “collapse" them into asingle super-site.Messages exchanged between (possibly overlapping) regions.

If clustered graph is a tree, method is exact.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 13 / 33

Page 45: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Generalized belief propagation

When the graph contains small loops, “collapse" them into asingle super-site.Messages exchanged between (possibly overlapping) regions.

If clustered graph is a tree, method is exact.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 13 / 33

Page 46: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Generalized belief propagation

When the graph contains small loops, “collapse" them into asingle super-site.Messages exchanged between (possibly overlapping) regions.

If clustered graph is a tree, method is exact.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 13 / 33

Page 47: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Generalized belief propagation

When the graph contains small loops, “collapse" them into asingle super-site.Messages exchanged between (possibly overlapping) regions.

If clustered graph is a tree, method is exact.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 13 / 33

Page 48: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Generalized belief propagation

When the graph contains small loops, “collapse" them into asingle super-site.Messages exchanged between (possibly overlapping) regions.

If clustered graph is a tree, method is exact.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 13 / 33

Page 49: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy

Fix a set of regions on the graph.Approximate the entropy by the sum of the entropy of the regions.Correct for the over counting of the sub-regions.Correct for the sub-sub-regions ...

1 2 3

4 5 6

7 8 9

Corresponds to Bethe’s free energy when regions are nearestneighbor pairs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 14 / 33

Page 50: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy

Fix a set of regions on the graph.Approximate the entropy by the sum of the entropy of the regions.Correct for the over counting of the sub-regions.Correct for the sub-sub-regions ...

1 2 3

4 5 6

7 8 9

S = S1245 + S2356 + S478 + S5689

Corresponds to Bethe’s free energy when regions are nearestneighbor pairs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 14 / 33

Page 51: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy

Fix a set of regions on the graph.Approximate the entropy by the sum of the entropy of the regions.Correct for the over counting of the sub-regions.Correct for the sub-sub-regions ...

1 2 3

4 5 6

7 8 9

S = S1245 + S2356 + S478 + S5689

!S4 ! S8 ! S25 ! S56

Corresponds to Bethe’s free energy when regions are nearestneighbor pairs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 14 / 33

Page 52: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy

Fix a set of regions on the graph.Approximate the entropy by the sum of the entropy of the regions.Correct for the over counting of the sub-regions.Correct for the sub-sub-regions ...

1 2 3

4 5 6

7 8 9

S = S1245 + S2356 + S478 + S5689

!S4 ! S8 ! S25 ! S56

Corresponds to Bethe’s free energy when regions are nearestneighbor pairs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 14 / 33

Page 53: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy

Fix a set of regions on the graph.Approximate the entropy by the sum of the entropy of the regions.Correct for the over counting of the sub-regions.Correct for the sub-sub-regions ...

1 2 3

4 5 6

7 8 9

S = S1245 + S2356 + S478 + S5689

!S4 ! S8 ! S25 ! S56

Corresponds to Bethe’s free energy when regions are nearestneighbor pairs.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 14 / 33

Page 54: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy and generalized BP

Theorem (Yedidia, Freeman, and Weiss)The fixed point of the generalized belief propagation algorithms arestationary points of the Kikushi free energy computed using the sameregions.a

aTechnical detail: the counting number of each sub-region must be non-positive.

Greatly improves numerical precision for graphs containing smallloops.Can use GBP instead of minimizing Kikuchi free energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 15 / 33

Page 55: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy and generalized BP

Theorem (Yedidia, Freeman, and Weiss)The fixed point of the generalized belief propagation algorithms arestationary points of the Kikushi free energy computed using the sameregions.a

aTechnical detail: the counting number of each sub-region must be non-positive.

Greatly improves numerical precision for graphs containing smallloops.Can use GBP instead of minimizing Kikuchi free energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 15 / 33

Page 56: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Belief propagation

Kikuchi free energy and generalized BP

Theorem (Yedidia, Freeman, and Weiss)The fixed point of the generalized belief propagation algorithms arestationary points of the Kikushi free energy computed using the sameregions.a

aTechnical detail: the counting number of each sub-region must be non-positive.

Greatly improves numerical precision for graphs containing smallloops.Can use GBP instead of minimizing Kikuchi free energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 15 / 33

Page 57: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Outline

1 Belief propagation

2 Quantum belief propagation

3 Markov Entropy Approximation Scheme

4 Closing

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 16 / 33

Page 58: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

The algorithm

Defining the log-exp product A� B = elog A+log B, the quantum andclassical setting become very similar.

Let G = (V ,E) be a graph with Hamiltonian H =∑

hi +∑

Jij .Defining µi = e−βhi , νij = e−βJij , the Gibbs state is

ρV =1Z

(⊗i∈V

µi

)�( ⊙

(i,j)∈E

νij

)We can execute BP by substituting ordinary products for �.

TheoremIf G is a tree and (G, ρV ) is a quantum Markov random field, then thebeliefs bu converge to the correct marginals ρu = TrV−u{ρV} in a timeproportional to depth(G).

This doesn’t happen very often, but similar conditions forgeneralized BP are more natural.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 17 / 33

Page 59: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

The algorithm

Defining the log-exp product A� B = elog A+log B, the quantum andclassical setting become very similar.

Let G = (V ,E) be a graph with Hamiltonian H =∑

hi +∑

Jij .Defining µi = e−βhi , νij = e−βJij , the Gibbs state is

ρV =1Z

(⊗i∈V

µi

)�( ⊙

(i,j)∈E

νij

)We can execute BP by substituting ordinary products for �.

TheoremIf G is a tree and (G, ρV ) is a quantum Markov random field, then thebeliefs bu converge to the correct marginals ρu = TrV−u{ρV} in a timeproportional to depth(G).

This doesn’t happen very often, but similar conditions forgeneralized BP are more natural.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 17 / 33

Page 60: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

The algorithm

Defining the log-exp product A� B = elog A+log B, the quantum andclassical setting become very similar.

Let G = (V ,E) be a graph with Hamiltonian H =∑

hi +∑

Jij .Defining µi = e−βhi , νij = e−βJij , the Gibbs state is

ρV =1Z

(⊗i∈V

µi

)�( ⊙

(i,j)∈E

νij

)We can execute BP by substituting ordinary products for �.

TheoremIf G is a tree and (G, ρV ) is a quantum Markov random field, then thebeliefs bu converge to the correct marginals ρu = TrV−u{ρV} in a timeproportional to depth(G).

This doesn’t happen very often, but similar conditions forgeneralized BP are more natural.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 17 / 33

Page 61: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

The algorithm

Defining the log-exp product A� B = elog A+log B, the quantum andclassical setting become very similar.

Let G = (V ,E) be a graph with Hamiltonian H =∑

hi +∑

Jij .Defining µi = e−βhi , νij = e−βJij , the Gibbs state is

ρV =1Z

(⊗i∈V

µi

)�( ⊙

(i,j)∈E

νij

)We can execute BP by substituting ordinary products for �.

TheoremIf G is a tree and (G, ρV ) is a quantum Markov random field, then thebeliefs bu converge to the correct marginals ρu = TrV−u{ρV} in a timeproportional to depth(G).

This doesn’t happen very often, but similar conditions forgeneralized BP are more natural.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 17 / 33

Page 62: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

The algorithm

Defining the log-exp product A� B = elog A+log B, the quantum andclassical setting become very similar.

Let G = (V ,E) be a graph with Hamiltonian H =∑

hi +∑

Jij .Defining µi = e−βhi , νij = e−βJij , the Gibbs state is

ρV =1Z

(⊗i∈V

µi

)�( ⊙

(i,j)∈E

νij

)We can execute BP by substituting ordinary products for �.

TheoremIf G is a tree and (G, ρV ) is a quantum Markov random field, then thebeliefs bu converge to the correct marginals ρu = TrV−u{ρV} in a timeproportional to depth(G).

This doesn’t happen very often, but similar conditions forgeneralized BP are more natural.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 17 / 33

Page 63: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

The algorithm

Defining the log-exp product A� B = elog A+log B, the quantum andclassical setting become very similar.

Let G = (V ,E) be a graph with Hamiltonian H =∑

hi +∑

Jij .Defining µi = e−βhi , νij = e−βJij , the Gibbs state is

ρV =1Z

(⊗i∈V

µi

)�( ⊙

(i,j)∈E

νij

)We can execute BP by substituting ordinary products for �.

TheoremIf G is a tree and (G, ρV ) is a quantum Markov random field, then thebeliefs bu converge to the correct marginals ρu = TrV−u{ρV} in a timeproportional to depth(G).

This doesn’t happen very often, but similar conditions forgeneralized BP are more natural.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 17 / 33

Page 64: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

! =1Z

exp{!"H}H =!!

i="!hi + Ji,i+1

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 65: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

! =1Z

exp{!"H}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 66: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} ! =1Z

exp{!"H}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 67: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} !! =1Z ! exp{!"He!}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 68: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} !! =1Z ! exp{!"He!}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

10 20 30 40 50 60 70 80 90 10010!12

10!10

10!8

10!6

10!4

10!2

100

!!z 0!

z j"

j

! = 20

! = 1

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 69: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} !! =1Z ! exp{!"He!}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

10 20 30 40 50 60 70 80 90 10010!12

10!10

10!8

10!6

10!4

10!2

100

!!z 0!

z j"

j

! = 20

! = 1

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 70: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} !! =1Z ! exp{!"He!}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

10 20 30 40 50 60 70 80 90 10010!12

10!10

10!8

10!6

10!4

10!2

100

!!z 0!

z j"

j

! = 20

! = 1

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 71: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} !! =1Z ! exp{!"He!}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

10 20 30 40 50 60 70 80 90 10010!12

10!10

10!8

10!6

10!4

10!2

100

!!z 0!

z j"

j

! = 20

! = 1

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 72: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} !! =1Z ! exp{!"He!}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

10 20 30 40 50 60 70 80 90 10010!12

10!10

10!8

10!6

10!4

10!2

100

!!z 0!

z j"

j

! = 20

! = 1

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 73: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Effective thermal hamiltonian

-5 -4 -3 -2 -1 0 1 2 3 4 5... ...

!! = Tr"#..."1{!} !! =1Z ! exp{!"He!}

Effective Thermal Hamiltonian =∑∞

i=1 hi + Ji,i+1 +V1 +V2+V3+V4 . . .

10 20 30 40 50 60 70 80 90 10010!12

10!10

10!8

10!6

10!4

10!2

100

!!z 0!

z j"

j

! = 20

! = 1

1 2 3 4 5 610!12

10!10

10!8

10!6

10!4

10!2

100

!Vj! 2

j

! = 1

! = 20

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 18 / 33

Page 74: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 75: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 76: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 77: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

{!1!4

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 78: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

{

h!2"4

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 79: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

{!2!5

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 80: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

{

h!3"5

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 81: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

{

!3!6

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 82: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Generalized quantum belief propagation

{

!N!3,N!2,N!1,N

σ1−4 = e−β(h1+h2+h3+h4+J12+J23+J34)

σ′2−4 = Tr1{σ1−4} h′2−4 = −1β

logσ′2−4

σ2−5 = e−β(h′2−4+h5+J45)

σ′3−5 = Tr2{σ2−5} h′3−5 = −1β

logσ′3−5

σ3−6 = e−β(h′3−5+h6+J56)

...ρN ≈ bN = TrN−3,N−2,N−1{σN−3,N−2,N−1,N}

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 19 / 33

Page 83: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

When does this work?

... ...A B C

This quantum generalized belief propagation is exact when

I(A : C|B) = S(AB) + S(BC)− S(B)− S(ABC) = 0

This is probably never exactly the case, but as B gets larger it seemsto be a good approximation.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 20 / 33

Page 84: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

When does this work?

... ...A B C

This quantum generalized belief propagation is exact when

I(A : C|B) = S(AB) + S(BC)− S(B)− S(ABC) = 0

This is probably never exactly the case, but as B gets larger it seemsto be a good approximation.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 20 / 33

Page 85: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

Results on trees

0 0.5 1 1.5 20

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

Bethe ansatzQBP

0.46 0.47 0.48 0.49 0.50.349

0.3492

0.3494

0.3496

0.3498

Spec

ific h

eat

Temperature

Sliding window

Bethe Ansatz

Temperature

Tra

nsvers

e F

ield

Str

ength

Edwards!Anderson Order Parameter on a Cayley Tree with Transverse Ising Hamiltonian

0.2 0.4 0.6 0.8 1 1.2 1.4

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2

Ising spin glass on Cayley tree

Tran

sver

se fi

eld

Bx

Temperature 1.13

17

Finally, we note that much of the phase diagram is surprisingly stable to variation in Nt.

We have explored various regions of the phase space at Nt = 6, 7, 8, 9, 10, 11. The classical

line (Bt = 0) at all temperatures is completely stable down to Nt = 1 as expected. Perhaps

more surprisingly, moving between Nt = 8 and Nt = 10, qEA is essentially stable below

Bt = 1 down to temperatures ! ! 0.15. Of course, the high field, low temperature part

of the phase transition curve moves downward as the finite discretization asymptote goes

towards the ! = 0 axis. See Figure 6 for the low temperature critical curves estimated using

vertical stripes run at five di!erent temperatures (corresponding to " = 3.5, 4, 4.5, 5, 5.5) at

various Nt.

FIG. 4: (a) Phase diagram at q = 3. The solid phase transition curve has been calculated at

Nt = 10, Nrods = 2500, Niter = 1000Nrods on a fine mesh in the (!, Bt) plane. The vertical dotted

line is the asymptotic critical line for large Bt at Nt = 10 (ie ! = !c/Nt). The points marked x

with error bars indicate Nt " # fits based on Figure 6. The dashed transition curve is a weighted

quadratic fit through the estimated low temperature points and the Nt = 10 points in the range

0.5 < ! < 1. This leads to an estimated Bct = 1.775 ± 0.03. As this fit is clearly heuristic, we

have suggested a much larger range for our estimate of Bct in the Figure. Our phase diagram

clearly disagrees with that of [18], who treat the identical model using a spherical approximation.

The stars and stripes indicate points in the phase space which we have investigated in more detail

below. (b) The average von Neumann entropy of a central spin as a function of (!, Bt).Laumann, Scardicchio, and Sondhi ’07, Bilgin and Poulin.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 21 / 33

Page 86: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

2D anti-ferromagnetic Heisenberg model

Doesn’t work so well in 2D...

Quantum Monte Calrlo: M.S. Makivic and H.-Q. Ding PRB’91.10th-order J/T expansion.Generalized Quantum Belief propagation, window size 7.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 22 / 33

Page 87: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Quantum belief propagation

2D anti-ferromagnetic Heisenberg model

Doesn’t work so well in 2D...

!"" #"" $"" %""

!&$

!&'

!&(

)!*

Quantum Monte Calrlo: M.S. Makivic and H.-Q. Ding PRB’91.10th-order J/T expansion.Generalized Quantum Belief propagation, window size 7.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 22 / 33

Page 88: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Outline

1 Belief propagation

2 Quantum belief propagation

3 Markov Entropy Approximation Scheme

4 Closing

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 23 / 33

Page 89: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Entropy is a mess...

F (T ) = minρ{E(ρ)− TS(ρ)} = min

ρ{Tr(ρH) + T · Tr(ρ log ρ)}

Energy can be computed from pieces of ρ:

E(ρ) =∑

i

Tr(ρihi) +∑

ij

Tr(ρijJij)

Hence it is easy to compute, involves small matrices.The same cannot be achieved with entropy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 24 / 33

Page 90: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Entropy is a mess...

F (T ) = minρ{E(ρ)− TS(ρ)} = min

ρ{Tr(ρH) + T · Tr(ρ log ρ)}

Energy can be computed from pieces of ρ:

E(ρ) =∑

i

Tr(ρihi) +∑

ij

Tr(ρijJij)

Hence it is easy to compute, involves small matrices.The same cannot be achieved with entropy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 24 / 33

Page 91: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Entropy is a mess...

F (T ) = minρ{E(ρ)− TS(ρ)} = min

ρ{Tr(ρH) + T · Tr(ρ log ρ)}

Energy can be computed from pieces of ρ:

E(ρ) =∑

i

Tr(ρihi) +∑

ij

Tr(ρijJij)

Hence it is easy to compute, involves small matrices.The same cannot be achieved with entropy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 24 / 33

Page 92: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Entropy is a mess...

F (T ) = minρ{E(ρ)− TS(ρ)} = min

ρ{Tr(ρH) + T · Tr(ρ log ρ)}

Energy can be computed from pieces of ρ:

E(ρ) =∑

i

Tr(ρihi) +∑

ij

Tr(ρijJij)

Hence it is easy to compute, involves small matrices.The same cannot be achieved with entropy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 24 / 33

Page 93: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 94: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

1 2 3

N

...

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 95: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

S = SN

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 96: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

S = SN !SN!1 + SN!1

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 97: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

S = SN !SN!1 + SN!1 !SN!2 + SN!2

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 98: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

S = SN !SN!1 + SN!1 !SN!2 + SN!2

=!

j

Sj ! Sj!1

=!

j

S(j|1, 2, ..., j ! 1)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 99: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

S(j|1, 2, ..., j ! 1)

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 100: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Breaking entropy apart

S(j|1, 2, ..., j ! 1)

! S(j|{1, 2, ..., j " 1} #N (j))

Strong sub-additivity

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 25 / 33

Page 101: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Bounding the free energy

S∗ =∑

j

S(j |{1,2, ..., j − 1} ∩ N (j)) ≥ S

can be computed from pieces ρP(j) of ρ (P(j) = {1,2, ..., j} ∩ N (j)).Provides approximation for free energy

F (T ) ≥ F ∗(T ) = minρP(j):Consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Consistent means ∃ρ : ρP(j) = TrP(j)(ρ). QMA-complete.Relax constraint to local consistency, i.e.

TrP(j)∩P(k)(ρP(j) − ρP(k)) = 0

F ∗(T ) ≥ F ∗∗(T ) = minρP(j):Local consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Convex optimization with linear constraints.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 26 / 33

Page 102: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Bounding the free energy

S∗ =∑

j

S(j |{1,2, ..., j − 1} ∩ N (j)) ≥ S

can be computed from pieces ρP(j) of ρ (P(j) = {1,2, ..., j} ∩ N (j)).Provides approximation for free energy

F (T ) ≥ F ∗(T ) = minρP(j):Consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Consistent means ∃ρ : ρP(j) = TrP(j)(ρ). QMA-complete.Relax constraint to local consistency, i.e.

TrP(j)∩P(k)(ρP(j) − ρP(k)) = 0

F ∗(T ) ≥ F ∗∗(T ) = minρP(j):Local consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Convex optimization with linear constraints.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 26 / 33

Page 103: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Bounding the free energy

S∗ =∑

j

S(j |{1,2, ..., j − 1} ∩ N (j)) ≥ S

can be computed from pieces ρP(j) of ρ (P(j) = {1,2, ..., j} ∩ N (j)).Provides approximation for free energy

F (T ) ≥ F ∗(T ) = minρP(j):Consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Consistent means ∃ρ : ρP(j) = TrP(j)(ρ). QMA-complete.Relax constraint to local consistency, i.e.

TrP(j)∩P(k)(ρP(j) − ρP(k)) = 0

F ∗(T ) ≥ F ∗∗(T ) = minρP(j):Local consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Convex optimization with linear constraints.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 26 / 33

Page 104: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Bounding the free energy

S∗ =∑

j

S(j |{1,2, ..., j − 1} ∩ N (j)) ≥ S

can be computed from pieces ρP(j) of ρ (P(j) = {1,2, ..., j} ∩ N (j)).Provides approximation for free energy

F (T ) ≥ F ∗(T ) = minρP(j):Consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Consistent means ∃ρ : ρP(j) = TrP(j)(ρ). QMA-complete.Relax constraint to local consistency, i.e.

TrP(j)∩P(k)(ρP(j) − ρP(k)) = 0

F ∗(T ) ≥ F ∗∗(T ) = minρP(j):Local consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Convex optimization with linear constraints.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 26 / 33

Page 105: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Bounding the free energy

S∗ =∑

j

S(j |{1,2, ..., j − 1} ∩ N (j)) ≥ S

can be computed from pieces ρP(j) of ρ (P(j) = {1,2, ..., j} ∩ N (j)).Provides approximation for free energy

F (T ) ≥ F ∗(T ) = minρP(j):Consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Consistent means ∃ρ : ρP(j) = TrP(j)(ρ). QMA-complete.Relax constraint to local consistency, i.e.

TrP(j)∩P(k)(ρP(j) − ρP(k)) = 0

F ∗(T ) ≥ F ∗∗(T ) = minρP(j):Local consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Convex optimization with linear constraints.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 26 / 33

Page 106: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Bounding the free energy

S∗ =∑

j

S(j |{1,2, ..., j − 1} ∩ N (j)) ≥ S

can be computed from pieces ρP(j) of ρ (P(j) = {1,2, ..., j} ∩ N (j)).Provides approximation for free energy

F (T ) ≥ F ∗(T ) = minρP(j):Consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Consistent means ∃ρ : ρP(j) = TrP(j)(ρ). QMA-complete.Relax constraint to local consistency, i.e.

TrP(j)∩P(k)(ρP(j) − ρP(k)) = 0

F ∗(T ) ≥ F ∗∗(T ) = minρP(j):Local consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Convex optimization with linear constraints.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 26 / 33

Page 107: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Bounding the free energy

S∗ =∑

j

S(j |{1,2, ..., j − 1} ∩ N (j)) ≥ S

can be computed from pieces ρP(j) of ρ (P(j) = {1,2, ..., j} ∩ N (j)).Provides approximation for free energy

F (T ) ≥ F ∗(T ) = minρP(j):Consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Consistent means ∃ρ : ρP(j) = TrP(j)(ρ). QMA-complete.Relax constraint to local consistency, i.e.

TrP(j)∩P(k)(ρP(j) − ρP(k)) = 0

F ∗(T ) ≥ F ∗∗(T ) = minρP(j):Local consistent

{E({ρP(j)})− TS∗({ρP(j)})

}Convex optimization with linear constraints.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 26 / 33

Page 108: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

2D anti-ferromagnetic Heisenberg model revisited

! !"# !"$ !"% !"& ' '"# '"$ '"% '"& #!'"(

!'"#(

!'

!!")(

!!"(

!!"#(

*+,,-./,+012-3.45

./,+012-3.45

./,+012-.6789-:;70</7=;>79;</-$6$-

./,+012-3</9,-?7+=<

*+,,-./,+012-@;0A-B,CD,+79E+,-5,+;,F

./,+012-.6789-:;70</7=;>79;</-G6G

!"#$ !"## !"#% !"#&!!"'$

!!"'!(

!!"'!)

!!"'!'

!!"'!*

!!"'!+

!!"'!&

Temperature (1/J)

E0 ! "0.7062...

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 27 / 33

Page 109: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Improved finite size effects

MEAS requires 8-site diagonalization and matches 16-sitediagonalization.The constraints force the system to behave as if were part of alarger lattice.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 28 / 33

Page 110: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Improved finite size effects

MEAS requires 8-site diagonalization and matches 16-sitediagonalization.The constraints force the system to behave as if were part of alarger lattice.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 28 / 33

Page 111: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Improved finite size effects

MEAS requires 8-site diagonalization and matches 16-sitediagonalization.The constraints force the system to behave as if were part of alarger lattice.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 28 / 33

Page 112: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Improved finite size effects

MEAS requires 8-site diagonalization and matches 16-sitediagonalization.The constraints force the system to behave as if were part of alarger lattice.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 28 / 33

Page 113: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Improved finite size effects

MEAS requires 8-site diagonalization and matches 16-sitediagonalization.The constraints force the system to behave as if were part of alarger lattice.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 28 / 33

Page 114: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Positive entropy density constraint

Alternative method to estimate ground energy

Minimize Tr(ρH) on finite lattice with translational constraints on ρ.

Our method is more accurate because it imposes positive entropydensity constraints.The free energy at the the temperature where S∗ goes negative isa rigorous lower bound on E0 in the thermodynamical limit.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 29 / 33

Page 115: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Positive entropy density constraint

Alternative method to estimate ground energy

Minimize Tr(ρH) on finite lattice with translational constraints on ρ.

Our method is more accurate because it imposes positive entropydensity constraints.The free energy at the the temperature where S∗ goes negative isa rigorous lower bound on E0 in the thermodynamical limit.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 29 / 33

Page 116: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Markov Entropy Approximation Scheme

Positive entropy density constraint

Alternative method to estimate ground energy

Minimize Tr(ρH) on finite lattice with translational constraints on ρ.

Our method is more accurate because it imposes positive entropydensity constraints.The free energy at the the temperature where S∗ goes negative isa rigorous lower bound on E0 in the thermodynamical limit.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 29 / 33

Page 117: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Outline

1 Belief propagation

2 Quantum belief propagation

3 Markov Entropy Approximation Scheme

4 Closing

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 30 / 33

Page 118: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Tightening the bound

S∗ > S because it neglects some correlations in the systems:Had we worked with a different patch, we would have neglectedother correlations:pS∗1 + (1− p)S∗2 is a valid upper bound to S.In addition, we can impose cross-constraints that will tighten thebound.

Combine patches that explore multiple length-scales.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 31 / 33

Page 119: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Tightening the bound

S∗ > S because it neglects some correlations in the systems:Had we worked with a different patch, we would have neglectedother correlations:pS∗1 + (1− p)S∗2 is a valid upper bound to S.In addition, we can impose cross-constraints that will tighten thebound.

Combine patches that explore multiple length-scales.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 31 / 33

Page 120: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Tightening the bound

S∗ > S because it neglects some correlations in the systems:Had we worked with a different patch, we would have neglectedother correlations:pS∗1 + (1− p)S∗2 is a valid upper bound to S.In addition, we can impose cross-constraints that will tighten thebound.

Combine patches that explore multiple length-scales.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 31 / 33

Page 121: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Tightening the bound

S∗ > S because it neglects some correlations in the systems:Had we worked with a different patch, we would have neglectedother correlations:pS∗1 + (1− p)S∗2 is a valid upper bound to S.In addition, we can impose cross-constraints that will tighten thebound.

Combine patches that explore multiple length-scales.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 31 / 33

Page 122: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Tightening the bound

S∗ > S because it neglects some correlations in the systems:Had we worked with a different patch, we would have neglectedother correlations:pS∗1 + (1− p)S∗2 is a valid upper bound to S.In addition, we can impose cross-constraints that will tighten thebound.

Combine patches that explore multiple length-scales.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 31 / 33

Page 123: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Dual program

Numerical minimization might fail to find the true minimum of F ∗.In that case our result would not be a rigorous lower bound to thesystem’s free energy.Any solution to the dual program would provide a lower bound tothe minimum of F ∗, and hence to the system’s free energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 32 / 33

Page 124: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Dual program

Numerical minimization might fail to find the true minimum of F ∗.In that case our result would not be a rigorous lower bound to thesystem’s free energy.Any solution to the dual program would provide a lower bound tothe minimum of F ∗, and hence to the system’s free energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 32 / 33

Page 125: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Closing

Dual program

Numerical minimization might fail to find the true minimum of F ∗.In that case our result would not be a rigorous lower bound to thesystem’s free energy.Any solution to the dual program would provide a lower bound tothe minimum of F ∗, and hence to the system’s free energy.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 32 / 33

Page 126: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33

Page 127: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33

Page 128: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33

Page 129: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33

Page 130: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33

Page 131: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33

Page 132: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33

Page 133: Markov Entropy Approximation Scheme · 2011-07-22 · Markov Entropy Approximation Scheme David Poulin Département de Physique Université de Sherbrooke Joint work with: Matt Leifer,

Conclusion

Belief propagation is a powerful heuristic to solve statisticalinference problems involving a large number of random variablesin a Markov Random Field.

Fixed points of BP = Stationary point of Bethe free energy.Generalized belief propagation improves the case where thegraph has small loops.

Fixed points of GBP = Stationary point of Kikuchi free energy.

Quantum generalized BP provides reliable estimates because theeffective thermal hamiltonian is short ranged.Markov Entropy Approximation Scheme uses strong sub-additivityto lower bound the system’s free energy.Linearizing the constraints decreases this bound and yields asolvable problem.Numerical results (simple) interpolate nicely between High Tseries and exact diagonalization of larger lattices.

David Poulin (Sherbrooke) Markov Entropy Approximation Scheme Vienna’09 33 / 33