48
Outline Loop calculus in statistical physics and information theory Michael Chertkov 1 & Vladimir Chernyak 2,1 1 CNLS & T-13, LANL and 2 Wayne State, Detroit Thanks to M. Stepanov (UofA, Tucson) Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

Loop calculus in statistical physics and information theory

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Outline

Loop calculusin statistical physics and information theory

Michael Chertkov1 & Vladimir Chernyak 2,1

1CNLS & T-13, LANL and 2Wayne State, Detroit

Thanks to M. Stepanov (UofA, Tucson)

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

Outline

Outline

1 IntroductionError Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

2 Loop Calculus: General Statement & Derivation

3 Loop Calculus & Error-CorrectionLoop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

4 Conclusions

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Error Correction

Scheme:

Example of Additive White Gaussian Channel:

P(xout |xin) =Y

i=bits

p(xout;i |xin;i )

p(x|y) ∼ exp(−s2(x − y)2/2)

Channelis noisy ”black box” with only statistical information available

Encoding:use redundancy to redistribute damaging effect of the noise

Decoding:reconstruct most probable codeword by noisy (polluted) channel

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Low Density Parity Check Codes

N bits, M checks, L = N − M information bitsexample: N = 10, M = 5, L = 5

2L codewords of 2N possible patterns

Parity check: Hv = c = 0example:

H =

0BBB@

1 1 1 1 0 1 1 0 0 00 0 1 1 1 1 1 1 0 00 1 0 1 0 1 0 1 1 11 0 1 0 1 0 0 1 1 11 1 0 0 1 0 1 0 1 1

1CCCA

LDPC = graph (parity check matrix) is sparse

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Maximum Likelihood/Maximum-a-Posteriori

Exhaustive search for pre-image = the best one can possibly do

ML=arg maxσ=codewords

P(xout|σ); MAP=sign

(∑σ σP(xout|σ)∑

σ P(xoutσ)

)

MAP≈BP=Belief-Propagation (Bethe-Pieirls) Gallager ’61

Exact on a tree Derivation Sketch

Trading optimality for reduction in complexity: ∼ 2L →∼ L

BP = solving equations on the graph:

ηjα = hj +j∈β∑β 6=α

tanh−1

(i∈β∏i 6=j

tanh ηiβ

)Message Passing = iterative BP

Applies to a general inference problem on a (sparse) graph

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Maximum Likelihood/Maximum-a-Posteriori

Exhaustive search for pre-image = the best one can possibly do

ML=arg maxσ=codewords

P(xout|σ); MAP=sign

(∑σ σP(xout|σ)∑

σ P(xoutσ)

)

MAP≈BP=Belief-Propagation (Bethe-Pieirls) Gallager ’61

Exact on a tree Derivation Sketch

Trading optimality for reduction in complexity: ∼ 2L →∼ L

BP = solving equations on the graph:

ηjα = hj +j∈β∑β 6=α

tanh−1

(i∈β∏i 6=j

tanh ηiβ

)Message Passing = iterative BP

Applies to a general inference problem on a (sparse) graph

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Graphical models of Statistical Inference

Factorization (Forney ’01, Loeliger ’01)

P{σ} = Z−1Ya∈X

fa(σa)

Z =X{σ}

P{σ}

X = edges

fa ≥ 0

σab = σba = ±1

σ1 = (σ12, σ14, σ18)

σ2 = (σ12, σ13)

Error-Correction (bipartite)

fi (σi ) =

�1, σiα = σiβ

0, otherwise

fα(σα) = δ

0@Y

i∈α

σi , +1

1A exp

0@X

i∈α

σihi/qi

1A

hi - log-likelihoodsqi -connectivity degrees

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Bethe free energy: variational approach(Yedidia,Freeman,Weiss ’01 - inspired by Bethe ’35, Peierls ’36)

F = −Pa

Pσa

ba(σa) ln fa(σa) +Pa

Pσa

ba(σa) ln ba(σa)−P

(a,c)

bac (σac ) ln bac (σac )

constraints:

∀ a, c; c ∈ a : 0 ≤ ba(σa), bac (σa,c ) ≤ 1

∀ a, c; c ∈ a :Xσa

ba(σa) =Xσa,c

bac (σa,c ) = 1

∀ a; c ∈ a : bac (σac ) =X

σa\σac

ba(σa)

Belief-Propagation Equations:

δFδb

���constr.

= 0

Variational Method in Stat Mech

• Relaxation to minimum of the Bethe Free energy enforces convergence of iterative BP (Stepanov, Chertkov ’06)

LP decoding Feldman, Wainwright, Karger ’03

LP decoding = minimization of a linear function over a bounded domaindescribed by linear constraints

Fast and Discrete

”Large SNR” limit of BP: F ≈ E = −Pa

Pσa

ba(σa) ln fa(σa)

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Bethe free energy: variational approach(Yedidia,Freeman,Weiss ’01 - inspired by Bethe ’35, Peierls ’36)

F = −Pa

Pσa

ba(σa) ln fa(σa) +Pa

Pσa

ba(σa) ln ba(σa)−P

(a,c)

bac (σac ) ln bac (σac )

constraints:

∀ a, c; c ∈ a : 0 ≤ ba(σa), bac (σa,c ) ≤ 1

∀ a, c; c ∈ a :Xσa

ba(σa) =Xσa,c

bac (σa,c ) = 1

∀ a; c ∈ a : bac (σac ) =X

σa\σac

ba(σa)

Belief-Propagation Equations:

δFδb

���constr.

= 0

Variational Method in Stat Mech

• Relaxation to minimum of the Bethe Free energy enforces convergence of iterative BP (Stepanov, Chertkov ’06)

LP decoding Feldman, Wainwright, Karger ’03

LP decoding = minimization of a linear function over a bounded domaindescribed by linear constraints

Fast and Discrete

”Large SNR” limit of BP: F ≈ E = −Pa

Pσa

ba(σa) ln fa(σa)

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Error-Floor

T. Richardson, Allerton ’03

BER vs SNR = measure ofperformance

Waterfall ↔ Error-floor

Suboptimal decoding causeserror-floor: at Es/N0 →∞,

FERML ∼ exp(−dMLEs/N0) vs

FERsub ∼ exp(−dsubEs/N0) where

dML > dsub

Monte-Carlo is useless atBER . 10−8

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Pseudo-codewords and Instantons

Error-floor is caused by (rare) troublemakers

instantons → pseudo-codewords

log-likelihoods → a-posteriori log-likelihoods

Bibliography:

Pseudo-codewords & Error-floor:Wiberg ’96; Forney et.al’99; Frey et.al ’01;

Richardson ’04; Vontobel, Koetter ’04-’06

Instantons:Stepanov et.al ’04-’06 Pseudo-Codeword Search

Shopping list

Analyze thetroublemakers

Improve decoding toreduce the error-floor

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Error Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

Pseudo-codewords and Instantons

Error-floor is caused by (rare) troublemakers

instantons → pseudo-codewords

log-likelihoods → a-posteriori log-likelihoods

Bibliography:

Pseudo-codewords & Error-floor:Wiberg ’96; Forney et.al’99; Frey et.al ’01;

Richardson ’04; Vontobel, Koetter ’04-’06

Instantons:Stepanov et.al ’04-’06 Pseudo-Codeword Search

Shopping list

Analyze thetroublemakers

Improve decoding toreduce the error-floor

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop Series for Partition function

1 IntroductionError Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

2 Loop Calculus: General Statement & Derivation

3 Loop Calculus & Error-CorrectionLoop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

4 Conclusions

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop Series for Partition function

(Chertkov,Chernyak ’06)

Exact expression (for partition function, etc)in terms of BP

Z =Xσσ

Ya

fa(σa) = Z0

1 +

XC

r(C)

!

r(C) =

Qa∈C

µaQ(ab)∈C

(1−m2ab)

C ∈ Generalized Loops = Loops without loose ends

mab =

Zdσab

(bp)a (σa)σab

µa =

Zdσab

(bp)a (σa)

Yb∈a,C

(σab −mab)

The Loop Series is finite

b(bp)ab , b

(bp)a , Z0 and all terms in

the series are calculated withinBP

BP is exact on a tree

BP is a Gauge fixing condition.Other choices of Gauges wouldlead to different representation.

Derivation Sketch

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

1 IntroductionError Correction, Low-Density-Parity-Check CodesStatistical Inference, Belief Propagation, Graphical ModelsBethe Free Energy, Linear Programming DecodingError-Floor, Pseudo-codewords & Instantons

2 Loop Calculus: General Statement & Derivation

3 Loop Calculus & Error-CorrectionLoop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

4 Conclusions

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

Synthesis ofPseudo-Codeword Search Algorithm & Loop Calculus

Consider pseudo-codewords one after other

For an individual pseudo-codeword/instanton identify a criticalloop giving major contribution to the loop series:Z = Z0(1 +

PC rC ) ≈ Z0(1 + r(Γ))

Hint: look for single connected loops and use local ”triad”contributions as a tester:

r(Γ)=Q

α∈Γ µ(bp)α , µ

(bp)α =

µ(bp)αr

(1−(m(bp)i

)2)(1−(m(bp)j

)2)

Proof-of-Concept test [(155, 64, 20) code over AWGN]

∀ pseudo-codewords with 16.4037 < d < 20 (∼ 200 found)there always exists a simple single-connected critical loop(s)with r(Γ) ∼ 1.

Pseudo-codewords with the lowest d show r(Γ) = 1

Invariant with respect to other choices of the original codeword

Correction to log-likelihood at a bit of a critical loop brings thecumulative result to zero. Correction to an a-posteriorilog-likelihood always aligns with correction to respectivelog-likelihood.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

Synthesis ofPseudo-Codeword Search Algorithm & Loop Calculus

Consider pseudo-codewords one after other

For an individual pseudo-codeword/instanton identify a criticalloop giving major contribution to the loop series:Z = Z0(1 +

PC rC ) ≈ Z0(1 + r(Γ))

Hint: look for single connected loops and use local ”triad”contributions as a tester:

r(Γ)=Q

α∈Γ µ(bp)α , µ

(bp)α =

µ(bp)αr

(1−(m(bp)i

)2)(1−(m(bp)j

)2)

Proof-of-Concept test [(155, 64, 20) code over AWGN]

∀ pseudo-codewords with 16.4037 < d < 20 (∼ 200 found)there always exists a simple single-connected critical loop(s)with r(Γ) ∼ 1.

Pseudo-codewords with the lowest d show r(Γ) = 1

Invariant with respect to other choices of the original codeword

Correction to log-likelihood at a bit of a critical loop brings thecumulative result to zero. Correction to an a-posteriorilog-likelihood always aligns with correction to respectivelog-likelihood.

0 50 100 150−1

0

1

bit label, i=1,...,155

A−

post

erio

ri lo

g−lik

elih

oods

Instanton #1, deff

=16.4037

Bigger Set

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

Extended Variational Principle & Loop-Corrected BPBare BP Variational Principle:

δZ0δηab

�����η(bp)

= 0, Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

���η(bp)

New choice of Gauges guided by the knowledge of the critical loop Γ

δ exp(−F)δηab

���ηeff

=0, F ≡ − ln(Z0 + ZΓ)

BP-equations are modified along the critical loop ΓP

σa (tanh(ηab+ηba)−σab)Pa(σa)Pσa

Pa(σa)

����ηeff

=

Qd∈Γ µd ;ΓQ

(a′b′)∈Γ(1−(m

(∗)

a′b′)2)

δma→b;Γ

������ηeff

6= 0 [along Γ]

Loop-Corrected BP Algorithm

1. Run bare BP algorithm. Terminate if BP succeeds (i.e. a valid code word is found).

2. If BP fails find the most relevant loop Γ that corresponds to the maximal |rΓ|. Triad search is helping.

3. Solve the modified-BP equations for the given Γ. Terminate if the improved-BP succeeds.

4. Return to Step 2 with an improved Γ-loop selection.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

Extended Variational Principle & Loop-Corrected BPBare BP Variational Principle:

δZ0δηab

�����η(bp)

= 0, Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

���η(bp)

New choice of Gauges guided by the knowledge of the critical loop Γ

δ exp(−F)δηab

���ηeff

=0, F ≡ − ln(Z0 + ZΓ)

BP-equations are modified along the critical loop ΓP

σa (tanh(ηab+ηba)−σab)Pa(σa)Pσa

Pa(σa)

����ηeff

=

Qd∈Γ µd ;ΓQ

(a′b′)∈Γ(1−(m

(∗)

a′b′)2)

δma→b;Γ

������ηeff

6= 0 [along Γ]

Loop-Corrected BP Algorithm

1. Run bare BP algorithm. Terminate if BP succeeds (i.e. a valid code word is found).

2. If BP fails find the most relevant loop Γ that corresponds to the maximal |rΓ|. Triad search is helping.

3. Solve the modified-BP equations for the given Γ. Terminate if the improved-BP succeeds.

4. Return to Step 2 with an improved Γ-loop selection.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

Extended Variational Principle & Loop-Corrected BPBare BP Variational Principle:

δZ0δηab

�����η(bp)

= 0, Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

���η(bp)

New choice of Gauges guided by the knowledge of the critical loop Γ

δ exp(−F)δηab

���ηeff

=0, F ≡ − ln(Z0 + ZΓ)

BP-equations are modified along the critical loop ΓP

σa (tanh(ηab+ηba)−σab)Pa(σa)Pσa

Pa(σa)

����ηeff

=

Qd∈Γ µd ;ΓQ

(a′b′)∈Γ(1−(m

(∗)

a′b′)2)

δma→b;Γ

������ηeff

6= 0 [along Γ]

Loop-Corrected BP Algorithm

1. Run bare BP algorithm. Terminate if BP succeeds (i.e. a valid code word is found).

2. If BP fails find the most relevant loop Γ that corresponds to the maximal |rΓ|. Triad search is helping.

3. Solve the modified-BP equations for the given Γ. Terminate if the improved-BP succeeds.

4. Return to Step 2 with an improved Γ-loop selection.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

Extended Variational Principle & Loop-Corrected BPBare BP Variational Principle:

δZ0δηab

�����η(bp)

= 0, Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

���η(bp)

New choice of Gauges guided by the knowledge of the critical loop Γ

δ exp(−F)δηab

���ηeff

=0, F ≡ − ln(Z0 + ZΓ)

BP-equations are modified along the critical loop ΓP

σa (tanh(ηab+ηba)−σab)Pa(σa)Pσa

Pa(σa)

����ηeff

=

Qd∈Γ µd ;ΓQ

(a′b′)∈Γ(1−(m

(∗)

a′b′)2)

δma→b;Γ

������ηeff

6= 0 [along Γ]

Loop-Corrected BP Algorithm

1. Run bare BP algorithm. Terminate if BP succeeds (i.e. a valid code word is found).

2. If BP fails find the most relevant loop Γ that corresponds to the maximal |rΓ|. Triad search is helping.

3. Solve the modified-BP equations for the given Γ. Terminate if the improved-BP succeeds.

4. Return to Step 2 with an improved Γ-loop selection.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

LP-erasure (even simpler) algorithm1. Run LP algorithm. Terminate if LP succeeds (i.e. a valid code word is found).

2. If LP fails, find the most relevant loop Γ that corresponds to the maximal amplitude r(Γ).

3. Modify the log-likelihoods along the loop Γ introducing a shift towards zero, i.e. introduce a completeor partial erasure of the log-likelihoods at the bits. Run LP with modified log-likelihoods. Terminate if themodified LP succeeds.

4. Return to Step 2 with an improved selection principle for the critical loop.

(155, 64, 20) Test

IT WORKS!All troublemakers (∼ 200 of them) previously found by LP-based Pseudo-Codeword-Search Algorithmmethod were successfully corrected by the loop-improved LP algorithm.

Method is invariant with respect the choice of the codeword (used to generate pseudo-codewords).

General Conjecture:

Loop-erasure algorithm is capable of reducing the error-floor.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

LP-erasure (even simpler) algorithm1. Run LP algorithm. Terminate if LP succeeds (i.e. a valid code word is found).

2. If LP fails, find the most relevant loop Γ that corresponds to the maximal amplitude r(Γ).

3. Modify the log-likelihoods along the loop Γ introducing a shift towards zero, i.e. introduce a completeor partial erasure of the log-likelihoods at the bits. Run LP with modified log-likelihoods. Terminate if themodified LP succeeds.

4. Return to Step 2 with an improved selection principle for the critical loop.

(155, 64, 20) Test

IT WORKS!All troublemakers (∼ 200 of them) previously found by LP-based Pseudo-Codeword-Search Algorithmmethod were successfully corrected by the loop-improved LP algorithm.

Method is invariant with respect the choice of the codeword (used to generate pseudo-codewords).

General Conjecture:

Loop-erasure algorithm is capable of reducing the error-floor.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Loop calculus & Instantons: Analysis of Error-FloorEffective Free Energy Approach: Towards better Decoding

LP-erasure (even simpler) algorithm1. Run LP algorithm. Terminate if LP succeeds (i.e. a valid code word is found).

2. If LP fails, find the most relevant loop Γ that corresponds to the maximal amplitude r(Γ).

3. Modify the log-likelihoods along the loop Γ introducing a shift towards zero, i.e. introduce a completeor partial erasure of the log-likelihoods at the bits. Run LP with modified log-likelihoods. Terminate if themodified LP succeeds.

4. Return to Step 2 with an improved selection principle for the critical loop.

(155, 64, 20) Test

IT WORKS!All troublemakers (∼ 200 of them) previously found by LP-based Pseudo-Codeword-Search Algorithmmethod were successfully corrected by the loop-improved LP algorithm.

Method is invariant with respect the choice of the codeword (used to generate pseudo-codewords).

General Conjecture:

Loop-erasure algorithm is capable of reducing the error-floor.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

ResultsPath ForwardBibliography

ResultsLoop Calculus= Generic Tool for calculating marginal probabilities in terms of loops onunderlying graphical structure

Pseudo-Codeword-Search Algorithm= Efficient way of describing pseudo-codeword spectrum for LP-decoding

Pseudo-Codewords are all associated with respective Critical LoopsEffective Free Energy Approach= Variational Principle improving BP with a Critical Loop information

Loop-corrected BP and Loop-erasure= Algorithms improving BP/LP with a Critical Loop information

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

ResultsPath ForwardBibliography

Future EffortsImprove and continue testing the simple LP-erasure algorithm.

The major improvement required is in automatization of the critical loopidentification scheme. (Towards Monte Carlo test.)

Testing other (longer) codes.

Testing other (e.g. correlated) channels.

All of the above for improving Loop-corrected BP.

Stat Mech & Inf. Theory (inter-symbol interference, network capacity) on ad-dimensional lattice/graph with/without disorder.

Other complementary developments, e.g. on

Improving BP [Survey Propagation = Mezard et.al ’02; Generalized BP =Yedidia et.al ’01]

Correcting for Loops in BP [Montanarri, Rizzo ’05; Parisi, Slanina ’05]

Accelerating convergence of bare BP [Stepanov, Chertkov ’06]

Reducing LP complexity [Taghavi, Siegel ’06; Vontobel, Koetter ’06]

Improving LP [Dimakis, Wainwright ’06]

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

ResultsPath ForwardBibliography

Bibliography

M. Chertkov, V.Y. Chernyak, Loop Calculus Helps to Improve Belief Propagation and Linear ProgrammingDecodings of Low-Density-Parity-Check Codes, 44th Allerton Conference (September 27-29, 2006,Allerton, IL); arXiv:cs.IT/0609154.

M. Chertkov, V.Y. Chernyak, Loop Calculus in Statistical Physics and Information Science, Phys. Rev. E73, 065102(R) (2006); cond-mat/0601487.

M. Chertkov, V.Y. Chernyak, Loop series for discrete statistical models on graphs, J. Stat. Mech. (2006)P06009, cond-mat/0603189.

M. Chertkov, M.G. Stepanov, An Efficient Pseudo-Codeword Search Algorithm for Linear ProgrammingDecoding of LDPC Codes, arXiv:cs.IT/0601113, submitted to IEEE Transactions on Information Theory.

M.G. Stepanov, V. Chernyak, M. Chertkov, B. Vasic, Diagnosis of weakness in error correction: a physicsapproach to error floor analysis, Phys. Rev. Lett. 95, 228701 (2005) [See alsohttp://www.arxiv.org/cond-mat/0506037 for extended version with Supplements.]

V. Chernyak, M. Chertkov, M. Stepanov, B. Vasic, Error correction on a tree: An instanton approach,Phys. Rev. Lett. 93, 198702-1 (2004).

M.G. Stepanov, M. Chertkov, Instanton analysis of Low-Density-Parity-Check codes in the error-floorregime, arXiv:cs.IT/0601070, ISIT 2006 (July 2006, Seattle, WA)

M.G. Stepanov, M. Chertkov, Improving convergence of belief propagation decoding, arXiv:cs.IT/0607112,44th Allerton Conference (September 27-29, 2006, Allerton, IL)

All papers are available at http://cnls.lanl.gov/∼chertkov/pub.htm

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

Optimal Fluctuation

Laplace method = Large deviation = Steepestdescent = instanton = optimal fluctuation method

BER =R

dnoise Weight(noise)

BER ∼ Weight

�optimal configuration

of the noise

��

optimal configurationof the noise

�=

�point at the ESclosest to zero

Chernyak, Chertkov, Stepanov, Vasic Phys.Rev.Lett 93, 198702 (2004)

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

(155,64,20) Tanner code. Gaussian Channel.

Instanton-amoeba (numerical minimization)implemented for 4-iterations of iterative BP

l2eff =462

210l2eff =

806

79l2eff =

442

188

≈ 10.076 ≈ 10.203 ≈ 10.298

Stepanov, Chertkov, Chernyak, Vasic, Phys Rev Lett 95, 228701 (2005) + cond-mat/0506037 + ISIT 2006

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

(155,64,20) Tanner code. Laplacian Channel.

l2eff = 7.6 l2eff = 8 l2eff = 8

Lesson:

Strong dependence of the error-floor performance on the channel

Stepanov, Chertkov, 43rd Allerton conference (2005), arXiv:cs.IT/0507031

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

(155,64,20) Tanner code. Laplacian Channel.

l2eff = 7.6 l2eff = 8 l2eff = 8

Lesson:

Strong dependence of the error-floor performance on the channel

Stepanov, Chertkov, 43rd Allerton conference (2005), arXiv:cs.IT/0507031

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

Example: Inter-Symbol interference (Partial Response) channel

Di-code formulation

output = yi =P

j Jijσj + ξi

〈ξi 〉 = 0, 〈ξiξj 〉 = δij/s2

σ is encoded by an LDPC code

di-code=Inter-Symbol Interference + LDPC code

Decoding of the di-code = solving inferenceproblem on an extended Tanner graph

Our approach

Formulate di-BP = minimum of the joint (di-) Bethe free energy

Test iterative version of the di-BP against Monte-Carlo simulations

Apply instanton (and instanton-LP) approach to analysis of di-BP (di-code)error-floor

Develop loop-improved di-BP/LP

Anguita, Chertkov – in progress

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

Dilute Gas of Loops

If d (dimensionality) is large for a lattice problem orIf N is large for a random graph of size N:

Z = Z0

(1 +

∑C

rC

)≈ Z0

∏Csc

(1 + rsc)

Csc are single connected loops.

The approximation allows an easy multi-scale re-summation.

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

Ising model

General Graphical Model

W (σ) = Z−1Ya∈X

fa(σa), Z =Xσ

Ya∈X

fa(σa),

Ising model

W (σ) = Z−1I exp

0@β

Xα=(i,j)∈X

σiσj +Xi∈X

hiσi

1A

Z =Xσ

Yα=(i,j)∈X

exp�βσiσj

�=Xσ

Ya∈X

fa(σa); {a} = {i} ∪ {α}

fi (σi ) =

�exp(hiσi ), σiα = σiβ = σi ∀α, β 3 i

0, otherwise;

fα�σα = (σαi , σαj )

�= exp

�βσαiσαj

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

Ising model

General Graphical Model

W (σ) = Z−1Ya∈X

fa(σa), Z =Xσ

Ya∈X

fa(σa),

Ising model

W (σ) = Z−1I exp

0@β

Xα=(i,j)∈X

σiσj +Xi∈X

hiσi

1A

Z =Xσ

Yα=(i,j)∈X

exp�βσiσj

�=Xσ

Ya∈X

fa(σa); {a} = {i} ∪ {α}

fi (σi ) =

�exp(hiσi ), σiα = σiβ = σi ∀α, β 3 i

0, otherwise;

fα�σα = (σαi , σαj )

�= exp

�βσαiσαj

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

IntroductionLoop Calculus: General Statement & Derivation

Loop Calculus & Error-CorrectionConclusions

Bonus Material

Instanton/Optimal Fluctuations/Pseudo-Codeword AnalysisOther Problems in Information/Computer ScienceStatistical Mechanics on a Lattice: Re-summation of Loops

Paramagnetic (high-temperature) State at h = 0

The only solution of the BP equations is the trivial one: η = 0

The Loop Series reduces to the High-Temperature Expansion

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

Z (h) =∑σ

M∏α=1

δ

(∏i∈α

σi , 1

)exp

(N∑

i=1hiσi

)hi is a log-likelihood at a bit (outcome of the channel)

Z±jα(h>) ≡σj=±1∑

σ>

∏β>

δ

(∏i∈β

σi , 1

)exp

(∑i>

hiσi

)

Z±jα = exp(±hj)

j∈β∏β 6=α

1

2

i∈β∏i 6=j

(Z+iβ + Z−iβ)±

i∈β∏i 6=j

(Z+iβ − Z−iβ)

ηjα ≡

1

2ln

(Z+

Z−jα

), ηjα = hj +

j∈β∑β 6=α

tanh−1

i∈β∏i 6=j

tanh ηiβ

Statistical Inference

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

Gibbs measure: P(x) = exp(−E(x))Z , Z ≡

∑x

exp(−E (x)

Exact Variational Principe

F{b(x)} =∑

x

b(x)E (x)−∑

x

b(x) ln b(x)

δF

δb(x)

∣∣∣∣b(x)=p(x)

= 0 under∑

x

b(x) = 1

Factorized form of E(x) or other considerations may suggest an approximate

variational ansatz (mean-field, b(x) =Qi

bi (xi ), is the famous example)

Bethe Free Energy

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

LP decoding (σi = 0, 1 AWGN channel)

Minimize, E =P

α

Pσα

bα(σα)P

i∈α σi (1− 2xi )/qi , under 0 ≤ bi (σi ), bα(σα) ≤ 1

∀ α :P

σαbα(σα) = 1, & ∀ i ∀ α 3 i : bi (σi ) =

Pσα\σi

bα(σα)

“0”-codeword

Error-Surface

dangerouspseudo-codeword

Weighted Median:

xinst = σ2

Pi σiPi σ2

i

, d =(P

i σi )2

Pi σ2

i

FER ∼ exp(−d · s2/2)Wiberg ’96; Forney et.al ’01Vontobel, Koetter ’03,’05

Pseudo-Codeword-Search AlgorithmChertkov, Stepanov ’06

Start: Initiate x(0).

Step 1: x(k) is decoded to σ(k).

Step 2: Find y(k) - weighted median

between σ(k), and ”0”

Step 3: If y(k) = y(k−1), k∗ = k End.Otherwise go to Step 2 with

x(k+1) = y(k) + 0.

(155, 64, 20), AWGN test:• Fast Convergence

bit label, = 0, ..., 154i0 50 100 150

012

16.4037

2x i

012

16.4044

2x i

012

16.4058

2x i

012

16.4060

2x i

012

16.4095

2x i

012

16.4113

2x i

012

16.4119

2x i

012

16.4145

2x i

Enlarge

∼ 200 pseudo-codewords within16.4037 < d < 20

Pseudo-Codeword General

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

LP decoding (σi = 0, 1 AWGN channel)

Minimize, E =P

α

Pσα

bα(σα)P

i∈α σi (1− 2xi )/qi , under 0 ≤ bi (σi ), bα(σα) ≤ 1

∀ α :P

σαbα(σα) = 1, & ∀ i ∀ α 3 i : bi (σi ) =

Pσα\σi

bα(σα)

“0”-codeword

Error-Surface

dangerouspseudo-codeword

Weighted Median:

xinst = σ2

Pi σiPi σ2

i

, d =(P

i σi )2

Pi σ2

i

FER ∼ exp(−d · s2/2)Wiberg ’96; Forney et.al ’01Vontobel, Koetter ’03,’05

Pseudo-Codeword-Search AlgorithmChertkov, Stepanov ’06

000

xxx(0)

xxx(1)

xxx(2)

xxx(3)

σσσ(0)

σσσ(1)

σσσ(2,3)

xxx(0) start

step 2

step 3

σσσ(2) = σσσ(3) end

Start: Initiate x(0).

Step 1: x(k) is decoded to σ(k).

Step 2: Find y(k) - weighted median

between σ(k), and ”0”

Step 3: If y(k) = y(k−1), k∗ = k End.Otherwise go to Step 2 with

x(k+1) = y(k) + 0.

(155, 64, 20), AWGN test:• Fast Convergence

bit label, = 0, ..., 154i0 50 100 150

012

16.4037

2x i

012

16.4044

2x i

012

16.4058

2x i

012

16.4060

2x i

012

16.4095

2x i

012

16.4113

2x i

012

16.4119

2x i

012

16.4145

2x i

Enlarge

∼ 200 pseudo-codewords within16.4037 < d < 20

Pseudo-Codeword General

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

LP decoding (σi = 0, 1 AWGN channel)

Minimize, E =P

α

Pσα

bα(σα)P

i∈α σi (1− 2xi )/qi , under 0 ≤ bi (σi ), bα(σα) ≤ 1

∀ α :P

σαbα(σα) = 1, & ∀ i ∀ α 3 i : bi (σi ) =

Pσα\σi

bα(σα)

“0”-codeword

Error-Surface

dangerouspseudo-codeword

Weighted Median:

xinst = σ2

Pi σiPi σ2

i

, d =(P

i σi )2

Pi σ2

i

FER ∼ exp(−d · s2/2)Wiberg ’96; Forney et.al ’01Vontobel, Koetter ’03,’05

Pseudo-Codeword-Search AlgorithmChertkov, Stepanov ’06

000

xxx(0)

xxx(1)

xxx(2)

xxx(3)

σσσ(0)

σσσ(1)

σσσ(2,3)

xxx(0) start

step 2

step 3

σσσ(2) = σσσ(3) end

Start: Initiate x(0).

Step 1: x(k) is decoded to σ(k).

Step 2: Find y(k) - weighted median

between σ(k), and ”0”

Step 3: If y(k) = y(k−1), k∗ = k End.Otherwise go to Step 2 with

x(k+1) = y(k) + 0.

(155, 64, 20), AWGN test:• Fast Convergence

bit label, = 0, ..., 154i0 50 100 150

012

16.4037

2x i

012

16.4044

2x i

012

16.4058

2x i

012

16.4060

2x i

012

16.4095

2x i

012

16.4113

2x i

012

16.4119

2x i

012

16.4145

2x i

Enlarge

∼ 200 pseudo-codewords within16.4037 < d < 20

Pseudo-Codeword General

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

bit label, = 0, ..., 154i0 50 100 150

012

16.4037

2x i

012

16.4044

2x i

012

16.4058

2x i

012

16.4060

2x i

012

16.4095

2x i

012

16.4113

2x i

012

16.41192

x i012

16.4145

2x i

Back

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

Replication:Z =

Qa fa(σa) =

Pσ′Q

a fa(σa)Q

bc1+σbc σcb

2, σbc 6= σcb

Local Gauge Freedom:

1 + πσ =exp(ση+πχ)cosh(η+χ)

�1 + (tanh(η + χ)− σ)(tanh(η + χ)− π) cosh2(η + χ)

Z = (Ybc

2 cosh(ηbc + ηcb))−1Xσ

Ya

Pa

Ybc

Vbc , Pa(σa) = fa(σa)Yb∈a

exp(ηabσab)

Vbc (σbc , σcb) = 1 + (tanh(ηbc + ηcb)− σbc ) (tanh(ηbc + ηcb)− σcb) cosh2(ηbc + ηcb)

Fixing the Gauge (η-fields on the graph)

BP equationsP

σa

�tanh(η

(bp)ab

+ η(bp)ba

)− σab

�Pa(σa) = 0 ⇒ η

bpjα = hj +

Pj∈ββ 6=α

tanh−1(Qi∈β

i 6=jtanh η

bpiβ

)

Geometrical Principle: no loose endsQ(bc)

Vbc = 1 +P

colored edges∗ · · · ∗ · · · ∗ · · ·

Variational Principle:Q(bc)

Vbc → 1, Z → Z0,δZ0δηab

�����η(bp)

= 0

Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

Loop Series

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

Replication:Z =

Qa fa(σa) =

Pσ′Q

a fa(σa)Q

bc1+σbc σcb

2, σbc 6= σcb

Local Gauge Freedom:

1 + πσ =exp(ση+πχ)cosh(η+χ)

�1 + (tanh(η + χ)− σ)(tanh(η + χ)− π) cosh2(η + χ)

Z = (Ybc

2 cosh(ηbc + ηcb))−1Xσ

Ya

Pa

Ybc

Vbc , Pa(σa) = fa(σa)Yb∈a

exp(ηabσab)

Vbc (σbc , σcb) = 1 + (tanh(ηbc + ηcb)− σbc ) (tanh(ηbc + ηcb)− σcb) cosh2(ηbc + ηcb)

Fixing the Gauge (η-fields on the graph)

BP equationsP

σa

�tanh(η

(bp)ab

+ η(bp)ba

)− σab

�Pa(σa) = 0 ⇒ η

bpjα = hj +

Pj∈ββ 6=α

tanh−1(Qi∈β

i 6=jtanh η

bpiβ

)

Geometrical Principle: no loose endsQ(bc)

Vbc = 1 +P

colored edges∗ · · · ∗ · · · ∗ · · ·

Variational Principle:Q(bc)

Vbc → 1, Z → Z0,δZ0δηab

�����η(bp)

= 0

Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

Loop Series

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

Replication:Z =

Qa fa(σa) =

Pσ′Q

a fa(σa)Q

bc1+σbc σcb

2, σbc 6= σcb

Local Gauge Freedom:

1 + πσ =exp(ση+πχ)cosh(η+χ)

�1 + (tanh(η + χ)− σ)(tanh(η + χ)− π) cosh2(η + χ)

Z = (Ybc

2 cosh(ηbc + ηcb))−1Xσ

Ya

Pa

Ybc

Vbc , Pa(σa) = fa(σa)Yb∈a

exp(ηabσab)

Vbc (σbc , σcb) = 1 + (tanh(ηbc + ηcb)− σbc ) (tanh(ηbc + ηcb)− σcb) cosh2(ηbc + ηcb)

Fixing the Gauge (η-fields on the graph)

BP equationsP

σa

�tanh(η

(bp)ab

+ η(bp)ba

)− σab

�Pa(σa) = 0 ⇒ η

bpjα = hj +

Pj∈ββ 6=α

tanh−1(Qi∈β

i 6=jtanh η

bpiβ

)

Geometrical Principle: no loose endsQ(bc)

Vbc = 1 +P

colored edges∗ · · · ∗ · · · ∗ · · · Variational Principle:

Q(bc)

Vbc → 1, Z → Z0,δZ0δηab

�����η(bp)

= 0

Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

Loop Series

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

Replication:Z =

Qa fa(σa) =

Pσ′Q

a fa(σa)Q

bc1+σbc σcb

2, σbc 6= σcb

Local Gauge Freedom:

1 + πσ =exp(ση+πχ)cosh(η+χ)

�1 + (tanh(η + χ)− σ)(tanh(η + χ)− π) cosh2(η + χ)

Z = (Ybc

2 cosh(ηbc + ηcb))−1Xσ

Ya

Pa

Ybc

Vbc , Pa(σa) = fa(σa)Yb∈a

exp(ηabσab)

Vbc (σbc , σcb) = 1 + (tanh(ηbc + ηcb)− σbc ) (tanh(ηbc + ηcb)− σcb) cosh2(ηbc + ηcb)

Fixing the Gauge (η-fields on the graph)

BP equationsP

σa

�tanh(η

(bp)ab

+ η(bp)ba

)− σab

�Pa(σa) = 0 ⇒ η

bpjα = hj +

Pj∈ββ 6=α

tanh−1(Qi∈β

i 6=jtanh η

bpiβ

)

Geometrical Principle: no loose endsQ(bc)

Vbc = 1 +P

colored edges∗ · · · ∗ · · · ∗ · · · Variational Principle:

Q(bc)

Vbc → 1, Z → Z0,δZ0δηab

�����η(bp)

= 0

Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

Loop Series

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

Replication:Z =

Qa fa(σa) =

Pσ′Q

a fa(σa)Q

bc1+σbc σcb

2, σbc 6= σcb

Local Gauge Freedom:

1 + πσ =exp(ση+πχ)cosh(η+χ)

�1 + (tanh(η + χ)− σ)(tanh(η + χ)− π) cosh2(η + χ)

Z = (Ybc

2 cosh(ηbc + ηcb))−1Xσ

Ya

Pa

Ybc

Vbc , Pa(σa) = fa(σa)Yb∈a

exp(ηabσab)

Vbc (σbc , σcb) = 1 + (tanh(ηbc + ηcb)− σbc ) (tanh(ηbc + ηcb)− σcb) cosh2(ηbc + ηcb)

Fixing the Gauge (η-fields on the graph)

BP equationsP

σa

�tanh(η

(bp)ab

+ η(bp)ba

)− σab

�Pa(σa) = 0 ⇒ η

bpjα = hj +

Pj∈ββ 6=α

tanh−1(Qi∈β

i 6=jtanh η

bpiβ

)

Geometrical Principle: no loose endsQ(bc)

Vbc = 1 +P

colored edges∗ · · · ∗ · · · ∗ · · · Variational Principle:

Q(bc)

Vbc → 1, Z → Z0,δZ0δηab

�����η(bp)

= 0

Z0 = (Q

bc 2 cosh(ηbc + ηcb))−1Pσ

Qa Pa(σa)

Loop Series

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory

BP is Exact on a TreeVariational Method in Statistical MechanicsPseudo-Codeword (Instanton) Search AlgorithmLoop Series: Derivation SketchPseudo-Codewords & Loops

0 50 100 150−1

0

1

bit label, i=1,...,155

A−

post

erio

ri lo

g−lik

elih

oods

Instanton #1, deff

=16.4037

139 100

38 140

1

1 1

1

0 50 100 150−1

0

1

Instanton #33, deff

=16.7531

bit label, i=1,...,155

A−

post

erio

ri lo

g−lik

elih

ood

140 101

39 141

0.5676

1 0.5676

1

0 50 100 150−1

0

1

bit label, i=1,...,155

A−

post

erio

ri lo

g−lik

elih

ood

Instanton #50, deff

=17.0171

78 113

116 16

1

1

-0.6963

1

153

-0.6963

0 50 100 150−1

0

1

Instanton #100, deff

=17.7366

bit label, i=1,...,155

A−

post

erio

ri lo

g−lik

elih

ood

131 122

60 28

0.7077

1

0.8665

9

74

0.9085

0.7493

0.9903

0 50 100 150−1

0

1

Instanton #140, deff

=18.8339

bit label, i=1,...,155

A−

post

erio

ri lo

g−lik

elih

ood

104 4

7 39

0.9441

1

1

0.9441

141

1

0 50 100 150−1

0

1

Instanton #192, deff

=19.9997

bit label, i=1,...,155

A−

post

erio

ri lo

g−lik

elih

ood

65

100

103

116

1 1

1

3

1

1 1

Back

Michael Chertkov, Los Alamos NL Loop calculus in statistical physics and information theory