Composing graphical models with neural networks...

Preview:

Citation preview

Composing graphical models with neural networks for structured representations and fast

inference

Written by Matthew James Johnson, David Duvenaud, Alexander B. Wiltschko,

Sandeep R. Datta and Ryan P. Adams.

Published in NIPS 2016.

Presenter: Juho Lee

Motivation

GMM:

Motivation

VAE:

Motivation

GMM + SVAE:

Conjugate Exponential Families

Natural parameter

SufficientStatistics

Log-partition function

Conjugate Exponential Families

Compare to

Variational Inference with Conjugate Exponential families

Assume that there exists a matrix such that

Variational Inference with Conjugate Exponential families

Variational Inference with Conjugate Exponential families• The objective function (ELBO):

• By the calculus of variations,

Stochastic variational inference

• One can start by assuming,

and optimize w.r.t. the ELBO

• The gradient of is computed as

Hoffman et al, Stochastic variational inference, JMLR 2013

Fisher information matrix

Natural gradient

Stochastic variational inference

• Coordinate descent algorithm is a natural gradient descent

• Approximate it by stochastic (natural) gradient descent

Hoffman et al, Stochastic variational inference, JMLR 2013

• Place conjugate exponential family prior on the latent variable

• Likelihood is an arbitrary (nonlinear) function

• Reparametrization + (stochastic) natural gradient descent

Structured Variational Autoencoder (SVAE)

Structured Variational Autoencoder (SVAE)

• Mean-field approximation and the ELBO

• Intractable, consider the subproblem

Structured Variational Autoencoder (SVAE)

• Now optimize the surrogate bound

• Optimizing : natural gradient descent

• Optimizing and : reparametrization trick

Structured Variational Autoencoder (SVAE)

• Examples:

GMM + SVAE:

Latent switching linear dynamical systems:

Structured Variational Autoencoder (SVAE)

• Some illustrations

Recommended