34
Bayesian Inference Chris Mathys Wellcome Trust Centre for Neuroimaging UCL SPM Course London, May 12, 2014 Thanks to Jean Daunizeau and Jérémie Mattout for previous versions of this talk

Bayesian Inference

  • Upload
    said

  • View
    34

  • Download
    0

Embed Size (px)

DESCRIPTION

Bayesian Inference. Chris Mathys Wellcome Trust Centre for Neuroimaging UCL SPM Course London, May 12, 2014. Thanks to Jean Daunizeau and Jérémie Mattout for previous versions of this talk. A spectacular piece of information. A spectacular piece of information. - PowerPoint PPT Presentation

Citation preview

Page 1: Bayesian Inference

Bayesian Inference

Chris Mathys

Wellcome Trust Centre for Neuroimaging

UCL

SPM Course

London, May 12, 2014

Thanks to Jean Daunizeau and Jérémie Mattout for previous versions of this talk

Page 2: Bayesian Inference

A spectacular piece of information

2May 12, 2014

Page 3: Bayesian Inference

A spectacular piece of information

Messerli, F. H. (2012). Chocolate Consumption, Cognitive Function, and Nobel Laureates.

New England Journal of Medicine, 367(16), 1562–1564.

3May 12, 2014

Page 4: Bayesian Inference

This is a question referring to uncertain quantities. Like almost all scientific

questions, it cannot be answered by deductive logic. Nonetheless, quantitative

answers can be given – but they can only be given in terms of probabilities.

Our question here can be rephrased in terms of a conditional probability:

To answer it, we have to learn to calculate such quantities. The tool for this is

Bayesian inference.

So will I win the Nobel prize if I eat lots of chocolate?

4May 12, 2014

Page 5: Bayesian Inference

«Bayesian» = logicaland

logical = probabilistic

«The actual science of logic is conversant at present only with things either

certain, impossible, or entirely doubtful, none of which (fortunately) we have to

reason on. Therefore the true logic for this world is the calculus of probabilities,

which takes account of the magnitude of the probability which is, or ought to be,

in a reasonable man's mind.»

— James Clerk Maxwell, 1850

5May 12, 2014

Page 6: Bayesian Inference

But in what sense is probabilistic reasoning (i.e., reasoning about uncertain

quantities according to the rules of probability theory) «logical»?

R. T. Cox showed in 1946 that the rules of probability theory can be derived

from three basic desiderata:

1. Representation of degrees of plausibility by real numbers

2. Qualitative correspondence with common sense (in a well-defined sense)

3. Consistency

«Bayesian» = logicaland

logical = probabilistic

6May 12, 2014

Page 7: Bayesian Inference

By mathematical proof (i.e., by deductive reasoning) the three desiderata as set out by

Cox imply the rules of probability (i.e., the rules of inductive reasoning).

This means that anyone who accepts the desiderata must accept the following rules:

1. (Normalization)

2. (Marginalization – also called the sum rule)

3. (Conditioning – also called the product rule)

«Probability theory is nothing but common sense reduced to calculation.»

— Pierre-Simon Laplace, 1819

The rules of probability

7May 12, 2014

Page 8: Bayesian Inference

The probability of given is denoted by

In general, this is different from the probability of alone (the marginal probability of ),

as we can see by applying the sum and product rules:

Because of the product rule, we also have the following rule (Bayes’ theorem) for going

from to :

Conditional probabilities

8May 12, 2014

Page 9: Bayesian Inference

In our example, it is immediately clear that is very different from . While the first is

hopeless to determine directly, the second is much easier to find out: ask Nobel

laureates how much chocolate they eat. Once we know that, we can use Bayes’ theorem:

Inference on the quantities of interest in fMRI/DCM studies has exactly the same

general structure.

The chocolate example

9May 12, 2014

prior

posterior

likelihood

evidence

model

Page 10: Bayesian Inference

forward problem

likelihood

inverse problem

posterior distribution

Inference in SPM

𝑝 (𝜗|𝑦 ,𝑚 )

𝑝 (𝑦|𝜗 ,𝑚 )

10May 12, 2014

Page 11: Bayesian Inference

Likelihood:

Prior:

Bayes’ theorem:

generative model

Inference in SPM

𝑝 (𝑦|𝜗 ,𝑚 )

𝑝 (𝜗|𝑚 )

𝑝 (𝜗|𝑦 ,𝑚)=𝑝 (𝑦|𝜗 ,𝑚 )𝑝 (𝜗|𝑚 )𝑝 (𝑦|𝑚 )

11May 12, 2014

Page 12: Bayesian Inference

A simple example of Bayesian inference(adapted from Jaynes (1976))

Assuming prices are comparable, from which manufacturer would you buy?

A: B:

Two manufacturers, A and B, deliver the same kind of components that turn out to

have the following lifetimes (in hours):

May 12, 2014 12

Page 13: Bayesian Inference

A simple example of Bayesian inference

How do we compare such samples?

May 12, 2014 13

Page 14: Bayesian Inference

What next?

A simple example of Bayesian inference

May 12, 2014 14

Page 15: Bayesian Inference

A simple example of Bayesian inference

The procedure in brief:

• Determine your question of interest («What is the probability that...?»)

• Specify your model (likelihood and prior)

• Calculate the full posterior using Bayes’ theorem

• [Pass to the uninformative limit in the parameters of your prior]

• Integrate out any nuisance parameters

• Ask your question of interest of the posterior

All you need is the rules of probability theory.

(Ok, sometimes you’ll encounter a nasty integral – but that’s a technical difficulty,

not a conceptual one).

May 12, 2014 15

Page 16: Bayesian Inference

A simple example of Bayesian inference

The question:

• What is the probability that the components from manufacturer B

have a longer lifetime than those from manufacturer A?

• More specifically: given how much more expensive they are, how

much longer do I require the components from B to live.

• Example of a decision rule: if the components from B live 3 hours

longer than those from A with a probability of at least 80%, I will

choose those from B.

May 12, 2014 16

Page 17: Bayesian Inference

A simple example of Bayesian inference

The model (bear with me, this will turn out to be simple):

• likelihood (Gaussian):

• prior (Gaussian-gamma):

May 12, 2014 17

Page 18: Bayesian Inference

A simple example of Bayesian inference

The posterior (Gaussian-gamma):

Parameter updates:

with

May 12, 2014 18

Page 19: Bayesian Inference

A simple example of Bayesian inference

The limit for which the prior becomes uninformative:

• For , , , the updates reduce to:

• As promised, this is really simple: all you need is , the number of

datapoints; , their mean; and , their variance.

• This means that only the data influence the posterior and all influence from the

parameters of the prior has been eliminated.

• The uninformative limit should only ever be taken after the calculation of the

posterior using a proper prior.

May 12, 2014 19

Page 20: Bayesian Inference

A simple example of Bayesian inference

Integrating out the nuisance parameter gives rise to a t-

distribution:

May 12, 2014 20

Page 21: Bayesian Inference

A simple example of Bayesian inference

The joint posterior is simply the product of our two

independent posteriors and . It will now give us the answer to

our question:

Note that the t-test told us that there was «no significant

difference» even though there is a >95% probability that the

parts from B will last at least 3 hours longer than those from A.

May 12, 2014 21

Page 22: Bayesian Inference

Bayesian inference

The procedure in brief:

• Determine your question of interest («What is the probability that...?»)

• Specify your model (likelihood and prior)

• Calculate the full posterior using Bayes’ theorem

• [Pass to the uninformative limit in the parameters of your prior]

• Integrate out any nuisance parameters

• Ask your question of interest of the posterior

All you need is the rules of probability theory.

May 12, 2014 22

Page 23: Bayesian Inference

Frequentist (or: orthodox, classical) versus Bayesian inference: hypothesis testing

if then reject H0

• estimate parameters (obtain test stat.)

• define the null, e.g.:

• apply decision rule, i.e.:

Classical

𝐻0 :𝜗=0𝑝 (𝑡|𝐻 0 )

𝑝 (𝑡>𝑡∗|𝐻0 )

𝑡∗ 𝑡≡𝑡 (𝑌 )

𝑝 (𝑡>𝑡∗|𝐻0 )≤𝛼

23May 12, 2014

if then accept H0

• invert model (obtain posterior pdf)

• define the null, e.g.:

• apply decision rule, i.e.:

Bayesian

𝑝 (𝜗|𝑦 )

𝑝 (𝐻0|𝑦 )

𝐻0 :𝜗>𝜗0

𝑝 (𝐻0|𝑦 ) ≥𝛼

𝜗0𝜗

Page 24: Bayesian Inference

• Principle of parsimony: «plurality should not be assumed without necessity»

• Automatically enforced by Bayesian model comparison

y=f(x

)y

= f(

x)

x

Model comparison: general principles

mod

el e

vide

nce

p(y|

m)

space of all data sets

Model evidence:

“Occam’s razor” :

𝑝 (𝑦|𝑚 )=∫𝑝 (𝑦|𝜗 ,𝑚 )𝑝 (𝜗|𝑚 ) d𝜗

24May 12, 2014

Page 25: Bayesian Inference

Model comparison: negative variational free energy F

25May 12, 2014

𝐥𝐨𝐠−𝐦𝐨𝐝𝐞𝐥𝐞𝐯𝐢𝐝𝐞𝐧𝐜𝐞≔ log𝑝 (𝑦|𝑚 )Jensen’s inequality

sum rule

multiply by

𝐹≔∫𝑞 (𝜗 ) log 𝑝 ( 𝑦 ,𝜗|𝑚 )𝑞 (𝜗 )

d𝜗product rule

Kullback-Leibler divergence

a lower bound on thelog-model evidence

Page 26: Bayesian Inference

Model comparison: F in relation to Bayes factors, AIC, BIC

26May 12, 2014

[Meaning of the Bayes factor: ]Posterior odds Prior odds

Bayes factor

𝑭=∫𝑞 (𝜗 ) log𝑝 ( 𝑦|𝜗 ,𝑚 )d𝜗−𝐾𝐿 [𝑞 (𝜗 ) ,𝑝 (𝜗|𝑚 ) ]𝐀𝐈𝐂≔ Accuracy −𝑝Number of parameters

Number of data points

Page 27: Bayesian Inference

A note on informative priors

27May 12, 2014

• Any model consists of two parts: likelihood and prior.

• The choice of likelihood requires as much justification as the choice of prior because it is just as «subjective» as that of the prior.

• The data never speak for themselves. They only acquire meaning when seen through the lens of a model. However, this does not mean that all is subjective because models differ in their validity.

• In this light, the widespread concern that informative priors might bias results (while the form of the likelihood is taken as a matter of course requiring no justification) is misplaced.

• Informative priors are an important tool and their use can be justified by establishing the validity (face, construct, and predictive) of the resulting model as well as by model comparison.

Page 28: Bayesian Inference

Applications of Bayesian inference

28May 12, 2014

Page 29: Bayesian Inference

realignment smoothing

normalisation

general linear model

template

Gaussian field theory

p <0.05

statisticalinference

segmentationand normalisation

dynamic causalmodelling

posterior probabilitymaps (PPMs)

multivariatedecoding

29May 12, 2014

Page 30: Bayesian Inference

grey matter CSFwhite matter

yi ci

k

2

1

1 2 k

class variances

classmeans

ith voxelvalue

ith voxellabel

classfrequencies

Segmentation (mixture of Gaussians-model)

30May 12, 2014

Page 31: Bayesian Inference

PPM: regions best explainedby short-term memory model

PPM: regions best explained by long-term memory model

fMRI time series

GLM coeff

prior varianceof GLM coeff

prior varianceof data noise

AR coeff(correlated noise)

short-term memorydesign matrix (X)

long-term memorydesign matrix (X)

fMRI time series analysis

31May 12, 2014

Page 32: Bayesian Inference

m2m1 m3 m4

V1 V5stim

PPC

attention

V1 V5stim

PPC

attention

V1 V5stim

PPC

attention

V1 V5stim

PPC

attention

m1 m2 m3 m4

15

10

5

0

V1 V5stim

PPC

attention

1.25

0.13

0.46

0.39 0.26

0.26

0.10estimated

effective synaptic strengthsfor best model (m4)

models marginal likelihoodln p y m

Dynamic causal modeling (DCM)

32May 12, 2014

Page 33: Bayesian Inference

m1

m2

diffe

renc

es in

log-

mod

el e

vide

nces

1 2ln lnp y m p y m

subjects

Fixed effect

Random effect

Assume all subjects correspond to the same model

Assume different subjects might correspond to different models

Model comparison for group studies

33May 12, 2014

Page 34: Bayesian Inference

Thanks

34May 12, 2014