Inverse problems in imaging science: from classical...

Preview:

Citation preview

.

Inverse problems in imaging science:From classical regularization methodsTo state of the art Bayesian methods

Ali Mohammad-DjafariLaboratoire des Signaux et Systemes,

UMR8506 CNRS-SUPELEC-UNIV PARIS SUD 11SUPELEC, 91192 Gif-sur-Yvette, France

http://lss.supelec.free.fr

Email: djafari@lss.supelec.frhttp://djafari.free.fr

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 1/76

Content

1. Preliminaries: Direct and indirect observation, errors andprobability law, 1D signal, 2D and 3D image, ...

2. Inverse problems examples in imaging science

3. Classical methods: Generalized inversion and Regularization

4. Bayesian approach for inverse problems

5. Prior modeling- Gaussian, Generalized Gaussian (GG), Gamma, Beta,- Gauss-Markov, GG-Marvov- Sparsity enforcing priors (Bernouilli-Gaussian, B-Gamma,Cauchy, Student-t, Laplace)

6. Full Bayesian approach (Estimation of hyperparameters)

7. Hierarchical prior models

8. Bayesian Computation and Algorithms for Hierarchical models

9. Gauss-Markov-Potts family of priors

10. Applications and case studies

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 2/76

Preliminaries: Direct and indirect observation

Direct observation of a few quantities are possible:length, time, electrical charge, number of particles

For many others, we only can measure them by transformingthem (Indirect observation). Example:Thermometer transforms variation of temeprature to variationof length.

Imaging science is a perfect example of indirect observationparticularly when we want to see inside of a body from theoutside (Computed Tomography)

When measuring (observing) a quantity, the errors are alwayspresent.

For any quantity (direct or indirect observation) we maydefine a probability law

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 3/76

Probability law: Discrete and continuous variables

A quantity can be discrete or continuous

For discrete value quantities we define a probabilitydistribution

P (X = k) = pk, k = 1, · · · ,K with

K∑

k=1

pk = 1

For continuous value quantities we define a probability density.

P (a < X ≤ b) =

∫ b

a

p(x) dx with

∫ +∞

−∞p(x) dx = 1

For both cases, we may define: Most probable (Mode), Median, Quantiles Regions of high probabilities, ... Expected value (Mean) Variance, Covariance Higher order moments Entropy

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 4/76

Representation of signals and images

Signal: f(t), f(x), f(ν) f(t) Variation of temperature in a given position as a function

of time t f(x) Variation of temperature as a function of the position x

on a line f(ν) Variation of temperature as a function of the frequency ν

Image: f(x, y), f(x, t), f(ν, t), f(ν1, ν2) f(x, y) Distribution of temperature as a function of the

position (x, y) f(x, t) Variation of temperature as a function of x and t ...

3D, 3D+t, 3D+ν, ... signals: f(x, y, z), f(x, y, t), f(x, y, z, t) f(x, y, z) Distribution of temperature as a function of the

position (x, y, z) f(x, y, z, t) Variation of temperature as a function of (x, y, z)

and t ...

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 5/76

Representation of signals

0 10 20 30 40 50 60 70 80 90 100−2.5

−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

2.5

time

Am

plit

ud

e

g(t)

1D signal 2D signal=image 3D signal

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 6/76

Signals and images

A signal f(t) can be represented by p(f(t), t = 0, · · · , T − 1)

0 10 20 30 40 50 60 70 80 90 100−4

−3

−2

−1

0

1

2

3

4

An image f(x, y) can be represented by p(f(x, y), (x, y) ∈ R)

Finite domaine observations f = f(t), t = 0, · · · , T − 1

Image F = f(x, y) a 2D table or a 1D tablef = f(x, y), (x, y) ∈ R

For a vector f we define p(f). Then, we can define

Most probable value: f = argmaxf p(f) Expected value : m = E f =

∫fp(f) df

CoVariance matrix: Σ = E (f −m)(f −m)′ Entropy H = E − ln p(f) = −

∫p(f) ln p(f) df

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 7/76

2. Inverse problems examples

Example 1:Measuring variation of temperature with a therometer

f(t) variation of temperature over time g(t) variation of length of the liquid in thermometer

Example 2: Seeing outside of a body: Making an image usinga camera, a microscope or a telescope

f(x, y) real scene g(x, y) observed image

Example 3: Seeing inside of a body: Computed Tomographyusng X rays, US, Microwave, etc.

f(x, y) a section of a real 3D body f(x, y, z) gφ(r) a line of observed radiographe gφ(r, z)

Example 1: Deconvolution

Example 2: Image restoration

Example 3: Image reconstruction

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 8/76

Measuring variation of temperature with a therometer

f(t) variation of temperature over time

g(t) variation of length of the liquid in thermometer

Forward model: Convolution

g(t) =

∫f(t′)h(t− t′) dt′ + ǫ(t)

h(t): impulse response of the measurement system

Inverse problem: Deconvolution

Given the forward model H (impulse response h(t)))and a set of data g(ti), i = 1, · · · ,Mfind f(t)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 9/76

Measuring variation of temperature with a therometer

Forward model: Convolution

g(t) =

∫f(t′)h(t− t′) dt′ + ǫ(t)

0 10 20 30 40 50 60−0.2

0

0.2

0.4

0.6

0.8

t

f(t)−→Thermometer

h(t) −→

0 10 20 30 40 50 60−0.2

0

0.2

0.4

0.6

0.8

t

g(t)

Inversion: Deconvolution

0 10 20 30 40 50 60−0.2

0

0.2

0.4

0.6

0.8

t

f(t) g(t)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 10/76

Seeing outside of a body: Making an image with a camera,

a microscope or a telescope

f(x, y) real scene

g(x, y) observed image

Forward model: Convolution

g(x, y) =

∫∫f(x′, y′)h(x− x′, y − y′) dx′ dy′ + ǫ(x, y)

h(x, y): Point Spread Function (PSF) of the imaging system

Inverse problem: Image restoration

Given the forward model H (PSF h(x, y)))and a set of data g(xi, yi), i = 1, · · · ,Mfind f(x, y)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 11/76

Making an image with an unfocused cameraForward model: 2D Convolution

g(x, y) =

∫∫f(x′, y′)h(x− x′, y − y′) dx′ dy′ + ǫ(x, y)

f(x, y) h(x, y) +

ǫ(x, y)

g(x, y)

Inversion: Image Deconvolution or Restoration

?⇐=

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 12/76

Making an image of the interior of a bodyDifferent imaging systems:

Passive Imaging

object

r

rrr

r

r

r

r

rrrr

r

r

r

r

r

rrr

r

r

r

r

rrrr

r

r

r

r

object

waveIncident

Active Imaging

Measurement

waveIncident

object

Reflection

Measurement

waveIncident

object

Transmission

Forward problem: Knowing the object predict the dataInverse problem: From measured data find the object

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 13/76

Seeing inside of a body: Computed Tomography

f(x, y) a section of a real 3D body f(x, y, z)

gφ(r) a line of observed radiographe gφ(r, z)

Forward model:Line integrals or Radon Transform

gφ(r) =

Lr,φ

f(x, y) dl + ǫφ(r)

=

∫∫f(x, y) δ(r − x cosφ− y sinφ) dx dy + ǫφ(r)

Inverse problem: Image reconstruction

Given the forward model H (Radon Transform) anda set of data gφi

(r), i = 1, · · · ,Mfind f(x, y)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 14/76

Computed Tomography: Radon Transform

Forward: f(x, y) −→ g(r, φ)Inverse: f(x, y) ←− g(r, φ)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 15/76

Microwave or ultrasound imaging

Measurs: diffracted wave by the object g(ri)Unknown quantity: f(r) = k20(n

2(r)− 1)Intermediate quantity : φ(r)

g(ri) =

∫∫

D

Gm(ri, r′)φ(r′) f(r′) dr′, ri ∈ S

φ(r) = φ0(r) +

∫∫

D

Go(r, r′)φ(r′) f(r′) dr′, r ∈ D

Born approximation (φ(r′) ≃ φ0(r′)) ):

g(ri) =

∫∫

D

Gm(ri, r′)φ0(r

′) f(r′) dr′, ri ∈ S

Discretization :g= GmFφ

φ= φ0 +GoFφ−→

g = H(f )with F = diag(f)H(f) = GmF (I −GoF )−1φ0

Object

Incidentplane Wave

x

y

z

Measurementplane

rr'

r

r

r

r

r

r

r

r

r

r

rr

r

r

r

φ0 (φ, f)

g

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 16/76

Fourier Synthesis in X ray Tomography

g(r, φ) =

∫∫f(x, y) δ(r − x cosφ− y sinφ) dx dy

G(Ω, φ) =

∫g(r, φ) exp [−jΩr] dr

F (ωx, ωy) =

∫∫f(x, y) exp [−jωxx, ωyy] dx dy

F (ωx, ωy) = G(Ω, φ) for ωx = Ωcosφ and ωy = Ωsinφ

f(x, y)

φ

p(r, φ)–FT–P (Ω, φ)

x

y

r

s

φ

ωx

ωy

Ω

α

F (ωx, ωy)φ

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 17/76

Fourier Synthesis in X ray tomography

G(ωx, ωy) =

∫∫f(x, y) exp [−j (ωxx+ ωyy)] dx dy

v

u

?

=⇒

50 100 150 200 250 300

50

100

150

200

250

300

350

400

450

Forward problem: Given f(x, y) compute G(ωx, ωy)Inverse problem: Given G(ωx, ωy) on those linesestimate f(x, y)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 18/76

Fourier Synthesis in Diffraction tomography

ω x

Incident plane wave

f (x, y)

FTy

x

2

1

1

2

-k0

ω y

Diffracted wave

k0

f^

( ω x , ω y )ψ(r, φ)ψ(r, φ)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 19/76

Fourier Synthesis in Diffraction tomography

G(ωx, ωy) =

∫∫f(x, y) exp [−j (ωxx+ ωyy)] dx dy

v

u

?

=⇒50 100 150 200 250 300 350 400

50

100

150

200

250

300

Forward problem: Given f(x, y) compute G(ωx, ωy)Inverse problem : Given G(ωx, ωy) on those semi cerclesestimate f(x, y)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 20/76

Fourier Synthesis in different imaging systems

G(ωx, ωy) =

∫∫f(x, y) exp [−j (ωxx+ ωyy)] dx dy

v

u

v

u

v

u

v

u

X ray Tomography Diffraction Eddy current SAR & Radar

Forward problem: Given f(x, y) compute G(ωx, ωy)Inverse problem : Given G(ωx, ωy) on those algebraic lines,cercles or curves, estimate f(x, y)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 21/76

Invers Problems: other examples and applications

X ray, Gamma ray Computed Tomography (CT)

Microwave and ultrasound tomography

Positron emission tomography (PET)

Magnetic resonance imaging (MRI)

Photoacoustic imaging

Radio astronomy

Geophysical imaging

Non Destructive Evaluation (NDE) and Testing (NDT)techniques in industry

Hyperspectral imaging

Earth observation methods (Radar, SAR, IR, ...)

Survey and tracking in security systems

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 22/76

3. General formulation of inverse problems and classical

methods

General non linear inverse problems:

g(s) = [Hf(r)](s) + ǫ(s), r ∈ R, s ∈ S

Linear models:

g(s) =

∫∫f(r)h(r, s) dr + ǫ(s)

If h(r, s) = h(r − s) −→ Convolution.

Discrete data:

g(si) =

∫∫h(si, r) f(r) dr + ǫ(si), i = 1, · · · ,m

Inversion: Given the forward model H and the datag = g(si), i = 1, · · · ,m) estimate f(r)

Well-posed and Ill-posed problems (Hadamard):existance, uniqueness and stability

Need for prior information

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 23/76

Inverse problems: Discretization

g(si) =

∫∫h(si, r) f(r) dr + ǫ(si), i = 1, · · · ,M

f(r) is assumed to be well approximated by

f(r) ≃N∑

j=1

fj bj(r)

with bj(r) a basis or any other set of known functions

g(si) = gi ≃N∑

j=1

fj

∫∫h(si, r) bj(r) dr, i = 1, · · · ,M

g = Hf + ǫ with Hij =

∫∫h(si, r) bj(r) dr

H is huge dimensional

LS solution : f = argminf Q(f) withQ(f) =

∑i |gi − [Hf ]i|

2 = ‖g −Hf‖2

does not give satisfactory result.

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 24/76

Convolution: Discretization

f(t) h(t) ♠+ǫ(t)

g(t)

g(t) =

∫f(t′)h(t − t′) dt′ + ǫ(t) =

∫h(t′) f(t− t′) dt′ + ǫ(t)

The signals f(t), g(t), h(t) are discretized with the samesampling period ∆T = 1,

The impulse response is finite (FIR) : h(t) = 0, for t such thatt < −q∆T or ∀t > p∆T .

g(m) =

p∑

k=−q

h(k) f(m− k) + ǫ(m), m = 0, · · · ,M

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 25/76

Convolution: Discretized matrix vector form If system is causal (q = 0) we obtain

g(0)g(1)............

g(M)

=

h(p) · · · h(0) 0 · · · · · · 0

0...

......

... h(p) · · · h(0)...

......

... 00 · · · · · · 0 h(p) · · · h(0)

f(−p)...

f(0)f(1)............

f(M)

g is a (M + 1)-dimensional vector, f has dimension M + p+ 1, h = [h(p), · · · , h(0)] has dimension (p+ 1) H has dimensions (M + 1)× (M + p+ 1).

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 26/76

Discretization of Radon Transfrom in CT

f(x, y)

x

y

r

φ

•D

g(r, φ)

S•

fN

f1

fj

gi

Hij

f(x, y) =∑

j f j bj(x, y)

bj(x, y) =

1 if (x, y) ∈ pixel j0 else

g(r, φ) =

L

f(x, y) dl gi =

N∑

j=1

Hij fj + ǫi

g = Hf + ǫ

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 27/76

Inverse problems: Deterministic methodsData matching

Observation modelgi = hi(f) + ǫi, i = 1, . . . ,M −→ g = H(f ) + ǫ

Misatch between data and output of the model ∆(g,H(f ))

f = argminf∆(g,H(f ))

Examples:

– LS ∆(g,H(f )) = ‖g −H(f )‖2 =∑

i

|gi − hi(f)|2

– Lp ∆(g,H(f )) = ‖g −H(f )‖p =∑

i

|gi − hi(f)|p , 1 < p < 2

– KL ∆(g,H(f )) =∑

i

gi lngi

hi(f)

In general, does not give satisfactory results for inverseproblems.

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 28/76

Inverse problems: Regularization theory

Inverse problems = Ill posed problems−→ Need for prior information

Functional space (Tikhonov):

g = H(f) + ǫ −→ J(f) = ||g −H(f)||22 + λ||Df ||22

Finite dimensional space (Philips & Towmey): g = H(f ) + ǫ

• Minimum norme LS (MNLS): J(f) = ||g −H(f )||2 + λ||f ||2

• Classical regularization: J(f) = ||g −H(f )||2 + λ||Df ||2

• More general regularization:

J(f) = Q(g −H(f)) + λΩ(Df)or

J(f) = ∆1(g,H(f )) + λ∆2(f ,f∞)Limitations:• Errors are considered implicitly white and Gaussian• Limited prior information on the solution• Lack of tools for the determination of the hyperparameters

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 29/76

Deconvolution example

50 100 150 200 250 300−0.2

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

50 100 150 200 250 300−0.1

0

0.1

0.2

0.3

0.4

0.5

0.6

Forward: f(t) −→ g(t) = h(t) ∗ f(t) + ǫ(t)

Inverse: f(t) ←− g(t)

50 100 150 200 250 300−0.2

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

50 100 150 200 250 300−0.2

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

50 100 150 200 250 300−0.2

0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

Inverse Filtering Wiener Regularization

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 30/76

Inversion: Probabilistic methods

Taking account of errors and uncertainties −→ Probability theory

Maximum Likelihood (ML)

Minimum Inaccuracy (MI)

Probability Distribution Matching (PDM)

Maximum Entropy (ME) and Information Theory (IT)

Bayesian Inference (Bayes)

Advantages:

Explicit account of the errors and noise

A large class of priors via explicit or implicit modeling

A coherent approach to combine information content of thedata and priors

Limitations:

Practical implementation and cost of calculation

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 31/76

4. Bayesian inference for inverse problems

M : g = Hf + ǫ

Observation modelM + Hypothesis on the noise ǫ −→p(g|f ;M) = pǫ(g −Hf)

A priori information p(f |M)

Bayes : p(f |g;M) =p(g|f ;M) p(f |M)

p(g|M)

Link with regularization :

Maximum A Posteriori (MAP) :

f = argmaxfp(f |g) = argmax

fp(g|f) p(f)

= argminf− ln p(g|f)− ln p(f)

with Q(g,Hf ) = − ln p(g|f) and λΩ(f) = − ln p(f)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 32/76

Case of linear models and Gaussian priorsg = Hf + ǫ

Hypothesis on the noise: ǫ ∼ N (0, σ2ǫ I)

p(g|f) ∝ exp

[−

1

2σ2ǫ

‖g −Hf‖2]

Hypothesis on f : f ∼ N (0, σ2fI)

p(f) ∝ exp

[−

1

2σ2f

‖f‖2

]

A posteriori:

p(f |g) ∝ exp

[−

1

2σ2ǫ

(‖g −Hf‖2 +

σ2ǫ

σ2f

‖f‖2

)]

MAP : f = argmaxf p(f |g) = argminf J(f)

with J(f) = ‖g −Hf‖2 + λ‖f‖2, λ = σ2ǫ

σ2

f

Advantage : characterization of the solution

f |g ∼ N (f , P ) with f =(HtH + λI

)−1Htg, P = σ2

ǫ

(HtH + λI

)−1

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 33/76

MAP estimation with other priors:

f = argminfJ(f ) with J(f ) = ‖g −Hf‖2 + λΩ(f)

Separable priors:

Gaussian:p(fj) ∝ exp

[−α|fj|

2]−→ Ω(f) = ‖f‖2 = α

∑j |fj |

2

Gamma: p(fj) ∝ fαj exp [−βfj] −→ Ω(f) = α

∑j ln fj + βfj

Beta:p(fj) ∝ fα

j (1− fj)β −→ Ω(f) = α

∑j ln fj +β

∑j ln(1− fj)

Generalized Gaussian: p(fj) ∝ exp [−α|fj|p] , 1 < p <

2 −→ Ω(f) = α∑

j |fj|p,

Markovian models:

p(fj|f) ∝ exp

−α

i∈Nj

φ(fj, fi)

−→ Ω(f) = α

j

i∈Nj

φ(fj , fi),

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 34/76

MAP estimation and Compressed Sensing

g = Hf + ǫ

f = Wz

W a code book matrix, z coefficients

Gaussian:

p(z) = N (0, σ2zI) ∝ exp

[− 1

2σ2z

∑j |zj|

2]

J(z) = − ln p(z|g) = ‖g −HWz‖2 + λ∑

j |zj |2

Generalized Gaussian (sparsity, β = 1):

p(z) ∝ exp[−λ∑

j |zj |β]

J(z) = − ln p(z|g) = ‖g −HWz‖2 + λ∑

j |zj |β

z = argminz J(z) −→ f = Wz

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 35/76

Bayesian Estimation: Two simple priors

Example 1: Linear Gaussian case:

p(g|f , θ1) = N (Hf , θ1I)p(f |θ2) = N (0, θ2I)

−→ p(f |g,θ) = N (f , P )

with f = (H ′H + λI)−1H ′g

P = θ1(H′H + λI)−1, λ = θ1

θ2

f = argminfJ(f) with J(f) = ‖g −Hf‖22 + λ‖f‖22

Example 2: Double Exponential prior & MAP:

f = argminfJ(f) with J(f) = ‖g −Hf‖22 + λ‖f‖1

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 36/76

Deconvolution example

0 20 40 60 80 100 120 140 160 180 200−10

−8

−6

−4

−2

0

2

4

6

8

10

f(t)

0 20 40 60 80 100 120 140 160 180 200−10

−8

−6

−4

−2

0

2

4

6

8

10

g(t)

Forward: f(t) −→ g(t) = h(t) ∗ f(t) + ǫ(t)

Inverse: f(t) ←− g(t)

0 20 40 60 80 100 120 140 160 180 200−10

−8

−6

−4

−2

0

2

4

6

8

10

fh(t)fh(t)abs(fh)>tsh

0 20 40 60 80 100 120 140 160 180 200−10

−8

−6

−4

−2

0

2

4

6

8

10

fh(t)fh(t)abs(fh)>tsh

Quadratic Reg. (Gaussian) L1 Reg. (Laplace)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 37/76

Main advantages of the Bayesian approach

MAP = Regularization

Posterior mean ? Marginal MAP ?

More information in the posterior law than only its mode orits mean

Meaning and tools for estimating hyper parameters

Meaning and tools for model selection

More specific and specialized priors, particularly through thehidden variables

More computational tools: Expectation-Maximization for computing the maximum

likelihood parameters MCMC for posterior exploration Variational Bayes for analytical computation of the posterior

marginals ...

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 38/76

Two main steps in the Bayesian approach Prior modeling

Separable:Gaussian, Gamma,Sparsity enforcing: Generalized Gaussian, mixture ofGaussians, mixture of Gammas, ...

Markovian:Gauss-Markov, GGM, ...

Markovian with hidden variables(contours, region labels)

Choice of the estimator and computational aspects MAP, Posterior mean, Marginal MAP MAP needs optimization algorithms Posterior mean needs integration methods Marginal MAP and Hyperparameter estimation need

integration and optimization Approximations:

Gaussian approximation (Laplace) Numerical exploration MCMC Variational Bayes (Separable approximation)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 39/76

5. Prior modeling of signals

Gaussian Generalized Gaussianp(fj) ∝ exp

[−α|fj |

2]

p(fj) ∝ exp [−α|fj|p] , 1 ≤ p ≤ 2

Gamma Betap(fj) ∝ fα

j exp [−βfj] p(fj) ∝ fαj (1− fj)

β

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 40/76

Sparsity enforcing prior models Sparse signals: Direct sparsity

0 20 40 60 80 100 120 140 160 180 200−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 20 40 60 80 100 120 140 160 180 2000

0.5

1

1.5

2

2.5

3

Sparse signals: Sparsity in a Transform domaine

0 20 40 60 80 100 120 140 160 180 200−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 20 40 60 80 100 120 140 160 180 200−6

−4

−2

0

2

4

6

0 20 40 60 80 100 120 140 160 180 200−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 20 40 60 80 100 120 140 160 180 200−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

20

40

60

80

100

120

140

160

180

200

20 40 60 80 100 120 140 160 180 200

1

2

3

4

5

6

7

8

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 41/76

Sparsity enforcing prior models

Simple heavy tailed models: Generalized Gaussian, Double Exponential Student-t, Cauchy Elastic net

Symmetric Weibull, Symmetric Rayleigh Generalized hyperbolic

Hierarchical mixture models: Mixture of Gaussians Bernoulli-Gaussian

Mixture of Gammas Bernoulli-Gamma Mixture of Dirichlet Bernoulli-Multinomial

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 42/76

Simple heavy tailed models

• Generalized Gaussian, Double Exponential

p(f |γ, β) =∏

j

GG(f j |γ, β) ∝ exp

−γ

j

|f j|β

β = 1 Double exponential or Laplace.0 < β ≤ 1 are of great interest for sparsity enforcing.

• Student-t and Cauchy models

p(f |ν) =∏

j

St(f j|ν) ∝ exp

−ν + 1

2

j

log(1 + f2

j/ν)

Cauchy model is obtained when ν = 1.

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 43/76

Mixture models• Mixture of two Gaussians (MoG2) model

p(f |λ, v1, v0) =∏

j

(λN (f j|0, v1) + (1− λ)N (f j |0, v0)

)

• Bernoulli-Gaussian (BG) model

p(f |λ, v) =∏

j

p(f j) =∏

j

(λN (f j |0, v) + (1− λ)δ(f j)

)

• Mixture of Gammas

p(f |λ, v1, v0) =∏

j

(λG(f j |α1, β1) + (1− λ)G(f j|α2, β2)

)

• Bernoulli-Gamma model

p(f |λ, α, β) =∏

j

[λG(f j |α, β) + (1− λ)δ(f j)

]

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 44/76

6. Full Bayesian approachM : g = Hf + ǫ

Forward & errors model: −→ p(g|f ,θ1;M) Prior models −→ p(f |θ2;M) Hyperparameters θ = (θ1,θ2) −→ p(θ|M)

Bayes: −→ p(f ,θ|g;M) = p(g|f ,θ;M) p(f |θ;M) p(θ|M)p(g|M)

Joint MAP: (f , θ) = argmax(f ,θ)p(f ,θ|g;M)

Marginalization 1:

p(f |g;M) =

∫∫p(f ,θ|g;M) dθ −→ f

Marginalization 2:

p(θ|g;M) =

∫∫p(f ,θ|g;M) df −→ θ

Approximate p(f ,θ|g;M) by a separable oneq(f ,θ) = q1(f)q2(θ) and then use them separatelyto find f and θ.

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 45/76

Summary of Bayesian estimation 1

Simple Bayesian Model and Estimation

θ2

p(f |θ2)

Prior

θ1

p(g|f ,θ1)

Likelihood

−→ p(f |g,θ)

Posterior

−→ f

Full Bayesian Model and Hyperparameter Estimation

↓ α,β

Hyperpraram. model p(θ|α,β)

p(θ2)

p(f |θ2)

Prior

p(θ1)

p(g|f ,θ1)

Likelihood

−→p(f ,θ|g,α,β)

Joint Posterior

−→ f

−→ θ

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 46/76

Summary of Bayesian estimation 2

Marginalization 1

p(f ,θ|g)

Joint Posterior

−→ p(θ|g)

Marginalize over θ

−→ f

Marginalization 2

p(f ,θ|g)

Joint Posterior

−→ p(θ|g)

Marginalize over f

−→ θ −→ p(f |θ,g) −→ f

Variational Bayesian Approximation

p(f ,θ|g) −→VariationalBayesian

Approximation

−→ q1(f) −→ f

−→ q2(θ) −→ θ

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 47/76

Variational Bayesian Approximation

Full Bayesian: p(f ,θ|g) ∝ p(g|f ,θ1) p(f |θ2) p(θ)

Approximate p(f ,θ|g) by q(f ,θ|g) = q1(f |g) q2(θ|g)and then continue computations.

Criterion KL(q(f ,θ|g) : p(f ,θ|g))

KL(q : p) =

∫ ∫q ln q/p =

∫ ∫q1q2 ln

q1q2p

Iterative algorithm q1 −→ q2 −→ q1 −→ q2, · · ·

q1(f) ∝ exp[〈ln p(g,f ,θ;M)〉q2(θ)

]

q2(θ) ∝ exp[〈ln p(g,f ,θ;M)〉q1(f)

]

p(f ,θ|g) −→VariationalBayesian

Approximation

−→ q1(f) −→ f

−→ q2(θ) −→ θ

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 48/76

7. Hierarchical models and hidden variables All the mixture models and some of simple models can be

modeled via hidden variables z.

p(f) =K∑

k=1

αkpk(f) −→

p(f |z = k) = pk(f),P (z = k) = αk,

∑k αk = 1

Example 2: Student-t model

St(f |ν) ∝ exp

[−ν + 1

2log(1 + f2/ν

)]

Infinite mixture

St(f |ν) ∝=

∫ ∞

0N (f |, 0, 1/z)G(z|α, β) dz, with α = β = ν/2

p(f |z) =∏

j p(fj|zj) =∏

j N (fj|0, 1/zj) ∝ exp[−1

2

∑j zjf

2j

]

p(z|α, β) =∏

j G(zj |α, β) ∝∏

j zj(α−1) exp [−βzj ]

∝ exp[∑

j(α− 1) ln zj − βzj

]

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 49/76

Summary of Bayesian estimation 3• Full Bayesian Hierarchical Model with Hyperparameter Estimation

↓ α,β,γ

Hyper prior model p(θ|α,β,γ)

p(θ3)

p(z|θ3)

Hidden variable

p(θ2)

p(f |z,θ2)

Prior

p(θ1)

p(g|f ,θ1)

Likelihood

−→ p(f ,z,θ|g)

Joint Posterior

−→ f

−→ z

−→ θ

• Full Bayesian Hierarchical Model and Variational Approximation

↓ α,β,γ

Hyper prior model p(θ|α,β,γ)

p(θ3)

p(z|θ3)

Hidden variable

p(θ2)

p(f |z, θ2)

Prior

p(θ1)

p(g|f , θ1)

Likelihood

−→ p(f , z, θ|g)

Joint Posterior

−→

VBAq1(f )q2(z)q3(θ)

−→ f

−→ z

−→ θ

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 50/76

8. Bayesian Computation and Algorithms for Hierarchical

models

Often, the expression of p(f ,z,θ|g) is complex.

Its optimization (for Joint MAP) orits marginalization or integration (for Marginal MAP or PM)is not easy

Two main techniques:MCMC and Variational Bayesian Approximation (VBA)

MCMC:Needs the expressions of the conditionalsp(f |z,θ,g), p(z|f ,θ,g), and p(θ|f ,z,g)

VBA: Approximate p(f ,z,θ|g) by a separable one

q(f ,z,θ|g) = q1(f) q2(z) q3(θ)

and do any computations with these separable ones.

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 51/76

Which images I am looking for?

50 100 150 200 250 300

50

100

150

200

250

300

350

400

450

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 52/76

Which image I am looking for?

Gauss-Markov Generalized GM

Piecewize Gaussian Mixture of GM

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 53/76

9. Gauss-Markov-Potts prior models for images

f(r) z(r) c(r) = 1− δ(z(r)− z(r′))

p(f(r)|z(r) = k,mk, vk) = N (mk, vk)

p(f(r)) =∑

k

P (z(r) = k)N (mk, vk) Mixture of Gaussians

Separable iid hidden variables: p(z) =∏

r p(z(r)) Markovian hidden variables: p(z) Potts-Markov:

p(z(r)|z(r′), r′ ∈ V(r)) ∝ exp

γ

r′∈V(r)

δ(z(r) − z(r′))

p(z) ∝ exp

γ∑

r∈R

r′∈V(r)

δ(z(r) − z(r′))

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 54/76

Four different cases

To each pixel of the image is associated 2 variables f(r) and z(r)

f |z Gaussian iid, z iid :Mixture of Gaussians

f |z Gauss-Markov, z iid :Mixture of Gauss-Markov

f |z Gaussian iid, z Potts-Markov :Mixture of Independent Gaussians(MIG with Hidden Potts)

f |z Markov, z Potts-Markov :Mixture of Gauss-Markov(MGM with hidden Potts)

f(r)

z(r)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 55/76

Application of CT in NDTReconstruction from only 2 projections

g1(x) =

∫f(x, y) dy, g2(y) =

∫f(x, y) dx

Given the marginals g1(x) and g2(y) find the joint distributionf(x, y).

Infinite number of solutions : f(x, y) = g1(x) g2(y)Ω(x, y)Ω(x, y) is a Copula:

∫Ω(x, y) dx = 1 and

∫Ω(x, y) dy = 1

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 56/76

Application in CT

20 40 60 80 100 120

20

40

60

80

100

120

g|f f |z z c

g = Hf + ǫ iid Gaussian iid c(r) ∈ 0, 1g|f ∼ N (Hf , σ2

ǫ I) or or 1− δ(z(r) − z(r′))Gaussian Gauss-Markov Potts binary

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 57/76

Proposed algorithm

p(f ,z,θ|g) ∝ p(g|f ,z,θ) p(f |z,θ) p(θ)

General scheme:

f ∼ p(f |z, θ,g) −→ z ∼ p(z|f , θ,g) −→ θ ∼ (θ|f , z,g)

Iterative algorithme:

Estimate f using p(f |z, θ,g) ∝ p(g|f ,θ) p(f |z, θ)Needs optimisation of a quadratic criterion.

Estimate z using p(z|f , θ,g) ∝ p(g|f , z, θ) p(z)Needs sampling of a Potts Markov field.

Estimate θ usingp(θ|f , z,g) ∝ p(g|f , σ2

ǫ I) p(f |z, (mk, vk)) p(θ)Conjugate priors −→ analytical expressions.

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 58/76

Results

Original Backprojection Filtered BP LS

Gauss-Markov+pos GM+Line process GM+Label process

20 40 60 80 100 120

20

40

60

80

100

120

c 20 40 60 80 100 120

20

40

60

80

100

120

z 20 40 60 80 100 120

20

40

60

80

100

120

c

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 59/76

Application in Microwave imaging

g(ω) =

∫f(r) exp [−j(ω.r)] dr + ǫ(ω)

g(u, v) =

∫∫f(x, y) exp [−j(ux+ vy)] dx dy + ǫ(u, v)

g = Hf + ǫ

20 40 60 80 100 120

20

40

60

80

100

120

20 40 60 80 100 120

20

40

60

80

100

120

20 40 60 80 100 120

20

40

60

80

100

120

20 40 60 80 100 120

20

40

60

80

100

120

f(x, y) g(u, v) f IFT f Proposed method

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 60/76

Color (Multi-spectral) image deconvolution

fi(x, y) h(x, y)

+

ǫi(x, y)

gi(x, y)

Observation model : gi = Hfi + ǫi, i = 1, 2, 3

?

⇐=

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 61/76

Images fusion and joint segmentation(with O. Feron)

gi(r) = fi(r) + ǫi(r)p(fi(r)|z(r) = k) = N (mik, σ

2i k)

p(f |z) =∏

i p(fi|z)

p(z) ∝ exp[γ∑

r∈R

∑r′∈V(r) δ(z(r) − z(r′))

]

g1

g2

−→f1

f2

z

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 62/76

Data fusion in medical imaging(with O. Feron)

gi(r) = fi(r) + ǫi(r)p(fi(r)|z(r) = k) = N (mik, σ

2i k)

p(f |z) =∏

i p(fi|z)

p(z) ∝ exp[γ∑

r∈R

∑r′∈V(r) δ(z(r) − z(r′))

]

g1

g2

−→f1

f2

z

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 63/76

Super-Resolution(with F. Humblot)

gi(r) = [DMBfi(r) + ǫi(r)p(fi(r)|z(r) = k) = N (mik, σ

2i k)

p(f |z) =∏

i p(fi|z)

p(z) ∝ exp[γ∑

r∈R

∑r′∈V(r) δ(z(r) − z(r′))

]

?

=⇒

Low Resolution images High Resolution imageA. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 64/76

Joint segmentation of hyper-spectral images

(with N. Bali & A. Mohammadpour)

gi(r) = fi(r) + ǫi(r)p(fi(r)|z(r) = k) = N (mik, σ

2i k), k = 1, · · · ,K

p(f |z) =∏

i p(fi|z)

p(z) ∝ exp[γ∑

r∈R

∑r′∈V(r) δ(z(r) − z(r′))

]

mik follow a Markovian model along the index i

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 65/76

Segmentation of a video sequence of images(with P. Brault)

gi(r) = fi(r) + ǫi(r)p(fi(r)|zi(r) = k) = N (mik, σ

2i k), k = 1, · · · ,K

p(f |z) =∏

i p(fi|zi)

p(z) ∝ exp[γ∑

r∈R

∑r′∈V(r) δ(z(r) − z(r′))

]

zi(r) follow a Markovian model along the index i

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 66/76

Source separation: (with H. Snoussi & M. Ichir)

gi(r) =∑N

j=1Aijfj(r) + ǫi(r)

p(fj(r)|zj(r) = k) = N (mjk, σ2j k)

p(z) ∝ exp[γ∑

r∈R

∑r′∈V(r) δ(z(r) − z(r′))

]

p(Aij) = N (A0ij, σ20 ij)

f g f z

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 67/76

Conclusions

Bayesian Inference for inverse problems

Different prior modeling for signals and images:Separable, Markovian, without and with hidden variables

Sprasity enforcing priors

Gauss-Markov-Potts models for images incorporating hiddenregions and contours

Two main Bayesian computation tools: MCMC and VBA

Application in different CT (X ray, Microwaves, PET, SPECT)

Current Projects and Perspectives :

Efficient implementation in 2D and 3D cases

Evaluation of performances and comparison between MCMCand VBA methods

Application to other linear and non linear inverse problems:(PET, SPECT or ultrasound and microwave imaging)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 68/76

Current Applications and Perspectives

We use these models for inverse problems in different signal andimage processing applications such as:

Period estimation in biological time series

Signal deconvolution in Proteomic and molecular imaging

X ray Computed Tomography

Diffraction Optical Tomography

Microwave Imaging, Acoustic imaging and sources localization

Synthetic Aperture Radar (SAR) Imaging

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 69/76

Thanks to:Graduated PhD students:1. C. Cai (2013: Multispectral X ray Tomography)2. N. Chu (2013: Acoustic sources localization)3. Th. Boulay (2013: Non Cooperative Radar Target Recognition)4. R. Prenon (2013: Proteomic and Masse Spectrometry)5. Sh. Zhu (2012: SAR Imaging)6. D. Fall (2012: Emission Positon Tomography, Non Parametric

Bayesian)7. D. Pougaza (2011: Copula and Tomography)8. H. Ayasso (2010: Optical Tomography, Variational Bayes)9. S. Fekih-Salem (2009: 3D X ray Tomography)10. N. Bali (2007: Hyperspectral imaging)11. O. Feron (2006: Microwave imaging)12. F. Humblot (2005: Super-resolution)13. M. Ichir (2005: Image separation in Wavelet domain)14. P. Brault (2005: Video segmentation using Wavelet domain)15. H. Snoussi (2003: Sources separation)16. Ch. Soussen (2000: Geometrical Tomography)17. G. Montemont (2000: Detectors, Filtering)18. H. Carfantan (1998: Microwave imaging)19. S. Gautier (1996: Gamma ray imaging for NDT)20. M. Nikolova (1994: Piecewise Gaussian models and GNC)21. D. Premel (1992: Eddy current imaging)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 70/76

Thanks to:

Current PhD students: L. Gharsali (Microwave imaging for Cancer detection) M. Dumitru (Multivariate time series analysis for biological signals) S. AlAli (Electrical imaging of CO2 stocking under the earth)

Master students: A. Cai (Non-circular X ray Tomography) F. Fuc (Multi component signal analysis for biology applications)

Post-Docs: J. Lapuyade (2011: Dimentionality Reduction and multivariate

analysis) S. Su (2006: Color image separation) A. Mohammadpour (2004: HyperSpectral image segmentation)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 71/76

Thanks my colleagues and collaborators

B. Duchene & A. Joisel (L2S) (Inverse scattering and MicrowaveImaging)

N. Gac (L2S) (GPU Implementation) Th. Rodet (L2S) (Computed Tomography)

—————– A. Vabre & S. Legoupil (CEA-LIST), (3D X ray Tomography) E. Barat (CEA-LIST) (Positon Emission Tomography, Non

Parametric Bayesian) C. Comtat (SHFJ, CEA) (PET, Spatio-Temporal Brain activity) J. Picheral (SSE, Supelec) (Acoustic sources localization) D. Blacodon (ONERA) (Acoustic sources separation) J. Lagoutte (Thales Air Systems) (Non Cooperative Radar Target

Recognition) P. Grangeat (LETI, CEA, Grenoble) (Proteomic and Masse

Spectrometry) F. Levi (CNRS-INSERM, Hopital Paul Brousse) (Biological rythms

and Chronotherapy of Cancer)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 72/76

References 11. A. Mohammad-Djafari, “Bayesian approach with prior models which enforce sparsity in signal and

image processing,” EURASIP Journal on Advances in Signal Processing, vol. Special issue on SparseSignal Processing, (2012).

2. A. Mohammad-Djafari (Ed.) Problemes inverses en imagerie et en vision (Vol. 1 et 2),Hermes-Lavoisier, Traite Signal et Image, IC2, 2009,

3. A. Mohammad-Djafari (Ed.) Inverse Problems in Vision and 3D Tomography, ISTE, Wiley and sons,ISBN: 9781848211728, December 2009, Hardback, 480 pp.

4. A. Mohammad-Djafari, Gauss-Markov-Potts Priors for Images in Computer Tomography Resulting toJoint Optimal Reconstruction and segmentation, International Journal of Tomography & Statistics 11:W09. 76-92, 2008.

5. A Mohammad-Djafari, Super-Resolution : A short review, a new method based on hidden Markovmodeling of HR image and future challenges, The Computer Journal doi:10,1093/comjnl/bxn005:,2008.

6. H. Ayasso and Ali Mohammad-Djafari Joint NDT Image Restoration and Segmentation usingGauss-Markov-Potts Prior Models and Variational Bayesian Computation, IEEE Trans. on ImageProcessing, TIP-04815-2009.R2, 2010.

7. H. Ayasso, B. Duchene and A. Mohammad-Djafari, Bayesian Inversion for Optical DiffractionTomography Journal of Modern Optics, 2008.

8. N. Bali and A. Mohammad-Djafari, “Bayesian Approach With Hidden Markov Modeling and MeanField Approximation for Hyperspectral Data Analysis,” IEEE Trans. on Image Processing 17: 2.217-225 Feb. (2008).

9. H. Snoussi and J. Idier., “Bayesian blind separation of generalized hyperbolic processes in noisy andunderdeterminate mixtures,” IEEE Trans. on Signal Processing, 2006.

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 73/76

References 21. O. Feron, B. Duchene and A. Mohammad-Djafari, Microwave imaging of inhomogeneous objects made

of a finite number of dielectric and conductive materials from experimental data, Inverse Problems,21(6):95-115, Dec 2005.

2. M. Ichir and A. Mohammad-Djafari,Hidden markov models for blind source separation, IEEE Trans. on Signal Processing, 15(7):1887-1899,Jul 2006.

3. F. Humblot and A. Mohammad-Djafari,Super-Resolution using Hidden Markov Model and Bayesian Detection Estimation Framework,EURASIP Journal on Applied Signal Processing, Special number on Super-Resolution Imaging:Analysis, Algorithms, and Applications:ID 36971, 16 pages, 2006.

4. O. Feron and A. Mohammad-Djafari,Image fusion and joint segmentation using an MCMC algorithm, Journal of Electronic Imaging,14(2):paper no. 023014, Apr 2005.

5. H. Snoussi and A. Mohammad-Djafari,Fast joint separation and segmentation of mixed images, Journal of Electronic Imaging, 13(2):349-361,April 2004.

6. A. Mohammad-Djafari, J.F. Giovannelli, G. Demoment and J. Idier,Regularization, maximum entropy and probabilistic methods in mass spectrometry data processingproblems, Int. Journal of Mass Spectrometry, 215(1-3):175-193, April 2002.

7. H. Snoussi and A. Mohammad-Djafari, “Estimation of Structured Gaussian Mixtures: The Inverse EMAlgorithm,” IEEE Trans. on Signal Processing 55: 7. 3185-3191 July (2007).

8. N. Bali and A. Mohammad-Djafari, “A variational Bayesian Algorithm for BSS Problem with HiddenGauss-Markov Models for the Sources,” in: Independent Component Analysis and Signal Separation(ICA 2007) Edited by:M.E. Davies, Ch.J. James, S.A. Abdallah, M.D. Plumbley. 137-144 Springer(LNCS 4666) (2007).

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 74/76

References 31. N. Bali and A. Mohammad-Djafari, “Hierarchical Markovian Models for Joint Classification,

Segmentation and Data Reduction of Hyperspectral Images” ESANN 2006, September 4-8, Belgium.(2006)

2. M. Ichir and A. Mohammad-Djafari, “Hidden Markov models for wavelet-based blind sourceseparation,” IEEE Trans. on Image Processing 15: 7. 1887-1899 July (2005)

3. S. Moussaoui, C. Carteret, D. Brie and A Mohammad-Djafari, “Bayesian analysis of spectral mixturedata using Markov Chain Monte Carlo methods sampling,” Chemometrics and Intelligent LaboratorySystems 81: 2. 137-148 (2005).

4. H. Snoussi and A. Mohammad-Djafari, “Fast joint separation and segmentation of mixed images”Journal of Electronic Imaging 13: 2. 349-361 April (2004)

5. H. Snoussi and A. Mohammad-Djafari, “Bayesian unsupervised learning for source separation withmixture of Gaussians prior,” Journal of VLSI Signal Processing Systems 37: 2/3. 263-279 June/July(2004)

6. F. Su and A. Mohammad-Djafari, “An Hierarchical Markov Random Field Model for Bayesian BlindImage Separation,” 27-30 May 2008, Sanya, Hainan, China: International Congress on Image andSignal Processing (CISP 2008).

7. N. Chu, J. Picheral and A. Mohammad-Djafari, “A robust super-resolution approach with sparsityconstraint for near-field wideband acoustic imaging,” IEEE International Symposium on SignalProcessing and Information Technology pp 286–289, Bilbao, Spain, Dec14-17,2011

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 75/76

Current PhD’s and projects

PhD’s:

1. Microwave imaging: PhD Leila Gharsalli (co-supervising B.Duchene)

2. Multivariate and multicomponents biological data processing:PhD Mircea Dumitru (co-supervising F. Levi), ERASYSBIO

3. ANR: HONTOMIN, PhD Safa AlAli, (CO2 stock supervisingusing electrical imaging) (B. Duchne & G. Perruson)

4. New methods for reducing dose in Computed Tomography,PhD Li Wang (N. Gac)

5. Information fusion for radar target recognition, starting PhD,May Abou Chahine, Thales Systemes Aeroports

Post-docs

1. ANR: SURMITO (Optical imaging), S. Mehrab, (B. Duchene)

2. 3D Tomography (SAFRAN), Th. Boulay, (N. Gac)

A. Mohammad-Djafari, Inverse problems in imaging science:... , Tutorial presentation, IPAS 2014: Tunisia, Nov. 5-7, 2014, 76/76

Recommended