View
187
Download
2
Category
Preview:
Citation preview
Tutorial
Advanced Information Theory in CVPR “in a Nutshell”
CVPRJune 13-18 2010
San Francisco,CAShape Matching with I-divergences
Anand Rangarajan
Shape Matching with I-Divergences
Groupwise Point-set Pattern RegistrationGiven N point-sets, which are denoted by {X p, p ∈ {1, ...,N}}, thetask of multiple point pattern matching or point-set registration is torecover the spatial transformations which yield the best alignment ofall shapes.
2/29
Problem Visualization
3/29
Problem Visualization
3/29
Group-wise Point-set Registration
Principal Technical ChallengesI Solving for nonrigid deformations between point-sets with
unknown correspondence is a difficult problem.
I How do we align all the point-sets in a symmetric manner sothat there is no bias toward any particular point-set?
4/29
From point-sets to density functions
5/29
From point-sets to density functions
5/29
Group-wise Point-set Registration
From point-sets to density functionsI Point sets are represented by probability density functions.I Intuitively, if these point sets are aligned properly, the
corresponding density functions should be similar.
Question: How do we measure the similarity between multipledensity functions?
6/29
Group-wise Point-set Registration
From point-sets to density functionsI Point sets are represented by probability density functions.I Intuitively, if these point sets are aligned properly, the
corresponding density functions should be similar.
Question: How do we measure the similarity between multipledensity functions?
6/29
Divergence Measures
Kullback-Leibler divergence
DKL(p‖q) =
ˆp(x) log
p(x)
q(x)dx
where p(x), q(x) are the probabilitydensity functions.
J divergenceGiven two probability densityfunction p and q, the symmetric KLdivergence is defined as:
J(p, q) =12
(DKL(p‖q) + DKL(q‖p))
7/29
Divergence Measures
Kullback-Leibler divergence
DKL(p‖q) =
ˆp(x) log
p(x)
q(x)dx
where p(x), q(x) are the probabilitydensity functions.
J divergenceGiven two probability densityfunction p and q, the symmetric KLdivergence is defined as:
J(p, q) =12
(DKL(p‖q) + DKL(q‖p))
7/29
Motivating the JS divergence
Modeling two shapes
�
X
�
Y
p(X |θ(1)) =
N1∏i=1
1K1
K1∑a=1
p(Xi |θ(1)a ), p(Y |θ(2)) =
N2∏j=1
1K2
K2∑b=1
p(Yj |θ(2)b )
8/29
Motivating the JS divergence
Modeling the overlay of two shapes with identity of origin
�
X Y
p(X ∪ Y |θ(1), θ(2)) = p(X |θ(1))p(Y |θ(2))
8/29
Motivating the JS divergence
Modeling the overlay of two shapes without identity of origin
�
Z
p(Z |θ(1), θ(2)) =N1
N1 + N2p(Z |θ(1)) +
N2
N1 + N2p(Z |θ(2))
8/29
Likelihood Ratio
I Which generative model do you prefer? The union of disparateshapes where identity of origin is preserved or one combinedshape where the identity of origin is suppressed.
I Likelihood ratio:
logΛ = logp(Z |θ(1), θ(2))
p(X ∪ Y |θ(1), θ(2))=
N1N1+N2
p(Z |θ(1)) + N2N1+N2
p(Z |θ(2))
p(X |θ(1))p(Y |θ(2))
I Z is understood to arise from a convex combination of twomixture models p(Z |θ(1)) and p(Z |θ(2)) where the weights ofeach mixture are proportional to the number of points N1 andN2 in each set.
I Weak law of large numbers leads to Jensen-Shannon divergence.
9/29
JS Divergence for multiple shapes
JS-divergence of shape densities
JSπ(P1,P2, ...,Pn) = H(∑
πiPi)−∑
πiH(Pi) (1)
where π = {π1, π2, ..., πn|πi > 0,∑πi = 1} are the weights of the
probability densities Pi and H(Pi) is the Shannon entropy.
10/29
Atlas estimation
Formulation using JS-divergence
JSβ(P1,P2, ...,PN) + λ
N∑i=1
||Lf i ||2
=H(∑
βiPi )−∑
βiH(Pi ) + λN∑
i=1
||Lf i ||2.
f i is the deformation function corresponding to point set X i ;Pi = p(f i (X i )) is the probability density for deformed point-set.
11/29
Multiple shapes: JS divergence
JS divergence in a hypothesis testing framework:I Construct a likelihood ratio between i.i.d. samples drawn from a
mixture (∑
a πaPa) and i.i.d. samples drawn from aheterogeneous collection of densities (P1,P2, ...,PN).
I The likelihood ratio is then
Λ =
∏Mk=1
∑Na=1 πaPa(xk)∏N
a=1∏Na
ka=1 Pa(xaka
).
I Weak law of large numbers gives us the JS-divergence.
12/29
Group-wise Registration ResultsExperimental results on four 3D hippocampus point sets.
13/29
Shape matching via CDF I-divergences
I Model each point-set by a cumulative distribution function(CDF)
I Quantify the distance among cdfs via an information-theoreticmeasure [typically the cumulative residual entropy (CRE)]
I Minimize the dis-similarity measure over the space ofcoordinate transformation parameters
14/29
Havrda-Charvát CRE
HC-CRE: Let X be a random vector in Rd , we define the HC-CREof X by
EH(X ) = −ˆ
Rd+
(α− 1)−1(Pα(|X | > λ)− P(|X | > λ))dλ
where X = {x1, x2, . . . , xd}, λ = {λ1, λ2, . . . , λd}, and |X | > λmeans |xi | > λi , Rd
+ = {xi ∈ Rd ; xi ≥ 0; i ∈ {1, 2, . . . , d}}.
15/29
CDF-HC Divergence
CDF-HC Divergence : Given N cumulative probability distributionsPk , k ∈ {1, . . . ,N}, the CDF-JS divergence of the set {Pk} isdefined as
HC (P1,P2, . . . ,PN) = EH(∑k
πkPk)−∑k
πkEH(Pk)
where 0 ≤ πk ≤ 1,∑
k πk = 1, and EH is the HC-CRE.
16/29
CDF-HC Divergence
Let P =∑
k πkPk
HC (P1,P2, . . . ,PN)
= −(α− 1)−1(
ˆRd
+
Pα(X >λ)dλ−∑k
πk
ˆRd
+
Pαk (Xk>λ)dλ)
=∑k
πk
ˆRd
+
P2k (Xk>λ)dλ−
ˆRd
+
P2(X >λ)dλ (α = 2)
17/29
Dirac Mixture Model
Pk(Xk > λ) =1
Dk
Dk∑i
H i (x, xi )
where H(x, xi ) is the Heaviside function (equal to 1 if allcomponents of x are greater than xi ).
0
10
20
30
40
50
60
70020
4060
80
0
0.5
1
18/29
CDF-JS, PDF-JS & CDF-HC
0 1 2
0
0.5
1
1.5
2
2.5
3
3.5Before Registraion
0 1 2
0
0.5
1
1.5
2
2.5
3
3.5CDF−JS
0 1 2
0
0.5
1
1.5
2
2.5
3
3.5PDF−JS
0 1 2
0
0.5
1
1.5
2
2.5
3
3.5CDF−HC
0 2 4 6 80
2
4Before Registraion
0 2 4 6 80
2
4CDF−JS
0 2 4 6 80
2
4PDF−JS
0 2 4 6 80
2
4CDF−HC
19/29
2D Point-set Registration for CC
0 0.2 0.4 0.6−0.1
0
0.1
0.2
Point Set 1
0 0.2 0.4 0.6−0.1
0
0.1
0.2
Point Set 2
0 0.2 0.4 0.6
0
0.1
0.2
Point Set 3
0 0.2 0.4 0.6−0.1
0
0.1
0.2
Point Set 4
0 0.2 0.4 0.6
0
0.1
0.2
Point Set 5
0 0.2 0.4 0.6
0
0.1
0.2
Point Set 6
0 0.2 0.4 0.6
0
0.1
0.2
Point Set 7
0 0.2 0.4 0.6−0.1
0
0.1
0.2
Before Registration
0 0.2 0.4 0.6
0
0.1
0.2
After Registration
20/29
With outliers
0.2 0.4 0.6 0.8 1
0
0.1
0.2
Before Registration
0.4 0.6 0.8 1 1.2
0
0.1
0.2
After PDF−JS Registration
0 0.2 0.4 0.60
0.1
0.2
After CDF−HC Registration
21/29
With different α values
0.5 1
00.10.2
Initial Configuration
0 0.2 0.4 0.60
0.10.2
α=1.1
0 0.2 0.4 0.60
0.10.2
α=1.3
0 0.2 0.4 0.60
0.10.2
α=1.5
0 0.2 0.4 0.60
0.10.2
α=2
0 0.2 0.4 0.60
0.10.2
α=1.9
0 0.2 0.4 0.60
0.10.2
α=1.7
0 0.2 0.4 0.60
0.10.2
α=3
0 0.2 0.4 0.60
0.10.2
α=4
0 0.2 0.4 0.60
0.10.2
α=5
22/29
3D Point-set Registration for Duck
02040600 100 200
0
50
100
150
Point Set 1
02040600 100 200
0
50
100
150
Point Set 2
02040600 100 200
0
50
100
150
Point Set 3
02040600 100 200
0
50
100
150
Point Set 4
02040600 100 200
0
50
100
150
Before Registration
0500 100 200
0
50
100
150
After Registration
23/29
3D Registration of Hippocampi
0
100
200
0
50
100
0
10
20
Point Set 10
100
200
0
50
100
0
10
20
Point Set 2
0
100
200
0
50
100
0
10
20
Point Set 40
100
200
0
50
100
051015
Point Sets Before Registration0
100
200
0
50
100
0
10
20
Point sets After Registration
0100
200
0
50
100
0
10
20
Point Set 3
24/29
Group-Wise Registration Assessment
The Kolmogorov-Smirnov (KS) statistic was computed to measurethe difference between the CDFs.
I With ground truth1N
N∑k=1
D(Fg ,Fk)
I Without ground truth
K =1
N2
N∑k,s=1
D(Fk ,Fs)
25/29
KS statistic for comparison
Table: KS statistic
KS-statistic CDF-JS PDF-JS CDF-HCOlympic Logo 0.1103 0.1018 0.0324
Fish with outliers 0.1314 0.1267 0.0722
Table: Average nearest neighbor distance
ANN distance CDF-JS PDF-JS CDF-HCOlympic Logo 0.0367 0.0307 0.0019
Fish with outliers 0.0970 0.0610 0.0446
26/29
KS statistic for comparison (contd.)
Table: Non-rigid group-wise registration assessment without ground truthusing KS statistics
Before Registration After RegistrationCorpus Callosum 0.3226 0.0635
Corpus Callosum with outlier 0.3180 0.0742Olympic Logo 0.1559 0.0308
Fish 0.1102 0.0544Hippocampus 0.2620 0.0770
Duck 0.2287 0.0160
27/29
KS statistic for comparison (contd.)
Table: Non-rigid group-wise registration assessment without ground truthusing average nearest neighbor distance
Before Registration After RegistrationCorpus Callosum 0.0291 0.0029
Corpus Callosum with outlier 0.0288 0.0092Olympic Logo 0.0825 0.0022
Fish 0.1461 0.0601Hippocampus 13.7679 3.1779
Duck 15.4725 0.3280
28/29
Discussion
I I-divergences for shape matching avoid correspondence problemI Symmetric, unbiased registration and atlas estimationI Shape densities modeled as Gaussian mixtures, cumulatives
directly estimatedI JS (pdf and cdf-based) and HC divergences usedI Estimated atlas useful in model-based segmentation
29/29
Recommended