Upload
nawin-kumar-sharma
View
116
Download
2
Embed Size (px)
Citation preview
Linear Discriminant Analysis
1
2 LDA comes with concept of class. PCA don’t use concept of classes. LDA is an enhancement to PCA Class in face recognition means a specific person, and elements of
class are his/her face images. Suppose there two class, then class 1 will have images of 1st
person and class 2 will have images of 2nd person.
3• Multiple classes and PCA
− Suppose there are C classes in the training data.− PCA is based on the sample covariance which characterizes the
scatter of the entire data set, irrespective of class-membership.− The projection axes chosen by PCA might not provide good
discrimination power.
• What is the goal of LDA?
− Perform dimensionality reduction while saving as much of the class discriminatory information as possible.
− Search to find directions along which the classes are best separated.− Takes into consideration the scatter within-classes but also the
scatter between-classes.
4
LDA maximizes the between-class scatter LDA minimizes the within-class scatter
Class A
Class B
5 Algorithm
Assumptions Square images with Width=Height=N M is the number of images in the database P is the number of persons in the database
6
The database
2
1
2
N
bb
b
2
1
2
N
cc
c
2
1
2
N
dd
d
2
1
2
N
ee
e
2
1
2
N
aa
a
2
1
2
N
ff
f
2
1
2
N
gg
g
2
1
2
N
hh
h
7
Fisherfaces, the algorithm We compute the average of all faces
2 2 2
1 1 1
2 2 21 , 8
N N N
a b ha b h
m where MM
a b h
8
Fisherfaces, the algorithmCompute the average face of each person
2 2 2 2
2 2 2 2
1 1 1 1
2 2 2 2
1 1 1 1
2 2 2 2
1 1, ,2 2
1 1,2 2
N N N N
N N N N
a b c da b c d
x y
a b c d
e f g he f g h
z w
e f g h
9
Fisherfaces, the algorithmAnd subtract them from the training faces
2 2 2 2 2 2 2 2
2 2
1 1 1 1 1 1 1 1
2 2 2 2 2 2 2 2
1 1 1 1
2 2
, , , ,
,
m m m m
N N N N N N N N
m m
N N
a x b x c y d ya x b x c y d y
a b c d
a x b x c y d y
e z f ze z f
e f
e z
2 2 2 2 2 2
1 1 1 1
2 2 2 2 2 2, ,m m
N N N N N N
g w h wz g w h w
g h
f z g w h w
10
Fisherfaces, the algorithm
We build scatter matrices S1, S2, S3, S4
And the within-class scatter matrix SW
1 2
3 4
, ,
,
m m m m m m m m
m m m m m m m m
S a a b b S c c d d
S e e f f S g g h h
1 2 3 4WS S S S S
Fisherfaces, the algorithmThe between-class scatter matrix
We are searching the matrix W maximizing
2 2 2 2BS x m x m y m y m z m z m w m w m
B
W
W S WJ W
W S W
12
Fisherfaces, the algorithm
Columns of W are eigenvectors ofWe have to compute the inverse of SW
We have to multiply the matricesWe have to compute the eigenvectors
1W BS S
13
RecognitionProject faces onto the LDA-spaceTo classify the face
Project it onto the LDA-spaceRun a nearest-neighbor classifier Nearest is our answer