29
Approximate Nearest Subspace Search with applications to pattern recognition Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech

Approximate Nearest Subspace Search with applications to pattern recognition

Embed Size (px)

DESCRIPTION

Approximate Nearest Subspace Search with applications to pattern recognition. Ronen Basri Tal Hassner Lihi Zelnik-Manor Weizmann Institute Caltech. Basri & Jacobs, PAMI’03. Nayar et al., IUW’96. Subspaces in Computer Vision. Illumination. Faces. - PowerPoint PPT Presentation

Citation preview

Approximate Nearest Subspace Searchwith applications to pattern recognition

Ronen Basri Tal Hassner Lihi Zelnik-ManorWeizmann Institute Caltech

Subspaces in Computer Vision

Zelnik-Manor & Irani, PAMI’06

Basri & Jacobs, PAMI’03

QuickTime™ and aTIFF (LZW) decompressor

are needed to see this picture.

Nayar et al., IUW’96

•Illumination

•Faces

•Objects

•Viewpoint, Motion

•Dynamic textures•…

Query

Nearest Subspace Search

Which is the Nearest

Subspace?

Sequential Search

Sequential search: O(ndk)

Too slow!!

Is there a sublinear solution?

Database

d dimensions

n subspaces

k subspace dimension

A Related Problem:Nearest Neighbor Search

d dimensions

n points

Sequential search: O(nd)

There is a sublinear solution!

Database

Approximate NN

(1+)r

• Tree search (KD-trees)

• Locality Sensitive Hashing

Fast!!

Query: Logarithmic Preprocessing: O(dn)

r

Is it possible to speed-up Nearest Subspace Search?

Existing point-based methods cannot be applied

Tree searchLSH

Our Suggested Approach• Reduction to points

• Works for bothlinear and affine spaces

Ru

n ti

me

Sequential Our

Database size

Problem Definition

S = Subspace with dim k

q = Query

Find Mapping

u = f (S)

v = g(q)

u − v2

= μ dist 2(q,S) + ω

Apply standard point ANN to u,v

A linear function of original distance

Monotonic in distance

Independent mappings

Finding a Reduction

dist 2(q,S) = SSTq − q2

=Vec (SST − I) • Vec (qqT )

u

v

u − v2

= u2

+ v2

+ 2dist 2 q,S( )

Constants?

u2

= d − k

v2

= q4

Depends on query

q

S

SSTq

Feeling lucky?

We are lucky !!

Basic Reduction

u − v2

= μ dist2(q,S) + ω

Want: minimize /

u = Vec (SST − I)

v = Vec (qqT )

Geometry of Basic Reduction

Database

u2

= d − k Lies on a sphere

and on a hyper-plane

QueryLies on a cone

v2

= q4

u = Vec (SST − I)

v = Vec (qqT )

u = Vec (SST − I)

v = Vec (qqT )

Improving the Reduction

γ

Final Reduction

α,β ,γ = constants €

u = Vec (SST − I)

v = Vec (qqT )

γ

Can We Do Better?

u − v2

= 0

dist 2(q,S) = 0€

q

If =0

u = v

Trivial mapping Additive Constant is Inherent

Final Mapping Geometry

ANS Complexities

Preprocessing: O(nkd2) Linear in n

Log in nQuery: O(d2)+TANN(n,d2)

Dimensionality May be Large

• Embedding in d2

• Might need to use small ε

• Current solution:–Use random projections (use Johnson-Lindenstrauss Lemma)–Repeat several times and select the nearest

Synthetic Data

Varying database size

d=60, k=4

Ru

n ti

me

Sequential Our

Database size

Varying dimension

n=5000, k=4

Ru

n ti

me

Sequential Our

dimension

Face Recognition (YaleB)

Database 64 illuminationsk=9 subspaces

Query:New illumination

Face Recognition Result

Wrong Match Wrong Person

True NS

Approx NS

Retiling with Patches

Patch database Query Approx Image

Wanted

Retiling with Subspaces

Subspace database

Query Approx Image

Wanted

Patches+

ANN~0.6sec

Subspaces+

ANS~1.2 sec

Patches+

ANN~0.6sec

Subspaces+

ANS~1.2 sec

Summary• Fast, approximate nearest subspace search• Reduction to point ANN• Useful applications in computer vision• Disadvantages:

– Embedding in d2

– Additive constant • Other methods? • Additional applications?

A lot more to be done…..

THANK YOU