129
1 November 2006 STRUCTURAL AND SYNTACTIC PATTERN RECOGNITION Universitat Autònoma de Barcelona Centre de Visió per Computador

STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

1

November 2006

STRUCTURAL AND SYNTACTIC PATTERN RECOGNITION

UniversitatAutònomade Barcelona

Centre de Visió per Computador

Page 2: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

2

CV master 2006/07Contents

Introduction.Structural Pattern Recognition.

One-dimensional structures. Strings.Multidimensional structures. Graphs.

Syntactic Pattern Recognition.Formal Grammars. Definitions.Representing patterns by formal grammars.Types of formal grammars to represent visual patterns.Recognition under noise and distortion.

Structural and syntactical learning.Structured textures modelization.Application examples.Bibliography.Conte

nts

Page 3: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

3

INTRODUCTION

Page 4: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

4

CV master 2006/07Object recognition

Object recognition is the task that aims to look for and identify some components of a bidimensional image of a scene as objects in the scene.

Strategies:

Suitable for complex images

Suitable for complex representational models

Fitting models to the photometry:

•Rigid models (Hough)

•Deformable models (snakes)

Feature vectors + statistical model

Combined approaches

Fitting models to symbolic structures:

•Grammars

•GraphsIntro

ducti

on

Page 5: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

5

CV master 2006/07

• Statistical PR a class of patterns is described by a vector of numerical features. The similarity between two shapes is formulated in terms of a distance (metric) defined on the feature space.

• Structural PR it is based on the explicit or implicit representation of the structure of a class, where the structure means the relational and hierarchical organization of low level features or primitives into higher level structures.

Statistical description: P = (#components, height, width)

Structural description:

Rleg, Lleg: VERTICAL RECTANGLE;

Rarm, Larm: HORIZONTAL RECTANGLE;

body: SQUARE;

head: CIRCLE;

P = head ⇑ (Larm ⇔ (body ⇑ (Lleg ⇔ Rleg)) ⇔ Rarm)Pattern P

Statistical PR vs. Structural PRInt

rodu

ction

Page 6: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

6

CV master 2006/07The Pattern Recognition framework

x = (0.02, 0.00, 0.02, 0.15, 0.01, 0.15, 0.13, 0.07, 0.13, 0.09, 0.14, 0.09)

v1

v2

v3

v4

v5

e1 e2 e3 e4

G=(V,E)

G ⇔ GM

p (wM / x)Statistical PR

Structural PR

?M

Clustering

Model learning

Intro

ducti

on

Page 7: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

7

CV master 2006/07Statistical Pattern Recognition

E representation space of patterns.

Ω=wkk=1...K partition of E into classes.

p(wk) probability of each class.

Rm feature space.

x feature vector.

f : Rm → R+ probability distribution of vectors x in the feature space.

fw: Rm → R+ probability distribution of a class w.

d: Rm → Ω decision function.

According to a Bayesian model:

p(x/A) p(A)

p(x/B) p(B)

error prob.

Problem: estimation of f(x/w).

• parametric approach: f=F(x,q1,...,qn)

• Non parametric approach: interpolation.

)()()/()/(

xfwpwxfxwp =

Intro

ducti

on

Page 8: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

8

CV master 2006/07

• Structural Pattern Recognition uses symbolic representational models (strings, trees, graphs, arrays, etc.) to describe visual objects.

• Symbolic representations allow to describe spatial, temporal, conceptual, … relationships between primitives, and also their hierarchical structure.

a

a

a

a

b

b

b bb

b

bb

c

c c

c

S

ab b

c B S

ab b

c B Sab b

c B S

ab b

c B

(cbab)4

STRING

TREE

GRAPH

Symbolic representation of a visual patternInt

rodu

ction

Page 9: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

9

CV master 2006/07

In terms of the representation model, two major approaches can be stated within the structural PR field:Recognition based on structural prototypes:

Explicit representation using structural prototypes.Recognition by a relational matching using an implicit distance function.

Syntactic Pattern Recognition:Representation using formal grammars.A parser is used as a recognition engine.

Syntactic PR and Structural PR. DefinitionInt

rodu

ction

Page 10: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

10

CV master 2006/07Syntactic PR or Structural PR?

The use of grammars or structural prototypes depends on three factors:

Completeness of knowledge about pattern structure

Number of samples

Number of common substructures

Prefer prototypes

Prefer grammar

Intro

ducti

on

Page 11: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

11

CV master 2006/07Syntactic PR or Structural PR?

1.- Number of samples:⇓ difficult to derive a grammar.

⇑ Parsing is more suitable.

2.- Number of common substructures:⇓ A grammar requires a production for each element.

3.- Completeness of knowledge about pattern structure:⇓ If there is no knowledge, the grammatical inference is difficult.

Prefer grammars

Prefer prototypes

Intro

ducti

on

Page 12: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

12

CV master 2006/07

Equivalence between syntactic approaches and structural prototypes

w1, ..., wM M classeswi=pi

1, ..., pini class of prototypes

S S1 | S2 | ... | SMSi pi

1 | pi2 | ... | pi

ni i = 1, ..., M.

G grammarL(G) = x1, ..., xn finite languageaccepted by G

Use each xi as prototype

From grammar to prototypes

from prototypes to grammar

Intro

ducti

on

Page 13: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

13

STRUCTURAL PATTERN RECOGNITION

Page 14: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

14

CV master 2006/07Structural Pattern Recognition

The same structure is used to represent models and unknown patterns. The recognition is performed by direct comparison between models and the unknown pattern using a distance function. Two types of structures to represent patterns:

One-dimensional structures. Strings.Multidimensional structures. Graphs.

Stru

ctura

l PR

Page 15: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

15

CV master 2006/07

22221220201000070070

Discrete set of symbols

Shape representation using strings

Object boundary representation using strings of symbols. Each symbol represents an atomic component of the patterns.

The symbol alphabet is important to be suitably chosen in each case:

la

φa

lb

φb

lcφc

ldφd

le

φe

ala

φa

blb

φb

clc

φc

dld

φd

ele

φe

F =

Attributed strings

Chain Codes

6

21

03

4

5 7

Stru

ctura

l PR

Page 16: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

16

CV master 2006/07String edit distance

• S alphabet of symbols.

• Two strings:X = x1x2 ... xnY = y1y2 ... ym

• Edit operations:s = x → y substitutions = λ → y insertions = x → λ deletion

• Edit sequence S = s1, s2, ..., sk

• Cost function:

• Distance between X and Y:

d(X,Y) = minc(S), S edit sequence between X and Y

)(c)(c1

∑=

=k

iisS

X, Y ∈ Σ*

Stru

ctura

l PR

Page 17: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

17

CV master 2006/07String edit distance. Wagner & Fischer algorithm

X = x1x2 ... xnY = y1y2 ... ym

x1

x2

xi

xn

y1 y2 yj ym

Dij

0• D00= 0;• Di 0= Di-1 0 + c(xi → λ), i = 1, ..., n;• D0 j= D0 j-1 + c(λ → yj), j = 1, ..., m;• Di j= min Di j-1 + c (λ → yj),

Di-1 j + c(xi → λ), Di-1 j-1 + c(xi →yj)

• d(X,Y) = Dnm

d(X,Y) computable in O(nm)

According to the dynamic programming formulation (the solution is the sum of the minimum of partial solutions)

Stru

ctura

l PR

Page 18: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

18

CV master 2006/07Improving the string edit distance algorithm

a’

b’c’d’e’

f’

Y

Merge operation

c(bc → c’d’e’) = c(bc → x) + c(c’d’e’ → y) + c(x → y)

d

e

ab

cX

String edit distance for cyclic strings

d(X,YY)

abcde

a’ b’ c’ d’ e’ f’ a’ b’ c’ d’ e’ f’

Stru

ctura

l PR

Page 19: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

19

CV master 2006/07Cyclic string matching

Algorithm cyclic_string_matching(X,Y)

Input: two strings X=x1 ... xn and Y=y1 ... ymOutput: the cyclic edit distance dc(X,Y) = d(X,Y') such that Y' is the substring of Y2 with the smallest edit

distance to X.

BeginD(0,0).cost:=0;For i=m+1 to 2m do yi:=yi-m;For i=1 to m do D(0,i).cost:=0;For i=m+1 to 2m do D(0,i).cost:=∞;For i=1 to n do D(i,0).cost:=D(i-1,0).cost + c(xi→λ);For i=1 to n do

For j=1 to 2m doBegin

D(i,j).cost:=minD(i-k,j-l) + c(xi-k+1,i→x') + c(yj-l+1,j→y') + c(x'→y'): k,l = 1,..., M;D(i,j).ptr = (i-k', j-l') where k' and l' are the values of k and l where the minimum in the previous statement occurs;

Enddc(X,Y) := minD(n,i), i=m+1,...,2m;

End.

Stru

ctura

l PR

Page 20: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

20

CV master 2006/07Application example: string matching for 2D shapes

a’ b’ c’ d’ e’ f’ g’ h’ i’

abcdefgh

a’ b’ c’ d’ e’ f’ g’ h’ i’

2.38 2.06 1.75 1.35 1.02 1.01 2.48 2.11 1.48 0.62 0.91 1.98 1.19 0.51 0.16 0.55 0.87 1.19

abc

d

ef g

h

a’

b’c’

d’ e’f’

g’

h’i’

Stru

ctura

l PR

Page 21: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

21

CV master 2006/07Iconic indexing by 2D strings

Representation based on the orthogonal projections of the objects of the image.

• V=x1,...,xn → Set of symbols that represent the objects of the image.

• A=<,=,: → set of special symbols that represent spatial relationships between elements in V.

• (u,v) → 2D string that represents the vertical and horizontal projections of the object bounding boxes.

AB C

D

(B:A<D=C,D<B:A=C)

BeforeAt the same position

In the same cell

Stru

ctura

l PR

Page 22: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

22

CV master 2006/072D C-strings

7 object relationships are defined to describe

different overlapping situations between

objects.

A<B end(A)<begin(B)

A=B begin(A)=begin(B), end(A)=end(B)

A|B end(A)=begin(B)

A%B begin(A)<begin(B), end(A)>end(B)

A[B begin(A)=begin(B), end(A)>end(B)

A]B begin(A)<begin(B), end(A)=end(B)

A/B begin(A)<begin(B)<end(A)<end(B)

AB

BA

AB

AB

AB

AB

BA

S.Y. Lee, F.J. Hsu, “2D C-String: A new Spatial Knowledge Representation for Image DatabaseSystems”. Pattern Recognition, Vol 23, No 10, pp 1077-1087, 1990.

Stru

ctura

l PR

Page 23: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

23

CV master 2006/07Construction of a 2D C-string (I)

AB

C

D

E

F

A

B

C

E

F

B

C

E

F

D]A]E]C

D]A]E]C|A=C=E]B|B=(C[E<F)|F

D]A]E]C|A=C=E]B

D]A]E]C|A=C=E]B|B=(C[E<F)

Cutting points: points at the end of the objects where an overlapping (no inclusion) with other objects is present.

The cutting points define the beginning of new sub-objects.

Reference object

Cutting point

1 2

3 4

Stru

ctura

l PR

F

Page 24: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

24

CV master 2006/07Construction of a 2D C-string (II)

The algorithm looks for cutting points between objects and the corresponding 2D C-string (u,v) consisits of the horizontal and vertical projections.

AB

C

D

E

F

AB

C

D

E

F

D]A]E]C|A=C=E]B|B=(C[E<F)|F A]B|B<D](C|F<E)

u v

Stru

ctura

l PR

Page 25: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

25

CV master 2006/07Pattern representation using graphs

• A graph G=(V,E) is a non-empty set V of vertices with a set E⊆ V×V of edges. If Erepresents a reflexive relation, i.e. (x,y) ≠ (y,x), (x,y), (y,x) ∈ E, then the graph is defined as directed graph.

• Let S be a set of symbolic labels. A graph G=(V,E) is called vertex labeled if there exists a function m : V → S that assigns a label to each vertex. Similarly, G is edge labeled if there exists a function u : E → S that assigns a label to each edge.

• A labeled graph is a graph G=(V,E, m, u) where m and u are respectively, two labeling functions for vertices and edges.

a

a

b

a

αα

βα

graph labeled graph

Stru

ctura

l PR

Page 26: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

26

CV master 2006/07Attributed graph

An attributed graph is a 4-pla G=(V,E,LV,LE) where: V set of verticesE set of edgesLV : V → SV × AV

k

LE : E → SE × AEl

where SV i SV are two sets of symbolic labels, and AV and AE are attribute values.

labeling functions

(β,2.25)

(t,1,2)

(r,1,20)

(t,0,4)

(α,3)

(α,3)

(α,3.25)

(t,2,5)

Stru

ctura

l PR

Page 27: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

27

CV master 2006/07Representation Structures for Graphs

Adjacency listsv5v1

v4

e7e6

e5e4e3

e2e1

221 1

1 13

v3

aa b

c

b

v2

Adjacency matrix

v1

v2

v3

v4

v5

v1 v2 v3 v4 v5

a

bc

1

23a

b

21

11

- --

e1

e2

e3

e4

e5

13111

e6

e7

22

v1 v2

v2 v4

v2 v3

v3 v4

v4 v5

v1 v3

v3 v5v1

v2

v3

v4

v5

aacbb

e1e2

e3e4

e5

e6e7

e4e2

e1 e6

e5 e7

Stru

ctura

l PR

Page 28: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

28

CV master 2006/07Graph isomorphism and subgraph isomorphism

• A graph isomorphism between G=(V,E,LV,LE) and G’=(V’,E’,LV’,LE’) is a bijectivemapping f : V → V’ such that the following conditions are satisfied:

• LV(v)=LV’(f(v)) for all v∈V.

• For any edge e=(vi,vj) ∈E there exists an edge e’=(f(vi),f(vj)) ∈E’ such that LE(e) = LE’(e’).

• For any edge e’=(vi’,vj’) ∈E’ there exists an edge e=(f- -1(vi’),f- -1(vj’)) ∈E such that LE’(e’) = LE(e).

• An injective mapping f : V → V’ is a subgraph isomorphism if there exists a subgraph G’’ ⊆ G’ such that f is an isomorphism between the graphs G and G’’.

Stru

ctura

l PR

Page 29: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

29

CV master 2006/07

Graph isomorphism and subgraph isomorphism. Graphical illustration

b

aa

bv1 v2

v3 v4

b

aa

bw1 w2

w3 w4

b

ba

aw1 w2

w3 w4

aw5

Graphisomorphism

f(v1) = w1f(v2) = w2f(v3) = w3f(v4) = w4

Subgraphisomorphism

f(v1) = w2f(v2) = w4f(v3) = w1f(v4) = w3

G G’St

ructu

ral P

R

Page 30: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

30

CV master 2006/07Graph matching drawbacks when dealing with images

• Subgraph isomorphism ∈NP-Complete.

Strategies are required to reduce the search space.

• Indexing mechanisms (hashing).

• Prune criteria (lookahead criteria).

• Inconsistent hypotheses rejection.

• Approximate algorithms.

• ...

• Presence of noise and distortion.

Error-tolerant subgraph isomorphism between G and G’: minimum cost edit sequence that transforms G to a graph G’’ such that there exists a subgraph isomorphism between G’’ and G’.

Usually, application-dependent methods

Stru

ctura

l PR

Page 31: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

31

CV master 2006/07Graph matching algorithms

Optimal algorithmsBacktrack tree searchBacktracking with forward checkingDiscrete relaxationMaximal cliques search

Error-tolerant algorithmsApproximate algorithms

Probabilistic relaxationNeural networksGenetic algorithms

Indexed search

Stru

ctura

l PR

Page 32: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

32

CV master 2006/07Backtrack tree search

G=(V,E,LV,LE) model graph and G’=(V’,E’,LV’,LE’) candidate graph.

Incremental mapping (state-space search). At each state, a new position of a vector f [i], i = 1, 2, ..., |V| is assigned to a candidate vertex, whenever it is consistent with the former assignments.

b

a

a

1

23

b

a

a

1

23

a4

G

G’

1 2 3 4

12

13

14

32

324

423

42

backtracking

Stru

ctura

l PR

Page 33: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

33

CV master 2006/07Forward checking

It includes a lookahead procedure which rejects incompatible mappings as soon as they are available, in order to avoid the generation of the whole subtree beneath a vertex mapping whose assignment is incompatible with any future assignments.

At each model-to-candidate vertex mapping, the algorithm checks if there exists at least one mapping for each future model vertex to some candidate vertex that satisfies the subgraph isomorphism conditions.

Forwardchecking

b

a

a

1

23

b

a

a

1

23

a4

G

G’

1 2 3 4

12

13

14

32

324

423

42

Stru

ctura

l PR

Page 34: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

34

CV master 2006/07Graph Matching with Forward Checking Algorithm

Algorithm Graph Matching Forward Checking(G,G’ )

• Input: two attributed graphs G=(V,E,LV,LE) and G’ =(V’,E’,LV’,LE’) ;

• Output: an assignment vector f [1 ... |V|] representing a subgraph isomorphism from G to G’.f[i] = j means that the vertex vi ∈V is mapped to the vertex wj ∈V’

• Let P = [1 ... |V|, 1 ... |V’|] be a compatibility matrix between the vertices of G and G’.

0. Begin1. For i = 1 to |V| do2. For j = 1 to |V’| do3. If LV(vi) = LV’(wj) then P[i, j] := 1; else P[i, j] := 0; 4. For i = 1 to |V| do f[i] := 0; 5. Return Backtracking(P,1,f);6. End.

Stru

ctura

l PR

Page 35: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

35

CV master 2006/07Backtracking. Algorithm

Algorithm Backtracking(P,i,f )

• Input: a compatibility matrix P between the vertices of two graphs G and G’;the current level i of the search tree;a vector f representing the partial isomorphism up to level i-1.

• Output: the partial isomorphism f up to level i.

0. Begin1. If i>|V| then return f as the subgraph isomorphism from G to G’;2. Else3. For j=1 to |V’| do4. If P[i,j]=1 then5. Begin6. f[i] := j;7. P’ := P;8. For k = i+1 to |V| do P’[k, j] := 0; 9. If Forward_checking(P’,i,f) then return Backtracking(P’,i+1,f);10. Else f[i]:=0;11. End12: End.

Stru

ctura

l PR

Page 36: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

36

CV master 2006/07Forward checking. Algorithm

Algorithm Forward_checking(P,i,f )

• Input: a compatibility matrix P between the vertices of two graphs G and G’;the current level i of the search tree;a vector f representing the partial isomorphism up to level i-1.

• Output: TRUE, if for each vertex vk ∈V (k>i) there is at least one assignment compatible with thealready mapped vertices in f;FALSE, otherwise.

0. Begin1. For k = i+1 to |V| do2. For j = 1 to |V’| do3. If P[k,j] = 1 then4. For l = 1 to i-1 do5. Begin6. Check the following edge consistency constraints:

(1) for each edge e=(vk,vl) ∈ E there is an edge e´=(wj,wf[l]) ∈ E’ such that LE(e)=LE’(e’)(2) for each edge e´=(wj,wf[l]) ∈ E’ there is an edge e=(vk,vl) ∈ E such that LE(e)=LE’(e’)

7. If (1) or (2) are FALSE then P[k, j] := 0; 8. End9. If there exists a row k in P such that P[k,j] = 0 for j = 1, ..., |V’| then return FALSE;10. Else return TRUE;11. End.

Stru

ctura

l PR

Page 37: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

37

CV master 2006/07

O set of objects

Consistent labeling problem L set of labels

R set of labeling constraints

The consistent labeling problem can be solved with discrete relaxation techniques:

1. Node labeling. Hypotheses about compatible mappings between vertices in terms of the local configurations of the model and candidate vertices.

2. Arc consistency verification. The graph is interpreted as a constraint graph to validate the consistency between labelings of neighboring vertices.

Discrete relaxation. Formulation

Generation of a set of hypotheses H that assign a label to eachobject such that the constraints are satisfied

AC4 algorithm (Arc Consistency 4) [Mohr & Henderson], 86. It is an iterative process that eliminates inadmissible labels until the stability.

Stru

ctura

l PR

Page 38: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

38

CV master 2006/07Discrete relaxation. Example

• Node labeling: A set of consistent labels Lm is assigned to each model vertex m∈VM, in terms of the local attributes of each vertex:

L1=1,3,4L2=2L3=1,3,4

• Edge consistency: Those incompatible labelings for neighboring vertices are eliminated:

L1=3,4L2=2L3=3,4

If v1 ⇔ v1’, any assignment for v2 is not compatible with such a labeling.

b

a

a

1

23

b

a

a

1

23

a4

G

G’

Stru

ctura

l PR

Page 39: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

39

CV master 2006/07Discrete relaxation. AC4 algorithm

Algorithm Graph Matching AC4(G,G’ )

• Input: two attributed graphs G=(V,E,LV,LE) and G’ =(V’,E’,LV’,LE’) ;

• Output: |V| sets Li of admissible mappings from each model vertex to input vertices.

1. For each model vertex vi ∈V do2. Begin3. Li := wk ∈V’: LV(vi) = LV’(wk);4. For each wk ∈ Li do Si,k := ∅;5. End.6. For each model edge e=(vi,vj) do7. For each hypotheses pair (vi,wk),(vj,wl) such that 8. vi, vj ∈ V and wk, wl ∈ V’ and wk ∈ Li, wl ∈ Lj and R(vi,wk,vj,wl) Do9. Begin10. Add (vj,wl) a Si,k ; Add (vi,wk) a Sj,l ;11. Ci,j,w := Ci,j,w + 1; Cj,i,l := Cj,i,l +1;12. End13. STACK:= ∅;14. For each Ci,j,k = 0 do15. Begin16. Push(STACK,(vi,wk));17. Li := Li - wk;18. End

Initialization step

Stru

ctura

l PR

Page 40: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

40

CV master 2006/07Discrete relaxation. AC4 algorithm (cont)

19. While STACK≠∅ do20. Begin21. (vi,wk) := pop(STACK);22. For each (vj,wl) such that (vi,wk) ∈Sj,l do23. Begin24. remove(vi,wk) from Sj,l ;25. Cj,i,l := Cj,i,l -1;26. If Cj,i,l = 0 then27. Begin28. push(STACK,(vj,wl));29. Lj := Lj - wl;30. End31 End32. End

Relaxation step

Stru

ctura

l PR

Page 41: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

41

CV master 2006/07

Comparison between tree-search proceduresfor graph matching

i.l. Incompatible labelingb.ch. backward checkingf.ch. forward checkingd.r. Discrete relaxation

1 2 3 4

12

13

14

31

32

34

41

42

43

321

324

421

423

i.l.

f.ch.

d.r.

b.ch.

b.ch. b.ch.

d.r. d.r.i.l.

i.l.

i.l. i.l. i.l. i.l.

ba

a

12

3b

a

a

12 3

a4G G’

Stru

ctura

l PR

Page 42: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

42

CV master 2006/07Maximal clique search in an association graph (I)

An association graph GA = (VA,EA) between two attributed graphs G and G’:

• VA ⊆ V×V’ = (v,w) : v ∈V, w ∈ V’, and LV(v) = LV’(w)

• EA ⊆ VA×VA = ((v1,w1), (v2,w2)) : (v1,w1), (v2,w2) ∈VA and LE((v1,v2)) = LE’((w1,w2))

Is a way to represent the compatible model-to-candidate mappings.

b

a

a

1

23

b

a

a

1

23

a4

G

G’

11

13

14

22

31

33

34

GA

Stru

ctura

l PR

Page 43: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

43

CV master 2006/07Maximal clique search in an association graph (II)

• Clique: set of fully connected vertices of GA.

• Maximal clique: clique whose nodes are not contained in any other clique.

⇒ The subgraph isomorphism between G and G’ is computed in terms of the maximal cliques of the association graph GA.

11

13

31

14

22

33

34

GA Maximal cliques

13

22

34 14

22

33

Stru

ctura

l PR

Page 44: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

44

CV master 2006/07Maximal cliques. Algorithm

Algorithm Graph Matching Clique Detection(G,G’ )

• Input: two attributed graphs G=(V,E,LV,LE) and G’ =(V’,E’,LV’,LE’) .

• Output: the set of cliques of the association graph between G and G’.

0. Begin1. Create an association graph GA=(VA,EA) between G and G’. 2. CLIQUES := Cliques(∅,VA);3. Return CLIQUES;4. End.

Algorithm Cliques(C,W )

• Input: a clique C and W, a set of graph vertices that includes C.

• Output: all cliques that include C and are included in W.

0. Begin1. If no vertex in W - C is connected to all elements of C then return C;2. Else3. For each v ∈W - C connected to all elements of C do4. Return Cliques(C ∪ v,W) ∪ Cliques(C,W-v);5. End.

W’ = w ∈ VA; w is connected to each node of W (to obtain maximal cliques the above line should be added)

Stru

ctura

l PR

Page 45: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

45

CV master 2006/07Error-tolerant graph matching (I)

Stru

ctura

l PR

The matching involves an error model which is often application-dependent.A well-known distortion model is inspired by the string matching techniques [Wagner & Fischer, 1974].

The distortion is modeled by a set of graph edit operations that includes the insertion, deletion and substitution of both nodes and edges.Each edit operation has an associated cost.An edit sequence S between two graphs G and G' is defined as an ordered

sequence of edit operations transforming G into G'.

Node/edge substitution(relabeling) node/edge insertion node/edge deletion

Page 46: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

46

CV master 2006/07Error-tolerant graph matching (II)

Stru

ctura

l PR

⇒ An error-tolerant subgraph isomorphism between two graphs G=(V,E,LV,LE)and G’=(V’,E’,LV’,LE’) is a tuple f=(S,fS ) where:

• S is an edit sequence of edit operations transforming G to a graph G’’, and

• fS is a subgraph isomorphism between G’’ and G’.

⇒The cost γ(f ) of an error-tolerant subgraph isomorphism f=(S,fS ) is defined as the cost of the underlying edit sequence, i.e.

γ(f ) = γ(S) = Σi γ(si)

where si are the edit operations of the sequence S.

G G’’ G’S fS

Page 47: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

47

CV master 2006/07Error-tolerant graph matching. Algorithm (I)

Algorithm Error-tolerant Subgraph Isomorphism(G,G’ )

• Input: two attributed graphs G=(V,E,LV,LE) and G’ =(V’,E’,LV’,LE’) .

• Output: a list MATCH_FOUND of optimal error-tolerant subgraph isomorphisms from G to G’.

0. Begin1. MATCH_FOUND := ∅;2. For each w ∈ V’ ∪ ∆ do3. Begin4. p := (v1,w) ;5. C(p) := γ (v1 → w);6. Insert(OPEN,p);7. End.8. While OPEN ≠ ∅ do9. Begin10. Select p ∈ OPEN such that C(p) is minimal;11. If C(p) > Threshold then return MATCH_FOUND;12. If p represents a complete mapping from G to G’ then13. ...St

ructu

ral P

R

Page 48: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

48

CV master 2006/07Error-tolerant graph matching. Algorithm (II)

10. ...11. If C(p) > Threshold then return MATCH_FOUND;12. If p represents a complete mapping from G to G’ then13. Begin14. Insert(MATCH_FOUND, p);15. Threshold := C(p);16. End17. Else18. Begin19. /* Let p = (v1, wi), ..., (vk , wj) and Vp’ = w ∈V’ : (v,w) ∈p */20. For each w ∈(V’ - Vp’ ) ∪ ∆ do21. Begin22. p’ := (v1 , wi), ..., (vk , wj), (vk+1 , w) ; 23. Compute C(p’);24. Insert( OPEN,p’);25. End26. End27. End28. Return MATCH_FOUND;29. End.

Stru

ctura

l PR

Page 49: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

49

CV master 2006/07Error-tolerant graph matching. Example

Original image Isomorphism result

Model graph

Stru

ctura

l PR

Page 50: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

50

CV master 2006/07Error-tolerant graph matching. Example

Stru

ctura

l PR

Page 51: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

51

CV master 2006/07Approximate algorithms for graph matching

• The approximate algorithms (continuous optimization) do not build a state-space of possible mappings, but minimize a similarity function.

• Advantages:

• Solution in polynomial time.

• Either graph and subgraph isomorphism.

• Drawbacks:

• The minimization process can stabilize in a local minimum and, therefore may not find the optimal solution.

Stru

ctura

l PR

Page 52: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

52

CV master 2006/07Probabilistic relaxation

Similar to discrete relaxation but the compatibility between assignments does not have a binary formulation but is defined in terms of a probability function.

P(0) (v→w) defined in terms of the affinity between LV(v) and LV’(w)

Support function that combines the evidence in the neighboring vertices (Gw)

Compatibility function defined in a Bayesian framework

Probability of the assignment v → w at the iteration n+1

Stru

ctura

l PR ∑ ∈

+

→→→→

=→Vu

n

nn

wuQwuPwvQwvPwvP

)()()()()( )(

)()1(

)()()|(),,,(

wvPyxPwvyxPyxwvR

→→→→

=

∏∑∈ ∈

→=→wGy Vx

n yxwvRyxPwvQ ),,,()()( )(

Page 53: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

53

CV master 2006/07Other graph matching approximate algorithms

• Neural networks. The most used are the Hopfield networks where the nodes represent mapping pairs (v,w) (v ∈ G and w ∈G’), and the connection weights represent compatibility assignments (R(v1,w1,v2,w2)). The key issue is the initialization of the network.

• Genetic algorithms. It is an iterative process that, by the use of some genetic operators, the best among a set of solutions is propagated to the set of successors. A solution is the set of genes (each gene is a mapping (v,w)).The most usually used operators are crossing, resorting and mutation of solutions.

• Numerical models.

• Decomposition of adjacency matrices A(G) and A(G’) in principal components.

• Lineal programming.

Where P is a permutation matrix of |V|×|V|.

⇒ Only work when G and G’ have the same number of vertices.

Stru

ctura

l PR

1)'()(min T

PPGPAGA −

Page 54: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

54

CV master 2006/07Indexed search

⇒ Look for the presence of a graph into a graph database.

Hashfunction

Graph database

Graph to

search ...Hash code

Hash table

Example: organization of the database as a decision tree based on the idea that the graphs G and G’ are isomorphs if there exists a permutation matrix P such that A(G’) = P A(G) PT where:

• A(G) adjacency matrix of graph G.

• pij ∈0,1 for i,j = 1, ..., n

•njpn

i ij ,,1for 11 …==∑ =

nipni ij ,,1for 11 …==∑ =

Stru

ctura

l PR

Page 55: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

55

CV master 2006/07Indexed search. Example (Messmer, 95)

b 1 10 a 10 0 a

1 2 3

A

b 1 10 a 00 1 a

1 3 2

B

a 0 11 b 10 0 a

2 1 3

C

a 1 00 a 01 1 b

2 3 1

D

a 0 01 b 11 0 a

3 1 2

E

a 0 01 a 01 1 b

3 2 1

F

a 1 00 a 11 0 a

1 2 3

A’

a 0 11 a 00 1 a

1 3 2

B’

a 0 11 a 10 1 a

2 1 3

C’

a 1 00 a 11 0 a

2 3 1

D’

a 1 00 a 11 0 a

3 1 2

E’

a 0 11 a 00 1 a

3 2 1

F’

a a

b

G1

a a

a

G2

ab

10 a

10 a

01 b

01 a

11

0 0 a

10

0 1 a

11

0 0 a

01

1 0 a

00

1 1 b

01

1 0 a

00

1 1 b

10

0 1 a

A B C E D A’, D’, E’ F B’, C’, F’

Stru

ctura

l PR

Page 56: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

56

CV master 2006/07Random graphs

⇒ Graphs that involve a probabilistic description of data (combine structural and statistical PR). They can be seen as attributed graphs that have a probability distribution associated to the attributes of vertices and edges.

• A random graph RG=(W,B) is a set of random vertices W=a1,...,am and random edges B=...,bpq,....

• Probability of an instance G of RG under the isomorphism I: G → RG:

• Entropy of a random graph (measurement of its variability):

• The isomorphism between two random graphs is defined in terms of an optimization process:

),(),,Pr(),( BeWvIGP ∈∀=∈∀== ββαα

∑−=),(

),(log),()(IG

IGPIGPRGH

with':)( IRGRGRRHminI

∪=∈I

Stru

ctura

l PR

Page 57: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

57

CV master 2006/07Random graphs. Example

u above

v bending

w branching

x crossing

y near (gap)

z left of

Code description diagram

a b cde

f

PrimitivesSamples

1:1 2:5 3:2 4:1 5:1 6:1

7:1 8:1 9:1 10:1 11:1 12:1

Graph

a(.25)b(.70)c(.05)

d(.40)e(.55)f(.05)

d(.05)e(.95)

x

y

u

Stru

ctura

l PR

Page 58: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

58

CV master 2006/07Graph matching methods compared

Stru

ctura

l PR

Graphisomorphism

Subgraphisomorphism

Error-tolerantsubgr. isom. Optimal Complexity

classBacktrack tree search Yes No Yes NP

Forward checking No Yes NPDiscrete relaxation Yes 1 Yes NP 2

Clique finding No Yes NPGraph edition Yes Yes NP

Random graphs Yes Yes NPProbabilistic relaxation Yes No P

Neural nets Yes No PGenetic algorithms Yes No P

Eigendecomposition No 3 Yes PLineal programming No Yes P

Indexed search No Yes P 41 In some cases.2 If backtracking follows relaxation.3 Although is able to find error-tolerant graph isomorphism between close graphs.4 Although the compilation of the database is NP.

YesYesYesYesYesYesYesYesYesYesYes

YesYesYesYesYesYesYesYesYesNoNoYes

Page 59: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

59

SYNTACTIC PATTERN RECOGNITION

Page 60: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

60

CV master 2006/07Syntactic Recognition

DefinitionsFormal Grammars to represent patternsClassification of formal grammars to represent patterns

One-dimensional Grammars (string grammars)N-dimensional Grammars

Recognition under noise or distortionStochastic GrammarsSyntactical analysis (parsing) with error correction

Synta

ctic P

R

Page 61: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

61

CV master 2006/07Formal Grammars. Definitions (I)

• An alphabet Σ is a finite set of symbols.

• A word x on the alphabet Σ is a symbol sequence x=a1 ... an on ai∈Σ ; i=1, ..., n.

• The empty word λ is a sequence without symbols.

• Σ* set of all the words on the alphabet Σ. For example, if Σ=a, b, then Σ*=λ, a, b, aa, ab, ba, bb, aaa, aab, ....

• Σ+ set of all the words on the alphabet Σ without the empty word, that is, Σ+ = Σ* - λ.

Synta

ctic P

R

Page 62: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

62

CV master 2006/07Formal Grammars. Definitions (II)

A formal grammar is a 4-tupla G=(N,T,P,S) where

• N is a finite set of non-terminal symbols,

• T is a finite set of terminal symbols,

• P=p is a finite set of productions or rewriting rules, wherep=α→β, α∈(N∪T)+ and β ∈(N∪T)*

• S∈N is the start symbol of the grammar.

Condition: N∩T = ∅

Example: grammar for arithmetic expressionsG=(N,T,P,S)N=S,E,TER,FT=d, ×, +,(,)P= S → E,

E → E + TER | TER,TER → TER × F | F,F → ‘(’ E ‘)’ | d

Synta

ctic P

R

Page 63: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

63

CV master 2006/07Formal Grammars. Definitions (III)

Being G=(N,T,P,S) a grammar and p=α→β a production of P.

• Any word v=xαy, with x,y∈(N∪T)* is derivable to the word w=xβy. Denoted by v → w.

• v→ w denotes that exists a set of words v=v0 , v1 , ..., vn=w such that vi→ vi+1 ; i=0, 1, ..., n-1.

• The language generated by G is defined as L(G) = x | x ∈T *, S → x.

*

*

Example: G=(N,T,P,S)N=S,A,BT=a,b,cP=S → cAb, A → aBa, B → aBa | cb

Language generated by G: L(G) = cancbanb ; n≥1

Derivation sequence for n=2: S → cAb → caBab → caaBaab → caacbaab.

Example: G=(N,T,P,S)N=S,A,BT=a,b,cP=S → cAb, A → aBa, B → aBa | cb

Language generated by G: L(G) = cancbanb ; n≥1

Derivation sequence for n=2: S → cAb → caBab → caaBaab → caacbaab.

Synta

ctic P

R

Page 64: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

64

CV master 2006/07Formal Grammars. Definitions (IV)

Chomsky Hierarchy:• Grammars without restrictions (type 0) : grammars without restrictions in the form of its productions.

• Context sensitive Grammars (type 1): each of its productions is of the form xAy → xzy, where x,y∈(N∪T)*, A∈N and z∈(N∪T)*.

• Context Free Grammars (type 2): each of its productions is of the formA → z, where A∈N and z∈(N∪T)*.

• Regular Grammars (type 3): each of its productions is of the form A → aB or A → a where A,B ∈N and a∈T.

Synta

ctic P

R

Page 65: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

65

CV master 2006/07Formal Grammars to represent patterns

• When a terminal alphabet of a grammar is a set of graphic primitives and their combination have a topological meaning, then a class of patterns can be represented by a formal grammar.• Syntactical Analysis (parsing). To answer if a pattern x belongs to a class represented by a grammar G means answer to the question: x∈L(G)?.

G=(N,T,P,S)

N=S,A,B

T=a,b,c

P = S → cAb,

A → aBa,

B → aBa | cb

L(G) = cancbanb

a b cT=

a

b

c

a a

c

b

a a a

...

...L(G) = Graphical

interpretation

Synta

ctic P

R

Page 66: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

66

CV master 2006/07Formal Grammars to represent patterns. Example

Synta

ctic P

R

Page 67: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

67

CV master 2006/07Formal Grammars to represent patterns

Processing

Analyze

Interpretation

Lexicographic Analysis (scanner)

Syntactical Analysis (parser)

SemanticalAnalysis

Image

Grammar G

G + semanticalrules

Alph. terminal

I ∈ L(G)?

Image

FilteredImage

Contours, regions,

etc.

DomainKnowledge

Image Understanding

Comp

uter V

ision

Tas

ks

Synta

ctica

l Rec

ognit

ion

Synta

ctic P

R

Page 68: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

68

CV master 2006/07Syntactical analysis main concepts (parsing) (I)

Let be G=(N,T,P,S) a context free grammar. A derivation tree is such that:

• Each leave has a symbol a∈T as a label,

• Each internal node has as label a symbol A∈N,

• The root has as label the start symbol S,

• If a node with a label A∈N has as successors the nodes with labels x1,..,xn ∈ (N∪T), then P has a production A→ x1...xn .

S → cAb, A → aBa, B → aBa | cb

a

b

ca

c

b

a a

Derivation tree for the word caacbaab

S

A

B

B

c a bca ba a

Synta

ctic P

R

Page 69: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

69

CV master 2006/07

x∈L(G) ⇔ derivation tree construction for the word x ∈T* and the grammar G.

• Top-down parser: tree derivation construction from the root to the leaves, applying the grammar productions.

• Bottom-up parser: tree derivation construction from the leaves to the root, applying the grammar productions in backward direction.

⇒ If the grammar is not context free, then instead of a derivation tree we have a derivation graph.Synta

ctic P

RSyntactical analysis main concepts (parsing) (II)

Page 70: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

70

CV master 2006/07Formal Grammars to represent patterns

One-dimensional Grammar (string grammars)PDLPlex grammars

N-dimensional Grammarsweb grammarsgraph grammarstree grammarsarray grammars

Attributed Grammars

Synta

ctic P

R

Page 71: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

71

CV master 2006/07One-dimensional Grammars: PDL

Picture Description Language (PDL): Based on the concatenation of graphical primitives following a head-to-tail scheme.

G=(N,T,P,S)

N=S,A,CASA,TRIANGLE

T=a,b,c,d,(,),+,-,×,*,!

P = S → A | CASA, A → b+((TRIANGLE)+c),

CASA → (d+(a+(!d)))*(TRIANGLE),TRIANGLE → (b+c)*a

L(G) = b+(((b+c)*a)+c), (d+(a+(!d)))*((b+c)*a)

b+(((b+c)*a)+c) (d+(a+(!d)))*((b+c)*a)Synta

ctic P

R

Primitives: a

a! aa

b

b

a

ba

ba

b

c d

Operators:

a + b a - b a x b a * b ! a

hd hd hd

hd hdtl

tl

tltl tl

Page 72: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

72

CV master 2006/07One dimensional Grammars: Plex Grammars

• Nape (n-attaching point entity): symbol with n attaching points. It is represented as:identifier(pc1, pc2, ..., pcn)

• Plex structure: structures formed by the union of napes. It is formed by three components:[ list of napes,

list of internal connections among napes,list of contact points among other napes or plex structures ]

• Plex grammar: one-dimensional grammar able to generate plex structures.

LETTER → TWOPART( )LETTER → THREEPART( )TWOPART( ) → L( )L( ) → (ver, hor)(31)( )THREEPART() → H( ) | F( )H( ) → (HELP, ver)(22)( )F( ) → (HELP, hor)(11)( )HELP(1,2,3) → (ver,hor)(21)(10, 02,30).

1

2

3

1 2

1

3

2

ver(1, 2, 3)

hor(1, 2)

HELP(1, 2, 3)

(ver, hor, ver) (210, 022) ( )

(ver, hor, hor) (110, 201) ( )

(ver, hor) (31) ( )

napes patterns

Synta

ctic P

R

Page 73: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

73

CV master 2006/07N-dimensional Grammars. Definitions (I)

PDL and Plex are one-dimensional grammars that allow to represent multidimensional patterns by adding special relational symbols. But the grammars described in the next paragraph are inherently multidimensional.

• A graph g=(V,E) is a non-empty finite set V of nodes with a set E⊆ V×V of edges. If E represents a non-reflexive relation, that means (x,y) ≠ (y,x), (x,y), (y,x) ∈ E, then the graph is defined as a directed graph.

• Let S be a set of symbolic labels. A graph g=(V,E) is called node labeled if there exists a function µ: V → S that assigns one label to each node. Similarly, g is edge labeled if there exists a function υ: E → S that assigns one label to each edge.

• A web is a non-directed graph node labeled. A web grammar is a formal grammar G=(N,Σ,P,S) where

• N is the non-terminal alphabet,• Σ is the set of node labels, it is the terminal alphabet,• P is the set of productions, and• S is the start symbol.

The productions allow to rewrite webs into other webs.

Synta

ctic P

R

Page 74: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

74

CV master 2006/07N-dimensional Grammars. Definitions (II)

Let S be a set of labels and Vα and Vβ the sets of nodes of the webs α and βrespectively. A web production is defined as a triplet (α,β,f), where f : Vβ × Vα → 2S

(set of subsets of labels) specifies how to join the nodes from b to the neighbors of each node from α.

The function f is called embedding rule.

X Y

ZB

C

D

A

C

D T

S

E

B

X Y

Z

A

T

S

( , , f (B,A)= C, D )

(α, β, f (B,A) = C, D) ≈ substitution of α by β and union of the node B (in β) to the neighboring nodes of node A (in α) with labels C or D.

E

Synta

ctic P

R

Page 75: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

75

CV master 2006/07Web grammars. Example

G=(N,T,P,S)

N=S,A

T=c, g, e,l

g

e

P =

f (l,A) = c, e, l f (l,A) = c, e, l

Synta

ctic P

Rl l l

c c c

g g g g

e

l l l

e cc c

g g g g

WebDiagram

S → A A → l — A

c

g

A → l

c

g

Page 76: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

76

CV master 2006/07Web grammars. Example 2

G=(N,T,P,S)

N=B,C,S

T=a

P=

f (a,S) = a

α β f

a BS

f (a,B) = a,aa BB

f (a,B) = a,aBa

C

a

f (a,C) = a,af (a,a) = a,a

B | C

C

a

a

a

C

a

f (a,C) = a,af (a,a) = a,a

C

a

B

a

a

a

Possible patterns:

Synta

ctic P

R

Page 77: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

77

CV master 2006/07Web grammars. Particular cases

Graph Grammars: is a particular case of web grammar where the terminal vocabulary of the grammar has a unique symbol, that is, they are grammars whose underlying graphs are non-labeled graphs.

Tree Grammars: A production a → b rewrites the subtree a in the tree w into the subtree b. The ancestor of the subtree a becomes the ancestor of the root of the subtree b, then it is not necessary to define an embedding function f.

Array Grammars: Generate languages that are sets of connected arrays of symbols. An array grammar generates an array placing symbols of an alphabet at coordinates of the plane.

Synta

ctic P

R

Page 78: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

78

CV master 2006/07Tree Grammar an example

ab

cd

e a

d

c

$

c

$

a

b

c d

e

c d aa d

ac cb d

e

G=(N,T,P,S)

N=S,A1, A2, A3, A4, A5, A6, A7, A8

T=$, a, b, c, d, e

P: S → $ A1 → a A2 → c

A1 A2 A3 A4 A5 A6 a A8

A3 → d A4 → b A5 → c

a c A7 e b d

A6 → d A7 → c A8 → d

c e a

primitives

Synta

ctic P

R

Page 79: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

79

CV master 2006/07Tree Grammar. Example 2

G=(N,T,P,S)

N=S, A

T=$, c, e, g, l

P: S → $ A → l | l

e A c A c

g g gSynta

ctic P

Rl l l

c c c

g g g g

e

Page 80: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

80

CV master 2006/07Array grammar. Example1

# # # # # # # # # # # # # # # # # # # ## # # a a a a a a a a a a # # # # # # ## # # a a a a a a a a a a # # # # # # ## # # a a a a a a a a a a # # # # # # ## # # # # # # # # # # # # # # # # # # ## # # # # # # # # # # # # # # # # # # ## # # # # # # # # # # # # # # # # # # #

G=(N,T,P,S)

N=S, A, B, C, D

T=#, a

P: S# →aS S → A S → a A → a# B

a → a # → # # → # C → a#B Ba #B #a #B #C # D

a → a # → # # → #D# aD D# a# D# A#

a a a ... a# B

a a a ... aa ... a a aD

a a a ... aa ... a a aa a a ... a

Synta

ctic P

R

Page 81: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

81

CV master 2006/07Siromoney array grammars: Example 2

a1 a2 a1 a2 a1 a2 a1 a2a1 a2 a1 a2 a1 a2 a1 a2b1 b2 b1 b2 b1 b2 b1 b2a1 a2 a1 a2 a1 a2 a1 a2a1 a2 a1 a2 a1 a2 a1 a2

G=(G1,G2)

G1=(S,S1,S2,S,S →S1S2S, S →S1S2)

G2= <G21,G22>

G21=(S1,A1,B1, a1, b1, f1, g1, S1 , S1 →a1A1f1, A1 →a1A1g1, A1 →b1B1, B1g1 →a1B1, B1f1 →a1)

G22=(S2,A2,B2, a2, b2, f2, g2, S2 , S2 →a2A2f2, A2 →a2A2g2, A1 →b2B2, B2g2 →a2B2, B2f2 →a2)

a1 a2 b2b1

G=(G1,G2)

G1=(N,I,P,S) (horizontal grammar) is a context free grammar where N is the non-terminal alphabet, I the intermediate symbols S1,...,Sk with N∩I= ∅, and S∈N is the start symbol.

G2 = <G21,...,G2k> where each G2i (vertical grammar) is G2i=(Ni,T,Fi,Pi,Si) where Ni is the non-terminal alphabet, T is the terminal alphabet, Fi is the alphabet of index, Pi is a set of indexed productions and Si ∈Ni and Si ∈I is the start symbol.

primitives

Synta

ctic P

R

Page 82: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

82

CV master 2006/07Attribute Grammars

⇒ They add a quantitative information (length and line orientation, texture parameters of a region, 3D surface orientation, etc.) to the structural information that give the productions.• Each symbol Y∈V (V=N∪T) is augmented with a vector of attribute values m(Y) = (x1,...,xk) where an attribute is a function α: Y→DY. Then, m(Y) = (α1 (Y),...,αk (Y))

• Relation between the attributes of the two sides of a production A1...An →B1...Bm(Ai, Bj ∈V):

αr(Ai) = fi (αr(B1), ... , αr(Bm)), i = 1, ... , n synthesized attributes

αs(Bi) = fi (αs(A1), ... , αs(An)), i = 1, ... , m inherited attributes

Example of synthesized attributes (L(G) = cancbanb):

l(S) = l(c) + l(A) + l(b), for S → cAb,l(A) = 2l(a) + l(B), for A → aBa,l(B) = 2l(a) + l(B), for B → aBa,l(B) = l(c) + l(b), for B → cb.

Synta

ctic P

R

Page 83: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

83

CV master 2006/07Parsing with noise and distortion

• Introduction of specific productions that modelize the error

• Stochastic Grammars

• Syntactical Analysis (parsing) with error correction

Synta

ctic P

R

Page 84: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

84

CV master 2006/07Productions that modelize errors

G=(N,T,P,S)

N=S,A,B

T=a,b,c

P = S → cAb,

A → aBa,

B → aBa | cb

L(G) = cancbanb

The grammar is augmented with error productions

a

b

c

a ac

ba a a

...

...L(G’) = ,

a a a

dcb

a a a

...

... d

a b cT’= d

G’=(N’,T’,P’,S)

N=S,A,B,ERROR

T=a,b,c,d

P = S → cAb,

A → aBa,

B → aBa | cb | ERROR

ERROR → dd

L(G’) = cancbanb, canddanb

Error case

Synta

ctic P

R

Page 85: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

85

CV master 2006/07Stochastic Grammars

• A stochastic grammar Gs = (N,T,Ps,S) where N, T, S are defined as in the formal grammars and Ps is a finite set of productions of the form pij : Ai → Xij with

• Any stochastic grammar has a characteristic grammar G=(N,T,P,S), which is the grammar obtained by deleting the probabilities pij of each production.

• x∈L(G) if exists the derivation S=w0 → w1 → ... → wk=x; the prob. that x comes from L(Gs) is:

p(x) has no influence and N classes are equiprobables with p(Gi

s=1/N)

Synta

ctic P

R

∑=

+ ==≤<∈∈ni

1jijijiji m.,1,i1;p1;p0;VXN;A K

∏=

=k

ii

s rpG|xp1

)()(

,,1);|(G)|(Gsi)( sj

si NjxpmaxxpGLx s

K==∈

)|()(

)()|()|(:Bayes si

si

sis

j Gxpxp

GpGxpxGp ≈=

Page 86: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

86

CV master 2006/07

∗∗

∗∗∗

Error-correcting parsers

Recognizing with error correction problem: to find the distance between I and Jwhere I is the pattern to recognize, and J is the pattern in L(G) nearest to I.

D(L(G),I) = minD(ξ,I) | ξ∈L(G)

1WLD (one-dimensional weighted Levenshtein distance) in 1-dimensional grammar:

D(X,Y) = minp ∗ #subst + q ∗ #insert + r ∗ #delete

Parsers with the possibility of substitutions, insertions and deletions in their productions:

A → αbβ with cost w,A → αaβ with cost w + p,A → αβ with cost w + r,A → αabβ with cost w + q,A → αbaβ with cost w + q

Synta

ctic P

R

Page 87: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

87

CV master 2006/07Grammar expansion to allow error productions

G=(N,T,P,S) G’=(N’,T’,P’,S’) N’ = N ∪ S’ ∪ Ea | a ∈ TT’ ⊇ T

1) A → a0b1a1b2 ... bmam, ∈ P

(m≥ 0, ai ∈N* and bi ∈ T)A → a0Eb1a1Eb2 ... Ebmam, 0

(Ebi∈N’ and 0 is the cost of this production)2) Add to P’:

S’ → S 0

S’ → Sa c.ins. For all a ∈ T’

Ea → a 0 For all a ∈ T

Ea → b c.subs. For all a ∈ T, b ∈ T’, b ≠ a

Ea → λ c.del. For all a ∈ T

Ea → bEa c.ins. For all a ∈ T, b ∈ T’

Construction of P’:

Context-free grammar Error-correcting grammarSy

ntacti

c PR

Page 88: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

88

CV master 2006/07Error-correcting grammar. Example

G=(N,T,P,S)

N=S,A,B

T=a,b,c

P = S → cAb,

A → aBa,

B → aBa | cb

L(G) = cancbanb

G=(N’,T’,P’,S’)

N=S’,S,A,B,Ea,Eb,Ec

T’=a,b,c

P = S’ → S 0

S’ → Sa ci

S’ → Sb ci

S’ → Sc ci

S → EcAEb, 0

A → EaBEa, 0

B → EaBEa 0

B → EcEb 0

Ea→ a 0

Eb→ b 0

Ec→ c 0

Ea→ b cs

Ea→ c cs

Eb→ a cs

Ec→ c cs

Ec→ a cs

Ec→ b cs

Ea→ λ cd

Eb→ λ cd

Ec→ λ cd

Ea→ aEa ci

Eb→ bEb ci

Ec→ cEc ci

Synta

ctic P

R

Page 89: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

89

STRUCTURAL LEARNING

Page 90: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

90

CV master 2006/07Structural learning

Inductive learning: The goal is to obtain a common representation model (class representative) from a set of examples.

• x1, ..., xn set of examples represented by a structural model.• Ω set of all possible patterns.• d(x,y) distance function between two patterns x,y∈Ω.

∑Ω∈

=i

iz

xzdminm ),(arg

• Grammatical inference.

• Mean String.

• Mean Graph.

in general, NP-Complete problems

Stru

ctura

l Lea

rning

Page 91: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

91

CV master 2006/07Regular grammar inference. Automats (I)

• A regular grammar A → aB|c is equivalent to a deterministic finite automatA=(T,Q,δ,q0,F) where:

• T alphabet.• Q set of states.• δ : Q×T * → Q transition function between states.• q0 initial state.• F⊆Q set of final states.

• The language accepted by A is: L(A)=x∈T * | δ (q0,x) ∈ F.• The non-terminal alphabet of the grammar corresponds to the set of states of the automat.

Q0 → aQ1 | bQ2

Q1 → aQ0 | bQ3

Q2 → aQ3 | bQ2 | λ

Q3 → aQ3 | bQ3

q0 q1

q2 q3

abb

b

a

aa,b

+

L=a2nbm | n≥0, m≥1=(aa)*bb*

Stru

ctura

l Lea

rning

Page 92: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

92

CV master 2006/07Regular grammar inference. Automats (II)

• If δ : Q×T * → Q* then the automat is non deterministic.• A non deterministic automat can be transformed in a deterministic one by searching the equivalence classes among the states.

L=aba, abb

q0

q1 q2 q3

a

b

b

b

a

a

+

q4 q5 q6

+

a,b

q0 [q1,q4] [q2,q5] [q3,q6]a,bba

+

q7

a

ba,b

Non deterministic

Deterministic

Stru

ctura

l Lea

rning

Page 93: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

93

CV master 2006/07Regular grammar inference from examples

Successors method: Given a set I of examples, a state is defined for each symbol in the alphabet. If no counterexamples are defined, an automat A is obtained such that I ⊆ L(A).

Qλ Qa

Qb Qcc

b

a

b

b

a

b

a

+

Ql → bQb | cQc

Qa → aQa | bQb

Qb → aQa | bQb | λ

Qc → aQa | bQb

I=caab, bbaab, caab, bbab, cab, bbb, cb

State Qλ ≡ successors of λ in I: (c, b).State Qa ≡ successors of a in I: (a, b).State Qb ≡ successors of b in I: (a, b).State Qc ≡ successors of c in I: (a, b).

Stru

ctura

l Lea

rning

Page 94: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

94

CV master 2006/07Mean string

• S= x1, ..., xn set of strings on the alphabet Σ S ⊆ Σ*

• d(x,y) string edit distance.∑

∈Σ∈

=Syz

yzdminx ),(arg*

)]()([),,( 1111

nn in

i

m

in

i xmcxmcminxx →++→=∂∪Σ∈

KKλ

When strings represent 2D shapes:• Cyclic strings.• Merging cost must be considered to tolerate distortion and scale invariance.• The alphabet may not be finite.

Mean symbol

Stru

ctura

l Lea

rning

Following the Wagner & Fischer approach to compute the string edit distance, the mean string can be computed using an n-dimensional matrix, computing the following expression at each position:

Di1,...,in=min [Di1-1,...,ij-1,,...,in-1+δ(x1i1,...,xj

ij,...,xnin),

Di1-1,...,ij-1,,...,in+δ(x1i1,...,xj

ij,...,λ), ..., Di1,...,ij-1,,...,in-1+δ(λ,...,xjij,..., xn

in),Di1-1,...,ij,,...,in+δ(x1

i1,...,λ,...,λ), ... ,Di1,...,ij,,...,in-1+δ(λ,...,λ,..., xnin), ...]

Page 95: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

95

CV master 2006/07Mean string for polygonal approximations

abcd→ a’b’c’

a

bc

da’

b’c’

ab

cd

X

a’ b’c’

Y

Stru

ctura

l Lea

rning

Page 96: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

96

CV master 2006/07Mean string for polygonal approximations

Stru

ctura

l Lea

rning

Page 97: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

97

CV master 2006/07Mean graph

• S= G1, ..., Gn set of attributed graphs with labels in the sets LV and LE.• U set of all attributed graphs that can be generated from LV and LE. • d(G,G’) graph edit distance.

∑∈∈

=SG

iUG

i

GGdminG ),(arg

X. Jiang, A. Müenger, H. Bunke, “Synthesis of Representative Graphical Symbols by Computing Generalized Median Graph”. Proceedings of 3rd. IAPR Int. Workshop on Graphics Recognition, pp. 187-194, Jaipur (India), 1999

Some works impose a set of constraints to solve the problem.Since it is an NP-Complete problem, it remains a challenge to find good heuristics that allow a fast convergence of the algorithm.

Stru

ctura

l Lea

rning

Page 98: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

98

RECOGNITION OF STRUCTURED TEXTURES

Page 99: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

99

CV master 2006/07Syntactic models for texture recognition

• Statistical textures: spatial distribution of gray levels following statistical models.

• Structured textures: identifiable texture element (texel) that is repeated according to a set of placement rules that often follow a geometric model.

Stru

cture

d tex

tures

Page 100: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

100

CV master 2006/07Tasks in the recognition of structured textures

Primitive detectionSubwindows of regular size, contours blobs, connected components, etc.Classification in terms of circularity, orientation, intensity,moments, shape, etc.

Inference of the placement rulesSyntactic model.Neighborhood graph.

Stru

cture

d tex

tures

Page 101: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

101

CV master 2006/07Voronoi polygons and structured textures

Voronoi tessellation: from a set of points pi in the Euclidean plane, a partition of the plane is computed such that each pi has a polygonal region associated that contains the points of the plane closest to pi than any other pj.

Extraction of features

Voronoi polygons.The geometric features

define the texture

Delaunay graph.Neighborhood between

texels.Stru

cture

d tex

tures

Page 102: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

102

CV master 2006/07

Formal grammars for structured textures(Lu & Fu model)

Texture image

Pattern

Primitive

Texture

1st level

2nd level

Hierarchical representation of the texture

Mosaicing in windows of the same size

Stru

cture

d tex

tures

Page 103: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

103

CV master 2006/07Formal grammars for structured textures (II)

1

1

1

1

1

1

1

1

1

0

0

0

0

1

0

0

0

0

0

0

0

0

1

0

0

0

0

0

0

0

0

1

0

0

0

0

0

0

0

0

1

0

0

0

0

0

0

0

0

1

0

0

0

0

0

0

0

0

1

0

0

0

0

0

0

0

0

1

0

0

0

0

0

0

0

0

1

0

0

0

0

Starting point

Tree-based representation

G=(N,T,P,S)

N= S, A, B, C, D, E, F, N

T= (1), (0)

P: S → 1 A → 1

F A F F B F

B → 1 C → 1

F C F F D F

D → 1

N E N

E → 1 E → 1

F E F F F

F → 0 | 0 N → 1 | 1

F N

G=(N,T,P,S)

N= S, A, B, C, D, E, F, N

T= (1), (0)

P: S → 1 A → 1

F A F F B F

B → 1 C → 1

F C F F D F

D → 1

N E N

E → 1 E → 1

F E F F F

F → 0 | 0 N → 1 | 1

F N

Tree grammar

Stru

cture

d tex

tures

Page 104: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

104

CV master 2006/07Example of distorted texture. Error-correcting parser

Model image from the grammar is inferred.

Image with the same pattern but distorted. An error-correcting parser is used to recognize the texture.St

ructu

red t

extur

es

Page 105: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

105

APPLICATIONS

Page 106: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

106

CV master 2006/07Application examples of the structural PR

• Diagram understanding.

W. Min, Z. Tang & L. Tang. Using Web Grammar to Recognize Dimensions in Engineering Drawings. Pattern Recognition, Vol. 26, No. 9, pp. 1407-1916, 1993.

• Interpretatiomn of mathematical expressions

A. Grbavec, D. Blonstein. Mathematics Recognition Using Graph Rewriting. Proc. Third Intl. Conf on Document Analysis and Recognition, Montreal, Canada, August 1995, pp. 417-421.

• Recognition of 2D shapes.

H. Bunke & U. Bühler. Applications of Approximate String Matching to 2D Shape Recognition. Pattern Recognition, Vol. 26, No. 12, pp. 1797-1812, 1993.

• Object tracking.

C. Wang & K. Abe. Region Correspondence by Inexact Attributed Planar Graph Matching. Fifth International Conference on Computer Vision, pp. 440-447, 1995.

• Stereo correspondence.

R. Horaud & T. Skordas. Correspondence Through Feature Grouping and Maximal Cliques. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 11, No. 11, pp. 1168-1180, Nov 1989.

Exam

ples

Page 107: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

107

CV master 2006/07Edit distance for symmetry detection (I)

Rotational symmetry: X=Yn => dc(X,X)

Axial symmetry: X=YY-1 => dc(X,X-1)

a

b

c

d e fg

hij

k

lmn

a ab bc cd de ef fg gh hi ij jk kl lm mn n

abcdefghijklmn

P4P3P2P1

h

0.33

0.39

0.39

0.39

0.39

0.39

0.38

0.43

0.08

0.06

0.06

0.06

0.33

0.00

Exam

ples

Page 108: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

108

CV master 2006/07Edit distance for symmetry detection (II)

0.69 0.90 1.26

Exam

ples

Page 109: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

109

CV master 2006/07Formal grammars for diagram understanding

Formal grammars are often used to recognize and validate the dimension in technical drawings (plans, engineering diagrams, etc.).

Exam

ples

T

SA A

W

C

on

adjacadjac

crossarrow

followscontour

W

C

crossarrow

followscontour

RSL:

20

TA

W

C

S

Page 110: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

110

CV master 2006/07Formal grammars for mathematical expressions

The different elements (hand drawn) are recognized by an OCR module. A graph grammar is used to validate the mathematical expression.

The graph is built from the image and the OCR result

Parser

Exam

ples

Page 111: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

111

CV master 2006/07String matching for 2D shape correspondence

Models Candidates

Correspondences:

C1 - M10 (d=1619.3) C6 - M7 (d=2228.3) C11 - M11 (d=1789.3)C2 - M13 (d=1666.9) C7 - M2 (d=1778.7) C12 - M8 (d=1672.7)C3 - M1 (d=1330.1) C8 - M3 (d=1491.1) C13 - M4 (d=1881.6)C4 - M6 (d=1833.0) C9 - M12 (d=1895.8) C14 - M5 (d=1644.4)C5 - M9 (d=1785.6) C10 - M14 (d=1222.4) C15 - M15 (d=1351.5)

Exam

ples

Page 112: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

112

CV master 2006/07Graph matching for object tracking

Correspondence between Region Adjacency Graphs

An object at different placementsEx

ample

s

Page 113: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

113

CV master 2006/07Graph matching for stereo correspondence

Original images Line detectionCorrespondence by maximal cliques

Exam

ples

Page 114: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

114

A CASE STUDY: SYMBOL RECOGNITION IN DOCUMENTS

Page 115: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

115

CV master 2006/07Symbol recognition in Graphics Recognition

Symb

ol Re

cogn

ition

Symbol recognition:Vectorization is not enoughA lot of methods in the literature for symbol recognition in particular frameworks. The research is not mature enough for general symbolrecognition approaches. There are still unaddressed issues.

Automatic intelligent interpretation of raster graphic images:Conversion to CAD formatAnalysis of information conveyed by graphic imagesDatabase queryingMan-machine interfaces

Interaction with other graphics recognition tasks:Feature and primitive extractionSegmentationSemantic interpretation

Page 116: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

116

CV master 2006/07Symbol recognition in architectural drawings

Symb

ol Re

cogn

ition

• There is no a standardization, just some notational convention (for a group or architects, a particular one or just for a single plan).• Usually, 2D entities made by sets of line segments.• Two types: prototype-based symbols and textured symbols.

Textures. Grouping similar geometrical entities according to a regular spatial organization.

Prototype-based symbols. Matching with a set of prototypes.

Page 117: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

117

CV master 2006/07A graph-based symbol recognition approach

S SS Sabc

de

f

g

Input Graph

Index table(shape)

Model symbol

Component graph

tokendetection

tokendetection

vertexmapping

graphclustering

parsingmatchingparsing

shape matching

a’ b’

Index table(neighbourhood)

neighborhood matching

e∩b’ e∩b’g gf

Symb

ol Re

cogn

ition

Page 118: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

118

CV master 2006/07Prototype patterns: the string growing algorithm

Idea of the algorithmIt is a branch and bound algorithm in which, after an initial model region is mapped to an input region, their corresponding boundary strings grow in parallel in terms of the similarity of their neighborhood until the exterior string is reached.

A

A'

C

C'

D

D'

B

B'

E'

H'F'

G'

C

C'

D

D'

B

B'

E'

H'F'

G'

C

C'

D

D'

E'

H'F'

G'

D

D'

E'

H'F'

G'

E'

H'F'

G'

A A' A A', B B' A A', B B', C C' A A', B B', C C', D D'

ModelGraph

PartialMatch

InputGraph

Level 1 2 3 4 5

A

A'

A

A'

B

B'

A

A'

C

C'

B

B'

A

A'

C

C'

D

D'

B

B'

Symb

ol Re

cogn

ition

Page 119: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

119

CV master 2006/07Error-tolerant graph matching. Example

Model graph

Input image

Symb

ol Re

cogn

ition

Page 120: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

120

CV master 2006/07Texture patterns: a graph grammar

Symb

ol Re

cogn

ition

Bidimensional texture

One element

Two elements

One-dimensional texture

Page 121: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

121

CV master 2006/07Texture patterns: grammar inference (1)

Symb

ol Re

cogn

ition

OR

Page 122: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

122

CV master 2006/07Texture patterns: grammar inference (2)

Symb

ol Re

cogn

ition

Page 123: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

123

CV master 2006/07Texture patterns: grammar inference (3)

Symb

ol Re

cogn

ition

Page 124: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

124

CV master 2006/07Texture patterns: grammar inference (4)

Symb

ol Re

cogn

ition

Page 125: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

125

CV master 2006/07Texture patterns: parsing process

Symb

ol Re

cogn

ition

r r r

r

o o o d2

o o o d2

d1

r r r

r

o o o d2

o o d2

d1

S’

r r

r

o o d2

o d2

d1

o

R

O

O

?

?

? ?

?

r

r

o d2

d2

d1

o

R

O

?

?

? ?

?

O

O

R

?

?

o

r

r

o d2

d2

d1

o

R?

?

? ?

?

O

O

R

?

?

o

o

?

?

?

?

r

r

o d2

d2

d1

o

?

?

? ?

?

O

O

R

?

?

o

o

?

?

?

?

r

r

d2

d2

d1

o

?

?

? ?

?

O

R

?

?

o

o

?

?

?

?

r

o

?

?

?

R

O d2

d2

d1

o

?

?

? ?

?

R

?

?

o

o

?

?

?

?

r

o

?

?

?

R

Oo

?

R

d2

d2

d1

o

?

?

? ?

? ?

?

o

o

?

?

?

?

r

o

?

?

?

R

Oo

?

R

rd2

d2

d1

o

?

?

? ?

? ?

?

o

o

?

?

?

?

r

o

?

?

?

R

o

?

R

r

o ?

?

?d2

d2

d1

o

?

?

? ?

? ?

?

o

o

?

?

?

?

r

o

?

?

?

o

?

R

r

o ?

?

?

r

d2

d2

d1

o

?

?

? ?

? ?

?

o

o

?

?

?

?

r

o

?

?

?

o

?

r

r

o ?

?

?

r

d2

d2

d1

o

?

?

? ?

? ?

?

o

o

?

?

?

?

r

o

?

?

?

o

?

r

r

o ?

?

?

r

d2

d2

d1

o o

o

r

o ?

o

?

r

r

o ?

?

r

d2

oc

d1

o o

o

r

o

o

?

r

r

o ?

?

r

oc

oc

d1

o o

o

r

o

o

?

r

r

o

?

r

oc

oc

d1

o o

o

r

o

or

r

o

r

osos

oc

oco o

o

r

o

or

r

o

r

osos

Page 126: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

126

CV master 2006/07Results

Symb

ol Re

cogn

ition

wc0.67

wc0.75

wc0.79

wc0.80

door1.81

door1.89

door1.94

door1.99

door2.05

door2.08

door2.13

door2.19

door2.48

door2.57

column0.34

column0.64

column1.00

window0.48

window0.51

window0.52

desk0.72

chair1.03

sink0.15

sink0.45

sink1.16

sink1.19

sink1.20

uri0.00

uri0.58

uri0.62

settee0.06

settee0.21

table2.06table

2.85

GRAM: table

GRAM: table

GRAM: table

GRAM: table

GRAM: stairs

GRAM: stairs

GRAM: settee

GRAM: settee

Page 127: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

127

CV master 2006/07Virtual Prototyping of Architectural Projects

Symb

ol Re

cogn

ition

Page 128: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

128

CV master 2006/07Some useful references

• A. Belaid, Y. Belaid. Reconnaissance des formes. Méthodes et Applications. Inter Editions, Paris, 1992.

• H. Bunke, A. Sanfeliu. Syntactic and Structural Pattern Recognition. Theory and Applications. World Scientific Publishing Company, 1990.

• S.K. Chang. Principles of Pictorial Information Systems Design. Prentice-Hall, 1989.

• C.H. Chen, L.F. Pau, P.S. Wang. Handbook of Pattern Recognition & Computer Vision. World Scientific, Singapore, 1993.

• K.S. Fu. Syntactic Pattern Recognition and Applications. Prentice-Hall, 1982.

• P.S.P. Wang. Array Grammars, Patterns and Recognizers. World Scientific, 1989.

Bibli

ogra

phy

Page 129: STRUCTURAL AND SYNTACTIC PATTERN RECOGNITIONernest/slides/recestr0607.pdfStructural Pattern Recognition 2006/07 z The same structure is used to represent models and unknown patterns

129

CV master 2006/07Exercise. Shape recognition by string edit distance

Exer

cise