18
2009:11:10 Magdi A. Mohamed 1/18 Q-Metrics in Theory and Practice PRESENTATION TO UNIVERSITY OF FLORIDA – LOUISVILLE, FL 2009:11:10 d =-1 = 1 d =0 = 1 d -1,0) = 1 d e = d p=2 = 1 Dimension1 Dimension2 d t = d p=1 = 1 d p=infinity = 1 x=(x 1 , x 2 ) y=(y 1 , y 2 ) Q-Metrics for Different Lambda Values Graph of d(x,y)=1 in 2-Dimensional Space

Q-Metrics in Theory and Practice

Embed Size (px)

DESCRIPTION

A generalized class of normalized distance functions called Q-Metrics is described in this presentation. The Q-Metrics approach relies on a unique functional, using a single bounded parameter Lambda, which characterizes the conventional distance functions in a normalized per-unit metric space. In addition to this coverage property, a distinguishing and extremely attractive characteristic of the Q-Metric function is its low computational complexity. Q-Metrics satisfy the standard metric axioms. Novel networks for classification and regression tasks are defined and constructed using Q-Metrics. These new networks are shown to outperform conventional feed forward back propagation networks with the same size when tested on real data sets.

Citation preview

Page 1: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 1/18

Q-Metricsin Theory and Practice

PRESENTATION TOUNIVERSITY OF FLORIDA – LOUISVILLE, FL

2009:11:10

d=-1 = 1

d=0 = 1

d-1,0) = 1

de = dp=2 = 1Dimension1

Dimension2

dt = dp=1 = 1

dp=infinity = 1

x=(x1,x2)

y=(y1,y2)

Q-Metrics for Different Lambda Values

Graph of d(x,y)=1 in 2-Dimensional Space

Page 2: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 2/18

Q-Measure ConceptFuzzy Measure AxiomsLet be non-intersecting sets

• Boundary conditions:

• Monotonicity:

• Continuity:

guaranteed for discrete spaces

Probability Measure (1933)replaces monotonicity by additivity:

Sugeno -Measure (1975)adds one more axiom:

for a unique that satisfies g(X)=1

Q-Measure Extensions (2003)for any choice of >-1, !=0, define:

where fi [0,1] are density generators

Convergence Behavior of Q-Measures

0.875

0.9

0.925

0.95

0.975

1

1.025

1.05

0 2 4 6 8 10 12

Iteration, n

Scali

ng

Facto

r, f

n

Case 1

Case 2

Case 3

0 -1, ,1)1(

1)1(

)(

Xx

i

Ax

i

i

i

f

f

Aq

XBA ,

)()( 2121 AmAmAA

1)(,0)( Xmm

)()()( BpApBAp

)()()()()( BgAgBgAgBAg

Page 3: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 3/18

Q-Measuresin a nutshell

X

x2 x3

x4

A

x1

x6 x5

x7 x8 x9

B=Ac

q-measures providemore expressive and

computationally attractivenonlinear models

foruncertainty

management

q(A)

q(Ac)

=0probability

>0plausibility

<0belief

0)(

0)(

0)(

)(

)(

0)(

1

0)(

0)(

Aq

Aq

Aq

Bq

Aq

BAf

Bf

Af

BA

XBA

when modeling a complex system,

it’s an oversimplificationto assume that the

interdependency among information sources is

linear

Page 4: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 4/18

Q-filter ComputationsN=5 Tap Case - Nonlinearity, Adaptivity, and Model Capacity

h5<h2< h1<h3< h4

h4

x1 x2 x3 x4 x5

f1 f2 f3 f4 f5

Window Slots

Signal Value

h1

h2

h3

h5

Density Generators

i

h5 h2 h1 h3 h4 Threshold

Nonlinearity Controller

h(xi)

q(A)

q({x4})

q({x4, x3})q({x4, x3, x1})q({x4, x3, x1, x2})q({x4, x3, x1, x2, x5})=1.0

Case

Adaptive Weight

A

q()=0.0

Total area is the Q-filter output value

Page 5: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 5/18

Case StudiesCDMA Data Filtering for Cognitive Radio

Linear Filter Equalization

)(tI f

)(tQ f

)(te

)(tI

)(tQ

)(tS)(tSuplink

Training[065,504 samples]

Testing[200,000 samples]

Real

RMS = 31.31

Correlation = 99.14%

RMS = 31.25

Correlation = 99.11%

ImaginaryRMS=20.49

Correlation = 99.55%

RMS = 20.54

Correlation = 99.52%

Existing Linear Filter (Target)- 63 coefficients

Q-Filter Solution- 7 coefficients

Solution Comparison Performance Comparison

Q-Filter Performance (Real)

-600

-400

-200

0

200

400

600

800

1

14

27

40

53

66

79

92

10

5

11

8

13

1

14

4

15

7

17

0

18

3

19

6

20

9

22

2

23

5

24

8

Time

Sig

na

l

Q-Filter

Target

Page 6: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 6/18

Q-Metric ConceptMetric AxiomsA function d(x,y) defined for x and y in a set X is a metric provided that: • d(x,y) > 0, and d(x,y) = 0 iff x=y• d(x,y) = d(y,x)• d(x,y) + d(y,z) > d(x,z)The pair (X,d) is called a Metric Space

P-Metrics, dp (x,y) Defined, for 1 < p < infinity, by:

dp(x,y) = [ sum { |xi-yi|p } ](1/p)

Manhattan (Taxi-Cab) Distance, dt (x,y) Same as p-metric with p=1

Euclidean Distance, de (x,y) Same as p-metric with p=2

Mahalanobis Distance, dm (x,y) Defined using covariance matrix A, by:

dm(x,y) = (x-y)’ A-1 (x-y)

Q-Metrics Definition, d (x,y)

For xx,y X=[0,1]n and [-1,0) define:

We call the pair (X, d) a Q-Metric Space

Graph of d(x,y)=1 in 2-D Space

/ 1 1 ),( 1

n

iii yxyxd

d=-1 = 1

d=0 = 1

d-1,0) = 1

de = dp=2 = 1Dimension1

Dimension2

dt = dp=1 = 1

dp=infinity = 1

x=(x1,x2)

y=(y1,y2)

Page 7: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 7/18

Q-Metric Based SVMNonlinear Classification and Regression Cases

NovelQMB-SVC

NovelQMB-SVR

ConventionalRBF-SVC

ConventionalRBF-SVR

Page 8: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 8/18

Q-Aggregates Conceptthe math behind the effect

Aggregation Operator AxiomsA function

h: [0,1]n -> [0,1], n > 2, is an aggregation operator provided that:

• h(0, 0, …, 0) = 0• h(1, 1, …, 1) = 1• h is monotonic non-decreasing in all its

arguments• h is continuous• h is symmetric in all its arguments

Generalized MeansDefined, for -infinity < < infinity, by:

h(a1, …, an) = [ (a1 + … + an

) / n ](1/)

Q-Aggregate Definition For ai [0,1], n > 2, & define:

11

1 1 ..., ,

1

11

n

i

n

ii

n

aaah

EXISTING NOVEL

Page 9: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 9/18

Aggregation Operationsprior art and q-aggregate coverage

Intersection / ConjunctionOperations

Averaging / CompensativeOperations

Union / DisjunctionOperations

Generalized Means

Scheize/Sklar Scheize/Sklar

Hamacher Hamacher

Frank Frank

Yager Yager

Dubois/Prade Dubois/Prade

Dombi Dombi

pp

s s

w w

Q-Aggregates

- 1+ inf2003

1982

1980

1980

1979

1978

1961

- inf + inf

- inf + inf + inf - inf

+ inf

+ inf

+ inf

+ inf

0

0

0

0

otherwise

aifb

bifa

u

,1

0 ,

0 ,

b)(a,maxmin max

otherwise

aifb

bifa

i

,0

1 ,

1 ,

b)(a,min

Page 10: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 10/18

QFS Supervised Learning for EKGcase study

S -

Q

Q

RMS=0.128

-2.5

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

1 53 105 157 209 261 313 365 417 469 521 573 625 677 729 781 833 885 937 989 1041 1093 1145 1197 1249 1301 1353 1405 1457 1509 1561 1613 1665 1717 1769 1821 1873

Processed

-0.5

0

0.5

1

1.5

2

1 53 105 157 209 261 313 365 417 469 521 573 625 677 729 781 833 885 937 989 1041 1093 1145 1197 1249 1301 1353 1405 1457 1509 1561 1613 1665 1717 1769 1821 1873

Processed

-0.5

0

0.5

1

1.5

1 53 105 157 209 261 313 365 417 469 521 573 625 677 729 781 833 885 937 989 1041 1093 1145 1197 1249 1301 1353 1405 1457 1509 1561 1613 1665 1717 1769 1821 1873

SQ

Q

QQ-

RMS=0.032

A

A

SQ

QQ

RMS=0.044

-

Page 11: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 11/18

Conventional RBF NetworksRegression:

m

i

d

m

i

di

i

i

e

eff

1

,

1

,

cx

cx

x

Classification:

m

i

dijj

iewf1

,cxx

where x input vectorc cluster centerd() distance functionw output weightsm number of hidden nodesj class label index

where x input vectorc cluster centerd() distance functionf output weightsm number of hidden nodes

Notes:1. Distance functions, d(*,*), i.e., Metrics, serve key

role in RBF neural networks.2. Exponential function e(-x) is a reversal operator.3. Exponential function e(*) is computationally

expensive and costly, typically in hardware implementation.

4. Output of RBF network is weighted averaging.

Page 12: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 12/18

Weighted Q-Metrics

0

0,111

,

1

1

,

n

iiii

n

iiii

yxw

yxw

d yxw .,1,0,, iwyx iii

Recursive Weighted Q-Metrics Calculation Algorithm:

n

iiiiii

dd

dyxwdd

d

yxw ,

...

1

...

0

,

11

0

Page 13: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 13/18

Weighted Q-Aggregate

0

0,11)1(

1)1(

),...,(

1

1

1

1

1,

n

ii

n

iii

n

ii

n

iii

n

a

a

aaA μ

1,0, iia

Recursive Weighted Q-Aggregate Calculation Algorithm:

...

)1(

)1(

...

0

11

11

00

iiii

iiiii a

n

nnaaaA

),...,( 21,μ

Page 14: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 14/18

Logical Negation & Veracity Functions

• Negation operator can behave similar to e(-x).

• Veracity operator can behave as a transform from x to e(-x).

• Computationally efficient since it only requires accumulations, multiplications and a division. No exponential function calculation.

xnxxv 1,, 1,0 0

0.2

0.4

0.6

0.8

1

0 0.2 0.4 0.6 0.8 1

X

notX

Evidence

Sugeno Negation Operator:

otherwize ,0

1 ,1

1xif

x

x

xn ),1[

]1,0[

x

Veracity Operator:

Page 15: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 15/18

The New Q-RBF Neural Networks

Notes:• Use more powerful metrics different from a fixed Lp or other

classical type of metrics.

• Use better aggregation operation in classification problem than simple linear weighted averaging.

• Negation and veracity functions are more computationally attractive, with low-cost than e(-x), suitable for hardware implementations, particularly in embedded platforms.

Regression: Classification:

m

ii

m

iii

iii

iii

dn

dnff

1,

1,

,

,

cx

cxx

w

w

11

1,,,1

1

1

m

jj

hi

m

jjjjj

hi

i

dj

dv

f

cx

x

Page 16: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 16/18

Case Studies

1 2

1 2613 482

2 223 2628

Regression: RF Positioning Classification: Driver Maneuver

0

0.2

0.4

0.6

0.8

1

1.2

1 36 71 106 141 176 211 246 281 316 351

Target

QRBF

0

0.2

0.4

0.6

0.8

1

1.2

1 38 75 112 149 186 223 260 297 334

Target

BP

2x4x1 neural networkQ-RBF RMS = 0.077BP RMS = 0.110

Q-RBF confusion matrix

BP confusion matrix

1 2

1 2019 1076

2 131 2720

5x3x2 neural networkQ-RBF clearly has better classification results than BP.

Page 17: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 17/18

Q-Aggregates: Union/Average DOMAINS

Page 18: Q-Metrics in Theory and Practice

2009:11:10

Magdi A. Mohamed 18/18

Q-Aggregates: Intersection/Average DOMAINS