View
6
Download
0
Category
Preview:
Citation preview
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 11
Unit 2
A Brief Introduction to Fuzzy Logic andFuzzy Systems
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 12
Motivation
In our everyday life, we use vague, qualitative, im-
precise linguistic terms like “small”, “hot”, “around two
o’clock”
Even very complex and crucial human actions are de-
cisions are based on such concepts, e.g. in
Process control
Driving
Financial/business decisions
Law
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 13
Motivation (cont’d)
These imprecise terms and the way they are pro-
cessed, therefore, play a crucial role in everyday life
To have a mathematical model which is able to ex-
press the complex semantics of such terms, hence,
would lead to more intelligent systems and open com-
pletely new opportunities
Concepts in classical mathematics and technology
are inadequate to provide such models
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 14
L. A. Zadeh (1965)
“More often than not, the classes of objects encountered
in the real physical world do not have precisely defined cri-
teria of membership. [. . . ] Yet, the fact remains that such
imprecisely defined “classes” play an important role in hu-
man thinking, particularly in the domains of pattern recog-
nition, communication of information, and abstraction.”
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 15
Fuzzy Logic and Fuzzy Sets
Fuzzy logic is a generalized kind of logic
Fuzzy sets are the key to the semantics of vague lin-
guistic terms
Fuzzy sets are based on fuzzy logic
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 16
What are Fuzzy Sets?
The idea behind fuzzy logic is to replace the set of truth
values {0,1} by the entire unit interval [0,1].
A fuzzy set on a universe X is represented by a function
which maps each element x ∈ X to a degree of member-
ship from the unit interval [0,1]. These so-called member-
ship functions are direct generalizations of characteristic
functions.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 17
What Do We Need?
In order to be able to proceed IF-THEN rules involving
vague linguistic expressions which are modeled by fuzzy
sets, we need to have proper generalizations of logical
operations and an inference scheme.
Let us start with the first question: How can we extend
the classical logical operations ∧,∨,¬ to the unit interval
[0,1]?
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 18
Triangular Norms
A mapping T : [0,1]2 → [0,1] is a triangular norm (t-norm) ifit has the following properties (for all x, y, z ∈ [0,1]):
Commutativity: T (x, y) = T (y, x)
Associativity: T (x, T (y, z)) = T (T (x, y), z)
Non-decreasingness: x ≤ y ⇒ T (x, z) ≤ T (y, z)
Neutral element: T (x,1) = x
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 19
The Four Standard t-Norms
TM(x, y) = min(x, y)
TP(x, y) = x · y
TL(x, y) = max(x + y − 1,0)
TD(x, y) =
x if y = 1
y if x = 1
0 otherwise
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 20
The Four Standard t-Norms (Cont’d)
TM
0 0.2 0.4 0.6 0.8 10.5
10
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 10.5
10
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
TP
TL
0 0.2 0.4 0.6 0.8 10.5
10
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 10.5
10
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
TD
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 21
Some Observations
For all x, y ∈ [0,1], we have:
TD(x, y) ≤ TL(x, y) ≤ TP(x, y) ≤ TM(x, y)
It is easy to check that TM is the largest possible t-normand that TD is the smallest possible t-norm
TM is the only t-norm fulfilling idempotence (T (x, x) = x)
All except TD are continuous
TP is the only differentiable one
TP is the only one that is strictly non-decreasing
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 22
Examples of Intersections
TM 0.5
1
0.5
1
TP 0.5
1
0.5
1
TL 0.5
1
0.5
1
TD 0.5
1
0.5
1
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 23
Fuzzy Negations
A mapping N : [0,1] → [0,1] is called a negation if
it is non-increasing and fulfills the boundary conditions
N(0) = 1 and N(1) = 0. A negation is called strong
if it is continuous and strictly decreasing (therefore, bijec-
tive). A strong negation is called strict if it is involutive,
i.e.
N(N(x)) = x
for all x ∈ [0,1]. The most common negation is NS(x) =
1− x.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 24
Triangular Conorms
A mapping S : [0,1]2 → [0,1] is a triangular conorm (t-conorm) if it has the following properties (for all x, y, z ∈ [0,1]):
Commutativity: S(x, y) = S(y, x)
Associativity: S(x, S(y, z)) = S(S(x, y), z)
Non-decreasingness: x ≤ y ⇒ S(x, z) ≤ S(y, z)
Neutral element: S(x,0) = x
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 25
The Four Standard t-Conorms
SM(x, y) = max(x, y)
SP(x, y) = x + y − x · y
SL(x, y) = min(x + y,1)
SD(x, y) =
x if y = 0
y if x = 0
1 otherwise
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 26
The Four Standard t-Conorms (Cont’d)
SM
0 0.2 0.4 0.6 0.8 10.510
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 10.510
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
SP
SL
0 0.2 0.4 0.6 0.8 10.510
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 10.510
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
0
0.2
0.4
0.6
0.8
1
SD
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 27
Some Observations
For all x, y ∈ [0,1], we have:
SM(x, y) ≤ SP(x, y) ≤ SL(x, y) ≤ SD(x, y)
It is easy to check that SD is the largest possible t-conormand that SM is the smallest possible t-conorm
SM is the only t-conorm that is idempotent (S(x, x) = x)
All except SD are continuous
SP is the only differentiable one
SP is the only one that is strictly non-decreasing
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 28
Examples of Unions
SM 0.5
1
0.5
1
SP 0.5
1
0.5
1
SL 0.5
1
0.5
1
SD 0.5
1
0.5
1
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 29
Implications
S-implication:For a t-conorm S and a negation N , we define
IS,N(x, y) = S(N(x), y)
Residual implication ( R-implication):For a (left-)continuous t-norm T , we define
T→
(x, y) = sup{u ∈ [0,1] | T (x, u) ≤ y}
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 30
Examples
ISM,NS(x, y) = max(1− x, y) T→
M(x, y) =
1 if x ≤ y
y otherwise
ISP,NS(x, y) = 1− x + x · y T→
P(x, y) =
1 if x ≤ y
yx
otherwise
ISL,NS(x, y) = min(1− x + y,1) T→
L(x, y) = min(1− x + y,1)
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 31
Examples (cont’d)
ITM,NSITP,NS
ITL,NS
00.25
0.50.75
1
0
0.25
0.5
0.75
1
0
0.25
0.5
0.75
1
00.25
0.50.75
1
00.25
0.50.75
1
0
0.25
0.5
0.75
1
0
0.25
0.5
0.75
1
00.25
0.50.75
1
00.25
0.50.75
1
0
0.25
0.5
0.75
1
0
0.25
0.5
0.75
1
00.25
0.50.75
1
00.25
0.50.75
1
0
0.25
0.5
0.75
1
0
0.25
0.5
0.75
1
00.25
0.50.75
1
00.25
0.50.75
1
0
0.25
0.5
0.75
1
0
0.25
0.5
0.75
1
00.25
0.50.75
1
00.25
0.50.75
1
0
0.25
0.5
0.75
1
0
0.25
0.5
0.75
1
00.25
0.50.75
1
T→M T
→P T
→L
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 32
Aggregation Operators
A function
A :⋃n∈N
[0,1]n → [0,1]
is called an aggregation operator if it has the following proper-ties:
1. A(x1, . . . , xn) ≤ A(y1, . . . , yn) whenever xi ≤ yi for alli ∈ {1, . . . , n}
2. A(x) = x for all x ∈ [0,1]
3. A(0, . . . ,0) = 0 and A(1, . . . ,1) = 1
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 33
Aggregation Operators: Examples
All t-norms and t-conorms are aggregation operators (withthe conventions T (x) = x and S(x) = x)
All weighted arithmetic and geometric means are aggrega-tion operators
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 34
Linguistic Variables
A linguistic variable is a quintuple of the form
V = (N, G, T, X, M),
where N , T , X, G, and M are defined as follows:
1. N is the name of the linguistic variable V
2. G is a grammar
3. T is the so-called term set, i.e. the set linguistic expressions re-sulting from G
4. X is the universe of discourse
5. M is a T → F(X) mapping which defines the semantics—a fuzzyset on X—of each linguistic expression in T
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 35
The Basic Setup
Let us in the following consider a system with n inputs and one output.Assume that we have n linguistic variables
v1 = (N1, G1, T1, X1, M1),
... =...
vn = (Nn, Gn, Tn, Xn, Mn),
associated to the n inputs of the system and one linguistic variableassociated to the output:
vy = (Ny , Gy , Ty , Xy , My)
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 36
Fuzzy Rule Base with m Rules
IF cond1 THEN action1
......
......
IF condm THEN actionm
The conditions condi and the actions actioni are ex-pressions built up according to an appropriate syn-tax.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 37
Example
Consider a system with two inputs and one output:
v1 = (N1 = “ϕ”, G1, T1 = {“nb”, “ns”, “z”, “ps”, “pb”},
X1 = [−30,30], M1),
v2 = (N2 = “ϕ”, G2, T2 = {“nb”, “ns”, “z”, “ps”, “pb”},
X2 = [−30,30], M2),
vy = (Ny = “f ”, Gy, Ty = {“nb”, “ns”, “z”, “ps”, “pb”},
Xy = [−100,100], My)
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 38
Example (cont’d)
IF (ϕ is z and ϕ is z) THEN f is z
IF (ϕ is ns and ϕ is z) THEN f is ns
IF (ϕ is ns and ϕ is ns) THEN f is nb
IF (ϕ is ns and ϕ is ps) THEN f is z...
......
...
How can we define a control function from these rules?[go to fuzzy sets]
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 39
What Do We Need?
1. We have to feed our input values into the system
2. We have to evaluate the truth values of the conditions
3. We have to come to some conclusions/actions for each rule
4. We have to come to an overall conclusion/action for thewhole set of rules
5. We have to get an output value
Steps 3 and 4 are usually considered the steps of actual infer-ence.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 40
Steps 1 and 2
Assume we are given n crisp input values xi ∈ Xi (i = 1, . . . , n) andassume we have fixed a De Morgan triple (T, S, N).
Then we can compute the truth value t(condi) of each condition condi
recursively in the following way:
t(Ni is lij) = µMi(l
ij)
(xi)
t(a and b) = T (t(a), t(b))
Trivial extensions are necessary if the language allows more than onlyconjunctions.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 41
Steps 3 and 4: Basic Remarks
1. It may happen that the conditions of two or more rules
are fulfilled with a non-zero truth value
2. It may even happen that this is true for two or more
rules with different (conflicting?) actions
3. This is not at all a problem, but a great advantage!
4. In any case, the following basic requirement is obvi-
ous: The higher the truth value of a rule’s condition,
the higher its influence on the output should be
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 42
Steps 3 and 4: Two Fundamental Approaches
Deductive interpretation: Rules are considered as logical de-duction rules (implications)
Assignment interpretation: Rules are considered as condi-tional assignments (like in a procedural programming lan-guage)
Both approaches have in common that separate output/actionfuzzy sets are computed for each rule. Finally, the output fuzzysets of all rules are aggregated into one global output fuzzy set.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 43
Step 3 in the Assignment Interpretation
We fix a t-norm T in advance. Assume that we consider
the i-th rule which looks as follows:
IF condi THEN Ny is lyj
Assume that the condition condi is fulfilled with a degree
of ti. Then the output fuzzy set Oi is defined in the follow-
ing way:
µOi(y) = T
(ti, µM(lyj )
(y))
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 44
An Example
µA(x)
1 2 3 4 5
0.2
0.4
0.6
0.8
1
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 44
An Example
µA(x)
1 2 3 4 5
0.2
0.4
0.6
0.8
1
TM(0.4, µA(x))
1 2 3 4 5
0.2
0.4
0.6
0.8
1
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 44
An Example
µA(x)
1 2 3 4 5
0.2
0.4
0.6
0.8
1
TM(0.4, µA(x))
1 2 3 4 5
0.2
0.4
0.6
0.8
1
TP(0.4, µA(x))
1 2 3 4 5
0.2
0.4
0.6
0.8
1
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 44
An Example
µA(x)
1 2 3 4 5
0.2
0.4
0.6
0.8
1
TM(0.4, µA(x))
1 2 3 4 5
0.2
0.4
0.6
0.8
1
TP(0.4, µA(x))
1 2 3 4 5
0.2
0.4
0.6
0.8
1
TL(0.4, µA(x))
1 2 3 4 5
0.2
0.4
0.6
0.8
1
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 45
Step 4 in the Assignment Interpretation
We fix an aggregation operator A in advance. Assume
that the output fuzzy sets Oi of all rules (i = 1, . . . , m) have
been computed. Then the output fuzzy set O is computed
in the following way:
µO(y) = A(µO1
(y), . . . , µOm(y))
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 46
Some Remarks
The assignment interpretation is by far the more commonone in practice. There is only one package that seriouslyoffers the deductive interpretation (LFLC). It uses I = T
→L
and T = TM.
The most common variant of the assignment-based ap-proach is T = TM and A = SM. This classical variantis better known as Mamdani/Assilian inference or max-mininference. Another common variant uses T = TP and thesum/arithmetic mean as aggregation A. This variant is of-ten called sum-prod inference.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 47
Example
We consider the rule base from the previous example.[go back]We define the following fuzzy sets for variables with names ϕ
and ϕ (left) and f (right):
-30 -20 -10 10 20 30 -100-80 -40-20 20 40 80 100
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 48
Step 5: Defuzzification
In many applications, we need a crisp value as output. The followingvariants are common:
Mean of maximum (MOM): The output is computed as the center ofgravity of the area where µO takes the maximum, i.e.
ξMOM(O) :=
∫Ceil(O)
y dy∫Ceil(O)
1 dy,
where
Ceil(O) := {y ∈ Xy | µO(y) = {µO(z) | z ∈ Xy}}
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 49
Step 5: Defuzzification (cont’d)
Center of gravity (COG): The output is computed as the cen-ter of gravity of the area under µO:
ξCOG(O) :=
∫Xy
y · µO(y) dy∫Xy
µO(y) dy
Center of area (COA): The output is computed as the pointwhich splits the area under µO into two equally-sized parts.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 50
Fuzzy Classification Systems
Fuzzy classification systems are ordinary fuzzy systems,
however, with the important difference that the universe of
the output variable Xy and the term set Ty coincide and
are finite sets of class labels.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 51
Example
Ty = Xy = {“dog”, “horse”, “fish”}
IF (no-of-legs is four and height is tall) THEN class is horse
IF (no-of-legs is four and height is short) THEN class is dog
IF no-of-legs is zero THEN class is fish
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 52
How to Process Fuzzy Classification Rules?
IF condi THEN Ny is lyi
Assume that the conditions condi are fulfilled with degrees ti.We compute individual output fuzzy sets Oi for each rule in thefollowing way:
µOi(x) =
ti if x = lyi
0 otherwise
The individual output fuzzy sets are then aggregated by meansof the aggregation operator A. Most often, the maximum t-conorm SM is used.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 53
Defuzzification
In many cases, we need a crisp decision to which class
the object belongs.
The almost only way to do this defuzzification on a finite
universe of class labels is to use that class the member-
ship to which is maximal.
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 54
Summary
1. Feed our input values into the system: evaluate the truthdegrees to which the inputs belong to the fuzzy sets asso-ciated to the linguistic terms
2. Evaluate the truth values of the conditions using fuzzy log-ical operations (e.g. a De Morgan triple (T, S, N))
3. Compute the conclusions/actions for each rule by connect-ing the truth value of the condition with the output fuzzy setusing a t-norm T
U N I V E R S I T Ä T L I N ZN e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E RU N I V E R S I T Ä T L I N Z
N e t z w e r k f ü r F o r s c h u n g , L e h r e u n d P r a x i s
J O H A N N E S K E P L E R
Knowledge-Based Methods in Image Processing and Pattern Recognition; Ulrich Bodenhofer 55
Summary (cont’d)
4. Compute the overall conclusion/action for the whole set ofrules by aggregating the output fuzzy sets with an aggre-gation operator A (most often a t-conorm)
5. Use defuzzification to get a crisp output value (optional)
Recommended