Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer

Preview:

DESCRIPTION

Bab /21 Artificial Neural Networks (ANN) / 2

Citation preview

Bab 5Bab 5Classification: Classification:

Alternative TechniquesAlternative Techniques

Part 4Part 4Artificial Neural Networks Artificial Neural Networks

Based ClassiferBased Classifer

Bab 5-4 - 2/21

Artificial Neural Networks (ANN) / 1

X1 X2 X3 Y1 0 0 01 0 1 11 1 0 11 1 1 10 0 1 00 1 0 00 1 1 10 0 0 0

X1

X2

X3

Y

Black box

Output

Input

Output Y is 1 if at least two of the three inputs are equal to 1.

Bab 5-4 - 3/21

Artificial Neural Networks (ANN) / 2

X1 X2 X3 Y1 0 0 01 0 1 11 1 0 11 1 1 10 0 1 00 1 0 00 1 1 10 0 0 0

X1

X2

X3

Y

Black box

0.3

0.3

0.3 t=0.4

Outputnode

Inputnodes

otherwise0 trueis if1

)( where

)04.03.03.03.0( 321

zzI

XXXIY

Bab 5-4 - 4/21

Artificial Neural Networks (ANN) / 3

Model is an assembly of inter-connected nodes and weighted links

Output node sums up each of its input value according to the weights of its links

Compare output node against some threshold t

X1

X2

X3

Y

Black box

w1

t

Outputnode

Inputnodes

w2

w3

)( tXwIYi

ii Perceptron Model

)( tXwsignYi

ii

or

Bab 5-4 - 5/21

General Structure of ANN

Activationfunction

g(Si )Si Oi

I1

I2

I3

wi1

wi2

wi3

Oi

Neuron iInput Output

threshold, t

InputLayer

HiddenLayer

OutputLayer

x1 x2 x3 x4 x5

y

Training ANN means learning the weights of the neurons

Bab 5-4 - 6/21

Algorithm for Learning ANN

Initialize the weights (w0, w1, …, wk)

Adjust the weights in such a way that the output of ANN is consistent with class labels of training examples– Objective function:

– Find the weights wi’s that minimize the above objective function e.g., backpropagation algorithm

2),( i

iii XwfYE

Bab 5-4 - 7/21

Artificial Neural Networks (ANN) / 2

Bab 5-4 - 8/21

Perceptron

Bab 5-4 - 9/21

Let D = {(xi, yi) | i= 1,2,…,N} be the set of training examples Initialize the weights Repeat

– For each training example (xi, yi) do Compute f(w, xi) For each weight wj do

Update the weight

Until stopping condition is met

Perceptron Learning Rule / 1

ok

ooo www ....,,, 1

ijik

ikj

kj xxwfyww ,1

Bab 5-4 - 10/21

Weight update formula:

Intuition:– Update weight based on error– If y = f(w,x), e = 0, no update is needed– If y > f(w,x), e = 2, weight must be increased so that

f(w,x) will increase– If y < f(w,x), e = -2, weight must be decreased so that

f(w,x) will decrease

Perceptron Learning Rule / 2

ratelearning

ijik

ikj

kj xxwfyww

;,1

ik

i xwfye ,

Bab 5-4 - 11/21

Terminating condition: Training stops when either1. all wij in the previous epoch (i.e., iteration) were so

small as to be below some specified threshold, or2. the percentage of samples misclassified in the previous

epoch is below some threshold, or3. a pre-specified number of epochs has expired.

In practice, several hundreds of thousands of epochs may be required before the weights will converge

Perceptron Learning Rule / 3

Bab 5-4 - 12/21

Example of Perceptron Learning

d

ii

ki

k

ijik

ikj

kj

xwsignxwf

xxwfyww

0

1

)(,

0.2;,

Bab 5-4 - 13/21

Perceptron Learning

Bab 5-4 - 14/21

Nonlinearly Separable Data

Bab 5-4 - 15/21

Multilayer Neural Network / 1

Bab 5-4 - 16/21

Multilayer Neural Network / 2

Bab 5-4 - 17/21

Learning Multilayer Neural Network

Bab 5-4 - 18/21

Gradient Descent for Multilayer NN / 1

Bab 5-4 - 19/21

Gradient Descent for Multilayer NN / 2

Bab 5-4 - 20/21

Design Issues in ANN

Bab 5-4 - 21/21

Characteristics of ANN

Recommended