22
Sparse Coding Sparse Coding in Sparse Winner in Sparse Winner networks networks Janusz A. Starzyk 1 , Yinyin Liu 1 , David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine Commonwealth of Dominica ISNN 2007: The 4th International Symposium on Neural Networks

Sparse Coding in Sparse Winner networks

  • Upload
    joanne

  • View
    82

  • Download
    0

Embed Size (px)

DESCRIPTION

Sparse Coding in Sparse Winner networks. ISNN 2007: The 4th International Symposium on Neural Networks. Janusz A. Starzyk 1 , Yinyin Liu 1 , David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University, USA 2 Ross University School of Medicine - PowerPoint PPT Presentation

Citation preview

Page 1: Sparse Coding  in Sparse Winner networks

Sparse Coding Sparse Coding in Sparse Winner in Sparse Winner

networksnetworksJanusz A. Starzyk1, Yinyin Liu1, David Vogel2

1 School of Electrical Engineering & Computer ScienceOhio University, USA

2 Ross University School of Medicine Commonwealth of Dominica

ISNN 2007: The 4th International Symposium on Neural Networks

Page 2: Sparse Coding  in Sparse Winner networks

2

OutlineOutline

• Sparse CodingSparse Coding

• Sparse Structure

• Sparse winner network with winner-take-all (WTA) mechanism

• Sparse winner network with oligarchy-take-all (OTA) mechanism

• Experimental results

• Conclusions

Broca’sarea

Parsopercularis

Motor cortex Somatosensory cortex

Sensory associativecortex

PrimaryAuditory cortex

Wernicke’sarea

Visual associativecortex

Visualcortex

Page 3: Sparse Coding  in Sparse Winner networks

3

Kandel Fig. 23-5

Sparse CodingSparse Coding

• How do we take in the sensory information and

make sense of them?

Richard Axel, 1995

Foot

Hip Trunk

ArmHand

Face

Tongue

Larynx

Kandel Fig. 30-1

Page 4: Sparse Coding  in Sparse Winner networks

4

Sparse CodingSparse Coding

• Neurons become active representing objects and concepts

http://gandalf.psych.umn.edu/~kersten/kersten-lab/CompNeuro2002/

C. Connor, “Friends and grandmothers’, Nature, Vol. 435, June, 2005

• Metabolism demands of human sensory system and brain

• Statistical properties of the environment – not every single bit information matters

• “Grandmother cell” by J.V. Lettvin – only one neuron on the top level representing and recognizing an object (extreme case)

• A small group of neuron on the top level representing an object

Produce sparse neural representation——“sparse coding”

Page 5: Sparse Coding  in Sparse Winner networks

5

Sparse StructureSparse Structure

• 1012 neurons in human brain are sparsely connected

• On average, each neuron is connected to other neurons through about 104 synapses

• Sparse structure enables efficient computation and saves energy and cost

Page 6: Sparse Coding  in Sparse Winner networks

6

Sparse Coding in Sparse StructureSparse Coding in Sparse Structure

• Cortical learning: unsupervised learning

• Finding sensory input activation pathway

• Competition is needed: Finding neurons with stronger activities and suppress the ones with weaker activities

• Winner-take-all (WTA) a single neuron winner

• Oligarchy-take-all (OTA) a group of neurons with strong activities as winners

Sensory input

……………...

… …

Increasing connection’s adaptability

Page 7: Sparse Coding  in Sparse Winner networks

7

OutlineOutline

• Sparse Coding

• Sparse Structure

• Sparse winner network Sparse winner network with winner-take-all with winner-take-all (WTA) mechanism(WTA) mechanism

• Sparse winner network with oligarchy-take-all (OTA) mechanism

• Experimental results

• Conclusions

Broca’sarea

Parsopercularis

Motor cortex Somatosensory cortex

Sensory associativecortex

PrimaryAuditory cortex

Wernicke’sarea

Visual associativecortex

Visualcortex

Page 8: Sparse Coding  in Sparse Winner networks

8

• Local network model of cognition – R-net

• Primary layer and secondary layer

• Random sparse connection

• For associative memories, not for feature extraction

• Not in hierarchical structure

Secondary layer

Primary layer

David Vogel, “A neural network model of memory and higher cognitive functions in the cerebrum”

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Page 9: Sparse Coding  in Sparse Winner networks

9

• Use secondary neurons to provide “full connectivity” in sparse structure

• More secondary levels can increase the sparsity

• Primary levels and secondary levels

… …

Incr

easi

ng n

umbe

r of

O

vera

ll n

euro

ns

Primary level h+1

Secondary level s

Primary level h

winner

Input pattern

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Hierarchical learning network:

• Finding global winner which has the strongest signal strength

• For large amount of neurons, it is very time-consuming

Finding neuronal representations:

Page 10: Sparse Coding  in Sparse Winner networks

10

• Data transmission: feed-forward computation

…Global winner

Input pattern

h+1

s2

h

s1

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Finding global winner using localized WTA:

• Winner tree finding: local competition and feed-back

• Winner selection: feed-forward computation and weight adjustment

Page 11: Sparse Coding  in Sparse Winner networks

11

• Signal calculation

• Transfer function

1layeris input

1layeris output

activation threshold

Input pattern

1layeris

1

1

1

layeril

j

layerjij

layeri sws

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

12 jw

Data transmission: feed-forward computation

Page 12: Sparse Coding  in Sparse Winner networks

12

• Local competition

Current –mode WTA circuit

(Signal – current)

• Local competitions on network

),..2,1(max1

1 level

Nk

levelkjk

Nj

leveliwinner Nisws

levelj

leveli

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Local neighborhood:

Local competition local winner

Branches logically cut off: l1 l3

Signal on goes to

11

hn 12hn 1

3hn

12hS

Winner tree finding: local competition and feedback

Set of post-synaptic neurons of N4

level

Set of pre-synaptic neurons of N4

level+1

N4level+1 is the winner among 4,5,6,7,8 N4

level+1 N4level

54 76 8

2 3 4 6

9

i

1leveliN

j

1 2 3

751

level+1

level

leveljN

Local winner

l2l1 l3X X2

1sn

11hn 1

2hn 1

3hn

11

hS 12hS 1

3hS

21sn1

2hn

4

Page 13: Sparse Coding  in Sparse Winner networks

13

The winner network is found: all the neurons directly or indirectly connected with the global winner neuron

Winner tree

Swinner

Swinner

SwinnerSwinner

SwinnerSwinner

Input neuronWinner neuron in local competitionLoser neuron in local competitionInactive neuron

…Swinner

Swinner

Swinner

SwinnerSwinner

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Page 14: Sparse Coding  in Sparse Winner networks

14

• Signal are recalculated through logically connected links

• Weights are adjusted using concept of Hebbian Learning

)(

)(

)(

1133

113

113

1122

112

112

1111

111

111

hhh

hhh

hhh

wxw

wxw

wxw

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

2 3 4 5 6 7 8 9 100

2

4

6

8

10

12

number of input links

nu

mb

er

of a

ctiv

e n

eu

ron

s o

n to

p le

vel

Number of active neurons on top level vs. Number of input links to each neuron

2 3 4 5 6 7 8 9 100

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

number of input links

Ave

rag

e s

ign

al s

tre

ng

th

Average signal strength of active neurons on top level except global winner relative to that of global winner vs. number of input links to each neuron

11hn

hn1hn2

hn3

hw11 hw21

hw31

1x 2x 3x

Winner selection: feed-forward computation and weight adjustment

Number of global winners found is typically 1 with sufficient links

• 64-256-1028-4096 network• Find 1 global winner with over 8 connections

Page 15: Sparse Coding  in Sparse Winner networks

15

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

Sparse winner network with winner-Sparse winner network with winner-take-all (WTA)take-all (WTA)

2 3 4 5 6 7 8 9 100

2

4

6

8

10

12

number of input links

nu

mb

er

of a

ctiv

e n

eu

ron

s o

n to

p le

vel

Number of active neurons on top level vs. Number of input links to each neuron

2 3 4 5 6 7 8 9 100

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

number of input links

Ave

rag

e s

ign

al s

tre

ng

th

Average signal strength of active neurons on top level except global winner relative to that of global winner vs. number of input links to each neuron

Number of global winners found is typically 1

with sufficient input links

• 64-256-1028-4096 network• Find 1 global winner with over 8 connections

Page 16: Sparse Coding  in Sparse Winner networks

16

OutlineOutline

• Sparse Coding

• Sparse Structure

• Sparse winner network with winner-take-all (WTA) mechanism

• Sparse winner network Sparse winner network with oligarchy-take-all with oligarchy-take-all (OTA) mechanism(OTA) mechanism

• Experimental results

• Conclusions

Broca’sarea

Parsopercularis

Motor cortex Somatosensory cortex

Sensory associativecortex

PrimaryAuditory cortex

Wernicke’sarea

Visual associativecortex

Visualcortex

Page 17: Sparse Coding  in Sparse Winner networks

17

• Signal goes through layer by layer

• Local competition is done after a layer is reached

• Local WTA

• Multiple local winner neurons on each level

• Multiple winner neurons on the top level – oligarchy-take-all

• Oligarchy represents the sensory input

• Provide coding redundancy

• More reliable than WTA

Sparse winner network with oligarchy-Sparse winner network with oligarchy-take-all (OTA)take-all (OTA)

Sparse winner network with oligarchy-Sparse winner network with oligarchy-take-all (OTA)take-all (OTA)

Active neuronWinner neuron in local competitionLoser neuron in local competitionInactive neuron

Page 18: Sparse Coding  in Sparse Winner networks

18

OutlineOutline

• Sparse Coding

• Sparse Structure

• Sparse winner network with winner-take-all (WTA)

• Sparse winner network with oligarchy-take-all (OTA)

• Experimental resultsExperimental results

• ConclusionsBroca’sarea

Parsopercularis

Motor cortex Somatosensory cortex

Sensory associativecortex

PrimaryAuditory cortex

Wernicke’sarea

Visual associativecortex

Visualcortex

Page 19: Sparse Coding  in Sparse Winner networks

19

Experimental ResultsExperimental Results

Input size: 8 x 8

original image

0 1000 2000 3000 4000 50000

0.2

0.4

0.6

0.8

1

1.2

1.4Initial output signal strength

neurons

ne

uro

na

l act

ivity

global winner

output neuronal activitiesactivation threshold

0 1000 2000 3000 4000 50000

0.5

1

1.5

2

2.5Winner selected in testing, winner is 3728

neurons

ne

uro

na

l act

ivity

global winner

output neuronal activitiesactivation threshold

WTA scheme in sparse network

Page 20: Sparse Coding  in Sparse Winner networks

20

Experimental ResultsExperimental Results

64 bit input

digit Active Neuron index in OTA network0 72 91 365 371 1103 1198 1432 1639 …1 237 291 377 730 887 1085 1193 1218 …2 294 329 339 771 845 1163 1325 1382 …3 109 122 237 350 353 564 690 758 …4 188 199 219 276 307 535 800 1068 …5 103 175 390 450 535 602 695 1008 …6 68 282 350 369 423 523 538 798 …7 237 761 784 1060 1193 1218 1402 1479 …8 35 71 695 801 876 1028 1198 1206 …9 184 235 237 271 277 329 759 812 …

Averagely, 28.3 neurons being active represent the objects. Varies from 26 to 34 neurons

OTA scheme in sparse network

Page 21: Sparse Coding  in Sparse Winner networks

21

0 10 20 30 40 500

0.2

0.4

0.6

0.8

1

number of bits changed in the pattern

pe

rce

nta

ge

of c

orr

ect

re

cog

niti

on

Percentage of correct recognition

performance of OTAperformance of winner network

Accuracy level of random recognition

WTA

Random recognition

OTA has better fault tolerance than WTA

Experimental ResultsExperimental Results

Page 22: Sparse Coding  in Sparse Winner networks

22

Conclusions & Future workConclusions & Future work

• Sparse coding building in sparsely connected networks

• WTA scheme: local competition accomplish the global competition using primary and secondary layers –efficient hardware implementation

• OTA scheme: local competition produces neuronal activity reduction

• OTA – redundant coding: more reliable and robust

• WTA & OTA: learning memory for developing machine intelligence

Future work:

• Introducing temporal sequence learning

• Building motor pathway on such learning memory

• Combining with goal-creation pathway to build intelligent machine