16
EGR 183: Modeling Neural Networks in silico Dr. Needham - Fall 2007 Daniel Calrin B.S.E. Daniel Cook Joshua Mendoza-Elias

EGR 183 Final Presentation

Embed Size (px)

DESCRIPTION

Short (expert) version powerpoint summarizing research paper file 'Paper EGR 183 Modeling Neural Networks in silico'.

Citation preview

Page 1: EGR 183 Final Presentation

EGR 183: Modeling Neural Networks in silicoDr. Needham - Fall 2007

Daniel Calrin B.S.E. Daniel Cook

Joshua Mendoza-Elias

Page 2: EGR 183 Final Presentation

Background: Neurons

Page 3: EGR 183 Final Presentation

Background: LTP and LTDThe mechanisms of Long-term Potentiation

Page 4: EGR 183 Final Presentation

Background continued:The mechanisms of Long-term Depression

Page 5: EGR 183 Final Presentation

Short-term and Long-term Effects

Page 6: EGR 183 Final Presentation

The Basis of Hebbian Learning

Page 7: EGR 183 Final Presentation

Foundation for our Computer Model

Page 8: EGR 183 Final Presentation

Types of NeuronTypes of Neuron

1

2

3

4

Inhibitory NeuronsInhibitory Neurons

Input NeuronInput Neuron

Output NeuronOutput Neuron

α = -1 4

Neuron is voltage-clamped,

presynaptic to all neurons in model

Inhibitory neurons depress post-

synaptic neurons

Excitatory NeuronsExcitatory NeuronsAverage firing rates solved at each time

step

Learning rule determines change in synaptic strength

inhibitory synapseexcitatory synapse

Key:

Page 9: EGR 183 Final Presentation

Synaptic strengths

vvii

1.0 1.0

t=t0 t=t0+1

+αvvjj

In phase

vvii vvjj

Out of phase

Two neurons are firing full-speed:

Strengths increase by factor of alpha

vvii

1.0 0.0

-βvvjj vvii vvjj

One neuron vi is firing but vj

is not:Strengths decrease by factor

of beta

Page 10: EGR 183 Final Presentation

Inhibitory Neurons

t=t0 t=t0+1

Excitatory

vvii vvjj vvjj

Inhibitory

vvii

= ...+ wi,jvi

+...

= ...- wi,jvi +...

An inhibitory neuron vi is firing, depressing the post-synaptic neuron

Weighted vi is summed negatively into vj

Weighted vi is summed positively into vj

vvii vvjj vvii vvjj

An excitatory neuron vi is firing, potentiating the post-synaptic

neuron

Page 11: EGR 183 Final Presentation

• In-phase & out-of-phase components, but we could not teach the model complete phasic inversion

• Need further development to do this: one-way connections (i.e. some strengths are 0)

Results

Page 12: EGR 183 Final Presentation

Phase components• Learning rule: α = | v1 - vN |

Input NeuronInput Neuron

Output NeuronOutput Neuron

ConvergenceConvergence

In phasecomponentIn phase

componentOut of phasecomponent

Out of phasecomponent

time

firi

ng

rate

Page 13: EGR 183 Final Presentation

Output NeuronOutput Neuron

Maxima & Minima

Local max near minimum

Local min near maximum

Maximumat maximum

time

firi

ng

rate

Input NeuronInput Neuron

Page 14: EGR 183 Final Presentation

Further developmentsShort-term

• Sparse synapse matrix (i.e. some synapses are strength 0)

• Asynchronous firing

• Multi-dimensional training (i.e. for character recognition, sound recognition, etc.)

Long-term

• Ca+2 Modeling

• Gene Expression Profile (DNA microarray data to reflect changes in synaptic efficacy)

Page 15: EGR 183 Final Presentation

Biological parallel with In Silico

Starvoytov et al. 2005Light-drected stimulation of neurons on silicon wafers. J Neurophysiol 93: 1090-1098.

Page 16: EGR 183 Final Presentation

LDS in concert with Computer Simulation

•MEAs vs. LDS •More real-time data•More quickly•Scans: Works on variably connected neural networks