11
The search for organizing principles of brain function Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas Self-organization: Hebbian learning => feature-analyzing cells => cortical maps Information theory, a neural optimization principle, and applications Prediction, control, and the “local cortical circuit” (LCC)

The search for organizing principles of brain function

Embed Size (px)

DESCRIPTION

The search for organizing principles of brain function. Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas Self-organization: Hebbian learning => feature-analyzing cells => cortical maps - PowerPoint PPT Presentation

Citation preview

Page 1: The search for organizing principles of brain function

The search for organizing principles of brain function

• Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas

• Self-organization: Hebbian learning => feature-analyzing cells => cortical maps

• Information theory, a neural optimization principle, and applications

• Prediction, control, and the “local cortical circuit” (LCC)

Page 2: The search for organizing principles of brain function

Self-organization

• Pattern formation (Turing, 1952) from simple local rules (e.g., Hebb, 1949) – Hebb rule: When the firing of cell A contributes to that

of cell B, increase the efficiency (synaptic strength) with which A excites B to fire.

– An early puzzle: How does a layer of orientation-selective cells (Hubel & Wiesel, 1960-70s) form?

– An early example of the power of Hebb learning: Hebb rule + short connections + locally-correlated random electrical activity, can => orientation-selective cells & their patterning in a layer (RL)

Page 3: The search for organizing principles of brain function

Self-organization in cortical models

• Movie: J Sirosh, R Miikkulainen, & JA Bednar (UT Austin), 1996 [courtesy JA Bednar]

• http://www.cs.utexas.edu/~nn/web-pubs/htmlbook96/sirosh/or_quad.mpg

• Click for movie: or_quad.mov

Orientation map (below; R Linsker, 1986)

Page 4: The search for organizing principles of brain function

• Some higher-level properties that can result from Hebbian

learning – Feature-analyzing (selective) cells.

– “Infomax” principle (RL): Create a layer of cells whose outputs convey maximum (Shannon) information about its inputs, subject to biological constraints & costs (types of allowed proc’g, wiring length, energy cost, etc.). An optimal encoding principle.

• Various uses of infomax– Models of neural learning & development

– Qual’ve (RL, others) and quant’ve (Atick et al.) exp’tal agreement

– Infomax-based ICA (independent component analysis) (Bell & Sejnowski, 1995): Reconstructs N statistically independent sources, given N linear combinations of them.

– Nonlinear infomax is one way to generate “sparse representations.” Sparse coding used to reconstruct 3 speech sources given only the composite signal at each of 2 receivers (RL, 2001)

Page 5: The search for organizing principles of brain function
Page 6: The search for organizing principles of brain function

time

freq.

Sparse representation of mixture of sources

Page 7: The search for organizing principles of brain function

time

freq.

Labeling using a source signature

Can obtain source signature from: - Relative transfer function (attenuation & phase shift at each frequency) from source to two rcvrs (used here). - Other methods: Pitch tracking; phoneme properties; can de-mix two overlapping sources using two received mixtures, etc. (None used here.)

Page 8: The search for organizing principles of brain function

time

freq.

Masking & reconstruction

Page 9: The search for organizing principles of brain function

Acoustic separation demo

• Mixture of 3 stereo speech sources

• Source 1: reconstruction & original

• Source 2: reconstruction & original

• Source 3: reconstruction & original

Page 10: The search for organizing principles of brain function

The “local cortical circuit” (LCC) • Substantial uniformity of cell org’n & connectivity across

neocortical areas (Mountcastle)• Core functions of the LCC “module”?

– A recurrent neural net that can combine “bottom-up” data and “top-down” expectations. LCC role in: forming generalizations? stabilizing feature analysis within each cortical processing area? Bayesian inference?

– It’s long been clear that prediction, estimation, inference, & goal-directed motor control are important functions of mammalian brains.

– Recent work (RL): A neural net alg’m for optimal Kalman estimation (pred’n) and control. The alg’m implies a set of constraints on the NN circuitry & signal flows. This architecture turns out to be similar to LCC.

Page 11: The search for organizing principles of brain function

Some other important unsolved problems• “Fast learning”: animals vs. neural nets

– Learning causal relations: deterministic or statistical? Learning powerful invariances and the “right” representations. Is statistical learning over-emphasized?

• Principles governing the processing, segregation, & integration of information streams (e.g., color, form, “what” & “where”)?

• Common ground between perception & human concept formation: Learning similarity metrics that are useful for forming generalizations & for behavior.

• How is information coded? (Firing rates, spike timing, place coding, synchrony & phase-locking, …?)

• What representations are really used by the brain? Some surprises -- e.g., “change blindness” (R Rensink demo).

• The “binding problem”; self-awareness & consciousness • Tools: How to probe circuit dynamics (of multiple interconnected cells) at

fine spatial & temporal resolution?