View
221
Download
1
Tags:
Embed Size (px)
Citation preview
NTL – Converging Constraints
• Basic concepts and words derive their meaning from embodied experience.
• Abstract and theoretical concepts derive their meaning from metaphorical maps to more basic embodied concepts.
• Structured Connectionist Models can capture both of these processes nicely.
• Grammar extends this by Constructions: pairings of form with embodied meaning.
Simulation-based language understanding
“Harry walked to the cafe.”
Schema Trajector Goalwalk Harry cafe
Analysis Process
Simulation Specification
Utterance
SimulationCafe
Constructions
General Knowledge
Belief State
Background: Primate Motor Control• Relevant requirements (Stromberg, Latash, Kandel, Arbib,
Jeannerod, Rizzolatti)– Should model coordinated, distributed, parameterized control
programs required for motor action and perception.– Should be an active structure.– Should be able to model concurrent actions and interrupts.
• Model– The NTL project has developed a computational model based on that
satisfies these requirements (x- schemas).– Details, papers, etc. can be obtained on the web at
http://www.icsi.berkeley.edu/NTL
Active representations• Many inferences about actions derive from what we know
about executing them
• Representation based on stochastic Petri nets captures dynamic, parameterized nature of actions
Walking:
bound to a specific walker with a direction or goal
consumes resources (e.g., energy)may have termination condition
(e.g., walker at goal) ongoing, iterative action
walker=Harry
goal=home
energy
walker at goal
Somatotopy of Action ObservationSomatotopy of Action Observation
Foot ActionFoot Action
Hand ActionHand Action
Mouth ActionMouth Action
Buccino et al. Eur J Neurosci 2001
Language Development in Children
• 0-3 mo: prefers sounds in native language• 3-6 mo: imitation of vowel sounds only• 6-8 mo: babbling in consonant-vowel segments• 8-10 mo: word comprehension, starts to lose sensitivity to
consonants outside native language• 12-13 mo: word production (naming)• 16-20 mo: word combinations, relational words (verbs,
adj.)• 24-36 mo: grammaticization, inflectional morphology• 3 years – adulthood: vocab. growth, sentence-level
grammar for discourse purposes
cow
apple ball yes
juice bead girl down no more
bottle truck baby woof yum go up this more
spoon hammer shoe daddy moo whee get out there bye
banana box eye momy choo-choo
uhoh sit in here hi
cookie horse door boy boom oh open on that no
food toys misc. people sound emotion action prep. demon. social
Words learned by most 2-year olds in a play school (Bloom 1993)
Learning Spatial Relation Words Terry Regier
A model of children learning spatial relations.
Assumes child hears one word label of scene.
Program learns well enough to label novel scenes correctly.
Extended to simple motion scenarios, like INTO.
System works across languages.
Mechanisms are neurally plausible.
Learning System
We’ll look at the details next lecture
dynamic relations(e.g. into)
structured connectionistnetwork (based on visual system)
Limitations
• Scale• Uniqueness/Plausibility• Grammar• Abstract Concepts• Inference• Representation• Biological Realism
physics lowest energy state
chemistry molecular
minima
biology fitness, MEU
neuroeconomics
vision threats,
friends
language errors,
NTL
Constrained Best Fit in Nature
inanimate animate
Learning Verb MeaningsDavid Bailey
A model of children learning their first verbs.Assumes parent labels child’s actions.Child knows parameters of action, associates with
wordProgram learns well enough to: 1) Label novel actions correctly 2) Obey commands using new words (simulation)System works across languagesMechanisms are neurally plausible.
Training ResultsDavid Bailey
English• 165 Training Examples, 18 verbs• Learns optimal number of word senses (21)• 32 Test examples : 78% recognition, 81% action• All mistakes were close lift ~ yank, etc.• Learned some particle CXN,e.g., pull up
Farsi • With identical settings, learned senses not in English
physics lowest energy state
chemistry molecular
minima
biology fitness, MEU
neuroeconomics
vision threats,
friends
language errors,
NTL
Constrained Best Fit in Nature
inanimate animate
Model Merging and Recruitment
Word Learning requires “fast mapping”.
Recruitment Learning is a Connectionist
Level model of this.
Model Merging is a practical Computational
Level method for fast mapping.
Bailey’s thesis outlines the reduction and some versions have been built.
The full story requires Bayesian MDL, later.
The Idea of Recruitment Learning
• Suppose we want to link up node X to node Y
• The idea is to pick the two nodes in the middle to link them up
• Can we be sure that we can find a path to get from X to Y?
KBFP )1(link no the point is, with a fan-out of 1000, if we allow 2 intermediate layers, we can almost always find a path
XX
YY
BBNN
KK
F = B/NF = B/N
Recruiting triangle nodes• Let’s say we are trying to remember a green circle
• currently weak connections between concepts (dotted lines)
has-color
blue green round oval
has-shape
Strengthen these connections
• and you end up with this picture
has-color
blue green round oval
has-shapeGreencircle