23
Language, Mind, and Brain: Experience Alters perception Chapter 8 The New Cognitive Neurosciences M. Gazzaniga (ed.) Sep 7, 2001

Chapter 8 The New Cognitive Neurosciences M. Gazzaniga (ed ...dove.ccs.fau.edu/dawei/COG/General/Gautam1.pdf · The New Cognitive Neurosciences M. Gazzaniga (ed.) Sep 7, 2001. Relevant

  • Upload
    dokiet

  • View
    217

  • Download
    0

Embed Size (px)

Citation preview

Language, Mind, and Brain: Experience Alters perception

Chapter 8

The New Cognitive Neurosciences M. Gazzaniga (ed.)

Sep 7, 2001

Relevant points from Stein et al. (Chap. 5)

• AES functions as an association area, and feeds back to the superior colliculus (SC)

• Multisensory neurons prefer spatially and temporally coincident events

• Neonatal SC neurons do not receive cortical input. By around 20 weeks postnatal:

– Percentage of multisensory neurons increase from ~ 0% to ~60%

– RFs for the SC neurons shrink

– Mutisensory neurons become spatially aligned and integrated

Relevant points (contd.)

• One dominant modality (e.g. vision) may first develop and influence the development of other modalities (e.g. audition)

• Sensory integration seems to occur after development of individual modalities

=> Initially, the modalities are developmentally related but not integrated

End of critical development for cats: ~ 135 days ~ 20 weeks20 weeks x 7 human days/cat day = 140 days = ~ 5-6 months (about 0.6% - 1% of lifespan)

Speculation: Basic perceptual abilities of infants may be developed by 5 months postnatal, and integration starts occurring.

Overview of the paper

• Development versus learning• Language as a case study

– Developmental changes in speech– Crossmodal influences

• The perceptual magnet effect – Acoustic structure of vowels– Experience alters perception

• A theory of speech development (Native Language Magnet)• Argument for NLM

– Nature of language input– Vocal learning– Brain correlates

• Conclusion

Development vs. Learning

• An organism’s behavior changes over time. Are the changes caused by:– Unfolding of the genetic program (e.g. growing a leg)?– A general learning mechanism that detects and adapts to patterns in the world

(e.g. a baby learns to use a walker)– A combination of both (how are they related?)

• Kuhl’s four types of models:A) Development and Learning are independentB) Learning <=> DevelopmentC) Development enables learningD) Development enables learning, and learning triggers development

Issue: What is development?

• All learning involves physical changes to the brain• Which physical changes are due to ‘development’, as opposed to

‘learning’?

• Possible distinction: developmental changes are ‘internally triggered’– Problem: Kuhl’s eventual position is that learning sets up developmental

changes => ‘externally triggered’

• Possible distinction 2: Aspects of behavior that were not environmentally specified are ‘developmental’. – Problem: We are defining development wrt behavior, not physical changes in

the brain.

Language as a case study

Skinner (Learning +Instruction)

A child learns language by associating sounds with objects and events.ArgumentLanguage is learnt by the same mechanisms that are used to learn other patterns in the world (learn associations with feedback from a “teacher”)

Chomsky (Development + Selection)

A child grows language, just as he grows arms and legs.ArgumentGrammatical structure is too complex to be learntDon’t need a lot of input, just some of the right kind of input to set parameters for the developing language

Debate applies at three levels: phonology (sounds of speech)word structure (ordering of speech sounds)grammar (ordering of words)

Language as a case study: Developmental changes

Long intonated utterances

Usually verb +noun pairs

Consistent use of a word to refer to objects

Consonant-vowel strings (“bababa”)

Vowel-like sounds

Able to discriminate between only native language speech contrasts

Able to discriminate between most speech contrasts

15 months onwards

Meaningful speech

18-24 monthsTwo-word utterances

10-15 monthsFirst words

5-10 monthsCanonical babbling

1-4 monthsCooing

Deaf children follow a similar developmental pattern

Language as a case study:Crossmodal influences

The McGurk EffectAuditory information for [b] (closure at front of the mouth)

+ visual information for [g] (closure at back of the mouth)? perception of [d] (closure in the middle of the mouth)

• 18-20 week old infants prefer correlated visual+speech signals

• Robust influence of vision on speech perception• Polymodal representation of speech

The perceptual magnet effect:Acoustic structure of vowels

Each vowel has characteristicresonance frequencies (spectral peaks)

Formants = spectral peak

Quality of a vowel approximatelygiven by locations of the first twoFormants (F1 and F2).

SpectrumVocal tract

F1 F2 F3

from P. Ladefoged, Elements of Acoustic Phonetics, 2nd ed.

The perceptual magnet effect:Language experience alters perception

Discrimination is worse near the prototypes of phonetic categories

The better the prototype, the worse the discrimination near it (as if the prototype attracts the perceptstowards itself)

Effect seen in humans but not in monkeys

1-dim MDS solutions (vertical distance arbitrarily chosen to prevent overlap)

from Iverson & Kuhl, JASA 97(1)

The perceptual magnet effect:Effect of native language

Adults exhibit the PME only for native language speech contrasts

6-month old infants exhibit a similar native language effect

The perceptual magnet effect:Theoretical implications

“Language input scults the brain … [to highlight] contrasts used in language, while de-emphasizing those that do not, and this happens prior to word learning”

The PME may assist infants in chunking the sound stream into words

=> Learning (PME) promotes development (word acquisition)

Issue: Paradoxical nature of learning

Speech sounds: discriminating among frequently-encountered stimuli is more difficult

Face recognition: discrimination among frequently-encountered stimuli is much easier

• When does learning involve loss of ability, and when does learning sharpen ability?

• Are both loss and sharpening caused by the same learning/developmental process?

• What kind of neural mechanisms could provide this kind of learning? (competitive-learning neural nets?)

The “Native Language Magnet” model of speech development

Phase 1 (ability at birth):Infants can hear almost all speech contrastsPredisposition may be due to auditory abilities

Phase 2 (ability at 6 months):Infants start representing native language informationLanguage-specific perceptual maps start to form

Phase 3 (stabilization of perceptual maps):Native speech distinctions are maximizedNon-native speech distinctions are minimized

NLM

Argument for NLM:Critical Periods

Critical periods are not be strictly timed.During “sensitive periods”:

– Exposure to some kinds of information may be more effective,– and thus alter the duration of the sensitive period

Why are 2nd languages difficult to learn?– Maturationally defined temporal window of learning– Neural commitment resulting from early learning

Reason for neural commitment: Interference– Brain plasticity is governed by statistical sufficiency – Once the speech prototypes stabilize, further data are

ignored for processing efficiency– Thus old patterns interfere with learning new patterns.

Speculation

Argument for NLM:Nature of language input to the child

• ‘Motherese’ is hyperarticulated• Possibly to give clearer prototypes of speech sounds to infants

and thus promote PME

Argument for NLM:Vocal learning

• Production becomes “discrete” at around the same time PME is first seen• By ~20 weeks, infants also more likely to imitate correctly

Argument for NLM:Vocal learning

Possible explanation for discrete productions:

Due to perceptual magnet effects, infants speech representations become more discrete:

=> they hear more discretely and so produce more discretely

=> Perceptual representations serve as guides for production

Argument for NLM:Vocal learning

Argument for NLM:Brain correlates

• Disassociation for speech and nonspeech signals– Phonetic processing engaged the left hemisphere– Pitch processing of same stimuli engaged the right hemisphere

• Speech becomes lateralized by ~ 4 months

Conclusion:Learning and development are interrelated

• Development (genetic program) sets up the constraints for effective learning:– Auditory sensitivity and natural discontinuities– the neural mechanisms for pattern detection

• Possible explanation:

Experience with linguistic input causes

the perceptual magnet effect, which causes

the formation of discrete perceptual representations, which allows for

word acquisition, which triggers

the left-hemisphere dominance, which allows

better representation and processing?