Week 6: (March 15, 2011) Auditory Maps and orienting: need for Coordinate Transformations

Preview:

DESCRIPTION

Week 6: (March 15, 2011) Auditory Maps and orienting: need for Coordinate Transformations. The Barnowl (tyto alba): Ears are placed asymmetrically. View on right side. View on left side. Convergence of many ICc neurons onto a single ICx neuron creates ITD sensitivity. (Takahashi). - PowerPoint PPT Presentation

Citation preview

Week 6: (March 15, 2011)

Auditory Maps and orienting: need for

Coordinate Transformations

The Barnowl (tyto alba):

Ears are placed asymmetrically

View on left side View on right side

(Takahashi)

Convergence of many ICc neurons onto a single ICx neuron

creates ITD sensitivity

(Takahashi)

Neurons in the owl’s

ventral lemniscus

are sensitive to

the interaural level

difference (ILD)!

(equivalent to the

LSO in mammals)

(Knudsen/Konishi)

Formation of a space-specific neuron in the owl’s ICx:

Convergence of ITD and ILD sensitive neurons

The neural map of auditory space in the owl’s ICx

The formation of thiscomputational map depends critically on the quality of the owl’s visual systemthrough feedbackconnections from the SC (later.....).

General Organization of the Mammalian Auditory System

ACOUSTIC

subcorticalpathways I

The Direct Sound-Localization and Orienting Pathway:

SuperiorColliculus:

eye-headorienting

InferiorColliculus:

sounddirection

Brainstem:

acousticcue

processing

Orienting of eyes, head, body (and pinnae)involves the midbrain Superior Colliculus (SC):

SC

EYE

HEAD

VISION

AUDITION

SC:- Multisensory- Sensorimotor ‘interface’- Topographic map of saccadic gaze shifts

‘sensory’ motor

SC: Topographic map of gaze - shifts: saccade vectors

time

Firingrate

Hor. Eye Pos

Vert. Eye Pos

Hor. Eye Pos

Independent of eye position

Making an eye movement towards a sound Making an eye movement towards a sound requires a requires a coordinate transformation:coordinate transformation:

This transformation necessiates a signalThis transformation necessiates a signalabout about eye positioneye position re. head, re. head, EE

C

HEV AV’

Eye and Head Not Aligned

20-2020HE

AV

Eye and Head AlignedA

2

5

10

20

400

-30

30

-60

up

down

rostral

caudal

VA

B

2

5

10

20

40 0-30

30

-60

up

down

rostral

caudal

VA

D

VV ’’

AA ’’

Jay and Sparks(1984/1987):

Auditory respon-ses in SC arein eye-centeredeye-centeredcoordinatescoordinates.

Question:Question:How do these cells get theirinformation?

Hypothesis:Hypothesis:The midbrain IC could convey thissignal.

Tuningof an IC neuronto eye position.

1. FR increases for rightward eye fixations.

2. FR increases only during the acoustic response:

“GAIN FIELD”

Neural Network Model of IC-SC mapping

Sound level and sound positionmodulation

Activity of model IC neurons:

are randomly distributed across the IC population(240 IC neurons, 12 freq bands; 100 SC neurons in map).

Peak GaussianTuning Curve

Eye positionmodulation

Brainstem pathways

Topographic codeof eye-motor error

Tonotopic code of sounds

SCICHRTF/ILDsound at (AZ,EL)

freqfreq

freq

Eye position

Up

Down

Hor

Example of a typical IC model neuron: ‘gain field’

Simulation result for TH =(+20,+30)deg and EH =(-30,+30) => ME = (50,0)

M

HE

M = H-E

T

O

F

OMR+

+

+BodyBody

HeadHead

EyeEye

TargetAudition: Audition: Target re. HeadTarget re. Head

Vision: Vision: Target re. EyeTarget re. Eye

Somatosensation:Somatosensation:Target re. BodyTarget re. Body

Eyere. Head

Headre. Body

Reference frames: including the head and bodyReference frames: including the head and body

Eye movements require oculocentric error signals

Head movements require craniocentric error signals

Vision is Eye-CenteredVision is Eye-Centered

Audition is Head-CenteredAudition is Head-Centered

In head-free head-free orienting(human):

EyesEyes(Go)and headhead(Ho)indeedmove bothboth toward a visual or auditory target.

Goossens & Van Opstal, Exp. Brain Research, 114 (1997)

Studying coordinate transformations - I:

Does the auditory system keep sounds in head-centered coordinates?

First, the rationale behind the underlying idea:

“the double-step paradigm and

pure-tone localization”

Goossens and Van Opstal, J Neurophysiol 1999

Studying coordinate transformations - I

Goossens & Van Opstal, J Neurophysiol 1999

Double-StepParadigm

Pure-ToneLocalizationParadigm

(V)

(S)

The sound-localization system should be ableto account for intervening movements of theeyes and head:

S

TH

S

F Azimuth

Elevation

Gaze shift

∆G

GSGH

V

GH=TH GS=TH - ∆G

Sound localization responses are spatially accurate

(Goossens and Van Opstal, 1999)

Pure-Tone Localization Behaviour: do head movements help?

Pure-Tone Localization:

(in)dependent of head orientation?

Pure-Tone Localization:

depends onhead orientation

AND

on tonefrequency!

Sounds appear to be represented in a spatial(body-centered) reference frame (TSPACE = THEAD+HSPACE):

Computation WITHIN the (tonotopic) auditory system

Dynamic coordinate transformations for multisensory orienting:

Recommended