[Cell Press] (BookZZ.org)

Embed Size (px)


cognitive science

Text of [Cell Press] (BookZZ.org)

  • EditorStavroula KoustaExecutive Editor, NeuroscienceKatja BroseJournal ManagerRolf van der SandenJournal AdministratorMyarca Bonsink

    Advisory Editorial BoardR. Adolphs, Caltech, CA, USAR. Baillargeon, U. Illinois, IL, USAN. Chater, University College, London, UKP. Dayan, University College London, UKS. Dehaene, INSERM, FranceD. Dennett, Tufts U., MA, USAJ. Driver, University College, London, UKY. Dudai, Weizmann Institute, IsraelA.K. Engel, Hamburg University, GermanyM. Farah, U. Pennsylvania, PA, USAS. Fiske, Princeton U., NJ, USAA.D. Friederici, MPI, Leipzig, GermanyO. Hikosaka, NIH, MD, USAR. Jackendoff, Tufts U., MA, USAP. Johnson-Laird, Princeton U., NJ, USAN. Kanwisher, MIT, MA, USAC. Koch, Caltech, CA, USAM. Kutas, UCSD, CA, USAN.K. Logothetis, MPI, Tbingen, GermanyJ.L. McClelland, Stanford U., CA, USAE.K. Miller, MIT, MA, USAE. Phelps, New York U., NY, USAR. Poldrack, U. Texas Austin, TX, USAM.E. Raichle, Washington U., MO, USAT.W. Robbins, U. Cambridge, UKA. Wagner, Stanford U., CA, USAV. Walsh, University College, London, UK

    Editorial EnquiriesTrends in Cognitive Sciences

    Cell Press600 Technology SquareCambridge, MA 02139, USATel: +1 617 397 2817Fax: +1 617 397 2810E-mail: tics@elsevier.com

    February 2011 Volume 15, Number 2 pp. 4794

    Cover: Although vision holds a central role in social interactions, the social perception of actions also relies on auditory and olfactory information. On pages 4755, Salvatore M. Aglioti and Mariella Pazzaglia review recent evidence showing how actions can be guided by sounds and smells both independently as well as within the context of the multimodal perceptions and representations that characterize real world experiences. Crucially, non-visual information appears to have a crucial role not only in guiding actions, but also in anticipating others' actions and thus in shaping social interactions more generally.

    Forthcoming articlesCognitive neuroscience of self-regulation failureTodd Heatherton and Dylan D. WagnerRepresenting multiple objects as an ensemble enhances visual cognitionGeorge A. AlvarezSongs to syntax: The linguistics of birdsongRobert C Berwick, Kazuo Okanoya, Gabriel J Beckers and Johan J. BolhuisConnectivity constrains the organization of object knowledgeBradford Zack Mahon and Alfonso CaramazzaSpecifying the self for cognitive neuroscienceKalina Christoff, Diego Cosmelli, Dorothe Legrand and Evan Thompson


    47 Sounds and scents in (social) action

    56 Value, pleasure and choice in the ventral prefrontal cortex

    68 Cognitive culture: theoretical and empirical insights into social learning strategies

    77 Visual search in scenes involves selective and nonselective pathways

    85 Emotional processing in anterior cingulate and medial prefrontal cortex

    Salvatore M. Aglioti and Mariella Pazzaglia

    Fabian Grabenhorst and Edmund T. Rolls

    Luke Rendell, Laurel Fogarty, William J.E. Hoppitt, Thomas J.H. Morgan, Mike M. Webster and Kevin N. Laland

    Jeremy M. Wolfe, Melissa L.-H. V, Karla K. Evans and Michelle R. Greene

    Amit Etkin, Tobias Egner and Raffael Kalisch

  • Sounds and scents in (social) actionSalvatore M. Aglioti1,2 and Mariella Pazzaglia1,2

    1Dipartimento di Psicologia, Sapienza University of Rome, Via dei Marsi 78, Rome I-00185, Italy2 IRCCS Fondazione Santa Lucia, Via Ardeatina 306, Rome I-00179, Italy

    Although vision seems to predominate in triggering the

    simulation of the behaviour and mental states of others,

    the social perception of actions might rely on auditory

    and olfactory information not onlywhen vision is lacking

    (e.g. in congenitally blind individuals), but also in daily

    life (e.g. hearing footsteps along a dark street prompts

    an appropriate fight-or-fly reaction and smelling the

    scent of coffee prompts the act of grasping amug). Here,

    we review recent evidence showing that non-visual,

    telereceptor-mediated motor mapping might occur as

    an autonomous process, as well as within the context of

    the multimodal perceptions and representations that

    characterize real-world experiences. Moreover, we dis-

    cuss the role of auditory and olfactory resonance in

    anticipating the actions of others and, therefore, in

    shaping social interactions.

    Telereceptive senses, namely vision, audition and


    Perceiving and interacting with the world and with other

    individuals might appear to be guided largely by vision,

    which, according to classical views, leads over audition,

    olfaction and touch, and commands, at least in human

    and non-human primates, most types of cross-modal and

    perceptuo-motor interactions [1]. However, in sundry daily

    life circumstances, our experience with the world is inher-

    ently cross-modal [2]. For example, inputs from all sensory

    channels combine to increase the efciency of our actions

    and reactions. Seeing ames, smelling smoke or hearing a

    re alarmmight each be sufcient to create an awareness of

    a re. However, the combination of all these signals ensures

    that our response to danger is more effective. The multi-

    modal processing of visual, acoustic and olfactory informa-

    tion is even more important for our social perception of the

    actions of other individuals [3]. Indeed, vision, audition and

    olfaction are the telereceptive senses that process informa-

    tion coming from both the near and the distant external

    environment, onwhich the brain then denes the selfother

    border and the surrounding social world [4,5].

    Behavioural studies suggest that action observation and

    execution are coded according to a common representa-

    tional medium [6]. Moreover, neural studies indicate that

    seeing actions activates a fronto-parietal neural network

    that is also active when performing those same actions

    [7,8]. Thus, the notion that one understands the actions of

    others by simulating them motorically is based mainly on

    visual studies (Box 1). Vision is also the channel used for

    studying the social nature of somatic experiences (e.g.

    touch and pain) [911] and emotions (e.g. anger, disgust

    and happiness) [12]. In spite of the notion that seeing

    might be informed by what one hears or smells, less is

    known about the possible mapping of actions through the

    sound and the odour associated with them, either in the

    absence of vision or within the context of clear cross-modal

    perception. In this review, we question the exclusive

    supremacy of vision in action mapping, not to promote a

    democracy of the senses, but to highlight the crucial role of

    the other two telereceptive channels in modulating our

    actions and our understanding of the world in general, and

    of the social world in particular.

    The sound and flavour of actions

    Classic cross-modal illusions, such as ventriloquism or the

    McGurk effect, indicate that vision is a key sense in several

    circumstances [13,14]. Therefore, when multisensory cues

    are simultaneously available, humans display a robust

    tendency to rely more on visual than on other forms of

    sensory information, particularlywhen dealingwith spatial

    tasks (a phenomenon referred to as the Colavita visual

    dominance effect) [15]. However, our knowledge is some-

    times dominated by sound and is ltered through a predom-

    inantly auditory context. Auditory stimuli might, for

    example, capture visual stimuli in temporal localization

    tasks [16]. Moreover, the presentation of two beeps and a

    single ash induces the perception of two visual stimuli [17].

    Thus, sound-induced ash illusions create the mistaken

    belief that we are seeing what we are, in fact, only hearing.

    This pattern of results might be in keeping with the

    notion that multisensory processing reects modality ap-

    propriateness rules, whereby vision dominates in spatial

    tasks, and audition in temporal ones [18]. However, psy-

    chophysical studies indicate that the degradation of visual

    inputs enables auditory inputs to modulate spatial locali-

    zation [19]. This result is in keeping with the principle of

    inverse effectiveness [20], according to which multisensory

    integration is more probable or stronger for the unisensory

    stimuli that evoke relatively weak responses when pre-

    sented in isolation. Notably, the recording of neural activi-

    ty from the auditory cortex of alert monkeys watching

    naturalistic audiovisual stimuli indicates that not only

    do congruent bimodal events provide more information

    than do unimodal ones, but also that suppressed responses

    are also less variable and, thus, more informative than are

    enhanced responses [21].

    Relevant to the present review is that action sounds

    might be crucial for signalling socially dangerous or un-

    pleasant events. Efcient mechanisms for matching audi-

    tion with action might be important, even at basic levels,

    because they might ensure the survival of all hearing


    Corresponding author: Aglioti, S.M. (salvatoremaria.aglioti@uniroma1.it).

    1364-6613/$ see front matter 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.tics.2010.12.003 Trends in Cognitive Sciences, February 2011, Vol. 15, No. 2 47

  • individuals. For example, in the dark of the primordial

    nights, ancestral humans probably detected potential dan-

    gers (e.g. the footsteps of enemies) mainly by audition and,

    therefore, implemented effective ght-or-ight behaviour.

    However, actionsound mediated inferences about others

    might also occur