Progress in Brain Research, volume 156, 2006
Lang, P. J. & Davis, M. (2006). Emotion, motivation, and the brain: reflex
foundations in animal and human research. Prog.Brain Res., 156, 3-29.
Notes: NIMH Center for the Study of Emotion and Attention, Department of Clinical and
Health Psychology, University of Florida, FL 32610-0165, USA. plang@phhp.ufl.edu
This review will focus on a motivational circuit in the brain, centered on the amygdala,
that underlies human emotion. This neural circuitry of appetitive/approach and
defensive/avoidance was laid down early in our evolutionary history in primitive cortex,
sub-cortex, and mid-brain, to mediate behaviors basic to the survival of individuals and
the propagation of genes to coming generations. Thus, events associated with appetitive
rewards, or that threaten danger or pain, engage attention and prompt information
gathering more so than other input. Motive cues also occasion metabolic arousal,
anticipatory responses, and mobilize the organism to prepare for action. Findings are
presented from research with animals, elucidating these psychophysiological (e.g.,
cardiovascular, neuro-humoral) and behavioral (e.g., startle potentiation, "freezing")
patterns in emotion, and defining their mediating brain circuits. Parallel results are
described from experiments with humans, showing similar activation patterns in brain
and body in response to emotion cues, co-varying with participants' reports of affective
valence and increasing emotional arousal
20070118
Schupp, H. T., Flaisch, T., Stockburger, J., & Junghofer, M. (2006). Emotion and
attention: event-related brain potential studies. Prog.Brain Res., 156, 31-51.
Notes: Department of Psychology, University of Konstanz, Konstanz, and Institute for
Biomagnetism and Biosignalanalysis, Munster University Hospital, Germany.
Harald.Schupp@uni-konstanz.de
Emotional pictures guide selective visual attention. A series of event-related brain
potential (ERP) studies is reviewed demonstrating the consistent and robust modulation
of specific ERP components by emotional images. Specifically, pictures depicting natural
pleasant and unpleasant scenes are associated with an increased early posterior
negativity, late positive potential, and sustained positive slow wave compared with
neutral contents. These modulations are considered to index different stages of stimulus
processing including perceptual encoding, stimulus representation in working memory,
and elaborate stimulus evaluation. Furthermore, the review includes a discussion of
studies exploring the interaction of motivated attention with passive and active forms of
attentional control. Recent research is reviewed exploring the selective processing of
emotional cues as a function of stimulus novelty, emotional prime pictures, learned
stimulus significance, and in the context of explicit attention tasks. It is concluded that
ERP measures are useful to assess the emotion-attention interface at the level of distinct
processing stages. Results are discussed within the context of two-stage models of
stimulus perception brought out by studies of attention, orienting, and learning
20070118
Codispoti, M., Ferrari, V., De Cesarei, A., & Cardinale, R. (2006). Implicit and
explicit categorization of natural scenes. Prog.Brain Res., 156, 53-65.
Notes: Department of Psychology, University of Bologna, Viale Berti Pichat, 5-40127
Bologna, Italy. maurizio.codispoti@unibo.it
Event-related potential (ERP) studies have consistently found that emotionally arousing
(pleasant and unpleasant) pictures elicit a larger late positive potential (LPP) than neutral
pictures in a window from 400 to 800 ms after picture onset. In addition, an early ERP
component has been reported to vary with emotional arousal in a window from about 150
to 300 ms with affective, compared to neutral stimuli, prompting significantly less
positivity over occipito-temporal sites. Similar early and late ERP components have been
found in explicit categorization tasks, suggesting that selective attention to target features
results in similar cortical changes. Several studies have shown that the affective
modulation of the LPP persisted even when the same pictures are repeated several times,
when they are presented as distractors, or when participants are engaged in a competing
task. These results indicate that categorization of affective stimuli is an obligatory
process. On the other hand, perceptual factors (e.g., stimulus size) seem to affect the
early ERP component but not the affective modulation of the LPP. Although early and
late ERP components vary with stimulus relevance, given that they are differentially
affected by stimulus and task manipulations, they appear to index different facets of
picture processing
20070118
Pourtois, G. & Vuilleumier, P. (2006). Dynamics of emotional effects on spatial
attention in the human visual cortex. Prog.Brain Res., 156, 67-91.
Notes: Neurology & Imaging of Cognition, Clinic of Neurology, University Hospital &
Department of Neurosciences, University Medical Center, University of Geneva,
Switzerland. gilles.pourtois@medecine.unige.ch
An efficient detection of threat is crucial for survival and requires an appropriate
allocation of attentional resources toward the location of potential danger. Recent
neuroimaging studies have begun to uncover the brain machinery underlying the
reflexive prioritization of spatial attention to locations of threat-related stimuli. Here, we
review functional brain imaging experiments using event-related potentials (ERPs) and
functional magnetic resonance imaging (fMRI) in a dot-probe paradigm with emotional
face cues, in which we investigated the spatio-temporal dynamics of attentional orienting
to a visual target when the latter is preceded by either a fearful or happy face, at the same
(valid) location or at a different (invalid) location in visual periphery. ERP results
indicate that fearful faces can bias spatial attention toward threat-related location, and
enhance the amplitude of the early exogenous visual P1 activity generated within the
extrastriate cortex in response to a target following a valid rather than invalid fearful
face. Furthermore, this gain control mechanism in extrastriate cortex (at 130-150 ms) is
preceded by an earlier modulation of activity in posterior parietal regions (at 40-80 ms)
that may provide a critical source of top-down signals on visual cortex. Happy faces
produced no modulation of ERPs in extrastriate and parietal cortex. fMRI data also show
increased responses in the occipital visual cortex for valid relative to invalid targets
following fearful faces, but in addition reveal significant decreases in intraparietal cortex
and increases in orbitofrontal cortex when targets are preceded by an invalid fearful face,
suggesting that negative emotional stimuli may not only draw but also hold spatial
attention more strongly than neutral or positive stimuli. These data confirm that threat
may act as a powerful exogenous cue and trigger reflexive shifts in spatial attention
toward its location, through a rapid temporal sequence of neural events in parietal and
temporo-occipital areas, with dissociable neural substrates for engagement benefits in
attention affecting activity in extrastriate occipital areas and increased disengagement
costs affecting intraparietal cortex. These brain-imaging results reveal how emotional
signals related to threat can play an important role in modulating spatial attention to
afford flexible perception and action
20070118
Sabatinelli, D., Lang, P. J., Bradley, M. M., & Flaisch, T. (2006). The neural basis
of narrative imagery: emotion and action. Prog.Brain Res., 156, 93-103.
Notes: NIMH Center for the Study of Emotion and Attention, University of Florida, PO
Box 100165 HSC, Gainesville, FL 32608, USA. sabat@ufl.edu
It has been proposed that narrative emotional imagery activates an associative network of
stimulus, semantic, and response (procedural) information. In previous research,
predicted response components have been demonstrated through psychophysiological
methods in peripheral nervous system. Here we investigate central nervous system
concomitants of pleasant, neutral, and unpleasant narrative imagery with functional
magnetic resonance imaging. Subjects were presented with brief narrative scripts over
headphones, and then imagined themselves engaged in the described events. During
script perception, auditory association cortex showed enhanced activation during
affectively arousing (pleasant and unpleasant), relative to neutral imagery. Structures
involved in language processing (left middle frontal gyrus) and spatial navigation
(retrosplenium) were also active during script presentation. At the onset of narrative
imagery, supplementary motor area, lateral cerebellum, and left inferior frontal gyrus
were initiated, showing enhanced signal change during affectively arousing (pleasant and
unpleasant), relative to neutral scripts. These data are consistent with a bioinformational
model of emotion that considers response mobilization as the measurable output of
narrative imagery
20070118
Wiens, S. (2006). Subliminal emotion perception in brain imaging: findings,
issues, and recommendations. Prog.Brain Res., 156, 105-121.
Notes: Department of Psychology, Stockholm University, Frescati Hagvag, 106 91
Stockholm, Sweden. sws@psychology.su.se
Many theories of emotion propose that emotional input is processed preferentially due to
its relevance for the organism. Further, because consciousness has limited capacity, these
considerations imply that emotional input ought to be processed even if participants are
perceptually unaware of the input (subliminal perception). Although brain imaging has
studied effects of unattended, suppressed (in binocular rivalry), and visually masked
emotional pictures, conclusions regarding subliminal perception have been mixed. The
reason is that subliminal perception demands a concept of an awareness threshold or
limen, but there is no agreement on how to define and measure this threshold. Although
different threshold concepts can be identified in psychophysics (signal detection theory),
none maps directly onto perceptual awareness. Whereas it may be tempting to equate
unawareness with the complete absence of objective discrimination ability (d'=0), this
approach is incompatible with lessons from blindsight and denies the subjective nature of
consciousness. This review argues that perceptual awareness is better viewed as a
continuum of sensory states than a binary state. When levels of awareness are
characterized carefully in terms of objective discrimination and subjective experience,
findings can be informative regarding the relative independence of effects from
awareness and the potentially moderating role of awareness in processing emotional
input. Thus, because the issue of a threshold concept may never be resolved completely,
the emphasis is to not prove subliminal perception but to compare effects at various
levels of awareness
20070118
Junghofer, M., Peyk, P., Flaisch, T., & Schupp, H. T. (2006). Neuroimaging
methods in affective neuroscience: selected methodological issues. Prog.Brain Res., 156,
123-143.
Notes: Institute for Biosignalanalysis and Biomagnetism, University of Munster,
Munster, Germany. markus.junghofer@uni-muenster.de
A current goal of affective neuroscience is to reveal the relationship between emotion
and dynamic brain activity in specific neural circuits. In humans, noninvasive
neuroimaging measures are of primary interest in this endeavor. However,
methodological issues, unique to each neuroimaging method, have important
implications for the design of studies, interpretation of findings, and comparison across
studies. With regard to event-related brain potentials, we discuss the need for dense
sensor arrays to achieve reference-independent characterization of field potentials and
improved estimate of cortical brain sources. Furthermore, limitations and caveats
regarding sparse sensor sampling are discussed. With regard to event-related magnetic
field (ERF) recordings, we outline a method to achieve magnetoencephalography (MEG)
sensor standardization, which improves effects' sizes in typical neuroscientific
investigations, avoids the finding of ghost effects, and facilitates comparison of MEG
waveforms across studies. Focusing on functional magnetic resonance imaging (fMRI),
we question the unjustified application of proportional global signal scaling in emotion
research, which can greatly distort statistical findings in key structures implicated in
emotional processing and possibly contributing to conflicting results in affective
neuroscience fMRI studies, in particular with respect to limbic and paralimbic structures.
Finally, a distributed EEG/MEG source analysis with statistical parametric mapping is
outlined providing a common software platform for hemodynamic and electromagnetic
neuroimaging measures. Taken together, to achieve consistent and replicable patterns of
the relationship between emotion and neuroimaging measures, methodological aspects
associated with the various neuroimaging techniques may be of similar importance as the
definition of emotional cues and task context used to study emotion
20070118
Kissler, J., Assadollahi, R., & Herbert, C. (2006). Emotional and semantic
networks in visual word processing: insights from ERP studies. Prog.Brain Res., 156,
147-183.
Notes: Department of Psychology, University of Konstanz, P. O. Box D25, D-78457
Konstanz, Germany. johanna.kissler@uni-konstanz.de
The event-related brain potential (ERP) literature concerning the impact of emotional
content on visual word processing is reviewed and related to general knowledge on
semantics in word processing: emotional connotation can enhance cortical responses at
all stages of visual word processing following the assembly of visual word form (up to
200 ms), such as semantic access (around 200 ms), allocation of attentional resources
(around 300 ms), contextual analysis (around 400 ms), and sustained processing and
memory encoding (around 500 ms). Even earlier effects have occasionally been reported
with subliminal or perceptual threshold presentation, particularly in clinical populations.
Here, the underlying mechanisms are likely to diverge from the ones operational in
standard natural reading. The variability in timing of the effects can be accounted for by
dynamically changing lexical representations that can be activated as required by the
subjects' motivational state, the task at hand, and additional contextual factors.
Throughout, subcortical structures such as the amygdala are likely to contribute these
enhancements. Further research will establish whether or when emotional arousal,
valence, or additional emotional properties drive the observed effects and how
experimental factors interact with these. Meticulous control of other word properties
known to affect ERPs in visual word processing, such as word class, length, frequency,
and concreteness and the use of more standardized EEG procedures is vital. Mapping the
interplay between cortical and subcortical mechanisms that give rise to amplified cortical
responses to emotional words will be of highest priority for future research
20070118
Fischler, I. & Bradley, M. (2006). Event-related potential studies of language and
emotion: words, phrases, and task effects. Prog.Brain Res., 156, 185-203.
Notes: Psychology Department, PO Box 112250, University of Florida, Gainesville, FL
32611, USA. ifisch@ufl.edu
This chapter reviews research that focuses on the effects of emotionality of single words,
and of simple phrases, on event-related brain potentials when these are presented visually
in various tasks. In these studies, presentation of emotionally evocative language material
has consistently elicited a late (c. 300-600 ms post-onset) positive-going, largely frontal-central shift in the event-related potentials (ERPs), relative to neutral materials. Overall,
affectively pleasant and unpleasant words or phrases are quite similar in their
neuroelectric profiles and rarely differ substantively. This emotionality effect is enhanced
in both amplitude and latency when emotional content is task relevant, but is also reliably
observed when the task involves other semantically engaging tasks. On the other hand, it
can be attenuated or eliminated when the task does not involve semantic evaluation (e.g.,
lexical decisions to words or orthographic judgments to the spelling patterns) or when
comprehension of phrases requires integration of the connotative meaning of several
words (e.g., compare dead puppy and dead tyrant). Taken together, these studies suggest
that the emotionality of written language has a rapid and robust impact on ERPs, which
can be modulated by specific task demands as well as the linguistic context in which the
affective stimulus occurs
20070118
Jackson, M. A. & Crosson, B. (2006). Emotional connotation of words: role of
emotion in distributed semantic systems. Prog.Brain Res., 156, 205-216.
Notes: Nemours Children's Clinic, Neurology Division, 807 Children's Way,
Jacksonville, FL 32207, USA. acato@nemours.org
One current doctrine regarding lexical-semantic functions asserts separate input and
output lexicons with access to a central semantic core. In other words, processes related
to word form have separate representations for input (comprehension) vs. output
(expression), while processes related to meaning are not split along the input-output
dimension. Recent evidence from our laboratory suggests that semantic processes related
to emotional connotation may be an exception to this rule. The ability to distinguish
among different emotional connotations may be linked distinctly both to attention
systems that select specific sensory input for further processing and to intention systems
that select specific actions for output. In particular, the neuroanatomic substrates for
emotional connotation on the input side of the equation appear to differ from the
substrates on the output side of the equation. Implications for semantic processing of
emotional connotation and its relationship to attention and motivation systems are discussed
20070118
Keil, A. (2006). Macroscopic brain dynamics during verbal and pictorial
processing of affective stimuli. Prog.Brain Res., 156, 217-232.
Notes: Department of Psychology, University of Konstanz, PO Box D23, D-78457
Konstanz, Germany. Andreas.Keil@uni-konstanz.de
Emotions can be viewed as action dispositions, preparing an individual to act efficiently
and successfully in situations of behavioral relevance. To initiate optimized behavior, it
is essential to accurately process the perceptual elements indicative of emotional
relevance. The present chapter discusses effects of affective content on neural and
behavioral parameters of perception, across different information channels.
Electrocortical data are presented from studies examining affective perception with
pictures and words in different task contexts. As a main result, these data suggest that
sensory facilitation has an important role in affective processing. Affective pictures
appear to facilitate perception as a function of emotional arousal at multiple levels of
visual analysis. If the discrimination between affectively arousing vs. nonarousing
content relies on fine-grained differences, amplification of the cortical representation
may occur as early as 60-90 ms after stimulus onset. Affectively arousing information as
conveyed via visual verbal channels was not subject to such very early enhancement.
However, electrocortical indices of lexical access and/or activation of semantic networks
showed that affectively arousing content may enhance the formation of semantic
representations during word encoding. It can be concluded that affective arousal is
associated with activation of widespread networks, which act to optimize sensory
processing. On the basis of prioritized sensory analysis for affectively relevant stimuli,
subsequent steps such as working memory, motor preparation, and action may be
adjusted to meet the adaptive requirements of the situation perceived
20070118
Grandjean, D., Banziger, T., & Scherer, K. R. (2006). Intonation as an interface
between language and affect. Prog.Brain Res., 156, 235-247.
Notes: Swiss Center for Affective Sciences, University of Geneva, 7 rue des Battoirs,
1205 Geneva, Switzerland. Didier.Grandjean@pse.unige.ch
The vocal expression of human emotions is embedded within language and the study of
intonation has to take into account two interacting levels of information--emotional and
semantic meaning. In addition to the discussion of this dual coding system, an extension
of Brunswik's lens model is proposed. This model includes the influences of conventions,
norms, and display rules (pull effects) and psychobiological mechanisms (push effects)
on emotional vocalizations produced by the speaker (encoding) and the reciprocal
influences of these two aspects on attributions made by the listener (decoding), allowing
the dissociation and systematic study of the production and perception of intonation.
Three empirical studies are described as examples of possibilities of dissociating these
different phenomena at the behavioral and neurological levels in the study of intonation
20070118
Wildgruber, D., Ackermann, H., Kreifelts, B., & Ethofer, T. (2006). Cerebral
processing of linguistic and emotional prosody: fMRI studies. Prog.Brain Res., 156, 249-268.
Notes: Department of Psychiatry, University of Tubingen, Osianderstr. 24, 72076
Tubingen, Germany. dirk.wildgruber@med.uni-tuebingen.de
During acoustic communication in humans, information about a speaker's emotional state
is predominantly conveyed by modulation of the tone of voice (emotional or affective
prosody). Based on lesion data, a right hemisphere superiority for cerebral processing of
emotional prosody has been assumed. However, the available clinical studies do not yet
provide a coherent picture with respect to interhemispheric lateralization effects of
prosody recognition and intrahemispheric localization of the respective brain regions. To
further delineate the cerebral network engaged in the perception of emotional tone, a
series of experiments was carried out based upon functional magnetic resonance imaging
(fMRI). The findings obtained from these investigations allow for the separation of three
successive processing stages during recognition of emotional prosody: (1) extraction of
suprasegmental acoustic information predominantly subserved by right-sided primary
and higher order acoustic regions; (2) representation of meaningful suprasegmental
acoustic sequences within posterior aspects of the right superior temporal sulcus; (3)
explicit evaluation of emotional prosody at the level of the bilateral inferior frontal
cortex. Moreover, implicit processing of affective intonation seems to be bound to
subcortical regions mediating automatic induction of specific emotional reactions such as
activation of the amygdala in response to fearful stimuli. As concerns lower level
processing of the underlying suprasegmental acoustic cues, linguistic and emotional
prosody seem to share the same right hemisphere neural resources. Explicit judgment of
linguistic aspects of speech prosody, however, appears to be linked to left-sided language
areas whereas bilateral orbitofrontal cortex has been found involved in explicit evaluation
of emotional prosody. These differences in hemispheric lateralization effects might
explain that specific impairments in nonverbal emotional communication subsequent to
focal brain lesions are relatively rare clinical observations as compared to the more
frequent aphasic disorders
20070118
Pihan, H. (2006). Affective and linguistic processing of speech prosody: DC
potential studies. Prog.Brain Res., 156, 269-284.
Notes: Department of Neurology, Schulthness Klinik, 8008 Zurich, and Department of
Neurology, Inselspital, University of Bern, 3010 Bern, Switzerland. hans_pihan@freesurf.ch
Speech melody or prosody subserves linguistic, emotional, and pragmatic functions in
speech communication. Prosodic perception is based on the decoding of acoustic cues
with a predominant function of frequency-related information perceived as speaker's
pitch. Evaluation of prosodic meaning is a cognitive function implemented in cortical and
subcortical networks that generate continuously updated affective or linguistic speaker
impressions. Various brain-imaging methods allow delineation of neural structures
involved in prosody processing. In contrast to functional magnetic resonance imaging
techniques, DC (direct current, slow) components of the EEG directly measure cortical
activation without temporal delay. Activation patterns obtained with this method are
highly task specific and intraindividually reproducible. Studies presented here
investigated the topography of prosodic stimulus processing in dependence on acoustic
stimulus structure and linguistic or affective task demands, respectively. Data obtained
from measuring DC potentials demonstrated that the right hemisphere has a predominant
role in processing emotions from the tone of voice, irrespective of emotional valence.
However, right hemisphere involvement is modulated by diverse speech and language-related conditions that are associated with a left hemisphere participation in prosody
processing. The degree of left hemisphere involvement depends on several factors such
as (i) articulatory demands on the perceiver of prosody (possibly, also the poser), (ii) a
relative left hemisphere specialization in processing temporal cues mediating prosodic
meaning, and (iii) the propensity of prosody to act on the segment level in order to
modulate word or sentence meaning. The specific role of top-down effects in terms of
either linguistically or affectively oriented attention on lateralization of stimulus
processing is not clear and requires further investigations
20070118
Kotz, S. A., Meyer, M., & Paulmann, S. (2006). Lateralization of emotional
prosody in the brain: an overview and synopsis on the impact of study design. Prog.Brain
Res., 156, 285-294.
Notes: Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a,
04103 Leipzig, Germany. kotz@cbs.mpg.de
Recently, research on the lateralization of linguistic and nonlinguistic (emotional)
prosody has experienced a revival. However, both neuroimaging and patient evidence do
not draw a coherent picture substantiating right-hemispheric lateralization of prosody and
emotional prosody in particular. The current overview summarizes positions and data on
the lateralization of emotion and emotional prosodic processing in the brain and proposes
that: (1) the realization of emotional prosodic processing in the brain is based on
differentially lateralized subprocesses and (2) methodological factors can influence the
lateralization of emotional prosody in neuroimaging investigations. Latter evidence
reveals that emotional valence effects are strongly right lateralized in studies using
compact blocked presentation of emotional stimuli. In contrast, data obtained from event-related studies are indicative of bilateral or left-accented lateralization of emotional
prosodic valence. These findings suggest a strong interaction between language and
emotional prosodic processing
20070118
Dietrich, S., Ackermann, H., Szameitat, D. P., & Alter, K. (2006). Psychoacoustic
studies on the processing of vocal interjections: how to disentangle lexical and prosodic
information? Prog.Brain Res., 156, 295-302.
Notes: Max-Planck-Institute for Human Cognitive and Brain Sciences, Stephanstrasse 1a,
04103 Leipzig, Germany. dietrich@cbs.mpg.de
Both intonation (affective prosody) and lexical meaning of verbal utterances participate
in the vocal expression of a speaker's emotional state, an important aspect of human
communication. However, it is still a matter of debate how the information of these two
'channels' is integrated during speech perception. In order to further analyze the impact of
affective prosody on lexical access, so-called interjections, i.e., short verbal emotional
utterances, were investigated. The results of a series of psychoacoustic studies indicate
the processing of emotional interjections to be mediated by a divided cognitive
mechanism encompassing both lexical access and the encoding of prosodic data.
Emotional interjections could be separated into elements with high- or low-lexical
content. As concerns the former items, both prosodic and propositional cues have a
significant influence upon recognition rates, whereas the processing of the low-lexical
cognates rather solely depends upon prosodic information. Incongruencies between
lexical and prosodic data structures compromise stimulus identification. Thus, the
analysis of utterances characterized by a dissociation of the prosodic and lexical
dimension revealed prosody to exert a stronger impact upon listeners' judgments than
lexicality. Taken together, these findings indicate that both propositional and prosodic
speech components closely interact during speech perception
20070118
Pell, M. D. (2006). Judging emotion and attitudes from prosody following brain
damage. Prog.Brain Res., 156, 303-317.
Notes: School of Communication Sciences and Disorders, McGill University, 1266 Ave.
des Pins Ouest, Montreal, QC, H3G 1A8, Canada. marc.pell@mcgill.ca
Research has long indicated a role for the right hemisphere in the decoding of basic
emotions from speech prosody, although there are few data on how the right hemisphere
is implicated in processes for understanding the emotive "attitudes" of a speaker from
prosody. We describe recent clinical studies that compared how well listeners with and
without focal right hemisphere damage (RHD) understand speaker attitudes such as
"confidence" or "politeness," which are signaled in large part by prosodic features of an
utterance. We found that RHD listeners as a group were abnormally sensitive to both the
expressed confidence and expressed politeness of speakers, and that these difficulties
often correlated with impairments for understanding basic emotions from prosody in
many RHD individuals. Our data emphasize a central role for the right hemisphere in the
ability to appreciate emotions and speaker attitudes from prosody, although the precise
source of these social-pragmatic deficits may arise in different ways in the context of
right hemisphere compromise
20070118
Schwaninger, A., Wallraven, C., Cunningham, D. W., & Chiller-Glaus, S. D.
(2006). Processing of facial identity and expression: a psychophysical, physiological, and
computational perspective. Prog.Brain Res., 156, 321-343.
Notes: Department of Bulthoff, Max Planck Institute for Biological Cybernetics,
Spemannstr. 38, 72076 Tubingen, Germany. adrian.schwaninger@tuebingen.mpg.de
A deeper understanding of how the brain processes visual information can be obtained by
comparing results from complementary fields such as psychophysics, physiology, and
computer science. In this chapter, empirical findings are reviewed with regard to the
proposed mechanisms and representations for processing identity and emotion in faces.
Results from psychophysics clearly show that faces are processed by analyzing
component information (eyes, nose, mouth, etc.) and their spatial relationship (configural
information). Results from neuroscience indicate separate neural systems for recognition
of identity and facial expression. Computer science offers a deeper understanding of the
required algorithms and representations, and provides computational modeling of
psychological and physiological accounts. An interdisciplinary approach taking these
different perspectives into account provides a promising basis for better understanding
and modeling of how the human brain processes visual information for recognition of
identity and emotion in faces
20070118
Ethofer, T., Pourtois, G., & Wildgruber, D. (2006). Investigating audiovisual
integration of emotional signals in the human brain. Prog.Brain Res., 156, 345-361.
Notes: Section of Experimental MR of the CNS, Department of Neuroradiology, Otfried-Muller-Str. 51, University of Tubingen, 72076 Tubingen, Germany.
thomas.ethofer@med.uni-tuebingen.de
Humans can communicate their emotional state via facial expression and affective
prosody. This chapter reviews behavioural, neuroanatomical, electrophysiological and
neuroimaging studies pertaining to audiovisual integration of emotional communicative
signals. Particular emphasis will be given to neuroimaging studies using positron
emission tomography (PET) or functional magnetic resonance imaging (fMRI).
Conjunction analyses, interaction analyses, correlation analyses between haemodynamic
responses and behavioural effects and connectivity analyses have been employed to
analyse neuroimaging data. There is no general agreement as to which of these
approaches can be considered "optimal" to classify brain regions as multisensory. We
argue that these approaches provide complementing information as they assess different
aspects of multisensory integration of emotional information. Assets and drawbacks of
the different analysis types are discussed and demonstrated on the basis of one fMRI data
set
20070118
Adolphs, R. & Spezio, M. (2006). Role of the amygdala in processing visual
social stimuli. Prog.Brain Res., 156, 363-378.
Notes: Division of the Humanities and Social Sciences, HSS 228-77, California Institute
of Technology, Pasadena, CA 91125, USA. radolphs@hss.caltech.edu
We review the evidence implicating the amygdala as a critical component of a neural
network of social cognition, drawing especially on research involving the processing of
faces and other visual social stimuli. We argue that, although it is clear that social
behavioral representations are not stored in the amygdala, the most parsimonious
interpretation of the data is that the amygdala plays a role in guiding social behaviors on
the basis of socioenvironmental context. Thus, it appears to be required for normal social
cognition. We propose that the amygdala plays this role by attentionally modulating
several areas of visual and somatosensory cortex that have been implicated in social
cognition, and in helping to direct overt visuospatial attention in face gaze. We also
hypothesize that the amygdala exerts attentional modulation of simulation in
somatosensory cortices such as supramarginal gyrus and insula. Finally, we argue that
the term emotion be broadened to include increased attention to bodily responses and
their representation in cortex
20070118
Keysers, C. & Gazzola, V. (2006). Towards a unifying neural theory of social
cognition. Prog.Brain Res., 156, 379-401.
Notes: BCN Neuro-Imaging-Centre, University Medical Center Groningen, University of
Groningen, A. Deusinglaan 2, 9713AW Groningen, The Netherlands. c.keysers@rug.nl
Humans can effortlessly understand a lot of what is going on in other peoples' minds.
Understanding the neural basis of this capacity has proven quite difficult. Since the
discovery of mirror neurons, a number of successful experiments have approached the
question of how we understand the actions of others from the perspective of sharing their
actions. Recently we have demonstrated that a similar logic may apply to understanding
the emotions and sensations of others. Here, we therefore review evidence that a single
mechanism (shared circuits) applies to actions, sensations and emotions: witnessing the
actions, sensations and emotions of other individuals activates brain areas normally
involved in performing the same actions and feeling the same sensations and emotions.
We propose that these circuits, shared between the first (I do, I feel) and third person
perspective (seeing her do, seeing her feel) translate the vision and sound of what other
people do and feel into the language of the observers own actions and feelings. This
translation could help understand the actions and feelings of others by providing intuitive
insights into their inner life. We propose a mechanism for the development of shared
circuits on the basis of Hebbian learning, and underline that shared circuits could
integrate with more cognitive functions during social cognitions
20070118
Chakrabarti, B. & Baron-Cohen, S. (2006). Empathizing: neurocognitive
developmental mechanisms and individual differences. Prog.Brain Res., 156, 403-417.
Notes: Autism Research Centre, University of Cambridge, Psychiatry Department,
Douglas House, 18B Trumpington Rd, Cambridge CB2 2AH, UK. bc249@cam.ac.uk
This chapter reviews the Mindreading System model encompassing four neurocognitive
mechanisms (ID, EDD, SAM, and ToMM) before reviewing the revised empathizing
model encompassing two new neurocognitive mechanisms (TED and TESS). It is argued
that the empathizing model is more comprehensive because it entails perception,
interpretation, and affective responses to other agents. Sex differences in empathy
(female advantage) are then reviewed, as a clear example of individual differences in
empathy. This leads into an illustration of individual differences using the Empathy
Quotient (EQ). Finally, the neuroimaging literature in relation to each of the
neurocognitive mechanisms is briefly summarized and a new study is described that tests
if different brain regions respond to the perception of different facial expressions of
emotion, as a function of the observer's EQ
20070118
Leiberg, S. & Anders, S. (2006). The multiple facets of empathy: a survey of
theory and evidence. Prog.Brain Res., 156, 419-440.
Notes: Institute of Medical Psychology and Behavioral Neurobiology, University of
Tubingen, Tubingen, Germany. sleiberg@ukaachen.de
Empathy is the ability to perceive and understand other people's emotions and to react
appropriately. This ability is a necessary prerequisite for successful interpersonal
interaction. Empathy is a multifaceted construct including low-level mechanisms like
emotional contagion as well as high-level processes like perspective-taking. The ability
to empathize varies between individuals and is considered a stable personality trait: some
people are generally more successful in empathizing than others. In this chapter we will
first present different conceptualizations of the construct of empathy, and refer to
empathy-regulating processes as well as to the relationship between empathy and social
behavior. Then, we will review peripheral physiological and brain imaging studies
pertaining to low- and high-level empathic processes, empathy-modulating processes,
and the link between empathy and social behavior. Further, we will present evidence
regarding interindividual differences in these processes as an important source of
information for solving the conundrum of how the comprehension of others' emotions is
achieved by our brains
20070118
Hennenlotter, A. & Schroeder, U. (2006). Partly dissociable neural substrates for
recognizing basic emotions: a critical review. Prog.Brain Res., 156, 443-456.
Notes: Department of Neuropsychology, Max Planck Institute for Human Cognitive and
Brain Sciences, Stephanstrasse 1A, D-04103 Leipzig, Germany. hennen@cbs.mpg.de
Facial expressions are powerful non-verbal displays of emotion which signal valence
information to others and constitute an important communicative element in social
interaction. Six basic emotional expressions (fear, disgust, anger, surprise, happiness, and
sadness) have been shown to be universal in their performance and perception. Recently,
a growing number of clinical and functional imaging studies have aimed at identifying
partly dissociable neural subsystems for recognizing basic emotions. Convincing results
have been obtained for fearful and disgusted facial expressions only. Empirical evidence
for a specialized neural representation of anger, surprise, sadness, or happiness is more
limited, primarily due to lack of clinical cases with selective impairments in recognizing
these emotions. In functional imaging research, the detection of dissociable neural
responses requires direct comparisons of signal changes associated with the perception of
different emotions, which are often not provided. Only recently has evidence been
obtained that the recruitment of emotion-specific neural subsystems may be closely
linked to characteristic facial features of single expressions such as the eye region for
fearful faces. Investigations into the neural systems underlying the processing of such
diagnostic cues for each of the six basic emotions may be helpful to further elucidate
their neural representation
20070118
Sommer, M., Hajak, G., Dohnel, K., Schwerdtner, J., Meinhardt, J., & Muller, J.
L. (2006). Integration of emotion and cognition in patients with psychopathy. Prog.Brain
Res., 156, 457-466.
Notes: Department of Psychiatry, Psychotherapy and Psychosomatic, University of
Regensburg, Universitatsstrasse 84, D-93053 Regensburg, Germany. monika.sommer@medbo.de
Psychopathy is a personality disorder associated with emotional characteristics like
impulsivity, manipulativeness, affective shallowness, and absence of remorse or
empathy. The impaired emotional responsiveness is considered to be the hallmark of the
disorder. There are two theories that attempt to explain the emotional dysfunction and the
poor socialization in psychopathy: (1) the low-fear model and (2) the inhibition of
violence model. Both approaches are supported by several studies. Studies using aversive
conditioning or the startle modulation underline the severe difficulties in processing
negative stimuli in psychopaths. Studies that explore the processing of emotional
expressions show a deficit of psychopathic individuals for processing sad or fearful facial
expressions or vocal affect. In the cognitive domain, psychopaths show performance
deficits in the interpretation of the motivational significance of stimuli. Studies
investigating the impact of emotions on cognitive processes show that in psychopaths in
contrast to healthy controls negative emotions drain no resources from a cognitive task. It
is suggested that dysfunctions in the frontal cortex, especially the orbitofrontal cortex, the
cingulate cortex and the amygdala are associated with the emotional and cognitive impairments
20070118
Kucharska-Pietura, K. (2006). Disordered emotional processing in schizophrenia
and one-sided brain damage. Prog.Brain Res., 156, 467-479.
Notes: Whitchurch Hospital, Cardiff and Vale NHS Trust, Cardiff CF 14 7XB, UK. katepietura@hotmail.com
The work concentrates on the problem of human emotions in healthy and pathologically
changed brains, mainly in persons afflicted with schizophrenia or with organic
impairments localized in one of the cerebral hemispheres. This chapter presents the state
of current knowledge concerning the hemispheric lateralization of emotions among
healthy people, psychiatric patients, and patients with one-sided brain lesion, on the basis
of clinical observations, the results of experimental work, and the newest neuroimaging
techniques. The numerous experiments and scientific methods used to assess the
hemispheric lateralization of emotions and the discrepancies in their results point toward
a lack of consistent theory in the field of hemispheric specialization in the regulation of
emotional processes. Particular scientific interest was taken in the emotions of persons
afflicted with schizophrenia, either in its early or late stages. This was inspired by the
emotional behavior of schizophrenic patients on a psychiatric ward and their ability to
perceive and express emotions during various stages of the schizophrenic process. In
order to examine the cerebral manifestations of emotional deficits and the specialization
of cerebral hemispheres for emotional processes, the author has described the emotional
behavior of patients with unilateral cerebral stroke, i.e., patients with damage to the right
or left cerebral hemisphere. Overall, the inferior performance of emotional tasks by right-hemisphere-damaged patients compared to other groups might support right-hemisphere
superiority for affect perception despite variations in the stimuli used
20070118
Ende, G., Demirakca, T., & Tost, H. (2006). The biochemistry of dysfunctional
emotions: proton MR spectroscopic findings in major depressive disorder. Prog.Brain
Res., 156, 481-501.
Notes: NMR Research in Psychiatry, Central Institute of Mental Health, J5, 68159
Mannheim, Germany. ende@zi-mannheim.de
Key neural systems involved in the processing and communication of emotions are
impaired in patients with major depressive disorder (MDD). Emotional and behavioral
symptoms are thought to be caused by damage or dysfunction in specific areas of the
brain that are responsible for directing attention, motivating behavior, and learning the
significance of environmental stimuli. Functional brain studies with positron emission
tomography (PET) and functional magnetic resonance imaging (fMRI) give support for
functional abnormalities in MDD that are predominantly located in areas known to play
an important role in the communication and processing of emotions. Disturbances in
emotional processing as they are observed in MDD, if any, have very subtle
morphometrical brain correlates. With proton magnetic resonance spectroscopy (1H
MRS), brain metabolites can be measured noninvasively in vivo, thus furthering the
understanding of the effects of changes in neurotransmitters within the brain. The current
literature on 1H MRS studies in MDD is small with a large diversity of MRS methods
applied, brain regions studied, and metabolite changes found. Nevertheless, there is
strong evidence that changes in neurometabolite concentrations in MDD occur within
brain regions, which are involved in the processing and communication of emotions that
can be monitored by 1H MRS. This review summarizes the literature about biochemical
changes quantified via 1H MRS in MDD patients in brain regions that play an important
role for the communication and processing of emotions
20070118