Doallo S, Holguin SR, Cadaveira F.
Neuroreport, 17, 1797-1801
One open question on the relation between attention and emotion concerns the automatic processing of emotional visual stimuli outside the focus of attention. This study examined to what extent the emotional processing at unattended locations is modulated by the processing load at attended locations. Event-related potentials were measured to task-irrelevant unpleasant and neutral pictures briefly presented at peripheral locations while participants performed a visual central task varying in load (low and high load). Unpleasant pictures elicited larger amplitudes of N1-P2 at parietoccipital and occipital sites than that of neutral pictures. This effect was only significant in the low-load condition. Data suggest that brain responses to affective value of task-irrelevant peripheral pictures are modulated by attentional load at fixation.
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Friday, December 15, 2006
ARTICLE UPDATE - Sequence of information processing for emotions based on the anatomic dialogue between prefrontal cortex and amygdala
H.T. Ghashghaei, 1, C.C. Hilgetag, d and H. Barbas
NeuroImage, in press
The prefrontal cortex and the amygdala have synergistic roles in regulating purposive behavior, effected through bidirectional pathways. Here we investigated the largely unknown extent and laminar relationship of prefrontal input–output zones linked with the amygdala using neural tracers injected in the amygdala in rhesus monkeys. Prefrontal areas varied vastly in their connections with the amygdala, with the densest connections found in posterior orbitofrontal and posterior medial cortices, and the sparsest in anterior lateral prefrontal areas, especially area 10. Prefrontal projection neurons directed to the amygdala originated in layer 5, but significant numbers were also found in layers 2 and 3 in posterior medial and orbitofrontal cortices. Amygdalar axonal terminations in prefrontal cortex were most frequently distributed in bilaminar bands in the superficial and deep layers, by columns spanning the entire cortical depth, and less frequently as small patches centered in the superficial or deep layers. Heavy terminations in layers 1–2 overlapped with calbindin-positive inhibitory neurons. A comparison of the relationship of input to output projections revealed that among the most heavily connected cortices, cingulate areas 25 and 24 issued comparatively more projections to the amygdala than they received, whereas caudal orbitofrontal areas were more receivers than senders. Further, there was a significant relationship between the proportion of ‘feedforward’ cortical projections from layers 2–3 to ‘feedback’ terminations innervating the superficial layers of prefrontal cortices. These findings indicate that the connections between prefrontal cortices and the amygdala follow similar patterns as corticocortical connections, and by analogy suggest pathways underlying the sequence of information processing for emotions.
NeuroImage, in press
The prefrontal cortex and the amygdala have synergistic roles in regulating purposive behavior, effected through bidirectional pathways. Here we investigated the largely unknown extent and laminar relationship of prefrontal input–output zones linked with the amygdala using neural tracers injected in the amygdala in rhesus monkeys. Prefrontal areas varied vastly in their connections with the amygdala, with the densest connections found in posterior orbitofrontal and posterior medial cortices, and the sparsest in anterior lateral prefrontal areas, especially area 10. Prefrontal projection neurons directed to the amygdala originated in layer 5, but significant numbers were also found in layers 2 and 3 in posterior medial and orbitofrontal cortices. Amygdalar axonal terminations in prefrontal cortex were most frequently distributed in bilaminar bands in the superficial and deep layers, by columns spanning the entire cortical depth, and less frequently as small patches centered in the superficial or deep layers. Heavy terminations in layers 1–2 overlapped with calbindin-positive inhibitory neurons. A comparison of the relationship of input to output projections revealed that among the most heavily connected cortices, cingulate areas 25 and 24 issued comparatively more projections to the amygdala than they received, whereas caudal orbitofrontal areas were more receivers than senders. Further, there was a significant relationship between the proportion of ‘feedforward’ cortical projections from layers 2–3 to ‘feedback’ terminations innervating the superficial layers of prefrontal cortices. These findings indicate that the connections between prefrontal cortices and the amygdala follow similar patterns as corticocortical connections, and by analogy suggest pathways underlying the sequence of information processing for emotions.
ARTICLE UPDATE - The impact of processing load on emotion.
D.G.V. Mitchell, M. Nakic, D. Fridberg, N. Kamel, D.S. Pine and R.J.R. Blair
NeuroImage, in press
This event-related fMRI study examined the impact of processing load on the BOLD response to emotional expressions. Participants were presented with composite stimuli consisting of neutral and fearful faces upon which semi-transparent words were superimposed. This manipulation held stimulus-driven features constant across multiple levels of processing load. Participants made either (1) gender discriminations based on the face; (2) case judgments based on the words; or (3) syllable number judgments based on the words. A significant main effect for processing load was revealed in prefrontal cortex, parietal cortex, visual processing areas, and amygdala. Critically, enhanced activity in the amygdala and medial prefrontal cortex seen during gender discriminations was significantly reduced during the linguistic task conditions. A connectivity analysis conducted to investigate theories of cognitive modulation of emotion showed that activity in dorsolateral prefrontal cortex was inversely related to activity in the ventromedial prefrontal cortex. Together, the data suggest that the processing of task-irrelevant emotional information, like neutral information, is subject to the effects of processing load and is under top-down control.
NeuroImage, in press
This event-related fMRI study examined the impact of processing load on the BOLD response to emotional expressions. Participants were presented with composite stimuli consisting of neutral and fearful faces upon which semi-transparent words were superimposed. This manipulation held stimulus-driven features constant across multiple levels of processing load. Participants made either (1) gender discriminations based on the face; (2) case judgments based on the words; or (3) syllable number judgments based on the words. A significant main effect for processing load was revealed in prefrontal cortex, parietal cortex, visual processing areas, and amygdala. Critically, enhanced activity in the amygdala and medial prefrontal cortex seen during gender discriminations was significantly reduced during the linguistic task conditions. A connectivity analysis conducted to investigate theories of cognitive modulation of emotion showed that activity in dorsolateral prefrontal cortex was inversely related to activity in the ventromedial prefrontal cortex. Together, the data suggest that the processing of task-irrelevant emotional information, like neutral information, is subject to the effects of processing load and is under top-down control.
Monday, December 04, 2006
ARTICLE UPDATE - Emotional memories are not all created equal: Evidence for selective memory enhancement
Adam K. Anderson, Yuki Yamaguchi, Wojtek Grabski, and Dominika Lacka
LEARNING & MEMORY, 13, 711-718
Human brain imaging studies have shown that greater amygdala activation to emotional relative to neutral events leads to enhanced episodic memory. Other studies have shown that fearful faces also elicit greater amygdala activation relative to neutral faces. To the extent that amygdala recruitment is sufficient to enhance recollection, these separate lines of evidence predict that recognition memory should be greater for fearful relative to neutral faces. Experiment 1 demonstrated enhanced memory for emotionally negative relative to neutral scenes; however, fearful faces were not subject to enhanced recognition across a variety of delays (15 min to 2 wk). Experiment 2 demonstrated that enhanced delayed recognition for emotional scenes was associated with increased sympathetic autonomic arousal, indexed by the galvanic skin response, relative to fearful faces. These results suggest that while amygdala activation may be necessary, it alone is insufficient to enhance episodic memory formation. It is proposed that a sufficient level of systemic arousal is required to alter memory consolidation resulting in enhanced recollection of emotional events.
LEARNING & MEMORY, 13, 711-718
Human brain imaging studies have shown that greater amygdala activation to emotional relative to neutral events leads to enhanced episodic memory. Other studies have shown that fearful faces also elicit greater amygdala activation relative to neutral faces. To the extent that amygdala recruitment is sufficient to enhance recollection, these separate lines of evidence predict that recognition memory should be greater for fearful relative to neutral faces. Experiment 1 demonstrated enhanced memory for emotionally negative relative to neutral scenes; however, fearful faces were not subject to enhanced recognition across a variety of delays (15 min to 2 wk). Experiment 2 demonstrated that enhanced delayed recognition for emotional scenes was associated with increased sympathetic autonomic arousal, indexed by the galvanic skin response, relative to fearful faces. These results suggest that while amygdala activation may be necessary, it alone is insufficient to enhance episodic memory formation. It is proposed that a sufficient level of systemic arousal is required to alter memory consolidation resulting in enhanced recollection of emotional events.
Friday, December 01, 2006
ARTICLE UPDATE - Regulation of emotional responses elicited by threat-related stimuli.
Eippert F, Veit R, Weiskopf N, Erb M, Birbaumer N, Anders S.
Human Brain Mapping, in press
The capacity to voluntarily regulate emotions is critical for mental health, especially when coping with aversive events. Several neuroimaging studies of emotion regulation found the amygdala to be a target for downregulation and prefrontal regions to be associated with downregulation. To characterize the role of prefrontal regions in bidirectional emotion regulation and to investigate regulatory influences on amygdala activity and peripheral physiological measures, a functional magnetic resonance imaging (fMRI) study with simultaneous recording of self-report, startle eyeblink, and skin conductance responses was carried out. Subjects viewed threat-related pictures and were asked to up- and downregulate their emotional responses using reappraisal strategies. While startle eyeblink responses (in successful regulators) and skin conductance responses were amplified during upregulation, but showed no consistent effect during downregulation, amygdala activity was increased and decreased according to the regulation instructions. Trial-by-trial ratings of regulation success correlated positively with activity in amygdala during upregulation and orbitofrontal cortex during downregulation. Downregulation was characterized by left-hemispheric activation peaks in anterior cingulate cortex, dorsolateral prefrontal cortex, and orbitofrontal cortex and upregulation was characterized by a pattern of prefrontal activation not restricted to the left hemisphere. Further analyses showed significant overlap of prefrontal activation across both regulation conditions, possibly reflecting cognitive processes underlying both up- and downregulation, but also showed distinct activations in each condition. The present study demonstrates that amygdala responses to threat-related stimuli can be controlled through the use of cognitive strategies depending on recruitment of prefrontal areas, thereby changing the subject's affective state.
Human Brain Mapping, in press
The capacity to voluntarily regulate emotions is critical for mental health, especially when coping with aversive events. Several neuroimaging studies of emotion regulation found the amygdala to be a target for downregulation and prefrontal regions to be associated with downregulation. To characterize the role of prefrontal regions in bidirectional emotion regulation and to investigate regulatory influences on amygdala activity and peripheral physiological measures, a functional magnetic resonance imaging (fMRI) study with simultaneous recording of self-report, startle eyeblink, and skin conductance responses was carried out. Subjects viewed threat-related pictures and were asked to up- and downregulate their emotional responses using reappraisal strategies. While startle eyeblink responses (in successful regulators) and skin conductance responses were amplified during upregulation, but showed no consistent effect during downregulation, amygdala activity was increased and decreased according to the regulation instructions. Trial-by-trial ratings of regulation success correlated positively with activity in amygdala during upregulation and orbitofrontal cortex during downregulation. Downregulation was characterized by left-hemispheric activation peaks in anterior cingulate cortex, dorsolateral prefrontal cortex, and orbitofrontal cortex and upregulation was characterized by a pattern of prefrontal activation not restricted to the left hemisphere. Further analyses showed significant overlap of prefrontal activation across both regulation conditions, possibly reflecting cognitive processes underlying both up- and downregulation, but also showed distinct activations in each condition. The present study demonstrates that amygdala responses to threat-related stimuli can be controlled through the use of cognitive strategies depending on recruitment of prefrontal areas, thereby changing the subject's affective state.
Wednesday, November 29, 2006
ARTICLE UPDATE - An electrophysiological study on the interaction between emotional content and spatial frequency of visual stimuli
Luis Carretié, José A. Hinojosa, Sara López-Martín and Manuel Tapia
Neuropsychologia, in press
Previous studies suggest that the magnocellular pathway, a visual processing system that rapidly provides low spatial frequency information to fast-responding structures such as the amygdala, is more involved in the processing of emotional facial expressions than the parvocellular pathway (which conveys all spatial frequencies). The present experiment explored the spatio-temporal characteristics of the spatial frequency modulation of affect-related neural processing, as well as its generalizability to non-facial stimuli. To that aim, the event-related potentials (ERPs) elicited by low-pass filtered (i.e., high spatial frequencies are eliminated) and intact non-facial emotional images were recorded from 31 participants using a 60-electrode array. The earliest significant effect of spatial frequency was observed at 135 ms from stimulus onset: N135 component of the ERPs. In line with previous studies, the origin of N135 was localized at secondary visual areas for low-pass filtered stimuli and at primary areas for intact stimuli. Importantly, this component showed an interaction between spatial frequency and emotional content: within low-pass filtered pictures, negative stimuli elicited the highest N135 amplitudes. By contrast, within intact stimuli, neutral pictures were those eliciting the highest amplitudes. These results suggest that high spatial frequencies are not essential for the initial affect-related processing of visual stimuli, which would mainly rely on low spatial frequency visual information. According to present data, high spatial frequencies would come into play later on.
Neuropsychologia, in press
Previous studies suggest that the magnocellular pathway, a visual processing system that rapidly provides low spatial frequency information to fast-responding structures such as the amygdala, is more involved in the processing of emotional facial expressions than the parvocellular pathway (which conveys all spatial frequencies). The present experiment explored the spatio-temporal characteristics of the spatial frequency modulation of affect-related neural processing, as well as its generalizability to non-facial stimuli. To that aim, the event-related potentials (ERPs) elicited by low-pass filtered (i.e., high spatial frequencies are eliminated) and intact non-facial emotional images were recorded from 31 participants using a 60-electrode array. The earliest significant effect of spatial frequency was observed at 135 ms from stimulus onset: N135 component of the ERPs. In line with previous studies, the origin of N135 was localized at secondary visual areas for low-pass filtered stimuli and at primary areas for intact stimuli. Importantly, this component showed an interaction between spatial frequency and emotional content: within low-pass filtered pictures, negative stimuli elicited the highest N135 amplitudes. By contrast, within intact stimuli, neutral pictures were those eliciting the highest amplitudes. These results suggest that high spatial frequencies are not essential for the initial affect-related processing of visual stimuli, which would mainly rely on low spatial frequency visual information. According to present data, high spatial frequencies would come into play later on.
Friday, November 24, 2006
ARTICLE UPDATE - Is this happiness I see? Biases in the identification of emotional facial expressions in depression and social phobia.
Joormann J, Gotlib IH.
Journal of Abnormal Psychology, 115, 705-714.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed.
Journal of Abnormal Psychology, 115, 705-714.
The present study was designed to examine the operation of depression-specific biases in the identification or labeling of facial expression of emotions. Participants diagnosed with major depression and social phobia and control participants were presented with faces that expressed increasing degrees of emotional intensity, slowly changing from a neutral to a full-intensity happy, sad, or angry expression. The authors assessed individual differences in the intensity of facial expression of emotion that was required for the participants to accurately identify the emotion being expressed. The depressed participants required significantly greater intensity of emotion than did the social phobic and the control participants to correctly identify happy expressions and less intensity to identify sad than angry expressions. In contrast, social phobic participants needed less intensity to correctly identify the angry expressions than did the depressed and control participants and less intensity to identify angry than sad expressions. Implications of these results for interpersonal functioning in depression and social phobia are discussed.
ARTICLE UPDATE - Green love is ugly: Emotions elicited by synesthetic grapheme-color perceptions.
Callejas A, Acosta A, Lupianez J.
Brain Research, in press
Synesthetes who experience grapheme-color synesthesia often report feeling uneasy when dealing with incongruently colored graphemes although no empirical data is available to confirm this phenomenon. We studied this affective reaction related to synesthetic perceptions by means of an evaluation task. We found that the perception of an incorrectly colored word affects the judgments of emotional valence. Furthermore, this effect competed with the word's emotional valence in a categorization task thus supporting the automatic nature of this synesthetically elicited affective reaction. When manipulating word valence and word color-photism congruence, we found that responses were slower (and less accurate) for inconsistent conditions than for consistent conditions. Inconsistent conditions were defined as those where semantics and color-photism congruence did not produce a similar assessment and therefore gave rise to a negative affective reaction (i.e., positive-valence words presented in a color different from the synesthete's photism or negative-valence words presented in the photism's color). We therefore observed a modulation of the congruency effect (i.e., faster reaction times to congruently colored words than incongruently colored words). Although this congruence effect has been taken as an index of the true experience of synesthesia, we observed that it can be reversed when the experimental manipulations turn an incongruently colored word into a consistent stimulus. To our knowledge, this is the first report of an affective reaction elicited by the congruency between the synesthetically induced color of a word and the color in which the word is actually presented. The underlying neural mechanisms that might be involved in this phenomenon are discussed.
Brain Research, in press
Synesthetes who experience grapheme-color synesthesia often report feeling uneasy when dealing with incongruently colored graphemes although no empirical data is available to confirm this phenomenon. We studied this affective reaction related to synesthetic perceptions by means of an evaluation task. We found that the perception of an incorrectly colored word affects the judgments of emotional valence. Furthermore, this effect competed with the word's emotional valence in a categorization task thus supporting the automatic nature of this synesthetically elicited affective reaction. When manipulating word valence and word color-photism congruence, we found that responses were slower (and less accurate) for inconsistent conditions than for consistent conditions. Inconsistent conditions were defined as those where semantics and color-photism congruence did not produce a similar assessment and therefore gave rise to a negative affective reaction (i.e., positive-valence words presented in a color different from the synesthete's photism or negative-valence words presented in the photism's color). We therefore observed a modulation of the congruency effect (i.e., faster reaction times to congruently colored words than incongruently colored words). Although this congruence effect has been taken as an index of the true experience of synesthesia, we observed that it can be reversed when the experimental manipulations turn an incongruently colored word into a consistent stimulus. To our knowledge, this is the first report of an affective reaction elicited by the congruency between the synesthetically induced color of a word and the color in which the word is actually presented. The underlying neural mechanisms that might be involved in this phenomenon are discussed.
Wednesday, November 22, 2006
ARTICLE UPDATE - Recent fear is resistant to extinction
Stephen Maren and Chun-hui Chang
PNAS, 103, 18020-18025
In some individuals, fearful experiences (e.g., combat) yield persistent and debilitating psychological disturbances, including posttraumatic stress disorder (PTSD). Early intervention (e.g., debriefing) after psychological trauma is widely practiced and argued to be an effective strategy for limiting subsequent psychopathology, although there has been considerable debate on this point. Here we show in an animal model of traumatic fear that early intervention shortly after an aversive experience yields poor long-term fear reduction. Extinction trials administered minutes after aversive fear conditioning in rats suppressed fear acutely, but fear suppression was not maintained the next day. In contrast, delivering extinction trials 1 day after fear conditioning produced an enduring suppression of fear memory. We further show that the recent experience of an aversive event, not the timing of the extinction intervention per se, inhibits the development of long-term fear extinction. These results reveal that the level of fear present at the time of intervention is a critical factor in the efficacy of extinction. Importantly, our work suggests that early intervention may not yield optimal outcomes in reducing posttraumatic stress, particularly after severe trauma.
PNAS, 103, 18020-18025
In some individuals, fearful experiences (e.g., combat) yield persistent and debilitating psychological disturbances, including posttraumatic stress disorder (PTSD). Early intervention (e.g., debriefing) after psychological trauma is widely practiced and argued to be an effective strategy for limiting subsequent psychopathology, although there has been considerable debate on this point. Here we show in an animal model of traumatic fear that early intervention shortly after an aversive experience yields poor long-term fear reduction. Extinction trials administered minutes after aversive fear conditioning in rats suppressed fear acutely, but fear suppression was not maintained the next day. In contrast, delivering extinction trials 1 day after fear conditioning produced an enduring suppression of fear memory. We further show that the recent experience of an aversive event, not the timing of the extinction intervention per se, inhibits the development of long-term fear extinction. These results reveal that the level of fear present at the time of intervention is a critical factor in the efficacy of extinction. Importantly, our work suggests that early intervention may not yield optimal outcomes in reducing posttraumatic stress, particularly after severe trauma.
Tuesday, November 14, 2006
ARTICLE UPDATE - Neural dynamics for facial threat processing as revealed by gamma band synchronization using MEG
Neural dynamics for facial threat processing as revealed by gamma band synchronization using MEG
Qian Luo, Tom Holroyd, Matthew Jones, Talma Hendler, and James Blair
NeuroImage, in press
Facial threat conveys important information about imminent environmental danger. The rapid detection of this information is critical for survival and social interaction. However, due to technical and methodological difficulties, the spatiotemporal profile for facial threat processing is unknown. By utilizing magnetoencephalography (MEG), a brain-imaging technique with superb temporal resolution and fairly good spatial resolution, Synthetic Aperture Magnetometry (SAM), a recently developed source analysis technique, and a sliding window analysis, we identified the spatiotemporal development of facial threat processing in the gamma frequency band. We also tested the dual-route hypothesis by LeDoux who proposed, based on animal research, that there are two routes to the amygdala: a quick subcortical route and a slower and cortical route. Direct evidence with humans supporting this model has been lacking. Moreover, it has been unclear whether the subcortical route responds specifically to fearful expressions or to threatening expressions in general. We found early event-related synchronizations (ERS) in response to fearful faces in the hypothalamus/thalamus area (10–20 ms) and then the amygdala (20–30 ms). This was even earlier than the ERS response seen to fearful faces in visual cortex (40–50 ms). These data support LeDoux's suggestion of a quick, subcortical thamalo-amygdala route. Moreover, this route was specific for fear expressions; the ERS response in the amygdala to angry expressions had a late onset (150–160 ms). The ERS onset in prefrontal cortex followed that seen within the amygdala (around 160–210 ms). This is consistent with its role in higher-level emotional/cognitive processing.
Qian Luo, Tom Holroyd, Matthew Jones, Talma Hendler, and James Blair
NeuroImage, in press
Facial threat conveys important information about imminent environmental danger. The rapid detection of this information is critical for survival and social interaction. However, due to technical and methodological difficulties, the spatiotemporal profile for facial threat processing is unknown. By utilizing magnetoencephalography (MEG), a brain-imaging technique with superb temporal resolution and fairly good spatial resolution, Synthetic Aperture Magnetometry (SAM), a recently developed source analysis technique, and a sliding window analysis, we identified the spatiotemporal development of facial threat processing in the gamma frequency band. We also tested the dual-route hypothesis by LeDoux who proposed, based on animal research, that there are two routes to the amygdala: a quick subcortical route and a slower and cortical route. Direct evidence with humans supporting this model has been lacking. Moreover, it has been unclear whether the subcortical route responds specifically to fearful expressions or to threatening expressions in general. We found early event-related synchronizations (ERS) in response to fearful faces in the hypothalamus/thalamus area (10–20 ms) and then the amygdala (20–30 ms). This was even earlier than the ERS response seen to fearful faces in visual cortex (40–50 ms). These data support LeDoux's suggestion of a quick, subcortical thamalo-amygdala route. Moreover, this route was specific for fear expressions; the ERS response in the amygdala to angry expressions had a late onset (150–160 ms). The ERS onset in prefrontal cortex followed that seen within the amygdala (around 160–210 ms). This is consistent with its role in higher-level emotional/cognitive processing.
Thursday, November 09, 2006
ARTICLE UPDATE - Mood Alters Amygdala Activation to Sad Distractors During an Attentional Task
Lihong Wang, Kevin S. LaBar and Gregory McCarthy
Biological Psychiatry, 60, 1139-1146
Background
A behavioral hallmark of mood disorders is biased perception and memory for sad events. The amygdala is poised to mediate internal mood and external event processing because of its connections with both the internal milieu and the sensory world. There is little evidence showing that the amygdala’s response to sad sensory stimuli is functionally modulated by mood state, however.
Methods
We investigated the impact of mood on amygdala activation evoked by sad and neutral pictures presented as distractors during an attentional oddball task. Healthy adults underwent functional magnetic resonance imaging during task runs that were preceded by sad or happy movie clips. Happy and sad mood induction was conducted within-subjects on consecutive days in counterbalanced order.
Results
Amygdala activation to sad distractors was enhanced after viewing sad movies relative to happy ones and was correlated with reaction time costs to detect attentional targets. The activation was higher in female subjects in the right hemisphere. The anterior cingulate, ventromedial and orbital prefrontal cortex, insula, and other posterior regions also showed enhanced responses to sad distractors during sad mood.
Conclusions
These findings reveal brain mechanisms that integrate emotional input and current mood state, with implications for understanding cognitive distractibility in depression.
Biological Psychiatry, 60, 1139-1146
Background
A behavioral hallmark of mood disorders is biased perception and memory for sad events. The amygdala is poised to mediate internal mood and external event processing because of its connections with both the internal milieu and the sensory world. There is little evidence showing that the amygdala’s response to sad sensory stimuli is functionally modulated by mood state, however.
Methods
We investigated the impact of mood on amygdala activation evoked by sad and neutral pictures presented as distractors during an attentional oddball task. Healthy adults underwent functional magnetic resonance imaging during task runs that were preceded by sad or happy movie clips. Happy and sad mood induction was conducted within-subjects on consecutive days in counterbalanced order.
Results
Amygdala activation to sad distractors was enhanced after viewing sad movies relative to happy ones and was correlated with reaction time costs to detect attentional targets. The activation was higher in female subjects in the right hemisphere. The anterior cingulate, ventromedial and orbital prefrontal cortex, insula, and other posterior regions also showed enhanced responses to sad distractors during sad mood.
Conclusions
These findings reveal brain mechanisms that integrate emotional input and current mood state, with implications for understanding cognitive distractibility in depression.
Wednesday, November 08, 2006
ARTICLE UPDATE - Detecting fearful and neutral faces: BOLD latency differences in amygdala–hippocampal junction
A.A.T.S. Reinders, J. Gläscher, J.R. de Jong, A.T.M. Willemsen, J.A. den Boer and C. Büchel
NeuroImage, 33, 805-814
Evolutionary survival and procreation are augmented if an individual organism quickly detects environmental threats and rapidly initiates defensive behavioral reactions. Thus, facial emotions signaling a potential threat, e.g., fear or anger, should be perceived rapidly and automatically, possibly through a subcortical processing route which includes the amygdala. Using event-related functional magnetic resonance imaging (fMRI), we investigated the time course of the response in the amygdala to neutral and fearful faces, which appear from dynamically decreasing random visual noise. We aimed to detect differences of the amygdala response between fearful and neutral faces by estimating the latency of the blood oxygenation level-dependent (BOLD) response. We found that bilateral amygdala–hippocampal junction activation occurred earlier for fearful than for neutral faces. Our findings support the theory of a dual route architecture in which the subcortical thalamic–hippocampal–amygdala route serves fast preconscious threat perception.
NeuroImage, 33, 805-814
Evolutionary survival and procreation are augmented if an individual organism quickly detects environmental threats and rapidly initiates defensive behavioral reactions. Thus, facial emotions signaling a potential threat, e.g., fear or anger, should be perceived rapidly and automatically, possibly through a subcortical processing route which includes the amygdala. Using event-related functional magnetic resonance imaging (fMRI), we investigated the time course of the response in the amygdala to neutral and fearful faces, which appear from dynamically decreasing random visual noise. We aimed to detect differences of the amygdala response between fearful and neutral faces by estimating the latency of the blood oxygenation level-dependent (BOLD) response. We found that bilateral amygdala–hippocampal junction activation occurred earlier for fearful than for neutral faces. Our findings support the theory of a dual route architecture in which the subcortical thalamic–hippocampal–amygdala route serves fast preconscious threat perception.
ARTICLE UPDATE - A gender- and sexual orientation-dependent spatial attentional effect of invisible images
Yi Jiang, Patricia Costello, Fang Fang, Miner Huang, and Sheng He
PNAS, 103, 17048-17052
Human observers are constantly bombarded with a vast amount of information. Selective attention helps us to quickly process what is important while ignoring the irrelevant. In this study, we demonstrate that information that has not entered observers' consciousness, such as interocularly suppressed (invisible) erotic pictures, can direct the distribution of spatial attention. Furthermore, invisible erotic information can either attract or repel observers' spatial attention depending on their gender and sexual orientation. While unaware of the suppressed pictures, heterosexual males' attention was attracted to invisible female nudes, heterosexual females' attention was attracted to invisible male nudes, gay males behaved similarly to heterosexual females, and gay/bisexual females performed in-between heterosexual males and females.
PNAS, 103, 17048-17052
Human observers are constantly bombarded with a vast amount of information. Selective attention helps us to quickly process what is important while ignoring the irrelevant. In this study, we demonstrate that information that has not entered observers' consciousness, such as interocularly suppressed (invisible) erotic pictures, can direct the distribution of spatial attention. Furthermore, invisible erotic information can either attract or repel observers' spatial attention depending on their gender and sexual orientation. While unaware of the suppressed pictures, heterosexual males' attention was attracted to invisible female nudes, heterosexual females' attention was attracted to invisible male nudes, gay males behaved similarly to heterosexual females, and gay/bisexual females performed in-between heterosexual males and females.
Friday, November 03, 2006
ARTICLE UPDATE - Emotional modulation of pain: is it the sensation or what we recall?
Godinho F, Magnin M, Frot M, Perchet C, Garcia-Larrea L.
Journal of Neuroscience, 26, 11454-11461
Emotions modulate pain perception, although the mechanisms underlying this phenomenon remain unclear. In this study, we show that intensity reports significantly increased when painful stimuli were concomitant to images showing human pain, whereas pictures with identical emotional values but without somatic content failed to modulate pain. Early somatosensory responses (<200 ms) remained unmodified by emotions. Conversely, late responses showed a significant enhancement associated with increased pain ratings, localized to the right prefrontal, right temporo-occipital junction, and right temporal pole. In contrast to selective attention, which enhances pain ratings by increasing sensory gain, emotions triggered by seeing other people's pain did not alter processing in SI-SII (primary and second somatosensory areas), but may have biased the transfer to, and the representation of pain in short-term memory buffers (prefrontal), as well as the affective assignment to this representation (temporal pole). Memory encoding and recall, rather than sensory processing, appear to be modulated by empathy with others' physical suffering.
Journal of Neuroscience, 26, 11454-11461
Emotions modulate pain perception, although the mechanisms underlying this phenomenon remain unclear. In this study, we show that intensity reports significantly increased when painful stimuli were concomitant to images showing human pain, whereas pictures with identical emotional values but without somatic content failed to modulate pain. Early somatosensory responses (<200 ms) remained unmodified by emotions. Conversely, late responses showed a significant enhancement associated with increased pain ratings, localized to the right prefrontal, right temporo-occipital junction, and right temporal pole. In contrast to selective attention, which enhances pain ratings by increasing sensory gain, emotions triggered by seeing other people's pain did not alter processing in SI-SII (primary and second somatosensory areas), but may have biased the transfer to, and the representation of pain in short-term memory buffers (prefrontal), as well as the affective assignment to this representation (temporal pole). Memory encoding and recall, rather than sensory processing, appear to be modulated by empathy with others' physical suffering.
Monday, October 30, 2006
ARTICLE UPDATE - Time course of amygdala activation during aversive conditioning
Thomas Straube, Thomas Weiss, Hans-Joachim Mentzel, and Wolfgang H.R. Miltner
NeuroImage, in press
The time course of amygdala activation during aversive conditioning is a matter of debate. While some researchers reported rapid habituation, others found stable or no amygdalar responses to conditioned stimuli at all. In the present event-related fMRI study, we investigated whether the activity of the amygdala during aversive conditioning depends on attentional conditions. Subjects underwent aversive delay conditioning by pairing an electrical shock (unconditioned aversive stimulus) with a visual conditioned stimulus (CS+). For each singular presentation of the CS+ or a nonconditioned visual stimulus (CS−), subjects attended in random order to features that either differed between both stimuli (identification task) or that did not differ (distraction task). For the identification task trials, increased responses of the left amygdala to CS+ versus CS− were rapidly established but absent at the end of the conditioning trials. In contrast, under the distraction condition,
amygdala activation to CS+ versus CS− was present during the late but not the early phase of conditioning. The results suggest that the time course of amygdala activity during aversive associative learning is strongly modulated by an interaction of attention and time.
NeuroImage, in press
The time course of amygdala activation during aversive conditioning is a matter of debate. While some researchers reported rapid habituation, others found stable or no amygdalar responses to conditioned stimuli at all. In the present event-related fMRI study, we investigated whether the activity of the amygdala during aversive conditioning depends on attentional conditions. Subjects underwent aversive delay conditioning by pairing an electrical shock (unconditioned aversive stimulus) with a visual conditioned stimulus (CS+). For each singular presentation of the CS+ or a nonconditioned visual stimulus (CS−), subjects attended in random order to features that either differed between both stimuli (identification task) or that did not differ (distraction task). For the identification task trials, increased responses of the left amygdala to CS+ versus CS− were rapidly established but absent at the end of the conditioning trials. In contrast, under the distraction condition,
amygdala activation to CS+ versus CS− was present during the late but not the early phase of conditioning. The results suggest that the time course of amygdala activity during aversive associative learning is strongly modulated by an interaction of attention and time.
Friday, October 27, 2006
ARTICLE UPDATE - Time-locked brain activity associated with emotion: a pilot MEG study.
Leon-Carrion J, McManis MH, Castillo EM, Papanicolaou AC.
Brain Injury, in press
To examine the time course of brain activation in response to emotionally evocative pictures. METHODS AND PROCEDURES: Regions of the brain involved in the processing of affective stimuli in response to picture sets rated unpleasant, pleasant and affectively neutral, as well as the order of activation of each region, were investigated using magnetoencephalography in 10 normal adult volunteers. RESULTS: Spatiotemporal maps were found consisting of two basic components. The first involving activation in the occipital and basal aspects of the temporal cortex- lasted, on average, 270 ms post-stimulus. The second component involving activation in the mesial temporal lobes (MTL) extended from 270 to 850 ms post-stimulus. After (serial) activating the mesial temporal lobe structures or simultaneous (parallel) to it, activation is also observed in the frontal structures. CONCLUSIONS: The temporal organization in the brain of an emotional stimulus requires the serial and alternating engagement of frontal and posterior cortices. It is suggested that lesions to the brain may disrupt this temporal course, altering the emotional response commonly observed in patients with brain injury.
Brain Injury, in press
To examine the time course of brain activation in response to emotionally evocative pictures. METHODS AND PROCEDURES: Regions of the brain involved in the processing of affective stimuli in response to picture sets rated unpleasant, pleasant and affectively neutral, as well as the order of activation of each region, were investigated using magnetoencephalography in 10 normal adult volunteers. RESULTS: Spatiotemporal maps were found consisting of two basic components. The first involving activation in the occipital and basal aspects of the temporal cortex- lasted, on average, 270 ms post-stimulus. The second component involving activation in the mesial temporal lobes (MTL) extended from 270 to 850 ms post-stimulus. After (serial) activating the mesial temporal lobe structures or simultaneous (parallel) to it, activation is also observed in the frontal structures. CONCLUSIONS: The temporal organization in the brain of an emotional stimulus requires the serial and alternating engagement of frontal and posterior cortices. It is suggested that lesions to the brain may disrupt this temporal course, altering the emotional response commonly observed in patients with brain injury.
Friday, October 20, 2006
ARTICLE UPDATE - Sex differences in brain activation patterns during processing of positively and negatively valenced emotional words
Hofer A, Siedentopf CM, Ischebeck A, Rettenbacher MA, Verius M, Felber S, Wolfgang Fleischhacker W.
Psychological Medicine, in press
Background. Previous studies have suggested that men and women process emotional stimuli differently. In this study, we used event-related functional magnetic resonance imaging (fMRI) to investigate gender differences in regional cerebral activity during the perception of positive or negative emotions.
Method. The experiment comprised two emotional conditions (positively/negatively valenced words) during which fMRI data were acquired.
Results. Thirty-eight healthy volunteers (19 males, 19 females) were investigated. A direct comparison of brain activation between men and women revealed differential activation in the right putamen, the right superior temporal gyrus, and the left supramarginal gyrus during processing of positively valenced words versus non-words for women versus men. By contrast, during processing of negatively valenced words versus non-words, relatively greater activation was seen in the left perirhinal cortex and hippocampus for women versus men, and in the right supramarginal gyrus for men versus women.
Conclusions. Our findings suggest gender-related neural responses to emotional stimuli and could contribute to the understanding of mechanisms underlying the gender disparity of neuropsychiatric diseases such as mood disorders.
Psychological Medicine, in press
Background. Previous studies have suggested that men and women process emotional stimuli differently. In this study, we used event-related functional magnetic resonance imaging (fMRI) to investigate gender differences in regional cerebral activity during the perception of positive or negative emotions.
Method. The experiment comprised two emotional conditions (positively/negatively valenced words) during which fMRI data were acquired.
Results. Thirty-eight healthy volunteers (19 males, 19 females) were investigated. A direct comparison of brain activation between men and women revealed differential activation in the right putamen, the right superior temporal gyrus, and the left supramarginal gyrus during processing of positively valenced words versus non-words for women versus men. By contrast, during processing of negatively valenced words versus non-words, relatively greater activation was seen in the left perirhinal cortex and hippocampus for women versus men, and in the right supramarginal gyrus for men versus women.
Conclusions. Our findings suggest gender-related neural responses to emotional stimuli and could contribute to the understanding of mechanisms underlying the gender disparity of neuropsychiatric diseases such as mood disorders.
ARTICLE UPDATE - Preferential responses in amygdala and insula during presentation of facial contempt and disgust.
Fabio Sambataro, Savino Dimalta, Annabella Di Giorgio, Paolo Taurisano, Giuseppe Blasi, Tommaso Scarabino, Giuseppe Giannatempo, Marcello Nardini and Alessandro Bertolino
European Journal of Neuroscience, in press
Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.
European Journal of Neuroscience, in press
Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.
Friday, October 13, 2006
ARTICLE UPDATE - Individual differences in amygdala activity predict response speed during working memory.
Schaefer A, Braver TS, Reynolds JR, Burgess GC, Yarkoni T, Gray JR.
The Journal of Neuroscience, 26, 12120-12128
The human amygdala has classically been viewed as a brain structure primarily related to emotions and dissociated from higher cognition. We report here findings suggesting that the human amygdala also has a role in supporting working memory (WM), a canonical higher cognitive function. In a first functional magnetic resonance imaging (fMRI) study (n = 53), individual differences in amygdala activity predicted behavioral performance in a 3-back WM task. Specifically, higher event-related amygdala amplitude predicted faster response time (RT; r = -0.64), with no loss of accuracy. This relationship was not contingent on mood state, task content, or personality variables. In a second fMRI study (n = 21), we replicated the key finding (r = -0.47) and further showed that the correlation between the amygdala and faster RT was specific to a high working memory load condition (3-back) compared with a low working memory load condition (1-back). These results support models of amygdala function that can account for its involvement not only in emotion but also higher cognition.
The Journal of Neuroscience, 26, 12120-12128
The human amygdala has classically been viewed as a brain structure primarily related to emotions and dissociated from higher cognition. We report here findings suggesting that the human amygdala also has a role in supporting working memory (WM), a canonical higher cognitive function. In a first functional magnetic resonance imaging (fMRI) study (n = 53), individual differences in amygdala activity predicted behavioral performance in a 3-back WM task. Specifically, higher event-related amygdala amplitude predicted faster response time (RT; r = -0.64), with no loss of accuracy. This relationship was not contingent on mood state, task content, or personality variables. In a second fMRI study (n = 21), we replicated the key finding (r = -0.47) and further showed that the correlation between the amygdala and faster RT was specific to a high working memory load condition (3-back) compared with a low working memory load condition (1-back). These results support models of amygdala function that can account for its involvement not only in emotion but also higher cognition.
Tuesday, October 03, 2006
ARTICLE UPDATE - Effects of emotional arousal on multiple memory systems: Evidence from declarative and procedural learning
Stephan Steidl, Salwa Mohi-uddin and Adam K. Anderson
Learning and Memory, in press
Extensive evidence documents emotional modulation of hippocampus-dependent declarative memory in humans. However, little is known about the emotional modulation of striatum-dependent procedural memory. To address how emotional arousal influences declarative and procedural memory, the current study utilized (1) a picture recognition and (2) a weather prediction (WP) task (a probabilistic classification learning task), which have been shown to rely on hippocampal- and striatum-based memory systems, respectively. Observers viewed arousing or neutral pictures after (Experiment 1) or during (Experiment 2) WP training trials. A 1-wk delayed picture recognition memory test revealed enhanced declarative memory for arousing compared with neutral pictures. Arousal during encoding impaired initial WP acquisition but did not influence retention when tested after a 1-wk delay. Data from a subsequent 3-mo delayed test, however, suggested that arousal during acquisition may enhance remote WP retention. These results suggest a potential dissociation between how readily emotional arousal influences hippocampus-dependent and striatum-dependent memory systems in humans.
Learning and Memory, in press
Extensive evidence documents emotional modulation of hippocampus-dependent declarative memory in humans. However, little is known about the emotional modulation of striatum-dependent procedural memory. To address how emotional arousal influences declarative and procedural memory, the current study utilized (1) a picture recognition and (2) a weather prediction (WP) task (a probabilistic classification learning task), which have been shown to rely on hippocampal- and striatum-based memory systems, respectively. Observers viewed arousing or neutral pictures after (Experiment 1) or during (Experiment 2) WP training trials. A 1-wk delayed picture recognition memory test revealed enhanced declarative memory for arousing compared with neutral pictures. Arousal during encoding impaired initial WP acquisition but did not influence retention when tested after a 1-wk delay. Data from a subsequent 3-mo delayed test, however, suggested that arousal during acquisition may enhance remote WP retention. These results suggest a potential dissociation between how readily emotional arousal influences hippocampus-dependent and striatum-dependent memory systems in humans.
ARTICLE UPDATE - Progress in Brain Research Volume 156
This issue is a special issue about understanding emotions.
Section I Attention and Motivation in Emotional Decoding
Chapter 1 Emotion, motivation, and the brain: Reflex foundations in animal and human research.
Peter J. Lang and Michael Davis
Chapter 2 Emotion and attention: event-related brain potential studies.
Harald T. Schupp, Tobias Flaisch, Jessica Stockburger and Markus Junghöfer
Chapter 3 Implicit and explicit categorization of natural scenes.
Maurizio Codispoti, Vera Ferrari, Andrea De Cesarei and Rossella Cardinale
Chapter 4 Dynamics of emotional effects on spatial attention in the human visual cortex.
Gilles Pourtois and Patrik Vuilleumier
Chapter 5 The neural basis of narrative imagery: emotion and action.
Dean Sabatinelli, Peter J. Lang, Margaret M. Bradley and Tobias Flaisch
Chapter 6 Subliminal emotion perception in brain imaging: findings, issues, and recommendations.
Stefan Wiens
Chapter 7 Neuroimaging methods in affective neuroscience: Selected methodological issues.
Markus Junghöfer, Peter Peyk, Tobias Flaisch and Harald T. Schupp
Section II Understanding Emotional Language Content
Chapter 8 Emotional and semantic networks in visual word processing: insights from ERP studies.
Johanna Kissler, Ramin Assadollahi and Cornelia Herbert
Chapter 9 Event-related potential studies of language and emotion: words, phrases, and task effects.
Ira Fischler and Margaret Bradley
Chapter 10 Emotional connotation of words: role of emotion in distributed semantic systems.
M. Allison Cato Jackson and Bruce Crosson
Chapter 11 Macroscopic brain dynamics during verbal and pictorial processing of affective stimuli.
Andreas Keil
Section III Understanding Emotional Intonation
Chapter 12 Intonation as an interface between language and affect.
Didier Grandjean, Tanja Bänziger and Klaus R. Scherer
Chapter 13 Cerebral processing of linguistic and emotional prosody: fMRI studies.
D. Wildgruber, H. Ackermann, B. Kreifelts and T. Ethofer
Chapter 14 Affective and linguistic processing of speech prosody: DC potential studies.
Hans Pihan
Chapter 15 Lateralization of emotional prosody in the brain: an overview and synopsis on the impact of study design.
Sonja A. Kotz, Martin Meyer and Silke Paulmann
Chapter 16 Psychoacoustic studies on the processing of vocal interjections: how to disentangle lexical and prosodic information?
Susanne Dietrich, Hermann Ackermann, Diana P. Szameitat and Kai Alter
Chapter 17 Judging emotion and attitudes from prosody following brain damage.
Marc D. Pell
Section IV Integrating Social Information
Chapter 18 Processing of facial identity and expression: a psychophysical, physiological, and computational perspective.
Adrian Schwaninger, Christian Wallraven, Douglas W. Cunningham and Sarah D. Chiller-Glaus
Chapter 19 Investigating audiovisual integration of emotional signals in the human brain.
Thomas Ethofer, Gilles Pourtois and Dirk Wildgruber
Chapter 20 Role of the amygdala in processing visual social stimuli.
Ralph Adolphs and Michael Spezio
Chapter 21 Towards a unifying neural theory of social cognition.
Christian Keysers and Valeria Gazzola
Chapter 22 Empathizing: neurocognitive developmental mechanisms and individual differences.
Bhismadev Chakrabarti and Simon Baron-Cohen
Chapter 23 The multiple facets of empathy: a survey of theory and evidence.
Susanne Leiberg and Silke Anders
Section V Understanding Emotional Disorders
Chapter 24 Partly dissociable neural substrates for recognizing basic emotions: a critical review.
Andreas Hennenlotter and Ulrike Schroeder
Chapter 25 Integration of emotion and cognition in patients with psychopathy.
Monika Sommer, Göran Hajak, Katrin Döhnel, Johannes Schwerdtner, Jörg Meinhardt and Jürgen L. Müller
Chapter 26 Disordered emotional processing in schizophrenia and one-sided brain damage.
Katarzyna Kucharska-Pietura
Chapter 27 The biochemistry of dysfunctional emotions: proton MR spectroscopic findings in major depressive disorder.
Gabriele Ende, Traute Demirakca and Heike Tost
Section I Attention and Motivation in Emotional Decoding
Chapter 1 Emotion, motivation, and the brain: Reflex foundations in animal and human research.
Peter J. Lang and Michael Davis
Chapter 2 Emotion and attention: event-related brain potential studies.
Harald T. Schupp, Tobias Flaisch, Jessica Stockburger and Markus Junghöfer
Chapter 3 Implicit and explicit categorization of natural scenes.
Maurizio Codispoti, Vera Ferrari, Andrea De Cesarei and Rossella Cardinale
Chapter 4 Dynamics of emotional effects on spatial attention in the human visual cortex.
Gilles Pourtois and Patrik Vuilleumier
Chapter 5 The neural basis of narrative imagery: emotion and action.
Dean Sabatinelli, Peter J. Lang, Margaret M. Bradley and Tobias Flaisch
Chapter 6 Subliminal emotion perception in brain imaging: findings, issues, and recommendations.
Stefan Wiens
Chapter 7 Neuroimaging methods in affective neuroscience: Selected methodological issues.
Markus Junghöfer, Peter Peyk, Tobias Flaisch and Harald T. Schupp
Section II Understanding Emotional Language Content
Chapter 8 Emotional and semantic networks in visual word processing: insights from ERP studies.
Johanna Kissler, Ramin Assadollahi and Cornelia Herbert
Chapter 9 Event-related potential studies of language and emotion: words, phrases, and task effects.
Ira Fischler and Margaret Bradley
Chapter 10 Emotional connotation of words: role of emotion in distributed semantic systems.
M. Allison Cato Jackson and Bruce Crosson
Chapter 11 Macroscopic brain dynamics during verbal and pictorial processing of affective stimuli.
Andreas Keil
Section III Understanding Emotional Intonation
Chapter 12 Intonation as an interface between language and affect.
Didier Grandjean, Tanja Bänziger and Klaus R. Scherer
Chapter 13 Cerebral processing of linguistic and emotional prosody: fMRI studies.
D. Wildgruber, H. Ackermann, B. Kreifelts and T. Ethofer
Chapter 14 Affective and linguistic processing of speech prosody: DC potential studies.
Hans Pihan
Chapter 15 Lateralization of emotional prosody in the brain: an overview and synopsis on the impact of study design.
Sonja A. Kotz, Martin Meyer and Silke Paulmann
Chapter 16 Psychoacoustic studies on the processing of vocal interjections: how to disentangle lexical and prosodic information?
Susanne Dietrich, Hermann Ackermann, Diana P. Szameitat and Kai Alter
Chapter 17 Judging emotion and attitudes from prosody following brain damage.
Marc D. Pell
Section IV Integrating Social Information
Chapter 18 Processing of facial identity and expression: a psychophysical, physiological, and computational perspective.
Adrian Schwaninger, Christian Wallraven, Douglas W. Cunningham and Sarah D. Chiller-Glaus
Chapter 19 Investigating audiovisual integration of emotional signals in the human brain.
Thomas Ethofer, Gilles Pourtois and Dirk Wildgruber
Chapter 20 Role of the amygdala in processing visual social stimuli.
Ralph Adolphs and Michael Spezio
Chapter 21 Towards a unifying neural theory of social cognition.
Christian Keysers and Valeria Gazzola
Chapter 22 Empathizing: neurocognitive developmental mechanisms and individual differences.
Bhismadev Chakrabarti and Simon Baron-Cohen
Chapter 23 The multiple facets of empathy: a survey of theory and evidence.
Susanne Leiberg and Silke Anders
Section V Understanding Emotional Disorders
Chapter 24 Partly dissociable neural substrates for recognizing basic emotions: a critical review.
Andreas Hennenlotter and Ulrike Schroeder
Chapter 25 Integration of emotion and cognition in patients with psychopathy.
Monika Sommer, Göran Hajak, Katrin Döhnel, Johannes Schwerdtner, Jörg Meinhardt and Jürgen L. Müller
Chapter 26 Disordered emotional processing in schizophrenia and one-sided brain damage.
Katarzyna Kucharska-Pietura
Chapter 27 The biochemistry of dysfunctional emotions: proton MR spectroscopic findings in major depressive disorder.
Gabriele Ende, Traute Demirakca and Heike Tost
ARTICLE UPDATE - “Did you see him in the newspaper?” Electrophysiological correlates of context and valence in face processing
Giulia Galli, Matteo Feurra and Maria Pia Viggiano
Brain Research, in press
Face recognition emerges from an interaction between bottom-up and top-down processing. Specifically, it relies on complex associations between the visual representation of a given face and previously stored knowledge about that face (e.g. biographical details). In the present experiment, the time-course of the interaction between bottom-up and top-down processing was investigated using event-related potentials (ERPs) and manipulating realistic, ecological contextual information. In the study phase, half of the faces (context faces) were framed in a newspaper page entitled with an action committed by the person depicted; these actions could have a positive or a negative value, so in this way emotional valence could be manipulated. The other half was presented on a neutral background (no-context faces). In the test phase, previously presented faces and new ones were presented on neutral backgrounds and an old/new discrimination was requested. The N170 component was modulated by both context (presence/absence at encoding) and valence (positive/negative). A reduction in amplitude was found for context faces as opposed to no-context faces. The same pattern was observed for negative faces compared to positive ones. Moreover, later activations associated with context and valence were differentially distributed over the scalp: context effects were prominent in left frontal areas, traditionally linked to person-specific information retrieval, whereas valence effects were broadly distributed over the scalp. In relation to recent neuroimaging findings on the neural basis of top-down modulations, present findings indicate that the information flow from higher-order areas might have modulated the N170 component and mediated the retrieval of semantic information pertaining to the study episode.
Brain Research, in press
Face recognition emerges from an interaction between bottom-up and top-down processing. Specifically, it relies on complex associations between the visual representation of a given face and previously stored knowledge about that face (e.g. biographical details). In the present experiment, the time-course of the interaction between bottom-up and top-down processing was investigated using event-related potentials (ERPs) and manipulating realistic, ecological contextual information. In the study phase, half of the faces (context faces) were framed in a newspaper page entitled with an action committed by the person depicted; these actions could have a positive or a negative value, so in this way emotional valence could be manipulated. The other half was presented on a neutral background (no-context faces). In the test phase, previously presented faces and new ones were presented on neutral backgrounds and an old/new discrimination was requested. The N170 component was modulated by both context (presence/absence at encoding) and valence (positive/negative). A reduction in amplitude was found for context faces as opposed to no-context faces. The same pattern was observed for negative faces compared to positive ones. Moreover, later activations associated with context and valence were differentially distributed over the scalp: context effects were prominent in left frontal areas, traditionally linked to person-specific information retrieval, whereas valence effects were broadly distributed over the scalp. In relation to recent neuroimaging findings on the neural basis of top-down modulations, present findings indicate that the information flow from higher-order areas might have modulated the N170 component and mediated the retrieval of semantic information pertaining to the study episode.
Monday, October 02, 2006
ARTICLE UPDATE - Neural Processing of Fearful Faces: Effects of Anxiety are Gated by Perceptual Capacity Limitations
Sonia J. Bishop, Rob Jenkins, and Andrew D. Lawrence
Cerebral Cortex, in press
Debate continues as to the automaticity of the amygdala's response to threat. Accounts taking a strong automaticity line suggest that the amygdala's response to threat is both involuntary and independent of attentional resources. Building on these accounts, prominent models have suggested that anxiety modulates the output of an amygdala-based preattentive threat evaluation system. Here, we argue for a modification of these models. Functional magnetic resonance imaging data were collected while volunteers performed a letter search task of high or low perceptual load superimposed on fearful or neutral face distractors. Neither high- nor low-anxious volunteers showed an increased amygdala response to threat distractors under high perceptual load, contrary to a strong automaticity account of amygdala function. Under low perceptual load, elevated state anxiety was associated with a heightened response to threat distractors in the amygdala and superior temporal sulcus, whereas individuals high in trait anxiety showed a reduced prefrontal response to these stimuli, consistent with weakened recruitment of control mechanisms used to prevent the further processing of salient distractors. These findings suggest that anxiety modulates processing subsequent to competition for perceptual processing resources, with state and trait anxiety having distinguishable influences upon the neural mechanisms underlying threat evaluation and "top-down" control.
Cerebral Cortex, in press
Debate continues as to the automaticity of the amygdala's response to threat. Accounts taking a strong automaticity line suggest that the amygdala's response to threat is both involuntary and independent of attentional resources. Building on these accounts, prominent models have suggested that anxiety modulates the output of an amygdala-based preattentive threat evaluation system. Here, we argue for a modification of these models. Functional magnetic resonance imaging data were collected while volunteers performed a letter search task of high or low perceptual load superimposed on fearful or neutral face distractors. Neither high- nor low-anxious volunteers showed an increased amygdala response to threat distractors under high perceptual load, contrary to a strong automaticity account of amygdala function. Under low perceptual load, elevated state anxiety was associated with a heightened response to threat distractors in the amygdala and superior temporal sulcus, whereas individuals high in trait anxiety showed a reduced prefrontal response to these stimuli, consistent with weakened recruitment of control mechanisms used to prevent the further processing of salient distractors. These findings suggest that anxiety modulates processing subsequent to competition for perceptual processing resources, with state and trait anxiety having distinguishable influences upon the neural mechanisms underlying threat evaluation and "top-down" control.
ARTICLE UPDATE - Resolving Emotional Conflict: A Role for the Rostral Anterior Cingulate Cortex in Modulating Activity in the Amygdala
Amit Etkin, Tobias Egner, Daniel M. Peraza, Eric R. Kandel and Joy Hirsch
Neuron, 51, 871-882
Effective mental functioning requires that cognition be protected from emotional conflict due to interference by task-irrelevant emotionally salient stimuli. The neural mechanisms by which the brain detects and resolves emotional conflict are still largely unknown, however. Drawing on the classic Stroop conflict task, we developed a protocol that allowed us to dissociate the generation and monitoring of emotional conflict from its resolution. Using functional magnetic resonance imaging (fMRI), we find that activity in the amygdala and dorsomedial and dorsolateral prefrontal cortices reflects the amount of emotional conflict. By contrast, the resolution of emotional conflict is associated with activation of the rostral anterior cingulate cortex. Activation of the rostral cingulate is predicted by the amount of previous-trial conflict-related neural activity and is accompanied by a simultaneous and correlated reduction of amygdalar activity. These data suggest that emotional conflict is resolved through top-down inhibition of amygdalar activity by the rostral cingulate cortex.
Neuron, 51, 871-882
Effective mental functioning requires that cognition be protected from emotional conflict due to interference by task-irrelevant emotionally salient stimuli. The neural mechanisms by which the brain detects and resolves emotional conflict are still largely unknown, however. Drawing on the classic Stroop conflict task, we developed a protocol that allowed us to dissociate the generation and monitoring of emotional conflict from its resolution. Using functional magnetic resonance imaging (fMRI), we find that activity in the amygdala and dorsomedial and dorsolateral prefrontal cortices reflects the amount of emotional conflict. By contrast, the resolution of emotional conflict is associated with activation of the rostral anterior cingulate cortex. Activation of the rostral cingulate is predicted by the amount of previous-trial conflict-related neural activity and is accompanied by a simultaneous and correlated reduction of amygdalar activity. These data suggest that emotional conflict is resolved through top-down inhibition of amygdalar activity by the rostral cingulate cortex.
ARTICLE UPDATE - Fast recognition of social emotions takes the whole brain: Interhemispheric cooperation in the absence of cerebral asymmetry
Marco Tamietto, Mauro Adenzato, Giuliano Geminiani and Beatrice de Gelder
Neuropsychologia, in press
Hemispheric asymmetry in emotional perception has been traditionally studied for basic emotions and very little is known about laterality for more complex social emotions. Here, we used the “redundant target paradigm” to investigate interhemispheric asymmetry and cooperation for two social emotions in healthy subjects. Facial expressions of flirtatiousness or arrogance were briefly presented either unilaterally in the left (LVF) or right visual field (RVF), or simultaneously to both visual fields (BVF) while participants responded to the target expression (flirtatious or arrogant, counterbalanced between blocks). In bilateral conditions the faces could show the same emotion (congruent condition) or two different expressions (incongruent condition). No difference between unilateral presentations was found, suggesting that the perception of social emotions is not hemispherically lateralized. Responses were faster and more accurate in bilateral displays with two emotionally congruent but physically different faces (i.e., a male and a female expressing the same emotion) than in unilateral conditions. This “redundant target effect” was consistent with a neural summation model, thereby showing that interhemispheric cooperation may occur for social emotions despite major perceptual differences between faces posing the same expression.
Neuropsychologia, in press
Hemispheric asymmetry in emotional perception has been traditionally studied for basic emotions and very little is known about laterality for more complex social emotions. Here, we used the “redundant target paradigm” to investigate interhemispheric asymmetry and cooperation for two social emotions in healthy subjects. Facial expressions of flirtatiousness or arrogance were briefly presented either unilaterally in the left (LVF) or right visual field (RVF), or simultaneously to both visual fields (BVF) while participants responded to the target expression (flirtatious or arrogant, counterbalanced between blocks). In bilateral conditions the faces could show the same emotion (congruent condition) or two different expressions (incongruent condition). No difference between unilateral presentations was found, suggesting that the perception of social emotions is not hemispherically lateralized. Responses were faster and more accurate in bilateral displays with two emotionally congruent but physically different faces (i.e., a male and a female expressing the same emotion) than in unilateral conditions. This “redundant target effect” was consistent with a neural summation model, thereby showing that interhemispheric cooperation may occur for social emotions despite major perceptual differences between faces posing the same expression.
ARTICLE UPDATE - Affective evaluations of objects are influenced by observed gaze direction and emotional expression
Andrew P. Bayliss, Alexandra Frischen, Mark J. Fenske and Steven P. Tipper
Cognition, in press
Gaze direction signals another person’s focus of interest. Facial expressions convey information about their mental state. Appropriate responses to these signals should reflect their combined influence, yet current evidence suggests that gaze-cueing effects for objects near an observed face are not modulated by its emotional expression. Here, we extend the investigation of perceived gaze direction and emotional expression by considering their combined influence on affective judgments. While traditional response-time measures revealed equal gaze-cueing effects for happy and disgust faces, affective evaluations critically depended on the combined product of gaze and emotion. Target objects looked at with a happy expression were liked more than objects looked at with a disgust expression. Objects not looked at were rated equally for both expressions. Our results demonstrate that facial expression does modulate the way that observers utilize gaze cues: Objects attended by others are evaluated according to the valence of their facial expression.
Cognition, in press
Gaze direction signals another person’s focus of interest. Facial expressions convey information about their mental state. Appropriate responses to these signals should reflect their combined influence, yet current evidence suggests that gaze-cueing effects for objects near an observed face are not modulated by its emotional expression. Here, we extend the investigation of perceived gaze direction and emotional expression by considering their combined influence on affective judgments. While traditional response-time measures revealed equal gaze-cueing effects for happy and disgust faces, affective evaluations critically depended on the combined product of gaze and emotion. Target objects looked at with a happy expression were liked more than objects looked at with a disgust expression. Objects not looked at were rated equally for both expressions. Our results demonstrate that facial expression does modulate the way that observers utilize gaze cues: Objects attended by others are evaluated according to the valence of their facial expression.
ARTICLE UPDATE - The influence of current mood on affective startle modulation.
Sabine M. Grüsser, Klaus Wölfling, Chantal P. Mörsen, Norbert Kathmann and Herta Flor
Experimental Brain Research, in press
The affect-modulated startle response is a reliable indicator of the affective processing of stimuli. It may be influenced by trait and state affective variables as well as psychopathological status. The aim of the present study was to determine the influence of the current mood state on startle modulation. Forty-five healthy volunteers viewed affective stimuli while eye blink responses and subjective emotional ratings were assessed. In addition, the current state of mood was assessed, pre and post the experimental procedure. Subjects were divided into those that were in a more positive and those that were in a more negative mood based on a median split. Compared to subjects in a positive mood those in a more negative mood showed significantly reduced startle amplitudes after viewing the negative and neutral stimuli. The results of the present study show that changes in startle responses are not only related to the current state of psychopathology but also to the general affective state of the participants during the assessments.
Experimental Brain Research, in press
The affect-modulated startle response is a reliable indicator of the affective processing of stimuli. It may be influenced by trait and state affective variables as well as psychopathological status. The aim of the present study was to determine the influence of the current mood state on startle modulation. Forty-five healthy volunteers viewed affective stimuli while eye blink responses and subjective emotional ratings were assessed. In addition, the current state of mood was assessed, pre and post the experimental procedure. Subjects were divided into those that were in a more positive and those that were in a more negative mood based on a median split. Compared to subjects in a positive mood those in a more negative mood showed significantly reduced startle amplitudes after viewing the negative and neutral stimuli. The results of the present study show that changes in startle responses are not only related to the current state of psychopathology but also to the general affective state of the participants during the assessments.
ARTICLE UPDATE - The Experience of Emotion.
Barrett LF, Mesquita B, Ochsner KN, Gross JJ.
Annual Review of Psychology, in press
Experiences of emotion are content-rich events that emerge at the level of psychological description, but must be causally constituted by neurobiological processes. This chapter outlines an emerging scientific agenda for understanding what these experiences feel like and how they arise. We review the available answers to what is felt (i.e., the content that makes up an experience of emotion) and how neurobiological processes instantiate these properties of experience. These answers are then integrated into a broad framework that describes, in psychological terms, how the experience of emotion emerges from more basic processes. We then discuss the role of such experiences in the economy of the mind and behavior.
Annual Review of Psychology, in press
Experiences of emotion are content-rich events that emerge at the level of psychological description, but must be causally constituted by neurobiological processes. This chapter outlines an emerging scientific agenda for understanding what these experiences feel like and how they arise. We review the available answers to what is felt (i.e., the content that makes up an experience of emotion) and how neurobiological processes instantiate these properties of experience. These answers are then integrated into a broad framework that describes, in psychological terms, how the experience of emotion emerges from more basic processes. We then discuss the role of such experiences in the economy of the mind and behavior.
Wednesday, September 20, 2006
ARTICLE UPDATE - The psychological, neurochemical and functional neuroanatomical mediators of the effects of positive and negative mood on executive f
Rachel L.C. Mitchell and Louise H. Phillips
Neuropsychologia, in press
In this review we evaluate the cognitive and neural effects of positive and negative mood on executive function. Mild manipulations of negative mood appear to have little effect on cognitive control processes, whereas positive mood impairs aspects of updating, planning and switching. These cognitive effects may be linked to neurochemistry: with positive mood effects mediated by dopamine while negative mood effects may be mediated by serotonin levels. Current evidence on the effects of mood on regional brain activity during executive functions, indicates that the prefrontal cortex is a recurrent site of integration between mood and cognition. We conclude that there is a disparity between the importance of this topic and awareness of how mood affects, executive functions in the brain. Most behavioural and neuroimaging studies of executive function in normal samples do not explore the potential role of variations in mood, yet the evidence we outline indicates that even mild fluctuations in mood can have a significant influence on neural activation and cognition.
Neuropsychologia, in press
In this review we evaluate the cognitive and neural effects of positive and negative mood on executive function. Mild manipulations of negative mood appear to have little effect on cognitive control processes, whereas positive mood impairs aspects of updating, planning and switching. These cognitive effects may be linked to neurochemistry: with positive mood effects mediated by dopamine while negative mood effects may be mediated by serotonin levels. Current evidence on the effects of mood on regional brain activity during executive functions, indicates that the prefrontal cortex is a recurrent site of integration between mood and cognition. We conclude that there is a disparity between the importance of this topic and awareness of how mood affects, executive functions in the brain. Most behavioural and neuroimaging studies of executive function in normal samples do not explore the potential role of variations in mood, yet the evidence we outline indicates that even mild fluctuations in mood can have a significant influence on neural activation and cognition.
ARTICLE UPDATE - Separating subjective emotion from the perception of emotion-inducing stimuli: An fMRI study
Amy S. Garrett and Richard J. Maddock
NeuroImage, in press
fMRI was used to dissociate neural responses temporally associated with the subjective experience of emotion from those associated with the perception of emotion-inducing stimuli in order to better define the emotion-related functions of the amygdala, lateral orbital frontal cortex (OFC), and hippocampus. Subjects viewed aversive pictures followed by an extended post-stimulus period of sustained subjective emotion. Brain regions showing activation paralleling the period of sustained subjective emotion were distinguished from those showing activation limited to the period of aversive picture presentation. Behavioral results showed that subjective ratings of emotion remained elevated for 20 s after offset of the aversive pictures. fMRI results showed that viewing aversive pictures activated the amygdala, lateral OFC, and hippocampus. Subjective emotion (present both during and after aversive pictures) was temporally associated with activation in the right lateral OFC and left hippocampus but not the amygdala. Ratings of subjective emotion were correlated with activation in the right lateral OFC and left hippocampus. The results support direct amygdala involvement in emotion perception but suggest that amygdala activation is not temporally associated with subjective emotion that occurs after the offset of emotion-related stimuli. The results are consistent with a general role for the lateral OFC in monitoring or reflecting on internal experience and show that hippocampal activation is sustained during a period of subjective emotion, possibly related to enhanced memory encoding for the aversive pictures.
NeuroImage, in press
fMRI was used to dissociate neural responses temporally associated with the subjective experience of emotion from those associated with the perception of emotion-inducing stimuli in order to better define the emotion-related functions of the amygdala, lateral orbital frontal cortex (OFC), and hippocampus. Subjects viewed aversive pictures followed by an extended post-stimulus period of sustained subjective emotion. Brain regions showing activation paralleling the period of sustained subjective emotion were distinguished from those showing activation limited to the period of aversive picture presentation. Behavioral results showed that subjective ratings of emotion remained elevated for 20 s after offset of the aversive pictures. fMRI results showed that viewing aversive pictures activated the amygdala, lateral OFC, and hippocampus. Subjective emotion (present both during and after aversive pictures) was temporally associated with activation in the right lateral OFC and left hippocampus but not the amygdala. Ratings of subjective emotion were correlated with activation in the right lateral OFC and left hippocampus. The results support direct amygdala involvement in emotion perception but suggest that amygdala activation is not temporally associated with subjective emotion that occurs after the offset of emotion-related stimuli. The results are consistent with a general role for the lateral OFC in monitoring or reflecting on internal experience and show that hippocampal activation is sustained during a period of subjective emotion, possibly related to enhanced memory encoding for the aversive pictures.
ARTICLE UPDATE - The role of awareness in delay and trace fear conditioning in humans
Knight, David C.; Nguyen, Hanh T.; Bandettini, Peter A.
Cognitive, Affective, & Behavioral Neuroscience, 6, 157-162
Expression of conditional fear without awareness has been previously demonstrated during delay conditioning, a procedure in which the conditioned stimulus (CS) and unconditioned stimulus (UCS) overlap. However, less is known about the role of awareness in trace fear conditioning, where an interval of time separates the CS and UCS. The present study assessed skin conductance response (SCR) and UCS expectancy during delay and trace conditioning. UCS predictability was varied on a trial-by-trial basis by presenting perithreshold auditory CSs. Differential UCS expectancies were demonstrated only on perceived delay and trace trials. Learning-related SCRs were observed during both perceived and unperceived delay CSs. In contrast, differential SCRs were demonstrated only for perceived trace CSs. These data suggest that awareness is necessary for conditional responding during trace, but not delay, fear conditioning.
Cognitive, Affective, & Behavioral Neuroscience, 6, 157-162
Expression of conditional fear without awareness has been previously demonstrated during delay conditioning, a procedure in which the conditioned stimulus (CS) and unconditioned stimulus (UCS) overlap. However, less is known about the role of awareness in trace fear conditioning, where an interval of time separates the CS and UCS. The present study assessed skin conductance response (SCR) and UCS expectancy during delay and trace conditioning. UCS predictability was varied on a trial-by-trial basis by presenting perithreshold auditory CSs. Differential UCS expectancies were demonstrated only on perceived delay and trace trials. Learning-related SCRs were observed during both perceived and unperceived delay CSs. In contrast, differential SCRs were demonstrated only for perceived trace CSs. These data suggest that awareness is necessary for conditional responding during trace, but not delay, fear conditioning.
ARTICLE UPDATE - Interference produced by emotional conflict associated with anterior cingulate activation
Haas, Brian W.; Omura, Kazufumi; Constable, R. Todd; Canli, Turhan
Cognitive, Affective, & Behavioral Neuroscience, 6, 152-156
The anterior cingulate cortex (ACC) is involved in cognition and emotion. In the classic stroop task, presentation of stimuli that are in response conflict with one another produces activation in the caudal ACC. In the emotional stroop task, presentation of emotionally salient stimuli produces activation in the rostral ACC. Presentation of stimuli that are emotionally conflicting should activate the caudal ACC; stimuli that are emotionally salient should activate the rostral ACC. We tested this prediction using functional magnetic resonance imaging while subjects made emotional valence judgments of words overlaid on emotional faces (word-face stroop task). Emotionally incongruent pairs were responded to more slowly than emotionally congruent pairs. Emotionally incongruent trials were associated with increased activation within the caudal ACC, whereas no ACC activation was found in response to emotional saliency. These results support the conflict-monitoring model of caudal ACC and extend this function to conflict within the domain of emotional stimuli.
Cognitive, Affective, & Behavioral Neuroscience, 6, 152-156
The anterior cingulate cortex (ACC) is involved in cognition and emotion. In the classic stroop task, presentation of stimuli that are in response conflict with one another produces activation in the caudal ACC. In the emotional stroop task, presentation of emotionally salient stimuli produces activation in the rostral ACC. Presentation of stimuli that are emotionally conflicting should activate the caudal ACC; stimuli that are emotionally salient should activate the rostral ACC. We tested this prediction using functional magnetic resonance imaging while subjects made emotional valence judgments of words overlaid on emotional faces (word-face stroop task). Emotionally incongruent pairs were responded to more slowly than emotionally congruent pairs. Emotionally incongruent trials were associated with increased activation within the caudal ACC, whereas no ACC activation was found in response to emotional saliency. These results support the conflict-monitoring model of caudal ACC and extend this function to conflict within the domain of emotional stimuli.
ARTICLE UPDATE - Processing emotional pictures and words: Effects of valence and arousal
Kensinger, Elizabeth A.; Schacter, Daniel L.
Cognitive, Affective, & Behavioral Neuroscience, 6, 110-126
There is considerable debate regarding the extent to which limbic regions respond differentially to items with different valences (positive or negative) or to different stimulus types (pictures or words). In the present event-related fMRI study, 21 participants viewed words and pictures that were neutral, negative, or positive. Negative and positive items were equated on arousal. The participants rated each item for whether it depicted or described something animate or inanimate or something common or uncommon. For both pictures and words, the amygdala, dorsomedial prefrontal cortex (PFC), and ventromedial PFC responded equally to all high-arousal items, regardless of valence. Laterality effects in the amygdala were based on the stimulus type (word = left, picture = bilateral). Valence effects were most apparent when the individuals processed pictures, and the results revealed a lateral/medial distinction within the PFC: the lateral PFC responded differentially to negative items, whereas the medial PFC was more engaged during the processing of positive pictures.
Cognitive, Affective, & Behavioral Neuroscience, 6, 110-126
There is considerable debate regarding the extent to which limbic regions respond differentially to items with different valences (positive or negative) or to different stimulus types (pictures or words). In the present event-related fMRI study, 21 participants viewed words and pictures that were neutral, negative, or positive. Negative and positive items were equated on arousal. The participants rated each item for whether it depicted or described something animate or inanimate or something common or uncommon. For both pictures and words, the amygdala, dorsomedial prefrontal cortex (PFC), and ventromedial PFC responded equally to all high-arousal items, regardless of valence. Laterality effects in the amygdala were based on the stimulus type (word = left, picture = bilateral). Valence effects were most apparent when the individuals processed pictures, and the results revealed a lateral/medial distinction within the PFC: the lateral PFC responded differentially to negative items, whereas the medial PFC was more engaged during the processing of positive pictures.
ARTICLE UPDATE - Emotional constraints on intentional forgetting
B. Keith Payne and Elizabeth Corrigan
Journal of Experimental Social Psychology, in press
One way people control the contents of their minds is intentional forgetting—voluntarily forgetting events after they have happened. The events people would most like to forget are unpleasant and emotional. This study used a directed forgetting procedure with emotional and neutral pictures to examine whether people can intentionally forget emotional events as easily as mundane ones. When the to-be-forgotten list was neutral, participants showed successful intentional forgetting. But when the to-be-forgotten list was emotional, directed forgetting failed. Results contribute to understanding the ways that emotion constrains mental control by capturing mental processes including memory retrieval. Emotion may short-circuit attempts to forget those parts of the past people would most like to forget.
Journal of Experimental Social Psychology, in press
One way people control the contents of their minds is intentional forgetting—voluntarily forgetting events after they have happened. The events people would most like to forget are unpleasant and emotional. This study used a directed forgetting procedure with emotional and neutral pictures to examine whether people can intentionally forget emotional events as easily as mundane ones. When the to-be-forgotten list was neutral, participants showed successful intentional forgetting. But when the to-be-forgotten list was emotional, directed forgetting failed. Results contribute to understanding the ways that emotion constrains mental control by capturing mental processes including memory retrieval. Emotion may short-circuit attempts to forget those parts of the past people would most like to forget.
ARTICLE UPDATE - A neural network reflecting individual differences in cognitive processing of emotions during perceptual decision making
Katja Mériau, Isabell Wartenburger, Philipp Kazzer, Kristin Prehn, Claas-Hinrich Lammers, Elke van der Meer, Arno Villringer and Hauke R. Heekeren
NeuorImage, 33, 1016-1027
Even simple perceptual decisions are influenced by the emotional content of a stimulus. Recent neuroimaging studies provide evidence about the neural mechanisms of perceptual decision making on emotional stimuli. However, the effect of individual differences in cognitive processing of emotions on perceptual decision making remains poorly understood. Here, we investigated how changes in the fMRI signal during perceptual decision making on facial stimuli covaried with individual differences in the ability to identify and communicate one’s emotional state. Although this personality trait covaried with changes in activity in the dorsal anterior cingulate cortex (dACC) during gender decisions on facial expressions, there was no correlation during emotion decisions. Further, we investigated whether individual differences in the ability to cognitively process emotions depend on differences in the functional integration of emotional and cognitive brain regions. We therefore compared task-dependent changes in effective connectivity of dACC in individuals with good and with poor ability to cognitively process emotions using a psychophysiological interaction analysis. We found greater coupling of dACC with prefrontal regions in individuals with good ability to identify and communicate their emotional state. Conversely, individuals with poor ability in this domain showed greater coupling of dACC with the amygdala. Our data indicate that individual differences in the ability to identify and communicate one’s emotional state are reflected by altered effective connectivity of the dACC with prefrontal and limbic regions. Thus, we provide neurophysiological evidence for a theoretical model that posits that a discommunication between limbic areas and the neocortex impairs cognitive processing of emotions.
NeuorImage, 33, 1016-1027
Even simple perceptual decisions are influenced by the emotional content of a stimulus. Recent neuroimaging studies provide evidence about the neural mechanisms of perceptual decision making on emotional stimuli. However, the effect of individual differences in cognitive processing of emotions on perceptual decision making remains poorly understood. Here, we investigated how changes in the fMRI signal during perceptual decision making on facial stimuli covaried with individual differences in the ability to identify and communicate one’s emotional state. Although this personality trait covaried with changes in activity in the dorsal anterior cingulate cortex (dACC) during gender decisions on facial expressions, there was no correlation during emotion decisions. Further, we investigated whether individual differences in the ability to cognitively process emotions depend on differences in the functional integration of emotional and cognitive brain regions. We therefore compared task-dependent changes in effective connectivity of dACC in individuals with good and with poor ability to cognitively process emotions using a psychophysiological interaction analysis. We found greater coupling of dACC with prefrontal regions in individuals with good ability to identify and communicate their emotional state. Conversely, individuals with poor ability in this domain showed greater coupling of dACC with the amygdala. Our data indicate that individual differences in the ability to identify and communicate one’s emotional state are reflected by altered effective connectivity of the dACC with prefrontal and limbic regions. Thus, we provide neurophysiological evidence for a theoretical model that posits that a discommunication between limbic areas and the neocortex impairs cognitive processing of emotions.
ARTICLE UPDATE - The effect of anticipation and the specificity of sex differences for amygdala and hippocampus function in emotional memory
Kristen L. Mackiewicz, Issidoros Sarinopoulos, Krystal L. Cleven, and Jack B. Nitschke
PNAS, 103, 14200-14205
Prior research has shown memory is enhanced for emotional events. Key brain areas involved in emotional memory are the amygdala and hippocampus, which are also recruited during aversion and its anticipation. This study investigated whether anticipatory processes signaling an upcoming aversive event contribute to emotional memory. In an event-related functional MRI paradigm, 40 healthy participants viewed aversive and neutral pictures preceded by predictive warning cues. Participants completed a surprise recognition task directly after functional MRI scanning or 2 weeks later. In anticipation of aversive pictures, bilateral dorsal amygdala and anterior hippocampus activations were associated with better immediate recognition memory. Similar associations with memory were observed for activation of those areas in response to aversive pictures. Anticipatory activation predicted immediate memory over and above these associations for picture viewing. Bilateral ventral amygdala activations in response to aversive pictures predicted delayed memory only. We found that previously reported sex differences of memory associations with left amygdala for women and with right amygdala for men were confined to the ventral amygdala during picture viewing and delayed memory. Results support an established animal model elucidating the functional neuroanatomy of the amygdala and hippocampus in emotional memory, highlight the importance of anticipatory processes in such memory for aversive events, and extend neuroanatomical evidence of sex differences for emotional memory.
PNAS, 103, 14200-14205
Prior research has shown memory is enhanced for emotional events. Key brain areas involved in emotional memory are the amygdala and hippocampus, which are also recruited during aversion and its anticipation. This study investigated whether anticipatory processes signaling an upcoming aversive event contribute to emotional memory. In an event-related functional MRI paradigm, 40 healthy participants viewed aversive and neutral pictures preceded by predictive warning cues. Participants completed a surprise recognition task directly after functional MRI scanning or 2 weeks later. In anticipation of aversive pictures, bilateral dorsal amygdala and anterior hippocampus activations were associated with better immediate recognition memory. Similar associations with memory were observed for activation of those areas in response to aversive pictures. Anticipatory activation predicted immediate memory over and above these associations for picture viewing. Bilateral ventral amygdala activations in response to aversive pictures predicted delayed memory only. We found that previously reported sex differences of memory associations with left amygdala for women and with right amygdala for men were confined to the ventral amygdala during picture viewing and delayed memory. Results support an established animal model elucidating the functional neuroanatomy of the amygdala and hippocampus in emotional memory, highlight the importance of anticipatory processes in such memory for aversive events, and extend neuroanatomical evidence of sex differences for emotional memory.
Friday, September 01, 2006
ARTICLE UPDATE - Left hemisphere specialization for response to positive emotional expressions: a divided output methodology.
Root JC, Wong PS, Kinsbourne M.
Emotion, 6, 473-483
An extensive literature credits the right hemisphere with dominance for processing emotion. Conflicting literature finds left hemisphere dominance for positive emotions. This conflict may be resolved by attending to processing stage. A divided output (bimanual) reaction time paradigm in which response hand was varied for emotion (angry; happy) in Experiments 1 and 2 and for gender (male; female) in Experiment 3 focused on response to emotion rather than perception. In Experiments 1 and 2, reaction time was shorter when right-hand responses indicated a happy face and left-hand responses an angry face, as compared to reversed assignment. This dissociation did not obtain with incidental emotion (Experiment 3). Results support the view that response preparation to positive emotional stimuli is left lateralized.
Emotion, 6, 473-483
An extensive literature credits the right hemisphere with dominance for processing emotion. Conflicting literature finds left hemisphere dominance for positive emotions. This conflict may be resolved by attending to processing stage. A divided output (bimanual) reaction time paradigm in which response hand was varied for emotion (angry; happy) in Experiments 1 and 2 and for gender (male; female) in Experiment 3 focused on response to emotion rather than perception. In Experiments 1 and 2, reaction time was shorter when right-hand responses indicated a happy face and left-hand responses an angry face, as compared to reversed assignment. This dissociation did not obtain with incidental emotion (Experiment 3). Results support the view that response preparation to positive emotional stimuli is left lateralized.
ARTICLE UPDATE - Attending to affect: appraisal strategies modulate the electrocortical response to arousing pictures
Hajcak G, Moser JS, Simons RF.
Emotion, 6, 517-522.
Arousing (unpleasant and pleasant) pictures elicit increased neurophysiological measures of perceptual processing. In particular, the electrocortical late positive potential (LPP) is enhanced for arousing, compared with neutral, pictures. To determine whether the magnitude of the LPP is sensitive to the way stimuli are appraised, 16 participants viewed both pleasant and unpleasant pictures and categorized them along an affective or nonaffective dimension. Results indicate that the LPP was reduced for both pleasant and unpleasant pictures when participants made nonaffective, compared with affective, judgments. These results are consistent with previous studies that have used functional neuroimaging to investigate the role of appraisal on emotional processing. The results are further discussed in terms of the utility of using the LPP to study emotion regulation.
Emotion, 6, 517-522.
Arousing (unpleasant and pleasant) pictures elicit increased neurophysiological measures of perceptual processing. In particular, the electrocortical late positive potential (LPP) is enhanced for arousing, compared with neutral, pictures. To determine whether the magnitude of the LPP is sensitive to the way stimuli are appraised, 16 participants viewed both pleasant and unpleasant pictures and categorized them along an affective or nonaffective dimension. Results indicate that the LPP was reduced for both pleasant and unpleasant pictures when participants made nonaffective, compared with affective, judgments. These results are consistent with previous studies that have used functional neuroimaging to investigate the role of appraisal on emotional processing. The results are further discussed in terms of the utility of using the LPP to study emotion regulation.
ARTICLE UPDATE - An emotion-induced attentional blink elicited by aversively conditioned stimuli.
Smith SD, Most SB, Newsome LA, Zald DH.
Emotion, 6, 523-527
The current study examines whether aversively conditioned stimuli can modulate attention to such a degree that they impair the perception of subsequently presented nonemotional targets. In the initial phase of this study, participants viewed 3 categories of photographs, 1 of which was paired with an aversive noise. Following conditioning, participants searched for a target embedded within a series of 17 rapidly presented images on each trial. Critically, a conditioned or unconditioned item from the initial phase appeared 200 ms or 800 ms before the target. At 200-ms lags but not 800-ms lags, the conditioned images impaired target detection relative to the other distractors. Thus, temporary visual deficits can be induced by otherwise neutral distractors whose aversive associations have only recently been learned.
Emotion, 6, 523-527
The current study examines whether aversively conditioned stimuli can modulate attention to such a degree that they impair the perception of subsequently presented nonemotional targets. In the initial phase of this study, participants viewed 3 categories of photographs, 1 of which was paired with an aversive noise. Following conditioning, participants searched for a target embedded within a series of 17 rapidly presented images on each trial. Critically, a conditioned or unconditioned item from the initial phase appeared 200 ms or 800 ms before the target. At 200-ms lags but not 800-ms lags, the conditioned images impaired target detection relative to the other distractors. Thus, temporary visual deficits can be induced by otherwise neutral distractors whose aversive associations have only recently been learned.
ARTICLE UPDATE - EEG phase synchronization during emotional response to positive and negative film stimuli.
Costa T, Rognoni E, Galati D.
Neuroscience Letter, in press
In the present study the patterns of interdependency between different brain regions were investigated as volunteers looked at emotional and non-emotional film stimuli. The main goal was to evaluate the emotion-related differences and to check their consistency during the elaboration of the same type of stimuli in repeated presentations. A measure called synchronization index (SI) was used to detect interdependencies in EEG signals. The hypotheses were that emotional-information processing could involve variation in synchronized activity and that two valence-specific emotions - happiness and sadness - differ from each other. The SI obtained was compared among the various experimental conditions and significant changes were found. The results demonstrated an overall increase of SI during emotional stimulation and, in particular, during sadness, which yielded a pattern involving a large exchange of information among frontal channels. On the other hand, happiness was associated with a wider synchronization among frontal and occipital sites, although happiness itself was less synchronized. We conclude that the SI can be successfully applied for studying the dynamic cooperation between cortical areas during emotion responses.
Neuroscience Letter, in press
In the present study the patterns of interdependency between different brain regions were investigated as volunteers looked at emotional and non-emotional film stimuli. The main goal was to evaluate the emotion-related differences and to check their consistency during the elaboration of the same type of stimuli in repeated presentations. A measure called synchronization index (SI) was used to detect interdependencies in EEG signals. The hypotheses were that emotional-information processing could involve variation in synchronized activity and that two valence-specific emotions - happiness and sadness - differ from each other. The SI obtained was compared among the various experimental conditions and significant changes were found. The results demonstrated an overall increase of SI during emotional stimulation and, in particular, during sadness, which yielded a pattern involving a large exchange of information among frontal channels. On the other hand, happiness was associated with a wider synchronization among frontal and occipital sites, although happiness itself was less synchronized. We conclude that the SI can be successfully applied for studying the dynamic cooperation between cortical areas during emotion responses.
ARTICLE UPDATE - Amygdala response to facial expressions reflects emotional learning.
Hooker CI, Germine LT, Knight RT, D'Esposito M.
Journal of Neuroscience, 26,8915-8922
The functional role of the human amygdala in the evaluation of emotional facial expressions is unclear. Previous animal and human research shows that the amygdala participates in processing positive and negative reinforcement as well as in learning predictive associations between stimuli and subsequent reinforcement. Thus, amygdala response to facial expressions could reflect the processing of primary reinforcement or emotional learning. Here, using functional magnetic resonance imaging, we tested the hypothesis that amygdala response to facial expressions is driven by emotional association learning. We show that the amygdala is more responsive to learning object-emotion associations from happy and fearful facial expressions than it is to the presentation of happy and fearful facial expressions alone. The results provide evidence that the amygdala uses social signals to rapidly and flexibly learn threatening and rewarding associations that ultimately serve to enhance survival.
Journal of Neuroscience, 26,8915-8922
The functional role of the human amygdala in the evaluation of emotional facial expressions is unclear. Previous animal and human research shows that the amygdala participates in processing positive and negative reinforcement as well as in learning predictive associations between stimuli and subsequent reinforcement. Thus, amygdala response to facial expressions could reflect the processing of primary reinforcement or emotional learning. Here, using functional magnetic resonance imaging, we tested the hypothesis that amygdala response to facial expressions is driven by emotional association learning. We show that the amygdala is more responsive to learning object-emotion associations from happy and fearful facial expressions than it is to the presentation of happy and fearful facial expressions alone. The results provide evidence that the amygdala uses social signals to rapidly and flexibly learn threatening and rewarding associations that ultimately serve to enhance survival.
Friday, August 25, 2006
ARTICLE UPDATE - Feature binding and affect: Emotional modulation of visuo-motor integration
Lorenza S. Colzato, Nelleke C. van Wouwe and Bernhard Hommel
Neuropsychologia, in press
The primate cortex represents the external world in a distributed fashion, which calls for a mechanism that integrates and binds the features of a perceived or processed event. Animal and patients studies provide evidence that feature binding in the visual cortex is driven by the muscarinic–cholinergic system, whereas visuo-motor integration may be under dopaminergic control. Consistent with this scenario, we present indication that the binding of visual and action features is modulated by emotions through the probable stimulation of the dopaminergic system. Interestingly, the impact of emotions on binding was restricted to tasks in which shape was task-relevant, suggesting that extracting affective information is not automatic but requires attention to shape.
Neuropsychologia, in press
The primate cortex represents the external world in a distributed fashion, which calls for a mechanism that integrates and binds the features of a perceived or processed event. Animal and patients studies provide evidence that feature binding in the visual cortex is driven by the muscarinic–cholinergic system, whereas visuo-motor integration may be under dopaminergic control. Consistent with this scenario, we present indication that the binding of visual and action features is modulated by emotions through the probable stimulation of the dopaminergic system. Interestingly, the impact of emotions on binding was restricted to tasks in which shape was task-relevant, suggesting that extracting affective information is not automatic but requires attention to shape.
Friday, August 18, 2006
ARTICLE UPDATE - Separating subjective emotion from the perception of emotion-inducing stimuli: An fMRI study.
Garrett AS, Maddock RJ.
NeuroImage, in press
fMRI was used to dissociate neural responses temporally associated with the subjective experience of emotion from those associated with the perception of emotion-inducing stimuli in order to better define the emotion-related functions of the amygdala, lateral orbital frontal cortex (OFC), and hippocampus. Subjects viewed aversive pictures followed by an extended post-stimulus period of sustained subjective emotion. Brain regions showing activation paralleling the period of sustained subjective emotion were distinguished from those showing activation limited to the period of aversive picture presentation. Behavioral results showed that subjective ratings of emotion remained elevated for 20 s after offset of the aversive pictures. fMRI results showed that viewing aversive pictures activated the amygdala, lateral OFC, and hippocampus. Subjective emotion (present both during and after aversive pictures) was temporally associated with activation in the right lateral OFC and left hippocampus but not the amygdala. Ratings of subjective emotion were correlated with activation in the right lateral OFC and left hippocampus. The results support direct amygdala involvement in emotion perception but suggest that amygdala activation is not temporally associated with subjective emotion that occurs after the offset of emotion-related stimuli. The results are consistent with a general role for the lateral OFC in monitoring or reflecting on internal experience and show that hippocampal activation is sustained during a period of subjective emotion, possibly related to enhanced memory encoding for the aversive pictures.
NeuroImage, in press
fMRI was used to dissociate neural responses temporally associated with the subjective experience of emotion from those associated with the perception of emotion-inducing stimuli in order to better define the emotion-related functions of the amygdala, lateral orbital frontal cortex (OFC), and hippocampus. Subjects viewed aversive pictures followed by an extended post-stimulus period of sustained subjective emotion. Brain regions showing activation paralleling the period of sustained subjective emotion were distinguished from those showing activation limited to the period of aversive picture presentation. Behavioral results showed that subjective ratings of emotion remained elevated for 20 s after offset of the aversive pictures. fMRI results showed that viewing aversive pictures activated the amygdala, lateral OFC, and hippocampus. Subjective emotion (present both during and after aversive pictures) was temporally associated with activation in the right lateral OFC and left hippocampus but not the amygdala. Ratings of subjective emotion were correlated with activation in the right lateral OFC and left hippocampus. The results support direct amygdala involvement in emotion perception but suggest that amygdala activation is not temporally associated with subjective emotion that occurs after the offset of emotion-related stimuli. The results are consistent with a general role for the lateral OFC in monitoring or reflecting on internal experience and show that hippocampal activation is sustained during a period of subjective emotion, possibly related to enhanced memory encoding for the aversive pictures.
ARTICLE UPDATE - How do emotion and gaze direction interfere with overt orienting of visual attention?
Bonifacci P, Ricciardelli P, Lugli L, Chitti F, Nicoletti R.
Cognitive Proceedings, 7 (supp 5), 155
BACKGROUND: Several studies using spatial cueing paradigms have demonstrated that observing gaze direction may trigger a reflexive visual orienting in the direction of the other person's gaze. Recently, it has been reported that facial expression may enhance the reflexive orienting of attention to gaze stimuli. However, some researchers have also suggested that, rather than facilitating attentional orienting away from the face, emotional expressions may, in fact, delay or prevent orienting. In the present study we investigated the effects of facial expression and gaze direction on a visual orienting oculomotor task to test whether an angry face may or may not interfere with visual orienting. METHOD: Participants performed an oculomotor task in which they had to make a saccade towards one of two lateral targets, depending upon the colour of the fixation dot which appeared at the centre of the computer screen. The instruction dot remained visible for 50 ms and then disappeared. At different time intervals (stimulus onset asynchronies, SOAs: 50, 100, 150 ms) following the onset of the instruction cue, a real face (gazing either to the right or to the left) was presented at the centre of the monitor. Gaze direction could be congruent or incongruent with respect to the instruction and target location. Facial expression was also manipulated. In half of the trials the diverted gaze (i.e., congruent or incongruent) appeared with an angry expression, whereas in the other half the face had a neutral expression. Participants were instructed to saccade either to the target on the left or to the target on the right (the targets were visible throughout the trial), as indicated by the instruction dot, while completely disregarding the face because it was irrelevant for the task. RESULTS: Eye movement recordings on correct trials showed that saccades congruent with the direction of the distracting gaze had shorter latencies than incongruent ones. However, the time-course of this effect varied depending on the facial expression. With a neutral expression, the congruency effect was found only at the shortest SOA (50 ms). On the contrary, for the angry face the congruency effect occurred at the longer SOAs. These findings suggest that gaze direction (even when task-irrelevant) is capable of interfering with the orienting of visual attention, and that faces with an angry expression may hold attention and affect its orienting longer than a neutral expression.
Cognitive Proceedings, 7 (supp 5), 155
BACKGROUND: Several studies using spatial cueing paradigms have demonstrated that observing gaze direction may trigger a reflexive visual orienting in the direction of the other person's gaze. Recently, it has been reported that facial expression may enhance the reflexive orienting of attention to gaze stimuli. However, some researchers have also suggested that, rather than facilitating attentional orienting away from the face, emotional expressions may, in fact, delay or prevent orienting. In the present study we investigated the effects of facial expression and gaze direction on a visual orienting oculomotor task to test whether an angry face may or may not interfere with visual orienting. METHOD: Participants performed an oculomotor task in which they had to make a saccade towards one of two lateral targets, depending upon the colour of the fixation dot which appeared at the centre of the computer screen. The instruction dot remained visible for 50 ms and then disappeared. At different time intervals (stimulus onset asynchronies, SOAs: 50, 100, 150 ms) following the onset of the instruction cue, a real face (gazing either to the right or to the left) was presented at the centre of the monitor. Gaze direction could be congruent or incongruent with respect to the instruction and target location. Facial expression was also manipulated. In half of the trials the diverted gaze (i.e., congruent or incongruent) appeared with an angry expression, whereas in the other half the face had a neutral expression. Participants were instructed to saccade either to the target on the left or to the target on the right (the targets were visible throughout the trial), as indicated by the instruction dot, while completely disregarding the face because it was irrelevant for the task. RESULTS: Eye movement recordings on correct trials showed that saccades congruent with the direction of the distracting gaze had shorter latencies than incongruent ones. However, the time-course of this effect varied depending on the facial expression. With a neutral expression, the congruency effect was found only at the shortest SOA (50 ms). On the contrary, for the angry face the congruency effect occurred at the longer SOAs. These findings suggest that gaze direction (even when task-irrelevant) is capable of interfering with the orienting of visual attention, and that faces with an angry expression may hold attention and affect its orienting longer than a neutral expression.
Wednesday, August 16, 2006
ARTICLE UPDATE - Attentional capture by task-irrelevant fearful faces is revealed by the N2pc component
Martin Eimer and Monika Kissa
Biological Psychology, in press
We measured the N2pc component as an electrophysiological indicator of attentional selection to investigate whether fearful faces can attract attention even when they are entirely task-irrelevant and attention is focused on another demanding visual monitoring task. Participants had to detect infrequent luminance changes of the fixation cross, while ignoring stimulus arrays containing a face singleton (a fearful face among neutral faces, or neutral face among fearful faces) to the left or right of fixation. On trials without a target luminance change, an N2pc was elicited by fearful faces presented next to fixation, irrespective of whether they were singletons or not, demonstrating that irrelevant fearful faces can bias the spatial distribution of attention. The N2pc to fearful faces was attenuated when face arrays were presented simultaneously with a target luminance change, suggesting that concurrent target processing reduces attentional capture by emotional salient events.
Biological Psychology, in press
We measured the N2pc component as an electrophysiological indicator of attentional selection to investigate whether fearful faces can attract attention even when they are entirely task-irrelevant and attention is focused on another demanding visual monitoring task. Participants had to detect infrequent luminance changes of the fixation cross, while ignoring stimulus arrays containing a face singleton (a fearful face among neutral faces, or neutral face among fearful faces) to the left or right of fixation. On trials without a target luminance change, an N2pc was elicited by fearful faces presented next to fixation, irrespective of whether they were singletons or not, demonstrating that irrelevant fearful faces can bias the spatial distribution of attention. The N2pc to fearful faces was attenuated when face arrays were presented simultaneously with a target luminance change, suggesting that concurrent target processing reduces attentional capture by emotional salient events.
Friday, August 11, 2006
ARTICLE UPDATE - Neuroimaging Studies of Emotional Responses in PTSD.
Liberzon I, Martis B.
Annuals of the New York Academy of Sciences, 1071, 87-109
Neuroimaging research offers a powerful and noninvasive means to understand healthy as well as dysregulated emotional processing in healthy subjects and PTSD patients. Functional neuroimaging findings suggest specific roles for subregions of the medial prefrontal (mPFC), orbito frontal (OFC), anterior cingulate (ACC), and insular cortices as well as the sublenticular extended amygdala (SLEA) and hippocampus in various components of emotional processing. Some of the same regions appear to be associated with emotional response to trauma, and with symptom formation in PTSD. Neuroimaging findings of emotional processing in healthy subjects and PTSD patients are discussed, addressing the specific roles of cortical regions like mPFC, ACC, and insula, and their potential contribution to PTSD pathophysiology. Processes of cognitive-emotional interactions and social emotions are discussed in an attempt to synthesize the prefrontal findings in healthy subjects and PTSD patients. Further links between functional neuroanatomy of emotional responses and neuroendocrine stress regulation are proposed.
Annuals of the New York Academy of Sciences, 1071, 87-109
Neuroimaging research offers a powerful and noninvasive means to understand healthy as well as dysregulated emotional processing in healthy subjects and PTSD patients. Functional neuroimaging findings suggest specific roles for subregions of the medial prefrontal (mPFC), orbito frontal (OFC), anterior cingulate (ACC), and insular cortices as well as the sublenticular extended amygdala (SLEA) and hippocampus in various components of emotional processing. Some of the same regions appear to be associated with emotional response to trauma, and with symptom formation in PTSD. Neuroimaging findings of emotional processing in healthy subjects and PTSD patients are discussed, addressing the specific roles of cortical regions like mPFC, ACC, and insula, and their potential contribution to PTSD pathophysiology. Processes of cognitive-emotional interactions and social emotions are discussed in an attempt to synthesize the prefrontal findings in healthy subjects and PTSD patients. Further links between functional neuroanatomy of emotional responses and neuroendocrine stress regulation are proposed.
ARTICLE UPDATE - Right hemispheric dominance in processing of unconscious negative emotion.
Sato W, Aoki S.
Brain and Cognition, in press
Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or plain gray images were briefly presented as negative, positive, and control primes, followed by a mosaic mask. Then nonsense target ideographs were presented, and the subjects evaluated their partiality toward the targets. When the stimuli were presented in the left, but not the right, visual fields, the negative primes reduced the subjects' liking for the targets, relative to the case of the positive or control primes. These results provided behavioral evidence supporting the hypothesis that the right hemisphere is dominant for unconscious negative emotional processing.
Brain and Cognition, in press
Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or plain gray images were briefly presented as negative, positive, and control primes, followed by a mosaic mask. Then nonsense target ideographs were presented, and the subjects evaluated their partiality toward the targets. When the stimuli were presented in the left, but not the right, visual fields, the negative primes reduced the subjects' liking for the targets, relative to the case of the positive or control primes. These results provided behavioral evidence supporting the hypothesis that the right hemisphere is dominant for unconscious negative emotional processing.
Friday, July 21, 2006
ARTICLE UPDATE - Neural substrates associated with evaluative processing during co-activation of positivity and negativity: A PET investigation.
Jung YC, An SK, Seok JH, Kim JS, Oh SJ, Moon DH, Kim JJ.
Biological Psychology, in press,
Affective symmetries, such as the positivity offset and negativity bias, have been postulated to be attributable to distinct activation functions of the positive and negative affect systems. We investigated the neural substrates that are engaged when the positive and negative affect systems undergo parallel and integrative processing. Eleven subjects were scanned using H(2)(15)O PET during choosing the subjective feeling produced by a stimulation pair of pictures or words. Four different conditions were designed for contrast: pure positivity, pure negativity, positivity offset, and negativity bias. The dorsolateral prefrontal activation was associated with positivity offset and negativity bias condition, whereas the ventromedial prefrontal activation, together with limbic and subcortical activations, was associated with pure positivity and pure negativity condition. The results indicated that positivity offset and negativity bias are not merely due to asymmetric activations of the positive and negative systems, but integrative processing of higher neocortical levels is involved.
Biological Psychology, in press,
Affective symmetries, such as the positivity offset and negativity bias, have been postulated to be attributable to distinct activation functions of the positive and negative affect systems. We investigated the neural substrates that are engaged when the positive and negative affect systems undergo parallel and integrative processing. Eleven subjects were scanned using H(2)(15)O PET during choosing the subjective feeling produced by a stimulation pair of pictures or words. Four different conditions were designed for contrast: pure positivity, pure negativity, positivity offset, and negativity bias. The dorsolateral prefrontal activation was associated with positivity offset and negativity bias condition, whereas the ventromedial prefrontal activation, together with limbic and subcortical activations, was associated with pure positivity and pure negativity condition. The results indicated that positivity offset and negativity bias are not merely due to asymmetric activations of the positive and negative systems, but integrative processing of higher neocortical levels is involved.
ARTICLE UPDATE - Anticipation of affective image modulates visual evoked magnetic fields (VEF).
Onoda K, Okamoto Y, Shishida K, Hashizume A, Ueda K, Kinoshita A, Yamashita H, Yamawaki S.
Experimental Brain Research, in press
We investigated the interaction between anticipation of positive and negative affective images and visual evoked magnetic fields (VEF). Participants (n = 13) were presented emotionally positive or negative images under different anticipatory conditions, and their subsequent brain responses were recorded by magnetoencephalography (MEG). In the Affective Cue conditions, the cue stimulus indicated the emotional valence of the image, which followed 2 s later. In the Null Cue conditions, the cue stimulus did not include any information about the valence of the image. In the No Cue conditions, the affective image was suddenly presented, without a cue stimulus. The VEF amplitude for the negative image in the Affective Cue condition was smaller than that of the positive image in the Affective Cue condition and that of the negative image in the Null Cue condition. This result suggests that anticipation of the valence of affective images modulates the processes of the visual cortex.
Experimental Brain Research, in press
We investigated the interaction between anticipation of positive and negative affective images and visual evoked magnetic fields (VEF). Participants (n = 13) were presented emotionally positive or negative images under different anticipatory conditions, and their subsequent brain responses were recorded by magnetoencephalography (MEG). In the Affective Cue conditions, the cue stimulus indicated the emotional valence of the image, which followed 2 s later. In the Null Cue conditions, the cue stimulus did not include any information about the valence of the image. In the No Cue conditions, the affective image was suddenly presented, without a cue stimulus. The VEF amplitude for the negative image in the Affective Cue condition was smaller than that of the positive image in the Affective Cue condition and that of the negative image in the Null Cue condition. This result suggests that anticipation of the valence of affective images modulates the processes of the visual cortex.
ARTICLE UPDATE - The locus ceruleus is involved in the successful retrieval of emotional memories in humans.
Sterpenich V, D'Argembeau A, Desseilles M, Balteau E, Albouy G, Vandewalle G, Degueldre C, Luxen A, Collette F, Maquet P.
Journal of Neuroscience, 26, 7416-7423.
Emotional memories are better remembered than neutral ones. The amygdala is involved in this enhancement not only by modulating the hippocampal activity, but possibly also by modulating central arousal. Using functional magnetic resonance imaging, we analyzed the retrieval of neutral faces encoded in emotional or neutral contexts. The pupillary size measured during encoding was used as a modulator of brain responses during retrieval. The interaction between emotion and memory showed significant responses in a set of areas, including the amygdala and parahippocampal gyrus. These areas responded significantly more for correctly remembered faces encoded in an emotional, compared with neutral, context. The same interaction conducted on responses modulated by the pupillary size revealed an area of the dorsal tegmentum of the ponto-mesencephalic region, consistent with the locus ceruleus. Moreover, a psychophysiological interaction showed that amygdalar responses were more tightly related to those of the locus ceruleus when remembering faces that had been encoded in an emotional, rather than neutral, context. These findings suggest that the restoration of a central arousal similar to encoding takes part in the successful retrieval of neutral events learned in an emotional context.
Journal of Neuroscience, 26, 7416-7423.
Emotional memories are better remembered than neutral ones. The amygdala is involved in this enhancement not only by modulating the hippocampal activity, but possibly also by modulating central arousal. Using functional magnetic resonance imaging, we analyzed the retrieval of neutral faces encoded in emotional or neutral contexts. The pupillary size measured during encoding was used as a modulator of brain responses during retrieval. The interaction between emotion and memory showed significant responses in a set of areas, including the amygdala and parahippocampal gyrus. These areas responded significantly more for correctly remembered faces encoded in an emotional, compared with neutral, context. The same interaction conducted on responses modulated by the pupillary size revealed an area of the dorsal tegmentum of the ponto-mesencephalic region, consistent with the locus ceruleus. Moreover, a psychophysiological interaction showed that amygdalar responses were more tightly related to those of the locus ceruleus when remembering faces that had been encoded in an emotional, rather than neutral, context. These findings suggest that the restoration of a central arousal similar to encoding takes part in the successful retrieval of neutral events learned in an emotional context.
ARTICLE UPDATE - Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging.
Vuilleumier P, Pourtois G.
Neuropsychologia, in press
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.
Neuropsychologia, in press
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.
Friday, July 14, 2006
ARTICLE UPDATE - On rejecting emotional lures created by phonological neighborhood activation.
Starns JJ, Cook GI, Hicks JL, Marsh RL.
Journal of Experimental Psychology: Learning, Memory and Cognition, 32, 847-853.
The authors conducted 2 experiments to assess how phonologically related lures are rejected in a false memory paradigm. Some phonological lures were emotional (i.e., taboo) words, and others were not. The authors manipulated the presence of taboo items on the study list and reduced the ability to use controlled rejection strategies by dividing attention and forcing a short response deadline. The results converge on the idea that participants reduce false alarms to emotional lures by setting more stringent recognition criteria for these items based on their expected memorability. Additionally, emotional lures are less familiar than nonemotional lures because emotional lures have affective and semantic features that mismatch studied nonemotional items.
Journal of Experimental Psychology: Learning, Memory and Cognition, 32, 847-853.
The authors conducted 2 experiments to assess how phonologically related lures are rejected in a false memory paradigm. Some phonological lures were emotional (i.e., taboo) words, and others were not. The authors manipulated the presence of taboo items on the study list and reduced the ability to use controlled rejection strategies by dividing attention and forcing a short response deadline. The results converge on the idea that participants reduce false alarms to emotional lures by setting more stringent recognition criteria for these items based on their expected memorability. Additionally, emotional lures are less familiar than nonemotional lures because emotional lures have affective and semantic features that mismatch studied nonemotional items.
Monday, July 03, 2006
ARTICLE UPDATE - Temporal dynamics of face repetition suppression
Alumit Ishai, Philip C. Bikle and Leslie G. Ungerleider
Brain Research Bulletin, in press
Single-unit recordings and functional brain imaging studies have shown reduced neural responses to repeated stimuli in the visual cortex. Using MEG, we compared responses evoked by repetitions of neutral faces to those evoked by fearful faces, which were either task relevant (targets) or irrelevant (distracters). Faces evoked a bi-phasic response in extrastriate cortex, peaking at 160–185 ms and at 220–250 ms, with stronger responses to neutral faces at the earlier interval and stronger responses to fearful faces at the later interval. At both latencies, repetitions of neutral and fearful targets resulted in reduced amplitude of the MEG signal. Additionally, we found that the context in which targets were presented affected their processing: fearful distracters increased the responses evoked by both neutral and fearful targets. Our data indicate that valence enhancement and context effects can be detected in extrastriate visual cortex within 250 ms and that these processes likely reflect feedback from other regions.
Brain Research Bulletin, in press
Single-unit recordings and functional brain imaging studies have shown reduced neural responses to repeated stimuli in the visual cortex. Using MEG, we compared responses evoked by repetitions of neutral faces to those evoked by fearful faces, which were either task relevant (targets) or irrelevant (distracters). Faces evoked a bi-phasic response in extrastriate cortex, peaking at 160–185 ms and at 220–250 ms, with stronger responses to neutral faces at the earlier interval and stronger responses to fearful faces at the later interval. At both latencies, repetitions of neutral and fearful targets resulted in reduced amplitude of the MEG signal. Additionally, we found that the context in which targets were presented affected their processing: fearful distracters increased the responses evoked by both neutral and fearful targets. Our data indicate that valence enhancement and context effects can be detected in extrastriate visual cortex within 250 ms and that these processes likely reflect feedback from other regions.
Friday, June 30, 2006
ARTICLE UPDATE - Intentional modulation of emotional responding to unpleasant pictures: An ERP study.
Moser JS, Hajcak G, Bukay E, Simons RF.
Psychophysiology, 43, 292-296.
Intentionally altering responses to unpleasant stimuli affects physiological and hemodynamic activity associated with emotional and cognitive processing. In the present experiment, we measured the late-positive potential (LPP) of the visually evoked event-related brain potential to examine the effects of intentional emotion modulation on electrophysiological correlates of emotional and cognitive processing. Seventeen participants received instructions to view, suppress, and enhance emotional responses to unpleasant stimuli. Results revealed significantly decreased electrophysiological activity during suppression of emotional responses beginning around 250 ms poststimulus and lasting several hundred milliseconds. These data suggest that ERPs are sensitive to emotion modulation/regulation processes.
Psychophysiology, 43, 292-296.
Intentionally altering responses to unpleasant stimuli affects physiological and hemodynamic activity associated with emotional and cognitive processing. In the present experiment, we measured the late-positive potential (LPP) of the visually evoked event-related brain potential to examine the effects of intentional emotion modulation on electrophysiological correlates of emotional and cognitive processing. Seventeen participants received instructions to view, suppress, and enhance emotional responses to unpleasant stimuli. Results revealed significantly decreased electrophysiological activity during suppression of emotional responses beginning around 250 ms poststimulus and lasting several hundred milliseconds. These data suggest that ERPs are sensitive to emotion modulation/regulation processes.
Wednesday, June 28, 2006
ARTICLE UPDATE - Event-related brain potential correlates of emotional face processing
Martin Eimer and Amanda Holmes
Neuropsychologia, in press
Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was remarkably early, ranging from 120 to 180 ms post-stimulus in different experiments where faces were either presented at fixation or laterally, and with or without non-face distractor stimuli. While broadly distributed positive deflections beyond 250 ms post-stimulus have been found in previous studies for non-face stimuli, the early frontocentrally distributed phase of this emotional positivity is most likely face-specific. Similar emotional expression effects were found for six basic emotions, suggesting that these effects are not primarily generated within neural structures specialised for the automatic detection of specific emotions. Expression effects were eliminated when attention was directed away from the location of peripherally presented emotional faces, indicating that they are not linked to pre-attentive emotional processing. When foveal faces were unattended, expression effects were attenuated, but not completely eliminated. It is suggested that these ERP correlates of emotional face processing reflect activity within a neocortical system where representations of emotional content are generated in a task-dependent fashion for the adaptive intentional control of behaviour. Given the early onset of the emotion-specific effects reviewed here, it is likely that this system is activated in parallel with the ongoing evaluation of emotional content in the amygdala and related subcortical brain circuits.
Neuropsychologia, in press
Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was remarkably early, ranging from 120 to 180 ms post-stimulus in different experiments where faces were either presented at fixation or laterally, and with or without non-face distractor stimuli. While broadly distributed positive deflections beyond 250 ms post-stimulus have been found in previous studies for non-face stimuli, the early frontocentrally distributed phase of this emotional positivity is most likely face-specific. Similar emotional expression effects were found for six basic emotions, suggesting that these effects are not primarily generated within neural structures specialised for the automatic detection of specific emotions. Expression effects were eliminated when attention was directed away from the location of peripherally presented emotional faces, indicating that they are not linked to pre-attentive emotional processing. When foveal faces were unattended, expression effects were attenuated, but not completely eliminated. It is suggested that these ERP correlates of emotional face processing reflect activity within a neocortical system where representations of emotional content are generated in a task-dependent fashion for the adaptive intentional control of behaviour. Given the early onset of the emotion-specific effects reviewed here, it is likely that this system is activated in parallel with the ongoing evaluation of emotional content in the amygdala and related subcortical brain circuits.
ARTICLE UPDATE - Are you always on my mind? A review of how face perception and attention interact
Romina Palermo and Gillian Rhodes
Neuropsychologia, in press,
In this review we examine how attention is involved in detecting faces, recognizing facial identity and registering and discriminating between facial expressions of emotion. The first section examines whether these aspects of face perception are “automatic”, in that they are especially rapid, non-conscious, mandatory and capacity-free. The second section discusses whether limited-capacity selective attention mechanisms are preferentially recruited by faces and facial expressions. Evidence from behavioral, neuropsychological, neuroimaging and psychophysiological studies from humans and single-unit recordings from primates is examined and the neural systems involved in processing faces, emotion and attention are highlighted. Avenues for further research are identified.
Neuropsychologia, in press,
In this review we examine how attention is involved in detecting faces, recognizing facial identity and registering and discriminating between facial expressions of emotion. The first section examines whether these aspects of face perception are “automatic”, in that they are especially rapid, non-conscious, mandatory and capacity-free. The second section discusses whether limited-capacity selective attention mechanisms are preferentially recruited by faces and facial expressions. Evidence from behavioral, neuropsychological, neuroimaging and psychophysiological studies from humans and single-unit recordings from primates is examined and the neural systems involved in processing faces, emotion and attention are highlighted. Avenues for further research are identified.
Friday, June 16, 2006
ARTICLE UPDATE - The influence of nonremembered affective associations on preference.
Ghuman AS, Bar M.
Emotion, 6, 215-223.
An important influence on our preference toward a specific object is its associations with affective information. Here, the authors concentrate on the role of memory on shaping such preferences. Specifically, the authors used a multistage behavioral paradigm that fostered associations between neutral shapes and affective images. Participants that explicitly remembered these affective associations preferred neutral shapes associated with positive images. Counterintuitively, participants who could not explicitly remember the associations preferred neutral shapes that were associated with negative images. Generally, the difference in preference between participants who could and could not remember the affective associations demonstrates a critical link between memory and preference formation. The authors propose that the preference for negatively associated items is a manifestation of a mechanism that produces an inherent incentive for rapidly assessing potentially threatening aspects in the environment.
Emotion, 6, 215-223.
An important influence on our preference toward a specific object is its associations with affective information. Here, the authors concentrate on the role of memory on shaping such preferences. Specifically, the authors used a multistage behavioral paradigm that fostered associations between neutral shapes and affective images. Participants that explicitly remembered these affective associations preferred neutral shapes associated with positive images. Counterintuitively, participants who could not explicitly remember the associations preferred neutral shapes that were associated with negative images. Generally, the difference in preference between participants who could and could not remember the affective associations demonstrates a critical link between memory and preference formation. The authors propose that the preference for negatively associated items is a manifestation of a mechanism that produces an inherent incentive for rapidly assessing potentially threatening aspects in the environment.
ARTICLE UPDATE - Rapid picture presentation and affective engagement.
Smith JC, Low A, Bradley MM, Lang PJ.
Emotion, 6, 208-214.
Emotional reactions were assessed to pictorial stimuli presented in a continuous stream at rapid speeds that compromise conceptual memory and the processing of specific picture content. Blocks of unpleasant, neutral, or pleasant pictures were presented at the rate of either three pictures per second or seven pictures per second. Even with rapid presentation rates, startle reflexes, corrugator muscle activity, and skin conductance responses were heightened when viewing unpleasant pictures. These effects were stronger later in the aversive block, suggesting that cumulative exposure increasingly activates the defense system. The findings suggest that, despite conceptual masking inherent in rapid serial visual presentation, affective pictures prompt measurable emotional engagement.
Emotion, 6, 208-214.
Emotional reactions were assessed to pictorial stimuli presented in a continuous stream at rapid speeds that compromise conceptual memory and the processing of specific picture content. Blocks of unpleasant, neutral, or pleasant pictures were presented at the rate of either three pictures per second or seven pictures per second. Even with rapid presentation rates, startle reflexes, corrugator muscle activity, and skin conductance responses were heightened when viewing unpleasant pictures. These effects were stronger later in the aversive block, suggesting that cumulative exposure increasingly activates the defense system. The findings suggest that, despite conceptual masking inherent in rapid serial visual presentation, affective pictures prompt measurable emotional engagement.
ARTICLE UPDATE - Spontaneous retrieval of affective person knowledge in face perception.
Todorov A, Gobbini MI, Evans KK, Haxby JV.
Neuropsychologia, in press
In a functional magnetic resonance imaging experiment, we explored whether affective person knowledge based on memories formed from minimal information is spontaneously retrieved in face perception. In the first stage of the experiment, participants were presented with 120 unfamiliar faces. Each face was presented with a description of one of four types of behaviors: aggressive, disgusting, neutral, and nice. In the second stage, participants were scanned while engaged in a one-back recognition task in which they saw the faces that were associated with behaviors and 30 novel faces. Although this task is a simple perceptual task that neither demands person evaluation nor retrieval of person knowledge, neural responses to faces differed as a function of the behaviors. Faces associated with behaviors evoked stronger activity than did novel faces in regions implicated in social cognition-anterior paracingulate cortex and superior temporal sulcus. Explicit memory for the behaviors enhanced the neural response in these regions. Faces associated with disgusting behaviors evoked stronger activity in left anterior insula than did faces associated with aggressive behaviors. This effect was equally strong for faces associated with explicitly recalled behaviors and faces associated with non-recalled behaviors. The findings suggest that affective person knowledge acquired from minimal information is spontaneously retrieved in face perception, engaging neural systems for analysis of social cognition and emotions.
Neuropsychologia, in press
In a functional magnetic resonance imaging experiment, we explored whether affective person knowledge based on memories formed from minimal information is spontaneously retrieved in face perception. In the first stage of the experiment, participants were presented with 120 unfamiliar faces. Each face was presented with a description of one of four types of behaviors: aggressive, disgusting, neutral, and nice. In the second stage, participants were scanned while engaged in a one-back recognition task in which they saw the faces that were associated with behaviors and 30 novel faces. Although this task is a simple perceptual task that neither demands person evaluation nor retrieval of person knowledge, neural responses to faces differed as a function of the behaviors. Faces associated with behaviors evoked stronger activity than did novel faces in regions implicated in social cognition-anterior paracingulate cortex and superior temporal sulcus. Explicit memory for the behaviors enhanced the neural response in these regions. Faces associated with disgusting behaviors evoked stronger activity in left anterior insula than did faces associated with aggressive behaviors. This effect was equally strong for faces associated with explicitly recalled behaviors and faces associated with non-recalled behaviors. The findings suggest that affective person knowledge acquired from minimal information is spontaneously retrieved in face perception, engaging neural systems for analysis of social cognition and emotions.
ARTICLE UPDATE - Human and computer recognition of facial expressions of emotion.
Susskind JM, Littlewort G, Bartlett MS, Movellan J, Anderson AK.
Neuropsychologia, in press
Neuropsychological and neuroimaging evidence suggests that the human brain contains facial expression recognition detectors specialized for specific discrete emotions. However, some human behavioral data suggest that humans recognize expressions as similar and not discrete entities. This latter observation has been taken to indicate that internal representations of facial expressions may be best characterized as varying along continuous underlying dimensions. To examine the potential compatibility of these two views, the present study compared human and support vector machine (SVM) facial expression recognition performance. Separate SVMs were trained to develop fully automatic optimal recognition of one of six basic emotional expressions in real-time with no explicit training on expression similarity. Performance revealed high recognition accuracy for expression prototypes. Without explicit training of similarity detection, magnitude of activation across each emotion-specific SVM captured human judgments of expression similarity. This evidence suggests that combinations of expert classifiers from separate internal neural representations result in similarity judgments between expressions, supporting the appearance of a continuous underlying dimensionality. Further, these data suggest similarity in expression meaning is supported by superficial similarities in expression appearance.
Neuropsychologia, in press
Neuropsychological and neuroimaging evidence suggests that the human brain contains facial expression recognition detectors specialized for specific discrete emotions. However, some human behavioral data suggest that humans recognize expressions as similar and not discrete entities. This latter observation has been taken to indicate that internal representations of facial expressions may be best characterized as varying along continuous underlying dimensions. To examine the potential compatibility of these two views, the present study compared human and support vector machine (SVM) facial expression recognition performance. Separate SVMs were trained to develop fully automatic optimal recognition of one of six basic emotional expressions in real-time with no explicit training on expression similarity. Performance revealed high recognition accuracy for expression prototypes. Without explicit training of similarity detection, magnitude of activation across each emotion-specific SVM captured human judgments of expression similarity. This evidence suggests that combinations of expert classifiers from separate internal neural representations result in similarity judgments between expressions, supporting the appearance of a continuous underlying dimensionality. Further, these data suggest similarity in expression meaning is supported by superficial similarities in expression appearance.
Monday, June 12, 2006
ARTICLE UPDATE - Semantic Processing Precedes Affect Retrieval: The Neurological Case for Cognitive Primacy in Visual Processing
Justin Storbeck, Michael D. Robinson and Mark E. McCourt
Review of General Psychology, 10, 41-55.
According to the affective primacy hypothesis, visual stimuli can be evaluated prior to and independent of object identification and semantic analysis (Zajonc, 1980, 2000). Our review concludes that the affective primacy hypothesis is, from the available evidence, not likely correct. Although people can react to objects that they cannot consciously identify, such affective reactions are dependent upon prior semantic analysis within the visual cortex. The authors propose that the features of objects must first be integrated, and then the objects themselves must be categorized and identified, all prior to affective analysis. Additionally, the authors offer a preliminary neurological analysis of the mere exposure and affective priming effects that is consistent with the claim that semantic analysis is needed to elicit these effects. In sum, the authors conclude that the brain must know what something is in order to know whether it is good or bad.
Review of General Psychology, 10, 41-55.
According to the affective primacy hypothesis, visual stimuli can be evaluated prior to and independent of object identification and semantic analysis (Zajonc, 1980, 2000). Our review concludes that the affective primacy hypothesis is, from the available evidence, not likely correct. Although people can react to objects that they cannot consciously identify, such affective reactions are dependent upon prior semantic analysis within the visual cortex. The authors propose that the features of objects must first be integrated, and then the objects themselves must be categorized and identified, all prior to affective analysis. Additionally, the authors offer a preliminary neurological analysis of the mere exposure and affective priming effects that is consistent with the claim that semantic analysis is needed to elicit these effects. In sum, the authors conclude that the brain must know what something is in order to know whether it is good or bad.
Saturday, June 10, 2006
ARTICLE UPDATE - Functional neuroimaging of emotional learning and autonomic reactions
Martin Peper, Martin Herpers, Joachim Spreer, Jürgen Hennig and Josef Zentner
Journal of Psysiology - Paris, in press
This article provides a selective overview of the functional neuroimaging literature with an emphasis on emotional activation processes. Emotions are fast and flexible response systems that provide basic tendencies for adaptive action. From the range of involved component functions, we first discuss selected automatic mechanisms that control basic adaptational changes. Second, we illustrate how neuroimaging work has contributed to the mapping of the network components associated with basic emotion families (fear, anger, disgust, happiness), and secondary dimensional concepts that organise the meaning space for subjective experience and verbal labels (emotional valence, activity/intensity, approach/withdrawal, etc.). Third, results and methodological difficulties are discussed in view of own neuroimaging experiments that investigated the component functions involved in emotional learning. The amygdala, prefrontal cortex, and striatum form a network of reciprocal connections that show topographically distinct patterns of activity as a correlate of up and down regulation processes during an emotional episode. Emotional modulations of other brain systems have attracted recent research interests. Emotional neuroimaging calls for more representative designs that highlight the modulatory influences of regulation strategies and socio-cultural factors responsible for inhibitory control and extinction. We conclude by emphasising the relevance of the temporal process dynamics of emotional activations that may provide improved prediction of individual differences in emotionality.
Journal of Psysiology - Paris, in press
This article provides a selective overview of the functional neuroimaging literature with an emphasis on emotional activation processes. Emotions are fast and flexible response systems that provide basic tendencies for adaptive action. From the range of involved component functions, we first discuss selected automatic mechanisms that control basic adaptational changes. Second, we illustrate how neuroimaging work has contributed to the mapping of the network components associated with basic emotion families (fear, anger, disgust, happiness), and secondary dimensional concepts that organise the meaning space for subjective experience and verbal labels (emotional valence, activity/intensity, approach/withdrawal, etc.). Third, results and methodological difficulties are discussed in view of own neuroimaging experiments that investigated the component functions involved in emotional learning. The amygdala, prefrontal cortex, and striatum form a network of reciprocal connections that show topographically distinct patterns of activity as a correlate of up and down regulation processes during an emotional episode. Emotional modulations of other brain systems have attracted recent research interests. Emotional neuroimaging calls for more representative designs that highlight the modulatory influences of regulation strategies and socio-cultural factors responsible for inhibitory control and extinction. We conclude by emphasising the relevance of the temporal process dynamics of emotional activations that may provide improved prediction of individual differences in emotionality.
Friday, June 09, 2006
ARTICLE UPDATE - Fear Recognition Ability Predicts Differences in Social Cognitive and Neural Functioning in Men
Ben Corden, Hugo D. Critchley, David Skuse and Raymond J. Dolan
Journal of Cognitive Neuroscience, 18, 889-897.
By testing the facial fear-recognition ability of 341 men in the general population, we show that 8.8% have deficits akin to those seen with acquired amygdala damage. Using psychological tests and functional magnetic resonance imaging (fMRI) we tested the hypothesis that poor fear recognition would predict deficits in other domains of social cognition and, in response to socially relevant stimuli, abnormal activation in brain regions that putatively reflect engagement of the "social brain." On tests of "theory of mind" ability, 25 "low fear scorers" (LFS) performed significantly worse than 25 age- and IQ-matched "normal (good) fear scorers" (NFS). In fMRI, we compared evoked activity during a gender judgement task to neutral faces portraying different head and eye gaze orientations in 12 NFS and 12 LFS subjects. Despite identical between-group accuracy in gender discrimination, LFS demonstrated significantly reduced activation in amygdala, fusiform gyrus, and anterior superior temporal cortices when viewing faces with direct versus averted gaze. In a functional connectivity analysis, NFS show enhanced connectivity between the amygdala and anterior temporal cortex in the context of direct gaze; this enhanced coupling is absent in LFS. We suggest that important individual differences in social cognitive skills are expressed within the healthy male population, which appear to have a basis in a compromised neural system that underpins social information processing.
Journal of Cognitive Neuroscience, 18, 889-897.
By testing the facial fear-recognition ability of 341 men in the general population, we show that 8.8% have deficits akin to those seen with acquired amygdala damage. Using psychological tests and functional magnetic resonance imaging (fMRI) we tested the hypothesis that poor fear recognition would predict deficits in other domains of social cognition and, in response to socially relevant stimuli, abnormal activation in brain regions that putatively reflect engagement of the "social brain." On tests of "theory of mind" ability, 25 "low fear scorers" (LFS) performed significantly worse than 25 age- and IQ-matched "normal (good) fear scorers" (NFS). In fMRI, we compared evoked activity during a gender judgement task to neutral faces portraying different head and eye gaze orientations in 12 NFS and 12 LFS subjects. Despite identical between-group accuracy in gender discrimination, LFS demonstrated significantly reduced activation in amygdala, fusiform gyrus, and anterior superior temporal cortices when viewing faces with direct versus averted gaze. In a functional connectivity analysis, NFS show enhanced connectivity between the amygdala and anterior temporal cortex in the context of direct gaze; this enhanced coupling is absent in LFS. We suggest that important individual differences in social cognitive skills are expressed within the healthy male population, which appear to have a basis in a compromised neural system that underpins social information processing.
ARTICLE UPDATE - Orbitofrontal Cortex and Social Behavior: Integrating Self-monitoring and Emotion–Cognition Interactions
Jennifer S. Beer, Oliver P. John, Donatella Scabini and Robert T. Knight
Journal of Cognitive Neuroscience, 18, 871-879.
The role of the orbitofrontal cortex in social behavior remains a puzzle. Various theories of the social functions of the orbitofrontal cortex focus on the role of this area in either emotional processing or its involvement in online monitoring of behavior (i.e., self-monitoring). The present research attempts to integrate these two theories by examining whether improving the self-monitoring of patients with orbitofrontal damage is associated with the generation of emotions needed to guide interpersonal behavior. Patients with orbitofrontal damage, patients with lateral prefrontal damage, and healthy controls took part in an interpersonal task. After completing the task, participants' self-monitoring was increased by showing them a videotape of their task performance. In comparison to healthy controls and patients with lateral prefrontal damage, orbitofrontal damage was associated with objectively inappropriate social behavior. Although patients with orbitofrontal damage were aware of social norms of intimacy, they were unaware that their task performance violated these norms. The embarrassment typically associated with inappropriate social behavior was elicited in these patients only after their self-monitoring increased from viewing their videotaped performance. These findings suggest that damage to the orbitofrontal cortex impairs self-insight that may preclude the generation of helpful emotional information. The results highlight the role of the orbitofrontal cortex in the interplay of self-monitoring and emotional processing and suggest avenues for neurorehabilitation of patients with social deficits subsequent to orbitofrontal damage.
Journal of Cognitive Neuroscience, 18, 871-879.
The role of the orbitofrontal cortex in social behavior remains a puzzle. Various theories of the social functions of the orbitofrontal cortex focus on the role of this area in either emotional processing or its involvement in online monitoring of behavior (i.e., self-monitoring). The present research attempts to integrate these two theories by examining whether improving the self-monitoring of patients with orbitofrontal damage is associated with the generation of emotions needed to guide interpersonal behavior. Patients with orbitofrontal damage, patients with lateral prefrontal damage, and healthy controls took part in an interpersonal task. After completing the task, participants' self-monitoring was increased by showing them a videotape of their task performance. In comparison to healthy controls and patients with lateral prefrontal damage, orbitofrontal damage was associated with objectively inappropriate social behavior. Although patients with orbitofrontal damage were aware of social norms of intimacy, they were unaware that their task performance violated these norms. The embarrassment typically associated with inappropriate social behavior was elicited in these patients only after their self-monitoring increased from viewing their videotaped performance. These findings suggest that damage to the orbitofrontal cortex impairs self-insight that may preclude the generation of helpful emotional information. The results highlight the role of the orbitofrontal cortex in the interplay of self-monitoring and emotional processing and suggest avenues for neurorehabilitation of patients with social deficits subsequent to orbitofrontal damage.
ARTICLE UPDATE - Neural systems connecting interoceptive awareness and feelings.
Pollatos O, Gramann K, Schandry R.
Human Brain Mapping, in press
In many theories of emotions the representations of bodily responses play an important role for subjective feelings. We tested the hypothesis that the perception of bodily states is positively related to the experienced intensity of feelings as well as to the activity of first-order and second-order brain structures involved in the processing of feelings. Using a heartbeat perception task, subjects were separated into groups with either high or poor interoceptive awareness. During emotional picture presentation we measured high-density EEG and used spatiotemporal current density reconstruction to identify regions involved in both interoceptive awareness and emotion processing. We observed a positive relation between interoceptive awareness and the experienced intensity of emotions. Furthermore, the P300 amplitudes to pleasant and unpleasant pictures were enhanced for subjects with high interoceptive awareness. The source reconstruction revealed that interoceptive awareness is related to an enhanced activation in both first-order structures (insula, somatosensory cortices) and second-order structures (anterior cingulate, prefrontal cortices). We conclude that the perception of bodily states is a crucial determinant for the processing and the subjective experience of feelings.
Human Brain Mapping, in press
In many theories of emotions the representations of bodily responses play an important role for subjective feelings. We tested the hypothesis that the perception of bodily states is positively related to the experienced intensity of feelings as well as to the activity of first-order and second-order brain structures involved in the processing of feelings. Using a heartbeat perception task, subjects were separated into groups with either high or poor interoceptive awareness. During emotional picture presentation we measured high-density EEG and used spatiotemporal current density reconstruction to identify regions involved in both interoceptive awareness and emotion processing. We observed a positive relation between interoceptive awareness and the experienced intensity of emotions. Furthermore, the P300 amplitudes to pleasant and unpleasant pictures were enhanced for subjects with high interoceptive awareness. The source reconstruction revealed that interoceptive awareness is related to an enhanced activation in both first-order structures (insula, somatosensory cortices) and second-order structures (anterior cingulate, prefrontal cortices). We conclude that the perception of bodily states is a crucial determinant for the processing and the subjective experience of feelings.
Subscribe to:
Posts (Atom)