Thomas Straube, Thomas Weiss, Hans-Joachim Mentzel, and Wolfgang H.R. Miltner
NeuroImage, in press
The time course of amygdala activation during aversive conditioning is a matter of debate. While some researchers reported rapid habituation, others found stable or no amygdalar responses to conditioned stimuli at all. In the present event-related fMRI study, we investigated whether the activity of the amygdala during aversive conditioning depends on attentional conditions. Subjects underwent aversive delay conditioning by pairing an electrical shock (unconditioned aversive stimulus) with a visual conditioned stimulus (CS+). For each singular presentation of the CS+ or a nonconditioned visual stimulus (CS−), subjects attended in random order to features that either differed between both stimuli (identification task) or that did not differ (distraction task). For the identification task trials, increased responses of the left amygdala to CS+ versus CS− were rapidly established but absent at the end of the conditioning trials. In contrast, under the distraction condition,
amygdala activation to CS+ versus CS− was present during the late but not the early phase of conditioning. The results suggest that the time course of amygdala activity during aversive associative learning is strongly modulated by an interaction of attention and time.
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Monday, October 30, 2006
Friday, October 27, 2006
ARTICLE UPDATE - Time-locked brain activity associated with emotion: a pilot MEG study.
Leon-Carrion J, McManis MH, Castillo EM, Papanicolaou AC.
Brain Injury, in press
To examine the time course of brain activation in response to emotionally evocative pictures. METHODS AND PROCEDURES: Regions of the brain involved in the processing of affective stimuli in response to picture sets rated unpleasant, pleasant and affectively neutral, as well as the order of activation of each region, were investigated using magnetoencephalography in 10 normal adult volunteers. RESULTS: Spatiotemporal maps were found consisting of two basic components. The first involving activation in the occipital and basal aspects of the temporal cortex- lasted, on average, 270 ms post-stimulus. The second component involving activation in the mesial temporal lobes (MTL) extended from 270 to 850 ms post-stimulus. After (serial) activating the mesial temporal lobe structures or simultaneous (parallel) to it, activation is also observed in the frontal structures. CONCLUSIONS: The temporal organization in the brain of an emotional stimulus requires the serial and alternating engagement of frontal and posterior cortices. It is suggested that lesions to the brain may disrupt this temporal course, altering the emotional response commonly observed in patients with brain injury.
Brain Injury, in press
To examine the time course of brain activation in response to emotionally evocative pictures. METHODS AND PROCEDURES: Regions of the brain involved in the processing of affective stimuli in response to picture sets rated unpleasant, pleasant and affectively neutral, as well as the order of activation of each region, were investigated using magnetoencephalography in 10 normal adult volunteers. RESULTS: Spatiotemporal maps were found consisting of two basic components. The first involving activation in the occipital and basal aspects of the temporal cortex- lasted, on average, 270 ms post-stimulus. The second component involving activation in the mesial temporal lobes (MTL) extended from 270 to 850 ms post-stimulus. After (serial) activating the mesial temporal lobe structures or simultaneous (parallel) to it, activation is also observed in the frontal structures. CONCLUSIONS: The temporal organization in the brain of an emotional stimulus requires the serial and alternating engagement of frontal and posterior cortices. It is suggested that lesions to the brain may disrupt this temporal course, altering the emotional response commonly observed in patients with brain injury.
Friday, October 20, 2006
ARTICLE UPDATE - Sex differences in brain activation patterns during processing of positively and negatively valenced emotional words
Hofer A, Siedentopf CM, Ischebeck A, Rettenbacher MA, Verius M, Felber S, Wolfgang Fleischhacker W.
Psychological Medicine, in press
Background. Previous studies have suggested that men and women process emotional stimuli differently. In this study, we used event-related functional magnetic resonance imaging (fMRI) to investigate gender differences in regional cerebral activity during the perception of positive or negative emotions.
Method. The experiment comprised two emotional conditions (positively/negatively valenced words) during which fMRI data were acquired.
Results. Thirty-eight healthy volunteers (19 males, 19 females) were investigated. A direct comparison of brain activation between men and women revealed differential activation in the right putamen, the right superior temporal gyrus, and the left supramarginal gyrus during processing of positively valenced words versus non-words for women versus men. By contrast, during processing of negatively valenced words versus non-words, relatively greater activation was seen in the left perirhinal cortex and hippocampus for women versus men, and in the right supramarginal gyrus for men versus women.
Conclusions. Our findings suggest gender-related neural responses to emotional stimuli and could contribute to the understanding of mechanisms underlying the gender disparity of neuropsychiatric diseases such as mood disorders.
Psychological Medicine, in press
Background. Previous studies have suggested that men and women process emotional stimuli differently. In this study, we used event-related functional magnetic resonance imaging (fMRI) to investigate gender differences in regional cerebral activity during the perception of positive or negative emotions.
Method. The experiment comprised two emotional conditions (positively/negatively valenced words) during which fMRI data were acquired.
Results. Thirty-eight healthy volunteers (19 males, 19 females) were investigated. A direct comparison of brain activation between men and women revealed differential activation in the right putamen, the right superior temporal gyrus, and the left supramarginal gyrus during processing of positively valenced words versus non-words for women versus men. By contrast, during processing of negatively valenced words versus non-words, relatively greater activation was seen in the left perirhinal cortex and hippocampus for women versus men, and in the right supramarginal gyrus for men versus women.
Conclusions. Our findings suggest gender-related neural responses to emotional stimuli and could contribute to the understanding of mechanisms underlying the gender disparity of neuropsychiatric diseases such as mood disorders.
ARTICLE UPDATE - Preferential responses in amygdala and insula during presentation of facial contempt and disgust.
Fabio Sambataro, Savino Dimalta, Annabella Di Giorgio, Paolo Taurisano, Giuseppe Blasi, Tommaso Scarabino, Giuseppe Giannatempo, Marcello Nardini and Alessandro Bertolino
European Journal of Neuroscience, in press
Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.
European Journal of Neuroscience, in press
Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.
Friday, October 13, 2006
ARTICLE UPDATE - Individual differences in amygdala activity predict response speed during working memory.
Schaefer A, Braver TS, Reynolds JR, Burgess GC, Yarkoni T, Gray JR.
The Journal of Neuroscience, 26, 12120-12128
The human amygdala has classically been viewed as a brain structure primarily related to emotions and dissociated from higher cognition. We report here findings suggesting that the human amygdala also has a role in supporting working memory (WM), a canonical higher cognitive function. In a first functional magnetic resonance imaging (fMRI) study (n = 53), individual differences in amygdala activity predicted behavioral performance in a 3-back WM task. Specifically, higher event-related amygdala amplitude predicted faster response time (RT; r = -0.64), with no loss of accuracy. This relationship was not contingent on mood state, task content, or personality variables. In a second fMRI study (n = 21), we replicated the key finding (r = -0.47) and further showed that the correlation between the amygdala and faster RT was specific to a high working memory load condition (3-back) compared with a low working memory load condition (1-back). These results support models of amygdala function that can account for its involvement not only in emotion but also higher cognition.
The Journal of Neuroscience, 26, 12120-12128
The human amygdala has classically been viewed as a brain structure primarily related to emotions and dissociated from higher cognition. We report here findings suggesting that the human amygdala also has a role in supporting working memory (WM), a canonical higher cognitive function. In a first functional magnetic resonance imaging (fMRI) study (n = 53), individual differences in amygdala activity predicted behavioral performance in a 3-back WM task. Specifically, higher event-related amygdala amplitude predicted faster response time (RT; r = -0.64), with no loss of accuracy. This relationship was not contingent on mood state, task content, or personality variables. In a second fMRI study (n = 21), we replicated the key finding (r = -0.47) and further showed that the correlation between the amygdala and faster RT was specific to a high working memory load condition (3-back) compared with a low working memory load condition (1-back). These results support models of amygdala function that can account for its involvement not only in emotion but also higher cognition.
Tuesday, October 03, 2006
ARTICLE UPDATE - Effects of emotional arousal on multiple memory systems: Evidence from declarative and procedural learning
Stephan Steidl, Salwa Mohi-uddin and Adam K. Anderson
Learning and Memory, in press
Extensive evidence documents emotional modulation of hippocampus-dependent declarative memory in humans. However, little is known about the emotional modulation of striatum-dependent procedural memory. To address how emotional arousal influences declarative and procedural memory, the current study utilized (1) a picture recognition and (2) a weather prediction (WP) task (a probabilistic classification learning task), which have been shown to rely on hippocampal- and striatum-based memory systems, respectively. Observers viewed arousing or neutral pictures after (Experiment 1) or during (Experiment 2) WP training trials. A 1-wk delayed picture recognition memory test revealed enhanced declarative memory for arousing compared with neutral pictures. Arousal during encoding impaired initial WP acquisition but did not influence retention when tested after a 1-wk delay. Data from a subsequent 3-mo delayed test, however, suggested that arousal during acquisition may enhance remote WP retention. These results suggest a potential dissociation between how readily emotional arousal influences hippocampus-dependent and striatum-dependent memory systems in humans.
Learning and Memory, in press
Extensive evidence documents emotional modulation of hippocampus-dependent declarative memory in humans. However, little is known about the emotional modulation of striatum-dependent procedural memory. To address how emotional arousal influences declarative and procedural memory, the current study utilized (1) a picture recognition and (2) a weather prediction (WP) task (a probabilistic classification learning task), which have been shown to rely on hippocampal- and striatum-based memory systems, respectively. Observers viewed arousing or neutral pictures after (Experiment 1) or during (Experiment 2) WP training trials. A 1-wk delayed picture recognition memory test revealed enhanced declarative memory for arousing compared with neutral pictures. Arousal during encoding impaired initial WP acquisition but did not influence retention when tested after a 1-wk delay. Data from a subsequent 3-mo delayed test, however, suggested that arousal during acquisition may enhance remote WP retention. These results suggest a potential dissociation between how readily emotional arousal influences hippocampus-dependent and striatum-dependent memory systems in humans.
ARTICLE UPDATE - Progress in Brain Research Volume 156
This issue is a special issue about understanding emotions.
Section I Attention and Motivation in Emotional Decoding
Chapter 1 Emotion, motivation, and the brain: Reflex foundations in animal and human research.
Peter J. Lang and Michael Davis
Chapter 2 Emotion and attention: event-related brain potential studies.
Harald T. Schupp, Tobias Flaisch, Jessica Stockburger and Markus Junghöfer
Chapter 3 Implicit and explicit categorization of natural scenes.
Maurizio Codispoti, Vera Ferrari, Andrea De Cesarei and Rossella Cardinale
Chapter 4 Dynamics of emotional effects on spatial attention in the human visual cortex.
Gilles Pourtois and Patrik Vuilleumier
Chapter 5 The neural basis of narrative imagery: emotion and action.
Dean Sabatinelli, Peter J. Lang, Margaret M. Bradley and Tobias Flaisch
Chapter 6 Subliminal emotion perception in brain imaging: findings, issues, and recommendations.
Stefan Wiens
Chapter 7 Neuroimaging methods in affective neuroscience: Selected methodological issues.
Markus Junghöfer, Peter Peyk, Tobias Flaisch and Harald T. Schupp
Section II Understanding Emotional Language Content
Chapter 8 Emotional and semantic networks in visual word processing: insights from ERP studies.
Johanna Kissler, Ramin Assadollahi and Cornelia Herbert
Chapter 9 Event-related potential studies of language and emotion: words, phrases, and task effects.
Ira Fischler and Margaret Bradley
Chapter 10 Emotional connotation of words: role of emotion in distributed semantic systems.
M. Allison Cato Jackson and Bruce Crosson
Chapter 11 Macroscopic brain dynamics during verbal and pictorial processing of affective stimuli.
Andreas Keil
Section III Understanding Emotional Intonation
Chapter 12 Intonation as an interface between language and affect.
Didier Grandjean, Tanja Bänziger and Klaus R. Scherer
Chapter 13 Cerebral processing of linguistic and emotional prosody: fMRI studies.
D. Wildgruber, H. Ackermann, B. Kreifelts and T. Ethofer
Chapter 14 Affective and linguistic processing of speech prosody: DC potential studies.
Hans Pihan
Chapter 15 Lateralization of emotional prosody in the brain: an overview and synopsis on the impact of study design.
Sonja A. Kotz, Martin Meyer and Silke Paulmann
Chapter 16 Psychoacoustic studies on the processing of vocal interjections: how to disentangle lexical and prosodic information?
Susanne Dietrich, Hermann Ackermann, Diana P. Szameitat and Kai Alter
Chapter 17 Judging emotion and attitudes from prosody following brain damage.
Marc D. Pell
Section IV Integrating Social Information
Chapter 18 Processing of facial identity and expression: a psychophysical, physiological, and computational perspective.
Adrian Schwaninger, Christian Wallraven, Douglas W. Cunningham and Sarah D. Chiller-Glaus
Chapter 19 Investigating audiovisual integration of emotional signals in the human brain.
Thomas Ethofer, Gilles Pourtois and Dirk Wildgruber
Chapter 20 Role of the amygdala in processing visual social stimuli.
Ralph Adolphs and Michael Spezio
Chapter 21 Towards a unifying neural theory of social cognition.
Christian Keysers and Valeria Gazzola
Chapter 22 Empathizing: neurocognitive developmental mechanisms and individual differences.
Bhismadev Chakrabarti and Simon Baron-Cohen
Chapter 23 The multiple facets of empathy: a survey of theory and evidence.
Susanne Leiberg and Silke Anders
Section V Understanding Emotional Disorders
Chapter 24 Partly dissociable neural substrates for recognizing basic emotions: a critical review.
Andreas Hennenlotter and Ulrike Schroeder
Chapter 25 Integration of emotion and cognition in patients with psychopathy.
Monika Sommer, Göran Hajak, Katrin Döhnel, Johannes Schwerdtner, Jörg Meinhardt and Jürgen L. Müller
Chapter 26 Disordered emotional processing in schizophrenia and one-sided brain damage.
Katarzyna Kucharska-Pietura
Chapter 27 The biochemistry of dysfunctional emotions: proton MR spectroscopic findings in major depressive disorder.
Gabriele Ende, Traute Demirakca and Heike Tost
Section I Attention and Motivation in Emotional Decoding
Chapter 1 Emotion, motivation, and the brain: Reflex foundations in animal and human research.
Peter J. Lang and Michael Davis
Chapter 2 Emotion and attention: event-related brain potential studies.
Harald T. Schupp, Tobias Flaisch, Jessica Stockburger and Markus Junghöfer
Chapter 3 Implicit and explicit categorization of natural scenes.
Maurizio Codispoti, Vera Ferrari, Andrea De Cesarei and Rossella Cardinale
Chapter 4 Dynamics of emotional effects on spatial attention in the human visual cortex.
Gilles Pourtois and Patrik Vuilleumier
Chapter 5 The neural basis of narrative imagery: emotion and action.
Dean Sabatinelli, Peter J. Lang, Margaret M. Bradley and Tobias Flaisch
Chapter 6 Subliminal emotion perception in brain imaging: findings, issues, and recommendations.
Stefan Wiens
Chapter 7 Neuroimaging methods in affective neuroscience: Selected methodological issues.
Markus Junghöfer, Peter Peyk, Tobias Flaisch and Harald T. Schupp
Section II Understanding Emotional Language Content
Chapter 8 Emotional and semantic networks in visual word processing: insights from ERP studies.
Johanna Kissler, Ramin Assadollahi and Cornelia Herbert
Chapter 9 Event-related potential studies of language and emotion: words, phrases, and task effects.
Ira Fischler and Margaret Bradley
Chapter 10 Emotional connotation of words: role of emotion in distributed semantic systems.
M. Allison Cato Jackson and Bruce Crosson
Chapter 11 Macroscopic brain dynamics during verbal and pictorial processing of affective stimuli.
Andreas Keil
Section III Understanding Emotional Intonation
Chapter 12 Intonation as an interface between language and affect.
Didier Grandjean, Tanja Bänziger and Klaus R. Scherer
Chapter 13 Cerebral processing of linguistic and emotional prosody: fMRI studies.
D. Wildgruber, H. Ackermann, B. Kreifelts and T. Ethofer
Chapter 14 Affective and linguistic processing of speech prosody: DC potential studies.
Hans Pihan
Chapter 15 Lateralization of emotional prosody in the brain: an overview and synopsis on the impact of study design.
Sonja A. Kotz, Martin Meyer and Silke Paulmann
Chapter 16 Psychoacoustic studies on the processing of vocal interjections: how to disentangle lexical and prosodic information?
Susanne Dietrich, Hermann Ackermann, Diana P. Szameitat and Kai Alter
Chapter 17 Judging emotion and attitudes from prosody following brain damage.
Marc D. Pell
Section IV Integrating Social Information
Chapter 18 Processing of facial identity and expression: a psychophysical, physiological, and computational perspective.
Adrian Schwaninger, Christian Wallraven, Douglas W. Cunningham and Sarah D. Chiller-Glaus
Chapter 19 Investigating audiovisual integration of emotional signals in the human brain.
Thomas Ethofer, Gilles Pourtois and Dirk Wildgruber
Chapter 20 Role of the amygdala in processing visual social stimuli.
Ralph Adolphs and Michael Spezio
Chapter 21 Towards a unifying neural theory of social cognition.
Christian Keysers and Valeria Gazzola
Chapter 22 Empathizing: neurocognitive developmental mechanisms and individual differences.
Bhismadev Chakrabarti and Simon Baron-Cohen
Chapter 23 The multiple facets of empathy: a survey of theory and evidence.
Susanne Leiberg and Silke Anders
Section V Understanding Emotional Disorders
Chapter 24 Partly dissociable neural substrates for recognizing basic emotions: a critical review.
Andreas Hennenlotter and Ulrike Schroeder
Chapter 25 Integration of emotion and cognition in patients with psychopathy.
Monika Sommer, Göran Hajak, Katrin Döhnel, Johannes Schwerdtner, Jörg Meinhardt and Jürgen L. Müller
Chapter 26 Disordered emotional processing in schizophrenia and one-sided brain damage.
Katarzyna Kucharska-Pietura
Chapter 27 The biochemistry of dysfunctional emotions: proton MR spectroscopic findings in major depressive disorder.
Gabriele Ende, Traute Demirakca and Heike Tost
ARTICLE UPDATE - “Did you see him in the newspaper?” Electrophysiological correlates of context and valence in face processing
Giulia Galli, Matteo Feurra and Maria Pia Viggiano
Brain Research, in press
Face recognition emerges from an interaction between bottom-up and top-down processing. Specifically, it relies on complex associations between the visual representation of a given face and previously stored knowledge about that face (e.g. biographical details). In the present experiment, the time-course of the interaction between bottom-up and top-down processing was investigated using event-related potentials (ERPs) and manipulating realistic, ecological contextual information. In the study phase, half of the faces (context faces) were framed in a newspaper page entitled with an action committed by the person depicted; these actions could have a positive or a negative value, so in this way emotional valence could be manipulated. The other half was presented on a neutral background (no-context faces). In the test phase, previously presented faces and new ones were presented on neutral backgrounds and an old/new discrimination was requested. The N170 component was modulated by both context (presence/absence at encoding) and valence (positive/negative). A reduction in amplitude was found for context faces as opposed to no-context faces. The same pattern was observed for negative faces compared to positive ones. Moreover, later activations associated with context and valence were differentially distributed over the scalp: context effects were prominent in left frontal areas, traditionally linked to person-specific information retrieval, whereas valence effects were broadly distributed over the scalp. In relation to recent neuroimaging findings on the neural basis of top-down modulations, present findings indicate that the information flow from higher-order areas might have modulated the N170 component and mediated the retrieval of semantic information pertaining to the study episode.
Brain Research, in press
Face recognition emerges from an interaction between bottom-up and top-down processing. Specifically, it relies on complex associations between the visual representation of a given face and previously stored knowledge about that face (e.g. biographical details). In the present experiment, the time-course of the interaction between bottom-up and top-down processing was investigated using event-related potentials (ERPs) and manipulating realistic, ecological contextual information. In the study phase, half of the faces (context faces) were framed in a newspaper page entitled with an action committed by the person depicted; these actions could have a positive or a negative value, so in this way emotional valence could be manipulated. The other half was presented on a neutral background (no-context faces). In the test phase, previously presented faces and new ones were presented on neutral backgrounds and an old/new discrimination was requested. The N170 component was modulated by both context (presence/absence at encoding) and valence (positive/negative). A reduction in amplitude was found for context faces as opposed to no-context faces. The same pattern was observed for negative faces compared to positive ones. Moreover, later activations associated with context and valence were differentially distributed over the scalp: context effects were prominent in left frontal areas, traditionally linked to person-specific information retrieval, whereas valence effects were broadly distributed over the scalp. In relation to recent neuroimaging findings on the neural basis of top-down modulations, present findings indicate that the information flow from higher-order areas might have modulated the N170 component and mediated the retrieval of semantic information pertaining to the study episode.
Monday, October 02, 2006
ARTICLE UPDATE - Neural Processing of Fearful Faces: Effects of Anxiety are Gated by Perceptual Capacity Limitations
Sonia J. Bishop, Rob Jenkins, and Andrew D. Lawrence
Cerebral Cortex, in press
Debate continues as to the automaticity of the amygdala's response to threat. Accounts taking a strong automaticity line suggest that the amygdala's response to threat is both involuntary and independent of attentional resources. Building on these accounts, prominent models have suggested that anxiety modulates the output of an amygdala-based preattentive threat evaluation system. Here, we argue for a modification of these models. Functional magnetic resonance imaging data were collected while volunteers performed a letter search task of high or low perceptual load superimposed on fearful or neutral face distractors. Neither high- nor low-anxious volunteers showed an increased amygdala response to threat distractors under high perceptual load, contrary to a strong automaticity account of amygdala function. Under low perceptual load, elevated state anxiety was associated with a heightened response to threat distractors in the amygdala and superior temporal sulcus, whereas individuals high in trait anxiety showed a reduced prefrontal response to these stimuli, consistent with weakened recruitment of control mechanisms used to prevent the further processing of salient distractors. These findings suggest that anxiety modulates processing subsequent to competition for perceptual processing resources, with state and trait anxiety having distinguishable influences upon the neural mechanisms underlying threat evaluation and "top-down" control.
Cerebral Cortex, in press
Debate continues as to the automaticity of the amygdala's response to threat. Accounts taking a strong automaticity line suggest that the amygdala's response to threat is both involuntary and independent of attentional resources. Building on these accounts, prominent models have suggested that anxiety modulates the output of an amygdala-based preattentive threat evaluation system. Here, we argue for a modification of these models. Functional magnetic resonance imaging data were collected while volunteers performed a letter search task of high or low perceptual load superimposed on fearful or neutral face distractors. Neither high- nor low-anxious volunteers showed an increased amygdala response to threat distractors under high perceptual load, contrary to a strong automaticity account of amygdala function. Under low perceptual load, elevated state anxiety was associated with a heightened response to threat distractors in the amygdala and superior temporal sulcus, whereas individuals high in trait anxiety showed a reduced prefrontal response to these stimuli, consistent with weakened recruitment of control mechanisms used to prevent the further processing of salient distractors. These findings suggest that anxiety modulates processing subsequent to competition for perceptual processing resources, with state and trait anxiety having distinguishable influences upon the neural mechanisms underlying threat evaluation and "top-down" control.
ARTICLE UPDATE - Resolving Emotional Conflict: A Role for the Rostral Anterior Cingulate Cortex in Modulating Activity in the Amygdala
Amit Etkin, Tobias Egner, Daniel M. Peraza, Eric R. Kandel and Joy Hirsch
Neuron, 51, 871-882
Effective mental functioning requires that cognition be protected from emotional conflict due to interference by task-irrelevant emotionally salient stimuli. The neural mechanisms by which the brain detects and resolves emotional conflict are still largely unknown, however. Drawing on the classic Stroop conflict task, we developed a protocol that allowed us to dissociate the generation and monitoring of emotional conflict from its resolution. Using functional magnetic resonance imaging (fMRI), we find that activity in the amygdala and dorsomedial and dorsolateral prefrontal cortices reflects the amount of emotional conflict. By contrast, the resolution of emotional conflict is associated with activation of the rostral anterior cingulate cortex. Activation of the rostral cingulate is predicted by the amount of previous-trial conflict-related neural activity and is accompanied by a simultaneous and correlated reduction of amygdalar activity. These data suggest that emotional conflict is resolved through top-down inhibition of amygdalar activity by the rostral cingulate cortex.
Neuron, 51, 871-882
Effective mental functioning requires that cognition be protected from emotional conflict due to interference by task-irrelevant emotionally salient stimuli. The neural mechanisms by which the brain detects and resolves emotional conflict are still largely unknown, however. Drawing on the classic Stroop conflict task, we developed a protocol that allowed us to dissociate the generation and monitoring of emotional conflict from its resolution. Using functional magnetic resonance imaging (fMRI), we find that activity in the amygdala and dorsomedial and dorsolateral prefrontal cortices reflects the amount of emotional conflict. By contrast, the resolution of emotional conflict is associated with activation of the rostral anterior cingulate cortex. Activation of the rostral cingulate is predicted by the amount of previous-trial conflict-related neural activity and is accompanied by a simultaneous and correlated reduction of amygdalar activity. These data suggest that emotional conflict is resolved through top-down inhibition of amygdalar activity by the rostral cingulate cortex.
ARTICLE UPDATE - Fast recognition of social emotions takes the whole brain: Interhemispheric cooperation in the absence of cerebral asymmetry
Marco Tamietto, Mauro Adenzato, Giuliano Geminiani and Beatrice de Gelder
Neuropsychologia, in press
Hemispheric asymmetry in emotional perception has been traditionally studied for basic emotions and very little is known about laterality for more complex social emotions. Here, we used the “redundant target paradigm” to investigate interhemispheric asymmetry and cooperation for two social emotions in healthy subjects. Facial expressions of flirtatiousness or arrogance were briefly presented either unilaterally in the left (LVF) or right visual field (RVF), or simultaneously to both visual fields (BVF) while participants responded to the target expression (flirtatious or arrogant, counterbalanced between blocks). In bilateral conditions the faces could show the same emotion (congruent condition) or two different expressions (incongruent condition). No difference between unilateral presentations was found, suggesting that the perception of social emotions is not hemispherically lateralized. Responses were faster and more accurate in bilateral displays with two emotionally congruent but physically different faces (i.e., a male and a female expressing the same emotion) than in unilateral conditions. This “redundant target effect” was consistent with a neural summation model, thereby showing that interhemispheric cooperation may occur for social emotions despite major perceptual differences between faces posing the same expression.
Neuropsychologia, in press
Hemispheric asymmetry in emotional perception has been traditionally studied for basic emotions and very little is known about laterality for more complex social emotions. Here, we used the “redundant target paradigm” to investigate interhemispheric asymmetry and cooperation for two social emotions in healthy subjects. Facial expressions of flirtatiousness or arrogance were briefly presented either unilaterally in the left (LVF) or right visual field (RVF), or simultaneously to both visual fields (BVF) while participants responded to the target expression (flirtatious or arrogant, counterbalanced between blocks). In bilateral conditions the faces could show the same emotion (congruent condition) or two different expressions (incongruent condition). No difference between unilateral presentations was found, suggesting that the perception of social emotions is not hemispherically lateralized. Responses were faster and more accurate in bilateral displays with two emotionally congruent but physically different faces (i.e., a male and a female expressing the same emotion) than in unilateral conditions. This “redundant target effect” was consistent with a neural summation model, thereby showing that interhemispheric cooperation may occur for social emotions despite major perceptual differences between faces posing the same expression.
ARTICLE UPDATE - Affective evaluations of objects are influenced by observed gaze direction and emotional expression
Andrew P. Bayliss, Alexandra Frischen, Mark J. Fenske and Steven P. Tipper
Cognition, in press
Gaze direction signals another person’s focus of interest. Facial expressions convey information about their mental state. Appropriate responses to these signals should reflect their combined influence, yet current evidence suggests that gaze-cueing effects for objects near an observed face are not modulated by its emotional expression. Here, we extend the investigation of perceived gaze direction and emotional expression by considering their combined influence on affective judgments. While traditional response-time measures revealed equal gaze-cueing effects for happy and disgust faces, affective evaluations critically depended on the combined product of gaze and emotion. Target objects looked at with a happy expression were liked more than objects looked at with a disgust expression. Objects not looked at were rated equally for both expressions. Our results demonstrate that facial expression does modulate the way that observers utilize gaze cues: Objects attended by others are evaluated according to the valence of their facial expression.
Cognition, in press
Gaze direction signals another person’s focus of interest. Facial expressions convey information about their mental state. Appropriate responses to these signals should reflect their combined influence, yet current evidence suggests that gaze-cueing effects for objects near an observed face are not modulated by its emotional expression. Here, we extend the investigation of perceived gaze direction and emotional expression by considering their combined influence on affective judgments. While traditional response-time measures revealed equal gaze-cueing effects for happy and disgust faces, affective evaluations critically depended on the combined product of gaze and emotion. Target objects looked at with a happy expression were liked more than objects looked at with a disgust expression. Objects not looked at were rated equally for both expressions. Our results demonstrate that facial expression does modulate the way that observers utilize gaze cues: Objects attended by others are evaluated according to the valence of their facial expression.
ARTICLE UPDATE - The influence of current mood on affective startle modulation.
Sabine M. Grüsser, Klaus Wölfling, Chantal P. Mörsen, Norbert Kathmann and Herta Flor
Experimental Brain Research, in press
The affect-modulated startle response is a reliable indicator of the affective processing of stimuli. It may be influenced by trait and state affective variables as well as psychopathological status. The aim of the present study was to determine the influence of the current mood state on startle modulation. Forty-five healthy volunteers viewed affective stimuli while eye blink responses and subjective emotional ratings were assessed. In addition, the current state of mood was assessed, pre and post the experimental procedure. Subjects were divided into those that were in a more positive and those that were in a more negative mood based on a median split. Compared to subjects in a positive mood those in a more negative mood showed significantly reduced startle amplitudes after viewing the negative and neutral stimuli. The results of the present study show that changes in startle responses are not only related to the current state of psychopathology but also to the general affective state of the participants during the assessments.
Experimental Brain Research, in press
The affect-modulated startle response is a reliable indicator of the affective processing of stimuli. It may be influenced by trait and state affective variables as well as psychopathological status. The aim of the present study was to determine the influence of the current mood state on startle modulation. Forty-five healthy volunteers viewed affective stimuli while eye blink responses and subjective emotional ratings were assessed. In addition, the current state of mood was assessed, pre and post the experimental procedure. Subjects were divided into those that were in a more positive and those that were in a more negative mood based on a median split. Compared to subjects in a positive mood those in a more negative mood showed significantly reduced startle amplitudes after viewing the negative and neutral stimuli. The results of the present study show that changes in startle responses are not only related to the current state of psychopathology but also to the general affective state of the participants during the assessments.
ARTICLE UPDATE - The Experience of Emotion.
Barrett LF, Mesquita B, Ochsner KN, Gross JJ.
Annual Review of Psychology, in press
Experiences of emotion are content-rich events that emerge at the level of psychological description, but must be causally constituted by neurobiological processes. This chapter outlines an emerging scientific agenda for understanding what these experiences feel like and how they arise. We review the available answers to what is felt (i.e., the content that makes up an experience of emotion) and how neurobiological processes instantiate these properties of experience. These answers are then integrated into a broad framework that describes, in psychological terms, how the experience of emotion emerges from more basic processes. We then discuss the role of such experiences in the economy of the mind and behavior.
Annual Review of Psychology, in press
Experiences of emotion are content-rich events that emerge at the level of psychological description, but must be causally constituted by neurobiological processes. This chapter outlines an emerging scientific agenda for understanding what these experiences feel like and how they arise. We review the available answers to what is felt (i.e., the content that makes up an experience of emotion) and how neurobiological processes instantiate these properties of experience. These answers are then integrated into a broad framework that describes, in psychological terms, how the experience of emotion emerges from more basic processes. We then discuss the role of such experiences in the economy of the mind and behavior.
Subscribe to:
Posts (Atom)