Klucken T, Tabbert K, Schweckendiek J, Merz CJ, Kagerer S, Vaitl D, Stark R.
Human Brain Mapping, in press
The ability to detect and learn contingencies between fearful stimuli and their predictive cues is an important capacity to cope with the environment. Contingency awareness refers to the ability to verbalize the relationships between conditioned and unconditioned stimuli. Although there is a heated debate about the influence of contingency awareness on conditioned fear responses, neural correlates behind the formation process of contingency awareness have gained only little attention in human fear conditioning. Recent animal studies indicate that the ventral striatum (VS) could be involved in this process, but in human studies the VS is mostly associated with positive emotions. To examine this question, we reanalyzed four recently published classical fear conditioning studies (n = 117) with respect to the VS at three distinct levels of contingency awareness: subjects, who did not learn the contingencies (unaware), subjects, who learned the contingencies during the experiment (learned aware) and subjects, who were informed about the contingencies in advance (instructed aware). The results showed significantly increased activations in the left and right VS in learned aware compared to unaware subjects. Interestingly, this activation pattern was only found in learned but not in instructed aware subjects. We assume that the VS is not involved when contingency awareness does not develop during conditioning or when contingency awareness is unambiguously induced already prior to conditioning. VS involvement seems to be important for the transition from a contingency unaware to a contingency aware state. Implications for fear conditioning models as well as for the contingency awareness debate are discussed. Hum Brain Mapp, 2009.
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Saturday, April 25, 2009
ARTICLE UPDATE - Unmasking emotion: Exposure duration and emotional engagement.
Codispoti M, Mazzetti M, Bradley MM.
Psychophysiology, in press
Effects of exposure duration on emotional reactivity were investigated in two experiments that parametrically varied the duration of exposure to affective pictures from 25-6000 ms in the presence or absence of a visual mask. Evaluative, facial, autonomic, and cortical responses were measured. Results demonstrated that, in the absence of a visual mask (Experiment 1), emotional content modulated evaluative ratings, cortical, autonomic, and facial changes even with very brief exposures, and there was little evidence that emotional engagement increased with longer exposure. When information persistence was reduced by a visual mask (Experiment 2), differences as a function of hedonic content were absent for all measures when exposure duration was 25 ms but statistically reliable when exposure duration was 80 ms. Between 25-80 ms, individual differences in discriminability were critical in observing affective reactions to masked pictures.
Psychophysiology, in press
Effects of exposure duration on emotional reactivity were investigated in two experiments that parametrically varied the duration of exposure to affective pictures from 25-6000 ms in the presence or absence of a visual mask. Evaluative, facial, autonomic, and cortical responses were measured. Results demonstrated that, in the absence of a visual mask (Experiment 1), emotional content modulated evaluative ratings, cortical, autonomic, and facial changes even with very brief exposures, and there was little evidence that emotional engagement increased with longer exposure. When information persistence was reduced by a visual mask (Experiment 2), differences as a function of hedonic content were absent for all measures when exposure duration was 25 ms but statistically reliable when exposure duration was 80 ms. Between 25-80 ms, individual differences in discriminability were critical in observing affective reactions to masked pictures.
ARTICLE UPDATE - Sleep promotes the neural reorganization of remote emotional memory.
Sterpenich V, Albouy G, Darsaud A, Schmidt C, Vandewalle G, Dang Vu TT, Desseilles M, Phillips C, Degueldre C, Balteau E, Collette F, Luxen A, Maquet P.
The Journal of Neuroscience, 16, 5143-5152
Sleep promotes memory consolidation, a process by which fresh and labile memories are reorganized into stable memories. Emotional memories are usually better remembered than neutral ones, even at long retention delays. In this study, we assessed the influence of sleep during the night after encoding onto the neural correlates of recollection of emotional memories 6 months later. After incidental encoding of emotional and neutral pictures, one-half of the subjects were allowed to sleep, whereas the others were totally sleep deprived, on the first postencoding night. During subsequent retest, functional magnetic resonance imaging sessions taking place 3 d and 6 months later, subjects made recognition memory judgments about the previously studied and new pictures. Between these retest sessions, all participants slept as usual at home. At 6 month retest, recollection was associated with significantly larger responses in subjects allowed to sleep than in sleep-deprived subjects, in the ventral medial prefrontal cortex (vMPFC) and the precuneus, two areas involved in memory retrieval, as well as in the extended amygdala and the occipital cortex, two regions the response of which was modulated by emotion at encoding. Moreover, the functional connectivity was enhanced between the vMPFC and the precuneus, as well as between the extended amygdala, the vMPFC, and the occipital cortex in the sleep group relative to the sleep-deprived group. These results suggest that sleep during the first postencoding night profoundly influences the long-term systems-level consolidation of emotional memory and modifies the functional segregation and integration associated with recollection in the long term.
The Journal of Neuroscience, 16, 5143-5152
Sleep promotes memory consolidation, a process by which fresh and labile memories are reorganized into stable memories. Emotional memories are usually better remembered than neutral ones, even at long retention delays. In this study, we assessed the influence of sleep during the night after encoding onto the neural correlates of recollection of emotional memories 6 months later. After incidental encoding of emotional and neutral pictures, one-half of the subjects were allowed to sleep, whereas the others were totally sleep deprived, on the first postencoding night. During subsequent retest, functional magnetic resonance imaging sessions taking place 3 d and 6 months later, subjects made recognition memory judgments about the previously studied and new pictures. Between these retest sessions, all participants slept as usual at home. At 6 month retest, recollection was associated with significantly larger responses in subjects allowed to sleep than in sleep-deprived subjects, in the ventral medial prefrontal cortex (vMPFC) and the precuneus, two areas involved in memory retrieval, as well as in the extended amygdala and the occipital cortex, two regions the response of which was modulated by emotion at encoding. Moreover, the functional connectivity was enhanced between the vMPFC and the precuneus, as well as between the extended amygdala, the vMPFC, and the occipital cortex in the sleep group relative to the sleep-deprived group. These results suggest that sleep during the first postencoding night profoundly influences the long-term systems-level consolidation of emotional memory and modifies the functional segregation and integration associated with recollection in the long term.
Saturday, April 18, 2009
ARTICLE UPDATE - Reason, emotion and decision-making: risk and reward computation with feeling.
Quartz SR.
Trends in Cognitive Sciences, in press
Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parameters correspond to various decision-making frameworks, and their correspondence to emotional and rational processes. Here, I review research suggesting that emotional processes encode in a precise quantitative manner the basic parameters of financial decision theory, indicating a reorientation of emotional and cognitive contributions to risky choice.
Trends in Cognitive Sciences, in press
Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parameters correspond to various decision-making frameworks, and their correspondence to emotional and rational processes. Here, I review research suggesting that emotional processes encode in a precise quantitative manner the basic parameters of financial decision theory, indicating a reorientation of emotional and cognitive contributions to risky choice.
ARTICLE UPDATE - Neural correlates of affective picture processing - a depth ERP study.
Brázdil M, Roman R, Urbánek T, Chládek J, Spok D, Mareček R, Mikl M, Jurák P, Halámek J, Daniel P, Rektor I.
Neuroimage, in press
Using functional neuroimaging techniques (PET and fMRI), various cortical, limbic, and paralimbic structures have been identified in the last decade as neural substrates of human emotion. In this study we used a novel approach (intracerebral recordings of event-related potentials) to add to our knowledge of specific brain regions involved in affective picture processing. Ten intractable epileptic patients undergoing pre-surgical depth electrode recording viewed pleasant, neutral, and unpleasant pictures and intracerebral event-related potentials (ERPs) were recorded. A total of 752 cortical and subcortical sites were investigated. Significant differences in ERPs to unpleasant as compared to neutral or pleasant pictures were frequently and consistently observed in recordings from various brain areas - the mesial temporal cortex (the amygdala, the hippocampus, the temporal pole), the lateral temporal cortex, the mesial prefrontal cortex (ACC and the medial frontal gyrus), and the lateral prefrontal cortex. Interestingly, the mean latencies of responses to emotional stimuli were somewhat shorter in the frontal lobe structures (with evidently earlier activation within lateral prefrontal areas when compared to mesial prefrontal cortex) and longer in the temporal lobe regions. These differences, however, were not significant. Additional clearly positive findings were observed in some rarely investigated regions - in the posterior parietal cortex, the precuneus, and the insula. An approximately equivalent number of positive findings was revealed in the left and right hemisphere structures. These results are in agreement with a multisystem model of human emotion, distributed far beyond the typical limbic system and substantially comprising lateral aspects of both frontal lobes as well.
Neuroimage, in press
Using functional neuroimaging techniques (PET and fMRI), various cortical, limbic, and paralimbic structures have been identified in the last decade as neural substrates of human emotion. In this study we used a novel approach (intracerebral recordings of event-related potentials) to add to our knowledge of specific brain regions involved in affective picture processing. Ten intractable epileptic patients undergoing pre-surgical depth electrode recording viewed pleasant, neutral, and unpleasant pictures and intracerebral event-related potentials (ERPs) were recorded. A total of 752 cortical and subcortical sites were investigated. Significant differences in ERPs to unpleasant as compared to neutral or pleasant pictures were frequently and consistently observed in recordings from various brain areas - the mesial temporal cortex (the amygdala, the hippocampus, the temporal pole), the lateral temporal cortex, the mesial prefrontal cortex (ACC and the medial frontal gyrus), and the lateral prefrontal cortex. Interestingly, the mean latencies of responses to emotional stimuli were somewhat shorter in the frontal lobe structures (with evidently earlier activation within lateral prefrontal areas when compared to mesial prefrontal cortex) and longer in the temporal lobe regions. These differences, however, were not significant. Additional clearly positive findings were observed in some rarely investigated regions - in the posterior parietal cortex, the precuneus, and the insula. An approximately equivalent number of positive findings was revealed in the left and right hemisphere structures. These results are in agreement with a multisystem model of human emotion, distributed far beyond the typical limbic system and substantially comprising lateral aspects of both frontal lobes as well.
ARTICLE UPDATE - I'll Know What You're Like When I See How You Feel.
Ames DR, Johar GV.
Psychological Science, in press
Accumulating evidence suggests that targets' displays of emotion shape perceivers' impression of those targets. Prior research has highlighted generalization effects, such as an angry display prompting an impression of hostility. In two studies, we went beyond generalization to examine the interaction of displays and behaviors, finding new evidence of augmenting effects (behavior-correspondent inferences are stronger when behavior is accompanied by positive affect) and discounting effects (such inferences are weaker when behavior is accompanied by negative affect). Thus, the same display can have different effects on impressions depending on the behavior it accompanies. We found evidence that these effects are mediated by ascribed intentions and that they have a boundary: When behaviors and affective displays are repeated, the augmenting and discounting power of displays appears to wane.
Psychological Science, in press
Accumulating evidence suggests that targets' displays of emotion shape perceivers' impression of those targets. Prior research has highlighted generalization effects, such as an angry display prompting an impression of hostility. In two studies, we went beyond generalization to examine the interaction of displays and behaviors, finding new evidence of augmenting effects (behavior-correspondent inferences are stronger when behavior is accompanied by positive affect) and discounting effects (such inferences are weaker when behavior is accompanied by negative affect). Thus, the same display can have different effects on impressions depending on the behavior it accompanies. We found evidence that these effects are mediated by ascribed intentions and that they have a boundary: When behaviors and affective displays are repeated, the augmenting and discounting power of displays appears to wane.
Saturday, April 11, 2009
ARTICLE UPDATE - Laterality effect on emotional faces processing: ALE meta-analysis of evidence.
Fusar-Poli P, Placentino A, Carletti F, Allen P, Landi P, Abbamonte M, Barale F, Perez J, McGuire P, Politi PL.
Neuroscience Letters, in press
Recognizing emotion from facial expressions draws on diverse psychological processes implemented in a large array of neural structures. Two major theories of cerebral lateralization of emotional perception have been proposed: (i) the Right-Hemisphere Hypothesis (RHH) and (ii) the Valence-Specific Hypothesis (VSH). To test these lateralization models we conducted a large voxel-based meta-analysis of current functional magnetic resonance imaging (fMRI) studies employing emotional faces paradigms in healthy volunteers. Two independent researchers conducted separate comprehensive PUBMED (1990-May 2008) searches to find all functional magnetic resonance imaging studies using a variant of the emotional faces paradigm in healthy subjects. Out of the 551 originally identified studies, 105 studies met inclusion criteria. The overall database consisted of 1785 brain coordinates which yield an overall sample of 1600 healthy subjects. We found no support for the hypothesis of overall right-lateralization of emotional processing. Conversely, across all emotional conditions the parahippocampal gyrus and amygdala, fusiform gyrus, lingual gyrus, precuneus, inferior and middle occipital gyrus, posterior cingulated, middle temporal gyrus, inferior frontal and superior frontal gyri were activated bilaterally (p=0.001). There was a valence-specific lateralization of brain response during negative emotions processing in the left amygdala (p=0.001). Significant interactions between the approach and avoidance dimensions and prefrontal response were observed (p=0.001).
Neuroscience Letters, in press
Recognizing emotion from facial expressions draws on diverse psychological processes implemented in a large array of neural structures. Two major theories of cerebral lateralization of emotional perception have been proposed: (i) the Right-Hemisphere Hypothesis (RHH) and (ii) the Valence-Specific Hypothesis (VSH). To test these lateralization models we conducted a large voxel-based meta-analysis of current functional magnetic resonance imaging (fMRI) studies employing emotional faces paradigms in healthy volunteers. Two independent researchers conducted separate comprehensive PUBMED (1990-May 2008) searches to find all functional magnetic resonance imaging studies using a variant of the emotional faces paradigm in healthy subjects. Out of the 551 originally identified studies, 105 studies met inclusion criteria. The overall database consisted of 1785 brain coordinates which yield an overall sample of 1600 healthy subjects. We found no support for the hypothesis of overall right-lateralization of emotional processing. Conversely, across all emotional conditions the parahippocampal gyrus and amygdala, fusiform gyrus, lingual gyrus, precuneus, inferior and middle occipital gyrus, posterior cingulated, middle temporal gyrus, inferior frontal and superior frontal gyri were activated bilaterally (p=0.001). There was a valence-specific lateralization of brain response during negative emotions processing in the left amygdala (p=0.001). Significant interactions between the approach and avoidance dimensions and prefrontal response were observed (p=0.001).
ARTICLE UPDATE - Neural systems of visual attention responding to emotional gestures.
Flaisch T, Schupp HT, Renner B, Junghöfer M.
Neuroimage, in press
Humans are the only species known to use symbolic gestures for communication. This affords a unique medium for nonverbal emotional communication with a distinct theoretical status compared to facial expressions and other biologically evolved nonverbal emotion signals. While a frown is a frown all around the world, the relation of emotional gestures to their referents is arbitrary and varies from culture to culture. The present studies examined whether such culturally based emotion displays guide visual attention processes. In two experiments, participants passively viewed symbolic hand gestures with positive, negative and neutral emotional meaning. In Experiment 1, functional magnetic resonance imaging (fMRI) measurements showed that gestures of insult and approval enhance activity in selected bilateral visual-associative brain regions devoted to object perception. In Experiment 2, dense sensor event-related brain potential recordings (ERP) revealed that emotional hand gestures are differentially processed already 150 ms poststimulus. Thus, the present studies provide converging neuroscientific evidence that emotional gestures provoke the cardinal signatures of selective visual attention regarding brain structures and temporal dynamics previously shown for emotional face and body expressions. It is concluded that emotionally charged gestures are efficient in shaping selective attention processes already at the level of stimulus perception.
Neuroimage, in press
Humans are the only species known to use symbolic gestures for communication. This affords a unique medium for nonverbal emotional communication with a distinct theoretical status compared to facial expressions and other biologically evolved nonverbal emotion signals. While a frown is a frown all around the world, the relation of emotional gestures to their referents is arbitrary and varies from culture to culture. The present studies examined whether such culturally based emotion displays guide visual attention processes. In two experiments, participants passively viewed symbolic hand gestures with positive, negative and neutral emotional meaning. In Experiment 1, functional magnetic resonance imaging (fMRI) measurements showed that gestures of insult and approval enhance activity in selected bilateral visual-associative brain regions devoted to object perception. In Experiment 2, dense sensor event-related brain potential recordings (ERP) revealed that emotional hand gestures are differentially processed already 150 ms poststimulus. Thus, the present studies provide converging neuroscientific evidence that emotional gestures provoke the cardinal signatures of selective visual attention regarding brain structures and temporal dynamics previously shown for emotional face and body expressions. It is concluded that emotionally charged gestures are efficient in shaping selective attention processes already at the level of stimulus perception.
Sunday, April 05, 2009
ARTICLE UPDATE - Links between rapid ERP responses to fearful faces and conscious awareness.
Eimer M, Kiss M, Holmes A.
Journal of Neurophysiology, 2, 165-181
To study links between rapid ERP responses to fearful faces and conscious awareness, a backward-masking paradigm was employed where fearful or neutral target faces were presented for different durations and were followed by a neutral face mask. Participants had to report target face expression on each trial. When masked faces were clearly visible (200 ms duration), an early frontal positivity, a later more broadly distributed positivity, and a temporo-occipital negativity were elicited by fearful relative to neutral faces, confirming findings from previous studies with unmasked faces. These emotion-specific effects were also triggered when masked faces were presented for only 17 ms, but only on trials where fearful faces were successfully detected. When masked faces were shown for 50 ms, a smaller but reliable frontal positivity was also elicited by undetected fearful faces. These results demonstrate that early ERP responses to fearful faces are linked to observers' subjective conscious awareness of such faces, as reflected by their perceptual reports. They suggest that frontal brain regions involved in the construction of conscious representations of facial expression are activated at very short latencies.
Journal of Neurophysiology, 2, 165-181
To study links between rapid ERP responses to fearful faces and conscious awareness, a backward-masking paradigm was employed where fearful or neutral target faces were presented for different durations and were followed by a neutral face mask. Participants had to report target face expression on each trial. When masked faces were clearly visible (200 ms duration), an early frontal positivity, a later more broadly distributed positivity, and a temporo-occipital negativity were elicited by fearful relative to neutral faces, confirming findings from previous studies with unmasked faces. These emotion-specific effects were also triggered when masked faces were presented for only 17 ms, but only on trials where fearful faces were successfully detected. When masked faces were shown for 50 ms, a smaller but reliable frontal positivity was also elicited by undetected fearful faces. These results demonstrate that early ERP responses to fearful faces are linked to observers' subjective conscious awareness of such faces, as reflected by their perceptual reports. They suggest that frontal brain regions involved in the construction of conscious representations of facial expression are activated at very short latencies.
ARTICLE UPDATE - Segregated and integrated coding of reward and punishment in the cingulate cortex.
Fujiwara J, Tobler PN, Taira M, Iijima T, Tsutsui KI.
Journal of Neurophysiology, in press
Affective stimuli fall into two major classes, reward and punishment, both of which are processed by the cingulate cortex. However, it is unclear whether the positive and negative affective values of monetary reward and punishment are processed by separate or common subregions of the cingulate cortex. We performed a functional magnetic resonance imaging (fMRI) study using a free-choice task, and compared cingulate activations for different levels of monetary gain and loss. Gain-specific activation (increasing activation for increasing gain, but no activation change in relation to loss) occurred mainly in the anterior part of the anterior cingulate and in the posterior cingulate cortex. Conversely, loss-specific activation (increasing activation for increasing loss, but no activation change in relation to gain) occurred in-between these areas, in the middle and posterior part of the anterior cingulate. Integrated coding of gain and loss (increasing activation throughout the full range, from biggest loss to biggest gain) occurred in the dorsal part of the anterior cingulate, at the border with medial prefrontal cortex. Finally, unspecific activation increases to both gains and losses (increasing activation to increasing gains and increasing losses, possibly reflecting attention) occurred in dorsal and middle regions of the cingulate cortex. Together, these results suggest separate and common coding of monetary reward and punishment in distinct subregions of the cingulate cortex. Further meta-analysis suggested that the presently found reward- and punishment-specific areas overlapped with those processing positive and negative emotions, respectively.
Journal of Neurophysiology, in press
Affective stimuli fall into two major classes, reward and punishment, both of which are processed by the cingulate cortex. However, it is unclear whether the positive and negative affective values of monetary reward and punishment are processed by separate or common subregions of the cingulate cortex. We performed a functional magnetic resonance imaging (fMRI) study using a free-choice task, and compared cingulate activations for different levels of monetary gain and loss. Gain-specific activation (increasing activation for increasing gain, but no activation change in relation to loss) occurred mainly in the anterior part of the anterior cingulate and in the posterior cingulate cortex. Conversely, loss-specific activation (increasing activation for increasing loss, but no activation change in relation to gain) occurred in-between these areas, in the middle and posterior part of the anterior cingulate. Integrated coding of gain and loss (increasing activation throughout the full range, from biggest loss to biggest gain) occurred in the dorsal part of the anterior cingulate, at the border with medial prefrontal cortex. Finally, unspecific activation increases to both gains and losses (increasing activation to increasing gains and increasing losses, possibly reflecting attention) occurred in dorsal and middle regions of the cingulate cortex. Together, these results suggest separate and common coding of monetary reward and punishment in distinct subregions of the cingulate cortex. Further meta-analysis suggested that the presently found reward- and punishment-specific areas overlapped with those processing positive and negative emotions, respectively.
Subscribe to:
Posts (Atom)