Jung YC, An SK, Seok JH, Kim JS, Oh SJ, Moon DH, Kim JJ.
Biological Psychology, in press,
Affective symmetries, such as the positivity offset and negativity bias, have been postulated to be attributable to distinct activation functions of the positive and negative affect systems. We investigated the neural substrates that are engaged when the positive and negative affect systems undergo parallel and integrative processing. Eleven subjects were scanned using H(2)(15)O PET during choosing the subjective feeling produced by a stimulation pair of pictures or words. Four different conditions were designed for contrast: pure positivity, pure negativity, positivity offset, and negativity bias. The dorsolateral prefrontal activation was associated with positivity offset and negativity bias condition, whereas the ventromedial prefrontal activation, together with limbic and subcortical activations, was associated with pure positivity and pure negativity condition. The results indicated that positivity offset and negativity bias are not merely due to asymmetric activations of the positive and negative systems, but integrative processing of higher neocortical levels is involved.
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Friday, July 21, 2006
ARTICLE UPDATE - Anticipation of affective image modulates visual evoked magnetic fields (VEF).
Onoda K, Okamoto Y, Shishida K, Hashizume A, Ueda K, Kinoshita A, Yamashita H, Yamawaki S.
Experimental Brain Research, in press
We investigated the interaction between anticipation of positive and negative affective images and visual evoked magnetic fields (VEF). Participants (n = 13) were presented emotionally positive or negative images under different anticipatory conditions, and their subsequent brain responses were recorded by magnetoencephalography (MEG). In the Affective Cue conditions, the cue stimulus indicated the emotional valence of the image, which followed 2 s later. In the Null Cue conditions, the cue stimulus did not include any information about the valence of the image. In the No Cue conditions, the affective image was suddenly presented, without a cue stimulus. The VEF amplitude for the negative image in the Affective Cue condition was smaller than that of the positive image in the Affective Cue condition and that of the negative image in the Null Cue condition. This result suggests that anticipation of the valence of affective images modulates the processes of the visual cortex.
Experimental Brain Research, in press
We investigated the interaction between anticipation of positive and negative affective images and visual evoked magnetic fields (VEF). Participants (n = 13) were presented emotionally positive or negative images under different anticipatory conditions, and their subsequent brain responses were recorded by magnetoencephalography (MEG). In the Affective Cue conditions, the cue stimulus indicated the emotional valence of the image, which followed 2 s later. In the Null Cue conditions, the cue stimulus did not include any information about the valence of the image. In the No Cue conditions, the affective image was suddenly presented, without a cue stimulus. The VEF amplitude for the negative image in the Affective Cue condition was smaller than that of the positive image in the Affective Cue condition and that of the negative image in the Null Cue condition. This result suggests that anticipation of the valence of affective images modulates the processes of the visual cortex.
ARTICLE UPDATE - The locus ceruleus is involved in the successful retrieval of emotional memories in humans.
Sterpenich V, D'Argembeau A, Desseilles M, Balteau E, Albouy G, Vandewalle G, Degueldre C, Luxen A, Collette F, Maquet P.
Journal of Neuroscience, 26, 7416-7423.
Emotional memories are better remembered than neutral ones. The amygdala is involved in this enhancement not only by modulating the hippocampal activity, but possibly also by modulating central arousal. Using functional magnetic resonance imaging, we analyzed the retrieval of neutral faces encoded in emotional or neutral contexts. The pupillary size measured during encoding was used as a modulator of brain responses during retrieval. The interaction between emotion and memory showed significant responses in a set of areas, including the amygdala and parahippocampal gyrus. These areas responded significantly more for correctly remembered faces encoded in an emotional, compared with neutral, context. The same interaction conducted on responses modulated by the pupillary size revealed an area of the dorsal tegmentum of the ponto-mesencephalic region, consistent with the locus ceruleus. Moreover, a psychophysiological interaction showed that amygdalar responses were more tightly related to those of the locus ceruleus when remembering faces that had been encoded in an emotional, rather than neutral, context. These findings suggest that the restoration of a central arousal similar to encoding takes part in the successful retrieval of neutral events learned in an emotional context.
Journal of Neuroscience, 26, 7416-7423.
Emotional memories are better remembered than neutral ones. The amygdala is involved in this enhancement not only by modulating the hippocampal activity, but possibly also by modulating central arousal. Using functional magnetic resonance imaging, we analyzed the retrieval of neutral faces encoded in emotional or neutral contexts. The pupillary size measured during encoding was used as a modulator of brain responses during retrieval. The interaction between emotion and memory showed significant responses in a set of areas, including the amygdala and parahippocampal gyrus. These areas responded significantly more for correctly remembered faces encoded in an emotional, compared with neutral, context. The same interaction conducted on responses modulated by the pupillary size revealed an area of the dorsal tegmentum of the ponto-mesencephalic region, consistent with the locus ceruleus. Moreover, a psychophysiological interaction showed that amygdalar responses were more tightly related to those of the locus ceruleus when remembering faces that had been encoded in an emotional, rather than neutral, context. These findings suggest that the restoration of a central arousal similar to encoding takes part in the successful retrieval of neutral events learned in an emotional context.
ARTICLE UPDATE - Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging.
Vuilleumier P, Pourtois G.
Neuropsychologia, in press
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.
Neuropsychologia, in press
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.
Friday, July 14, 2006
ARTICLE UPDATE - On rejecting emotional lures created by phonological neighborhood activation.
Starns JJ, Cook GI, Hicks JL, Marsh RL.
Journal of Experimental Psychology: Learning, Memory and Cognition, 32, 847-853.
The authors conducted 2 experiments to assess how phonologically related lures are rejected in a false memory paradigm. Some phonological lures were emotional (i.e., taboo) words, and others were not. The authors manipulated the presence of taboo items on the study list and reduced the ability to use controlled rejection strategies by dividing attention and forcing a short response deadline. The results converge on the idea that participants reduce false alarms to emotional lures by setting more stringent recognition criteria for these items based on their expected memorability. Additionally, emotional lures are less familiar than nonemotional lures because emotional lures have affective and semantic features that mismatch studied nonemotional items.
Journal of Experimental Psychology: Learning, Memory and Cognition, 32, 847-853.
The authors conducted 2 experiments to assess how phonologically related lures are rejected in a false memory paradigm. Some phonological lures were emotional (i.e., taboo) words, and others were not. The authors manipulated the presence of taboo items on the study list and reduced the ability to use controlled rejection strategies by dividing attention and forcing a short response deadline. The results converge on the idea that participants reduce false alarms to emotional lures by setting more stringent recognition criteria for these items based on their expected memorability. Additionally, emotional lures are less familiar than nonemotional lures because emotional lures have affective and semantic features that mismatch studied nonemotional items.
Monday, July 03, 2006
ARTICLE UPDATE - Temporal dynamics of face repetition suppression
Alumit Ishai, Philip C. Bikle and Leslie G. Ungerleider
Brain Research Bulletin, in press
Single-unit recordings and functional brain imaging studies have shown reduced neural responses to repeated stimuli in the visual cortex. Using MEG, we compared responses evoked by repetitions of neutral faces to those evoked by fearful faces, which were either task relevant (targets) or irrelevant (distracters). Faces evoked a bi-phasic response in extrastriate cortex, peaking at 160–185 ms and at 220–250 ms, with stronger responses to neutral faces at the earlier interval and stronger responses to fearful faces at the later interval. At both latencies, repetitions of neutral and fearful targets resulted in reduced amplitude of the MEG signal. Additionally, we found that the context in which targets were presented affected their processing: fearful distracters increased the responses evoked by both neutral and fearful targets. Our data indicate that valence enhancement and context effects can be detected in extrastriate visual cortex within 250 ms and that these processes likely reflect feedback from other regions.
Brain Research Bulletin, in press
Single-unit recordings and functional brain imaging studies have shown reduced neural responses to repeated stimuli in the visual cortex. Using MEG, we compared responses evoked by repetitions of neutral faces to those evoked by fearful faces, which were either task relevant (targets) or irrelevant (distracters). Faces evoked a bi-phasic response in extrastriate cortex, peaking at 160–185 ms and at 220–250 ms, with stronger responses to neutral faces at the earlier interval and stronger responses to fearful faces at the later interval. At both latencies, repetitions of neutral and fearful targets resulted in reduced amplitude of the MEG signal. Additionally, we found that the context in which targets were presented affected their processing: fearful distracters increased the responses evoked by both neutral and fearful targets. Our data indicate that valence enhancement and context effects can be detected in extrastriate visual cortex within 250 ms and that these processes likely reflect feedback from other regions.
Subscribe to:
Posts (Atom)