Damaraju E, Huang YM, Barrett LF, Pessoa L.
Neuropsychologia, in press
This study examined the impact of task-irrelevant affective information on early visual processing regions V1-V4. Fearful and neutral faces presented with rings of different colors were used as stimuli. During the conditioning phase, fearful faces presented with a certain ring color (e.g., black) were paired with mild electrical stimulation. Neutral faces shown with rings of that color, as well as fearful or neutral faces shown with another ring color (e.g., white), were never paired with shock. Our findings revealed that fearful faces evoked enhanced blood oxygen level dependent (BOLD) responses in V1 and V4 compared to neutral faces. Faces embedded in a color ring that was paired with shock (e.g., black) evoked greater BOLD responses in V1-V4 compared to a ring color that was never paired with shock (e.g., white). Finally, BOLD responses in early visual cortex were tightly interrelated (i.e., correlated) during an affectively potent context (i.e., ring color) but not during a neutral one, suggesting that increased functional integration was present with affective learning. Taken together, the results suggest that task-irrelevant affective information not only influences evoked responses in early, retinotopically organized visual cortex, but also determines the pattern of responses across early visual cortex.
Request reprint
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Saturday, May 30, 2009
ARTICLE UPDATE - Embodiment of emotion concepts.
Niedenthal PM, Winkielman P, Mondillon L, Vermeulen N.
Journal of Personality and Social Psychology, 96, 1120-1136
Theories of embodied cognition hold that higher cognitive processes operate on perceptual symbols and that concept use involves partial reactivations of the sensory-motor states that occur during experience with the world. On this view, the processing of emotion knowledge involves a (partial) reexperience of an emotion, but only when access to the sensory basis of emotion knowledge is required by the task. In 2 experiments, participants judged emotional and neutral concepts corresponding to concrete objects (Experiment 1) and abstract states (Experiment 2) while facial electromyographic activity was recorded from the cheek, brow, eye, and nose regions. Results of both studies show embodiment of specific emotions in an emotion-focused but not a perceptual-focused processing task on the same words. A follow up in Experiment 3, which blocked selective facial expressions, suggests a causal, rather than simply a correlational, role for embodiment in emotion word processing. Experiment 4, using a property generation task, provided support for the conclusion that emotions embodied in conceptual tasks are context-dependent situated simulations rather than associated emotional reactions. Implications for theories of embodied simulation and for emotion theories are discussed.
Journal of Personality and Social Psychology, 96, 1120-1136
Theories of embodied cognition hold that higher cognitive processes operate on perceptual symbols and that concept use involves partial reactivations of the sensory-motor states that occur during experience with the world. On this view, the processing of emotion knowledge involves a (partial) reexperience of an emotion, but only when access to the sensory basis of emotion knowledge is required by the task. In 2 experiments, participants judged emotional and neutral concepts corresponding to concrete objects (Experiment 1) and abstract states (Experiment 2) while facial electromyographic activity was recorded from the cheek, brow, eye, and nose regions. Results of both studies show embodiment of specific emotions in an emotion-focused but not a perceptual-focused processing task on the same words. A follow up in Experiment 3, which blocked selective facial expressions, suggests a causal, rather than simply a correlational, role for embodiment in emotion word processing. Experiment 4, using a property generation task, provided support for the conclusion that emotions embodied in conceptual tasks are context-dependent situated simulations rather than associated emotional reactions. Implications for theories of embodied simulation and for emotion theories are discussed.
ARTICLE UPDATE - Early and late temporo-spatial effects of contextual interference during perception of facial affect.
Frühholz S, Fehr T, Herrmann M.
International Journal of Psychophysiology, in press
Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.
International Journal of Psychophysiology, in press
Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.
Saturday, May 23, 2009
ARTICLE UPDATE - The Interrelations between Verbal Working Memory and Visual Selection of Emotional Faces.
Grecucci A, Soto D, Rumiati RI, Humphreys GW, Rotshtein P.
The Journal of Cognitive Neuroscience, in press
Working memory (WM) and visual selection processes interact in a reciprocal fashion based on overlapping representations abstracted from the physical characteristics of stimuli. Here, we assessed the neural basis of this interaction using facial expressions that conveyed emotion information. Participants memorized an emotional word for a later recognition test and then searched for a face of a particular gender presented in a display with two faces that differed in gender and expression. The relation between the emotional word and the expressions of the target and distractor faces was varied. RTs for the memory test were faster when the target face matched the emotional word held in WM (on valid trials) relative to when the emotional word matched the expression of the distractor (on invalid trials). There was also enhanced activation on valid compared with invalid trials in the lateral orbital gyrus, superior frontal polar (BA 10), lateral occipital sulcus, and pulvinar. Re-presentation of the WM stimulus in the search display led to the earlier onset of activity in the superior and inferior frontal gyri and the anterior hippocampus irrespective of the search validity of the re-presented stimulus. The data indicate that the middle temporal and prefrontal cortices are sensitive to the reappearance of stimuli that are held in WM, whereas a fronto-thalamic occipital network is sensitive to the behavioral significance of the match between WM and targets for selection. We conclude that these networks are modulated by high-level matches between the contents of WM, the behavioral goals, and our current sensory input.
The Journal of Cognitive Neuroscience, in press
Working memory (WM) and visual selection processes interact in a reciprocal fashion based on overlapping representations abstracted from the physical characteristics of stimuli. Here, we assessed the neural basis of this interaction using facial expressions that conveyed emotion information. Participants memorized an emotional word for a later recognition test and then searched for a face of a particular gender presented in a display with two faces that differed in gender and expression. The relation between the emotional word and the expressions of the target and distractor faces was varied. RTs for the memory test were faster when the target face matched the emotional word held in WM (on valid trials) relative to when the emotional word matched the expression of the distractor (on invalid trials). There was also enhanced activation on valid compared with invalid trials in the lateral orbital gyrus, superior frontal polar (BA 10), lateral occipital sulcus, and pulvinar. Re-presentation of the WM stimulus in the search display led to the earlier onset of activity in the superior and inferior frontal gyri and the anterior hippocampus irrespective of the search validity of the re-presented stimulus. The data indicate that the middle temporal and prefrontal cortices are sensitive to the reappearance of stimuli that are held in WM, whereas a fronto-thalamic occipital network is sensitive to the behavioral significance of the match between WM and targets for selection. We conclude that these networks are modulated by high-level matches between the contents of WM, the behavioral goals, and our current sensory input.
ARTICLE UPDATE - Decoding of Emotional Information in Voice-Sensitive Cortices.
Ethofer T, Van De Ville D, Scherer K, Vuilleumier P.
Current Biology, in press
The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas [1-3] activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses [4-8]. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis [9, 10] to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.
Current Biology, in press
The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas [1-3] activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses [4-8]. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis [9, 10] to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.
ARTICLE UPDATE - Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brain: behavioral and brain evidence.
Schyns PG, Petro LS, Smith ML.
PlosOne
Competent social organisms will read the social signals of their peers. In primates, the face has evolved to transmit the organism's internal emotional state. Adaptive action suggests that the brain of the receiver has co-evolved to efficiently decode expression signals. Here, we review and integrate the evidence for this hypothesis. With a computational approach, we co-examined facial expressions as signals for data transmission and the brain as receiver and decoder of these signals. First, we show in a model observer that facial expressions form a lowly correlated signal set. Second, using time-resolved EEG data, we show how the brain uses spatial frequency information impinging on the retina to decorrelate expression categories. Between 140 to 200 ms following stimulus onset, independently in the left and right hemispheres, an information processing mechanism starts locally with encoding the eye, irrespective of expression, followed by a zooming out to processing the entire face, followed by a zooming back in to diagnostic features (e.g. the opened eyes in "fear", the mouth in "happy"). A model categorizer demonstrates that at 200 ms, the left and right brain have represented enough information to predict behavioral categorization performance.
PlosOne
Competent social organisms will read the social signals of their peers. In primates, the face has evolved to transmit the organism's internal emotional state. Adaptive action suggests that the brain of the receiver has co-evolved to efficiently decode expression signals. Here, we review and integrate the evidence for this hypothesis. With a computational approach, we co-examined facial expressions as signals for data transmission and the brain as receiver and decoder of these signals. First, we show in a model observer that facial expressions form a lowly correlated signal set. Second, using time-resolved EEG data, we show how the brain uses spatial frequency information impinging on the retina to decorrelate expression categories. Between 140 to 200 ms following stimulus onset, independently in the left and right hemispheres, an information processing mechanism starts locally with encoding the eye, irrespective of expression, followed by a zooming out to processing the entire face, followed by a zooming back in to diagnostic features (e.g. the opened eyes in "fear", the mouth in "happy"). A model categorizer demonstrates that at 200 ms, the left and right brain have represented enough information to predict behavioral categorization performance.
Sunday, May 17, 2009
ARTICLE UPDATE - Involvement of medial prefrontal cortex in emotion during feedback presentation.
Jimura K, Konishi S, Asari T, Miyashita Y.
Neuroreport, in press
It has been suggested that the posterior medial prefrontal cortex (pMPFC) implements cognitive functions involved during negative feedback processing. It has also been suggested that the presentation of the feedback elicits emotional processes. This functional MRI study examined whether pMPFC was associated with the emotional component in feedback processing. Participants were exposed to feedback while performing a version of a motion prediction task. The pMPFC was activated during negative feedback presentation and emotion-related activity was extracted from the pMPFC activation through parametric imaging analysis. It was found that the emotional pMPFC activity was greater in participants who scored higher on depressive mood scales. The results suggest that pMPFC also implements feedback-related emotional functions, which individually vary depending on depressive moods.
Neuroreport, in press
It has been suggested that the posterior medial prefrontal cortex (pMPFC) implements cognitive functions involved during negative feedback processing. It has also been suggested that the presentation of the feedback elicits emotional processes. This functional MRI study examined whether pMPFC was associated with the emotional component in feedback processing. Participants were exposed to feedback while performing a version of a motion prediction task. The pMPFC was activated during negative feedback presentation and emotion-related activity was extracted from the pMPFC activation through parametric imaging analysis. It was found that the emotional pMPFC activity was greater in participants who scored higher on depressive mood scales. The results suggest that pMPFC also implements feedback-related emotional functions, which individually vary depending on depressive moods.
Saturday, May 09, 2009
ARTICLE UPDATE - Social Anxiety and Anger Identification: Bubbles Reveal Differential Use of Facial Information With Low Spatial Frequencies.
Langner O, Becker ES, Rinck M.
Psychological Science, in press
We investigated the facial information that socially anxious and nonanxious individuals utilize to judge emotions. Using a reversed-correlation technique, we presented participants with face images that were masked with random bubble patterns. These patterns determined which parts of the face were visible in specific spatial-frequency bands. This masking allowed us to establish which locations and spatial frequencies were helping participants to successfully discriminate angry faces from neutral ones. Although socially anxious individuals performed as well as nonanxious individuals on the emotion-discrimination task, they did not utilize the same facial information for the task. The fine details (high spatial frequencies) around the eyes were discriminative for both groups, but only socially anxious participants additionally processed rough configural information (low spatial frequencies).
Psychological Science, in press
We investigated the facial information that socially anxious and nonanxious individuals utilize to judge emotions. Using a reversed-correlation technique, we presented participants with face images that were masked with random bubble patterns. These patterns determined which parts of the face were visible in specific spatial-frequency bands. This masking allowed us to establish which locations and spatial frequencies were helping participants to successfully discriminate angry faces from neutral ones. Although socially anxious individuals performed as well as nonanxious individuals on the emotion-discrimination task, they did not utilize the same facial information for the task. The fine details (high spatial frequencies) around the eyes were discriminative for both groups, but only socially anxious participants additionally processed rough configural information (low spatial frequencies).
ARTICLE UPDATE - Emotion Improves and Impairs Early Vision
Bocanegra BR, Zeelenberg R.
Psychological Science, in press
Recent studies indicate that emotion enhances early vision, but the generality of this finding remains unknown. Do the benefits of emotion extend to all basic aspects of vision, or are they limited in scope? Our results show that the brief presentation of a fearful face, compared with a neutral face, enhances sensitivity for the orientation of subsequently presented low-spatial-frequency stimuli, but diminishes orientation sensitivity for high-spatial-frequency stimuli. This is the first demonstration that emotion not only improves but also impairs low-level vision. The selective low-spatial-frequency benefits are consistent with the idea that emotion enhances magnocellular processing. Additionally, we suggest that the high-spatial-frequency deficits are due to inhibitory interactions between magnocellular and parvocellular pathways. Our results suggest an emotion-induced trade-off in visual processing, rather than a general improvement. This trade-off may benefit perceptual dimensions that are relevant for survival at the expense of those that are less relevant.
Psychological Science, in press
Recent studies indicate that emotion enhances early vision, but the generality of this finding remains unknown. Do the benefits of emotion extend to all basic aspects of vision, or are they limited in scope? Our results show that the brief presentation of a fearful face, compared with a neutral face, enhances sensitivity for the orientation of subsequently presented low-spatial-frequency stimuli, but diminishes orientation sensitivity for high-spatial-frequency stimuli. This is the first demonstration that emotion not only improves but also impairs low-level vision. The selective low-spatial-frequency benefits are consistent with the idea that emotion enhances magnocellular processing. Additionally, we suggest that the high-spatial-frequency deficits are due to inhibitory interactions between magnocellular and parvocellular pathways. Our results suggest an emotion-induced trade-off in visual processing, rather than a general improvement. This trade-off may benefit perceptual dimensions that are relevant for survival at the expense of those that are less relevant.
Saturday, May 02, 2009
ARTICLE UPDATE - Binding and Inhibition in Episodic Memory -Cognitive, Emotional, and Neural Processes.
Bäuml KH, Pastötter B, Hanslmayr S.
Neuroscience and Biobehavioral Reviews, in press
The goal-directed use of human memory requires that irrelevant or unpleasant memories are, at least temporarily, reduced in their accessibility and memory for more relevant or pleasant information is enhanced, thus making memory more efficient. There is evidence that, in memory, inhibitory processes operate to serve this function. Results from three experimental paradigms are reviewed in which the action of intentionally and unintentionally recruited inhibitory processes has been suggested. The findings provide evidence on representational preconditions for the action of inhibitory processes, specifying binding structures in which inhibitory processes may be triggered and binding structures in which inhibitory processes are generally not observed. The findings also provide evidence on how inhibition a ffects memory representations, including changes at the memory unit level and changes in the binding between single units. Finally, current knowledge on the interplay between inhibition and emotion and on possible neural correlates of inhibitory processes is reviewed.
Neuroscience and Biobehavioral Reviews, in press
The goal-directed use of human memory requires that irrelevant or unpleasant memories are, at least temporarily, reduced in their accessibility and memory for more relevant or pleasant information is enhanced, thus making memory more efficient. There is evidence that, in memory, inhibitory processes operate to serve this function. Results from three experimental paradigms are reviewed in which the action of intentionally and unintentionally recruited inhibitory processes has been suggested. The findings provide evidence on representational preconditions for the action of inhibitory processes, specifying binding structures in which inhibitory processes may be triggered and binding structures in which inhibitory processes are generally not observed. The findings also provide evidence on how inhibition a ffects memory representations, including changes at the memory unit level and changes in the binding between single units. Finally, current knowledge on the interplay between inhibition and emotion and on possible neural correlates of inhibitory processes is reviewed.
ARTICLE UPDATE - Coarse threat images reveal theta oscillations in the amygdala: A magnetoencephalography study.
Maratos FA, Mogg K, Bradley BP, Rippon G, Senior C.
Cognitive, Affective & Behavioral Neuroscience, 9, 133-143
Neurocognitive models propose a specialized neural system for processing threat-related information, in which the amygdala plays a key role in the analysis of threat cues. fMRI research indicates that the amygdala is sensitive to coarse visual threat relevant information-for example, low spatial frequency (LSF) fearful faces. However, fMRI cannot determine the temporal or spectral characteristics of neural responses. Consequently, we used magnetoencephalography to explore spatiotemporal patterns of activity in the amygdala and cortical regions with blurry (LSF) and normal angry, fearful, and neutral faces. Results demonstrated differences in amygdala activity between LSF threat-related and LSF neutral faces (50-250 msec after face onset). These differences were evident in the theta range (4-8 Hz) and were accompanied by power changes within visual and frontal regions. Our results support the view that the amygdala is involved in the early processing of coarse threat related information and that theta is important in integrating activity within emotion-processing networks.
Cognitive, Affective & Behavioral Neuroscience, 9, 133-143
Neurocognitive models propose a specialized neural system for processing threat-related information, in which the amygdala plays a key role in the analysis of threat cues. fMRI research indicates that the amygdala is sensitive to coarse visual threat relevant information-for example, low spatial frequency (LSF) fearful faces. However, fMRI cannot determine the temporal or spectral characteristics of neural responses. Consequently, we used magnetoencephalography to explore spatiotemporal patterns of activity in the amygdala and cortical regions with blurry (LSF) and normal angry, fearful, and neutral faces. Results demonstrated differences in amygdala activity between LSF threat-related and LSF neutral faces (50-250 msec after face onset). These differences were evident in the theta range (4-8 Hz) and were accompanied by power changes within visual and frontal regions. Our results support the view that the amygdala is involved in the early processing of coarse threat related information and that theta is important in integrating activity within emotion-processing networks.
ARTICLE UPDATE - Do tests of executive functioning predict ability to downregulate emotions spontaneously and when instructed to suppress?
Gyurak A, Goodkind MS, Madan A, Kramer JH, Miller BL, Levenson RW.
Cognitive, Affective & Behavioral Neuroscience, 9, 144-152
Behavioral regulation is a hallmark feature of executive functioning (EF). The present study investigated whether commonly used neuropsychological test measures of EF (i.e., working memory, Stroop, trail making, and verbal fluency) were related to ability to downregulate emotion both spontaneously and when instructed to suppress emotional expressions. To ensure a wide range of EF, 24 frontotemporal lobar degeneration patients, 7 Alzheimer's patients, and 17 neurologically normal controls participated. Participants were exposed to an acoustic startle stimulus (single aversive noise burst) under three conditions: (1) unwarned, (2) warned with no instructions (to measure spontaneous emotion downregulation), and (3) warned with instructions to suppress (to measure instructed emotion downregulation). Results indicated that higher verbal fluency scores were related to greater emotion regulation (operationalized as reduction in body movement and emotional facial behavior when warned of the impending startle) in both regulation conditions. No relationships were found between emotion regulation in these conditions and the other EF measures. We conclude that, of four commonly used measures of EF, verbal fluency best indexes the complex processes of monitoring, evaluation, and control necessary for successful emotion regulation, both spontaneously and following instructions to suppress.
Cognitive, Affective & Behavioral Neuroscience, 9, 144-152
Behavioral regulation is a hallmark feature of executive functioning (EF). The present study investigated whether commonly used neuropsychological test measures of EF (i.e., working memory, Stroop, trail making, and verbal fluency) were related to ability to downregulate emotion both spontaneously and when instructed to suppress emotional expressions. To ensure a wide range of EF, 24 frontotemporal lobar degeneration patients, 7 Alzheimer's patients, and 17 neurologically normal controls participated. Participants were exposed to an acoustic startle stimulus (single aversive noise burst) under three conditions: (1) unwarned, (2) warned with no instructions (to measure spontaneous emotion downregulation), and (3) warned with instructions to suppress (to measure instructed emotion downregulation). Results indicated that higher verbal fluency scores were related to greater emotion regulation (operationalized as reduction in body movement and emotional facial behavior when warned of the impending startle) in both regulation conditions. No relationships were found between emotion regulation in these conditions and the other EF measures. We conclude that, of four commonly used measures of EF, verbal fluency best indexes the complex processes of monitoring, evaluation, and control necessary for successful emotion regulation, both spontaneously and following instructions to suppress.
Subscribe to:
Posts (Atom)