Van Boven L. White K. Huber M.
Journal of Experimental Psychology: General, 138, 368-382
People tend to perceive immediate emotions as more intense than previous emotions. This immediacy bias in emotion perception occurred for exposure to emotional but not neutral stimuli (Study 1), when emotional stimuli were separated by both shorter (2 s; Studies 1 and 2) and longer (20 min; Studies 3, 4, and 5) delays, and for emotional reactions to pictures (Studies 1 and 2), films (Studies 3 and 4), and descriptions of terrorist threats (Study 5). The immediacy bias may be partly caused by immediate emotion's salience, and by the greater availability of information about immediate compared with previous emotion. Consistent with emotional salience, when people experienced new emotions, they perceived previous emotions as less intense than they did initially (Studies 3 and 5)-a change in perception that did not occur when people did not experience a new immediate emotion (Study 2). Consistent with emotional availability, reminding people that information about emotions naturally decays from memory reduced the immediacy bias by making previous emotions seem more intense (Study 4). Discussed are implications for psychological theory and other judgments and behaviors.
Emotion Rules
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Wednesday, October 14, 2009
ARTICLE UPDATE - Emotion words, regardless of polarity, have a processing advantage over neutral words.
Kousta ST. Vinson DP. Vigliocco G.
Cognition, 112, 473-481
Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat. According to a different hypothesis which does not posit a privileged role for the aversive system, valence, regardless of polarity, facilitates processing due to the relevance of both negative and positive stimuli for survival and for the attainment of goals. Here, we present evidence that emotional valence has an overall facilitatory role in the processing of verbal stimuli, providing support for the latter hypothesis. We found no asymmetry between negative and positive words and suggest that previous findings of such an asymmetry can be attributed to failure to control for a number of critical lexical variables and to a sampling bias.
Cognition, 112, 473-481
Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat. According to a different hypothesis which does not posit a privileged role for the aversive system, valence, regardless of polarity, facilitates processing due to the relevance of both negative and positive stimuli for survival and for the attainment of goals. Here, we present evidence that emotional valence has an overall facilitatory role in the processing of verbal stimuli, providing support for the latter hypothesis. We found no asymmetry between negative and positive words and suggest that previous findings of such an asymmetry can be attributed to failure to control for a number of critical lexical variables and to a sampling bias.
ARTICLE UPDATE - Decoding of emotional information in voice-sensitive cortices.
Ethofer T. Van De Ville D. Scherer K. Vuilleumier P.
Current Biology, 19, 1028-1033
The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.
Current Biology, 19, 1028-1033
The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.
ARTICLE UPDATE - Acoustic profiles of distinct emotional expressions in laughter.
Szameitat DP. Alter K. Szameitat AJ. Wildgruber D. Sterr A. Darwin CJ.
Journal of the Acoustical Society of America, 126, 354-366
Although listeners are able to decode the underlying emotions embedded in acoustical laughter sounds, little is known about the acoustical cues that differentiate between the emotions. This study investigated the acoustical correlates of laughter expressing four different emotions: joy, tickling, taunting, and schadenfreude. Analysis of 43 acoustic parameters showed that the four emotions could be accurately discriminated on the basis of a small parameter set. Vowel quality contributed only minimally to emotional differentiation whereas prosodic parameters were more effective. Emotions are expressed by similar prosodic parameters in both laughter and speech.
Journal of the Acoustical Society of America, 126, 354-366
Although listeners are able to decode the underlying emotions embedded in acoustical laughter sounds, little is known about the acoustical cues that differentiate between the emotions. This study investigated the acoustical correlates of laughter expressing four different emotions: joy, tickling, taunting, and schadenfreude. Analysis of 43 acoustic parameters showed that the four emotions could be accurately discriminated on the basis of a small parameter set. Vowel quality contributed only minimally to emotional differentiation whereas prosodic parameters were more effective. Emotions are expressed by similar prosodic parameters in both laughter and speech.
ARTICLE UPDATE - Effects of emotionally contagious films on changes in hemisphere-specific cognitive performance.
Papousek I. Schulter G. Lang B.
Emotion, 9, 510-519
In the framework of models on the lateralized involvement of the cortical hemispheres in affect and psychopathology, the authors examined whether cognitive processes associated with the left and the right prefrontal cortex varied as a function of valence, motivational direction, or intensity of induced mood. Affective states (cheerfulness, anxiety, sadness, anger, and neutral mood) were experimentally induced by short "emotionally contagious films." Findings confirmed that the newly developed films were suitable to effectively elicit the expected affective states and to differentially change the dimensions of interest. Changes in verbal versus figural fluency performance were examined as a function of positive versus negative valence, approach versus withdrawal motivation, and low versus high emotional arousal. Level of interest was evaluated as a control. Both the tendency to withdraw and emotional arousal seemed to produce relative advantages for cognitive processes that are more strongly represented in the right than left prefrontal cortex. Findings suggest that changes in cognitive performance might be best explained by an additive combination of motivational direction and arousal.
Emotion, 9, 510-519
In the framework of models on the lateralized involvement of the cortical hemispheres in affect and psychopathology, the authors examined whether cognitive processes associated with the left and the right prefrontal cortex varied as a function of valence, motivational direction, or intensity of induced mood. Affective states (cheerfulness, anxiety, sadness, anger, and neutral mood) were experimentally induced by short "emotionally contagious films." Findings confirmed that the newly developed films were suitable to effectively elicit the expected affective states and to differentially change the dimensions of interest. Changes in verbal versus figural fluency performance were examined as a function of positive versus negative valence, approach versus withdrawal motivation, and low versus high emotional arousal. Level of interest was evaluated as a control. Both the tendency to withdraw and emotional arousal seemed to produce relative advantages for cognitive processes that are more strongly represented in the right than left prefrontal cortex. Findings suggest that changes in cognitive performance might be best explained by an additive combination of motivational direction and arousal.
ARTICLE UPDATE - Tell me about it: neural activity elicited by emotional pictures and preceding descriptions.
Macnamara A. Foti D. Hajcak G.
Emotion, 9, 531-543
Emotional pictures elicit enhanced parietal positivities beginning around 300 ms following stimulus presentation. The magnitude of these responses, however, depends on both intrinsic (stimulus-driven) and extrinsic (context-driven) factors. In the present study, event-related potentials were recorded while participants viewed unpleasant and neutral pictures that were described either more neutrally or more negatively prior to presentation; temporospatial principal components analysis identified early and late positivities: Both emotional images and descriptions had independent and additive effects on early (334 ms) and midlatency (1,066 ms) positivities, whereas the latest positivity (1,688 ms) was sensitive only to description type. Results are discussed with regard to the time course of automatic and controlled processing of emotional stimuli. 2009 APA, all rights reserved.
Emotion, 9, 531-543
Emotional pictures elicit enhanced parietal positivities beginning around 300 ms following stimulus presentation. The magnitude of these responses, however, depends on both intrinsic (stimulus-driven) and extrinsic (context-driven) factors. In the present study, event-related potentials were recorded while participants viewed unpleasant and neutral pictures that were described either more neutrally or more negatively prior to presentation; temporospatial principal components analysis identified early and late positivities: Both emotional images and descriptions had independent and additive effects on early (334 ms) and midlatency (1,066 ms) positivities, whereas the latest positivity (1,688 ms) was sensitive only to description type. Results are discussed with regard to the time course of automatic and controlled processing of emotional stimuli. 2009 APA, all rights reserved.
ARTICLE UPDATE - Finding Comfort in a Joke: Consolatory Effects of Humor Through Cognitive Distraction
Strick M. Holland RW. van Baaren RB. van Knippenberg A.
Emotion, 9, 574-578
This study aimed to demonstrate that the cognitive demands involved in humor processing can attenuate negative emotions. A primary aspect of humor is that it poses cognitive demands needed for incongruency resolution. On the basis of findings that cognitive distraction prevents mood-congruent processing, the authors hypothesized that humorous stimuli attenuate negative emotions to a greater extent than do equally positive nonhumorous stimuli. To test this idea, the authors used a modified version of the picture-viewing paradigm of L. F. Van Dillen and S. L. Koole (2007). Participants viewed neutral, mildly negative, and strongly negative pictures, followed by either a humorous or an equally positive nonhumorous stimulus, and then rated their feelings. Participants reported less negative feelings in both mildly and strongly negative trials with humorous positive stimuli than with nonhumorous positive stimuli. Humor did not differentially affect emotions in the neutral trials. Stimuli that posed greater cognitive demands were more effective in regulating negative emotions than less demanding stimuli. These findings fully support Van Dillen and Koole's working memory model of distraction from negative mood and suggest that humor may attenuate negative emotions as a result of cognitive distraction. 2009 APA, all rights reserved.
Emotion, 9, 574-578
This study aimed to demonstrate that the cognitive demands involved in humor processing can attenuate negative emotions. A primary aspect of humor is that it poses cognitive demands needed for incongruency resolution. On the basis of findings that cognitive distraction prevents mood-congruent processing, the authors hypothesized that humorous stimuli attenuate negative emotions to a greater extent than do equally positive nonhumorous stimuli. To test this idea, the authors used a modified version of the picture-viewing paradigm of L. F. Van Dillen and S. L. Koole (2007). Participants viewed neutral, mildly negative, and strongly negative pictures, followed by either a humorous or an equally positive nonhumorous stimulus, and then rated their feelings. Participants reported less negative feelings in both mildly and strongly negative trials with humorous positive stimuli than with nonhumorous positive stimuli. Humor did not differentially affect emotions in the neutral trials. Stimuli that posed greater cognitive demands were more effective in regulating negative emotions than less demanding stimuli. These findings fully support Van Dillen and Koole's working memory model of distraction from negative mood and suggest that humor may attenuate negative emotions as a result of cognitive distraction. 2009 APA, all rights reserved.
ARTICLE UPDATE - Event-related potential correlates of the extraverts' sensitivity to valence changes in positive stimuli.
Yuan J. He Y. Lei Y. Yang J. Li H.
Neuroreport, 20, 1071-1076
This study investigated whether the human sensitivity to valence intensity changes in positive stimuli varies with extraversion. Event-related potentials were recorded for highly positive, moderately positive, and neutral stimuli while participants (extraverts and nonextraverts) performed a standard/deviant categorization task, irrespective of the emotionality of deviants. The results of extraverts showed larger P2 and P3 amplitudes during highly positive condition than during moderately positive condition which, in turn, elicited larger P2 than neutral condition. Conversely, nonextraverts showed no differences at both P2 and P3 components. Thus, extraverts, unlike less extraverted individuals, are sensitive to valence changes in positive stimuli, which may be underlain by certain biogenetic mechanism.
Neuroreport, 20, 1071-1076
This study investigated whether the human sensitivity to valence intensity changes in positive stimuli varies with extraversion. Event-related potentials were recorded for highly positive, moderately positive, and neutral stimuli while participants (extraverts and nonextraverts) performed a standard/deviant categorization task, irrespective of the emotionality of deviants. The results of extraverts showed larger P2 and P3 amplitudes during highly positive condition than during moderately positive condition which, in turn, elicited larger P2 than neutral condition. Conversely, nonextraverts showed no differences at both P2 and P3 components. Thus, extraverts, unlike less extraverted individuals, are sensitive to valence changes in positive stimuli, which may be underlain by certain biogenetic mechanism.
ARTICLE UPDATE - Instrumental music influences recognition of emotional body language.
Van den Stock J. Peretz I. Grezes J. de Gelder B.
Brain Topography, 21, 216-20
In everyday life, emotional events are perceived by multiple sensory systems. Research has shown that recognition of emotions in one modality is biased towards the emotion expressed in a simultaneously presented but task irrelevant modality. In the present study, we combine visual and auditory stimuli that convey similar affective meaning but have a low probability of co-occurrence in everyday life. Dynamic face-blurred whole body expressions of a person grasping an object while expressing happiness or sadness are presented in combination with fragments of happy or sad instrumental classical music. Participants were instructed to categorize the emotion expressed by the visual stimulus. The results show that recognition of body language is influenced by the auditory stimuli. These findings indicate that crossmodal influences as previously observed for audiovisual speech can also be obtained from the ignored auditory to the attended visual modality in audiovisual stimuli that consist of whole bodies and music.
Brain Topography, 21, 216-20
In everyday life, emotional events are perceived by multiple sensory systems. Research has shown that recognition of emotions in one modality is biased towards the emotion expressed in a simultaneously presented but task irrelevant modality. In the present study, we combine visual and auditory stimuli that convey similar affective meaning but have a low probability of co-occurrence in everyday life. Dynamic face-blurred whole body expressions of a person grasping an object while expressing happiness or sadness are presented in combination with fragments of happy or sad instrumental classical music. Participants were instructed to categorize the emotion expressed by the visual stimulus. The results show that recognition of body language is influenced by the auditory stimuli. These findings indicate that crossmodal influences as previously observed for audiovisual speech can also be obtained from the ignored auditory to the attended visual modality in audiovisual stimuli that consist of whole bodies and music.
Sunday, October 04, 2009
ARTICLE UPDATE - Peripheral vision and preferential emotion processing.
De Cesarei A, Codispoti M, Schupp HT.
Neuroreport, in press
This study investigated the preferential processing of emotional scenes, which were presented in the periphery of the visual field. Building on well-established affective modulations of event-related potentials, which were observed for foveal stimuli, emotional and neutral images were presented at several locations in the visual field, while participants either viewed the pictures or were engaged by a distractor task. The findings clearly show that emotional processing varied with picture eccentricity, with emotional effects being maximal in the center and absent in the far periphery. Moreover, near-peripheral emotional stimuli modulated event-related potentials only when participants were passively viewing them. These results suggest that perceptual processing resources are needed for identification and emotional processing of peripheral stimuli.
Neuroreport, in press
This study investigated the preferential processing of emotional scenes, which were presented in the periphery of the visual field. Building on well-established affective modulations of event-related potentials, which were observed for foveal stimuli, emotional and neutral images were presented at several locations in the visual field, while participants either viewed the pictures or were engaged by a distractor task. The findings clearly show that emotional processing varied with picture eccentricity, with emotional effects being maximal in the center and absent in the far periphery. Moreover, near-peripheral emotional stimuli modulated event-related potentials only when participants were passively viewing them. These results suggest that perceptual processing resources are needed for identification and emotional processing of peripheral stimuli.
ARTICLE UPDATE - Cultural Context Moderates the Relationship Between Emotion Control Values and Cardiovascular Challenge Versus Threat Responses.
Mauss IB, Butler EA.
Biological Psychology, in press
Cultural context affects people's values regarding emotions, as well as their experiential and behavioral but not autonomic physiological responses to emotional situations. Little research, however, has examined how cultural context influences the relationships among values and emotional responding. Specifically, depending on their cultural context, individuals' values about emotion control (ECV; the extent to which they value emotion control) may have differing meanings, and as such, be associated with differing responses in emotional situations. We examined this possibility by testing the effect of two cultural contexts (28 female Asian-American (AA) versus 28 female European-American (EA) undergraduate students) on the associations between individuals' ECV and emotional responding (experiential, behavioral, and cardiovascular) to a relatively neutral film clip and a laboratory anger provocation. In the AA group, greater ECV were associated with reduced anger experience and behavior, and a challenge pattern of cardiovascular responding. In the EA group, greater ECV were associated with reduced anger behavior but not anger experience, and a threat pattern of cardiovascular responding. These results are consistent with the notion that individuals' values about emotion are associated with different meanings in different cultural contexts, and in turn, with different emotional and cardiovascular responses.
Biological Psychology, in press
Cultural context affects people's values regarding emotions, as well as their experiential and behavioral but not autonomic physiological responses to emotional situations. Little research, however, has examined how cultural context influences the relationships among values and emotional responding. Specifically, depending on their cultural context, individuals' values about emotion control (ECV; the extent to which they value emotion control) may have differing meanings, and as such, be associated with differing responses in emotional situations. We examined this possibility by testing the effect of two cultural contexts (28 female Asian-American (AA) versus 28 female European-American (EA) undergraduate students) on the associations between individuals' ECV and emotional responding (experiential, behavioral, and cardiovascular) to a relatively neutral film clip and a laboratory anger provocation. In the AA group, greater ECV were associated with reduced anger experience and behavior, and a challenge pattern of cardiovascular responding. In the EA group, greater ECV were associated with reduced anger behavior but not anger experience, and a threat pattern of cardiovascular responding. These results are consistent with the notion that individuals' values about emotion are associated with different meanings in different cultural contexts, and in turn, with different emotional and cardiovascular responses.
ARTICLE UPDATE - Are irrational reactions to unfairness truly emotionally-driven? Dissociated behavioural and emotional responses in the Ultimatum Gam
Civai C, Corradi-Dell'acqua C, Gamer M, Rumiati RI.
Cognition, in press
The "irrational" rejections of unfair offers by people playing the Ultimatum Game (UG), a widely used laboratory model of economical decision-making, have traditionally been associated with negative emotions, such as frustration, elicited by unfairness (Sanfey, Rilling, Aronson, Nystrom, & Cohen, 2003; van't Wout, Kahn, Sanfey, & Aleman, 2006). We recorded skin conductance responses as a measure of emotional activation while participants performed a modified version of the UG, in which they were asked to play both for themselves and on behalf of a third-party. Our findings show that even unfair offers are rejected when participants' payoff is not affected (third-party condition); however, they show an increase in the emotional activation specifically when they are rejecting offers directed towards themselves (myself condition). These results suggest that theories emphasizing negative emotions as the critical factor of "irrational" rejections (Pillutla & Murninghan, 1996) should be re-discussed. Psychological mechanisms other than emotions might be better candidates for explaining this behaviour.
Cognition, in press
The "irrational" rejections of unfair offers by people playing the Ultimatum Game (UG), a widely used laboratory model of economical decision-making, have traditionally been associated with negative emotions, such as frustration, elicited by unfairness (Sanfey, Rilling, Aronson, Nystrom, & Cohen, 2003; van't Wout, Kahn, Sanfey, & Aleman, 2006). We recorded skin conductance responses as a measure of emotional activation while participants performed a modified version of the UG, in which they were asked to play both for themselves and on behalf of a third-party. Our findings show that even unfair offers are rejected when participants' payoff is not affected (third-party condition); however, they show an increase in the emotional activation specifically when they are rejecting offers directed towards themselves (myself condition). These results suggest that theories emphasizing negative emotions as the critical factor of "irrational" rejections (Pillutla & Murninghan, 1996) should be re-discussed. Psychological mechanisms other than emotions might be better candidates for explaining this behaviour.
ARTICLE UPDATE - Event-Related Delta And Theta Synchronization During Explicit And Implicit Emotion Processing.
Knyazev GG, Slobodskoj-Plusnin JY, Bocharov AV.
Neuroscience, in press
Emotion information processing may occur in two modes which are differently represented in conscious awareness. Fast online processing involves coarse-grained analysis of salient features, and is not represented in conscious awareness; offline processing takes hundreds of milliseconds to generate fine-grained analysis, and is represented in conscious awareness. These processing modes may be studied using event-related electroencephalogram theta and delta synchronization as a marker of emotion processing. Two experiments were conducted, which differed on the mode of emotional information presentation. In the explicit mode subjects were explicitly instructed to evaluate the emotional content of presented stimuli; in the implicit mode they performed a gender discrimination task. Firstly, we show that in both experiments theta and delta synchronization is stronger upon presentation of "emotional" than "neutral" stimuli, and in subjects who are more sensitive, or experience higher emotional involvement than in less sensitive or detached subjects. Secondly, we show that in the implicit mode theta and delta synchronization is more pronounced in an early (before 250 ms post-stimulus) processing stage, whereas in the explicit mode it is more pronounced in a later processing stage. Source localization analysis showed that implicit processing of angry and happy (relative to neutral) faces is associated with higher early (before 250 ms) theta synchronization in the right parietal cortex and the right insula, respectively. Explicit processing of angry and happy faces is associated with higher late (after 250 ms) theta synchronization in the left temporal lobe and bilateral prefrontal cortex, respectively.
Neuroscience, in press
Emotion information processing may occur in two modes which are differently represented in conscious awareness. Fast online processing involves coarse-grained analysis of salient features, and is not represented in conscious awareness; offline processing takes hundreds of milliseconds to generate fine-grained analysis, and is represented in conscious awareness. These processing modes may be studied using event-related electroencephalogram theta and delta synchronization as a marker of emotion processing. Two experiments were conducted, which differed on the mode of emotional information presentation. In the explicit mode subjects were explicitly instructed to evaluate the emotional content of presented stimuli; in the implicit mode they performed a gender discrimination task. Firstly, we show that in both experiments theta and delta synchronization is stronger upon presentation of "emotional" than "neutral" stimuli, and in subjects who are more sensitive, or experience higher emotional involvement than in less sensitive or detached subjects. Secondly, we show that in the implicit mode theta and delta synchronization is more pronounced in an early (before 250 ms post-stimulus) processing stage, whereas in the explicit mode it is more pronounced in a later processing stage. Source localization analysis showed that implicit processing of angry and happy (relative to neutral) faces is associated with higher early (before 250 ms) theta synchronization in the right parietal cortex and the right insula, respectively. Explicit processing of angry and happy faces is associated with higher late (after 250 ms) theta synchronization in the left temporal lobe and bilateral prefrontal cortex, respectively.
Friday, September 25, 2009
ARTICLE UPDATE - When seeing outweighs feeling: a role for prefrontal cortex in passive control of negative affect in blindsight.
Anders S, Eippert F, Wiens S, Birbaumer N, Lotze M, Wildgruber D.
Brain, in press
Affective neuroscience has been strongly influenced by the view that a 'feeling' is the perception of somatic changes and has consequently often neglected the neural mechanisms that underlie the integration of somatic and other information in affective experience. Here, we investigate affective processing by means of functional magnetic resonance imaging in nine cortically blind patients. In these patients, unilateral postgeniculate lesions prevent primary cortical visual processing in part of the visual field which, as a result, becomes subjectively blind. Residual subcortical processing of visual information, however, is assumed to occur in the entire visual field. As we have reported earlier, these patients show significant startle reflex potentiation when a threat-related visual stimulus is shown in their blind visual field. Critically, this was associated with an increase of brain activity in somatosensory-related areas, and an increase in experienced negative affect. Here, we investigated the patients' response when the visual stimulus was shown in the sighted visual field, that is, when it was visible and cortically processed. Despite the fact that startle reflex potentiation was similar in the blind and sighted visual field, patients reported significantly less negative affect during stimulation of the sighted visual field. In other words, when the visual stimulus was visible and received full cortical processing, the patients' phenomenal experience of affect did not closely reflect somatic changes. This decoupling of phenomenal affective experience and somatic changes was associated with an increase of activity in the left ventrolateral prefrontal cortex and a decrease of affect-related somatosensory activity. Moreover, patients who showed stronger left ventrolateral prefrontal cortex activity tended to show a stronger decrease of affect-related somatosensory activity. Our findings show that similar affective somatic changes can be associated with different phenomenal experiences of affect, depending on the depth of cortical processing. They are in line with a model in which the left ventrolateral prefrontal cortex is a relay station that integrates information about subcortically triggered somatic responses and information resulting from in-depth cortical stimulus processing. Tentatively, we suggest that the observed decoupling of somatic responses and experienced affect, and the reduction of negative phenomenal experience, can be explained by a left ventrolateral prefrontal cortex-mediated inhibition of affect-related somatosensory activity.
Brain, in press
Affective neuroscience has been strongly influenced by the view that a 'feeling' is the perception of somatic changes and has consequently often neglected the neural mechanisms that underlie the integration of somatic and other information in affective experience. Here, we investigate affective processing by means of functional magnetic resonance imaging in nine cortically blind patients. In these patients, unilateral postgeniculate lesions prevent primary cortical visual processing in part of the visual field which, as a result, becomes subjectively blind. Residual subcortical processing of visual information, however, is assumed to occur in the entire visual field. As we have reported earlier, these patients show significant startle reflex potentiation when a threat-related visual stimulus is shown in their blind visual field. Critically, this was associated with an increase of brain activity in somatosensory-related areas, and an increase in experienced negative affect. Here, we investigated the patients' response when the visual stimulus was shown in the sighted visual field, that is, when it was visible and cortically processed. Despite the fact that startle reflex potentiation was similar in the blind and sighted visual field, patients reported significantly less negative affect during stimulation of the sighted visual field. In other words, when the visual stimulus was visible and received full cortical processing, the patients' phenomenal experience of affect did not closely reflect somatic changes. This decoupling of phenomenal affective experience and somatic changes was associated with an increase of activity in the left ventrolateral prefrontal cortex and a decrease of affect-related somatosensory activity. Moreover, patients who showed stronger left ventrolateral prefrontal cortex activity tended to show a stronger decrease of affect-related somatosensory activity. Our findings show that similar affective somatic changes can be associated with different phenomenal experiences of affect, depending on the depth of cortical processing. They are in line with a model in which the left ventrolateral prefrontal cortex is a relay station that integrates information about subcortically triggered somatic responses and information resulting from in-depth cortical stimulus processing. Tentatively, we suggest that the observed decoupling of somatic responses and experienced affect, and the reduction of negative phenomenal experience, can be explained by a left ventrolateral prefrontal cortex-mediated inhibition of affect-related somatosensory activity.
ARTICLE UPDATE - The convergence of information about rewarding and aversive stimuli in single neurons.
Morrison SE, Salzman CD.
The Journal of Neuroscience, 29, 11471-11483
Neuroscientists, psychologists, clinicians, and economists have long been interested in how individuals weigh information about potential rewarding and aversive stimuli to make decisions and to regulate their emotions. However, we know relatively little about how appetitive and aversive systems interact in the brain, as most prior studies have investigated only one valence of reinforcement. Previous work has suggested that primate orbitofrontal cortex (OFC) represents information about the reward value of stimuli. We therefore investigated whether OFC also represents information about aversive stimuli, and, if so, whether individual neurons process information about both rewarding and aversive stimuli. Monkeys performed a trace conditioning task in which different novel abstract visual stimuli (conditioned stimuli, CSs) predicted the occurrence of one of three unconditioned stimuli (USs): a large liquid reward, a small liquid reward, or an aversive air-puff. Three lines of evidence suggest that information about rewarding and aversive stimuli converges in individual neurons in OFC. First, OFC neurons often responded to both rewarding and aversive USs, despite their different sensory features. Second, OFC neural responses to CSs often encoded information about both potential rewarding and aversive stimuli, even though these stimuli differed in both valence and sensory modality. Finally, OFC neural responses were correlated with monkeys' behavioral use of information about both rewarding and aversive CS-US associations. These data indicate that processing of appetitive and aversive stimuli converges at the single cell level in OFC, providing a possible substrate for executive and emotional processes that require using information from both appetitive and aversive systems.
The Journal of Neuroscience, 29, 11471-11483
Neuroscientists, psychologists, clinicians, and economists have long been interested in how individuals weigh information about potential rewarding and aversive stimuli to make decisions and to regulate their emotions. However, we know relatively little about how appetitive and aversive systems interact in the brain, as most prior studies have investigated only one valence of reinforcement. Previous work has suggested that primate orbitofrontal cortex (OFC) represents information about the reward value of stimuli. We therefore investigated whether OFC also represents information about aversive stimuli, and, if so, whether individual neurons process information about both rewarding and aversive stimuli. Monkeys performed a trace conditioning task in which different novel abstract visual stimuli (conditioned stimuli, CSs) predicted the occurrence of one of three unconditioned stimuli (USs): a large liquid reward, a small liquid reward, or an aversive air-puff. Three lines of evidence suggest that information about rewarding and aversive stimuli converges in individual neurons in OFC. First, OFC neurons often responded to both rewarding and aversive USs, despite their different sensory features. Second, OFC neural responses to CSs often encoded information about both potential rewarding and aversive stimuli, even though these stimuli differed in both valence and sensory modality. Finally, OFC neural responses were correlated with monkeys' behavioral use of information about both rewarding and aversive CS-US associations. These data indicate that processing of appetitive and aversive stimuli converges at the single cell level in OFC, providing a possible substrate for executive and emotional processes that require using information from both appetitive and aversive systems.
ARTICLE UPDATE - Propensity and sensitivity measures of fear and disgust are differentially related to emotion-specific brain activation.
Schäfer A, Leutgeb V, Reishofer G, Ebner F, Schienle A.
Neuroscience Letters, in press
Neuroimaging studies on individual differences in experiencing disgust and fear have indicated that disgust propensity and trait anxiety are able to moderate brain activity. The moderating role of disgust sensitivity and anxiety sensitivity has not been investigated thus far. Both sensitivity traits refer to the tendency of a person to perceive harmful consequences of experiencing fear and disgust. Eighteen female subjects viewed and subsequently rated pictures for the elicitation of disgust, fear and a neutral affective state. The viewing of the aversive pictures was associated with activation of visual processing areas, the amygdala, the insula and the orbitofrontal cortex (OFC). In the disgust condition, disgust propensity was positively correlated with activation of attention-related areas (parietal cortex, anterior cingulate cortex (ACC)) and brain regions involved in valence and arousal processing (OFC, insula). For the fear condition, we observed positive correlations between trait anxiety and activation of the ACC, the insula, and the OFC. Correlations between brain activity and sensitivity measures were exclusively negative and concerned areas crucial for emotion regulation, such as the medial and dorsolateral prefrontal cortex (MPFC, DLPFC). Thus, individuals high in disgust/anxiety sensitivity might have difficulties to successfully control the specific affective experience.
Neuroscience Letters, in press
Neuroimaging studies on individual differences in experiencing disgust and fear have indicated that disgust propensity and trait anxiety are able to moderate brain activity. The moderating role of disgust sensitivity and anxiety sensitivity has not been investigated thus far. Both sensitivity traits refer to the tendency of a person to perceive harmful consequences of experiencing fear and disgust. Eighteen female subjects viewed and subsequently rated pictures for the elicitation of disgust, fear and a neutral affective state. The viewing of the aversive pictures was associated with activation of visual processing areas, the amygdala, the insula and the orbitofrontal cortex (OFC). In the disgust condition, disgust propensity was positively correlated with activation of attention-related areas (parietal cortex, anterior cingulate cortex (ACC)) and brain regions involved in valence and arousal processing (OFC, insula). For the fear condition, we observed positive correlations between trait anxiety and activation of the ACC, the insula, and the OFC. Correlations between brain activity and sensitivity measures were exclusively negative and concerned areas crucial for emotion regulation, such as the medial and dorsolateral prefrontal cortex (MPFC, DLPFC). Thus, individuals high in disgust/anxiety sensitivity might have difficulties to successfully control the specific affective experience.
ARTICLE UPDATE - Brain networks involved in haptic and visual identification of facial expressions of emotion: An fMRI study
Kitada R, Johnsrude IS, Kochiyama T, Lederman SJ.
Neuroimage, in press
Previous neurophysiological and neuroimaging studies have shown that a cortical network involving the inferior frontal gyrus (IFG), inferior parietal lobe (IPL) and cortical areas in and around the posterior superior temporal sulcus (pSTS) region are employed in action understanding by vision and audition. However, the brain regions that are involved in action understanding by touch are unknown. Lederman et al. (2007) recently demonstrated that humans can haptically recognize facial expressions of emotion (FEE) surprisingly well. Here, we report a functional magnetic resonance imaging (fMRI) study in which we test the hypothesis that the IFG, IPL and pSTS regions are involved in haptic, as well as visual, FEE identification. Twenty subjects haptically or visually identified facemasks with three different FEEs (disgust, neutral and happiness) and casts of shoes (shoes) of three different types. The left posterior middle temporal gyrus, IPL, IFG, and bilateral precentral gyrus were activated by FEE identification relative to that of shoes, regardless of sensory modality. By contrast, an inferomedial part of the left superior parietal lobule was activated by haptic, but not visual, FEE identification. Other brain regions, including the lingual gyrus and superior frontal gyrus, were activated by visual identification of FEEs, relative to haptic identification of FEEs. These results suggest that haptic and visual FEE identification rely on distinct but overlapping neural substrates including the IFG, IPL and pSTS region.
Neuroimage, in press
Previous neurophysiological and neuroimaging studies have shown that a cortical network involving the inferior frontal gyrus (IFG), inferior parietal lobe (IPL) and cortical areas in and around the posterior superior temporal sulcus (pSTS) region are employed in action understanding by vision and audition. However, the brain regions that are involved in action understanding by touch are unknown. Lederman et al. (2007) recently demonstrated that humans can haptically recognize facial expressions of emotion (FEE) surprisingly well. Here, we report a functional magnetic resonance imaging (fMRI) study in which we test the hypothesis that the IFG, IPL and pSTS regions are involved in haptic, as well as visual, FEE identification. Twenty subjects haptically or visually identified facemasks with three different FEEs (disgust, neutral and happiness) and casts of shoes (shoes) of three different types. The left posterior middle temporal gyrus, IPL, IFG, and bilateral precentral gyrus were activated by FEE identification relative to that of shoes, regardless of sensory modality. By contrast, an inferomedial part of the left superior parietal lobule was activated by haptic, but not visual, FEE identification. Other brain regions, including the lingual gyrus and superior frontal gyrus, were activated by visual identification of FEEs, relative to haptic identification of FEEs. These results suggest that haptic and visual FEE identification rely on distinct but overlapping neural substrates including the IFG, IPL and pSTS region.
Friday, September 11, 2009
ARTICLE UPDATE - Emotional Conception: How Embodied Emotion Concepts Guide Perception and Facial Action.
Halberstadt J, Winkielman P, Niedenthal PM, Dalle N.
Psychological Science, in press
This study assessed embodied simulation via electromyography (EMG) as participants first encoded emotionally ambiguous faces with emotion concepts (i.e., "angry,""happy") and later passively viewed the faces without the concepts. Memory for the faces was also measured. At initial encoding, participants displayed more smiling-related EMG activity in response to faces paired with "happy" than in response to faces paired with "angry." Later, in the absence of concepts, participants remembered happiness-encoded faces as happier than anger-encoded faces. Further, during passive reexposure to the ambiguous faces, participants' EMG indicated spontaneous emotion-specific mimicry, which in turn predicted memory bias. No specific EMG activity was observed when participants encoded or viewed faces with non-emotion-related valenced concepts, or when participants encoded or viewed Chinese ideographs. From an embodiment perspective, emotion simulation is a measure of what is currently perceived. Thus, these findings provide evidence of genuine concept-driven changes in emotion perception. More generally, the findings highlight embodiment's role in the representation and processing of emotional information.
Psychological Science, in press
This study assessed embodied simulation via electromyography (EMG) as participants first encoded emotionally ambiguous faces with emotion concepts (i.e., "angry,""happy") and later passively viewed the faces without the concepts. Memory for the faces was also measured. At initial encoding, participants displayed more smiling-related EMG activity in response to faces paired with "happy" than in response to faces paired with "angry." Later, in the absence of concepts, participants remembered happiness-encoded faces as happier than anger-encoded faces. Further, during passive reexposure to the ambiguous faces, participants' EMG indicated spontaneous emotion-specific mimicry, which in turn predicted memory bias. No specific EMG activity was observed when participants encoded or viewed faces with non-emotion-related valenced concepts, or when participants encoded or viewed Chinese ideographs. From an embodiment perspective, emotion simulation is a measure of what is currently perceived. Thus, these findings provide evidence of genuine concept-driven changes in emotion perception. More generally, the findings highlight embodiment's role in the representation and processing of emotional information.
ARTICLE UPDATE - Interactions of attention, emotion and motivation.
Raymond J.
Progress in Brain Research, 176, 293-308
Although successful visually guided action begins with sensory processes and ends with motor control, the intervening processes related to the appropriate selection of information for processing are especially critical because of the brain's limited capacity to handle information. Three important mechanisms--attention, emotion and motivation--contribute to the prioritization and selection of information. In this chapter, the interplay between these systems is discussed with emphasis placed on interactions between attention (or immediate task relevance of stimuli) and emotion (or affective evaluation of stimuli), and between attention and motivation (or the predicted value of stimuli). Although numerous studies have shown that emotional stimuli modulate mechanisms of selective attention in humans, little work has been directed at exploring whether such interactions can be reciprocal, that is, whether attention can influence emotional response. Recent work on this question (showing that distracting information is typically devalued upon later encounters) is reviewed in the first half of the chapter. In the second half, some recent experiments exploring how prior value-prediction learning (i.e., learning to associate potential outcomes, good or bad, with specific stimuli) plays a role in visual selection and conscious perception. The results indicate that some aspects of motivation act on selection independently of traditionally defined attention and other aspects interact with it.
Progress in Brain Research, 176, 293-308
Although successful visually guided action begins with sensory processes and ends with motor control, the intervening processes related to the appropriate selection of information for processing are especially critical because of the brain's limited capacity to handle information. Three important mechanisms--attention, emotion and motivation--contribute to the prioritization and selection of information. In this chapter, the interplay between these systems is discussed with emphasis placed on interactions between attention (or immediate task relevance of stimuli) and emotion (or affective evaluation of stimuli), and between attention and motivation (or the predicted value of stimuli). Although numerous studies have shown that emotional stimuli modulate mechanisms of selective attention in humans, little work has been directed at exploring whether such interactions can be reciprocal, that is, whether attention can influence emotional response. Recent work on this question (showing that distracting information is typically devalued upon later encounters) is reviewed in the first half of the chapter. In the second half, some recent experiments exploring how prior value-prediction learning (i.e., learning to associate potential outcomes, good or bad, with specific stimuli) plays a role in visual selection and conscious perception. The results indicate that some aspects of motivation act on selection independently of traditionally defined attention and other aspects interact with it.
ARTICLE UPDATE - Emotional modulation of visual cortex activity: a functional near-infrared spectroscopy study.
Emotional modulation of visual cortex activity: a functional near-infrared spectroscopy study.
Neuroreport, in press
Functional neuroimaging and electroencephalography reveal emotional effects in the early visual cortex. Here, we used functional near-infrared spectroscopy to examine haemodynamic responses evoked by neutral, positive and negative emotional pictures, matched for brightness, contrast, hue, saturation, spatial frequency and entropy. Emotion content modulated amplitude and latency of oxy, deoxy and total haemoglobin response peaks, and induced peripheral autonomic reactions. The processing of positive and negative pictures enhanced haemodynamic response amplitude, and this effect was paralleled by blood pressure changes. The processing of positive pictures was reflected in reduced haemodynamic response peak latency. Together these data suggest that the early visual cortex holds amplitude-dependent representation of stimulus salience and latency-dependent information regarding stimulus valence, providing new insight into affective interaction with sensory processing.
Neuroreport, in press
Functional neuroimaging and electroencephalography reveal emotional effects in the early visual cortex. Here, we used functional near-infrared spectroscopy to examine haemodynamic responses evoked by neutral, positive and negative emotional pictures, matched for brightness, contrast, hue, saturation, spatial frequency and entropy. Emotion content modulated amplitude and latency of oxy, deoxy and total haemoglobin response peaks, and induced peripheral autonomic reactions. The processing of positive and negative pictures enhanced haemodynamic response amplitude, and this effect was paralleled by blood pressure changes. The processing of positive pictures was reflected in reduced haemodynamic response peak latency. Together these data suggest that the early visual cortex holds amplitude-dependent representation of stimulus salience and latency-dependent information regarding stimulus valence, providing new insight into affective interaction with sensory processing.
Wednesday, September 02, 2009
ARTICLE UPDATE - Appetitive vs. defensive responses to emotional cues. Autonomic measures and brain oscillation modulation.
Balconi M, Brambilla E, Falbo L.
Brain Research, in press
The present study explored the effect of the subjective evaluation and the individual differences related to BIS and BAS (Behavioural Inhibition and Activation System) on autonomic measures and brain oscillations, in response to appetitive and aversive emotional stimuli. Multiple measures were recorded, such as psychophysiological (skin conductance response, heart rate, and electromyography) and frequency bands (delta, theta, alpha, and gamma), during viewing IAPS figures, that varied in terms of pleasantness (appetitive vs. aversive) and arousing power (high vs. low intensity). Both BIS and BAS measures were significant in modulating behavioural, autonomic and brain oscillations responses. Withdrawal (BIS system) and appetitive (BAS system) behaviour showed opposite patterns of responses by the subjects. Also, frontal cortical site response was more significant than other sites. Nevertheless, no specific lateralization effect was found as a function of BIS/BAS dichotomy. Moreover, autonomic variables and frequency band modulations were found to be effected by valence and arousal rating per se, with an increased response for high arousing and negative or positive stimuli in comparison with low arousing and neutral stimuli. The effects of subjective evaluation and individual differences were discussed at light of coping activity model of emotion comprehension.
Brain Research, in press
The present study explored the effect of the subjective evaluation and the individual differences related to BIS and BAS (Behavioural Inhibition and Activation System) on autonomic measures and brain oscillations, in response to appetitive and aversive emotional stimuli. Multiple measures were recorded, such as psychophysiological (skin conductance response, heart rate, and electromyography) and frequency bands (delta, theta, alpha, and gamma), during viewing IAPS figures, that varied in terms of pleasantness (appetitive vs. aversive) and arousing power (high vs. low intensity). Both BIS and BAS measures were significant in modulating behavioural, autonomic and brain oscillations responses. Withdrawal (BIS system) and appetitive (BAS system) behaviour showed opposite patterns of responses by the subjects. Also, frontal cortical site response was more significant than other sites. Nevertheless, no specific lateralization effect was found as a function of BIS/BAS dichotomy. Moreover, autonomic variables and frequency band modulations were found to be effected by valence and arousal rating per se, with an increased response for high arousing and negative or positive stimuli in comparison with low arousing and neutral stimuli. The effects of subjective evaluation and individual differences were discussed at light of coping activity model of emotion comprehension.
ARTICLE UPDATE - Brain oscillations and BIS/BAS (behavioral inhibition/activation system) effects on processing masked emotional cues ERS/ERD and cohe
Balconi M, Mazza G.
International Journal of Psychophysiology, in press
Alpha brain oscillation modulation was analyzed in response to masked emotional facial expressions. In addition, behavioural activation (BAS) and behavioural inhibition systems (BIS) were considered as an explicative factor to verify the effect of motivational significance on cortical activity. Nineteen subjects were submitted to an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral). The results demonstrated that anterior frontal sites were more active than central and posterior sites in response to facial stimuli. Moreover, right-side responses varied as a function of emotional types, with an increased right-frontal activity for negative emotions. Finally, whereas higher BIS subjects generated a more right hemisphere activation for some negative emotions (such as fear, anger, and surprise), Reward-BAS subjects were more responsive to positive emotion (happiness) within the left hemisphere. Valence and potential threatening power of facial expressions were considered to elucidate these cortical differences.
International Journal of Psychophysiology, in press
Alpha brain oscillation modulation was analyzed in response to masked emotional facial expressions. In addition, behavioural activation (BAS) and behavioural inhibition systems (BIS) were considered as an explicative factor to verify the effect of motivational significance on cortical activity. Nineteen subjects were submitted to an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral). The results demonstrated that anterior frontal sites were more active than central and posterior sites in response to facial stimuli. Moreover, right-side responses varied as a function of emotional types, with an increased right-frontal activity for negative emotions. Finally, whereas higher BIS subjects generated a more right hemisphere activation for some negative emotions (such as fear, anger, and surprise), Reward-BAS subjects were more responsive to positive emotion (happiness) within the left hemisphere. Valence and potential threatening power of facial expressions were considered to elucidate these cortical differences.
ARTICLE UPDATE - Changing Fear: The Neurocircuitry of Emotion Regulation.
Hartley CA, Phelps EA.
Neuropsychopharmacology, in press
The ability to alter emotional responses as circumstances change is a critical component of normal adaptive behavior and is often impaired in psychological disorders. In this review, we discuss four emotional regulation techniques that have been investigated as means to control fear: extinction, cognitive regulation, active coping, and reconsolidation. For each technique, we review what is known about the underlying neural systems, combining findings from animal models and human neuroscience. The current evidence suggests that these different means of regulating fear depend on both overlapping and distinct components of a fear circuitry.
Neuropsychopharmacology, in press
The ability to alter emotional responses as circumstances change is a critical component of normal adaptive behavior and is often impaired in psychological disorders. In this review, we discuss four emotional regulation techniques that have been investigated as means to control fear: extinction, cognitive regulation, active coping, and reconsolidation. For each technique, we review what is known about the underlying neural systems, combining findings from animal models and human neuroscience. The current evidence suggests that these different means of regulating fear depend on both overlapping and distinct components of a fear circuitry.
ARTICLE UPDATE - Emotional context modulates response inhibition: Neural and behavioral data.
Albert J, López-MartÃn S, Carretié L.
Neuroimage, in press
Although recent hemodynamic studies indicate that neural activity related to emotion and that associated with response inhibition constitute closely interrelated and mutually dependent processes, the nature of this relationship is still unclear. In order to explore the temporo-spatial characteristics of the interaction between emotion and inhibition, event-related potentials (ERPs) were measured as participants (N=30) performed a modified version of the Go/Nogo task that required the inhibition of prepotent responses to neutral cues during three different emotional contexts: negative, neutral, and positive. Temporal and spatial Principal Component Analyses were employed to detect and quantify, in a reliable manner, those ERP components related to response inhibition (i.e., Nogo-N2 and Nogo-P3), and a source-localization technique (sLORETA) provided information on their neural origin. Behavioral analyses revealed that reaction times (RTs) to Go cues were shorter during the positive context than during neutral and negative contexts. ERP analyses showed that suppressing responses to Nogo cues within the positive context elicited larger frontocentral Nogo-P3 amplitudes and enhanced anterior cingulate cortex (ACC) activation than within the negative context. Regression analyses revealed that Nogo-P3 (i) was inversely related to RTs, supporting its association with the inhibition of a prepotent response, and (ii) was associated with contextual valence (amplitude increased as context valence was more positive), but not with contextual arousal. These results suggest that withholding a prepotent response within positively valenced contexts is more difficult and requires more inhibitory control than within negatively valenced contexts.
Neuroimage, in press
Although recent hemodynamic studies indicate that neural activity related to emotion and that associated with response inhibition constitute closely interrelated and mutually dependent processes, the nature of this relationship is still unclear. In order to explore the temporo-spatial characteristics of the interaction between emotion and inhibition, event-related potentials (ERPs) were measured as participants (N=30) performed a modified version of the Go/Nogo task that required the inhibition of prepotent responses to neutral cues during three different emotional contexts: negative, neutral, and positive. Temporal and spatial Principal Component Analyses were employed to detect and quantify, in a reliable manner, those ERP components related to response inhibition (i.e., Nogo-N2 and Nogo-P3), and a source-localization technique (sLORETA) provided information on their neural origin. Behavioral analyses revealed that reaction times (RTs) to Go cues were shorter during the positive context than during neutral and negative contexts. ERP analyses showed that suppressing responses to Nogo cues within the positive context elicited larger frontocentral Nogo-P3 amplitudes and enhanced anterior cingulate cortex (ACC) activation than within the negative context. Regression analyses revealed that Nogo-P3 (i) was inversely related to RTs, supporting its association with the inhibition of a prepotent response, and (ii) was associated with contextual valence (amplitude increased as context valence was more positive), but not with contextual arousal. These results suggest that withholding a prepotent response within positively valenced contexts is more difficult and requires more inhibitory control than within negatively valenced contexts.
ARTICLE UPDATE - Personal space regulation by the human amygdala.
Kennedy DP, Gläscher J, Tyszka JM, Adolphs R.
Nature Neuroscience, in press
The amygdala plays key roles in emotion and social cognition, but how this translates to face-to-face interactions involving real people remains unknown. We found that an individual with complete amygdala lesions lacked any sense of personal space. Furthermore, healthy individuals showed amygdala activation upon close personal proximity. The amygdala may be required to trigger the strong emotional reactions normally following personal space violations, thus regulating interpersonal distance in humans.
Nature Neuroscience, in press
The amygdala plays key roles in emotion and social cognition, but how this translates to face-to-face interactions involving real people remains unknown. We found that an individual with complete amygdala lesions lacked any sense of personal space. Furthermore, healthy individuals showed amygdala activation upon close personal proximity. The amygdala may be required to trigger the strong emotional reactions normally following personal space violations, thus regulating interpersonal distance in humans.
Monday, August 24, 2009
ARTICLE UPDATE - When nonsense sounds happy or helpless: The Implicit Positive and Negative Affect Test (IPANAT).
Quirin M, Kazén M, Kuhl J.
Journal of Personality and Social Psychology, 97, 500-516
This article introduces an instrument for the indirect assessment of positive and negative affect, the Implicit Positive and Negative Affect Test (IPANAT). This test draws on participant ratings of the extent to which artificial words subjectively convey various emotions. Factor analyses of these ratings yielded two independent factors that can be interpreted as implicit positive and negative affect. The corresponding scales show adequate internal consistency, test-retest reliability, stability (Study 1), and construct validity (Study 2). Studies 3 and 4 demonstrate that the IPANAT also measures state variance. Finally, Study 5 provides criterion-based validity by demonstrating that correlations between implicit affect and explicit affect are higher under conditions of spontaneous responding than under conditions of reflective responding to explicit affect scales. The present findings suggest that the IPANAT is a reliable and valid measure with a straightforward application procedure.
Journal of Personality and Social Psychology, 97, 500-516
This article introduces an instrument for the indirect assessment of positive and negative affect, the Implicit Positive and Negative Affect Test (IPANAT). This test draws on participant ratings of the extent to which artificial words subjectively convey various emotions. Factor analyses of these ratings yielded two independent factors that can be interpreted as implicit positive and negative affect. The corresponding scales show adequate internal consistency, test-retest reliability, stability (Study 1), and construct validity (Study 2). Studies 3 and 4 demonstrate that the IPANAT also measures state variance. Finally, Study 5 provides criterion-based validity by demonstrating that correlations between implicit affect and explicit affect are higher under conditions of spontaneous responding than under conditions of reflective responding to explicit affect scales. The present findings suggest that the IPANAT is a reliable and valid measure with a straightforward application procedure.
ARTICLE UPDATE - Smile Through Your Fear and Sadness.
Smith FW, Schyns PG.
Psychological Science, in press
ABSTRACT- It is well established that animal communication signals have adapted to the evolutionary pressures of their environment. For example, the low-frequency vocalizations of the elephant are tailored to long-range communications, whereas the high-frequency trills of birds are adapted to their more localized acoustic niche. Like the voice, the human face transmits social signals about the internal emotional state of the transmitter. Here, we address two main issues: First, we characterized the spectral composition of the facial features signaling each of the six universal expressions of emotion (happiness, sadness, fear, disgust, anger, and surprise). From these analyses, we then predicted and tested the effectiveness of the transmission of emotion signals over different viewing distances. We reveal a gradient of recognition over viewing distances constraining the relative adaptive usefulness of facial expressions of emotion (distal expressions are good signals over a wide range of viewing distances; proximal expressions are suited to closer-range communication).
Psychological Science, in press
ABSTRACT- It is well established that animal communication signals have adapted to the evolutionary pressures of their environment. For example, the low-frequency vocalizations of the elephant are tailored to long-range communications, whereas the high-frequency trills of birds are adapted to their more localized acoustic niche. Like the voice, the human face transmits social signals about the internal emotional state of the transmitter. Here, we address two main issues: First, we characterized the spectral composition of the facial features signaling each of the six universal expressions of emotion (happiness, sadness, fear, disgust, anger, and surprise). From these analyses, we then predicted and tested the effectiveness of the transmission of emotion signals over different viewing distances. We reveal a gradient of recognition over viewing distances constraining the relative adaptive usefulness of facial expressions of emotion (distal expressions are good signals over a wide range of viewing distances; proximal expressions are suited to closer-range communication).
ARTICLE UPDATE - Response and habituation of the amygdala during processing of emotional prosody.
Wiethoff S, Wildgruber D, Grodd W, Ethofer T.
Neuroreport, in press
The role of the amygdala in processing acoustic information of affective value is still under debate. Using event-related functional MRI (fMRI), we showed increased amygdalar responses to various emotions (anger, fear, happiness, eroticism) expressed by prosody, a means of communication bound to language and consequently unique to humans. The smallest signal increases were found for fearful prosody, a finding that could not be explained by rapid response habituation to stimuli of this emotional category, challenging classical theories about fear specificity of the human amygdala. Our results converge with earlier neuroimaging evidence investigating emotional vocalizations, and these neurobiological similarities suggest that the two forms of communication might have common evolutionary roots.
Neuroreport, in press
The role of the amygdala in processing acoustic information of affective value is still under debate. Using event-related functional MRI (fMRI), we showed increased amygdalar responses to various emotions (anger, fear, happiness, eroticism) expressed by prosody, a means of communication bound to language and consequently unique to humans. The smallest signal increases were found for fearful prosody, a finding that could not be explained by rapid response habituation to stimuli of this emotional category, challenging classical theories about fear specificity of the human amygdala. Our results converge with earlier neuroimaging evidence investigating emotional vocalizations, and these neurobiological similarities suggest that the two forms of communication might have common evolutionary roots.
Monday, August 17, 2009
ARTICLE UPDATE - Distinct brain systems underlie the processing of valence and arousal of affective pictures.
Nielen MM, Heslenfeld DJ, Heinen K, Van Strien JW, Witter MP, Jonker C, Veltman DJ.
Brain & Cognition, in press
Valence and arousal are thought to be the primary dimensions of human emotion. However, the degree to which valence and arousal interact in determining brain responses to emotional pictures is still elusive. This functional MRI study aimed to delineate neural systems responding to valence and arousal, and their interaction. We measured neural activation in healthy females (N=23) to affective pictures using a 2 (Valence)x2 (Arousal) design. Results show that arousal was preferentially processed by middle temporal gyrus, hippocampus and ventrolateral prefrontal cortex. Regions responding to negative valence included visual and lateral prefrontal regions, positive valence activated middle temporal and orbitofrontal areas. Importantly, distinct arousal-by-valence interactions were present in anterior insula (negative pictures), and in occipital cortex, parahippocampal gyrus and posterior cingulate (positive pictures). These data demonstrate that the brain not only differentiates between valence and arousal but also responds to specific combinations of these two, thereby highlighting the sophisticated nature of emotion processing in (female) human subjects.
Brain & Cognition, in press
Valence and arousal are thought to be the primary dimensions of human emotion. However, the degree to which valence and arousal interact in determining brain responses to emotional pictures is still elusive. This functional MRI study aimed to delineate neural systems responding to valence and arousal, and their interaction. We measured neural activation in healthy females (N=23) to affective pictures using a 2 (Valence)x2 (Arousal) design. Results show that arousal was preferentially processed by middle temporal gyrus, hippocampus and ventrolateral prefrontal cortex. Regions responding to negative valence included visual and lateral prefrontal regions, positive valence activated middle temporal and orbitofrontal areas. Importantly, distinct arousal-by-valence interactions were present in anterior insula (negative pictures), and in occipital cortex, parahippocampal gyrus and posterior cingulate (positive pictures). These data demonstrate that the brain not only differentiates between valence and arousal but also responds to specific combinations of these two, thereby highlighting the sophisticated nature of emotion processing in (female) human subjects.
ARTICLE UPDATE - An electrophysiological investigation into the automaticity of emotional face processing in high versus low trait anxious individuals
Holmes A, Nielsen MK, Tipper S, Green S.
Cognitive, Affective, Behavioral Neuroscience, 9, 323-334
To examine the extent of automaticity of emotional face processing in high versus low trait anxious participants, event-related potentials (ERPs) were recorded to emotional (fearful, happy) and neutral faces under varying task demands (low load, high load). Results showed that perceptual encoding of emotional faces, as reflected in P1 and early posterior negativity components, was unaffected by the availability of processing resources. In contrast, the postperceptual registration and storage of emotion-related information, as reflected in the late positive potential component at frontal locations, was influenced by the availability of processing resources, and this effect was further modulated by level of trait anxiety. Specifically, frontal ERP augmentations to emotional faces were eliminated in the more demanding task for low trait anxious participants, whereas ERP enhancements to emotional faces were unaffected by task load in high trait anxious participants. This result suggests greater automaticity in processing affective information in high trait anxious participants.
Cognitive, Affective, Behavioral Neuroscience, 9, 323-334
To examine the extent of automaticity of emotional face processing in high versus low trait anxious participants, event-related potentials (ERPs) were recorded to emotional (fearful, happy) and neutral faces under varying task demands (low load, high load). Results showed that perceptual encoding of emotional faces, as reflected in P1 and early posterior negativity components, was unaffected by the availability of processing resources. In contrast, the postperceptual registration and storage of emotion-related information, as reflected in the late positive potential component at frontal locations, was influenced by the availability of processing resources, and this effect was further modulated by level of trait anxiety. Specifically, frontal ERP augmentations to emotional faces were eliminated in the more demanding task for low trait anxious participants, whereas ERP enhancements to emotional faces were unaffected by task load in high trait anxious participants. This result suggests greater automaticity in processing affective information in high trait anxious participants.
ARTICLE UPDATE - Taboo words: The effect of emotion on memory for peripheral information.
Guillet R, Arndt J.
Memory & Cognition, 37, 866-879
In three experiments, we examined memory for peripheral information that occurred in the same context as emotion-inducing information. In the first two experiments, participants studied either a sentence (Experiment 1) or a pair of words (Experiments 2A-2C) containing a neutral peripheral word, as well as a neutral, negative-valence, or taboo word, to induce an emotional response. At retrieval, the participants were asked to recall the neutral peripheral word from a sentence fragment or emotion-inducing word cue. In Experiment 3, we presented word pairs at encoding and tested memory with associative recognition. In all three experiments, memory for peripheral words was enhanced when it was encoded in the presence of emotionally arousing taboo words but not when it was encoded in the presence of words that were only negative in valence. These data are consistent with priority-binding theory (MacKay et al., 2004) and inconsistent with the attention-narrowing hypothesis (Easterbrook, 1959), as well as with object-based binding theory (Mather, 2007).
Memory & Cognition, 37, 866-879
In three experiments, we examined memory for peripheral information that occurred in the same context as emotion-inducing information. In the first two experiments, participants studied either a sentence (Experiment 1) or a pair of words (Experiments 2A-2C) containing a neutral peripheral word, as well as a neutral, negative-valence, or taboo word, to induce an emotional response. At retrieval, the participants were asked to recall the neutral peripheral word from a sentence fragment or emotion-inducing word cue. In Experiment 3, we presented word pairs at encoding and tested memory with associative recognition. In all three experiments, memory for peripheral words was enhanced when it was encoded in the presence of emotionally arousing taboo words but not when it was encoded in the presence of words that were only negative in valence. These data are consistent with priority-binding theory (MacKay et al., 2004) and inconsistent with the attention-narrowing hypothesis (Easterbrook, 1959), as well as with object-based binding theory (Mather, 2007).
Subscribe to:
Posts (Atom)