Neth D, Martinez AM.
Journal of Vision, 9, 5.1 - 5.11
Perception of facial expressions of emotion is generally assumed to correspond to underlying muscle movement. However, it is often observed that some individuals have sadder or angrier faces, even for neutral, motionless faces. Here, we report on one such effect caused by simple static configural changes. In particular, we show four variations in the relative vertical position of the nose, mouth, eyes, and eyebrows that affect the perception of emotion in neutral faces. The first two configurations make the vertical distance between the eyes and mouth shorter than average, resulting in the perception of an angrier face. The other two configurations make this distance larger than average, resulting in the perception of sadness. These perceptions increase with the amount of configural change, suggesting a representation based on variations from a norm (prototypical) face.
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Showing posts with label facial expression. Show all posts
Showing posts with label facial expression. Show all posts
Monday, March 16, 2009
Saturday, November 22, 2008
ARTICLE UPDATE - Rapid influence of emotional scenes on encoding of facial expressions: an ERP study.
Righart R, de Gelder B.
Social, Cognitive, Affective Neuroscience, 3, 270-278
In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing
Social, Cognitive, Affective Neuroscience, 3, 270-278
In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing
Sunday, June 22, 2008
ARTICLE UPDATE - Mirror neuron activation is associated with facial emotion processing.
Enticott PG, Johnston PJ, Herring SE, Hoy KE, Fitzgerald PB.
Neuropsychologia, in press
Theoretical accounts suggest that mirror neurons play a crucial role in social cognition. The current study used transcranial magnetic stimulation (TMS) to investigate the association between mirror neuron activation and facial emotion processing, a fundamental aspect of social cognition, among healthy adults (n=20). Facial emotion processing of static (but not dynamic) images correlated significantly with an enhanced motor response, proposed to reflect mirror neuron activation. These correlations did not appear to reflect general facial processing or pattern recognition, and provide support to current theoretical accounts linking the mirror neuron system to aspects of social cognition. We discuss the mechanism by which mirror neurons might facilitate facial emotion recognition.
Neuropsychologia, in press
Theoretical accounts suggest that mirror neurons play a crucial role in social cognition. The current study used transcranial magnetic stimulation (TMS) to investigate the association between mirror neuron activation and facial emotion processing, a fundamental aspect of social cognition, among healthy adults (n=20). Facial emotion processing of static (but not dynamic) images correlated significantly with an enhanced motor response, proposed to reflect mirror neuron activation. These correlations did not appear to reflect general facial processing or pattern recognition, and provide support to current theoretical accounts linking the mirror neuron system to aspects of social cognition. We discuss the mechanism by which mirror neurons might facilitate facial emotion recognition.
Saturday, June 14, 2008
ARTICLE UPDATE - Decoding of affective facial expressions in the context of emotional situations.
Sommer M, Döhnel K, Meinhardt J, Hajak G.
Neuropsychologia, in press
The ability to recognize other persons' affective states and to link these with aspects of the current situation arises early in development and is precursor functions of a Theory of Mind (ToM). Until now, studies investigated either the processing of affective faces or affective pictures. In the present study, we tried to realize a scenario more similar to every day situations. We employed fMRI and used a picture matching task to explore the neural correlates associated with the integration and decoding of facial affective expressions in the context of affective situations. In the emotion condition, the participants judged an emotional facial expression with respect to the content of an emotional picture. In the two other conditions, participants indicated colour matches on the background of either affective or scrambled pictures. In contrast to colour matching on scrambled pictures, colour matching on emotional pictures resulted in longer reaction times and increased activation of the bilateral fusiform and occipital gyrus. These results indicated that, although task irrelevant, participants may attend to the emotional background of the pictures. The emotion task was associated with higher reaction times and with activation of the bilateral fusiform and occipital gyrus. Additionally, emotion attribution induced left amygdala activity. Possibly, attention processes and amygdala projections modulated the activation found in the occipital and fusiform areas. Furthermore, the involvement of the amygdala in the ToM precursor ability to link facial expressions with an emotional situation may indicate that the amygdala is involved in the development of stable ToM abilities.
Neuropsychologia, in press
The ability to recognize other persons' affective states and to link these with aspects of the current situation arises early in development and is precursor functions of a Theory of Mind (ToM). Until now, studies investigated either the processing of affective faces or affective pictures. In the present study, we tried to realize a scenario more similar to every day situations. We employed fMRI and used a picture matching task to explore the neural correlates associated with the integration and decoding of facial affective expressions in the context of affective situations. In the emotion condition, the participants judged an emotional facial expression with respect to the content of an emotional picture. In the two other conditions, participants indicated colour matches on the background of either affective or scrambled pictures. In contrast to colour matching on scrambled pictures, colour matching on emotional pictures resulted in longer reaction times and increased activation of the bilateral fusiform and occipital gyrus. These results indicated that, although task irrelevant, participants may attend to the emotional background of the pictures. The emotion task was associated with higher reaction times and with activation of the bilateral fusiform and occipital gyrus. Additionally, emotion attribution induced left amygdala activity. Possibly, attention processes and amygdala projections modulated the activation found in the occipital and fusiform areas. Furthermore, the involvement of the amygdala in the ToM precursor ability to link facial expressions with an emotional situation may indicate that the amygdala is involved in the development of stable ToM abilities.
Saturday, May 31, 2008
ARTICLE UPDATE - Audio-visual integration of emotion expression.
Collignon O, Girard S, Gosselin F, Roy S, Saint-Amour D, Lassonde M, Lepore F.
Brain Research, in press
Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of affect expressions. In Experiment 1, participants were required to categorise fear and disgust expressions displayed auditorily, visually, or using congruent or incongruent audio-visual stimuli. Results showed faster and more accurate categorisation in the bimodal congruent situation than in the unimodal conditions. In the incongruent situation, participant preferentially categorised the affective expression based on the visual modality, demonstrating a visual dominance in emotional processing. However, when the reliability of the visual stimuli was diminished, participants categorised incongruent bimodal stimuli preferentially via the auditory modality. These results demonstrate that visual dominance in affect perception does not occur in a rigid manner, but follows flexible situation-dependent rules. In Experiment 2, we requested the participants to pay attention to only one sensory modality at a time in order to test the putative mandatory nature of multisensory affective interactions. We observed that even if they were asked to ignore concurrent sensory information, the irrelevant information significantly affected the processing of the target. This observation was especially true when the target modality was less reliable. Altogether, these findings indicate that the perception of emotion expressions is a robust multisensory situation which follows rules that have been previously observed in other perceptual domains.
Brain Research, in press
Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of affect expressions. In Experiment 1, participants were required to categorise fear and disgust expressions displayed auditorily, visually, or using congruent or incongruent audio-visual stimuli. Results showed faster and more accurate categorisation in the bimodal congruent situation than in the unimodal conditions. In the incongruent situation, participant preferentially categorised the affective expression based on the visual modality, demonstrating a visual dominance in emotional processing. However, when the reliability of the visual stimuli was diminished, participants categorised incongruent bimodal stimuli preferentially via the auditory modality. These results demonstrate that visual dominance in affect perception does not occur in a rigid manner, but follows flexible situation-dependent rules. In Experiment 2, we requested the participants to pay attention to only one sensory modality at a time in order to test the putative mandatory nature of multisensory affective interactions. We observed that even if they were asked to ignore concurrent sensory information, the irrelevant information significantly affected the processing of the target. This observation was especially true when the target modality was less reliable. Altogether, these findings indicate that the perception of emotion expressions is a robust multisensory situation which follows rules that have been previously observed in other perceptual domains.
Friday, March 07, 2008
ARTICLE UPDATE - Valuating other people's emotional face expression: A combined functional magnetic resonance imaging and electroencephalography study
Seitz RJ, Schäfer R, Scherfeld D, Friederichs S, Popp K, Wittsack HJ, Azari NP, Franz M.
Neuroscience, in press
Reading the facial expression of other people is a fundamental skill for social interaction. Human facial expressions of emotions are readily recognized but may also evoke the same experiential emotional state in the observer. We used event-related functional magnetic resonance imaging and multi-channel electroencephalography to determine in 14 right-handed healthy volunteers (29+/-6 years) which brain structures mediate the perception of such a shared experiential emotional state. Statistical parametric mapping showed that an area in the dorsal medial frontal cortex was specifically activated during the perception of emotions that reflected the seen happy and sad emotional face expressions. This area mapped to the pre-supplementary motor area which plays a central role in control of behavior. Low resolution brain electromagnetic tomography-based analysis of the encephalographic data revealed that the activation was detected 100 ms after face presentation onset lasting until 740 ms. Our observation substantiates recently emerging evidence suggesting that the subjective perception of an experiential emotional state-empathy-is mediated by the involvement of the dorsal medial frontal cortex.
Neuroscience, in press
Reading the facial expression of other people is a fundamental skill for social interaction. Human facial expressions of emotions are readily recognized but may also evoke the same experiential emotional state in the observer. We used event-related functional magnetic resonance imaging and multi-channel electroencephalography to determine in 14 right-handed healthy volunteers (29+/-6 years) which brain structures mediate the perception of such a shared experiential emotional state. Statistical parametric mapping showed that an area in the dorsal medial frontal cortex was specifically activated during the perception of emotions that reflected the seen happy and sad emotional face expressions. This area mapped to the pre-supplementary motor area which plays a central role in control of behavior. Low resolution brain electromagnetic tomography-based analysis of the encephalographic data revealed that the activation was detected 100 ms after face presentation onset lasting until 740 ms. Our observation substantiates recently emerging evidence suggesting that the subjective perception of an experiential emotional state-empathy-is mediated by the involvement of the dorsal medial frontal cortex.
Thursday, February 21, 2008
ARTICLE UPDATE - The use of aftereffects in the study of relationships among emotion categories.
Rutherford MD, Chattha HM, Krysko KM.
Journal of Experimental Psychology: Human Perception & Performance, 34, 27-40
The perception of visual aftereffects has been long recognized, and these aftereffects reveal a relationship between perceptual categories. Thus, emotional expression aftereffects can be used to map the categorical relationships among emotion percepts. One might expect a symmetric relationship among categories, but an evolutionary, functional perspective predicts an asymmetrical relationship. In a series of 7 experiments, the authors tested these predictions. Participants fixated on a facial expression, then briefly viewed a neutral expression, then reported the apparent facial expression of the 2nd image. Experiment 1 revealed that happy and sad are opposites of one another; each evokes the other as an aftereffect. The 2nd and 3rd experiments reveal that fixating on any negative emotions yields an aftereffect perceived as happy, whereas fixating on a happy face results in the perception of a sad aftereffect. This suggests an asymmetric relationship among categories. Experiments 4-7 explored the mechanism driving this effect. The evolutionary and functional explanations for the category asymmetry are discussed.
Journal of Experimental Psychology: Human Perception & Performance, 34, 27-40
The perception of visual aftereffects has been long recognized, and these aftereffects reveal a relationship between perceptual categories. Thus, emotional expression aftereffects can be used to map the categorical relationships among emotion percepts. One might expect a symmetric relationship among categories, but an evolutionary, functional perspective predicts an asymmetrical relationship. In a series of 7 experiments, the authors tested these predictions. Participants fixated on a facial expression, then briefly viewed a neutral expression, then reported the apparent facial expression of the 2nd image. Experiment 1 revealed that happy and sad are opposites of one another; each evokes the other as an aftereffect. The 2nd and 3rd experiments reveal that fixating on any negative emotions yields an aftereffect perceived as happy, whereas fixating on a happy face results in the perception of a sad aftereffect. This suggests an asymmetric relationship among categories. Experiments 4-7 explored the mechanism driving this effect. The evolutionary and functional explanations for the category asymmetry are discussed.
Labels:
aftereffect,
emotional category,
facial expression
ARTICLE UPDATE - The automaticity of emotion recognition.
Tracy JL, Robins RW.
Emotion, 8, 81-95
Evolutionary accounts of emotion typically assume that humans evolved to quickly and efficiently recognize emotion expressions because these expressions convey fitness-enhancing messages. The present research tested this assumption in 2 studies. Specifically, the authors examined (a) how quickly perceivers could recognize expressions of anger, contempt, disgust, embarrassment, fear, happiness, pride, sadness, shame, and surprise; (b) whether accuracy is improved when perceivers deliberate about each expression's meaning (vs. respond as quickly as possible); and (c) whether accurate recognition can occur under cognitive load. Across both studies, perceivers quickly and efficiently (i.e., under cognitive load) recognized most emotion expressions, including the self-conscious emotions of pride, embarrassment, and shame. Deliberation improved accuracy in some cases, but these improvements were relatively small. Discussion focuses on the implications of these findings for the cognitive processes underlying emotion recognition.
Emotion, 8, 81-95
Evolutionary accounts of emotion typically assume that humans evolved to quickly and efficiently recognize emotion expressions because these expressions convey fitness-enhancing messages. The present research tested this assumption in 2 studies. Specifically, the authors examined (a) how quickly perceivers could recognize expressions of anger, contempt, disgust, embarrassment, fear, happiness, pride, sadness, shame, and surprise; (b) whether accuracy is improved when perceivers deliberate about each expression's meaning (vs. respond as quickly as possible); and (c) whether accurate recognition can occur under cognitive load. Across both studies, perceivers quickly and efficiently (i.e., under cognitive load) recognized most emotion expressions, including the self-conscious emotions of pride, embarrassment, and shame. Deliberation improved accuracy in some cases, but these improvements were relatively small. Discussion focuses on the implications of these findings for the cognitive processes underlying emotion recognition.
Tuesday, January 22, 2008
ARTICLE UPDATE - Neural circuitry for accurate identification of facial emotions.
Loughead J, Gur RC, Elliott M, Gur RE.
Brain Research, in press
Converging studies have revealed neural circuits for emotion processing, yet none has related activation to identification accuracy. We report a hybrid (block and event-related) fMRI study in 17 healthy adults, which permitted performance-based analysis. As in earlier studies, blocked analysis of the facial emotion identification task showed activation of amygdala, fusiform, thalamus, inferior and midfrontal regions. However, an event-related analysis of target stimuli demonstrated time locked activation associated with correct identification of happy, sad, angry and fearful faces. Overall, correct detection of angry and fearful faces was associated with greater activation compared to incorrect responses, especially in the amygdala and fusiform gyrus. The opposite was observed for happy and sad faces, where greater thalamic and midfrontal activation portended incorrect responses. Results indicate that the fusiform cortex and amygdala respond differentially in the four target conditions (happy, sad, angry and fearful) along the dimension of threat-relatedness.
Brain Research, in press
Converging studies have revealed neural circuits for emotion processing, yet none has related activation to identification accuracy. We report a hybrid (block and event-related) fMRI study in 17 healthy adults, which permitted performance-based analysis. As in earlier studies, blocked analysis of the facial emotion identification task showed activation of amygdala, fusiform, thalamus, inferior and midfrontal regions. However, an event-related analysis of target stimuli demonstrated time locked activation associated with correct identification of happy, sad, angry and fearful faces. Overall, correct detection of angry and fearful faces was associated with greater activation compared to incorrect responses, especially in the amygdala and fusiform gyrus. The opposite was observed for happy and sad faces, where greater thalamic and midfrontal activation portended incorrect responses. Results indicate that the fusiform cortex and amygdala respond differentially in the four target conditions (happy, sad, angry and fearful) along the dimension of threat-relatedness.
Friday, December 14, 2007
ARTICLE UPDATE - Simultaneous recording of EEG and facial muscle reactions during spontaneous emotional mimicry.
Achaibou A, Pourtois G, Schwartz S, Vuilleumier P.
Neuropsychologia, in press
The perception of emotional facial expressions induces covert imitation in emotion-specific muscles of the perceiver's face. Neural processes involved in these spontaneous facial reactions remain largely unknown. Here we concurrently recorded EEG and facial EMG in 15 participants watching short movie clips displaying either happy or angry facial expressions. EMG activity was recorded for the zygomaticus major (ZM) that elevates the lips during a smile, and the corrugator supercillii (CS) that knits the eyebrows during a frown. We found increased EMG activity of CS in response to angry expressions, and enhanced EMG activity of ZM for happy expressions, replicating earlier EMG studies. More importantly, we found that the amplitude of an early visual evoked potential (right P1) was larger when ZM activity to happy faces was high, and when CS activity to angry faces was high, as compared to when muscle reactions were low. Conversely, the amplitude of right N170 component was smaller when the intensity of facial imitation was high. These combined EEG-EMG results suggest that early visual processing of face expression may determine the magnitude of subsequent facial imitation, with dissociable effects for P1 and N170. These findings are discussed against the classical dual-route model of face recognition.
Neuropsychologia, in press
The perception of emotional facial expressions induces covert imitation in emotion-specific muscles of the perceiver's face. Neural processes involved in these spontaneous facial reactions remain largely unknown. Here we concurrently recorded EEG and facial EMG in 15 participants watching short movie clips displaying either happy or angry facial expressions. EMG activity was recorded for the zygomaticus major (ZM) that elevates the lips during a smile, and the corrugator supercillii (CS) that knits the eyebrows during a frown. We found increased EMG activity of CS in response to angry expressions, and enhanced EMG activity of ZM for happy expressions, replicating earlier EMG studies. More importantly, we found that the amplitude of an early visual evoked potential (right P1) was larger when ZM activity to happy faces was high, and when CS activity to angry faces was high, as compared to when muscle reactions were low. Conversely, the amplitude of right N170 component was smaller when the intensity of facial imitation was high. These combined EEG-EMG results suggest that early visual processing of face expression may determine the magnitude of subsequent facial imitation, with dissociable effects for P1 and N170. These findings are discussed against the classical dual-route model of face recognition.
Monday, November 19, 2007
ARTICLE UPDATE - Happy and fearful emotion in cues and targets modulate event-related potential indices of gaze-directed attentional orienting
Harlan M. Fichtenholtz, Joseph B. Hopfinger, Reiko Graham, Jacqueline M. Detwiler and Kevin S. LaBar
Social Cognitive Affective Neuroscience, 2, 323-333
The goal of the present study was to characterize the effects of valence in facial cues and object targets on event-related potential (ERPs) indices of gaze-directed orienting. Participants were shown faces at fixation that concurrently displayed dynamic gaze shifts and expression changes from neutral to fearful or happy emotions. Emotionally-salient target objects subsequently appeared in the periphery and were spatially congruent or incongruent with the gaze direction. ERPs were time-locked to target presentation. Three sequential ERP components were modulated by happy emotion, indicating a progression from an expression effect to a gaze-by-expression interaction to a target emotion effect. These effects included larger P1 amplitude over contralateral occipital sites for targets following happy faces, larger centrally distributed N1 amplitude for targets following happy faces with leftward gaze, and faster P3 latency for positive targets. In addition, parietally distributed P3 amplitude was reduced for validly cued targets following fearful expressions. Results are consistent with accounts of attentional broadening and motivational approach by happy emotion, and facilitation of spatially directed attention in the presence of fearful cues. The findings have implications for understanding how socioemotional signals in faces interact with each other and with emotional features of objects in the environment to alter attentional processes.
Social Cognitive Affective Neuroscience, 2, 323-333
The goal of the present study was to characterize the effects of valence in facial cues and object targets on event-related potential (ERPs) indices of gaze-directed orienting. Participants were shown faces at fixation that concurrently displayed dynamic gaze shifts and expression changes from neutral to fearful or happy emotions. Emotionally-salient target objects subsequently appeared in the periphery and were spatially congruent or incongruent with the gaze direction. ERPs were time-locked to target presentation. Three sequential ERP components were modulated by happy emotion, indicating a progression from an expression effect to a gaze-by-expression interaction to a target emotion effect. These effects included larger P1 amplitude over contralateral occipital sites for targets following happy faces, larger centrally distributed N1 amplitude for targets following happy faces with leftward gaze, and faster P3 latency for positive targets. In addition, parietally distributed P3 amplitude was reduced for validly cued targets following fearful expressions. Results are consistent with accounts of attentional broadening and motivational approach by happy emotion, and facilitation of spatially directed attention in the presence of fearful cues. The findings have implications for understanding how socioemotional signals in faces interact with each other and with emotional features of objects in the environment to alter attentional processes.
Friday, October 19, 2007
ARTICLE UPDATE - Multiple Cues in Social Perception: The Time Course of Processing Race and Facial Expression.
Kubota JT, Ito TA.
Journal of Experimental Social Psychology, 43, 738-752
The purpose of the present study was to examine the time course of race and expression processing to determine how these cues influence early perceptual as well as explicit categorization judgments. Despite their importance in social perception, little research has examined how social category information and emotional expression are processed over time. Moreover, although models of face processing suggest that the two cues should be processed independently, this has rarely been directly examined. Event-related brain potentials were recorded as participants made race and emotion categorization judgments of Black and White men posing either happy, angry, or neutral expressions. Our findings support that processing of race and emotion cues occur independently and in parallel, relatively early in processing.
Journal of Experimental Social Psychology, 43, 738-752
The purpose of the present study was to examine the time course of race and expression processing to determine how these cues influence early perceptual as well as explicit categorization judgments. Despite their importance in social perception, little research has examined how social category information and emotional expression are processed over time. Moreover, although models of face processing suggest that the two cues should be processed independently, this has rarely been directly examined. Event-related brain potentials were recorded as participants made race and emotion categorization judgments of Black and White men posing either happy, angry, or neutral expressions. Our findings support that processing of race and emotion cues occur independently and in parallel, relatively early in processing.
Friday, September 21, 2007
ARTICLE UPDATE - Social anxiety and interpretation biases for facial displays of emotion: Emotion detection and ratings of social cost.
Schofield CA, Coles ME, Gibb BE.
Behavioral Research and Therapy, in press
The current study assessed the processing of facial displays of emotion (Happy, Disgust, and Neutral) of varying emotional intensities in participants with high vs. low social anxiety. Use of facial expressions of varying intensities allowed for strong external validity and a fine-grained analysis of interpretation biases. Sensitivity to perceiving negative evaluation in faces (i.e., emotion detection) was assessed at both long (unlimited) and brief (60ms) stimulus durations. In addition, ratings of perceived social cost were made indicating what participants judged it would be like to have a social interaction with a person exhibiting the stimulus emotion. Results suggest that high social anxiety participants did not demonstrate biases in their sensitivity to perceiving negative evaluation (i.e. disgust) in facial expressions. However, high social anxiety participants did estimate the perceived cost of interacting with someone showing disgust to be significantly greater than low social anxiety participants, regardless of the intensity of the disgust expression. These results are consistent with a specific type of interpretation bias in which participants with social anxiety have elevated ratings of the social cost of interacting with individuals displaying negative evaluation.
Behavioral Research and Therapy, in press
The current study assessed the processing of facial displays of emotion (Happy, Disgust, and Neutral) of varying emotional intensities in participants with high vs. low social anxiety. Use of facial expressions of varying intensities allowed for strong external validity and a fine-grained analysis of interpretation biases. Sensitivity to perceiving negative evaluation in faces (i.e., emotion detection) was assessed at both long (unlimited) and brief (60ms) stimulus durations. In addition, ratings of perceived social cost were made indicating what participants judged it would be like to have a social interaction with a person exhibiting the stimulus emotion. Results suggest that high social anxiety participants did not demonstrate biases in their sensitivity to perceiving negative evaluation (i.e. disgust) in facial expressions. However, high social anxiety participants did estimate the perceived cost of interacting with someone showing disgust to be significantly greater than low social anxiety participants, regardless of the intensity of the disgust expression. These results are consistent with a specific type of interpretation bias in which participants with social anxiety have elevated ratings of the social cost of interacting with individuals displaying negative evaluation.
ARTICLE UPDATE - Dynamics of Visual Information Integration in the Brain for Categorizing Facial Expressions.
Schyns PG, Petro LS, Smith ML.
Current Biology, 17, 1580-1585
A key to understanding visual cognition is to determine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset [1-16]) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with "fear" being faster than "disgust," itself faster than "happy"). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.
Current Biology, 17, 1580-1585
A key to understanding visual cognition is to determine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset [1-16]) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with "fear" being faster than "disgust," itself faster than "happy"). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.
ARTICLE UPDATE - Own-sex effects in emotional memory for faces.
Armony JL, Sergerie K.
Neuroscience Letters, in press
The amygdala is known to be critical for the enhancement of memory for emotional, especially negative, material. Importantly, some researchers have suggested a sex-specific hemispheric lateralization in this process. In the case of facial expressions, another important factor that could influence memory success is the sex of the face, which could interact with the emotion depicted as well as with the sex of the perceiver. Whether this is the case remains unknown, as all previous studies of sex difference in emotional memory have employed affective pictures. Here we directly explored this question using functional magnetic resonance imaging in a subsequent memory paradigm for facial expressions (fearful, happy and neutral). Consistent with our hypothesis, we found that the hemispheric laterality of the amygdala involvement in successful memory for emotional material was influenced not only by the sex of the subjects, as previously proposed, but also by the sex of the faces being remembered. Namely, the left amygdala was more active for successfully remembered female fearful faces in women, whereas in men the right amygdala was more involved in memory for male fearful faces. These results confirm the existence of sex differences in amygdala lateralization in emotional memory but also demonstrate a subtle relationship between the observer and the stimulus in this process.
Neuroscience Letters, in press
The amygdala is known to be critical for the enhancement of memory for emotional, especially negative, material. Importantly, some researchers have suggested a sex-specific hemispheric lateralization in this process. In the case of facial expressions, another important factor that could influence memory success is the sex of the face, which could interact with the emotion depicted as well as with the sex of the perceiver. Whether this is the case remains unknown, as all previous studies of sex difference in emotional memory have employed affective pictures. Here we directly explored this question using functional magnetic resonance imaging in a subsequent memory paradigm for facial expressions (fearful, happy and neutral). Consistent with our hypothesis, we found that the hemispheric laterality of the amygdala involvement in successful memory for emotional material was influenced not only by the sex of the subjects, as previously proposed, but also by the sex of the faces being remembered. Namely, the left amygdala was more active for successfully remembered female fearful faces in women, whereas in men the right amygdala was more involved in memory for male fearful faces. These results confirm the existence of sex differences in amygdala lateralization in emotional memory but also demonstrate a subtle relationship between the observer and the stimulus in this process.
Tuesday, August 07, 2007
ARTICLE UPDATE - Anxiety and sensitivity to gaze direction in emotionally expressive faces.
Fox, Elaine; Mathews, Andrew; Calder, Andrew J.; Yiend, Jenny
Emotion, 7, 478-486
This study investigated the role of neutral, happy, fearful, and angry facial expressions in enhancing orienting to the direction of eye gaze. Photographs of faces with either direct or averted gaze were presented. A target letter (T or L) appeared unpredictably to the left or the right of the face, either 300 ms or 700 ms after gaze direction changed. Response times were faster in congruent conditions (i.e., when the eyes gazed toward the target) relative to incongruent conditions (when the eyes gazed away from the target letter). Facial expression did influence reaction times, but these effects were qualified by individual differences in self-reported anxiety. High trait-anxious participants showed an enhanced orienting to the eye gaze of faces with fearful expressions relative to all other expressions. In contrast, when the eyes stared straight ahead, trait anxiety was associated with slower responding when the facial expressions depicted anger. Thus, in anxiety-prone people attention is more likely to be held by an expression of anger, whereas attention is guided more potently by fearful facial expressions.
Emotion, 7, 478-486
This study investigated the role of neutral, happy, fearful, and angry facial expressions in enhancing orienting to the direction of eye gaze. Photographs of faces with either direct or averted gaze were presented. A target letter (T or L) appeared unpredictably to the left or the right of the face, either 300 ms or 700 ms after gaze direction changed. Response times were faster in congruent conditions (i.e., when the eyes gazed toward the target) relative to incongruent conditions (when the eyes gazed away from the target letter). Facial expression did influence reaction times, but these effects were qualified by individual differences in self-reported anxiety. High trait-anxious participants showed an enhanced orienting to the eye gaze of faces with fearful expressions relative to all other expressions. In contrast, when the eyes stared straight ahead, trait anxiety was associated with slower responding when the facial expressions depicted anger. Thus, in anxiety-prone people attention is more likely to be held by an expression of anger, whereas attention is guided more potently by fearful facial expressions.
Monday, August 06, 2007
ARTICLE UPDATE - Differential electrocortical responses to increasing intensities of fearful and happy emotional expressions.
Leppänen JM, Kauppinen P, Peltola MJ, Hietanen JK.
Brain Research, in press
Previous studies have shown differential event-related potentials (ERPs) to fearful and happy/neutral facial expressions. To investigate whether the brain systems underlying these ERP differences are sensitive to the intensity of fear and happiness, behavioral recognition accuracy and reaction times as well as ERPs were measured while observers categorized low-intensity (50%), prototypical (100%), and caricatured (150%) fearful and happy facial expressions. The speed and accuracy of emotion categorization improved with increasing levels of expression intensity, and 100% and 150% expressions were consistently classified as expressions of the intended emotions. Comparison of ERPs to 100% and 150% expressions revealed a differential pattern of ERPs to 100% and 150% fear expressions over occipital-temporal electrodes 190-290 ms post-stimulus (a negative shift in ERP activity for high-intensity fearful expressions). Similar ERP differences were not observed for 100% and 150% happy expressions, ruling out the possibility that the ERPs to high-intensity fear reflected a response to increased expression intensity per se. Together, these results suggest that differential electrocortical responses to fearful facial expressions over posterior electrodes are generated by a neural system that responds to the intensity of negative but not positive emotional expressions.
Brain Research, in press
Previous studies have shown differential event-related potentials (ERPs) to fearful and happy/neutral facial expressions. To investigate whether the brain systems underlying these ERP differences are sensitive to the intensity of fear and happiness, behavioral recognition accuracy and reaction times as well as ERPs were measured while observers categorized low-intensity (50%), prototypical (100%), and caricatured (150%) fearful and happy facial expressions. The speed and accuracy of emotion categorization improved with increasing levels of expression intensity, and 100% and 150% expressions were consistently classified as expressions of the intended emotions. Comparison of ERPs to 100% and 150% expressions revealed a differential pattern of ERPs to 100% and 150% fear expressions over occipital-temporal electrodes 190-290 ms post-stimulus (a negative shift in ERP activity for high-intensity fearful expressions). Similar ERP differences were not observed for 100% and 150% happy expressions, ruling out the possibility that the ERPs to high-intensity fear reflected a response to increased expression intensity per se. Together, these results suggest that differential electrocortical responses to fearful facial expressions over posterior electrodes are generated by a neural system that responds to the intensity of negative but not positive emotional expressions.
Friday, May 25, 2007
ARTICLE UPDATE - Garner interference reveals dependencies between emotional expression and gaze in face perception.
Graham R, Labar KS
Emotion, 7, 296-313
The relationship between facial expression and gaze processing was investigated with the Garner selective attention paradigm. In Experiment 1, participants performed expression judgments without interference from gaze, but expression interfered with gaze judgments. Experiment 2 replicated these results across different emotions. In both experiments, expression judgments occurred faster than gaze judgments, suggesting that expression was processed before gaze could interfere. In Experiments 3 and 4, the difficulty of the emotion discrimination was increased in two different ways. In both cases, gaze interfered with emotion judgments and vice versa. Furthermore, increasing the difficulty of the emotion discrimination resulted in gaze and expression interactions. Results indicate that expression and gaze interactions are modulated by discriminability. Whereas expression generally interferes with gaze judgments, gaze direction modulates expression processing only when facial emotion is difficult to discriminate.
Emotion, 7, 296-313
The relationship between facial expression and gaze processing was investigated with the Garner selective attention paradigm. In Experiment 1, participants performed expression judgments without interference from gaze, but expression interfered with gaze judgments. Experiment 2 replicated these results across different emotions. In both experiments, expression judgments occurred faster than gaze judgments, suggesting that expression was processed before gaze could interfere. In Experiments 3 and 4, the difficulty of the emotion discrimination was increased in two different ways. In both cases, gaze interfered with emotion judgments and vice versa. Furthermore, increasing the difficulty of the emotion discrimination resulted in gaze and expression interactions. Results indicate that expression and gaze interactions are modulated by discriminability. Whereas expression generally interferes with gaze judgments, gaze direction modulates expression processing only when facial emotion is difficult to discriminate.
Friday, March 23, 2007
ARTICLE UPDATE - Imitating expressions: emotion-specific neural substrates in facial mimicry
Lee TW, Josephs O, Dolan RJ, Critchley HD.
Social Cognitive Affective Neuroscience, 1, 122-135.
Intentionally adopting a discrete emotional facial expression can modulate the subjective feelings corresponding to that emotion; however, the underlying neural mechanism is poorly understood. We therefore used functional brain imaging (functional magnetic resonance imaging) to examine brain activity during intentional mimicry of emotional and non-emotional facial expressions and relate regional responses to the magnitude of expression-induced facial movement. Eighteen healthy subjects were scanned while imitating video clips depicting three emotional (sad, angry, happy), and two ‘ingestive’ (chewing and licking) facial expressions. Simultaneously, facial movement was monitored from displacement of fiducial markers (highly reflective dots) on each subject’s face. Imitating emotional expressions enhanced activity within right inferior prefrontal cortex. This pattern was absent during passive viewing conditions. Moreover, the magnitude of facial movement during emotion-imitation predicted responses within right insula and motor/premotor cortices. Enhanced activity in ventromedial prefrontal cortex and frontal pole was observed during imitation of anger, in ventromedial prefrontal and rostral anterior cingulate during imitation of sadness and in striatal, amygdala and occipitotemporal during imitation of happiness. Our findings suggest a central role for right inferior frontal gyrus in the intentional imitation of emotional expressions. Further, by entering metrics for facial muscular change into analysis of brain imaging data, we highlight shared and discrete neural substrates supporting affective, action and social consequences of somatomotor emotional expression.
Social Cognitive Affective Neuroscience, 1, 122-135.
Intentionally adopting a discrete emotional facial expression can modulate the subjective feelings corresponding to that emotion; however, the underlying neural mechanism is poorly understood. We therefore used functional brain imaging (functional magnetic resonance imaging) to examine brain activity during intentional mimicry of emotional and non-emotional facial expressions and relate regional responses to the magnitude of expression-induced facial movement. Eighteen healthy subjects were scanned while imitating video clips depicting three emotional (sad, angry, happy), and two ‘ingestive’ (chewing and licking) facial expressions. Simultaneously, facial movement was monitored from displacement of fiducial markers (highly reflective dots) on each subject’s face. Imitating emotional expressions enhanced activity within right inferior prefrontal cortex. This pattern was absent during passive viewing conditions. Moreover, the magnitude of facial movement during emotion-imitation predicted responses within right insula and motor/premotor cortices. Enhanced activity in ventromedial prefrontal cortex and frontal pole was observed during imitation of anger, in ventromedial prefrontal and rostral anterior cingulate during imitation of sadness and in striatal, amygdala and occipitotemporal during imitation of happiness. Our findings suggest a central role for right inferior frontal gyrus in the intentional imitation of emotional expressions. Further, by entering metrics for facial muscular change into analysis of brain imaging data, we highlight shared and discrete neural substrates supporting affective, action and social consequences of somatomotor emotional expression.
ARTICLE UPDATE - The neural mechanism of imagining facial affective expression.
Kim SE, Kim JW, Kim JJ, Jeong BS, Choi EA, Jeong YG, Kim JH, Ku J, Ki SW.
Brain Research, in press
To react appropriately in social relationships, we have a tendency to simulate how others think of us through mental imagery. In particular, simulating other people's facial affective expressions through imagery in social situations enables us to enact vivid affective responses, which may be inducible from other people's affective responses that are predicted as results of our mental imagery of future behaviors. Therefore, this ability is an important cognitive feature of diverse advanced social cognition in humans. We used functional magnetic imaging to examine brain activation during the imagery of emotional facial expressions as compared to neutral facial expressions. Twenty-one right-handed subjects participated in this study. We observed the activation of the amygdala during the imagining of emotional facial affect versus the imagining of neutral facial affects. In addition, we also observed the activation of several areas of the brain, including the dorsolateral prefrontal cortex, ventral premotor cortex, superior temporal sulcus, parahippocampal gyrus, lingual gyrus, and the midbrain. Our results suggest that the areas of the brain known to be involved in the actual perception of affective facial expressions are also implicated in the imagery of affective facial expressions. In particular, given that the processing of information concerning the facial patterning of different emotions and the enactment of behavioral responses, such as autonomic arousal, are central components of the imagery of emotional facial expressions, we postulate the central role of the amygdala in the imagery of emotional facial expressions.
Brain Research, in press
To react appropriately in social relationships, we have a tendency to simulate how others think of us through mental imagery. In particular, simulating other people's facial affective expressions through imagery in social situations enables us to enact vivid affective responses, which may be inducible from other people's affective responses that are predicted as results of our mental imagery of future behaviors. Therefore, this ability is an important cognitive feature of diverse advanced social cognition in humans. We used functional magnetic imaging to examine brain activation during the imagery of emotional facial expressions as compared to neutral facial expressions. Twenty-one right-handed subjects participated in this study. We observed the activation of the amygdala during the imagining of emotional facial affect versus the imagining of neutral facial affects. In addition, we also observed the activation of several areas of the brain, including the dorsolateral prefrontal cortex, ventral premotor cortex, superior temporal sulcus, parahippocampal gyrus, lingual gyrus, and the midbrain. Our results suggest that the areas of the brain known to be involved in the actual perception of affective facial expressions are also implicated in the imagery of affective facial expressions. In particular, given that the processing of information concerning the facial patterning of different emotions and the enactment of behavioral responses, such as autonomic arousal, are central components of the imagery of emotional facial expressions, we postulate the central role of the amygdala in the imagery of emotional facial expressions.
Subscribe to:
Posts (Atom)