Helfinstein SM, White LK, Bar-Haim Y, Fox NA.
Behavioral Research Therapy, in press
Anxious individuals show an attention bias towards threatening information. However, under conditions of sustained environmental threat this otherwise-present attention bias disappears. It remains unclear whether this suppression of attention bias can be caused by a transient activation of the fear system. In the present experiment, high socially anxious and low socially anxious individuals (HSA group, n=12; LSA group, n=12) performed a modified dot-probe task in which they were shown either a neutral or socially threatening prime word prior to each trial. EEG was collected and ERP components to the prime and faces displays were computed. HSA individuals showed an attention bias to threat after a neutral prime, but no attention bias after a threatening prime, demonstrating that suppression of attention bias can occur after a transient activation of the fear system. LSA individuals showed an opposite pattern: no evidence of a bias to threat with neutral primes but induction of an attention bias to threat following threatening primes. ERP results suggested differential processing of the prime and faces displays by HSA and LSA individuals. However, no group by prime interaction was found for any of ERP components.
This blog keeps you up-to-date with latest emotion related research. Feel free to browse and contribute.
Saturday, May 31, 2008
ARTICLE UPDATE - When do motor behaviors (mis)match affective stimuli? An evaluative coding view of approach and avoidance reactions.
Eder AB, Rothermund K.
Journal of Experimental Psychology: General, 2, 262-281
Affective-mapping effects between affective stimuli and lever movements are critically dependent upon the evaluative meaning of the response labels that are used in the task instructions. In Experiments 1 and 2, affective-mapping effects predicted by specific-muscle-activation and distance-regulation accounts were replicated when the standard response labels towards and away were used but were reversed when identical lever movements were labeled downwards and upwards. In Experiment 3, affective-mapping effects were produced with affectively labeled right and left lever movements that are intrinsically unrelated to approach and avoidance. Experiments 4 and 5 revealed that affective-mapping effects are not mediated by memory retrieval processes and depend on the execution of affectively coded responses. The results support the assumption that evaluative implications of action instructions assign affective codes to motor responses on a representational level that interact with stimulus evaluations on a response selection stage.
Journal of Experimental Psychology: General, 2, 262-281
Affective-mapping effects between affective stimuli and lever movements are critically dependent upon the evaluative meaning of the response labels that are used in the task instructions. In Experiments 1 and 2, affective-mapping effects predicted by specific-muscle-activation and distance-regulation accounts were replicated when the standard response labels towards and away were used but were reversed when identical lever movements were labeled downwards and upwards. In Experiment 3, affective-mapping effects were produced with affectively labeled right and left lever movements that are intrinsically unrelated to approach and avoidance. Experiments 4 and 5 revealed that affective-mapping effects are not mediated by memory retrieval processes and depend on the execution of affectively coded responses. The results support the assumption that evaluative implications of action instructions assign affective codes to motor responses on a representational level that interact with stimulus evaluations on a response selection stage.
ARTICLE UPDATE - Time course of the involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum in emotional pro
Hoekert M, Bais L, Kahn RS, Aleman A.
PLoS One, 3, e2244
In verbal communication, not only the meaning of the words convey information, but also the tone of voice (prosody) conveys crucial information about the emotional state and intentions of others. In various studies right frontal and right temporal regions have been found to play a role in emotional prosody perception. Here, we used triple-pulse repetitive transcranial magnetic stimulation (rTMS) to shed light on the precise time course of involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum. We hypothesized that information would be processed in the right anterior superior temporal gyrus before being processed in the right fronto-parietal operculum. Right-handed healthy subjects performed an emotional prosody task. During listening to each sentence a triplet of TMS pulses was applied to one of the regions at one of six time points (400-1900 ms). Results showed a significant main effect of Time for right anterior superior temporal gyrus and right fronto-parietal operculum. The largest interference was observed half-way through the sentence. This effect was stronger for withdrawal emotions than for the approach emotion. A further experiment with the inclusion of an active control condition, TMS over the EEG site POz (midline parietal-occipital junction), revealed stronger effects at the fronto-parietal operculum and anterior superior temporal gyrus relative to the active control condition. No evidence was found for sequential processing of emotional prosodic information from right anterior superior temporal gyrus to the right fronto-parietal operculum, but the results revealed more parallel processing. Our results suggest that both right fronto-parietal operculum and right anterior superior temporal gyrus are critical for emotional prosody perception at a relatively late time period after sentence onset. This may reflect that emotional cues can still be ambiguous at the beginning of sentences, but become more apparent half-way through the sentence.
PLoS One, 3, e2244
In verbal communication, not only the meaning of the words convey information, but also the tone of voice (prosody) conveys crucial information about the emotional state and intentions of others. In various studies right frontal and right temporal regions have been found to play a role in emotional prosody perception. Here, we used triple-pulse repetitive transcranial magnetic stimulation (rTMS) to shed light on the precise time course of involvement of the right anterior superior temporal gyrus and the right fronto-parietal operculum. We hypothesized that information would be processed in the right anterior superior temporal gyrus before being processed in the right fronto-parietal operculum. Right-handed healthy subjects performed an emotional prosody task. During listening to each sentence a triplet of TMS pulses was applied to one of the regions at one of six time points (400-1900 ms). Results showed a significant main effect of Time for right anterior superior temporal gyrus and right fronto-parietal operculum. The largest interference was observed half-way through the sentence. This effect was stronger for withdrawal emotions than for the approach emotion. A further experiment with the inclusion of an active control condition, TMS over the EEG site POz (midline parietal-occipital junction), revealed stronger effects at the fronto-parietal operculum and anterior superior temporal gyrus relative to the active control condition. No evidence was found for sequential processing of emotional prosodic information from right anterior superior temporal gyrus to the right fronto-parietal operculum, but the results revealed more parallel processing. Our results suggest that both right fronto-parietal operculum and right anterior superior temporal gyrus are critical for emotional prosody perception at a relatively late time period after sentence onset. This may reflect that emotional cues can still be ambiguous at the beginning of sentences, but become more apparent half-way through the sentence.
ARTICLE UPDATE - Audio-visual integration of emotion expression.
Collignon O, Girard S, Gosselin F, Roy S, Saint-Amour D, Lassonde M, Lepore F.
Brain Research, in press
Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of affect expressions. In Experiment 1, participants were required to categorise fear and disgust expressions displayed auditorily, visually, or using congruent or incongruent audio-visual stimuli. Results showed faster and more accurate categorisation in the bimodal congruent situation than in the unimodal conditions. In the incongruent situation, participant preferentially categorised the affective expression based on the visual modality, demonstrating a visual dominance in emotional processing. However, when the reliability of the visual stimuli was diminished, participants categorised incongruent bimodal stimuli preferentially via the auditory modality. These results demonstrate that visual dominance in affect perception does not occur in a rigid manner, but follows flexible situation-dependent rules. In Experiment 2, we requested the participants to pay attention to only one sensory modality at a time in order to test the putative mandatory nature of multisensory affective interactions. We observed that even if they were asked to ignore concurrent sensory information, the irrelevant information significantly affected the processing of the target. This observation was especially true when the target modality was less reliable. Altogether, these findings indicate that the perception of emotion expressions is a robust multisensory situation which follows rules that have been previously observed in other perceptual domains.
Brain Research, in press
Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of affect expressions. In Experiment 1, participants were required to categorise fear and disgust expressions displayed auditorily, visually, or using congruent or incongruent audio-visual stimuli. Results showed faster and more accurate categorisation in the bimodal congruent situation than in the unimodal conditions. In the incongruent situation, participant preferentially categorised the affective expression based on the visual modality, demonstrating a visual dominance in emotional processing. However, when the reliability of the visual stimuli was diminished, participants categorised incongruent bimodal stimuli preferentially via the auditory modality. These results demonstrate that visual dominance in affect perception does not occur in a rigid manner, but follows flexible situation-dependent rules. In Experiment 2, we requested the participants to pay attention to only one sensory modality at a time in order to test the putative mandatory nature of multisensory affective interactions. We observed that even if they were asked to ignore concurrent sensory information, the irrelevant information significantly affected the processing of the target. This observation was especially true when the target modality was less reliable. Altogether, these findings indicate that the perception of emotion expressions is a robust multisensory situation which follows rules that have been previously observed in other perceptual domains.
Friday, May 09, 2008
ARTICLE UPDATE - Emotional words are preferentially processed during silent reading. Here, we investigate to what extent different components of the v
Mallan KM, Lipp OV, Libera M.
International Journal of Psychophysiology, in press
Affect modulates the blink startle reflex in the picture-viewing paradigm, however, the process responsible for reflex modulation during conditional stimuli (CSs) that have acquired valence through affective conditioning remains unclear. In Experiment 1, neutral shapes (CSs) and valenced or neutral pictures (USs) were paired in a forward (CS-->US) manner. Pleasantness ratings supported affective learning of positive and negative valence. Post-acquisition, blink reflexes were larger during the pleasant and unpleasant CSs than during the neutral CS. Rather than affect, attention or anticipatory arousal were suggested as sources of startle modulation. Experiment 2 confirmed that affective learning in the picture-picture paradigm was not affected by whether the CS preceded the US. Pleasantness ratings and affective priming revealed similar extents of affective learning following forward, backward or simultaneous pairings of CSs and USs. Experiment 3 utilized a backward conditioning procedure (US-->CS) to minimize effects of US anticipation. Again, blink reflexes were larger during CSs paired with valenced USs regardless of US valence implicating attention rather than anticipatory arousal or affect as the process modulating startle in this paradigm.
International Journal of Psychophysiology, in press
Affect modulates the blink startle reflex in the picture-viewing paradigm, however, the process responsible for reflex modulation during conditional stimuli (CSs) that have acquired valence through affective conditioning remains unclear. In Experiment 1, neutral shapes (CSs) and valenced or neutral pictures (USs) were paired in a forward (CS-->US) manner. Pleasantness ratings supported affective learning of positive and negative valence. Post-acquisition, blink reflexes were larger during the pleasant and unpleasant CSs than during the neutral CS. Rather than affect, attention or anticipatory arousal were suggested as sources of startle modulation. Experiment 2 confirmed that affective learning in the picture-picture paradigm was not affected by whether the CS preceded the US. Pleasantness ratings and affective priming revealed similar extents of affective learning following forward, backward or simultaneous pairings of CSs and USs. Experiment 3 utilized a backward conditioning procedure (US-->CS) to minimize effects of US anticipation. Again, blink reflexes were larger during CSs paired with valenced USs regardless of US valence implicating attention rather than anticipatory arousal or affect as the process modulating startle in this paradigm.
ARTICLE UPDATE - Emotion and attention in visual word processing-An ERP study.
Kissler J, Herbert C, Winkler I, Junghofer M.
Biological Psychology, in press
Emotional words are preferentially processed during silent reading. Here, we investigate to what extent different components of the visual evoked potential, namely the P1, N1, the early posterior negativity (EPN, around 250ms after word onset) as well as the late positive complex (LPC, around 500ms) respond differentially to emotional words and whether this response depends on the availability of attentional resources. Subjects viewed random sequences of pleasant, neutral and unpleasant adjectives and nouns. They were first instructed to simply read the words and then to count either adjectives or nouns. No consistent effects emerged for the P1 and N1. However, during both reading and counting the EPN was enhanced for emotionally arousing words (pleasant and unpleasant), regardless of whether the word belonged to a target or a non-target category. A task effect on the EPN was restricted to adjectives, but the effect did not interact with emotional content. The later centro-parietal LPC (450-650ms) showed a large enhancement for the attended word class. A small and topographically distinct emotion-LPC effect was found specifically in response to pleasant words, both during silent reading and the active task. Thus, emotional word content is processed effortlessly and automatically and is not subject to interference from a primary grammatical decision task. The results are in line with other reports of early automatic semantic processing as reflected by posterior negativities in the ERP around 250ms after word onset. Implications for models of emotion-attention interactions in the brain are discussed.
Biological Psychology, in press
Emotional words are preferentially processed during silent reading. Here, we investigate to what extent different components of the visual evoked potential, namely the P1, N1, the early posterior negativity (EPN, around 250ms after word onset) as well as the late positive complex (LPC, around 500ms) respond differentially to emotional words and whether this response depends on the availability of attentional resources. Subjects viewed random sequences of pleasant, neutral and unpleasant adjectives and nouns. They were first instructed to simply read the words and then to count either adjectives or nouns. No consistent effects emerged for the P1 and N1. However, during both reading and counting the EPN was enhanced for emotionally arousing words (pleasant and unpleasant), regardless of whether the word belonged to a target or a non-target category. A task effect on the EPN was restricted to adjectives, but the effect did not interact with emotional content. The later centro-parietal LPC (450-650ms) showed a large enhancement for the attended word class. A small and topographically distinct emotion-LPC effect was found specifically in response to pleasant words, both during silent reading and the active task. Thus, emotional word content is processed effortlessly and automatically and is not subject to interference from a primary grammatical decision task. The results are in line with other reports of early automatic semantic processing as reflected by posterior negativities in the ERP around 250ms after word onset. Implications for models of emotion-attention interactions in the brain are discussed.
ARTICLE UPDATE - Early emotion word processing: Evidence from event-related potentials.
Scott GG, O'Donnell PJ, Leuthold H, Sereno SC.
Biological Psychology, in press
Behavioral and electrophysiological responses were monitored to 80 controlled sets of emotionally positive, negative, and neutral words presented randomly in a lexical decision paradigm. Half of the words were low frequency and half were high frequency. Behavioral results showed significant effects of frequency and emotion as well as an interaction. Prior research has demonstrated sensitivity to lexical processing in the N1 component of the event-related brain potential (ERP). In this study, the N1 (135-180ms) showed a significant emotion by frequency interaction. The P1 window (80-120ms) preceding the N1 as well as post-N1 time windows, including the Early Posterior Negativity (200-300ms) and P300 (300-450ms), were examined. The ERP data suggest an early identification of the emotional tone of words leading to differential processing. Specifically, high frequency negative words seem to attract additional cognitive resources. The overall pattern of results is consistent with a time line of word recognition in which semantic analysis, including the evaluation of emotional quality, occurs at an early, lexical stage of processing.
Biological Psychology, in press
Behavioral and electrophysiological responses were monitored to 80 controlled sets of emotionally positive, negative, and neutral words presented randomly in a lexical decision paradigm. Half of the words were low frequency and half were high frequency. Behavioral results showed significant effects of frequency and emotion as well as an interaction. Prior research has demonstrated sensitivity to lexical processing in the N1 component of the event-related brain potential (ERP). In this study, the N1 (135-180ms) showed a significant emotion by frequency interaction. The P1 window (80-120ms) preceding the N1 as well as post-N1 time windows, including the Early Posterior Negativity (200-300ms) and P300 (300-450ms), were examined. The ERP data suggest an early identification of the emotional tone of words leading to differential processing. Specifically, high frequency negative words seem to attract additional cognitive resources. The overall pattern of results is consistent with a time line of word recognition in which semantic analysis, including the evaluation of emotional quality, occurs at an early, lexical stage of processing.
ARTICLE UPDATE - Not all emotions are created equal: The negativity bias in social-emotional development.
Vaish A, Grossmann T, Woodward A
Psychological Bulletin, 134, 383-403
There is ample empirical evidence for an asymmetry in the way that adults use positive versus negative information to make sense of their world; specifically, across an array of psychological situations and tasks, adults display a negativity bias, or the propensity to attend to, learn from, and use negative information far more than positive information. This bias is argued to serve critical evolutionarily adaptive functions, but its developmental presence and ontogenetic emergence have never been seriously considered. The authors argue for the existence of the negativity bias in early development and that it is evident especially in research on infant social referencing but also in other developmental domains. They discuss ontogenetic mechanisms underlying the emergence of this bias and explore not only its evolutionary but also its developmental functions and consequences. Throughout, the authors suggest ways to further examine the negativity bias in infants and older children, and they make testable predictions that would help clarify the nature of the negativity bias during early development.
Psychological Bulletin, 134, 383-403
There is ample empirical evidence for an asymmetry in the way that adults use positive versus negative information to make sense of their world; specifically, across an array of psychological situations and tasks, adults display a negativity bias, or the propensity to attend to, learn from, and use negative information far more than positive information. This bias is argued to serve critical evolutionarily adaptive functions, but its developmental presence and ontogenetic emergence have never been seriously considered. The authors argue for the existence of the negativity bias in early development and that it is evident especially in research on infant social referencing but also in other developmental domains. They discuss ontogenetic mechanisms underlying the emergence of this bias and explore not only its evolutionary but also its developmental functions and consequences. Throughout, the authors suggest ways to further examine the negativity bias in infants and older children, and they make testable predictions that would help clarify the nature of the negativity bias during early development.
ARTICLE UPDATE - In this paper we discuss the issue of the processes potentially underlying the emergence of emotional consciousness in the light of t
Kapucu A, Rotello CM, Ready RE, Seidl KN.
Journal of Experimental Psychology: Learning, Memory and Cognition, 34, 703-711
Older adults sometimes show a recall advantage for emotionally positive, rather than neutral or negative, stimuli (S. T. Charles, M. Mather, & L. L. Carstensen, 2003). In contrast, younger adults respond "old" and "remember" more often to negative materials in recognition tests. For younger adults, both effects are due to response bias changes rather than to enhanced memory accuracy (S. Dougal & C. M. Rotello, 2007). We presented older and younger adults with emotional and neutral stimuli in a remember-know paradigm. Signal-detection and model-based analyses showed that memory accuracy did not differ for the neutral, negative, and positive stimuli, and that "remember" responses did not reflect the use of recollection. However, both age groups showed large and significant response bias effects of emotion: Younger adults tended to say "old" and "remember" more often in response to negative words than to positive and neutral words, whereas older adults responded "old" and "remember" more often to both positive and negative words than to neutral stimuli.
Journal of Experimental Psychology: Learning, Memory and Cognition, 34, 703-711
Older adults sometimes show a recall advantage for emotionally positive, rather than neutral or negative, stimuli (S. T. Charles, M. Mather, & L. L. Carstensen, 2003). In contrast, younger adults respond "old" and "remember" more often to negative materials in recognition tests. For younger adults, both effects are due to response bias changes rather than to enhanced memory accuracy (S. Dougal & C. M. Rotello, 2007). We presented older and younger adults with emotional and neutral stimuli in a remember-know paradigm. Signal-detection and model-based analyses showed that memory accuracy did not differ for the neutral, negative, and positive stimuli, and that "remember" responses did not reflect the use of recollection. However, both age groups showed large and significant response bias effects of emotion: Younger adults tended to say "old" and "remember" more often in response to negative words than to positive and neutral words, whereas older adults responded "old" and "remember" more often to both positive and negative words than to neutral stimuli.
ARTICLE UPDATE - Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization.
Grandjean D, Sander D, Scherer KR.
Consciousness and Cognition, in press
In this paper we discuss the issue of the processes potentially underlying the emergence of emotional consciousness in the light of theoretical considerations and empirical evidence. First, we argue that componential emotion models, and specifically the Component Process Model (CPM), may be better able to account for the emergence of feelings than basic emotion or dimensional models. Second, we advance the hypothesis that consciousness of emotional reactions emerges when lower levels of processing are not sufficient to cope with the event and regulate the emotional process, particularly when the degree of synchronization between the components reaches a critical level and duration. Third, we review recent neuroscience evidence that bolsters our claim of the central importance of the synchronization of neuronal assemblies at different levels of processing.
Consciousness and Cognition, in press
In this paper we discuss the issue of the processes potentially underlying the emergence of emotional consciousness in the light of theoretical considerations and empirical evidence. First, we argue that componential emotion models, and specifically the Component Process Model (CPM), may be better able to account for the emergence of feelings than basic emotion or dimensional models. Second, we advance the hypothesis that consciousness of emotional reactions emerges when lower levels of processing are not sufficient to cope with the event and regulate the emotional process, particularly when the degree of synchronization between the components reaches a critical level and duration. Third, we review recent neuroscience evidence that bolsters our claim of the central importance of the synchronization of neuronal assemblies at different levels of processing.
Subscribe to:
Posts (Atom)