Friday, September 28, 2007

ARTICLE UPDATE - The Neural Bases of Emotion Regulation: Reappraisal and Suppression of Negative Emotion.

Goldin PR, McRae K, Ramel W, Gross JJ.

Biological Psychiatry, in press

Background

Emotion regulation strategies are thought to differ in when and how they influence the emotion-generative process. However, no study to date has directly probed the neural bases of two contrasting (e.g., cognitive versus behavioral) emotion regulation strategies. This study used functional magnetic resonance imaging (fMRI) to examine cognitive reappraisal (a cognitive strategy thought to have its impact early in the emotion-generative process) and expressive suppression (a behavioral strategy thought to have its impact later in the emotion-generative process).

Methods

Seventeen women viewed 15 sec neutral and negative emotion-eliciting films under four conditions—watch-neutral, watch-negative, reappraise-negative, and suppress-negative—while providing emotion experience ratings and having their facial expressions videotaped.

Results

Reappraisal resulted in early (0–4.5 sec) prefrontal cortex (PFC) responses, decreased negative emotion experience, and decreased amygdala and insular responses. Suppression produced late (10.5–15 sec) PFC responses, decreased negative emotion behavior and experience, but increased amygdala and insular responses.

Conclusions

These findings demonstrate the differential efficacy of reappraisal and suppression on emotional experience, facial behavior, and neural response and highlight intriguing differences in the temporal dynamics of these two emotion regulation strategies.

Key Words: Amygdala; cognitive control; emotion; emotion regulation; fMRI; insula

ARTICLE UPDATE - Self produced and observed actions influence emotion: the roles of action fluency and eye gaze.

Hayes AE, Paul MA, Beuger B, Tipper SP.

Psychological Research, in press

Affective responses to objects can be influenced by cognitive processes such as perceptual fluency. Here we investigated whether the quality of motor interaction with an object influences affective response to the object. Participants grasped and moved objects using either a fluent action or a non-fluent action (avoiding an obstacle). Liking ratings were higher for objects in the fluent condition. Two further studies investigated whether the fluency of another person's actions influences affective response. Observers watched movie clips of the motor actions described above, in conditions where the observed actor could be seen to be looking towards the grasped object, or where the actor's head and gaze were not visible. Two results were observed: First, when the actor's gaze cannot be seen, liking ratings of the objects are reduced. Second, action fluency of observed actions does influence liking ratings, but only when the actor's gaze towards the object is visible. These findings provide supporting evidence for the important role of observed eye gaze in action simulation, and demonstrate that non-emotive actions can evoke empathic states in observers.

Friday, September 21, 2007

ARTICLE UPDATE - Things are sounding up: affective influences on auditory tone perception.

Weger UW, Meier BP, Robinson MD, Inhoff AW.

Psychonomic Bulletin & Review, 14, 517-521

Recent studies have documented robust and intriguing associations between affect and performance in cognitive tasks. The present two experiments sought to extend this line of work with reference to potential cross-modal effects. Specifically, the present studies examined whether word evaluations would bias subsequent judgments of low- and high-pitch tones. Because affective metaphors and related associations consistently indicate that positive is high and negative is low, we predicted and found that positive evaluations biased tone judgment in the direction of high-pitch tones, whereas the opposite was true of negative evaluations. Effects were found on accuracy rates, response biases, and reaction times. These effects occurred despite the irrelevance of prime evaluations to the tone judgment task. In addition to clarifying the nature of these cross-modal associations, the present results further the idea that affective evaluations exert large effects on perceptual judgments related to verticality.

ARTICLE UPDATE - Social anxiety and interpretation biases for facial displays of emotion: Emotion detection and ratings of social cost.

Schofield CA, Coles ME, Gibb BE.

Behavioral Research and Therapy, in press

The current study assessed the processing of facial displays of emotion (Happy, Disgust, and Neutral) of varying emotional intensities in participants with high vs. low social anxiety. Use of facial expressions of varying intensities allowed for strong external validity and a fine-grained analysis of interpretation biases. Sensitivity to perceiving negative evaluation in faces (i.e., emotion detection) was assessed at both long (unlimited) and brief (60ms) stimulus durations. In addition, ratings of perceived social cost were made indicating what participants judged it would be like to have a social interaction with a person exhibiting the stimulus emotion. Results suggest that high social anxiety participants did not demonstrate biases in their sensitivity to perceiving negative evaluation (i.e. disgust) in facial expressions. However, high social anxiety participants did estimate the perceived cost of interacting with someone showing disgust to be significantly greater than low social anxiety participants, regardless of the intensity of the disgust expression. These results are consistent with a specific type of interpretation bias in which participants with social anxiety have elevated ratings of the social cost of interacting with individuals displaying negative evaluation.

ARTICLE UPDATE - Dynamics of Visual Information Integration in the Brain for Categorizing Facial Expressions.

Schyns PG, Petro LS, Smith ML.

Current Biology, 17, 1580-1585

A key to understanding visual cognition is to determine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset [1-16]) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with "fear" being faster than "disgust," itself faster than "happy"). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.

ARTICLE UPDATE - Own-sex effects in emotional memory for faces.

Armony JL, Sergerie K.

Neuroscience Letters, in press

The amygdala is known to be critical for the enhancement of memory for emotional, especially negative, material. Importantly, some researchers have suggested a sex-specific hemispheric lateralization in this process. In the case of facial expressions, another important factor that could influence memory success is the sex of the face, which could interact with the emotion depicted as well as with the sex of the perceiver. Whether this is the case remains unknown, as all previous studies of sex difference in emotional memory have employed affective pictures. Here we directly explored this question using functional magnetic resonance imaging in a subsequent memory paradigm for facial expressions (fearful, happy and neutral). Consistent with our hypothesis, we found that the hemispheric laterality of the amygdala involvement in successful memory for emotional material was influenced not only by the sex of the subjects, as previously proposed, but also by the sex of the faces being remembered. Namely, the left amygdala was more active for successfully remembered female fearful faces in women, whereas in men the right amygdala was more involved in memory for male fearful faces. These results confirm the existence of sex differences in amygdala lateralization in emotional memory but also demonstrate a subtle relationship between the observer and the stimulus in this process.

ARTICLE UPDATE - "Remembering" emotional words is based on response bias, not recollection.

Dougal S, Rotello CM.

Psychonomic Bulletin & Review, 14, 423-429

Recent studies have demonstrated that emotional stimuli result in a higher proportion of recognized items that are "remembered" (e.g., Kensinger & Corkin, 2003; Ochsner, 2000), leading to greater estimates of recollection by the dual-process model (Yonelinas, 1994). This result suggests that recognition judgments to emotional stimuli depend on a recollection process. We challenge this conclusion with receiver operating characteristic (ROC) curve data from two experiments. In both experiments, subjects studied neutral and emotional words. During the recognition test, subjects made old-new confidence ratings as well as remember-know judgments. Four models of remember-know judgments were fit to individual subjects' data: two versions of a one-dimensional signal-detection-based model (Donaldson, 1996; Wixted & Stretch, 2004), the dual-process model (Yonelinas, 1994), and the two-dimensional signal-detection-based model known as STREAK (Rotello, Macmillan, & Reeder, 2004). Consistent with the literature, we found that emotion increases subjective reports of "remembering:" However, our ROC analyses and modeling work reveal that the effect is due to response bias differences rather than sensitivity change or use of a high-threshold recollection process.

Monday, September 17, 2007

ARTICLE UPDATE - Taking the Feeling out of Emotional Memories - A Study of Hypnotic Emotional Numbing: A Brief Communication

Bryant RA, Fearns S.

International Journal of Clinical and Experimental Hypnosis, 55, 426-434

This study investigated the influence of hypnotic emotional inhibition on emotional response to and recall of emotional features of autobiographical memories. Twenty-nine high hypnotizable participants were administered a hypnotic induction and either emotional suppression or control instructions and then were asked to recall a personal distressing or neutral autobiographical memory. Dependent variables included self-reported emotion, EMG corrugator muscle activity, and use of affective descriptors in autobiographical memories. Participants in the suppression condition displayed less emotional responsivity on self-report and EMG corrugator muscle activity than other participants during recall of the distressing memory. In contrast, emotional suppression did not influence the use of affective descriptors in the content of personal memories. These findings point to the capacity for hypnotic emotional inhibition to differentially influence affective and semantic components of the emotional response.SE

ARTICLE UPDATE - Is a neutral face really evaluated as being emotionally neutral?

Lee E, Kang JI, Park IH, Kim JJ, An SK.

Psychiatry Research, in press

Most of the functional neuroimaging studies on emotion have used neutral faces as a baseline condition. The aim of the present study was to explore whether prototypical neutral faces are evaluated as displaying neutral emotions. Twenty-one subjects performed the Extrinsic Affective Simon Task (EAST), a validated implicit task that measures the emotional evaluation of target stimuli. All stimuli consisted of two juxtaposed faces from standardized facial pictures. The attribute stimuli (positive vs. negative), which needed to be classified on the basis of extrinsic valence, were presented as black and white facial pictures. The target stimuli were color-filtered positive, negative, neutral, and positive/negative faces, and subjects were instructed to classify them on the basis of the filtered color (blue vs. green). The responses to the positive target faces were associated with the positive emotions and the responses to the negative target faces were associated with the negative emotions. For the neutral faces, the responses were similar to those of negative faces, while for the positive/negative stimuli, the responses were undifferentiated. These findings suggested that prototypical “neutral” faces may be evaluated as negative in some circumstances, which suggests that the inclusion of neutral faces as a baseline condition might introduce an experimental confound in functional neuroimaging studies.

Friday, September 07, 2007

ARTICLE UPDATE - Opposing influences of emotional and non-emotional distracters upon sustained prefrontal cortex activity during a delayed-response wo

Dolcos F, Diaz-Granados P, Wang L, McCarthy G.

Neuropsychologia, in press

Performance in delayed-response working memory (WM) tasks is typically associated with sustained activation in the dorsolateral prefrontal cortex (dlPFC) that spans the delay between the memoranda and the memory probe. Recent studies have demonstrated that novel distracters presented during the delay interval both affect sustained activation and impair WM performance. However, the effect of the performance-impairing distracters upon sustained dlPFC delay activity was related to the characteristics of the distracters: memoranda-confusable distracters increased delay activity, whereas memoranda-nonconfusable emotional distracters decreased delay activity. Because these different effects were observed in different studies, it is possible that different dlPFC regions were involved and the paradox is more apparent than real. To investigate this possibility, event-related fMRI data were recorded while subjects performed a WM task for faces with memoranda-confusable (novel faces) and memoranda-nonconfusable emotional (novel scenes) distracters presented during the delay interval. Consistent with previous findings, confusable face distracters increased dlPFC delay activity, while nonconfusable emotional distracters decreased dlPFC delay activity, and these opposing effects modulated activity in the same dlPFC regions. These results provide direct evidence that specific regions of the dlPFC are generally involved in mediating the effects of distraction, while showing sensitivity to the nature of distraction. These findings are relevant for understanding alterations in the neural mechanisms associated with both general impairment of cognitive control and with specific impairment in the ability to control emotional distraction, such as those observed in aging and affective disorders, respectively.

ARTICLE UPDATE - Anxiety and orienting of gaze to angry and fearful faces.

Mogg K, Garner M, Bradley BP.

Biological Psychology, in press

Neuroscience research indicates that individual differences in anxiety may be attributable to a neural system for threat-processing, involving the amygdala, which modulates attentional vigilance, and which is more sensitive to fearful than angry faces. Complementary cognitive studies indicate that high-anxious individuals show enhanced visuospatial orienting towards angry faces, but it is unclear whether fearful faces elicit a similar attentional bias. This study compared biases in initial orienting of gaze to fearful and angry faces, which varied in emotional intensity, in high- and low-anxious individuals. Gaze was monitored whilst participants viewed a series of face-pairs. Results showed that fearful and angry faces elicited similar attentional biases. High-anxious individuals were more likely to direct gaze at intense negative facial expressions, than low-anxious individuals, whereas the groups did not differ in orienting to mild negative expressions. Implications of the findings for research into the neural and cognitive bases of emotion processing are discussed.

ARTICLE UPDATE - Congruency, attentional set, and laterality effects with emotional words.

Techentin C, Voyer D.

Neuropsychology, 21, 646-655

The present study investigated the influence of attention and word-emotion congruency on auditory asymmetries with stimuli that include verbal and emotional components. Words were presented dichotically to 80 participants and were pronounced in either congruent or incongruent emotional tones. Participants were asked to identify the presence of a target word or emotion under 1 of 2 conditions. The blocked condition required detection of a word or emotional target in separate blocks. In the randomized condition, the target was changed across trials by means of a postcue. A right-ear advantage (REA) and a left-ear advantage (LEA) were found for word and emotion targets, respectively. However, the finding of a Condition x Stimulus Type x Ear x Congruency interaction indicated that in the randomized condition, a REA was obtained for words when the stimuli were congruent and a LEA was observed for emotions when the stimuli were incongruent. The findings suggest that randomizing the target reduced the influence of the attentional set established by blocking the target. It is likely that this promoted the detection of hemispheric interference in the randomized condition.

ARTICLE UPDATE - Neural Biases to Covert and Overt Signals of Fear: Dissociation by Trait Anxiety and Depression

Leanne M. Williams, Andrew H. Kemp, Kim Felmingham, Belinda J. Liddell, Donna M. Palmer and Richard A. Bryant

Journal of Cognitive Neuroscience, 19, 1595-1608

Although biases toward signals of fear may be an evolutionary adaptation necessary for survival, heightened biases may be maladaptive and associated with anxiety or depression. In this study, event-related potentials (ERPs) were used to examine the time course of neural responses to facial fear stimuli (versus neutral) presented overtly (for 500 msec with conscious attention) and covertly (for 10 msec with immediate masking to preclude conscious awareness) in 257 nonclinical subjects. We also examined the impact of trait anxiety and depression, assessed using psychometric ratings, on the time course of ERPs. In the total subject group, controlled biases to overtly processed fear were reflected in an enhancement of ERPs associated with structural encoding (120–220 msec) and sustained evaluation persisting from 250 msec and beyond, following a temporo-occipital to frontal topography. By contrast, covert fear processing elicited automatic biases, reflected in an enhancement of ERPs prior to structural encoding (80–180 msec) and again in the period associated with automatic orienting and emotion encoding (230–330 msec), which followed the reverse frontal to temporo-occipital topography. Higher levels of trait anxiety (in the clinical range) were distinguished by a heightened bias to covert fear (speeding of early ERPs), compared to higher depression which was associated with an opposing bias to overt fear (slowing of later ERPs). Anxiety also heightened early responses to covert fear, and depression to overt fear, with subsequent deficits in emotion encoding in each case. These findings are consistent with neural biases to signals of fear which operate automatically and during controlled processing, feasibly supported by parallel networks. Heightened automatic biases in anxiety may contribute to a cycle of hypervigilance and anxious thoughts, whereas depression may represent a "burnt out" emotional state in which evaluation of fear stimuli is prolonged only when conscious attention is allocated.