Wednesday, October 14, 2009

ARTICLE UPDATE - Immediacy Bias in Emotion Perception: Current Emotions Seem More Intense Than Previous Emotions

Van Boven L. White K. Huber M.

Journal of Experimental Psychology: General, 138, 368-382

People tend to perceive immediate emotions as more intense than previous emotions. This immediacy bias in emotion perception occurred for exposure to emotional but not neutral stimuli (Study 1), when emotional stimuli were separated by both shorter (2 s; Studies 1 and 2) and longer (20 min; Studies 3, 4, and 5) delays, and for emotional reactions to pictures (Studies 1 and 2), films (Studies 3 and 4), and descriptions of terrorist threats (Study 5). The immediacy bias may be partly caused by immediate emotion's salience, and by the greater availability of information about immediate compared with previous emotion. Consistent with emotional salience, when people experienced new emotions, they perceived previous emotions as less intense than they did initially (Studies 3 and 5)-a change in perception that did not occur when people did not experience a new immediate emotion (Study 2). Consistent with emotional availability, reminding people that information about emotions naturally decays from memory reduced the immediacy bias by making previous emotions seem more intense (Study 4). Discussed are implications for psychological theory and other judgments and behaviors.

ARTICLE UPDATE - Emotion words, regardless of polarity, have a processing advantage over neutral words.

Kousta ST. Vinson DP. Vigliocco G.

Cognition, 112, 473-481

Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat. According to a different hypothesis which does not posit a privileged role for the aversive system, valence, regardless of polarity, facilitates processing due to the relevance of both negative and positive stimuli for survival and for the attainment of goals. Here, we present evidence that emotional valence has an overall facilitatory role in the processing of verbal stimuli, providing support for the latter hypothesis. We found no asymmetry between negative and positive words and suggest that previous findings of such an asymmetry can be attributed to failure to control for a number of critical lexical variables and to a sampling bias.

ARTICLE UPDATE - Decoding of emotional information in voice-sensitive cortices.

Ethofer T. Van De Ville D. Scherer K. Vuilleumier P.

Current Biology, 19, 1028-1033

The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.

ARTICLE UPDATE - Acoustic profiles of distinct emotional expressions in laughter.

Szameitat DP. Alter K. Szameitat AJ. Wildgruber D. Sterr A. Darwin CJ.

Journal of the Acoustical Society of America, 126, 354-366

Although listeners are able to decode the underlying emotions embedded in acoustical laughter sounds, little is known about the acoustical cues that differentiate between the emotions. This study investigated the acoustical correlates of laughter expressing four different emotions: joy, tickling, taunting, and schadenfreude. Analysis of 43 acoustic parameters showed that the four emotions could be accurately discriminated on the basis of a small parameter set. Vowel quality contributed only minimally to emotional differentiation whereas prosodic parameters were more effective. Emotions are expressed by similar prosodic parameters in both laughter and speech.

ARTICLE UPDATE - Effects of emotionally contagious films on changes in hemisphere-specific cognitive performance.

Papousek I. Schulter G. Lang B.

Emotion, 9, 510-519

In the framework of models on the lateralized involvement of the cortical hemispheres in affect and psychopathology, the authors examined whether cognitive processes associated with the left and the right prefrontal cortex varied as a function of valence, motivational direction, or intensity of induced mood. Affective states (cheerfulness, anxiety, sadness, anger, and neutral mood) were experimentally induced by short "emotionally contagious films." Findings confirmed that the newly developed films were suitable to effectively elicit the expected affective states and to differentially change the dimensions of interest. Changes in verbal versus figural fluency performance were examined as a function of positive versus negative valence, approach versus withdrawal motivation, and low versus high emotional arousal. Level of interest was evaluated as a control. Both the tendency to withdraw and emotional arousal seemed to produce relative advantages for cognitive processes that are more strongly represented in the right than left prefrontal cortex. Findings suggest that changes in cognitive performance might be best explained by an additive combination of motivational direction and arousal.

ARTICLE UPDATE - Tell me about it: neural activity elicited by emotional pictures and preceding descriptions.

Macnamara A. Foti D. Hajcak G.

Emotion, 9, 531-543

Emotional pictures elicit enhanced parietal positivities beginning around 300 ms following stimulus presentation. The magnitude of these responses, however, depends on both intrinsic (stimulus-driven) and extrinsic (context-driven) factors. In the present study, event-related potentials were recorded while participants viewed unpleasant and neutral pictures that were described either more neutrally or more negatively prior to presentation; temporospatial principal components analysis identified early and late positivities: Both emotional images and descriptions had independent and additive effects on early (334 ms) and midlatency (1,066 ms) positivities, whereas the latest positivity (1,688 ms) was sensitive only to description type. Results are discussed with regard to the time course of automatic and controlled processing of emotional stimuli. 2009 APA, all rights reserved.

ARTICLE UPDATE - Finding Comfort in a Joke: Consolatory Effects of Humor Through Cognitive Distraction

Strick M. Holland RW. van Baaren RB. van Knippenberg A.

Emotion, 9, 574-578

This study aimed to demonstrate that the cognitive demands involved in humor processing can attenuate negative emotions. A primary aspect of humor is that it poses cognitive demands needed for incongruency resolution. On the basis of findings that cognitive distraction prevents mood-congruent processing, the authors hypothesized that humorous stimuli attenuate negative emotions to a greater extent than do equally positive nonhumorous stimuli. To test this idea, the authors used a modified version of the picture-viewing paradigm of L. F. Van Dillen and S. L. Koole (2007). Participants viewed neutral, mildly negative, and strongly negative pictures, followed by either a humorous or an equally positive nonhumorous stimulus, and then rated their feelings. Participants reported less negative feelings in both mildly and strongly negative trials with humorous positive stimuli than with nonhumorous positive stimuli. Humor did not differentially affect emotions in the neutral trials. Stimuli that posed greater cognitive demands were more effective in regulating negative emotions than less demanding stimuli. These findings fully support Van Dillen and Koole's working memory model of distraction from negative mood and suggest that humor may attenuate negative emotions as a result of cognitive distraction. 2009 APA, all rights reserved.

ARTICLE UPDATE - Event-related potential correlates of the extraverts' sensitivity to valence changes in positive stimuli.

Yuan J. He Y. Lei Y. Yang J. Li H.

Neuroreport, 20, 1071-1076

This study investigated whether the human sensitivity to valence intensity changes in positive stimuli varies with extraversion. Event-related potentials were recorded for highly positive, moderately positive, and neutral stimuli while participants (extraverts and nonextraverts) performed a standard/deviant categorization task, irrespective of the emotionality of deviants. The results of extraverts showed larger P2 and P3 amplitudes during highly positive condition than during moderately positive condition which, in turn, elicited larger P2 than neutral condition. Conversely, nonextraverts showed no differences at both P2 and P3 components. Thus, extraverts, unlike less extraverted individuals, are sensitive to valence changes in positive stimuli, which may be underlain by certain biogenetic mechanism.

ARTICLE UPDATE - Instrumental music influences recognition of emotional body language.

Van den Stock J. Peretz I. Grezes J. de Gelder B.

Brain Topography, 21, 216-20

In everyday life, emotional events are perceived by multiple sensory systems. Research has shown that recognition of emotions in one modality is biased towards the emotion expressed in a simultaneously presented but task irrelevant modality. In the present study, we combine visual and auditory stimuli that convey similar affective meaning but have a low probability of co-occurrence in everyday life. Dynamic face-blurred whole body expressions of a person grasping an object while expressing happiness or sadness are presented in combination with fragments of happy or sad instrumental classical music. Participants were instructed to categorize the emotion expressed by the visual stimulus. The results show that recognition of body language is influenced by the auditory stimuli. These findings indicate that crossmodal influences as previously observed for audiovisual speech can also be obtained from the ignored auditory to the attended visual modality in audiovisual stimuli that consist of whole bodies and music.

Sunday, October 04, 2009

ARTICLE UPDATE - Peripheral vision and preferential emotion processing.

De Cesarei A, Codispoti M, Schupp HT.

Neuroreport, in press

This study investigated the preferential processing of emotional scenes, which were presented in the periphery of the visual field. Building on well-established affective modulations of event-related potentials, which were observed for foveal stimuli, emotional and neutral images were presented at several locations in the visual field, while participants either viewed the pictures or were engaged by a distractor task. The findings clearly show that emotional processing varied with picture eccentricity, with emotional effects being maximal in the center and absent in the far periphery. Moreover, near-peripheral emotional stimuli modulated event-related potentials only when participants were passively viewing them. These results suggest that perceptual processing resources are needed for identification and emotional processing of peripheral stimuli.

ARTICLE UPDATE - Cultural Context Moderates the Relationship Between Emotion Control Values and Cardiovascular Challenge Versus Threat Responses.

Mauss IB, Butler EA.

Biological Psychology, in press

Cultural context affects people's values regarding emotions, as well as their experiential and behavioral but not autonomic physiological responses to emotional situations. Little research, however, has examined how cultural context influences the relationships among values and emotional responding. Specifically, depending on their cultural context, individuals' values about emotion control (ECV; the extent to which they value emotion control) may have differing meanings, and as such, be associated with differing responses in emotional situations. We examined this possibility by testing the effect of two cultural contexts (28 female Asian-American (AA) versus 28 female European-American (EA) undergraduate students) on the associations between individuals' ECV and emotional responding (experiential, behavioral, and cardiovascular) to a relatively neutral film clip and a laboratory anger provocation. In the AA group, greater ECV were associated with reduced anger experience and behavior, and a challenge pattern of cardiovascular responding. In the EA group, greater ECV were associated with reduced anger behavior but not anger experience, and a threat pattern of cardiovascular responding. These results are consistent with the notion that individuals' values about emotion are associated with different meanings in different cultural contexts, and in turn, with different emotional and cardiovascular responses.

ARTICLE UPDATE - Are irrational reactions to unfairness truly emotionally-driven? Dissociated behavioural and emotional responses in the Ultimatum Gam

Civai C, Corradi-Dell'acqua C, Gamer M, Rumiati RI.

Cognition, in press

The "irrational" rejections of unfair offers by people playing the Ultimatum Game (UG), a widely used laboratory model of economical decision-making, have traditionally been associated with negative emotions, such as frustration, elicited by unfairness (Sanfey, Rilling, Aronson, Nystrom, & Cohen, 2003; van't Wout, Kahn, Sanfey, & Aleman, 2006). We recorded skin conductance responses as a measure of emotional activation while participants performed a modified version of the UG, in which they were asked to play both for themselves and on behalf of a third-party. Our findings show that even unfair offers are rejected when participants' payoff is not affected (third-party condition); however, they show an increase in the emotional activation specifically when they are rejecting offers directed towards themselves (myself condition). These results suggest that theories emphasizing negative emotions as the critical factor of "irrational" rejections (Pillutla & Murninghan, 1996) should be re-discussed. Psychological mechanisms other than emotions might be better candidates for explaining this behaviour.

ARTICLE UPDATE - Event-Related Delta And Theta Synchronization During Explicit And Implicit Emotion Processing.

Knyazev GG, Slobodskoj-Plusnin JY, Bocharov AV.

Neuroscience, in press

Emotion information processing may occur in two modes which are differently represented in conscious awareness. Fast online processing involves coarse-grained analysis of salient features, and is not represented in conscious awareness; offline processing takes hundreds of milliseconds to generate fine-grained analysis, and is represented in conscious awareness. These processing modes may be studied using event-related electroencephalogram theta and delta synchronization as a marker of emotion processing. Two experiments were conducted, which differed on the mode of emotional information presentation. In the explicit mode subjects were explicitly instructed to evaluate the emotional content of presented stimuli; in the implicit mode they performed a gender discrimination task. Firstly, we show that in both experiments theta and delta synchronization is stronger upon presentation of "emotional" than "neutral" stimuli, and in subjects who are more sensitive, or experience higher emotional involvement than in less sensitive or detached subjects. Secondly, we show that in the implicit mode theta and delta synchronization is more pronounced in an early (before 250 ms post-stimulus) processing stage, whereas in the explicit mode it is more pronounced in a later processing stage. Source localization analysis showed that implicit processing of angry and happy (relative to neutral) faces is associated with higher early (before 250 ms) theta synchronization in the right parietal cortex and the right insula, respectively. Explicit processing of angry and happy faces is associated with higher late (after 250 ms) theta synchronization in the left temporal lobe and bilateral prefrontal cortex, respectively.

Friday, September 25, 2009

ARTICLE UPDATE - When seeing outweighs feeling: a role for prefrontal cortex in passive control of negative affect in blindsight.

Anders S, Eippert F, Wiens S, Birbaumer N, Lotze M, Wildgruber D.

Brain, in press

Affective neuroscience has been strongly influenced by the view that a 'feeling' is the perception of somatic changes and has consequently often neglected the neural mechanisms that underlie the integration of somatic and other information in affective experience. Here, we investigate affective processing by means of functional magnetic resonance imaging in nine cortically blind patients. In these patients, unilateral postgeniculate lesions prevent primary cortical visual processing in part of the visual field which, as a result, becomes subjectively blind. Residual subcortical processing of visual information, however, is assumed to occur in the entire visual field. As we have reported earlier, these patients show significant startle reflex potentiation when a threat-related visual stimulus is shown in their blind visual field. Critically, this was associated with an increase of brain activity in somatosensory-related areas, and an increase in experienced negative affect. Here, we investigated the patients' response when the visual stimulus was shown in the sighted visual field, that is, when it was visible and cortically processed. Despite the fact that startle reflex potentiation was similar in the blind and sighted visual field, patients reported significantly less negative affect during stimulation of the sighted visual field. In other words, when the visual stimulus was visible and received full cortical processing, the patients' phenomenal experience of affect did not closely reflect somatic changes. This decoupling of phenomenal affective experience and somatic changes was associated with an increase of activity in the left ventrolateral prefrontal cortex and a decrease of affect-related somatosensory activity. Moreover, patients who showed stronger left ventrolateral prefrontal cortex activity tended to show a stronger decrease of affect-related somatosensory activity. Our findings show that similar affective somatic changes can be associated with different phenomenal experiences of affect, depending on the depth of cortical processing. They are in line with a model in which the left ventrolateral prefrontal cortex is a relay station that integrates information about subcortically triggered somatic responses and information resulting from in-depth cortical stimulus processing. Tentatively, we suggest that the observed decoupling of somatic responses and experienced affect, and the reduction of negative phenomenal experience, can be explained by a left ventrolateral prefrontal cortex-mediated inhibition of affect-related somatosensory activity.

ARTICLE UPDATE - The convergence of information about rewarding and aversive stimuli in single neurons.

Morrison SE, Salzman CD.

The Journal of Neuroscience, 29, 11471-11483

Neuroscientists, psychologists, clinicians, and economists have long been interested in how individuals weigh information about potential rewarding and aversive stimuli to make decisions and to regulate their emotions. However, we know relatively little about how appetitive and aversive systems interact in the brain, as most prior studies have investigated only one valence of reinforcement. Previous work has suggested that primate orbitofrontal cortex (OFC) represents information about the reward value of stimuli. We therefore investigated whether OFC also represents information about aversive stimuli, and, if so, whether individual neurons process information about both rewarding and aversive stimuli. Monkeys performed a trace conditioning task in which different novel abstract visual stimuli (conditioned stimuli, CSs) predicted the occurrence of one of three unconditioned stimuli (USs): a large liquid reward, a small liquid reward, or an aversive air-puff. Three lines of evidence suggest that information about rewarding and aversive stimuli converges in individual neurons in OFC. First, OFC neurons often responded to both rewarding and aversive USs, despite their different sensory features. Second, OFC neural responses to CSs often encoded information about both potential rewarding and aversive stimuli, even though these stimuli differed in both valence and sensory modality. Finally, OFC neural responses were correlated with monkeys' behavioral use of information about both rewarding and aversive CS-US associations. These data indicate that processing of appetitive and aversive stimuli converges at the single cell level in OFC, providing a possible substrate for executive and emotional processes that require using information from both appetitive and aversive systems.

ARTICLE UPDATE - Propensity and sensitivity measures of fear and disgust are differentially related to emotion-specific brain activation.

Schäfer A, Leutgeb V, Reishofer G, Ebner F, Schienle A.

Neuroscience Letters, in press

Neuroimaging studies on individual differences in experiencing disgust and fear have indicated that disgust propensity and trait anxiety are able to moderate brain activity. The moderating role of disgust sensitivity and anxiety sensitivity has not been investigated thus far. Both sensitivity traits refer to the tendency of a person to perceive harmful consequences of experiencing fear and disgust. Eighteen female subjects viewed and subsequently rated pictures for the elicitation of disgust, fear and a neutral affective state. The viewing of the aversive pictures was associated with activation of visual processing areas, the amygdala, the insula and the orbitofrontal cortex (OFC). In the disgust condition, disgust propensity was positively correlated with activation of attention-related areas (parietal cortex, anterior cingulate cortex (ACC)) and brain regions involved in valence and arousal processing (OFC, insula). For the fear condition, we observed positive correlations between trait anxiety and activation of the ACC, the insula, and the OFC. Correlations between brain activity and sensitivity measures were exclusively negative and concerned areas crucial for emotion regulation, such as the medial and dorsolateral prefrontal cortex (MPFC, DLPFC). Thus, individuals high in disgust/anxiety sensitivity might have difficulties to successfully control the specific affective experience.

ARTICLE UPDATE - Brain networks involved in haptic and visual identification of facial expressions of emotion: An fMRI study

Kitada R, Johnsrude IS, Kochiyama T, Lederman SJ.

Neuroimage, in press

Previous neurophysiological and neuroimaging studies have shown that a cortical network involving the inferior frontal gyrus (IFG), inferior parietal lobe (IPL) and cortical areas in and around the posterior superior temporal sulcus (pSTS) region are employed in action understanding by vision and audition. However, the brain regions that are involved in action understanding by touch are unknown. Lederman et al. (2007) recently demonstrated that humans can haptically recognize facial expressions of emotion (FEE) surprisingly well. Here, we report a functional magnetic resonance imaging (fMRI) study in which we test the hypothesis that the IFG, IPL and pSTS regions are involved in haptic, as well as visual, FEE identification. Twenty subjects haptically or visually identified facemasks with three different FEEs (disgust, neutral and happiness) and casts of shoes (shoes) of three different types. The left posterior middle temporal gyrus, IPL, IFG, and bilateral precentral gyrus were activated by FEE identification relative to that of shoes, regardless of sensory modality. By contrast, an inferomedial part of the left superior parietal lobule was activated by haptic, but not visual, FEE identification. Other brain regions, including the lingual gyrus and superior frontal gyrus, were activated by visual identification of FEEs, relative to haptic identification of FEEs. These results suggest that haptic and visual FEE identification rely on distinct but overlapping neural substrates including the IFG, IPL and pSTS region.

Friday, September 11, 2009

ARTICLE UPDATE - Emotional Conception: How Embodied Emotion Concepts Guide Perception and Facial Action.

Halberstadt J, Winkielman P, Niedenthal PM, Dalle N.

Psychological Science, in press

This study assessed embodied simulation via electromyography (EMG) as participants first encoded emotionally ambiguous faces with emotion concepts (i.e., "angry,""happy") and later passively viewed the faces without the concepts. Memory for the faces was also measured. At initial encoding, participants displayed more smiling-related EMG activity in response to faces paired with "happy" than in response to faces paired with "angry." Later, in the absence of concepts, participants remembered happiness-encoded faces as happier than anger-encoded faces. Further, during passive reexposure to the ambiguous faces, participants' EMG indicated spontaneous emotion-specific mimicry, which in turn predicted memory bias. No specific EMG activity was observed when participants encoded or viewed faces with non-emotion-related valenced concepts, or when participants encoded or viewed Chinese ideographs. From an embodiment perspective, emotion simulation is a measure of what is currently perceived. Thus, these findings provide evidence of genuine concept-driven changes in emotion perception. More generally, the findings highlight embodiment's role in the representation and processing of emotional information.

ARTICLE UPDATE - Interactions of attention, emotion and motivation.

Raymond J.

Progress in Brain Research, 176, 293-308

Although successful visually guided action begins with sensory processes and ends with motor control, the intervening processes related to the appropriate selection of information for processing are especially critical because of the brain's limited capacity to handle information. Three important mechanisms--attention, emotion and motivation--contribute to the prioritization and selection of information. In this chapter, the interplay between these systems is discussed with emphasis placed on interactions between attention (or immediate task relevance of stimuli) and emotion (or affective evaluation of stimuli), and between attention and motivation (or the predicted value of stimuli). Although numerous studies have shown that emotional stimuli modulate mechanisms of selective attention in humans, little work has been directed at exploring whether such interactions can be reciprocal, that is, whether attention can influence emotional response. Recent work on this question (showing that distracting information is typically devalued upon later encounters) is reviewed in the first half of the chapter. In the second half, some recent experiments exploring how prior value-prediction learning (i.e., learning to associate potential outcomes, good or bad, with specific stimuli) plays a role in visual selection and conscious perception. The results indicate that some aspects of motivation act on selection independently of traditionally defined attention and other aspects interact with it.

ARTICLE UPDATE - Emotional modulation of visual cortex activity: a functional near-infrared spectroscopy study.

Emotional modulation of visual cortex activity: a functional near-infrared spectroscopy study.

Neuroreport, in press

Functional neuroimaging and electroencephalography reveal emotional effects in the early visual cortex. Here, we used functional near-infrared spectroscopy to examine haemodynamic responses evoked by neutral, positive and negative emotional pictures, matched for brightness, contrast, hue, saturation, spatial frequency and entropy. Emotion content modulated amplitude and latency of oxy, deoxy and total haemoglobin response peaks, and induced peripheral autonomic reactions. The processing of positive and negative pictures enhanced haemodynamic response amplitude, and this effect was paralleled by blood pressure changes. The processing of positive pictures was reflected in reduced haemodynamic response peak latency. Together these data suggest that the early visual cortex holds amplitude-dependent representation of stimulus salience and latency-dependent information regarding stimulus valence, providing new insight into affective interaction with sensory processing.

Wednesday, September 02, 2009

ARTICLE UPDATE - Appetitive vs. defensive responses to emotional cues. Autonomic measures and brain oscillation modulation.

Balconi M, Brambilla E, Falbo L.

Brain Research, in press

The present study explored the effect of the subjective evaluation and the individual differences related to BIS and BAS (Behavioural Inhibition and Activation System) on autonomic measures and brain oscillations, in response to appetitive and aversive emotional stimuli. Multiple measures were recorded, such as psychophysiological (skin conductance response, heart rate, and electromyography) and frequency bands (delta, theta, alpha, and gamma), during viewing IAPS figures, that varied in terms of pleasantness (appetitive vs. aversive) and arousing power (high vs. low intensity). Both BIS and BAS measures were significant in modulating behavioural, autonomic and brain oscillations responses. Withdrawal (BIS system) and appetitive (BAS system) behaviour showed opposite patterns of responses by the subjects. Also, frontal cortical site response was more significant than other sites. Nevertheless, no specific lateralization effect was found as a function of BIS/BAS dichotomy. Moreover, autonomic variables and frequency band modulations were found to be effected by valence and arousal rating per se, with an increased response for high arousing and negative or positive stimuli in comparison with low arousing and neutral stimuli. The effects of subjective evaluation and individual differences were discussed at light of coping activity model of emotion comprehension.

ARTICLE UPDATE - Brain oscillations and BIS/BAS (behavioral inhibition/activation system) effects on processing masked emotional cues ERS/ERD and cohe

Balconi M, Mazza G.

International Journal of Psychophysiology, in press

Alpha brain oscillation modulation was analyzed in response to masked emotional facial expressions. In addition, behavioural activation (BAS) and behavioural inhibition systems (BIS) were considered as an explicative factor to verify the effect of motivational significance on cortical activity. Nineteen subjects were submitted to an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral). The results demonstrated that anterior frontal sites were more active than central and posterior sites in response to facial stimuli. Moreover, right-side responses varied as a function of emotional types, with an increased right-frontal activity for negative emotions. Finally, whereas higher BIS subjects generated a more right hemisphere activation for some negative emotions (such as fear, anger, and surprise), Reward-BAS subjects were more responsive to positive emotion (happiness) within the left hemisphere. Valence and potential threatening power of facial expressions were considered to elucidate these cortical differences.

ARTICLE UPDATE - Changing Fear: The Neurocircuitry of Emotion Regulation.

Hartley CA, Phelps EA.

Neuropsychopharmacology, in press

The ability to alter emotional responses as circumstances change is a critical component of normal adaptive behavior and is often impaired in psychological disorders. In this review, we discuss four emotional regulation techniques that have been investigated as means to control fear: extinction, cognitive regulation, active coping, and reconsolidation. For each technique, we review what is known about the underlying neural systems, combining findings from animal models and human neuroscience. The current evidence suggests that these different means of regulating fear depend on both overlapping and distinct components of a fear circuitry.

ARTICLE UPDATE - Emotional context modulates response inhibition: Neural and behavioral data.

Albert J, López-Martín S, Carretié L.

Neuroimage, in press

Although recent hemodynamic studies indicate that neural activity related to emotion and that associated with response inhibition constitute closely interrelated and mutually dependent processes, the nature of this relationship is still unclear. In order to explore the temporo-spatial characteristics of the interaction between emotion and inhibition, event-related potentials (ERPs) were measured as participants (N=30) performed a modified version of the Go/Nogo task that required the inhibition of prepotent responses to neutral cues during three different emotional contexts: negative, neutral, and positive. Temporal and spatial Principal Component Analyses were employed to detect and quantify, in a reliable manner, those ERP components related to response inhibition (i.e., Nogo-N2 and Nogo-P3), and a source-localization technique (sLORETA) provided information on their neural origin. Behavioral analyses revealed that reaction times (RTs) to Go cues were shorter during the positive context than during neutral and negative contexts. ERP analyses showed that suppressing responses to Nogo cues within the positive context elicited larger frontocentral Nogo-P3 amplitudes and enhanced anterior cingulate cortex (ACC) activation than within the negative context. Regression analyses revealed that Nogo-P3 (i) was inversely related to RTs, supporting its association with the inhibition of a prepotent response, and (ii) was associated with contextual valence (amplitude increased as context valence was more positive), but not with contextual arousal. These results suggest that withholding a prepotent response within positively valenced contexts is more difficult and requires more inhibitory control than within negatively valenced contexts.

ARTICLE UPDATE - Personal space regulation by the human amygdala.

Kennedy DP, Gläscher J, Tyszka JM, Adolphs R.

Nature Neuroscience, in press

The amygdala plays key roles in emotion and social cognition, but how this translates to face-to-face interactions involving real people remains unknown. We found that an individual with complete amygdala lesions lacked any sense of personal space. Furthermore, healthy individuals showed amygdala activation upon close personal proximity. The amygdala may be required to trigger the strong emotional reactions normally following personal space violations, thus regulating interpersonal distance in humans.

Monday, August 24, 2009

ARTICLE UPDATE - When nonsense sounds happy or helpless: The Implicit Positive and Negative Affect Test (IPANAT).

Quirin M, Kazén M, Kuhl J.

Journal of Personality and Social Psychology, 97, 500-516

This article introduces an instrument for the indirect assessment of positive and negative affect, the Implicit Positive and Negative Affect Test (IPANAT). This test draws on participant ratings of the extent to which artificial words subjectively convey various emotions. Factor analyses of these ratings yielded two independent factors that can be interpreted as implicit positive and negative affect. The corresponding scales show adequate internal consistency, test-retest reliability, stability (Study 1), and construct validity (Study 2). Studies 3 and 4 demonstrate that the IPANAT also measures state variance. Finally, Study 5 provides criterion-based validity by demonstrating that correlations between implicit affect and explicit affect are higher under conditions of spontaneous responding than under conditions of reflective responding to explicit affect scales. The present findings suggest that the IPANAT is a reliable and valid measure with a straightforward application procedure.

ARTICLE UPDATE - Smile Through Your Fear and Sadness.

Smith FW, Schyns PG.

Psychological Science, in press

ABSTRACT- It is well established that animal communication signals have adapted to the evolutionary pressures of their environment. For example, the low-frequency vocalizations of the elephant are tailored to long-range communications, whereas the high-frequency trills of birds are adapted to their more localized acoustic niche. Like the voice, the human face transmits social signals about the internal emotional state of the transmitter. Here, we address two main issues: First, we characterized the spectral composition of the facial features signaling each of the six universal expressions of emotion (happiness, sadness, fear, disgust, anger, and surprise). From these analyses, we then predicted and tested the effectiveness of the transmission of emotion signals over different viewing distances. We reveal a gradient of recognition over viewing distances constraining the relative adaptive usefulness of facial expressions of emotion (distal expressions are good signals over a wide range of viewing distances; proximal expressions are suited to closer-range communication).

ARTICLE UPDATE - Response and habituation of the amygdala during processing of emotional prosody.

Wiethoff S, Wildgruber D, Grodd W, Ethofer T.

Neuroreport, in press

The role of the amygdala in processing acoustic information of affective value is still under debate. Using event-related functional MRI (fMRI), we showed increased amygdalar responses to various emotions (anger, fear, happiness, eroticism) expressed by prosody, a means of communication bound to language and consequently unique to humans. The smallest signal increases were found for fearful prosody, a finding that could not be explained by rapid response habituation to stimuli of this emotional category, challenging classical theories about fear specificity of the human amygdala. Our results converge with earlier neuroimaging evidence investigating emotional vocalizations, and these neurobiological similarities suggest that the two forms of communication might have common evolutionary roots.

Monday, August 17, 2009

ARTICLE UPDATE - Distinct brain systems underlie the processing of valence and arousal of affective pictures.

Nielen MM, Heslenfeld DJ, Heinen K, Van Strien JW, Witter MP, Jonker C, Veltman DJ.

Brain & Cognition, in press

Valence and arousal are thought to be the primary dimensions of human emotion. However, the degree to which valence and arousal interact in determining brain responses to emotional pictures is still elusive. This functional MRI study aimed to delineate neural systems responding to valence and arousal, and their interaction. We measured neural activation in healthy females (N=23) to affective pictures using a 2 (Valence)x2 (Arousal) design. Results show that arousal was preferentially processed by middle temporal gyrus, hippocampus and ventrolateral prefrontal cortex. Regions responding to negative valence included visual and lateral prefrontal regions, positive valence activated middle temporal and orbitofrontal areas. Importantly, distinct arousal-by-valence interactions were present in anterior insula (negative pictures), and in occipital cortex, parahippocampal gyrus and posterior cingulate (positive pictures). These data demonstrate that the brain not only differentiates between valence and arousal but also responds to specific combinations of these two, thereby highlighting the sophisticated nature of emotion processing in (female) human subjects.

ARTICLE UPDATE - Speicial Issue on Music & Emotion

Annuals of the New York Academy of Sciences, 1169

Click here

ARTICLE UPDATE - An electrophysiological investigation into the automaticity of emotional face processing in high versus low trait anxious individuals

Holmes A, Nielsen MK, Tipper S, Green S.

Cognitive, Affective, Behavioral Neuroscience, 9, 323-334

To examine the extent of automaticity of emotional face processing in high versus low trait anxious participants, event-related potentials (ERPs) were recorded to emotional (fearful, happy) and neutral faces under varying task demands (low load, high load). Results showed that perceptual encoding of emotional faces, as reflected in P1 and early posterior negativity components, was unaffected by the availability of processing resources. In contrast, the postperceptual registration and storage of emotion-related information, as reflected in the late positive potential component at frontal locations, was influenced by the availability of processing resources, and this effect was further modulated by level of trait anxiety. Specifically, frontal ERP augmentations to emotional faces were eliminated in the more demanding task for low trait anxious participants, whereas ERP enhancements to emotional faces were unaffected by task load in high trait anxious participants. This result suggests greater automaticity in processing affective information in high trait anxious participants.

ARTICLE UPDATE - Taboo words: The effect of emotion on memory for peripheral information.

Guillet R, Arndt J.

Memory & Cognition, 37, 866-879

In three experiments, we examined memory for peripheral information that occurred in the same context as emotion-inducing information. In the first two experiments, participants studied either a sentence (Experiment 1) or a pair of words (Experiments 2A-2C) containing a neutral peripheral word, as well as a neutral, negative-valence, or taboo word, to induce an emotional response. At retrieval, the participants were asked to recall the neutral peripheral word from a sentence fragment or emotion-inducing word cue. In Experiment 3, we presented word pairs at encoding and tested memory with associative recognition. In all three experiments, memory for peripheral words was enhanced when it was encoded in the presence of emotionally arousing taboo words but not when it was encoded in the presence of words that were only negative in valence. These data are consistent with priority-binding theory (MacKay et al., 2004) and inconsistent with the attention-narrowing hypothesis (Easterbrook, 1959), as well as with object-based binding theory (Mather, 2007).

Monday, August 10, 2009

ARTICLE UPDATE - Tuning the brain for novelty detection under emotional threat: the role of increasing gamma phase-synchronization.

Garcia-Garcia M, Yordanova J, Kolev V, Domínguez-Borràs J, Escera C.

Neuroimage, in press

Effective orienting of attention towards novel events is crucial for survival, particularly if they occur in a dangerous situation. This is why stimuli with emotional value are more efficient in capturing attention than neutral stimuli, and why the processing of unexpected novel stimuli is enhanced under a negative emotional context. Here we measured the phase-synchronization (PS) of gamma-band responses (GBR) from human EEG scalp-recordings during performance of a visual discrimination task in which task-irrelevant standard and novel sounds were presented in either a neutral or a negative emotional context, in order to elucidate the brain mechanisms by which emotion tunes the processing of novel events. Visual task performance was distracted by novel sounds, and this distraction was enhanced by the negative emotional context. Similarly, gamma PS was enhanced after novel as compared to standard sounds and it was also larger to auditory stimuli in the negative than in the neutral emotional context, reflecting the synchronization of neural networks for increasing of attentional processing. Remarkably, the larger PS increase of GBR after novel sounds in the negative as compared to the neutral emotional context over midline and right frontal regions reveals that a negative emotional context tunes novelty processing by means of the PS of brain activity in the gamma frequency band around 40 Hz in specific neural networks.

ARTICLE UPDATE - Tell me about it: Neural activity elicited by emotional pictures and preceding descriptions.

Macnamara A, Foti D, Hajcak G.

Emotion, 9, 531-543

Emotional pictures elicit enhanced parietal positivities beginning around 300 ms following stimulus presentation. The magnitude of these responses, however, depends on both intrinsic (stimulus-driven) and extrinsic (context-driven) factors. In the present study, event-related potentials were recorded while participants viewed unpleasant and neutral pictures that were described either more neutrally or more negatively prior to presentation; temporospatial principal components analysis identified early and late positivities: Both emotional images and descriptions had independent and additive effects on early (334 ms) and midlatency (1,066 ms) positivities, whereas the latest positivity (1,688 ms) was sensitive only to description type. Results are discussed with regard to the time course of automatic and controlled processing of emotional stimuli.

ARTICLE UPDATE - Immediacy bias in emotion perception: Current emotions seem more intense than previous emotions.

Van Boven L, White K, Huber M.

Journal of Experimental Psychology: General, 138, 368-382

People tend to perceive immediate emotions as more intense than previous emotions. This immediacy bias in emotion perception occurred for exposure to emotional but not neutral stimuli (Study 1), when emotional stimuli were separated by both shorter (2 s; Studies 1 and 2) and longer (20 min; Studies 3, 4, and 5) delays, and for emotional reactions to pictures (Studies 1 and 2), films (Studies 3 and 4), and descriptions of terrorist threats (Study 5). The immediacy bias may be partly caused by immediate emotion's salience, and by the greater availability of information about immediate compared with previous emotion. Consistent with emotional salience, when people experienced new emotions, they perceived previous emotions as less intense than they did initially (Studies 3 and 5)-a change in perception that did not occur when people did not experience a new immediate emotion (Study 2). Consistent with emotional availability, reminding people that information about emotions naturally decays from memory reduced the immediacy bias by making previous emotions seem more intense (Study 4). Discussed are implications for psychological theory and other judgments and behaviors.

ARTICLE UPDATE - What factors need to be considered to understand emotional memories?

Kensinger EA.

Emotional Review, 1, 120-121

In my original review (this issue), I proposed that to understand the effects of emotion on memory accuracy, we must look beyond effects of arousal and consider the contribution of valence. In discussing this proposal, the commentators raise a number of excellent points that hone in on the question of when valence does (and does not) account for emotion's effects on memory accuracy. Though future research will be required to resolve this issue more fully, in this brief response, I address some of the concerns outlined by the commentators and suggest a few steps that may help to elucidate the dimensions that should be incorporated in models of emotional memory.

ARTICLE UPDATE - Why people rehearse their memories: Frequency of use and relations to the intensity of emotions associated with autobiographical memo

Walker WR, Skowronski JJ, Gibbons JA, Vogl RJ, Ritchie TD.

Memory, in press

People may choose to rehearse their autobiographical memories in silence or to disclose their memories with other people. This paper focuses on five types of memory rehearsal: involuntary rehearsal, rehearsal to maintain an event memory, rehearsal to re-experience the emotion of an event, rehearsal to understand an event, or rehearsal for social communication. A total of 337 participants recalled event memories, provided estimates of how often each event was rehearsed and for what reason, and rated the affective characteristics of the events. Rehearsal frequency was highest for social communication and lowest for rehearsals aimed at understanding events. For many rehearsal types, rehearsal was more frequent for positive than negative events. Frequently rehearsed events tended to show less affective fading. The pattern changed when events were socially rehearsed. For positive events, increased social rehearsal was related to a reduction in affective fading. For negative events, increased social rehearsal was associated with increased affective fading.

Monday, August 03, 2009

ARTICLE UPDATE - Normative data on development of neural and behavioral mechanisms underlying attention orienting toward social-emotional stimuli: An

Lindstrom K, Guyer AE, Mogg K, Bradley BP, Fox NA, Ernst M, Nelson EE, Leibenluft E, Britton JC, Monk CS, Pine DS, Bar-Haim Y.

Brain Research, in press

The ability of positive and negative facial signals to influence attention orienting is crucial to social functioning. Given the dramatic developmental change in neural architecture supporting social function, positive and negative facial cues may influence attention orienting differently in relatively young or old individuals. However, virtually no research examines such age-related differences in the neural circuitry supporting attention orienting to emotional faces. We examined age-related correlations in attention-orienting biases to positive and negative face emotions in a healthy sample (N=37; 9-40 years old) using functional magnetic resonance imaging and a dot-probe task. The dot-probe task in an fMRI setting yields both behavioral and neural indices of attention biases towards or away from an emotional cue (happy or angry face). In the full sample, angry-face attention bias scores did not correlate with age, and age did not correlate with brain activation to angry faces. However, age did positively correlate with attention bias towards happy faces; age also negatively correlated with left cuneus and left caudate activation to a happy-bias fMRI contrast. Secondary analyses suggested age-related changes in attention bias to happy faces. The tendency in younger children to direct attention away from happy faces (relative to neutral faces) was diminished in the older age groups, in tandem with increasing neural deactivation. Implications for future work on developmental changes in attention-emotion processing are discussed.

Monday, July 27, 2009

ARTICLE UPDATE - Modulation of Perception and Brain Activity by Predictable Trajectories of Facial Expressions.

Furl N, van Rijsbergen NJ, Kiebel SJ, Friston KJ, Treves A, Dolan RJ.

Cerebral Cortex, in press

People track facial expression dynamics with ease to accurately perceive distinct emotions. Although the superior temporal sulcus (STS) appears to possess mechanisms for perceiving changeable facial attributes such as expressions, the nature of the underlying neural computations is not known. Motivated by novel theoretical accounts, we hypothesized that visual and motor areas represent expressions as anticipated motion trajectories. Using magnetoencephalography, we show predictable transitions between fearful and neutral expressions (compared with scrambled and static presentations) heighten activity in visual cortex as quickly as 165 ms poststimulus onset and later (237 ms) engage fusiform gyrus, STS and premotor areas. Consistent with proposed models of biological motion representation, we suggest that visual areas predictively represent coherent facial trajectories. We show that such representations bias emotion perception of subsequent static faces, suggesting that facial movements elicit predictions that bias perception. Our findings reveal critical processes evoked in the perception of dynamic stimuli such as facial expressions, which can endow perception with temporal continuity.

ARTICLE UPDATE - Event-related potentials to task-irrelevant changes in facial expressions.

Astikainen P, Hietanen JK.

Behavioural Brain Function, in press

ABSTRACT: BACKGROUND: Numerous previous experiments have used oddball paradigm to study change detection. This paradigm is applied here to study change detection of facial expressions in a context which demands abstraction of the emotional expression-related facial features among other changing facial features. METHODS: Event-related potentials (ERPs) were recorded in adult humans engaged in a demanding auditory task. In an oddball paradigm, repeated pictures of faces with a neutral expression ('standard', p = .9) were rarely replaced by pictures with a fearful ('fearful deviant', p = .05) or happy ('happy deviant', p = .05) expression. Importantly, facial identities changed from picture to picture. Thus, change detection required abstraction of facial expression from changes in several low-level visual features. RESULTS: ERPs to both types of deviants differed from those to standards. At occipital electrode sites, ERPs to deviants were more negative than ERPs to standards at 150-180 ms and 280-320 ms post-stimulus. A positive shift to deviants at fronto-central electrode sites in the analysis window of 130-170 ms post-stimulus was also found. Waveform analysis computed as point-wise comparisons between the amplitudes elicited by standards and deviants revealed that the occipital negativity emerged earlier to happy deviants than to fearful deviants (after 140 ms versus 160 ms post-stimulus, respectively). In turn, the anterior positivity was earlier to fearful deviants than to happy deviants (110 ms versus 120 ms post-stimulus, respectively). CONCLUSION: ERP amplitude differences between emotional and neutral expressions indicated pre-attentive change detection of facial expressions among neutral faces. The posterior negative difference at 150-180 ms latency resembled visual mismatch negativity (vMMN) - an index of pre-attentive change detection previously studied only to changes in low-level features in vision. The positive anterior difference in ERPs at 130-170 ms post-stimulus probably indexed pre-attentive attention orienting towards emotionally significant changes. The results show that the human brain can abstract emotion related features of faces while engaged to a demanding task in another sensory modality.

ARTICLE UPDATE - Evidence for mirror systems in emotions.

Bastiaansen JA, Thioux M, Keysers C.

Phil. Trans. R. Soc. B, 364, 2391 - 2404

Why do we feel tears well up when we see a loved one cry? Why do we wince when we see other people hurt themselves? This review addresses these questions from the perspective of embodied simulation: observing the actions and tactile sensations of others activates premotor, posterior parietal and somatosensory regions in the brain of the observer which are also active when performing similar movements and feeling similar sensations. We will show that seeing the emotions of others also recruits regions involved in experiencing similar emotions, although there does not seem to be a reliable mapping of particular emotions onto particular brain regions. Instead, emotion simulation seems to involve a mosaic of affective, motor and somatosensory components. The relative contributions of these components to a particular emotion and their interrelationship are largely unknown, although recent experimental evidence suggests that motor simulation may be a trigger for the simulation of associated feeling states. This mosaic of simulations may be necessary for generating the compelling insights we have into the feelings of others. Through their integration with, and modulation by, higher cognitive functions, they could be at the core of important social functions, including empathy, mind reading and social learning.

ARTICLE UPDATE - N400 during recognition of voice identity and vocal affect.

Toivonen M, Rämä P.

Neuroreport, in press

This study explored whether neural processes underlying recognition of speaker's voice and vocal affect are dissociable by measuring event-related potentials. Individuals were asked to identify a target emotion, or a target (congruent) speaker among distracter (incongruent) emotions or speakers. The incongruent condition elicited more negative N400-like response during both tasks, but the distributions differed. Although the response in speaker task was more pronounced at frontal than posterior recording sites, in emotion task, the opposite was true. Furthermore, the response was more pronounced at the left recording sites for speaker task and more pronounced at the right recording sites for emotion task. The present results suggest that neural substrates involved in processing speaker identity are different from those responsible for processing vocal affect.

Monday, July 20, 2009

ARTICLE UPDATE - Emotion words, regardless of polarity, have a processing advantage over neutral words.

Kousta ST, Vinson DP, Vigliocco G.

Cognition, in press

Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat. According to a different hypothesis which does not posit a privileged role for the aversive system, valence, regardless of polarity, facilitates processing due to the relevance of both negative and positive stimuli for survival and for the attainment of goals. Here, we present evidence that emotional valence has an overall facilitatory role in the processing of verbal stimuli, providing support for the latter hypothesis. We found no asymmetry between negative and positive words and suggest that previous findings of such an asymmetry can be attributed to failure to control for a number of critical lexical variables and to a sampling bias.

ARTICLE UPDATE - Amygdala activation predicts gaze toward fearful eyes.

Gamer M, Büchel C.

The Journal of Neuroscience, 29, 9123-9126

The human amygdala can be robustly activated by presenting fearful faces, and it has been speculated that this activation has functional relevance for redirecting the gaze toward the eye region. To clarify this relationship between amygdala activation and gaze-orienting behavior, functional magnetic resonance imaging data and eye movements were simultaneously acquired in the current study during the evaluation of facial expressions. Fearful, angry, happy, and neutral faces were briefly presented to healthy volunteers in an event-related manner. We controlled for the initial fixation by unpredictably shifting the faces downward or upward on each trial, such that the eyes or the mouth were presented at fixation. Across emotional expressions, participants showed a bias to shift their gaze toward the eyes, but the magnitude of this effect followed the distribution of diagnostically relevant regions in the face. Amygdala activity was specifically enhanced for fearful faces with the mouth aligned to fixation, and this differential activation predicted gazing behavior preferentially targeting the eye region. These results reveal a direct role of the amygdala in reflexive gaze initiation toward fearfully widened eyes. They mirror deficits observed in patients with amygdala lesions and open a window for future studies on patients with autism spectrum disorder, in which deficits in emotion recognition, probably related to atypical gaze patterns and abnormal amygdala activation, have been observed.

Monday, July 13, 2009

ARTICLE UPDATE - Short-term antidepressant treatment modulates amygdala response to happy faces.

Norbury R, Taylor MJ, Selvaraj S, Murphy SE, Harmer CJ, Cowen PJ.

Psychopharmacology, in press

RATIONALE: We have previously demonstrated that antidepressant medication facilitates the processing of positive affective stimuli in healthy volunteers. These early effects of antidepressants may be an important component in the therapeutic effects of antidepressant treatment in patients with depression and anxiety. OBJECTIVES: Here we used functional magnetic resonance imaging in a double-blind, randomised, placebo-controlled between-groups design to investigate the effects of short-term (7-10 days) treatment with the selective serotonin reuptake inhibitor, citalopram, on the amygdala response to positive and negative facial expressions in healthy volunteers. RESULTS: Citalopram was associated with increased amygdala activation to happy faces relative to placebo control, without changes in levels of mood or anxiety. CONCLUSIONS: These early, direct effects of antidepressant administration on emotional processing are consistent with a cognitive neuropsychological model of antidepressant action.

Monday, July 06, 2009

ARTICLE UPDATE - Human brain responsivity to masked different intensities of fearful eye whites: An ERP study.

Feng W, Luo W, Liao Y, Wang N, Gan T, Luo Y.

Brain Research, in press

Previous studies have shown differential event-related potentials (ERPs) to intensities of fearful facial expressions. There are indications that the eyes may be particularly relevant for the recognition of fearful expressions, even the amount of white sclera exposed above and on sides of the dark pupil could activate the amygdala response. To investigate whether the ERP differences between intensities of fearful expressions are driven by the differential salience of the eyes in the fearful faces, ERPs were measured within a backward masking paradigm, where observers were asked to do a gender decision task with male and female neutral faces. The emotional stimuli used were low-intensity (50%), prototypical (100%), and caricatured (150%) fearful eye whites that were derived from corresponding intensities of fearful faces respectively. Three groups of white squares that have the same pixels as the eye whites were created as control conditions. Analysis of the ERP data showed a linear increase in amplitudes of the parietal-occipital P120 by three intensities of fearful eye whites. These ERP effects were proved sensitive to intensities of negative emotions but not to the simple physical features as the same patterns of differences were not observed on white squares. Larger parietal-occipital P250 amplitudes were observed for caricatured 150% than low-intensity 50% fearful eye-white. It might reflect the subcortical pathway of emotion-specific, fearful processing. The results demonstrate that the human brain is sensitive to intensities of fear, even if just shown intensities of fearful eye-white in the absence of awareness.

ARTICLE UPDATE - Genetics of Emotion Regulation.

Canli T, Ferri J, Duman EA.

Neuroscience, in press

Emotions can be powerful drivers of behavior that may be adaptive or maladaptive for the individual. Thus, the ability to alter one's emotions, to regulate them, should be beneficial to an individual's success of survival and fitness. What is the biological basis of this ability? And what are the biological mechanisms that impart individual differences in the ability to regulate emotion? In this article, we will first introduce readers to the construct of emotion regulation, and the various strategies that individuals may utilize to regulate their emotions. We will then point to evidence that suggests genetic contributions (alongside environmental contributions) to individual differences in emotion regulation. To date, efforts to identify specific genetic mechanisms involved in emotion regulation have focused on common gene variants (i.e., variants that exist in > 1% of the population, referred to as polymorphisms) and their association with specific emotion regulation strategies or the neural substrate mediating these strategies. We will discuss these efforts, and conclude with a call to expand the set of experimental paradigms and putative molecular mechanisms, in order to significantly advance our understanding of the molecular mechanisms by which genes are involved in emotion regulation.

Saturday, June 27, 2009

ARTICLE UPDATE - Mirror of the soul: a cortical stimulation study on recognition of facial emotions.

Giussani C, Pirillo D, Roux FE.

Journal of Neurosurgery, in press

Object The capability of recognizing the expressions of facial emotions has been hypothesized to depend on a right hemispheric cortical-subcortical network. Its impairment deeply disturbs social relationships. To spare right hemispheric cortical areas involved in recognizing facial emotion, the authors used intraoperative cortical stimulation and the awake surgery technique in a consecutive series of patients. The feasibility and the interest to map them during brain mapping for neurosurgical procedures are discussed. Methods After a preoperative neuropsychological evaluation, 18 consecutive patients with right hemispheric lesions (5 metastases, 6 high-grade gliomas, 4 low-grade gliomas, 2 arteriovenous malformations, and 1 malignant meningioma) were tested by intraoperative cortical stimulation while performing a facial emotion recognition task along with sensorimotor and visuospatial tasks. Results Three hundred eighty-six cortical sites were studied. Five (1.30%) reproducible interference sites for facial emotion recognition were identified in 5 patients: 1 site in the medial segment of T1; 1 site in the posterior segment of T1; 1 site in the posterior segment of T2; and 2 sites in the supramarginal gyrus. No selective impairment was found regarding the emotion category. All facial emotion recognition sites were spared during surgery, and none of the patients experienced postoperative deficits in recognition of facial emotions. Conclusions The finding of interference sites in facial emotion recognition in the right posterior perisylvian area, independent to sensorimotor or visuospatial orientation processes, reinforces the theory about the role of anatomically and functionally segregated right hemisphere structures in this cognitive process. The authors advocate offering a brain mapping of facial emotion recognition to patients with right posterior perisylvian tumors.

ARTICLE UPDATE - Influence of attention to somatic information on emotional and autonomic responses.

Murakami H, Ohira H, Matsunaga M, Kimura K.

Perceptual Motor Skills, 108, 531-539

The present study aimed to investigate the dissociable effects of two forms of self-focus on emotional and autonomic responses. One form is suppression, which includes the suppression of heart rate and self-evaluation of performance. The other is observation, which includes attention to one's own heart rate with no suppression and no evaluation. 26 undergraduate and graduate students from the Nagoya University campus (13 men, 13 women), ages 18 to 24 years (M = 20.7, SD = 1.6) were recruited. Participants were provided with their own heart rate as feedback for 5 min., during which participants conducted a self-focus manipulation. Several days after the experimental session for one condition, the same participants conducted another experimental session for the other condition. Instruction to suppress enhanced physiological arousal and subsequent negative emotions; however, instruction to observe did not increase physiological arousal or negative emotions.

ARTICLE UPDATE - Worry tendencies predict brain activation during aversive imagery.

Schienle A, Schäfer A, Pignanelli R, Vaitl D.

Neuroscience Letters, in press

Because of its abstract nature, worrying might function as an avoidance response in order to cognitively disengage from fearful imagery. The present functional magnetic resonance imaging study investigated neural correlates of aversive imagery and their association with worry tendencies, as measured by the Penn State Worry Questionnaire (PSWQ). Nineteen healthy women first viewed, and subsequently imagined pictures from two categories, 'threat' and 'happiness'. Worry tendencies were negatively correlated with brain activation in the anterior cingulate cortex, the prefrontal cortex (dorsolateral, dorsomedial, ventrolateral), the parietal cortex and the insula. These negative correlations between PSWQ scores and localized brain activation were specific for aversive imagery. Moreover, activation in the abovementioned regions was positively associated with the experienced vividness of both pleasant and unpleasant mental pictures. As the identified brain regions are involved in emotion regulation, vivid imagery and memory retrieval, a lowered activity in high PSWQ scorers might be associated with cognitive disengagement from aversive imagery as well as insufficient refresh rates of mental pictures. Our preliminary findings encourage future imagery studies on generalized anxiety disorder patients, as one of the main symptoms of this disorder is excessive worrying.

Friday, June 19, 2009

ARTICLE UPDATE - In search of specificity: functional MRI in the study of emotional experience.

Schienle A, Schäfer A.

International Journal of Psychophysiology, 73, 22-26.

The growing availability of functional magnetic resonance imaging (fMRI) with its property of high spatial resolution has energized the search for specific neural substrates of basic emotions and their feeling components. In the present article, we address the question as to whether recent fMRI studies on primary affective experiences have truly helped to pinpoint emotion-specific areas in the human brain or whether these studies are afflicted with methodological problems which make such inferences difficult. As one approach for improvement, we suggest the combination of fMRI with methods characterized by high temporal resolution, such as electroencephalography (EEG). Simultaneous recoding allows the correlation of temporally specific EEG components (e.g., the late positive potential) with regional blood-oxygen-level-dependent (BOLD) signals during affective experiences. Combined information on the source as well as the exact temporal pattern of a neural affective response will help to improve our understanding of emotion-specific brain activation.

Saturday, June 13, 2009

ARTICLE UPDATE - Emotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific ac

Emotions in motion: Dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations.

Brain Research, in press

In social contexts, facial expressions are dynamic in nature and vary rapidly in relation to situational requirements. However, there are very few fMRI studies using dynamic emotional stimuli. The aim of this study was (1) to introduce and evaluate a new stimulus database of static and dynamic emotional facial expressions according to arousal and recognizability investigated by a rating by both participants of the present fMRI study and by an external sample of 30 healthy women, (2) to examine the neural networks involved in emotion perception of static and dynamic facial stimuli separately, and (3) to examine the impact of motion on the emotional processing of dynamic compared to static face stimuli. A total of 16 females participated in the present fMRI study performing a passive emotion perception task including static and dynamic faces of neutral, happy and disgusted expressions. Comparing dynamic stimuli to static faces indicated enhanced emotion-specific brain activation patterns in the parahippocampal gyrus (PHG) including the amygdala (AMG), fusiform gyrus (FG), superior temporal gyrus (STG), inferior frontal gyrus (IFG), and occipital and orbitofrontal cortex (OFC). These regions have been discussed to be associated with emotional memory encoding, the perception of threat, facial identity, biological motion, the mirror neuron system, an increase of emotional arousal, and reward processing, respectively. Post hoc ratings of the dynamic stimuli revealed a better recognizability in comparison to the static stimuli. In conclusion, dynamic facial expressions might provide a more appropriate approach to examine the processing of emotional face perception than static stimuli.

ARTICLE UPDATE - Emotion and space. Lateralized emotional word detection depends on line bisection bias.

Tamagni C, Mantei T, Brugger P.

Neuroscience, in press

There is converging evidence, from various independent areas of neuroscience, for a functional specialization of the left and right cerebral hemispheres for positive and negative emotions, respectively ("valence theory" of emotional processing). One subfield, however, has produced mixed results, i.e. work on the detection of parafoveally presented positively or negatively emotional words by healthy subjects. Right or left visual field advantages were described and interpreted as reflecting the superiority of either the left hemisphere (LH) for linguistic material, or of the right hemisphere (RH) for highly emotional stimuli. Here we show that 48 healthy, right-handed participants' performance on a lateralized lexical decision task depends on their individual inclination to bisect a line to the left or right of the objective center. Only those with a bisection bias to the right showed the LH advantage for word detection known from the neuropsychological literature. Negative emotional words were processed with comparable accuracy in the two visual fields. However, a recognition advantage for negative over positive emotional words was found exclusively for those participants with a leftward line bisection bias. These results suggest that in work on functional hemispheric differences state variables like stimulus lateralization and word emotionality may be less decisive than the trait variable of lateral hemispatial attention. We propose a cautious reconsideration of the concept of "hemisphericity", which once emphasized individual differences in baseline hemispheric arousal, but was later dismissed in a reaction to oversimplifications in popular science accounts.

ARTICLE UPDATE - EEG coherence in humans: relationship with success in recognizing emotions in the voice.

Kislova OO, Rusalova MN.

Neuroscience and Behavioral Physiology, in press

EEG recordings from two groups of subjects - with high and low levels of recognition of emotions from voices were made. Comparisons were performed of the numbers of pairs of leads with different levels of coherence in baseline conditions and on recognition of emotions in six standard frequency ranges and in individual bands with 1-Hz steps. Significant differences were seen between groups 1 and 2 both in baseline conditions and during recognition of emotions: in most cases, coherence was greater in subjects with poor recognition of emotions from voices.

Saturday, June 06, 2009

ARTICLE UPDATE - Prolonged reduction of electrocortical activity predicts correct performance during rapid serial visual processing.

Keil A, Heim S.

Psychophysiology, in press

Abstract When two targets are shown in a rapid temporal stream of distractors, performance for the second target (T2) is typically reduced when presented between 200 and 500 ms after the first (T1). The present study used the steady-state visual evoked potential (ssVEP), a continuous index of electrocortical facilitation, to compare brain responses in trials with correct versus incorrect T2 responses. We found a reduction of the electrocortical response following T1 in trials with correct T2 identification. By contrast, incorrect T2 trials were characterized by enhanced electrocortical amplitude. Amplitude attenuation predictive of successful T2 report was sustained over time, suggesting a reduction of resources allocated to the distractor stream in correct trials. Across intertarget intervals, T2 performance was a linear function of the ssVEP amplitude reduction in correct trials, weighted by the stimulus onset asynchrony.

Saturday, May 30, 2009

PUBLICATION - Affective learning enhances activity and functional connectivity in early visual cortex.

Damaraju E, Huang YM, Barrett LF, Pessoa L.

Neuropsychologia, in press

This study examined the impact of task-irrelevant affective information on early visual processing regions V1-V4. Fearful and neutral faces presented with rings of different colors were used as stimuli. During the conditioning phase, fearful faces presented with a certain ring color (e.g., black) were paired with mild electrical stimulation. Neutral faces shown with rings of that color, as well as fearful or neutral faces shown with another ring color (e.g., white), were never paired with shock. Our findings revealed that fearful faces evoked enhanced blood oxygen level dependent (BOLD) responses in V1 and V4 compared to neutral faces. Faces embedded in a color ring that was paired with shock (e.g., black) evoked greater BOLD responses in V1-V4 compared to a ring color that was never paired with shock (e.g., white). Finally, BOLD responses in early visual cortex were tightly interrelated (i.e., correlated) during an affectively potent context (i.e., ring color) but not during a neutral one, suggesting that increased functional integration was present with affective learning. Taken together, the results suggest that task-irrelevant affective information not only influences evoked responses in early, retinotopically organized visual cortex, but also determines the pattern of responses across early visual cortex.

Request reprint

ARTICLE UPDATE - Embodiment of emotion concepts.

Niedenthal PM, Winkielman P, Mondillon L, Vermeulen N.

Journal of Personality and Social Psychology, 96, 1120-1136

Theories of embodied cognition hold that higher cognitive processes operate on perceptual symbols and that concept use involves partial reactivations of the sensory-motor states that occur during experience with the world. On this view, the processing of emotion knowledge involves a (partial) reexperience of an emotion, but only when access to the sensory basis of emotion knowledge is required by the task. In 2 experiments, participants judged emotional and neutral concepts corresponding to concrete objects (Experiment 1) and abstract states (Experiment 2) while facial electromyographic activity was recorded from the cheek, brow, eye, and nose regions. Results of both studies show embodiment of specific emotions in an emotion-focused but not a perceptual-focused processing task on the same words. A follow up in Experiment 3, which blocked selective facial expressions, suggests a causal, rather than simply a correlational, role for embodiment in emotion word processing. Experiment 4, using a property generation task, provided support for the conclusion that emotions embodied in conceptual tasks are context-dependent situated simulations rather than associated emotional reactions. Implications for theories of embodied simulation and for emotion theories are discussed.

ARTICLE UPDATE - Early and late temporo-spatial effects of contextual interference during perception of facial affect.

Frühholz S, Fehr T, Herrmann M.

International Journal of Psychophysiology, in press

Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.

Saturday, May 23, 2009

ARTICLE UPDATE - The Interrelations between Verbal Working Memory and Visual Selection of Emotional Faces.

Grecucci A, Soto D, Rumiati RI, Humphreys GW, Rotshtein P.

The Journal of Cognitive Neuroscience, in press

Working memory (WM) and visual selection processes interact in a reciprocal fashion based on overlapping representations abstracted from the physical characteristics of stimuli. Here, we assessed the neural basis of this interaction using facial expressions that conveyed emotion information. Participants memorized an emotional word for a later recognition test and then searched for a face of a particular gender presented in a display with two faces that differed in gender and expression. The relation between the emotional word and the expressions of the target and distractor faces was varied. RTs for the memory test were faster when the target face matched the emotional word held in WM (on valid trials) relative to when the emotional word matched the expression of the distractor (on invalid trials). There was also enhanced activation on valid compared with invalid trials in the lateral orbital gyrus, superior frontal polar (BA 10), lateral occipital sulcus, and pulvinar. Re-presentation of the WM stimulus in the search display led to the earlier onset of activity in the superior and inferior frontal gyri and the anterior hippocampus irrespective of the search validity of the re-presented stimulus. The data indicate that the middle temporal and prefrontal cortices are sensitive to the reappearance of stimuli that are held in WM, whereas a fronto-thalamic occipital network is sensitive to the behavioral significance of the match between WM and targets for selection. We conclude that these networks are modulated by high-level matches between the contents of WM, the behavioral goals, and our current sensory input.

ARTICLE UPDATE - Decoding of Emotional Information in Voice-Sensitive Cortices.

Ethofer T, Van De Ville D, Scherer K, Vuilleumier P.

Current Biology, in press

The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas [1-3] activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses [4-8]. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis [9, 10] to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.

ARTICLE UPDATE - Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brain: behavioral and brain evidence.

Schyns PG, Petro LS, Smith ML.

PlosOne

Competent social organisms will read the social signals of their peers. In primates, the face has evolved to transmit the organism's internal emotional state. Adaptive action suggests that the brain of the receiver has co-evolved to efficiently decode expression signals. Here, we review and integrate the evidence for this hypothesis. With a computational approach, we co-examined facial expressions as signals for data transmission and the brain as receiver and decoder of these signals. First, we show in a model observer that facial expressions form a lowly correlated signal set. Second, using time-resolved EEG data, we show how the brain uses spatial frequency information impinging on the retina to decorrelate expression categories. Between 140 to 200 ms following stimulus onset, independently in the left and right hemispheres, an information processing mechanism starts locally with encoding the eye, irrespective of expression, followed by a zooming out to processing the entire face, followed by a zooming back in to diagnostic features (e.g. the opened eyes in "fear", the mouth in "happy"). A model categorizer demonstrates that at 200 ms, the left and right brain have represented enough information to predict behavioral categorization performance.

Sunday, May 17, 2009

ARTICLE UPDATE - Involvement of medial prefrontal cortex in emotion during feedback presentation.

Jimura K, Konishi S, Asari T, Miyashita Y.

Neuroreport, in press

It has been suggested that the posterior medial prefrontal cortex (pMPFC) implements cognitive functions involved during negative feedback processing. It has also been suggested that the presentation of the feedback elicits emotional processes. This functional MRI study examined whether pMPFC was associated with the emotional component in feedback processing. Participants were exposed to feedback while performing a version of a motion prediction task. The pMPFC was activated during negative feedback presentation and emotion-related activity was extracted from the pMPFC activation through parametric imaging analysis. It was found that the emotional pMPFC activity was greater in participants who scored higher on depressive mood scales. The results suggest that pMPFC also implements feedback-related emotional functions, which individually vary depending on depressive moods.

Saturday, May 09, 2009

ARTICLE UPDATE - Social Anxiety and Anger Identification: Bubbles Reveal Differential Use of Facial Information With Low Spatial Frequencies.

Langner O, Becker ES, Rinck M.

Psychological Science, in press

We investigated the facial information that socially anxious and nonanxious individuals utilize to judge emotions. Using a reversed-correlation technique, we presented participants with face images that were masked with random bubble patterns. These patterns determined which parts of the face were visible in specific spatial-frequency bands. This masking allowed us to establish which locations and spatial frequencies were helping participants to successfully discriminate angry faces from neutral ones. Although socially anxious individuals performed as well as nonanxious individuals on the emotion-discrimination task, they did not utilize the same facial information for the task. The fine details (high spatial frequencies) around the eyes were discriminative for both groups, but only socially anxious participants additionally processed rough configural information (low spatial frequencies).

ARTICLE UPDATE - Emotion Improves and Impairs Early Vision

Bocanegra BR, Zeelenberg R.

Psychological Science, in press

Recent studies indicate that emotion enhances early vision, but the generality of this finding remains unknown. Do the benefits of emotion extend to all basic aspects of vision, or are they limited in scope? Our results show that the brief presentation of a fearful face, compared with a neutral face, enhances sensitivity for the orientation of subsequently presented low-spatial-frequency stimuli, but diminishes orientation sensitivity for high-spatial-frequency stimuli. This is the first demonstration that emotion not only improves but also impairs low-level vision. The selective low-spatial-frequency benefits are consistent with the idea that emotion enhances magnocellular processing. Additionally, we suggest that the high-spatial-frequency deficits are due to inhibitory interactions between magnocellular and parvocellular pathways. Our results suggest an emotion-induced trade-off in visual processing, rather than a general improvement. This trade-off may benefit perceptual dimensions that are relevant for survival at the expense of those that are less relevant.

Saturday, May 02, 2009

ARTICLE UPDATE - Binding and Inhibition in Episodic Memory -Cognitive, Emotional, and Neural Processes.

Bäuml KH, Pastötter B, Hanslmayr S.

Neuroscience and Biobehavioral Reviews, in press

The goal-directed use of human memory requires that irrelevant or unpleasant memories are, at least temporarily, reduced in their accessibility and memory for more relevant or pleasant information is enhanced, thus making memory more efficient. There is evidence that, in memory, inhibitory processes operate to serve this function. Results from three experimental paradigms are reviewed in which the action of intentionally and unintentionally recruited inhibitory processes has been suggested. The findings provide evidence on representational preconditions for the action of inhibitory processes, specifying binding structures in which inhibitory processes may be triggered and binding structures in which inhibitory processes are generally not observed. The findings also provide evidence on how inhibition a ffects memory representations, including changes at the memory unit level and changes in the binding between single units. Finally, current knowledge on the interplay between inhibition and emotion and on possible neural correlates of inhibitory processes is reviewed.

ARTICLE UPDATE - Coarse threat images reveal theta oscillations in the amygdala: A magnetoencephalography study.

Maratos FA, Mogg K, Bradley BP, Rippon G, Senior C.

Cognitive, Affective & Behavioral Neuroscience, 9, 133-143

Neurocognitive models propose a specialized neural system for processing threat-related information, in which the amygdala plays a key role in the analysis of threat cues. fMRI research indicates that the amygdala is sensitive to coarse visual threat relevant information-for example, low spatial frequency (LSF) fearful faces. However, fMRI cannot determine the temporal or spectral characteristics of neural responses. Consequently, we used magnetoencephalography to explore spatiotemporal patterns of activity in the amygdala and cortical regions with blurry (LSF) and normal angry, fearful, and neutral faces. Results demonstrated differences in amygdala activity between LSF threat-related and LSF neutral faces (50-250 msec after face onset). These differences were evident in the theta range (4-8 Hz) and were accompanied by power changes within visual and frontal regions. Our results support the view that the amygdala is involved in the early processing of coarse threat related information and that theta is important in integrating activity within emotion-processing networks.

ARTICLE UPDATE - Do tests of executive functioning predict ability to downregulate emotions spontaneously and when instructed to suppress?

Gyurak A, Goodkind MS, Madan A, Kramer JH, Miller BL, Levenson RW.

Cognitive, Affective & Behavioral Neuroscience, 9, 144-152

Behavioral regulation is a hallmark feature of executive functioning (EF). The present study investigated whether commonly used neuropsychological test measures of EF (i.e., working memory, Stroop, trail making, and verbal fluency) were related to ability to downregulate emotion both spontaneously and when instructed to suppress emotional expressions. To ensure a wide range of EF, 24 frontotemporal lobar degeneration patients, 7 Alzheimer's patients, and 17 neurologically normal controls participated. Participants were exposed to an acoustic startle stimulus (single aversive noise burst) under three conditions: (1) unwarned, (2) warned with no instructions (to measure spontaneous emotion downregulation), and (3) warned with instructions to suppress (to measure instructed emotion downregulation). Results indicated that higher verbal fluency scores were related to greater emotion regulation (operationalized as reduction in body movement and emotional facial behavior when warned of the impending startle) in both regulation conditions. No relationships were found between emotion regulation in these conditions and the other EF measures. We conclude that, of four commonly used measures of EF, verbal fluency best indexes the complex processes of monitoring, evaluation, and control necessary for successful emotion regulation, both spontaneously and following instructions to suppress.

Saturday, April 25, 2009

ARTICLE UPDATE - Contingency learning in human fear conditioning involves the ventral striatum.

Klucken T, Tabbert K, Schweckendiek J, Merz CJ, Kagerer S, Vaitl D, Stark R.

Human Brain Mapping, in press

The ability to detect and learn contingencies between fearful stimuli and their predictive cues is an important capacity to cope with the environment. Contingency awareness refers to the ability to verbalize the relationships between conditioned and unconditioned stimuli. Although there is a heated debate about the influence of contingency awareness on conditioned fear responses, neural correlates behind the formation process of contingency awareness have gained only little attention in human fear conditioning. Recent animal studies indicate that the ventral striatum (VS) could be involved in this process, but in human studies the VS is mostly associated with positive emotions. To examine this question, we reanalyzed four recently published classical fear conditioning studies (n = 117) with respect to the VS at three distinct levels of contingency awareness: subjects, who did not learn the contingencies (unaware), subjects, who learned the contingencies during the experiment (learned aware) and subjects, who were informed about the contingencies in advance (instructed aware). The results showed significantly increased activations in the left and right VS in learned aware compared to unaware subjects. Interestingly, this activation pattern was only found in learned but not in instructed aware subjects. We assume that the VS is not involved when contingency awareness does not develop during conditioning or when contingency awareness is unambiguously induced already prior to conditioning. VS involvement seems to be important for the transition from a contingency unaware to a contingency aware state. Implications for fear conditioning models as well as for the contingency awareness debate are discussed. Hum Brain Mapp, 2009.

ARTICLE UPDATE - Unmasking emotion: Exposure duration and emotional engagement.

Codispoti M, Mazzetti M, Bradley MM.

Psychophysiology, in press

Effects of exposure duration on emotional reactivity were investigated in two experiments that parametrically varied the duration of exposure to affective pictures from 25-6000 ms in the presence or absence of a visual mask. Evaluative, facial, autonomic, and cortical responses were measured. Results demonstrated that, in the absence of a visual mask (Experiment 1), emotional content modulated evaluative ratings, cortical, autonomic, and facial changes even with very brief exposures, and there was little evidence that emotional engagement increased with longer exposure. When information persistence was reduced by a visual mask (Experiment 2), differences as a function of hedonic content were absent for all measures when exposure duration was 25 ms but statistically reliable when exposure duration was 80 ms. Between 25-80 ms, individual differences in discriminability were critical in observing affective reactions to masked pictures.

ARTICLE UPDATE - Sleep promotes the neural reorganization of remote emotional memory.

Sterpenich V, Albouy G, Darsaud A, Schmidt C, Vandewalle G, Dang Vu TT, Desseilles M, Phillips C, Degueldre C, Balteau E, Collette F, Luxen A, Maquet P.

The Journal of Neuroscience, 16, 5143-5152

Sleep promotes memory consolidation, a process by which fresh and labile memories are reorganized into stable memories. Emotional memories are usually better remembered than neutral ones, even at long retention delays. In this study, we assessed the influence of sleep during the night after encoding onto the neural correlates of recollection of emotional memories 6 months later. After incidental encoding of emotional and neutral pictures, one-half of the subjects were allowed to sleep, whereas the others were totally sleep deprived, on the first postencoding night. During subsequent retest, functional magnetic resonance imaging sessions taking place 3 d and 6 months later, subjects made recognition memory judgments about the previously studied and new pictures. Between these retest sessions, all participants slept as usual at home. At 6 month retest, recollection was associated with significantly larger responses in subjects allowed to sleep than in sleep-deprived subjects, in the ventral medial prefrontal cortex (vMPFC) and the precuneus, two areas involved in memory retrieval, as well as in the extended amygdala and the occipital cortex, two regions the response of which was modulated by emotion at encoding. Moreover, the functional connectivity was enhanced between the vMPFC and the precuneus, as well as between the extended amygdala, the vMPFC, and the occipital cortex in the sleep group relative to the sleep-deprived group. These results suggest that sleep during the first postencoding night profoundly influences the long-term systems-level consolidation of emotional memory and modifies the functional segregation and integration associated with recollection in the long term.