Monday, July 27, 2009

ARTICLE UPDATE - Modulation of Perception and Brain Activity by Predictable Trajectories of Facial Expressions.

Furl N, van Rijsbergen NJ, Kiebel SJ, Friston KJ, Treves A, Dolan RJ.

Cerebral Cortex, in press

People track facial expression dynamics with ease to accurately perceive distinct emotions. Although the superior temporal sulcus (STS) appears to possess mechanisms for perceiving changeable facial attributes such as expressions, the nature of the underlying neural computations is not known. Motivated by novel theoretical accounts, we hypothesized that visual and motor areas represent expressions as anticipated motion trajectories. Using magnetoencephalography, we show predictable transitions between fearful and neutral expressions (compared with scrambled and static presentations) heighten activity in visual cortex as quickly as 165 ms poststimulus onset and later (237 ms) engage fusiform gyrus, STS and premotor areas. Consistent with proposed models of biological motion representation, we suggest that visual areas predictively represent coherent facial trajectories. We show that such representations bias emotion perception of subsequent static faces, suggesting that facial movements elicit predictions that bias perception. Our findings reveal critical processes evoked in the perception of dynamic stimuli such as facial expressions, which can endow perception with temporal continuity.

ARTICLE UPDATE - Event-related potentials to task-irrelevant changes in facial expressions.

Astikainen P, Hietanen JK.

Behavioural Brain Function, in press

ABSTRACT: BACKGROUND: Numerous previous experiments have used oddball paradigm to study change detection. This paradigm is applied here to study change detection of facial expressions in a context which demands abstraction of the emotional expression-related facial features among other changing facial features. METHODS: Event-related potentials (ERPs) were recorded in adult humans engaged in a demanding auditory task. In an oddball paradigm, repeated pictures of faces with a neutral expression ('standard', p = .9) were rarely replaced by pictures with a fearful ('fearful deviant', p = .05) or happy ('happy deviant', p = .05) expression. Importantly, facial identities changed from picture to picture. Thus, change detection required abstraction of facial expression from changes in several low-level visual features. RESULTS: ERPs to both types of deviants differed from those to standards. At occipital electrode sites, ERPs to deviants were more negative than ERPs to standards at 150-180 ms and 280-320 ms post-stimulus. A positive shift to deviants at fronto-central electrode sites in the analysis window of 130-170 ms post-stimulus was also found. Waveform analysis computed as point-wise comparisons between the amplitudes elicited by standards and deviants revealed that the occipital negativity emerged earlier to happy deviants than to fearful deviants (after 140 ms versus 160 ms post-stimulus, respectively). In turn, the anterior positivity was earlier to fearful deviants than to happy deviants (110 ms versus 120 ms post-stimulus, respectively). CONCLUSION: ERP amplitude differences between emotional and neutral expressions indicated pre-attentive change detection of facial expressions among neutral faces. The posterior negative difference at 150-180 ms latency resembled visual mismatch negativity (vMMN) - an index of pre-attentive change detection previously studied only to changes in low-level features in vision. The positive anterior difference in ERPs at 130-170 ms post-stimulus probably indexed pre-attentive attention orienting towards emotionally significant changes. The results show that the human brain can abstract emotion related features of faces while engaged to a demanding task in another sensory modality.

ARTICLE UPDATE - Evidence for mirror systems in emotions.

Bastiaansen JA, Thioux M, Keysers C.

Phil. Trans. R. Soc. B, 364, 2391 - 2404

Why do we feel tears well up when we see a loved one cry? Why do we wince when we see other people hurt themselves? This review addresses these questions from the perspective of embodied simulation: observing the actions and tactile sensations of others activates premotor, posterior parietal and somatosensory regions in the brain of the observer which are also active when performing similar movements and feeling similar sensations. We will show that seeing the emotions of others also recruits regions involved in experiencing similar emotions, although there does not seem to be a reliable mapping of particular emotions onto particular brain regions. Instead, emotion simulation seems to involve a mosaic of affective, motor and somatosensory components. The relative contributions of these components to a particular emotion and their interrelationship are largely unknown, although recent experimental evidence suggests that motor simulation may be a trigger for the simulation of associated feeling states. This mosaic of simulations may be necessary for generating the compelling insights we have into the feelings of others. Through their integration with, and modulation by, higher cognitive functions, they could be at the core of important social functions, including empathy, mind reading and social learning.

ARTICLE UPDATE - N400 during recognition of voice identity and vocal affect.

Toivonen M, Rämä P.

Neuroreport, in press

This study explored whether neural processes underlying recognition of speaker's voice and vocal affect are dissociable by measuring event-related potentials. Individuals were asked to identify a target emotion, or a target (congruent) speaker among distracter (incongruent) emotions or speakers. The incongruent condition elicited more negative N400-like response during both tasks, but the distributions differed. Although the response in speaker task was more pronounced at frontal than posterior recording sites, in emotion task, the opposite was true. Furthermore, the response was more pronounced at the left recording sites for speaker task and more pronounced at the right recording sites for emotion task. The present results suggest that neural substrates involved in processing speaker identity are different from those responsible for processing vocal affect.

Monday, July 20, 2009

ARTICLE UPDATE - Emotion words, regardless of polarity, have a processing advantage over neutral words.

Kousta ST, Vinson DP, Vigliocco G.

Cognition, in press

Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat. According to a different hypothesis which does not posit a privileged role for the aversive system, valence, regardless of polarity, facilitates processing due to the relevance of both negative and positive stimuli for survival and for the attainment of goals. Here, we present evidence that emotional valence has an overall facilitatory role in the processing of verbal stimuli, providing support for the latter hypothesis. We found no asymmetry between negative and positive words and suggest that previous findings of such an asymmetry can be attributed to failure to control for a number of critical lexical variables and to a sampling bias.

ARTICLE UPDATE - Amygdala activation predicts gaze toward fearful eyes.

Gamer M, Büchel C.

The Journal of Neuroscience, 29, 9123-9126

The human amygdala can be robustly activated by presenting fearful faces, and it has been speculated that this activation has functional relevance for redirecting the gaze toward the eye region. To clarify this relationship between amygdala activation and gaze-orienting behavior, functional magnetic resonance imaging data and eye movements were simultaneously acquired in the current study during the evaluation of facial expressions. Fearful, angry, happy, and neutral faces were briefly presented to healthy volunteers in an event-related manner. We controlled for the initial fixation by unpredictably shifting the faces downward or upward on each trial, such that the eyes or the mouth were presented at fixation. Across emotional expressions, participants showed a bias to shift their gaze toward the eyes, but the magnitude of this effect followed the distribution of diagnostically relevant regions in the face. Amygdala activity was specifically enhanced for fearful faces with the mouth aligned to fixation, and this differential activation predicted gazing behavior preferentially targeting the eye region. These results reveal a direct role of the amygdala in reflexive gaze initiation toward fearfully widened eyes. They mirror deficits observed in patients with amygdala lesions and open a window for future studies on patients with autism spectrum disorder, in which deficits in emotion recognition, probably related to atypical gaze patterns and abnormal amygdala activation, have been observed.

Monday, July 13, 2009

ARTICLE UPDATE - Short-term antidepressant treatment modulates amygdala response to happy faces.

Norbury R, Taylor MJ, Selvaraj S, Murphy SE, Harmer CJ, Cowen PJ.

Psychopharmacology, in press

RATIONALE: We have previously demonstrated that antidepressant medication facilitates the processing of positive affective stimuli in healthy volunteers. These early effects of antidepressants may be an important component in the therapeutic effects of antidepressant treatment in patients with depression and anxiety. OBJECTIVES: Here we used functional magnetic resonance imaging in a double-blind, randomised, placebo-controlled between-groups design to investigate the effects of short-term (7-10 days) treatment with the selective serotonin reuptake inhibitor, citalopram, on the amygdala response to positive and negative facial expressions in healthy volunteers. RESULTS: Citalopram was associated with increased amygdala activation to happy faces relative to placebo control, without changes in levels of mood or anxiety. CONCLUSIONS: These early, direct effects of antidepressant administration on emotional processing are consistent with a cognitive neuropsychological model of antidepressant action.

Monday, July 06, 2009

ARTICLE UPDATE - Human brain responsivity to masked different intensities of fearful eye whites: An ERP study.

Feng W, Luo W, Liao Y, Wang N, Gan T, Luo Y.

Brain Research, in press

Previous studies have shown differential event-related potentials (ERPs) to intensities of fearful facial expressions. There are indications that the eyes may be particularly relevant for the recognition of fearful expressions, even the amount of white sclera exposed above and on sides of the dark pupil could activate the amygdala response. To investigate whether the ERP differences between intensities of fearful expressions are driven by the differential salience of the eyes in the fearful faces, ERPs were measured within a backward masking paradigm, where observers were asked to do a gender decision task with male and female neutral faces. The emotional stimuli used were low-intensity (50%), prototypical (100%), and caricatured (150%) fearful eye whites that were derived from corresponding intensities of fearful faces respectively. Three groups of white squares that have the same pixels as the eye whites were created as control conditions. Analysis of the ERP data showed a linear increase in amplitudes of the parietal-occipital P120 by three intensities of fearful eye whites. These ERP effects were proved sensitive to intensities of negative emotions but not to the simple physical features as the same patterns of differences were not observed on white squares. Larger parietal-occipital P250 amplitudes were observed for caricatured 150% than low-intensity 50% fearful eye-white. It might reflect the subcortical pathway of emotion-specific, fearful processing. The results demonstrate that the human brain is sensitive to intensities of fear, even if just shown intensities of fearful eye-white in the absence of awareness.

ARTICLE UPDATE - Genetics of Emotion Regulation.

Canli T, Ferri J, Duman EA.

Neuroscience, in press

Emotions can be powerful drivers of behavior that may be adaptive or maladaptive for the individual. Thus, the ability to alter one's emotions, to regulate them, should be beneficial to an individual's success of survival and fitness. What is the biological basis of this ability? And what are the biological mechanisms that impart individual differences in the ability to regulate emotion? In this article, we will first introduce readers to the construct of emotion regulation, and the various strategies that individuals may utilize to regulate their emotions. We will then point to evidence that suggests genetic contributions (alongside environmental contributions) to individual differences in emotion regulation. To date, efforts to identify specific genetic mechanisms involved in emotion regulation have focused on common gene variants (i.e., variants that exist in > 1% of the population, referred to as polymorphisms) and their association with specific emotion regulation strategies or the neural substrate mediating these strategies. We will discuss these efforts, and conclude with a call to expand the set of experimental paradigms and putative molecular mechanisms, in order to significantly advance our understanding of the molecular mechanisms by which genes are involved in emotion regulation.