Monday, March 30, 2009

ARTICLE UPDATE - Between- and within-ear congruency and laterality effects in an auditory semantic/emotional prosody conflict task.

Techentin C, Voyer D, Klein RM.

Brain and Cognition, in press

The present study investigated the influence of within- and between-ear congruency on interference and laterality effects in an auditory semantic/prosodic conflict task. Participants were presented dichotically with words (e.g., mad, sad, glad) pronounced in either congruent or incongruent emotional tones (e.g., angry, happy, or sad) and identified a target word or emotion under one of two conditions. In the within-ear condition, the congruent or incongruent dimensions were bound within a single stimulus and therefore, presented to the same ear. In the between-ear condition, the two dimensions were split between two stimuli and, therefore, presented in separate ears. Findings indicated interference in both conditions. However, the expected right ear advantage (EA) for words and left EA for emotions were obtained only in the between-ear condition. Factors involved in producing interference and laterality effects in dichotic listening tasks are discussed.

Saturday, March 21, 2009

ARTICLE UPDATE - Enhanced post-learning memory consolidation is influenced by arousal predisposition and emotion regulation but not by stimulus valenc

Nielson KA, Lorber W.

Neurobiology of Learning & Memory, in press

Emotionally arousing stimuli are more memorable than neutral ones and arousal induced after learning enhances later retrieval. However, there is as yet little study of how stimulus qualities might interact with induced arousal and how individual differences might influence the modulation of memory. Thus, the present study examined the effect of arousal induced after learning on memory for words that varied in both arousal and valence quality, as well as the influence of three individual differences factors that are known to influence arousal response: emotional suppression, emotional reappraisal, and arousal predisposition. Seventy-six adults (57 female) viewed and rated 60 words that normatively ranged from high to low in arousal and valence. Ten minutes later, they viewed a 3-min comedic or neutral video clip. Arousal induced after learning enhanced one-week delayed memory, spanning the lengthy task without preference for word type or serial position, contrasting with reports of arousal effects interacting with stimulus qualities. Importantly, being predisposed to arousal led to greater enhancement of long-term memory modulation, while the use of emotional reappraisal, which reduces arousal responding, inhibited the ability of arousal to induce memory enhancement. Thus, individual differences that influence arousal responding can contribute to or interfere with memory modulation.

ARTICLE UPDATE - How do emotion and motivation direct executive control?

Pessoa L.

Trends in Cognitive Science, in press

Emotion and motivation have crucial roles in determining human behavior. Yet, how they interact with cognitive control functions is less understood. Here, the basic elements of a conceptual framework for understanding how they interact are introduced. More broadly, the 'dual competition' framework proposes that emotion and motivation affect both perceptual and executive competition. In particular, the anterior cingulate cortex is hypothesized to be engaged in attentional/effortful control mechanisms and to interact with several other brain structures, including the amygdala and nucleus accumbens, in integrating affectively significant signals with control signals in prefrontal cortex. An implication of the proposal is that emotion and motivation can either enhance or impair behavioral performance depending on how they interact with control functions.

ARTICLE UPDATE - Nonautomatic emotion perception in a dual-task situation.

Tomasik D, Ruthruff E, Allen PA, Lien MC.

Psychological Bulletin & Review, 16, 282-288

Are emotions perceived automatically? Two psychological refractory period experiments were conducted to ascertain whether emotion perception requires central attentional resources. Task 1 required an auditory discrimination (tone vs. noise), whereas Task 2 required a discrimination between happy and angry faces. The difficulty of Task 2 was manipulated by varying the degree of emotional expression. The stimulus onset asynchrony (SOA) between Task 1 and Task 2 was also varied. Experiment 1 revealed additive effects of SOA and Task 2 emotion-perception difficulty. Experiment 2 replicated the additive relationship with a stronger manipulation of emotion-perception difficulty. According to locus-of-slack logic, our participants did not process emotional expressions while central resources were devoted to Task 1. We conclude that emotion perception is not fully automatic.

Monday, March 16, 2009

ARTICLE UPDATE - Emotion perception in emotionless face images suggests a norm-based representation.

Neth D, Martinez AM.

Journal of Vision, 9, 5.1 - 5.11

Perception of facial expressions of emotion is generally assumed to correspond to underlying muscle movement. However, it is often observed that some individuals have sadder or angrier faces, even for neutral, motionless faces. Here, we report on one such effect caused by simple static configural changes. In particular, we show four variations in the relative vertical position of the nose, mouth, eyes, and eyebrows that affect the perception of emotion in neutral faces. The first two configurations make the vertical distance between the eyes and mouth shorter than average, resulting in the perception of an angrier face. The other two configurations make this distance larger than average, resulting in the perception of sadness. These perceptions increase with the amount of configural change, suggesting a representation based on variations from a norm (prototypical) face.

ARTICLE UPDATE - Flawless visual short-term memory for facial emotional expressions.

Bankó EM, Gál V, Vidnyánszky Z.

Journal of Vision, 9, 12.1-12.13

Facial emotions are important cues of human social interactions. Emotional expressions are continuously changing and thus should be monitored, memorized, and compared from time to time during social intercourse. However, it is not known how efficiently emotional expressions can be stored in short-term memory. Here we show that emotion discrimination is not impaired when the faces to be compared are separated by several seconds, requiring storage of fine-grained emotion-related information in short-term memory. Likewise, we found no significant effect of increasing the delay between the sample and the test face in the case of facial identity discrimination. Furthermore, a second experiment conducted on a large subject sample (N = 160) revealed flawless short-term memory for both facial emotions and facial identity also when observers performed the discrimination tasks only twice with novel faces. We also performed an fMRI experiment, which confirmed that discrimination of fine-grained emotional expressions in our experimental paradigm involved processing of high-level facial emotional attributes. Significantly stronger fMRI responses were found in a cortical network--including the posterior superior temporal sulcus--that is known to be involved in processing of facial emotional expression during emotion discrimination than during identity discrimination. These findings reveal flawless, high-resolution visual short-term memory for emotional expressions, which might underlie efficient monitoring of continuously changing facial emotions.

ARTICLE UPDATE - The role of mirror neurons in processing vocal emotions: evidence from psychophysiological data.

Ramachandra V, Depalma N, Lisiewski S.

International Journal of Neuroscience, 119,681-690

Recent evidence suggests that the mirror neuron system may serve as a common neural substrate for processing motor, linguistic, emotional, and other higher-level cognitive information. The current study employed psychophysiological methods to elucidate the role of this system in processing vocal emotions. Skin conductance and heart rate were measured for 25 undergraduate students while they were both listening to emotional vocalizations and also thinking (internal production) about them. The results revealed changes in skin conductance response and heart rate during both "listening" and "thinking" conditions. This suggests an active role of the mirror neuron system in processing vocal emotions.

Monday, March 09, 2009

ARTICLE UPDATE - Specific and common brain regions involved in the perception of faces and bodies and the representation of their emotional expression

van de Riet WA, Grezes J, de Gelder B.

Social Neuroscience, 4, 101 - 120

Many studies provide support for the role of the fusiform gyrus in face recognition and its sensitivity to emotional expressions. Recently, category-specific representation was also observed for neutral human bodies in the middle temporal/middle occipital gyrus (extrastriate body area) but it is not clear whether this area is also sensitive to emotional bodily expressions. Besides these areas, other regions that process the affective information carried by the face and the body may be common and/or specific to the face or the body. To clarify these issues we performed a systematic comparison of how the whole brain processes faces and bodies and how their affective information is represented. Participants categorized emotional facial and bodily expressions while brain activity was measured using functional magnetic resonance imaging. Our results show that, first, the amygdala and the fusiform gyrus are sensitive to recognition of facial and bodily fear signals. Secondly, the extrastriate body area-area V5/MT is specifically involved in processing bodies without being sensitive to the emotion displayed. Thirdly, other important areas such as the superior temporal sulcus, the parietal lobe and subcortical structures represent selectively facial and bodily expressions. Finally, some face/body differences in activation are a function of the emotion expressed.